Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Fix handling of streaming response in AnthropicClaudeInvocationLayer #4993

Merged
merged 5 commits into from
Jun 7, 2023

Conversation

silvanocerza
Copy link
Contributor

Related Issues

Proposed Changes:

Change the default streaming handler in AnthropicClaudeInvocationLayer to a custom one, so that the received streamed response is handled gracefully.

How did you test it?

I added new tests for the AnthropicTokenStreamingHandler and changed the AnthropicClaudeInvocationLayer tests.

Notes for the reviewer

More info available in the linked issue.

@silvanocerza silvanocerza requested a review from a team as a code owner May 23, 2023 10:27
@silvanocerza silvanocerza self-assigned this May 23, 2023
@silvanocerza silvanocerza requested review from masci and removed request for a team May 23, 2023 10:27
@silvanocerza silvanocerza changed the title Fix handling of streaming response in AnthropicClaudeInvocationLayer fix: Fix handling of streaming response in AnthropicClaudeInvocationLayer May 23, 2023
@silvanocerza silvanocerza requested a review from vblagoje May 23, 2023 10:27
@coveralls
Copy link
Collaborator

coveralls commented May 23, 2023

Pull Request Test Coverage Report for Build 5154996112

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • 2348 unchanged lines in 29 files lost coverage.
  • Overall coverage increased (+0.6%) to 40.118%

Files with Coverage Reduction New Missed Lines %
nodes/file_converter/init.py 1 93.33%
agents/base.py 2 91.25%
init.py 2 83.33%
nodes/prompt/invocation_layer/handlers.py 2 93.88%
nodes/preprocessor/base.py 4 80.95%
utils/openai_utils.py 13 75.64%
nodes/prompt/invocation_layer/anthropic_claude.py 14 82.8%
utils/early_stopping.py 15 15.38%
nodes/answer_generator/openai.py 17 72.22%
nodes/prompt/prompt_template.py 20 89.9%
Totals Coverage Status
Change from base Build 5055949519: 0.6%
Covered Lines: 8994
Relevant Lines: 22419

💛 - Coveralls

@vblagoje
Copy link
Member

vblagoje commented May 24, 2023

@silvanocerza I suggest we make AnthropicTokenStreamingHandler an adapter class that takes in the user-providedTokenStreamingHandler via init. This instance of TokenStreamingHandler is what we pop from kwargs. We can then use AnthropicTokenStreamingHandler always - as an adapter; instead of print(chopped_text, flush=True, end="") we just invoke the user's TokenStreamingHandler with chopped text. This way, user-provided TokenStreamingHandler callbacks will get the same token-by-token callbacks across all invocation layers. For example, our agents send AgentTokenStreamingHandler as stream_handler kwarg down every invocation layer stack. In the currently proposed implementation, the user-provided TokenStreamingHandler (sent via kwargs as stream_handler) will get the old cumulative token callbacks, including our agents. Once Anthropic fixes their streaming, we can just retire AnthropicTokenStreamingHandler.

Copy link
Member

@vblagoje vblagoje left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Almost there - if I'm not mistaken, the current version will not work as we want it to for user-provided stream_handler (e.g. our agent handler). We can adapt the proposed solution with a few lines of code and achieve our intended goal.

@silvanocerza
Copy link
Contributor Author

Merged with branch fix-anthropic-streaming_v2.

@dfokina
Copy link
Contributor

dfokina commented Jun 2, 2023

Updated docstrings language a bit

Copy link
Member

@vblagoje vblagoje left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM 🚀

@vblagoje vblagoje merged commit a2156ee into main Jun 7, 2023
@vblagoje vblagoje deleted the fix-anthropic-streaming branch June 7, 2023 08:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Handle Anthropic streaming response
4 participants