-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
issue with tools + streaming + Claude on #133 #138
Comments
I mention this (briefly) in the README, but Claude doesn't support streaming tool use. |
This was the case some time ago, but I believe the situation has changed. https://docs.anthropic.com/en/api/messages-streaming#streaming-request-with-tool-use |
That's interesting, thanks. We'll have to update Claude's code to handle the streaming tool use. That will happen soon, but not in my current tool-use PR. |
I did end up putting this in my latest PR, thanks for letting me know that Claude can now support this! |
I tried this just now, or more precisely the mildly updated version (let* ((provider (make-llm-claude
:key (exec-path-from-shell-getenv "ANTHROPIC_KEY")
:chat-model "claude-3-5-sonnet-20240620"))
results
(add-fn (llm-make-tool
:function (lambda (callback a b)
(let ((result (format "%s" (+ a b))))
(push (list :tool-call (cons (list a b) result)) results)
(funcall callback result)))
:name "add"
:description "Sums two numbers."
:args '((:name "a" :description "A number." :type integer :required t)
(:name "b" :description "A number." :type integer :required t))
:async t))
(prompt (llm-make-chat-prompt
"Compute 2+3."
:tools (list add-fn)))
done)
(llm-chat-streaming
provider prompt
(lambda (partial) (push (list :partial partial) results))
(lambda (final)
(push (list :final final) results)
(push (list :prompt-after (copy-sequence prompt)) results)
(setq done t))
(lambda (err msg)
(push (list :error err msg) results)
(setq done t)))
(while (not done) (sleep-for 0.1))
(pp-display-expression (nreverse results) "*test*")) which failed for similar reasons as before. It seemed that you were calling the same function for streaming tool use for both OpenAI and Claude, but looking at their docs, they look different. Maybe I missed something. Anyway, I got tool streaming with Claude working on my branch, which I'll PR, referencing this, in just a second. |
* llm-claude.el (llm-provider-streaming-media-handler, llm-provider-collect-streaming-tool-uses): Support tool use events in Claude's streaming API by capturing content_block_start events as well as content_block_delta events of type input_json_delta. See #138
Thanks for the fix here - I'm not sure why I had thought I had enabled this. Perhaps it had worked on a simpler example. I've added an integration test so we know for sure now. |
Hi Andrew, I suspect this is still a work in progress, but figured I'd report what I come across:
Evaluating
yields
which shows that when streaming with Claude,
llm
does not call the tool, let alone record the result in the prompt.The text was updated successfully, but these errors were encountered: