Replies: 9 comments 7 replies
-
|
Beta Was this translation helpful? Give feedback.
-
I can provide an API key privately if you wish to test on your end.
…On Sun, Nov 3, 2024 at 11:48 PM Andrew Hyatt ***@***.***> wrote:
I think you might be the first to try to use openrouter.
plz unfortunately doesn't have a good way to log all the requests and
responses, AFAIK. @r0man <https://github.com/r0man>, do you have any
advice here for debugging? I expect that openrouter is doing something
slightly unexpected with the responses, although it claims to be Open AI
compatible.
—
Reply to this email directly, view it on GitHub
<#97 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAFSI2K7JSZPIGH4X2DAXDZ63U7HAVCNFSM6AAAAABRCHDSGGVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJTHA2TIMY>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
@dto Could you please try to adapt the snippet in #48 (comment) to make a call that fails to openrouter and send me the response (minus your credentials) as a file attachment per email? |
Beta Was this translation helpful? Give feedback.
-
@dto and @ahyatt I added minimal support for debugging HTTP responses here r0man/plz-media-type@a9236fa. Maybe that is of any use. |
Beta Was this translation helpful? Give feedback.
-
thank you. I have installed plz-media-type from git with your changes, and
have done
```
(add-to-list 'load-path "/home/dto/src/plz-media-type")
(require 'plz-media-type)
```
But am not sure how to proceed or get plz to use this feature.
…On Mon, Nov 4, 2024 at 2:41 PM r0man ***@***.***> wrote:
Other things to check for Openrouter that come to mind are:
- Is the response streaming or not?
- Is the response advertised in the right format and are the handlers
installed correctly. For example, if OpenRouter claims to return OpenAI
compatible responses, do they also contain the right content type header,
e.g. text/event-stream?
—
Reply to this email directly, view it on GitHub
<#97 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAFSI4YLUQFX7CV2LEATKDZ67EXBAVCNFSM6AAAAABRCHDSGGVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJUGY3TQMY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
One thing that seems to be suspicious from the response you sent is that it declares the response to be |
Beta Was this translation helpful? Give feedback.
-
I don't think I set the response to streaming.
Here is the debug output, and the tool call fails with json-end-of-file
within plz--respond.. Thanks for your help :)
HTTP/2 200
date: Mon, 04 Nov 2024 19:58:30 GMT
content-type: application/json
access-control-allow-credentials: true
access-control-allow-headers: Authorization, User-Agent, X-Api-Key,
X-CSRF-Token, X-Requested-With, Accept, Accept-Version, Content-Length,
Content-MD5, Content-Type, Date, X-Api-Version, HTTP-Referer,
X-Windowai-Title, X-Openrouter-Title, X-Title, X-Stainless-Lang,
X-Stainless-Package-Version, X-Stainless-OS, X-Stainless-Arch,
X-Stainless-Runtime, X-Stainless-Runtime-Version, X-Stainless-Retry-Count,
Protection-Key
access-control-allow-methods: GET,OPTIONS,PATCH,DELETE,POST,PUT
access-control-allow-origin: *
cache-control: public, max-age=0, must-revalidate
content-security-policy: default-src 'self'; script-src 'self'
'unsafe-eval' 'unsafe-inline' https://clerk.openrouter.ai
https://cunning-heron-18.clerk.accounts.dev
https://challenges.cloudflare.com https://checkout.stripe.com
https://connect-js.stripe.com https://js.stripe.com
https://maps.googleapis.com https://www.googletagmanager.com https://*.
ingest.sentry.io; connect-src 'self' https://clerk.openrouter.ai
https://cunning-heron-18.clerk.accounts.dev https://checkout.stripe.com
https://api.stripe.com https://maps.googleapis.com *.google-analytics.com
https://www.googletagmanager.com https://raw.githubusercontent.com wss://
www.walletlink.org/rpc https://*.ingest.sentry.io; frame-src 'self'
https://challenges.cloudflare.com https://checkout.stripe.com
https://connect-js.stripe.com https://js.stripe.com https://hooks.stripe.com
https://us5.datadoghq.com https://*.ingest.sentry.io; img-src 'self' data:
blob: https://img.clerk.com https://*.stripe.com
https://www.googletagmanager.com https://t0.gstatic.com; worker-src 'self'
blob:; style-src 'self' 'unsafe-inline'
sha256-0hAheEzaMe6uXIKV4EehS9pu1am1lj/KnnzrOYqckXk=;
upgrade-insecure-requests
strict-transport-security: max-age=63072000
x-matched-path: /api/v1/chat/completions
x-vercel-id: iad1::w7f2k-1730750310003-cd83616a5691
cf-cache-status: DYNAMIC
vary: accept-encoding
server: cloudflare
cf-ray: 8dd72ddd68908fdb-BOS
content-encoding: br
{"id":"gen-1730750310-iaeJNUIbfvNy1JAXZD28","provider":"DeepInfra","model":"qwen/qwen-2.5-72b-instruct","object":"chat.completion","created":1730750310,"choices":[{"logprobs":null,"finish_reason":"stop","index":0,"message":{"role":"assistant","content":null,"refusal":null,"tool_calls":[{"index":0,"id":"call_YfQTLPaDEkVvV9PEFU2Qs7Hg","function":{"arguments":"","name":"eli-disk-usage"},"type":"function"}]}}],"usage":{"prompt_tokens":2570,"completion_tokens":19,"total_tokens":2589}}
…On Mon, Nov 4, 2024 at 2:53 PM r0man ***@***.***> wrote:
One thing that seems to be suspicious from the response you sent is that
it declares the response to be application/json. If this is a streaming
response and in an OpenAI compatible format it should be text/event-stream
from what I remember, e.g. a server sent events response. If those are not
aligned we may parse the response with the wrong parser, e.g. a server sent
event response with the JSON document parser.
—
Reply to this email directly, view it on GitHub
<#97 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAFSIZJEMPFT5N66M5JZCDZ67GDXAVCNFSM6AAAAABRCHDSGGVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJUGY4DOMY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Yes, this is with llm-openai-compatible-provider. I think we've found the
source of the issue!
…On Tue, Nov 5, 2024 at 12:54 AM Andrew Hyatt ***@***.***> wrote:
This is helpful, thanks (and thanks @r0man <https://github.com/r0man> for
the pointers in getting the debug). I tried to just manually srun
llm-provider-extract-function-calls with an ollama provider and your
response, and it doesn't work, because the function doesn't expect that the
function call is in a choices key in the main object. As you can see in
https://github.com/ollama/ollama/blob/main/docs/api.md, that's what the
docs say to expect.
However, maybe you are using an llm-openai-compatible provider, which I
think is what openrouter says they support. This should in theory work, but
arguments is "", when I think it needs to be "{}". We could potentially
just fix this in the llm package, it doesn't seem crazy to support "",
and I don't think I've actually seen what Open AI gives in this case.
—
Reply to this email directly, view it on GitHub
<#97 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAFSI3IFSI3GU3M4TLBHBDZ7BFPLAVCNFSM6AAAAABRCHDSGGVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJVGA2DANY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
I've got the change to handle null strings in JSON "" and now my
application works great. Thank you :)
…On Tue, Nov 5, 2024 at 10:45 AM Andrew Hyatt ***@***.***> wrote:
I've merged in a change in the main branch, please check it out. It will
be in the next release, which will be out in a week or so, I hope.
—
Reply to this email directly, view it on GitHub
<#97 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAFSI5IFHZZCILGBNUDZBDZ7DR2XAVCNFSM6AAAAABRCHDSGGVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTCMJVGY2TINA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
Actually any of the Qwen tools capable models will fail in this way, although they work locally i get the json error from PLZ when trying to use openrouter.
Error running timer ‘plz--respond’: (json-end-of-file)
How can I debug this further? Toggle-debug-on-error doesn't seem to work for plz--respond in the timer
Beta Was this translation helpful? Give feedback.
All reactions