We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Setup: Docker compose self hosting
Facing "Error parsing response" in helicone worker openai proxy container and helicone-jawn container while sending a request to OpenAI/Helicone.
Even if this error, the request is displayed in Helicone dashboard but some fields are not there (e.g. model).
Please, also check this issue to see some adjustments I had to implement to run the docker compose: #2965
Request example:
curl --location 'http://127.0.0.1:8787/v1/chat/completions' \ --header 'Authorization: Bearer XXXXXXXXXX' \ --header 'Helicone-Auth: Bearer XXXXXXXXXX' \ --header 'Content-Type: application/json' \ --data '{ "model": "gpt-4o", "messages": [ { "role": "system", "content": "What is football, give one word as answer" } ] }'
[wrangler:inf] POST /v1/chat/completions 200 OK (448ms) Error parsing default response: SyntaxError: Unexpected token '!', "!�
Error processing response body Error parsing body: SyntaxError: Unexpected token '!', "!� V�WS�"... is not valid JSON, !� V�WS�lm�~ u �� $���y����>�` M��.]8���0ކ�|���e��ED�����4�A/ڌ�C"B�j�&�a(�Ѵ#�^�� 7�Yz�Y�qSCS;�5Q��1V���n?`�'H̭�\�5 d�h�&���F����%b-����*"�����uxx�9��S�f;,J�3����xW|N3�@��K�I�8�k!���;��4�,f��z��_4�J MF��t��[d���'�k�����v����� RI�y�>>��5m�I";Ws6��eSYܛ��XUa ��B�\�Y��)f��D�������&*��?V8IҢͤ�}��שOf���.�3f�e Inserting rate limits for batch Upserting logs for batch Finished processing batch
### Twitter / LinkedIn details _No response_
The text was updated successfully, but these errors were encountered:
No branches or pull requests
What happened?
Setup: Docker compose self hosting
Facing "Error parsing response" in helicone worker openai proxy container and helicone-jawn container while sending a request to OpenAI/Helicone.
Even if this error, the request is displayed in Helicone dashboard but some fields are not there (e.g. model).
Please, also check this issue to see some adjustments I had to implement to run the docker compose: #2965
Request example:
Relevant log output
The text was updated successfully, but these errors were encountered: