-
-
Notifications
You must be signed in to change notification settings - Fork 194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chat UI not reacting to failed response with code 400 #955
Comments
I think that it is the latter exception that prevents the UI from reacting to the illegal request response. It seems that for some reason the |
I'm experiencing this issue as well, using
Was just a question to test the AI haha. |
Can confirm this is happening to me as well, I'm using ollama with the default built-in adapter. (running v2.3.49) |
I'm also getting 400, but different error: json: cannot unmarshal array into Go struct field ChatRequest.messages of type string Using remote Ollama server using custom api mode: The ollama server works from other chat frontends. v2.3.49 |
Thanks everyone for reporting this. It seems like the issues with Ollama are likely model specific. In #955 (comment), the issue appears to be the model doesn't support tools, which are currently required to use the lookup notes function from within the chat. Researching which models support tools should fix this error. For additional errors, please let me know which model you are using with Ollama. Screenshots of error codes/messages are helpful. Lastly, if any models seem to be working particularly well, sharing that would also be helpful 🌴 |
For the error I got: json: cannot unmarshal array into Go struct field ChatRequest.messages of type string , I copied and pasted the exact same request in Postman and received the same error response. On a closer note, it seems the content block in the Post request is not compatible with the Ollama format (using deepseek r1 14b). The content block for Ollama only supports "content": "some message" instead of { "type":"text", "text", "some message"} (I have verified this in Postman). The question is for Ollama users serving from a remote computer. What should we use to connect? The current Ollama preset doesn't support remote it seems. |
@evetsagg good to know. The custom API endpoint does indeed make a different request than the Ollama adapter (uses the default OpenAI adapter which allows the content objects array). We might be able to add a server url setting to the Ollama adapter to override the default localhost. It sounds like that is probably the only thing you need, does that sound right to you? 🌴 |
Yea, that'll work. Thanks. |
I have the issue with running deepseek-r1:1.5b with ollama. |
@ujikol are you using the Ollama adapter (or Custom API adapter) and are there any errors in the console logs that can be screenshotted? 🌴 |
Custom API does not work either. Manual request works: |
I sent the following message to Phi4 in Ollama (local):
The UI just keeps rotating the spinner, as if it is still waiting for a response.
Ollama server log shows as response 400 in just 20 ms:
Looking at the transaction using the inspector shows logs.
Request:
Response:
Exception:
Exception:
Version: 2.3.47
The text was updated successfully, but these errors were encountered: