-
Hi folks! Just getting started with Smart connections and looking to use local LLM with LM studio to process chat requests. At the moment I have a model running on LM studio on a server locally on the machine and trying to connect it to Smart connections. I have configured the following on Smart Connections: The configuration seems fine but I am getting an API error message in Smart connections and LM studio is showing missing a "messages" field? Any I am missing to get Smart Connections working with LM Studio Server hosted models? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I have fixed it! Updating this discussion for future reference for anyone wanting to resolve this. I am currently using Mistral hosted on LM Studio and the "'messages' field is required" error message is related to enabling the streaming option in the settings of Smart Connect, disabling streaming resolves this issue. Screenshot below of the setting that needs to be disabled for the chat to LM studio via a local server to work. |
Beta Was this translation helpful? Give feedback.
I have fixed it! Updating this discussion for future reference for anyone wanting to resolve this.
I am currently using Mistral hosted on LM Studio and the "'messages' field is required" error message is related to enabling the streaming option in the settings of Smart Connect, disabling streaming resolves this issue. Screenshot below of the setting that needs to be disabled for the chat to LM studio via a local server to work.