You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the spirit of self-hosted tools, it would be great if the /chat command could support offline AI text generation, for example with a local Ollama instance.
At the moment, it currently seems restricted to using online tools that use an API token (OpenAI, Gemini, Anthropic). If it was possible to query a locally hosted alternative, it would help to keep everything self-contained, and give the Bot Hoster the freedom to choose whichever model they want.
I've done a small amount of research and found an Ollama API for Node, which might help an implementation on Bastion.
The text was updated successfully, but these errors were encountered:
In the spirit of self-hosted tools, it would be great if the
/chat
command could support offline AI text generation, for example with a local Ollama instance.At the moment, it currently seems restricted to using online tools that use an API token (OpenAI, Gemini, Anthropic). If it was possible to query a locally hosted alternative, it would help to keep everything self-contained, and give the Bot Hoster the freedom to choose whichever model they want.
I've done a small amount of research and found an Ollama API for Node, which might help an implementation on Bastion.
The text was updated successfully, but these errors were encountered: