Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[IDEA]: Self-Hosted AI Chat Responses (Ollama) #1082

Open
cotton105 opened this issue Jan 12, 2025 · 1 comment
Open

[IDEA]: Self-Hosted AI Chat Responses (Ollama) #1082

cotton105 opened this issue Jan 12, 2025 · 1 comment
Labels
💡 Idea Ideas, sugestions or feature requests

Comments

@cotton105
Copy link

In the spirit of self-hosted tools, it would be great if the /chat command could support offline AI text generation, for example with a local Ollama instance.

At the moment, it currently seems restricted to using online tools that use an API token (OpenAI, Gemini, Anthropic). If it was possible to query a locally hosted alternative, it would help to keep everything self-contained, and give the Bot Hoster the freedom to choose whichever model they want.

I've done a small amount of research and found an Ollama API for Node, which might help an implementation on Bastion.

@cotton105 cotton105 added the 💡 Idea Ideas, sugestions or feature requests label Jan 12, 2025
Copy link

atlanna bot commented Jan 12, 2025

Thank you for opening this issue.
A maintainer will get by as soon as practical to address this issue.

If this is a support question and not really an issue or suggestion, then please ask it in our Discord Server instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💡 Idea Ideas, sugestions or feature requests
Projects
None yet
Development

No branches or pull requests

1 participant