Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generic OpenAI API compliant AI Chat Endopoint configuration #55

Open
corradodebari opened this issue Dec 5, 2024 · 2 comments
Open

Comments

@corradodebari
Copy link
Contributor

Since model server like HF TGI/vLLM/OpenLLM offer their models as OpenAI API REST endpoint, a generic OpenAI end point with a free field for name model could offer an integration point.
In the case of model served by HF TGI, the model name is fixed to "tgi" since the server "serve" only one model at time.
So, this generic option should be added on request into the Model configuration page more than once, to reach out several independent model servers.

@gotsysdba
Copy link
Contributor

image
make API a LOV of supported API types (like "Generic OpenAI")

@gotsysdba
Copy link
Contributor

Related to #59

@gotsysdba gotsysdba added this to the v1.0.0 milestone Dec 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants