-
Notifications
You must be signed in to change notification settings - Fork 60.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request]: Implement Service Provider Google Support Custom End Point Url #4086
Comments
Please follow the issue template to update title and description of your issue. |
Additional NoteNote This implement could significantly enhance performance not only for web-based applications, but also for desktop versions. |
OneAPI doesnt support Google Gemini API definition, you can diectly config the enpoint in custom endpoint(GPT like). As for model providers, the progress will be tracked in issue #4030 |
it's supported I've been testing it running on (base) root@H0llyW00dzZ:/# kubectl top pods
NAME CPU(cores) MEMORY(bytes)
mysql-78mv3xmb37-hpmz5 5m 402Mi
one-api-j3laxb3o30-bz0gu 0m 46Mi To be honest, I like the |
@H0llyW00dzZ Cool! The base url of gemini api should be able to configured via |
no, but env variable GOOGLE_URL can't be used in desktop version for example in this one: also desktop version it much stable when using custom end point url, plus smoothly unlike web-based applications |
The reason it can't be used in the desktop version is because it needs to be recompiled again into binaries. |
also another reason, rust is so slow when compiling it |
They are same thing eventually, you can try switching the endpoint to one-api service |
No, it doesn't work that way. When using the gemini-pro model, it utilizes Google as a service provider and it's mandatory to use this service. https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/blob/main/app/constant.ts#L14C1-L14C77 |
@H0llyW00dzZ When requesting gemini api server, the client will check whether the googleUrl is configured first. If not set, will fallback to the url you just mentioned. |
Oh, for native clients, bypassing through a proxy service is still necessary. It might need to using a proxy service like |
@H0llyW00dzZ have you checked the developer console in development build, it much likely the requests being block by CORS right now. |
no response I am using one-api on k8s |
when using service provider openai + model open ai it work, but gemini it doesn't work |
also I am pretty sure because of this |
Problem Description
In the current version, the service provider
Google
does not support custom endpoints, unlike the service providerOpenAI
, which provides a custom endpoint URL.Solution Description
Implementing support for custom endpoints in the service provider
Google
could greatly enhance performance. As an example, I've been operating this repository onKubernetes
, utilizing containers such asone-api
andlitellm
across13 pods
. This configuration has shown to be more stable compared to deployments onVercel
or other platforms.Alternatives Considered
No response
Additional Context
No response
The text was updated successfully, but these errors were encountered: