Can I change default OpenAI token from gpt-3.5turbo to gpt-4o-mini? #2699
Unanswered
nstuPlyaskin
asked this question in
Q&A
Replies: 1 comment
-
I've already file an issue for that. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello there, thanks so much for making this powerful tool!
I have a problem when I tried to add OpenAI integration via token from website (Model Providers), I select OpenAI, put gpt-4o-mini token, and I see an error:
hint : 102
Fail to access model(gpt-3.5-turbo) using this api key.ERROR: Error code: 403 - {'error': {'message': ... does not have access to model
gpt-3.5-turbo
', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}"How can I fix that to use my token for gpt-4o-mini?
Beta Was this translation helpful? Give feedback.
All reactions