-
-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Supports for o1 models? #419
Comments
I haven't got access yet so I'm not sure about the request/response structure. If someone could send a MITM capture, I'd be able to implement it. |
I think I have the time to capture that. I'll see what I can do. |
Here's the raw request and response when asking o1-preview model in the VSCode Insider
Here's the request and response when I'm asking the same question in the nvim plugin
EDIT: Here's a few things that I noticed:
|
Me also, I vote for |
Has anyone managed to get it to work? |
I have not yet attempted this, but incorporating support for this feature would be highly beneficial. Those who have a subscription to GitHub Copilot also have requests to the o1 models included. This means it is possible to make requests to those models 'for free,' without paying extra.. This enhancement could elevate Copilot to be on par with Claude 3.5 Sonnet in terms of coding capabilities. |
I just got access today. Will see if I can get it working |
Took almost an hour but o1 pretty much works now. Thank you @alwint3r for the mitm. Didn't even have to install vscode. |
Can't wait to try it. Thank you! |
GitHub has opened a limited preview for OpenAI o1 models. I was lucky enough to be admitted into their limited preview program, and I can see the o1 and o1-preview models when running the
:CopilotChatModels
command.I tried selecting the o1-mini-2024-09-1 model and asked it some questions but got some errors. Here's some details from the log file
The text was updated successfully, but these errors were encountered: