-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feature: support Anthropic’s Claude 3.5 Sonnet, Google’s Gemini 1.5 Pro, and OpenAI’s o1-preview and o1-mini #733
Comments
Claude and Gemini also supported now in copilot. https://github.blog/news-insights/product-news/bringing-developer-choice-to-copilot/ |
Huge news here, for us Copilot subscribers. I truly hope support will be added in the plugin for all those models, with Copilot as provider. |
The CopilotChat.nvim plugin has recently expanded its capabilities by adding support for several new AI models: Having recently experimented with the Microsoft's strategic decision to integrate these diverse AI models under the unified Github Copilot platform appears to be a well-calculated move. This integration offers subscribers seamless access to multiple state-of-the-art AI models through a single subscription service. Looking forward, there's hope that |
Maybe change the issue title to note that there are multiple models now available? maybe it'll draw more attention |
Done. |
Are these optional models controlled by copilot.lua? |
@msdone-lwt i dont see a way to change model from copilot.lua for some reason |
If you're referring to the model used by Copilot for code completion, it was originally based on OpenAI Codex: OpenAI Codex Official Documentation The Code Completion functionality received a major update in July 2023: GitHub Blog Post about the Update Since then, it has been powered by a new model developed through a collaboration between OpenAI, Microsoft Azure AI, and GitHub. This new model offers a 13% latency improvement compared to its predecessor. While looking at the editor integrations, it's worth noting that the For comprehensive information about GitHub Copilot, including features, pricing, and documentation, visit the official GitHub Copilot page: Anyway, The specific name of this new model, as well as the current model being used for Code Completion functionality, seems to remains undisclosed. I've always been curious about this information. If anyone has more detailed insights about the current model, I would greatly appreciate if you could share them. Thank you! |
I have successfully enabled Claude 3.5 Sonnet support with the following configuration: local my_opts = {
provider = "copilot",
copilot = {
model = "claude-3.5-sonnet",
-- max_tokens = 4096,
},
} There are some considerations regarding the optimal value for Summary Table of GitHub Copilot Chat Supported Models
If anyone has insights on how these limits correlate, please share your understanding. Additionally, the following tasks remain:
|
cool |
@msdone-lwt do you happen to live in Eastern Asia, most likely in:
? |
I am in China, if I turn on the network proxy, it doesn't respond at all. If I turn off the code, it returns an error: model access is not permitted per policy settings😥 |
I was curious because just for testing I played around with the new Claude 3.5 Sonnet integration with It seems to be working pretty well! 🙂 Joking apart, try this:
https://github.com/CopilotC-Nvim/CopilotChat.nvim
local my_opts = {
provider = "copilot",
copilot = {
model = "claude-3.5-sonnet",
-- max_tokens = 4096,
},
}
|
not an issue of avante, but of github rollout. i'm getting the same. |
When I did my first test, using https://github.com/CopilotC-Nvim/CopilotChat.nvim, I also received that error (I never had it with These are the models I can currently use with my GitHub Copilot subscription: |
|
There is no Claude option in the Copilot plugin in my VSCode either 😂 WTF |
@pidgeon777 Okay, I will try it tomorrow, |
@pidgeon777 How does your "model selector" work? |
It's a very cool feature of https://github.com/CopilotC-Nvim/CopilotChat.nvim: https://github.com/CopilotC-Nvim/CopilotChat.nvim?tab=readme-ov-file#commands |
@repparw Just now, I saw that I could use claude-3.5-sonnet. |
Thanks, with this work-around it is enabled now. |
@msdone-lwt and @gam-phon, was it enabled after following the method I shared here?: @repparw did you also try? Did it work for you? |
I have already enabled Anthropic Claude 3.5 Sonnet in Copilot in my GitHub account, but I still don't see Claude when selecting a model in VSCode. Additionally, when I set the Copilot model to claude-3.5-sonnet in avante.nvim, it returns: Error: 'API request failed with status 403. Body: "access denied"' |
Yes, after following your method exactly, it was enabled immediately. Before I did not have access to Claude 3.5 Sonnet |
policy still hasn't rolled out for me, sadly |
rolled out now and working. |
Thanks, I can use Claude now too |
My idea originated from an interesting discovery I did: initially, the model availability issue was also present in the CopilotChat.nvim plugin (repository link). A subsequent commit addressed this, likely modifying the model activation method, which enabled the use of the Claude Sonnet 3.5 model. These enhancements haven't been implemented in avante.nvim yet. There's a need for support of Currently, our workflow involves using CopilotChat.nvim as a preliminary "unlocking" authentication step, enabling us to utilize all features in avante.nvim afterward. For GitHub Copilot subscribers, I strongly recommend using both plugins in tandem. Based on my testing:
The hope is for avante.nvim to evolve by incorporating workspace parsing functionality, which would provide better context awareness when proposing code modifications, and also to support the remaining models offered in the GitHub Copilot subscription. Finally, a model selector would be great, when performing requests. @repparw I'm also interested in knowing more about |
not true. rollout just was not for all users at the same time. I didn't install CopilotChat. Just waited until the policy appeared and enabled it |
Then you're suggesting it could be a coincidence that after following the method, they were able to use |
Yes. CopilotChat.nvim has nothing to do with it |
|
This opt (from @pidgeon777 above) doesn't work with current Avante.nvim. I got local my_opts = {
provider = "copilot",
copilot = {
model = "claude-3.5-sonnet",
-- max_tokens = 4096,
},
} Playing around, here's my current working opt to enable Claude 3.5 Sonnet with Copilot subscription (with precondition that Claude Sonnet is enabled for your account): local opts = {
provider = "copilotclaude",
vendors = {
copilotclaude = {
__inherited_from = "copilot",
api_key_name = "GITHUB_TOKEN",
model = "claude-3.5-sonnet",
max_tokens = 4096,
}
}
} |
Feature request
OpenAI o1-preview and o1-mini are now available in GitHub Copilot Chat in VS Code and in the GitHub Models playground.
https://github.blog/news-insights/product-news/try-out-openai-o1-in-github-copilot-and-models/
All the necessary information on how this could be ported to
avante.nvim
can be found at the following link:CopilotC-Nvim/CopilotChat.nvim#419
Motivation
To utilize the OpenAI
o1
model, it is not mandatory to rely on the costly API services. Instead, a subscription to GitHub Copilot Chat can suffice for accessing the model's capabilities. This alternative provides a more cost-effective solution while still leveraging the advanced functionalities of theo1
model. By subscribing to GitHub Copilot Chat, users can integrate AI-driven assistance directly into their development workflow, enhancing productivity and code quality without incurring significant expenses.Other
No response
The text was updated successfully, but these errors were encountered: