Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature: support Anthropic’s Claude 3.5 Sonnet, Google’s Gemini 1.5 Pro, and OpenAI’s o1-preview and o1-mini #733

Open
pidgeon777 opened this issue Oct 18, 2024 · 40 comments
Labels
enhancement New feature or request

Comments

@pidgeon777
Copy link

Feature request

OpenAI o1-preview and o1-mini are now available in GitHub Copilot Chat in VS Code and in the GitHub Models playground.

https://github.blog/news-insights/product-news/try-out-openai-o1-in-github-copilot-and-models/

All the necessary information on how this could be ported to avante.nvim can be found at the following link:

CopilotC-Nvim/CopilotChat.nvim#419

Motivation

To utilize the OpenAI o1 model, it is not mandatory to rely on the costly API services. Instead, a subscription to GitHub Copilot Chat can suffice for accessing the model's capabilities. This alternative provides a more cost-effective solution while still leveraging the advanced functionalities of the o1 model. By subscribing to GitHub Copilot Chat, users can integrate AI-driven assistance directly into their development workflow, enhancing productivity and code quality without incurring significant expenses.

Other

No response

@pidgeon777 pidgeon777 added the enhancement New feature or request label Oct 18, 2024
@repparw
Copy link

repparw commented Oct 29, 2024

Claude and Gemini also supported now in copilot.
maybe do a parameter for picking from the multi model options?

https://github.blog/news-insights/product-news/bringing-developer-choice-to-copilot/

@pidgeon777
Copy link
Author

Claude and Gemini also supported now in copilot.
maybe do a parameter for picking from the multi model options?

https://github.blog/news-insights/product-news/bringing-developer-choice-to-copilot/

Huge news here, for us Copilot subscribers. I truly hope support will be added in the plugin for all those models, with Copilot as provider.

@pidgeon777
Copy link
Author

The CopilotChat.nvim plugin has recently expanded its capabilities by adding support for several new AI models: o1-preview, o1-mini, and claude-3.5-sonnet.

Having recently experimented with the claude-3.5-sonnet model for the first time, I can now understand why it has garnered such widespread appreciation. Its ability to adhere to system prompts demonstrates significantly higher accuracy compared to the gpt-4o model that I had been utilizing previously.

Microsoft's strategic decision to integrate these diverse AI models under the unified Github Copilot platform appears to be a well-calculated move. This integration offers subscribers seamless access to multiple state-of-the-art AI models through a single subscription service.

Looking forward, there's hope that avante.nvim will be updated to accommodate these new developments in the ecosystem.

@repparw
Copy link

repparw commented Oct 30, 2024

Maybe change the issue title to note that there are multiple models now available? maybe it'll draw more attention

@pidgeon777 pidgeon777 changed the title feature: support GitHub Copilot Chat O1 model feature: support Anthropic’s Claude 3.5 Sonnet, Google’s Gemini 1.5 Pro, and OpenAI’s o1-preview and o1-mini Oct 30, 2024
@pidgeon777
Copy link
Author

Done.

@msdone-lwt
Copy link

Are these optional models controlled by copilot.lua?

@blurskye
Copy link

@msdone-lwt i dont see a way to change model from copilot.lua for some reason

@pidgeon777
Copy link
Author

If you're referring to the model used by Copilot for code completion, it was originally based on OpenAI Codex:

OpenAI Codex Official Documentation

The Code Completion functionality received a major update in July 2023:

GitHub Blog Post about the Update

Since then, it has been powered by a new model developed through a collaboration between OpenAI, Microsoft Azure AI, and GitHub. This new model offers a 13% latency improvement compared to its predecessor.

While looking at the editor integrations, it's worth noting that the copilot.vim README still mentions that "GitHub Copilot uses OpenAI Codex to suggest code and entire functions in real-time right from your editor." However, it appears that users cannot manually change or select the underlying completion model used by editor plugins like copilot.vim (and consequently copilot.lua). The model selection and updates are managed entirely on GitHub's backend infrastructure.

For comprehensive information about GitHub Copilot, including features, pricing, and documentation, visit the official GitHub Copilot page:

GitHub Copilot Features

Anyway, The specific name of this new model, as well as the current model being used for Code Completion functionality, seems to remains undisclosed.

I've always been curious about this information. If anyone has more detailed insights about the current model, I would greatly appreciate if you could share them. Thank you!

@pidgeon777
Copy link
Author

I have successfully enabled Claude 3.5 Sonnet support with the following configuration:

local my_opts = {
  provider = "copilot",
  copilot = {
    model = "claude-3.5-sonnet",
    -- max_tokens = 4096,
  },
}

There are some considerations regarding the optimal value for max_tokens that need to be addressed. The model specifications table shows "max context" and "max output" values for each model, but their relationship to the max_tokens parameter is not yet clear:

Summary Table of GitHub Copilot Chat Supported Models

Model Family Model Name Type Max Context Max Output Tokenizer Features
gpt-4-turbo GPT 4 Turbo chat 128000 4096 cl100k_base tool_calls, parallel_tool_calls
o1-mini o1-mini (Preview) chat 128000 - o200k_base -
o1-mini o1-mini (Preview) chat 128000 - o200k_base -
gpt-4 GPT 4 chat 32768 4096 cl100k_base tool_calls
gpt-4 GPT 4 chat 32768 4096 cl100k_base tool_calls
text-embedding-3-small Embedding V3 small embeddings - - cl100k_base dimensions
text-embedding-3-small Embedding V3 small (Inference) embeddings - - cl100k_base dimensions
claude-3.5-sonnet Claude 3.5 Sonnet (Preview) chat 200000 4096 o200k_base -
gpt-3.5-turbo GPT 3.5 Turbo chat 16384 4096 cl100k_base tool_calls
gpt-3.5-turbo GPT 3.5 Turbo chat 16384 4096 cl100k_base tool_calls
gpt-4o GPT 4o chat 128000 4096 o200k_base tool_calls, parallel_tool_calls
gpt-4o GPT 4o chat 128000 4096 o200k_base tool_calls, parallel_tool_calls
gpt-4o GPT 4o chat 128000 4096 o200k_base tool_calls, parallel_tool_calls
gpt-4o GPT 4o chat 128000 16384 o200k_base tool_calls, parallel_tool_calls
gpt-4o-mini GPT 4o Mini chat 128000 4096 o200k_base tool_calls, parallel_tool_calls
gpt-4o-mini GPT 4o Mini chat 128000 4096 o200k_base tool_calls, parallel_tool_calls
text-embedding-ada-002 Embedding V2 Ada embeddings - - cl100k_base -
o1 o1-preview (Preview) chat 128000 - o200k_base -
o1 o1-preview (Preview) chat 128000 - o200k_base -

If anyone has insights on how these limits correlate, please share your understanding.

Additionally, the following tasks remain:

  • Add support for o1-preview and o1-mini models in avante.nvim
  • Test whether using model = "claude-3.5-sonnet" is sufficient or if additional adjustments are needed (e.g., in the system prompt)
  • Implement and test support for Google Gemini 1.5 Pro

@msdone-lwt
Copy link

cool

@msdone-lwt
Copy link

What is the reason for this? Is it because of my network?
image

@pidgeon777
Copy link
Author

I never had the issue you mentioned. In my case it is working great:

image

@pidgeon777
Copy link
Author

@msdone-lwt do you happen to live in Eastern Asia, most likely in:

  • China
  • Philippines
  • Malaysia
  • Western Indonesia
  • Hong Kong
  • Singapore

?

@msdone-lwt
Copy link

I am in China, if I turn on the network proxy, it doesn't respond at all. If I turn off the code, it returns an error: model access is not permitted per policy settings😥

@pidgeon777
Copy link
Author

pidgeon777 commented Oct 31, 2024

I was curious because just for testing I played around with the new Claude 3.5 Sonnet integration with avante.nvim:

image

image

It seems to be working pretty well! 🙂

Joking apart, try this:

  1. Install and configure the following (also great) plugin for Claude 3.5 Sonnet usage:

https://github.com/CopilotC-Nvim/CopilotChat.nvim

  1. Try to perform a couple of requests to the Claude 3.5 Sonnet model.

  2. Switch back to avante.nvim using the config I posted before:

local my_opts = {
  provider = "copilot",
  copilot = {
    model = "claude-3.5-sonnet",
    -- max_tokens = 4096,
  },
}
  1. Try to perform a request again and see if something changes.

@repparw
Copy link

repparw commented Oct 31, 2024

model access is not permitted per policy settings

not an issue of avante, but of github rollout. i'm getting the same.

Claude 3.5 Sonnet Announcement and Rollout

@pidgeon777
Copy link
Author

When I did my first test, using https://github.com/CopilotC-Nvim/CopilotChat.nvim, I also received that error (I never had it with avante.nvim). Probably, few minutes later I got the permission granted.

These are the models I can currently use with my GitHub Copilot subscription:

image

@msdone-lwt
Copy link

model access is not permitted per policy settings

not an issue of avante, but of github rollout. i'm getting the same.

Claude 3.5 Sonnet Announcement and Rollout

There is no option in my settings to start Claude
image

@msdone-lwt
Copy link

model access is not permitted per policy settings

not an issue of avante, but of github rollout. i'm getting the same.
Claude 3.5 Sonnet Announcement and Rollout

There is no option in my settings to start Claude image

There is no Claude option in the Copilot plugin in my VSCode either 😂 WTF
image

@msdone-lwt
Copy link

@pidgeon777 Okay, I will try it tomorrow,

@repparw
Copy link

repparw commented Oct 31, 2024 via email

@msdone-lwt
Copy link

When I did my first test, using https://github.com/CopilotC-Nvim/CopilotChat.nvim, I also received that error (I never had it with avante.nvim). Probably, few minutes later I got the permission granted.

These are the models I can currently use with my GitHub Copilot subscription:

image

@pidgeon777 How does your "model selector" work?

@msdone-lwt
Copy link

you have to wait for it to roll out to your account

On Thu, Oct 31, 2024 at 12:35 PM msdone @.> wrote: model access is not permitted per policy settings not an issue of avante, but of github rollout. i'm getting the same. Claude 3.5 Sonnet Announcement and Rollout https://docs.github.com/en/copilot/using-github-copilot/using-claude-sonnet-in-github-copilot#claude-35-sonnet-announcement-and-rollout There is no option in my settings to start Claude [image: image] https://private-user-images.githubusercontent.com/103359349/382002553-a9c7081f-d859-47a6-abe1-8a9cd6fbc370.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MzAzODkwODAsIm5iZiI6MTczMDM4ODc4MCwicGF0aCI6Ii8xMDMzNTkzNDkvMzgyMDAyNTUzLWE5YzcwODFmLWQ4NTktNDdhNi1hYmUxLThhOWNkNmZiYzM3MC5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQxMDMxJTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MTAzMVQxNTMzMDBaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT1jOWM5MjA0Yjk3ZDc3Zjc5OTM5MWYyZTk4MDEzYjQ2NmE0NTQ2MzNlODUzYmFhMTY2ZTk0YTExNDg1YmNlNzM2JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCJ9.m1S0lQA2ExUJwQX9nb9AYM_8rgd2X-wTzDX_LL4P9F0 There is no Claude option in the Copilot plugin in my VSCode either 😂 WTF image.png (view on web) https://github.com/user-attachments/assets/6588afac-84ae-425e-99f2-3bc93cd2bb5d — Reply to this email directly, view it on GitHub <#733 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AK6S7SVKH6BZXCAKME6HPOLZ6JE4VAVCNFSM6AAAAABQF5R4L6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJQGE4DQMRUGM . You are receiving this because you commented.Message ID: <yetone/avante. @.>

I see, but I'm curious why you are in the promotion plan. Is it random?

@repparw
Copy link

repparw commented Oct 31, 2024 via email

@pidgeon777
Copy link
Author

When I did my first test, using https://github.com/CopilotC-Nvim/CopilotChat.nvim, I also received that error (I never had it with avante.nvim). Probably, few minutes later I got the permission granted.
These are the models I can currently use with my GitHub Copilot subscription:
image

@pidgeon777 How does your "model selector" work?

It's a very cool feature of https://github.com/CopilotC-Nvim/CopilotChat.nvim:

https://github.com/CopilotC-Nvim/CopilotChat.nvim?tab=readme-ov-file#commands

@msdone-lwt
Copy link

@repparw Just now, I saw that I could use claude-3.5-sonnet.
image

@gam-phon
Copy link

gam-phon commented Nov 1, 2024

pidgeon777

Thanks, with this work-around it is enabled now.

@pidgeon777
Copy link
Author

@msdone-lwt and @gam-phon, was it enabled after following the method I shared here?:

#733 (comment)

@repparw did you also try? Did it work for you?

@msdone-lwt
Copy link

I have already enabled Anthropic Claude 3.5 Sonnet in Copilot in my GitHub account, but I still don't see Claude when selecting a model in VSCode. Additionally, when I set the Copilot model to claude-3.5-sonnet in avante.nvim, it returns: Error: 'API request failed with status 403. Body: "access denied"'

@msdone-lwt
Copy link

image
CopilotChat.nvim

@gam-phon
Copy link

gam-phon commented Nov 1, 2024

@msdone-lwt and @gam-phon, was it enabled after following the method I shared here?:

#733 (comment)

@repparw did you also try? Did it work for you?

Yes, after following your method exactly, it was enabled immediately. Before I did not have access to Claude 3.5 Sonnet

@repparw
Copy link

repparw commented Nov 1, 2024

@repparw did you also try? Did it work for you?

policy still hasn't rolled out for me, sadly

@repparw
Copy link

repparw commented Nov 1, 2024

rolled out now and working.
sidenote, anyone figured out what to do with max_tokens for claude?

@msdone-lwt
Copy link

Thanks, I can use Claude now too

@pidgeon777
Copy link
Author

My idea originated from an interesting discovery I did: initially, the model availability issue was also present in the CopilotChat.nvim plugin (repository link). A subsequent commit addressed this, likely modifying the model activation method, which enabled the use of the Claude Sonnet 3.5 model.

These enhancements haven't been implemented in avante.nvim yet. There's a need for support of o1-preview and o1-mini models, as well as future compatibility with Gemini 1.5 Pro once it becomes available.

Currently, our workflow involves using CopilotChat.nvim as a preliminary "unlocking" authentication step, enabling us to utilize all features in avante.nvim afterward.

For GitHub Copilot subscribers, I strongly recommend using both plugins in tandem. Based on my testing:

  • CopilotChat.nvim:

    • Seamlessly integrates with most of GitHub Copilot features
    • Excels in code-focused conversations
    • Perfect for code analysis and discussions
  • avante.nvim:

    • Superior for actual code modifications
    • Well-structured modification implementation system
    • Potential for growth with workspace parsing capabilities
    • Could benefit from enhanced contextual awareness when suggesting changes

The hope is for avante.nvim to evolve by incorporating workspace parsing functionality, which would provide better context awareness when proposing code modifications, and also to support the remaining models offered in the GitHub Copilot subscription. Finally, a model selector would be great, when performing requests.

@repparw I'm also interested in knowing more about max_tokens and similar parameters. Which would be the optimal values, for example?

@repparw
Copy link

repparw commented Nov 2, 2024

Currently, our workflow involves using CopilotChat.nvim as a preliminary "unlocking" authentication step, enabling us to utilize all features in avante.nvim afterward.

not true. rollout just was not for all users at the same time. I didn't install CopilotChat. Just waited until the policy appeared and enabled it

@pidgeon777
Copy link
Author

Then you're suggesting it could be a coincidence that after following the method, they were able to use avante.nvim with Claude 3.5 Sonnet? As far as I know, it could even be the case.

@repparw
Copy link

repparw commented Nov 2, 2024

Yes. CopilotChat.nvim has nothing to do with it

@Timmy0o0
Copy link

Timmy0o0 commented Nov 4, 2024

Thanks, I can use Claude now too
I am also in China, using the Claude model is much slower than the gpt model

@trongthanh
Copy link

This opt (from @pidgeon777 above) doesn't work with current Avante.nvim. I got parse_curl_args errors:

local my_opts = {
  provider = "copilot",
  copilot = {
    model = "claude-3.5-sonnet",
    -- max_tokens = 4096,
  },
}

Playing around, here's my current working opt to enable Claude 3.5 Sonnet with Copilot subscription (with precondition that Claude Sonnet is enabled for your account):

local opts = {
  provider = "copilotclaude",
  vendors = {
    copilotclaude = {
      __inherited_from = "copilot",
      api_key_name = "GITHUB_TOKEN",
      model = "claude-3.5-sonnet",
      max_tokens = 4096,
    }
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants