Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GPTBigCodeForCausalLM LoRA support #3011

Closed
tybritten opened this issue Feb 23, 2024 · 3 comments · Fixed by #3949
Closed

GPTBigCodeForCausalLM LoRA support #3011

tybritten opened this issue Feb 23, 2024 · 3 comments · Fixed by #3949

Comments

@tybritten
Copy link

Would love to see LoRA support for GPTBigCodeForCausalLM. We have fine tuned adapters per-language.

Copy link
Collaborator

GPTBigCodeForCausalLM is already supported, and but it needs to support the weight loading logic so it knows how to load the weights. Do you have a public lora model that contributor can test with?

For contributors, the model code is here https://github.com/vllm-project/vllm/blob/57f044945f25d90d1b434014b2719ba6b06fdc44/vllm/model_executor/models/gpt_bigcode.py and the lora loading logic in llama should be a good template

lora_config: Optional[LoRAConfig] = None,

@simon-mo simon-mo added the good first issue Good for newcomers label Feb 23, 2024 — with Linear
@tybritten
Copy link
Author

Sure thing, I just uploaded one:
https://huggingface.co/tybritten/lora-for-starcoder

@simon-mo simon-mo removed the good first issue Good for newcomers label Feb 27, 2024
@raywanb
Copy link
Contributor

raywanb commented Mar 13, 2024

btw what language is this adapter for? I'm trying to test it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants