Skip to content
This repository has been archived by the owner on Dec 1, 2024. It is now read-only.

Support for LLaMA #104

Closed
ustcwhy opened this issue Mar 31, 2023 · 1 comment
Closed

Support for LLaMA #104

ustcwhy opened this issue Mar 31, 2023 · 1 comment

Comments

@ustcwhy
Copy link

ustcwhy commented Mar 31, 2023

Thanks for your wonderful work!
Meta released their newest LLM, LLaMA. The checkpoint is available on Huggingface[1]. zphang has presented the code to use LLaMA based on the transformers repo. For FlexGen, could I directly replace OPT model with LLaMA to make inferences on a local card? Do you have any plan to support LLaMA in the future?

[1] https://huggingface.co/decapoda-research
[2] huggingface/transformers#21955

@BarfingLemurs
Copy link

(duplicate) #60

@ustcwhy ustcwhy closed this as completed Apr 10, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants