You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you so much for putting this together. I'm running into some issues with both downloading the model from pan.baidu and using it via Huggingface's transformer API.
On Huggingface (transformers version 4.37.2):
>>> model = AutoModelForCausalLM.from_pretrained("LinkSoul/Chinese-LLaVA-Baichuan")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/amith/miniconda3/envs/viz_features/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 569, in from_pretrained
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers.models.llava.configuration_llava.LlavaConfig'> for this kind of AutoModel: AutoModelForCausalLM.
Model type should be one of BartConfig, BertConfig, BertGenerationConfig, BigBirdConfig, BigBirdPegasusConfig, BioGptConfig, BlenderbotConfig, BlenderbotSmallConfig, BloomConfig, CamembertConfig, LlamaConfig, CodeGenConfig, CpmAntConfig, CTRLConfig, Data2VecTextConfig, ElectraConfig, ErnieConfig, FalconConfig, FuyuConfig, GitConfig, GPT2Config, GPT2Config, GPTBigCodeConfig, GPTNeoConfig, GPTNeoXConfig, GPTNeoXJapaneseConfig, GPTJConfig, LlamaConfig, MarianConfig, MBartConfig, MegaConfig, MegatronBertConfig, MistralConfig, MixtralConfig, MptConfig, MusicgenConfig, MvpConfig, OpenLlamaConfig, OpenAIGPTConfig, OPTConfig, PegasusConfig, PersimmonConfig, PhiConfig, PLBartConfig, ProphetNetConfig, QDQBertConfig, Qwen2Config, ReformerConfig, RemBertConfig, RobertaConfig, RobertaPreLayerNormConfig, RoCBertConfig, RoFormerConfig, RwkvConfig, Speech2Text2Config, TransfoXLConfig, TrOCRConfig, WhisperConfig, XGLMConfig, XLMConfig, XLMProphetNetConfig, XLMRobertaConfig, XLMRobertaXLConfig, XLNetConfig, XmodConfig.
>>> quit()
I'm happy to use your repo directly to use the model but for some reason the Baidu download is not working -- I'm unable to install the Client required to do the download. Would it be possible to share a different download link for your two models?
Thank you!
The text was updated successfully, but these errors were encountered:
Update: I was able to get your codebase and published model weights working -- still no luck with HuggingFace so leaving this issue open. Feel free to close though.
Hey there,
Thank you so much for putting this together. I'm running into some issues with both downloading the model from pan.baidu and using it via Huggingface's transformer API.
On Huggingface (transformers version 4.37.2):
I'm happy to use your repo directly to use the model but for some reason the Baidu download is not working -- I'm unable to install the Client required to do the download. Would it be possible to share a different download link for your two models?
Thank you!
The text was updated successfully, but these errors were encountered: