Replies: 1 comment
-
Is this on mac? We don't support qwen on mac yet, only llama series models. will add the support later. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
File "/ossfs/node_39022465/workspace/model.py", line 32, in init
self.model = AirLLMLlama2("/roots/models/Qwen72B/")
File "/opt/conda/lib/python3.8/site-packages/airllm/airllm.py", line 9, in init
super(AirLLMLlama2, self).init(*args, **kwargs)
File "/opt/conda/lib/python3.8/site-packages/airllm/airllm_base.py", line 104, in init
self.model_local_path, self.checkpoint_path = find_or_create_local_splitted_path(model_local_path_or_repo_id,
File "/opt/conda/lib/python3.8/site-packages/airllm/utils.py", line 351, in find_or_create_local_splitted_path
return Path(model_local_path_or_repo_id), split_and_save_layers(model_local_path_or_repo_id, layer_shards_saving_path,
File "/opt/conda/lib/python3.8/site-packages/airllm/utils.py", line 270, in split_and_save_layers
if max(shards) > shard:
ValueError: max() arg is an empty sequence
(base)
Beta Was this translation helpful? Give feedback.
All reactions