Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OSError: Incorrect path_or_model_id: 'F:\TOOL\MagicQuill\models\llava-v1.5-7b-finetune-clean' #93

Closed
Firman2024-ffs opened this issue Dec 20, 2024 · 1 comment

Comments

@Firman2024-ffs
Copy link

i have an error like this, how to solve this?

(MagicQuill) F:\TOOL\MagicQuill>python gradio_run.py
Total VRAM 4096 MB, total RAM 16234 MB
pytorch version: 2.1.2+cu118
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce GTX 1050 Ti : native
Using pytorch cross attention
['F:\TOOL\MagicQuill', 'C:\Users\Fanny Firman Syah\.conda\envs\MagicQuill\python310.zip', 'C:\Users\Fanny Firman Syah\.conda\envs\MagicQuill\DLLs', 'C:\Users\Fanny Firman Syah\.conda\envs\MagicQuill\lib', 'C:\Users\Fanny Firman Syah\.conda\envs\MagicQuill', 'C:\Users\Fanny Firman Syah\.conda\envs\MagicQuill\lib\site-packages', 'editable.llava-1.2.2.post1.finder.path_hook', 'F:\TOOL\MagicQuill\MagicQuill']
Traceback (most recent call last):
File "C:\Users\Fanny Firman Syah.conda\envs\MagicQuill\lib\site-packages\transformers\utils\hub.py", line 385, in cached_file
resolved_file = hf_hub_download(
File "C:\Users\Fanny Firman Syah.conda\envs\MagicQuill\lib\site-packages\huggingface_hub\utils_validators.py", line 106, in inner_fn
validate_repo_id(arg_value)
File "C:\Users\Fanny Firman Syah.conda\envs\MagicQuill\lib\site-packages\huggingface_hub\utils_validators.py", line 160, in validate_repo_id
raise HFValidationError(
huggingface_hub.errors.HFValidationError: Repo id must use alphanumeric chars or '-', '
', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: 'F:\TOOL\MagicQuill\models\llava-v1.5-7b-finetune-clean'.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "F:\TOOL\MagicQuill\gradio_run.py", line 24, in
llavaModel = LLaVAModel()
File "F:\TOOL\MagicQuill\MagicQuill\llava_new.py", line 26, in init
self.tokenizer, self.model, self.image_processor, self.context_len = load_pretrained_model(
File "F:\TOOL\MagicQuill\MagicQuill\LLaVA\llava\model\builder.py", line 116, in load_pretrained_model
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
File "C:\Users\Fanny Firman Syah.conda\envs\MagicQuill\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 758, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File "C:\Users\Fanny Firman Syah.conda\envs\MagicQuill\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 590, in get_tokenizer_config
resolved_config_file = cached_file(
File "C:\Users\Fanny Firman Syah.conda\envs\MagicQuill\lib\site-packages\transformers\utils\hub.py", line 450, in cached_file
raise EnvironmentError(
OSError: Incorrect path_or_model_id: 'F:\TOOL\MagicQuill\models\llava-v1.5-7b-finetune-clean'. Please provide either the path to a local folder or the repo_id of a model on the Hub.

@zliucz
Copy link
Member

zliucz commented Dec 25, 2024

Hi, check this issue#54. Take a screenshot at your folder at path F:\TOOL\MagicQuill\models\llava-v1.5-7b-finetune-clean would help us find the problem.

@zliucz zliucz closed this as completed Jan 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants