Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SOLVED] split_torch_state_dict_into_shards #446

Open
freeload101 opened this issue Jul 12, 2024 · 3 comments
Open

[SOLVED] split_torch_state_dict_into_shards #446

freeload101 opened this issue Jul 12, 2024 · 3 comments

Comments

@freeload101
Copy link

bash -x play.sh
ESC[?2004l^M+ '[' '!' -f runtime/envs/koboldai/bin/python ']'
+ bin/micromamba run -r runtime -n koboldai python aiserver.py
Traceback (most recent call last):
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1076, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/modeling_utils.py", line 78, in <module>
    from accelerate import __version__ as accelerate_version
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/accelerate/__init__.py", line 16, in <module>
    from .accelerator import Accelerator
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/accelerate/accelerator.py", line 34, in <module>
    from huggingface_hub import split_torch_state_dict_into_shards
ImportError: cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/huggingface_hub/__init__.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "aiserver.py", line 58, in <module>
    from utils import debounce
  File "/opt/koboldai-client/utils.py", line 12, in <module>
    from transformers import PreTrainedModel
  File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1066, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1078, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
cannot import name 'split_torch_state_dict_into_shards' from 'huggingface_hub' (/opt/koboldai-client/runtime/envs/koboldai/lib/python3.8/site-packages/huggingface_hub/__init__.py)

FIX:
./bin/micromamba run -r runtime -n koboldai pip install --upgrade huggingface_hub

@crspangenberg
Copy link

Where should this command be run?

@GeorgeEBeresford
Copy link

Where should this command be run?

I'm not sure about the command he mentioned. I went down a similar path.

  1. Open command prompt
  2. Navigate to the directory with KoboldAI installed via CD (e.g. CD C:\Program Files (x86)\KoboldAI)
  3. Run miniconda3\condabin\activate
    That will run command prompt with the miniconda context
  4. type pip install --upgrade huggingface_hub

This fixed the issue for me

@Mat4Shell
Copy link

Mat4Shell commented Aug 12, 2024

Or you can try this :

  1. Launch commandline.bat or commandline.sh (depending of your OS)
  2. Execute pip install --upgrade huggingface_hub
  3. Relaunch KoboldAI

I try this one and it works for me because I don't have a condabin folder in my miniconda3 folder

@henk717 henk717 mentioned this issue Aug 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants