Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No module named 'xtuner.parallel' #541

Closed
828Tina opened this issue Apr 2, 2024 · 6 comments
Closed

No module named 'xtuner.parallel' #541

828Tina opened this issue Apr 2, 2024 · 6 comments

Comments

@828Tina
Copy link

828Tina commented Apr 2, 2024

环境按照步骤安装:conda create --name xtuner_lxy python=3.10 -y
pip install -U xtuner
可以正常使用xtuner命令检测数据集:xtuner check-custom-dataset /home/xtuner/internlm2_7b_qlora_alpaca_e3_copy.py
7398628bbf8a9e39f60e2e971290a36
微调时出错:xtuner train internlm2_7b_qlora_alpaca_e3_copy.py --deepspeed deepspeed_zero2

[2024-04-02 11:50:21,749] [INFO] [real_accelerator.py:191:get_accelerator] Setting ds_accelerator to cuda (auto detect)
04/02 11:50:24 - mmengine - WARNING - WARNING: command error: 'No module named 'xtuner.parallel''!
04/02 11:50:24 - mmengine - WARNING -
Arguments received: ['xtuner', 'train', 'internlm2_7b_qlora_alpaca_e3_copy.py', '--deepspeed', 'deepspeed_zero2']. xtuner commands use the following syntax:

      xtuner MODE MODE_ARGS ARGS

      Where   MODE (required) is one of ('list-cfg', 'copy-cfg', 'log-dataset', 'check-custom-dataset', 'train', 'test', 'chat', 'convert', 'preprocess', 'mmbench', 'eval_refcoco')
              MODE_ARG (optional) is the argument for specific mode
              ARGS (optional) are the arguments for specific command

  Some usages for xtuner commands: (See more by using -h for specific command!)
@LZHgrla
Copy link
Collaborator

LZHgrla commented Apr 2, 2024

Hi! @828Tina
So sorry for that. This is a known issue about xtuner v0.1.16, and we are working on it!

Before the bug is resolved, you can degrade xtuner to 0.1.15, or install xtuner from source.

# Approach 1: degrade
pip install xtuner==0.1.15
# Approach 2: install from source
git clone https://github.com/InternLM/xtuner.git
cd xtuner
pip install -e .

@ILG2021
Copy link

ILG2021 commented Apr 3, 2024

0.1.15也有此问题

@LZHgrla
Copy link
Collaborator

LZHgrla commented Apr 3, 2024

@ILG2021
是不是使用0.1.16的config?如果是的话,需要把 config 在0.1.15版本下重新copy一份

@LZHgrla
Copy link
Collaborator

LZHgrla commented Apr 3, 2024

@ILG2021
@828Tina
Install xtuner>=0.1.17 can solve this issue.

pip install 'xtuner>=0.1.17'

@ILG2021
Copy link

ILG2021 commented Apr 3, 2024

好的,谢谢。
我还想问下xtuner能否在windows上使用qlora,因为bitsandbytes不支持Windows,直接使用pip install bitsandbytes会安装一个不支持GPU的版本,折腾了一下午也没有解决。

@LZHgrla
Copy link
Collaborator

LZHgrla commented Apr 3, 2024

@ILG2021 是的 官方的bitsandbytes不支持windows。可以关注一下这个 issue bitsandbytes-foundation/bitsandbytes#30

里面有人通过这个 repo 解决了问题,不过我们并未进行验证
https://github.com/jllllll/bitsandbytes-windows-webui

@LZHgrla LZHgrla closed this as completed Apr 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants