-
Notifications
You must be signed in to change notification settings - Fork 46
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Installation Errors #6
Comments
Hey Tom, Thank you for reporting these issues and sorry for the inconveniences. As you've experienced, this is a large project that integrates several complex libraries, some of which have specific and sometimes challenging installation requirements. Here's a breakdown of the problems and suggested fixes:
Sorry again for the problems and missing hints in the installation doc. I didn't get much feedback about problems during installation so far, I'll update the guides and optimize the install code as good as I can. |
I also probably add an option to install Linguflex completely without deepspeed. I think this is the most challening library to install and the benefits we get from it are quite neglectable, it will work good enough without it. |
If you can't get deepspeed installed (which as I mentioned would not be your fault, it's hard) you can disable it in the settings. Open settings.yaml and in the speech section set the parameter coqui_use_deepspeed to False. |
Hi, I compiled and Installed deepspeed wheel for a specfic Python Version and now No Errorrs when installing LinguFlex see: microsoft/DeepSpeed#4729 still have to check llama-cpp-python. |
This actually installs deepspeed successfully, but because of the DEPRECATION error the install script detects it as failed. LLama Error was fixed by moving the files, thanks @KoljaB |
DEPRECATION: omegaconf 2.0.6 has a non-standard dependency specifier PyYAML>=5.1.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of omegaconf or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
realtimestt 0.1.15 requires scipy==1.12.0, but you have scipy 1.11.4 which is incompatible.
realtimestt 0.1.15 requires torch==2.2.2, but you have torch 2.1.2+cu118 which is incompatible.
realtimestt 0.1.15 requires torchaudio==2.2.2, but you have torchaudio 2.1.2+cu118 which is incompatible.
tts 0.22.0 requires librosa>=0.10.0, but you have librosa 0.9.1 which is incompatible.
tts 0.22.0 requires numpy==1.22.0; python_version <= "3.10", but you have numpy 1.26.4 which is incompatible.
stream2sentence 0.2.3 requires emoji==2.8.0, but you have emoji 2.10.1 which is incompatible.
Successfully installed torch-2.1.2+cu118 torchaudio-2.1.2+cu118
Successfully installed PyTorch and Torchaudio for CUDA 11.8.
Installing required deepspeed ...
Failed to install https://github.com/daswer123/deepspeed-windows/releases/download/11.2/deepspeed-0.11.2+cuda118-cp310-cp310-win_amd64.whl. Error: DEPRECATION: omegaconf 2.0.6 has a non-standard dependency specifier PyYAML>=5.1.*. pip 24.1 will enforce this behaviour change. A possible replacement is to upgrade to a newer version of omegaconf or contact the author to suggest that they release a version with a conforming dependency specifiers. Discussion can be found at pypa/pip#12063
-- Configuring incomplete, errors occurred!
*** CMake configuration failed
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> See above for output.
note: This error originates from a subprocess, and is likely not a problem with pip.
full command: 'C:\Users\DEVO\AI_C\Linguflex\test_env\Scripts\python.exe' 'C:\Users\DEVO\AI_C\Linguflex\test_env\lib\site-packages\pip_vendor\pyproject_hooks_in_process_in_process.py' build_wheel 'C:\Users\DEVO\AppData\Local\Temp\tmpbkvrmjzt'
cwd: C:\Users\DEVO\AppData\Local\Temp\pip-install-v2c7qa2p\llama-cpp-python_6a239ebbad884300a54823bc225c4ef3
Building wheel for llama-cpp-python (pyproject.toml) ... error
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects
Failed to install llama-cpp-python. Error: Command '['C:\Users\DEVO\AI_C\Linguflex\test_env\Scripts\python.exe', '-m', 'pip', 'install', 'llama-cpp-python', '--force-reinstall', '--upgrade', '--no-cache-dir', '--verbose']' returned non-zero exit status 1.
You may need to copy MSBuildExtensions files for CUDA 11.8.
Copy all four MSBuildExtensions files from:
C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.8\extras\visual_studio_integration\MSBuildExtensions
to
C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Microsoft\VC\v170\BuildCustomizations
before restarting the installation script or manually executing the following command:
pip install llama-cpp-python --force-reinstall --upgrade --no-cache-dir --verbose
Do you want to continue without a verified installation of llama-cpp-python? (yes/no):
Do you want to try anyway? (yes/no):
Setting numpy version ...
Successfully installed numpy==1.23.5
Traceback (most recent call last):
File "C:\Users\DEVO\AI_C\Linguflex\download_models.py", line 6, in
from huggingface_hub import hf_hub_download
ModuleNotFoundError: No module named 'huggingface_hub'
The text was updated successfully, but these errors were encountered: