Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to install flash-attn due to SSL error #756

Closed
tranlm opened this issue Jan 6, 2024 · 5 comments
Closed

Unable to install flash-attn due to SSL error #756

tranlm opened this issue Jan 6, 2024 · 5 comments

Comments

@tranlm
Copy link

tranlm commented Jan 6, 2024

Hi,

I'm on an EC2 instance and am trying to install flash-attn but keep running into an ssl error. Wondering if you know what's going on. I have openssl-1.1.1l installed.

Here's the output:

[ec2-user@ip-xxx-xx-xx-x ~]$ pip3.10 install flash-attn --no-build-isolation
Defaulting to user installation because normal site-packages is not writeable
Collecting flash-attn
  Using cached flash_attn-2.4.2.tar.gz (2.4 MB)
  Preparing metadata (setup.py) ... done
Requirement already satisfied: torch in ./.local/lib/python3.10/site-packages (from flash-attn) (2.1.2)
Collecting einops (from flash-attn)
  Using cached einops-0.7.0-py3-none-any.whl.metadata (13 kB)
Requirement already satisfied: packaging in ./.local/lib/python3.10/site-packages (from flash-attn) (23.2)
Collecting ninja (from flash-attn)
  Using cached ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl.metadata (5.3 kB)
Requirement already satisfied: filelock in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (3.13.1)
Requirement already satisfied: typing-extensions in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (4.9.0)
Requirement already satisfied: sympy in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (1.12)
Requirement already satisfied: networkx in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (3.2.1)
Requirement already satisfied: jinja2 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (3.1.2)
Requirement already satisfied: fsspec in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (2023.12.2)
Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105)
Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105)
Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105)
Requirement already satisfied: nvidia-cudnn-cu12==8.9.2.26 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (8.9.2.26)
Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (12.1.3.1)
Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (11.0.2.54)
Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (10.3.2.106)
Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (11.4.5.107)
Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (12.1.0.106)
Requirement already satisfied: nvidia-nccl-cu12==2.18.1 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (2.18.1)
Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105)
Requirement already satisfied: triton==2.1.0 in ./.local/lib/python3.10/site-packages (from torch->flash-attn) (2.1.0)
Requirement already satisfied: nvidia-nvjitlink-cu12 in ./.local/lib/python3.10/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch->flash-attn) (12.3.101)
Requirement already satisfied: MarkupSafe>=2.0 in ./.local/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (2.1.3)
Requirement already satisfied: mpmath>=0.19 in ./.local/lib/python3.10/site-packages (from sympy->torch->flash-attn) (1.3.0)
Using cached einops-0.7.0-py3-none-any.whl (44 kB)
Using cached ninja-1.11.1.1-py2.py3-none-manylinux1_x86_64.manylinux_2_5_x86_64.whl (307 kB)
Building wheels for collected packages: flash-attn
  Building wheel for flash-attn (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [9 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
      
      
      torch.__version__  = 2.1.2+cu121
      
      
      running bdist_wheel
      Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.2/flash_attn-2.4.2+cu122torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
      error: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)>
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects
@tridao
Copy link
Member

tridao commented Jan 6, 2024

Can you download the wheel manually, just to check the networking is fine?
https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.2/flash_attn-2.4.2+cu122torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

@tranlm
Copy link
Author

tranlm commented Jan 10, 2024

Thanks Tri. After some debugging, I tracked it down to a Poetry issue (I think the same one from this issue). Will go ahead and close it.

@tranlm tranlm closed this as completed Jan 10, 2024
@Tejaswgupta
Copy link

@tranlm how did you solve it exactly? Having a similar issue.

  Building wheel for flash-attn (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [9 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
      
      
      torch.__version__  = 2.1.2+cu121
      
      
      running bdist_wheel
      Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.2/flash_attn-2.4.2+cu122torch2.1cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
      error: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1002)>
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: Could not build wheels for flash-attn, which is required to install pyproject.toml-based projects

@tranlm
Copy link
Author

tranlm commented Jan 14, 2024 via email

@muhark
Copy link

muhark commented Apr 23, 2024

Just to mention in case future people come across this--I got this error in a mamba (as in the conda alternative) environment that is not using poetry AFAIK.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants