Skip to content

Commit

Permalink
Work-around Dao-AILab/flash-attention#453 when installing Flash Atten…
Browse files Browse the repository at this point in the history
…tion
  • Loading branch information
astefanutti committed Jun 26, 2024
1 parent 7f55a1a commit 4a13ddf
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions examples/ray-finetune-llm-deepspeed/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,9 @@ awscliv2==2.3.0
datasets==2.19.2
deepspeed==0.14.3
# Flash Attention 2 requires PyTorch to be installed first
# See https://github.com/Dao-AILab/flash-attention/issues/246
flash-attn==2.5.9.post1
# See https://github.com/Dao-AILab/flash-attention/issues/453
#flash-attn==2.5.9.post1
https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.9.post1/flash_attn-2.5.9.post1+cu122torch2.3cxx11abiFALSE-cp311-cp311-linux_x86_64.whl
peft==0.11.1
ray[train]==2.23.0
torch==2.3.1
Expand Down

0 comments on commit 4a13ddf

Please sign in to comment.