fffiloni commited on
Commit
6221263
·
verified ·
1 Parent(s): d516aa4

update pytorch and flash-attn

Browse files
Files changed (1) hide show
  1. requirements.txt +4 -4
requirements.txt CHANGED
@@ -1,6 +1,6 @@
1
- torch==2.4.1
2
- torchvision==0.19.1
3
- torchaudio==2.4.1
4
  opencv-python>=4.9.0.80
5
  diffusers>=0.31.0
6
  transformers>=4.49.0
@@ -25,4 +25,4 @@ numpy<2
25
  ninja
26
  psutil
27
  packaging
28
- https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
 
1
+ torch==2.6.0
2
+ torchvision==0.21.0
3
+ torchaudio==2.6.0
4
  opencv-python>=4.9.0.80
5
  diffusers>=0.31.0
6
  transformers>=4.49.0
 
25
  ninja
26
  psutil
27
  packaging
28
+ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl