Spaces:
Running
Running
Commit
·
9d0a236
1
Parent(s):
edcba0d
dang
Browse files- requirements.txt +15 -14
requirements.txt
CHANGED
@@ -12,23 +12,24 @@
|
|
12 |
#torchcodec
|
13 |
# flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.3/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
14 |
#
|
15 |
-
#
|
|
|
16 |
#
|
17 |
-
torch==2.7.1
|
18 |
-
torchvision==0.22.1
|
19 |
-
torchdata==0.11.0
|
20 |
-
torchao==0.12.0
|
21 |
-
torchcodec==0.5.0
|
22 |
-
flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.7cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
23 |
#
|
24 |
-
#
|
25 |
#
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
|
33 |
# something broke in Transformers > 4.55.4
|
34 |
transformers==4.55.4
|
|
|
12 |
#torchcodec
|
13 |
# flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.3/flash_attn-2.8.3+cu12torch2.8cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
14 |
#
|
15 |
+
# if we revert back to torch 2.7, then we get another error:
|
16 |
+
# Missing key in checkpoint state_dict: optimizer.param_groups.scale_shift_table.decoupled_weight_decay.
|
17 |
#
|
18 |
+
#torch==2.7.1
|
19 |
+
#torchvision==0.22.1
|
20 |
+
#torchdata==0.11.0
|
21 |
+
#torchao==0.12.0
|
22 |
+
#torchcodec==0.5.0
|
23 |
+
#flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.7cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
24 |
#
|
25 |
+
# so in the end, we have to revert back to the 2.6:
|
26 |
#
|
27 |
+
torch==2.6.0
|
28 |
+
torchvision==0.21.0
|
29 |
+
torchdata==0.10.1
|
30 |
+
torchao==0.9.0
|
31 |
+
torchcodec==0.4.0
|
32 |
+
flash-attn @ https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
|
33 |
|
34 |
# something broke in Transformers > 4.55.4
|
35 |
transformers==4.55.4
|