File size: 1,213 Bytes
257f706
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
# Installation Guide

## Install with pip

```bash

pip install .

pip install .[dev]  # Installe aussi les outils de dev

```

## Install with Poetry

Ensure you have [Poetry](https://python-poetry.org/docs/#installation) installed on your system.

To install all dependencies:

```bash

poetry install

```

### Handling `flash-attn` Installation Issues

If `flash-attn` fails due to **PEP 517 build issues**, you can try one of the following fixes.

#### No-Build-Isolation Installation (Recommended)
```bash

poetry run pip install --upgrade pip setuptools wheel

poetry run pip install flash-attn --no-build-isolation

poetry install

```

#### Install from Git (Alternative)
```bash

poetry run pip install git+https://github.com/Dao-AILab/flash-attention.git

```

---

### Running the Model

Once the installation is complete, you can run **Wan2.2** using:

```bash

poetry run python generate.py --task t2v-A14B --size '1280*720' --ckpt_dir ./Wan2.2-T2V-A14B --prompt "Two anthropomorphic cats in comfy boxing gear and bright gloves fight intensely on a spotlighted stage."

```

#### Test
```bash

bash tests/test.sh

```

#### Format
```bash

black .

isort .

```