What kind of hardware does this require to run locally?
#1
by
wangleineo
- opened
What kind of hardware does this require to run locally?
Our code has been optimized to require less than 24GB of VRAM when using torch.bfloat16 for computation. The optimized version will be released within the next two days.
What kind of hardware does this require to run locally?
The code for multi-GPU inference is updated. Also the low-memory inference is supported.