Quantized using Intel's AutoRound quantization tool.
auto-round-best --model facebook/opt-30b --scheme "w4a16"
- Downloads last month
- 18
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for Emanresu/opt-30b-w4g128-AutoRound
Base model
facebook/opt-30b