Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
ModelCloud
/
MiniMax-M2-GPTQMODEL-W4A16
like
1
Follow
ModelCloud.AI
70
Text Generation
Safetensors
English
minimax
gptqmodel
modelcloud
chat
glm4.6
glm
instruct
int4
gptq
4bit
w4a16
conversational
custom_code
4-bit precision
License:
modelcloud
Model card
Files
Files and versions
xet
Community
1
Use this model
New discussion
New pull request
Resources
PR & discussions documentation
Code of Conduct
Hub documentation
All
Discussions
Pull requests
View closed (0)
Sort: Recently created
Is it compatible with vLLM?
4
#1 opened 5 days ago by
bullerwins