Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
unsloth
/
granite-4.0-350m-GGUF
like
1
Follow
Unsloth AI
10.3k
Transformers
GGUF
language
unsloth
granite-4.0
conversational
arxiv:
0000.00000
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Train
Deploy
Use this model
main
granite-4.0-350m-GGUF
5.45 GB
1 contributor
History:
37 commits
danielhanchen
Delete granite-4.0-350m-UD-IQ2_XXS.gguf
d6a88cd
verified
8 days ago
.gitattributes
Safe
2.91 kB
Upload folder using huggingface_hub
8 days ago
README.md
Safe
33.9 kB
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-BF16.gguf
Safe
708 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-IQ4_NL.gguf
Safe
229 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-IQ4_XS.gguf
Safe
222 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q3_K_M.gguf
Safe
208 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q3_K_S.gguf
Safe
195 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q4_0.gguf
229 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q4_1.gguf
244 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q4_K_M.gguf
237 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q4_K_S.gguf
229 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q5_K_M.gguf
264 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q5_K_S.gguf
260 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q6_K.gguf
293 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-Q8_0.gguf
378 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-UD-IQ3_XXS.gguf
172 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-UD-Q3_K_XL.gguf
213 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-UD-Q4_K_XL.gguf
241 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-UD-Q5_K_XL.gguf
265 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-UD-Q6_K_XL.gguf
339 MB
xet
Upload folder using huggingface_hub
8 days ago
granite-4.0-350m-UD-Q8_K_XL.gguf
517 MB
xet
Upload folder using huggingface_hub
8 days ago
imatrix_unsloth.gguf_file
948 kB
xet
Upload folder using huggingface_hub
8 days ago