Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
tsunemoto
/
dolphin-2.5-mixtral-8x7b-GGUF
like
1
GGUF
English
GGUF
conversational
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
dolphin-2.5-mixtral-8x7b-GGUF
404 GB
1 contributor
History:
3 commits
tsunemoto
Update README.md
3c58b75
almost 2 years ago
.gitattributes
Safe
2.53 kB
Upload folder using huggingface_hub
almost 2 years ago
README.md
Safe
3.91 kB
Update README.md
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q2_K.gguf
15.5 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q3_K_L.gguf
20.3 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q3_K_M.gguf
20.2 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q3_K_S.gguf
20.1 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q4_0.gguf
26.3 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q4_1.gguf
29.2 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q4_K_M.gguf
26.3 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q4_K_S.gguf
26.3 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q5_0.gguf
32.1 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q5_1.gguf
35 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q5_K_M.gguf
32.1 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q5_K_S.gguf
32.1 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q6_K.gguf
Safe
38.3 GB
xet
Upload folder using huggingface_hub
almost 2 years ago
dolphin-2.5-mixtral-8x7b.Q8_0.gguf
Safe
49.6 GB
xet
Upload folder using huggingface_hub
almost 2 years ago