--- base_model: PokeeAI/pokee_research_7b base_model_relation: quantized quantized_by: ArtusDev license: apache-2.0 language: - en tags: - agent - deepresearch - llm - rl - reinforcementlearning - exl3 datasets: - miromind-ai/MiroRL-GenQA ---

ArtusDev/PokeeAI_pokee_research_7b-EXL3

EXL3 quants of PokeeAI/pokee_research_7b using exllamav3 for quantization.

Quants

Quant BPW Head Bits Size (GB)
2.5_H6 2.5 6 3.56
3.0_H6 3.0 6 3.97
3.5_H6 3.5 6 4.38
4.0_H6 4.0 6 4.79
4.5_H6 4.5 6 5.19
5.0_H6 5.0 6 5.60
6.0_H6 6.0 6 6.42
8.0_H8 8.0 8 8.19

How to Download and Use Quants

You can download quants by targeting specific size using the Hugging Face CLI.

Click for download commands
1. Install huggingface-cli:
pip install -U "huggingface_hub[cli]"
2. Download a specific quant:
huggingface-cli download ArtusDev/PokeeAI_pokee_research_7b-EXL3 --revision "5.0bpw_H6" --local-dir ./

EXL3 quants can be run with any inference client that supports EXL3, such as TabbyAPI. Refer to documentation for set up instructions.

Quant Requests

Request EXL3 Quants

See EXL community hub for request guidelines.

Acknowledgements

Made possible with cloud compute from lium.io