I have no idea what I’m doing… if this causes the apocalypse someone please let me know.

Lumimaid-Magnum-12B 8.0bpw h8 EXL2

Includes measurement.json file for further quantization

Original Model: https://huggingface.co/Undi95/Lumimaid-Magnum-12B

Original Model Card

Merge of Lumimaid and Magnum as requested by some.

I used the new DELLA merge method in mergekit and added a finetune of Nemo only on Claude input, trained on 16k ctx, in the mix.

Prompt template: Mistral

<s>[INST] {input} [/INST] {output}</s>
Downloads last month
20
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for FuturisticVibes/Lumimaid-Magnum-12B-8.0bpw-h8-exl2