Highlights
Collection
2 items
β’
Updated
~ The power of Three
This is a merge of pre-trained language models created using mergekit.
If you like my work, consider buying me a coffee to support future merges, GPU time, and experiments.
This model was merged using the DARE TIES merge method using darkc0de/XortronCriminalComputingConfig as a base.
The following models were included in the merge:
The following YAML configuration was used to produce this model:
base_model: darkc0de/XortronCriminalComputingConfig
chat_template: auto
merge_method: dare_ties
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: darkc0de/XortronCriminalComputingConfig
parameters:
weight: 0.4
- layer_range: [0, 40]
model: Sorawiz/MistralCreative-24B-Chat
parameters:
weight: 0.3
- layer_range: [0, 40]
model: TheDrummer/Cydonia-24B-v3
parameters:
weight: 0.3
out_dtype: bfloat16
parameters:
density: 1.0
tokenizer: {}