Scarlet-Seraph-12B

Overview

Scarlet-Seraph-12B is made by merging Scarlet-Ink-12B, Abyssal-Seraph-12B, and Scarlet-Eclipse-12B using a custom method.

Show YAML Config
models:
  - model: Vortex5/Scarlet-Ink-12B
    parameters:
      weight:
        - filter: self_attn
          value: 0.1
        - filter: mlp
          value: 0.33
        - value: 0.33
  - model: Vortex5/Abyssal-Seraph-12B
    parameters:
      weight:
        - filter: self_attn
          value: 0.8
        - filter: mlp
          value: 0.4
        - value: 0.33
  - model: Vortex5/Scarlet-Eclipse-12B
    parameters:
      weight:
        - filter: self_attn
          value: 0.15
        - filter: mlp
          value: 0.45
        - value: 0.33
merge_method: amsf
parameters:
  normalize: true
dtype: bfloat16
tokenizer:
  source: Vortex5/Abyssal-Seraph-12B
  

Intended Use

πŸ“• Storytelling
🎭 Roleplay
✨ Creative Writing

Credits

Downloads last month
16
Safetensors
Model size
12B params
Tensor type
BF16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Vortex5/Scarlet-Seraph-12B