merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using TroyDoesAI/BlackSheep-24B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
 - model: TroyDoesAI/BlackSheep-24B
   parameters:
     density: 0.9
     weight: 1
 - model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
   parameters:
     density: 0.6
     weight: 0.8
 - model: PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
   parameters:
     density: 1
     weight: 0.9
 - model: lars1234/Mistral-Small-24B-Instruct-2501-writer
   parameters:
     density: 0.8
     weight: 0.6
merge_method: dare_ties
base_model: TroyDoesAI/BlackSheep-24B
tokenizer_source: base
parameters:
  rescale: true
dtype: bfloat16
Downloads last month
0
Safetensors
Model size
23.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DoppelReflEx/MiniusLight-24B-v2.2d-test