merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Karcher Mean merge method using Gryphe/Pantheon-RP-1.8-24b-Small-3.1 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: cognitivecomputations/Dolphin-Mistral-24B-Venice-Edition
  - model: trashpanda-org/MS-24B-Instruct-Mullein-v0
  - model: PocketDoc/Dans-DangerousWinds-V1.1.1-24b
  - model: huihui-ai/Mistral-Small-24B-Instruct-2501-abliterated
  - model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
  - model: ReadyArt/Omega-Darker_The-Final-Directive-24B
merge_method: karcher
base_model: Gryphe/Pantheon-RP-1.8-24b-Small-3.1
parameters:
  max_iter: 1000
normalize: true
int8_mask: true
tokenizer_source: base
dtype: bfloat16

I am the creator and developer of the karcher method. If you want to cooperate, please contact my email.

Please support my ko-fi. https://ko-fi.com/ogodwin10

Downloads last month
8
Safetensors
Model size
23.6B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for win10/Mistral-RP-24b-karcher-pro