Tarek07's picture
Upload folder using huggingface_hub
57660bb verified
metadata
base_model:
  - TareksLab/Dungeons-R1-LLaMa-70B
  - TareksLab/Doppleganger-V8-LLaMa-70B
  - TareksLab/Dragons-V1-LLaMa-70B
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Linear DELLA merge method using TareksLab/Doppleganger-V8-LLaMa-70B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: TareksLab/Dungeons-R1-LLaMa-70B
    parameters:
      weight: 0.35
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
  - model: TareksLab/Dragons-V1-LLaMa-70B
    parameters:
      weight: 0.35
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
  - model: TareksLab/Doppleganger-V8-LLaMa-70B
    parameters:
      weight: 0.30
      density: 0.7
      epsilon: 0.2
      lambda: 1.1
merge_method: della_linear
base_model: TareksLab/Doppleganger-V8-LLaMa-70B
parameters:
  normalize: false
  int8_mask: true
dtype: bfloat16
chat_template: llama3
tokenizer:
 source: base