007ankit's picture
Upload folder using huggingface_hub
48298c3 verified
metadata
base_model:
  - BioMistral/BioMistral-7B
  - OdiaGenAI/mistral_hindi_7b_base_v1
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the NuSLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: BioMistral/BioMistral-7B
        layer_range: [0, 32]
      - model: OdiaGenAI/mistral_hindi_7b_base_v1
        layer_range: [0, 32]
    weight: [0.5, 0.5]  # Equivalent to SLERP t=0.5
    nuslerp_flatten: false  # Row/column-wise interpolation
    nuslerp_row_wise: true  # SLERP row vectors instead of column vectors

# Adding the required `merge_method`
merge_method: nuslerp  

# Base model is optional, but included for compatibility


parameters:
  weight: 
    - filter: self_attn
      value: [1, 0.5, 0.3, 0.7, 0]   # Replicating the same t values
    - filter: mlp
      value: [0, 0.5, 0.7, 0.3, 1]   # Reversed values for MLP
    - value: 0.5                      # Default weight for other layers

dtype: bfloat16