MT-Gen12-gemma-2-9B / README.md
zelk12's picture
Update README.md
3957424 verified
---
base_model:
- Sorawiz/Gemma-9B-Chat
- zelk12/MT1-Gen7-gemma-2-9B
- TheDrummer/Tiger-Gemma-9B-v3
- zelk12/MT-Gen6fix-gemma-2-9B
- IlyaGusev/gemma-2-9b-it-abliterated
- zelk12/MT-Merge6-gemma-2-9B
library_name: transformers
tags:
- mergekit
- merge
license: gemma
pipeline_tag: text-generation
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [zelk12/MT-Merge6-gemma-2-9B](https://huggingface.co/zelk12/MT-Merge6-gemma-2-9B) as a base.
### Models Merged
The following models were included in the merge:
* [Sorawiz/Gemma-9B-Chat](https://huggingface.co/Sorawiz/Gemma-9B-Chat)
* [zelk12/MT1-Gen7-gemma-2-9B](https://huggingface.co/zelk12/MT1-Gen7-gemma-2-9B)
* [TheDrummer/Tiger-Gemma-9B-v3](https://huggingface.co/TheDrummer/Tiger-Gemma-9B-v3)
* [zelk12/MT-Gen6fix-gemma-2-9B](https://huggingface.co/zelk12/MT-Gen6fix-gemma-2-9B)
* [IlyaGusev/gemma-2-9b-it-abliterated](https://huggingface.co/IlyaGusev/gemma-2-9b-it-abliterated)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: zelk12/MT-Merge6-gemma-2-9B
#no parameters necessary for base model
- model: zelk12/MT1-Gen7-gemma-2-9B
parameters:
density: 0.8
weight: 0.8
- model: IlyaGusev/gemma-2-9b-it-abliterated
parameters:
density: 0.75
weight: 0.75
- model: Sorawiz/Gemma-9B-Chat
parameters:
density: 0.72
weight: 0.72
- model: TheDrummer/Tiger-Gemma-9B-v3
parameters:
density: 0.67
weight: 0.67
- model: zelk12/MT-Gen6fix-gemma-2-9B
parameters:
density: 0.5
weight: 0.5
merge_method: dare_ties
base_model: zelk12/MT-Merge6-gemma-2-9B
parameters:
normalize: true
dtype: bfloat16
```