MaziyarPanahi/TheTop-7B-DPO-S2-v0.2

Merge of top 7B models with SLERP method.

mergekit is a toolkit for merging pre-trained language models. mergekit uses an out-of-core approach to perform unreasonably elaborate merges in resource-constrained situations. Merges can be run entirely on CPU or accelerated with as little as 8 GB of VRAM. Many merging algorithms are supported, with more coming as they catch my attention.

Downloads last month
11
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MaziyarPanahi/TheTop-7B-DPO-S2-v0.2

Quantizations
2 models

Collection including MaziyarPanahi/TheTop-7B-DPO-S2-v0.2