Mistral-MOE-4X7B-Dark-MultiVerse-24B / mergekit_moe_config.yml
DavidAU's picture
Upload folder using huggingface_hub
46f5df5 verified
raw
history blame
272 Bytes
base_model: F:/7B/Multi-Verse-RP
gate_mode: random
dtype: bfloat16
experts_per_token: 2
experts:
- source_model: F:/7B/DarkSapling-7B-v1.0
- source_model: F:/7B/DarkSapling-7B-v1.1
- source_model: F:/7B/DarkSapling-V2
- source_model: F:/7B/Multi-Verse-RP