--- license: gpl-3.0 language: - eng tags: - text Generation - mergekit - phi - bfloat16 - Merged - 18B - conversational - Story - Abliterated model_creator: Orion-zhen base_model: Orion-zhen/phi-4-abliterated merged_by: otakadelic model_name: phi-4-abliterated-Orion-v2 --- # 🗡️ Phi-4 Abliterated Orion v2 (18B) **Altered variant of the legendary [Orion-zhen/phi-4-abliterated](https://huggingface.co/Orion-zhen/phi-4-abliterated) by Orion-zhen.** Orion-zhen's original wasn’t just abliteration. it is capable of portraying complex personalities, delicate emotions, sharp thoughts and inner conflict, it felt more like a katana than a model: slicing through responses with cold precision. This version shifts the focus slightly toward storytelling. Still emotionally complex, still powerful under duress, captivity, or obedience—but with a more narrative-friendly tone. If the original was a katana, this one has a sheath. ## 💡 Summary - 🧬 Based on **Phi-4 Abliterated** (40-layer model) - 🔁 Mid-layer region **boosted** via **repetition pattern** - 🧠 Final param count: **~18.07B** - 🔢 Data type: **bfloat16** - 🧩 Ready for: **quantization (Confirmed GGUF via llama-cpp )** - 📦 File structure: Hugging Face `transformers` + `safetensors` compatible ## 🛠️ How It Was Created Inspired by @mlabonne's [BigQwen2.5-Echo-47B-Instruct](https://huggingface.co/mlabonne/BigQwen2.5-Echo-47B-Instruct), added exact same layers to original Phi-4 14B(40-layer model) to the middle part while keep in tact first and last parts. Though its parameter count increased(14B --> 18B), it is a structural duplicate. No new data added. Orion 18B expands its internal architecture by 23%—widened mid-layers give it more breathing room, more depth, and more flexibility to reason, reflect, and respond. ## ⚙️ Merge Configuration ```yaml dtype: bfloat16 merge_method: passthrough slices: # First 8 layers - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [0, 8] # Second 8 layers # Next 8 layers: Replicate 2 layers - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [8, 9] - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [8, 9] - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [9, 13] - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [13, 14] - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [13, 14] - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [14, 16] # Third 8 layers - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [16, 24] # Third 8 layers x 2! - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [16, 24] # Fourth 8 layers - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [24, 32] # Fifth 8 layers - sources: - model: Orion-zhen/phi-4-abliterated layer_range: [32, 40] ```