๐Ÿ—ก๏ธ Phi-4 Abliterated Orion v2 (18B)

Altered variant of the legendary Orion-zhen/phi-4-abliterated by Orion-zhen.

Orion-zhen's original wasnโ€™t just abliteration. it is capable of portraying complex personalities, delicate emotions, sharp thoughts and inner conflict, it felt more like a katana than a model: slicing through responses with cold precision.

This version shifts the focus slightly toward storytelling. Still emotionally complex, still powerful under duress, captivity, or obedienceโ€”but with a more narrative-friendly tone. If the original was a katana, this one has a sheath.

๐Ÿ’ก Summary

  • ๐Ÿงฌ Based on Phi-4 Abliterated (40-layer model)
  • ๐Ÿ” Mid-layer region boosted via repetition pattern
  • ๐Ÿง  Final param count: ~18.07B
  • ๐Ÿ”ข Data type: bfloat16
  • ๐Ÿงฉ Ready for: quantization (Confirmed GGUF via llama-cpp )
  • ๐Ÿ“ฆ File structure: Hugging Face transformers + safetensors compatible

๐Ÿ› ๏ธ How It Was Created

Inspired by @mlabonne's BigQwen2.5-Echo-47B-Instruct, added exact same layers to original Phi-4 14B(40-layer model) to the middle part while keep in tact first and last parts. Though its parameter count increased(14B --> 18B), it is a structural duplicate. No new data added. Orion 18B expands its internal architecture by 23%โ€”widened mid-layers give it more breathing room, more depth, and more flexibility to reason, reflect, and respond.

โš™๏ธ Merge Configuration

dtype: bfloat16
merge_method: passthrough
slices:
  # First 8 layers
  - sources:
      - model: Orion-zhen/phi-4-abliterated
        layer_range: [0, 8]

  # Second 8 layers
  # Next 8 layers: Replicate 2 layers
  - sources:
    - model: Orion-zhen/phi-4-abliterated
      layer_range: [8, 9]
  - sources:
    - model: Orion-zhen/phi-4-abliterated
      layer_range: [8, 9]
  - sources:
    - model: Orion-zhen/phi-4-abliterated
      layer_range: [9, 13]
  - sources:
    - model: Orion-zhen/phi-4-abliterated
      layer_range: [13, 14]
  - sources:
    - model: Orion-zhen/phi-4-abliterated
      layer_range: [13, 14]
  - sources:
    - model: Orion-zhen/phi-4-abliterated
      layer_range: [14, 16]

  # Third 8 layers
  - sources:
      - model: Orion-zhen/phi-4-abliterated
        layer_range: [16, 24]
  # Third 8 layers x 2!
  - sources:
      - model: Orion-zhen/phi-4-abliterated
        layer_range: [16, 24]
  # Fourth 8 layers
  - sources:
      - model: Orion-zhen/phi-4-abliterated
        layer_range: [24, 32]

  # Fifth 8 layers
  - sources:
      - model: Orion-zhen/phi-4-abliterated
        layer_range: [32, 40]
Downloads last month
11
Safetensors
Model size
18.1B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Otakadelic/phi-4-abliterated-Orion-18B

Base model

microsoft/phi-4
Finetuned
(1)
this model
Quantizations
4 models