This is a 5.0bpw/h8 quantized version of huihui-ai/QwQ-32B-abliterated using exllamav2 with this PR applied.
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support