Mistral-Quine-24B-GGUF (mistral-small-3.1-24b-instruct-2503-jackterated-GGUF)
This is an experimental version, just text for now, for more information about the Abliterated technique, refer to this notebook and check out @FailSpy.
- Downloads last month
- 122
Hardware compatibility
Log In
to view the estimation
4-bit
6-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for JackCloudman/mistral-small-3.1-24b-instruct-2503-jackterated-GGUF
Base model
mistralai/Mistral-Small-3.1-24B-Base-2503