Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

Quantization

Quantized using the default exllamav2 (0.2.9) quantization process.
Original model: https://huggingface.co/TheDrummer/Star-Command-R-32B-v1
exllamav2: https://github.com/turboderp-org/exllamav2

Original model card of Star-Command-R-32B-v1


Join our Discord! https://discord.gg/Nbv9pQ88Xb


BeaverAI proudly presents...

Star Command R 32B v1 🌟

An RP finetune of Command-R-8-2024

image/png

Links

Usage

  • Cohere Instruct format or Text Completion

Special Thanks

  • Mr. Gargle for the GPUs! Love you, brotha.
Downloads last month
2
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for MetaphoricalCode/Star-Command-R-32B-v1-4.25bpw-h8-exl2

Quantized
(4)
this model