Configuration Parsing
Warning:
In config.json: "quantization_config.bits" must be an integer
Quantization
Quantized using the default exllamav2 (0.2.9) quantization process.
Original model: https://huggingface.co/TheDrummer/Star-Command-R-32B-v1
exllamav2: https://github.com/turboderp-org/exllamav2
Original model card of Star-Command-R-32B-v1
Join our Discord! https://discord.gg/Nbv9pQ88Xb
BeaverAI proudly presents...
Star Command R 32B v1 π
An RP finetune of Command-R-8-2024
Links
- Original: https://huggingface.co/TheDrummer/Star-Command-R-32B-v1
- GGUF: https://huggingface.co/TheDrummer/Star-Command-R-32B-v1-GGUF
Usage
- Cohere Instruct format or Text Completion
Special Thanks
- Mr. Gargle for the GPUs! Love you, brotha.
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for MetaphoricalCode/Star-Command-R-32B-v1-4.25bpw-h8-exl2
Base model
TheDrummer/Star-Command-R-32B-v1