This is a W8A8-FP8 quant created using llm-compressor which can be loaded with vllm.
- Downloads last month
- 35
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for aikitoria/c4ai-command-a-03-2025-FP8-Dynamic
Base model
CohereForAI/c4ai-command-a-03-2025