|
---
|
|
library_name: ExLlama2
|
|
pipeline_tag: text-generation
|
|
base_model: google/shieldgemma-2b
|
|
base_model_relation: quantized
|
|
---
|
|
|
|
# Exl2 quants for [shieldgemma-2b](https://huggingface.co/google/shieldgemma-2b)
|
|
|
|
## Automatically quantized using the auto quant script from [hf-scripts](https://huggingface.co/anthonyg5005/hf-scripts)
|
|
|
|
### BPW:
|
|
|
|
[4.25](https://huggingface.co/Anthonyg5005/shieldgemma-2b-exl2/tree/4.25bpw)\
|
|
[5.0](https://huggingface.co/Anthonyg5005/shieldgemma-2b-exl2/tree/5.0bpw)\
|
|
[6.0](https://huggingface.co/Anthonyg5005/shieldgemma-2b-exl2/tree/6.0bpw)\
|
|
[6.5](https://huggingface.co/Anthonyg5005/shieldgemma-2b-exl2/tree/6.5bpw)\
|
|
[8.0](https://huggingface.co/Anthonyg5005/shieldgemma-2b-exl2/tree/8.0bpw)\
|
|
[measurement.json](https://huggingface.co/Anthonyg5005/shieldgemma-2b-exl2/blob/main/measurement.json)
|
|
|