Update README.md
Browse files
README.md
CHANGED
@@ -16,7 +16,7 @@ Here are some perplexity measurements:
|
|
16 |
| Model | File size ↓ | PPL (wiki.text.raw) ↓ |
|
17 |
| --- | --- | --- |
|
18 |
| [iQ3_xs (bartowski)](https://huggingface.co/bartowski/google_gemma-3-12b-it-GGUF/blob/main/google_gemma-3-12b-it-IQ3_XS.gguf) | 5.21 GB |10.0755 +/- 0.08024 |
|
19 |
-
|
|
20 |
| [Q4_0 (bartowski)](https://huggingface.co/bartowski/google_gemma-3-12b-it-GGUF/blob/main/google_gemma-3-12b-it-Q4_0.gguf) | 6.91 GB | 9.5589 +/- 0.07527 |
|
21 |
| [QAT Q4_0 (google)](https://huggingface.co/google/gemma-3-12b-it-qat-q4_0-gguf/blob/main/gemma-3-12b-it-q4_0.gguf) | 8.07 GB | 9.2565 +/- 0.07212 |
|
22 |
| [Q5_K_S (bartowski)](https://huggingface.co/bartowski/google_gemma-3-12b-it-GGUF/blob/main/google_gemma-3-12b-it-Q5_K_S.gguf) | 8.23 GB | 9.8540 +/- 0.08016 |
|
|
|
16 |
| Model | File size ↓ | PPL (wiki.text.raw) ↓ |
|
17 |
| --- | --- | --- |
|
18 |
| [iQ3_xs (bartowski)](https://huggingface.co/bartowski/google_gemma-3-12b-it-GGUF/blob/main/google_gemma-3-12b-it-IQ3_XS.gguf) | 5.21 GB |10.0755 +/- 0.08024 |
|
19 |
+
| [This model](https://huggingface.co/stduhpf/google-gemma-3-12b-it-qat-q4_0-gguf-small/blob/main/gemma-3-12b-it-q4_0_s.gguf) | 6.89 GB | 9.2637 +/- 0.07216 |
|
20 |
| [Q4_0 (bartowski)](https://huggingface.co/bartowski/google_gemma-3-12b-it-GGUF/blob/main/google_gemma-3-12b-it-Q4_0.gguf) | 6.91 GB | 9.5589 +/- 0.07527 |
|
21 |
| [QAT Q4_0 (google)](https://huggingface.co/google/gemma-3-12b-it-qat-q4_0-gguf/blob/main/gemma-3-12b-it-q4_0.gguf) | 8.07 GB | 9.2565 +/- 0.07212 |
|
22 |
| [Q5_K_S (bartowski)](https://huggingface.co/bartowski/google_gemma-3-12b-it-GGUF/blob/main/google_gemma-3-12b-it-Q5_K_S.gguf) | 8.23 GB | 9.8540 +/- 0.08016 |
|