|
--- |
|
base_model: Qwen/Qwen1.5-0.5B |
|
language: |
|
- en |
|
license: other |
|
license_name: tongyi-qianwen-research |
|
license_link: https://huggingface.co/Qwen/Qwen1.5-0.5B/blob/main/LICENSE |
|
pipeline_tag: text-generation |
|
tags: |
|
- pretrained |
|
- llama-cpp |
|
- gguf-my-repo |
|
--- |
|
|
|
*Produced by [Antigma Labs](https://antigma.ai)* |
|
## llama.cpp quantization |
|
Using <a href="https://github.com/ggml-org/llama.cpp">llama.cpp</a> release <a href="https://github.com/ggml-org/llama.cpp/releases/tag/b5165">b4944</a> for quantization. |
|
Original model: https://huggingface.co/Qwen/Qwen1.5-0.5B |
|
Run them directly with [llama.cpp](https://github.com/ggml-org/llama.cpp), or any other llama.cpp based project |
|
## Prompt format |
|
``` |
|
<|begin▁of▁sentence|>{system_prompt}<|User|>{prompt}<|Assistant|><|end▁of▁sentence|><|Assistant|> |
|
``` |
|
## Download a file (not the whole branch) from below: |
|
| Filename | Quant type | File Size | Split | |
|
| -------- | ---------- | --------- | ----- | |
|
| [qwen1.5-0.5b-q4_k_m.gguf](https://huggingface.co/Brianpu/Qwen1.5-0.5B-GGUF/blob/main/qwen1.5-0.5b-q4_k_m.gguf)|Q4_K_M|0.38 GB|False| |
|
|
|
## Downloading using huggingface-cli |
|
<details> |
|
<summary>Click to view download instructions</summary> |
|
First, make sure you have hugginface-cli installed: |
|
``` |
|
pip install -U "huggingface_hub[cli]" |
|
|
|
``` |
|
Then, you can target the specific file you want: |
|
|
|
``` |
|
huggingface-cli download https://huggingface.co/Brianpu/Qwen1.5-0.5B-GGUF --include "qwen1.5-0.5b-q4_k_m.gguf" --local-dir ./ |
|
|
|
``` |
|
If the model is bigger than 50GB, it will have been split into multiple files. In order to download them all to a local folder, run: |
|
|
|
``` |
|
huggingface-cli download https://huggingface.co/Brianpu/Qwen1.5-0.5B-GGUF --include "qwen1.5-0.5b-q4_k_m.gguf/*" --local-dir ./ |
|
|
|
``` |
|
You can either specify a new local-dir (deepseek-ai_DeepSeek-V3-0324-Q8_0) or download them all in place (./) |
|
</details> |
|
|