petkopetkov's picture
Update README.md
4f9d6a6 verified
|
raw
history blame contribute delete
843 Bytes
---
base_model: HuggingFaceTB/SmolLM2-1.7B-Instruct
language:
- en
- bg
license: apache-2.0
tags:
- text-generation-inference
- transformers
- llama
- trl
datasets:
- petkopetkov/oasst1_bg
---
# SmolLM2-135M-Bulgarian
- **Developed by:** petkopetkov
- **License:** apache-2.0
- **Finetuned from model :** HuggingFaceTB/SmolLM2-1.7B-Instruct
SmolLM2-1.7B-Instruct finetuned on OASST1 dataset translated to Bulgarian language.
### Usage
First, install the Transformers library with:
```sh
pip install -U transformers
```
#### Run with the `pipeline` API
```python
import torch
from transformers import pipeline
pipe = pipeline(
"text-generation",
model="petkopetkov/SmolLM2-1.7B-Instruct-bg",
torch_dtype=torch.bfloat16,
device_map="auto"
)
prompt = "Колко е 2 + 2?"
print(pipe(prompt)[0]['generated_text'])
```