|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
base_model: |
|
- explorewithai/Loxa-3B |
|
model_size: |
|
- 4.75B |
|
tags: |
|
- text-generation |
|
- conversational |
|
- language-model |
|
- cpu |
|
widget: |
|
- text: "Hello" |
|
--- |
|
|
|
|
|
# Model Card for Loxa |
|
|
|
**Model Name:** Loxa-3B |
|
|
|
**Model Family:** Loxa |
|
|
|
**Creator:** AIFRAME |
|
|
|
**Description:** Loxa-3B is a powerful language model designed for optimal performance on CPU resources, particularly Raspberry Pi 4 and 5 (8GB+ RAM). It excels in math, code, chat, help, science, and formal conversations, achieving 92% total accuracy. |
|
|
|
**Capabilities:** |
|
|
|
* **Mathematics:** Solves problems, performs calculations, explains concepts. |
|
* **Code:** Generates code, understands/debugs existing code, provides explanations. |
|
* **Chat:** Engages in conversations, provides informative and helpful responses. |
|
* **Help:** Offers guidance and clear explanations across various topics. |
|
* **Science:** Discusses scientific topics, explains phenomena, provides insights. |
|
* **Formal Conversations:** Maintains formal etiquette and respectful language. |
|
|
|
**Performance:** |
|
|
|
* **Accuracy:** 92% total accuracy. |
|
* **Resource Usage:** Optimized for Raspberry Pi 4/5 (8GB+ RAM). Consult documentation for detailed metrics. |
|
|
|
**Intended Use:** Educational purposes, personal projects, embedded systems, resource-constrained environments. |
|
|
|
**Limitations:** |
|
|
|
May produce incorrect or nonsensical outputs. Exercise caution for critical tasks. Performance may be affected by input complexity/length. See documentation for details on limitations and biases. |
|
|
|
|
|
**How to Use:** See accompanying documentation for installation and usage instructions. |
|
|
|
|
|
## Code Example: |
|
|
|
```python |
|
# Use a pipeline as a high-level helper |
|
from transformers import pipeline |
|
|
|
messages = [ |
|
{"role": "user", "content": "Who are you?"}, |
|
] |
|
pipe = pipeline("text-generation", model="explorewithai/Loxa-3B") # Using 'explorewithai' as a placeholder organization |
|
result = pipe(messages) |
|
print(result) |
|
``` |