metadata
language:
- en
- lug
tags:
- luganda
- english
- translation
- gemma
- lora
- peft
license: apache-2.0
base_model: Bronsn/gemma-9b-luganda-pretrained
LoRA Adapters
This repository contains the LoRA adapters used for fine-tuning.
Details
- Base Model: Bronsn/gemma-9b-luganda-pretrained
- Contains LoRA adapter weights
- Compatible with PEFT library
Configuration
peft_config = LoraConfig(
r=128,
target_modules=["q_proj", "k_proj", "v_proj", "o_proj",
"gate_proj", "up_proj", "down_proj",
"embed_tokens", "lm_head"],
lora_alpha=32,
bias="none",
)