Sylheti-MBart

Model Overview

This is a fine-tuned facebook/mbart-large-50-many-to-many-mmt model for translating between Bangla and Sylheti, handling word and sentence translations.

Training Info

  • Base Model: facebook/mbart-large-50-many-to-many-mmt
  • Dataset Entries: 262
  • Training Data Points: 499
  • Train Size: 449
  • Validation Size: 50
  • Epochs: 6
  • Batch Size: 1
  • Learning Rate: 1.5e-05
  • Final Training Loss: 9.2842
  • Final Validation Loss: 4.042934894561768
  • Training Date: 2025-05-01 07:01:29
  • Environment: Google Colab (GPU)

How to Use

Load the model from Hugging Face:

  • Repository: kamilhussen24/Sylheti-MBart
  • Example:
  • Input "কি সত্যি নাকি" (Bangla to Sylheti) Output "কিতা হাছা নি"
  • Input "ইগু ফুরি নি" (Sylheti to Bangla) Output "এটা মেয়ে নাকি"

Contact

For issues, contact: kamilhussen24

Downloads last month
64
Safetensors
Model size
611M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for kamilhussen24/Sylheti-MBart

Finetuned
(147)
this model

Space using kamilhussen24/Sylheti-MBart 1