indian_translatorv1
This model is a fine-tuned version model on indiantranslator dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.1119
- Epoch: 14
Model description
Solo female tourists in India often face communication barriers due to formal or outdated translations. This can lead to misunderstandings, frustration, and even safety concerns. Therefore, a model designed just for them that translates your english text into hindi colloquial.
Training and evaluation data
batch_size = 16 learning_rate = 5e-5 weight_decay = 0.01 num_train_epochs = 15
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
Train Loss | Epoch |
---|---|
2.7680 | 0 |
1.7672 | 1 |
1.2098 | 2 |
0.9267 | 3 |
0.6953 | 4 |
0.5534 | 5 |
0.4267 | 6 |
0.3309 | 7 |
0.2949 | 8 |
0.2394 | 9 |
0.2157 | 10 |
0.1717 | 11 |
0.1564 | 12 |
0.1278 | 13 |
0.1119 | 14 |
Framework versions
- Transformers 4.48.3
- TensorFlow 2.18.0
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 258
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for nashrah18/indian_translatorv1
Base model
Helsinki-NLP/opus-mt-en-hi