--- library_name: transformers license: apache-2.0 base_model: Helsinki-NLP/opus-mt-en-mr tags: - generated_from_trainer model-index: - name: english-marathi-colloquial-translator1 results: [] --- # english-marathi-colloquial-translator1 This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-mr](https://huggingface.co/Helsinki-NLP/opus-mt-en-mr) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5252 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 15.4233 | 0.3636 | 2 | 7.9121 | | 13.0588 | 0.7273 | 4 | 7.9121 | | 8.4646 | 1.0 | 6 | 7.9121 | | 4.474 | 1.3636 | 8 | 3.3942 | | 1.7662 | 1.7273 | 10 | 0.6550 | | 0.7863 | 2.0 | 12 | 0.5941 | | 1.1458 | 2.3636 | 14 | 0.5240 | | 0.8619 | 2.7273 | 16 | 0.4977 | | 0.7108 | 3.0 | 18 | 0.4929 | | 0.8382 | 3.3636 | 20 | 0.4984 | | 0.5416 | 3.7273 | 22 | 0.4854 | | 0.2726 | 4.0 | 24 | 0.4787 | | 0.4561 | 4.3636 | 26 | 0.4806 | | 0.5203 | 4.7273 | 28 | 0.4890 | | 0.233 | 5.0 | 30 | 0.4989 | | 0.4509 | 5.3636 | 32 | 0.4978 | | 0.3384 | 5.7273 | 34 | 0.4999 | | 0.1781 | 6.0 | 36 | 0.5054 | | 0.3225 | 6.3636 | 38 | 0.5101 | | 0.2737 | 6.7273 | 40 | 0.5159 | | 0.0695 | 7.0 | 42 | 0.5239 | | 0.2417 | 7.3636 | 44 | 0.5263 | | 0.3237 | 7.7273 | 46 | 0.5253 | | 0.1458 | 8.0 | 48 | 0.5249 | | 0.2153 | 8.3636 | 50 | 0.5252 | ### Framework versions - Transformers 4.47.1 - Pytorch 2.6.0+cu124 - Datasets 3.3.1 - Tokenizers 0.21.0