barto_exp1_10partition_modelo_asl3000
This model is a fine-tuned version of vgaraujov/bart-base-spanish on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1432
- Model Preparation Time: 0.0031
- Bleu Msl: 0
- Bleu 1 Msl: 0
- Bleu 2 Msl: 0
- Bleu 3 Msl: 0
- Bleu 4 Msl: 0
- Ter Msl: 100
- Bleu Asl: 0
- Bleu 1 Asl: 0.9666
- Bleu 2 Asl: 0.9452
- Bleu 3 Asl: 0.9194
- Bleu 4 Asl: 0.8890
- Ter Asl: 3.9051
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Model Preparation Time | Bleu Msl | Bleu 1 Msl | Bleu 2 Msl | Bleu 3 Msl | Bleu 4 Msl | Ter Msl | Bleu Asl | Bleu 1 Asl | Bleu 2 Asl | Bleu 3 Asl | Bleu 4 Asl | Ter Asl |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 75 | 0.2051 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9471 | 0.9166 | 0.8870 | 0.8548 | 6.7832 |
No log | 2.0 | 150 | 0.1380 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9564 | 0.9316 | 0.9065 | 0.8757 | 5.1748 |
No log | 3.0 | 225 | 0.1438 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9603 | 0.9361 | 0.9100 | 0.8804 | 5.1748 |
No log | 4.0 | 300 | 0.1273 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9671 | 0.9470 | 0.9254 | 0.8985 | 3.9860 |
No log | 5.0 | 375 | 0.1163 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9730 | 0.9557 | 0.9358 | 0.9114 | 3.4266 |
No log | 6.0 | 450 | 0.1357 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9681 | 0.9474 | 0.9252 | 0.8985 | 3.9860 |
0.2716 | 7.0 | 525 | 0.1560 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9700 | 0.9521 | 0.9321 | 0.9069 | 3.6364 |
0.2716 | 8.0 | 600 | 0.1263 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9748 | 0.9585 | 0.9403 | 0.9171 | 2.9371 |
0.2716 | 9.0 | 675 | 0.1407 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9802 | 0.9663 | 0.9503 | 0.9300 | 2.3776 |
0.2716 | 10.0 | 750 | 0.1396 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9778 | 0.9633 | 0.9449 | 0.9217 | 2.7273 |
0.2716 | 11.0 | 825 | 0.1228 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9771 | 0.9636 | 0.9469 | 0.9263 | 3.0769 |
0.2716 | 12.0 | 900 | 0.1310 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9790 | 0.9661 | 0.9505 | 0.9301 | 2.5175 |
0.2716 | 13.0 | 975 | 0.1295 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9778 | 0.9648 | 0.9493 | 0.9296 | 2.7972 |
0.0167 | 14.0 | 1050 | 0.1349 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9766 | 0.9627 | 0.9461 | 0.9247 | 2.8671 |
0.0167 | 15.0 | 1125 | 0.1347 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9784 | 0.9643 | 0.9476 | 0.9266 | 2.8671 |
0.0167 | 16.0 | 1200 | 0.1330 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9772 | 0.9629 | 0.9455 | 0.9237 | 2.8671 |
0.0167 | 17.0 | 1275 | 0.1476 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9765 | 0.9624 | 0.9456 | 0.9244 | 3.0769 |
0.0167 | 18.0 | 1350 | 0.1475 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9754 | 0.9611 | 0.9441 | 0.9223 | 2.9371 |
0.0167 | 19.0 | 1425 | 0.1467 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9796 | 0.9660 | 0.9498 | 0.9290 | 2.4476 |
0.0061 | 20.0 | 1500 | 0.1371 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9778 | 0.9640 | 0.9468 | 0.9247 | 2.9371 |
0.0061 | 21.0 | 1575 | 0.1331 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9784 | 0.9647 | 0.9482 | 0.9271 | 2.7273 |
0.0061 | 22.0 | 1650 | 0.1366 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9771 | 0.9627 | 0.9449 | 0.9219 | 2.8671 |
0.0061 | 23.0 | 1725 | 0.1247 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9808 | 0.9684 | 0.9532 | 0.9337 | 2.3077 |
0.0061 | 24.0 | 1800 | 0.1282 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9808 | 0.9688 | 0.9537 | 0.9341 | 2.3776 |
0.0061 | 25.0 | 1875 | 0.1286 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9814 | 0.9698 | 0.9554 | 0.9366 | 2.3077 |
0.0061 | 26.0 | 1950 | 0.1291 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9814 | 0.9698 | 0.9553 | 0.9365 | 2.3077 |
0.0013 | 27.0 | 2025 | 0.1287 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9808 | 0.9688 | 0.9537 | 0.9344 | 2.3776 |
0.0013 | 28.0 | 2100 | 0.1303 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9802 | 0.9678 | 0.9521 | 0.9319 | 2.4476 |
0.0013 | 29.0 | 2175 | 0.1308 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9802 | 0.9678 | 0.9521 | 0.9319 | 2.4476 |
0.0013 | 30.0 | 2250 | 0.1309 | 0.0031 | 0 | 0 | 0 | 0 | 0 | 100 | 0 | 0.9802 | 0.9678 | 0.9521 | 0.9319 | 2.4476 |
Framework versions
- Transformers 4.51.1
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for vania2911/barto_exp1_10partition_modelo_asl3000
Base model
vgaraujov/bart-base-spanish