wav2vec2-xls-r-2b-faroese-100h-30-epochs_20250126
This model is a fine-tuned version of facebook/wav2vec2-xls-r-2b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1114
- Wer: 19.2228
- Cer: 4.1352
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.0663 | 0.4877 | 1000 | 0.4799 | 54.6284 | 15.5160 |
0.5134 | 0.9754 | 2000 | 0.2491 | 33.6080 | 8.6429 |
0.4176 | 1.4628 | 3000 | 0.2075 | 31.1187 | 7.9399 |
0.3923 | 1.9505 | 4000 | 0.2237 | 31.8632 | 8.1789 |
0.3497 | 2.4379 | 5000 | 0.2379 | 33.6388 | 8.6421 |
0.3539 | 2.9256 | 6000 | 0.2181 | 31.1715 | 8.1561 |
0.3228 | 3.4131 | 7000 | 0.2242 | 31.8016 | 8.2728 |
0.295 | 3.9008 | 8000 | 0.2056 | 30.0656 | 7.6677 |
0.2754 | 4.3882 | 9000 | 0.1886 | 28.7527 | 7.3079 |
0.2802 | 4.8759 | 10000 | 0.1936 | 28.7527 | 7.3607 |
0.2589 | 5.3633 | 11000 | 0.1863 | 27.8627 | 7.0507 |
0.2335 | 5.8510 | 12000 | 0.1745 | 27.9464 | 7.1193 |
0.1956 | 6.3385 | 13000 | 0.1745 | 27.5235 | 6.7982 |
0.2026 | 6.8261 | 14000 | 0.1618 | 26.7657 | 6.7532 |
0.1986 | 7.3136 | 15000 | 0.1658 | 26.6599 | 6.5962 |
0.1901 | 7.8013 | 16000 | 0.1740 | 26.7040 | 6.5820 |
0.1621 | 8.2887 | 17000 | 0.1668 | 25.8933 | 6.3958 |
0.1651 | 8.7764 | 18000 | 0.1534 | 25.6245 | 6.2979 |
0.1315 | 9.2638 | 19000 | 0.1583 | 25.3514 | 6.2411 |
0.1545 | 9.7515 | 20000 | 0.1635 | 24.7830 | 6.1228 |
0.1361 | 10.2390 | 21000 | 0.1524 | 25.3470 | 6.2443 |
0.126 | 10.7267 | 22000 | 0.1495 | 24.6200 | 5.9516 |
0.1291 | 11.2141 | 23000 | 0.1419 | 24.0913 | 5.8892 |
0.1212 | 11.7018 | 24000 | 0.1390 | 23.7961 | 5.6667 |
0.1097 | 12.1892 | 25000 | 0.1435 | 24.1089 | 5.7030 |
0.1132 | 12.6769 | 26000 | 0.1391 | 23.8446 | 5.6494 |
0.0948 | 13.1644 | 27000 | 0.1537 | 23.5273 | 5.6210 |
0.0934 | 13.6520 | 28000 | 0.1445 | 23.1264 | 5.4876 |
0.0854 | 14.1395 | 29000 | 0.1333 | 22.6153 | 5.2343 |
0.0878 | 14.6272 | 30000 | 0.1315 | 22.6638 | 5.3787 |
0.0805 | 15.1146 | 31000 | 0.1304 | 22.4964 | 5.2075 |
0.074 | 15.6023 | 32000 | 0.1295 | 22.4303 | 5.1602 |
0.0734 | 16.0897 | 33000 | 0.1279 | 22.2937 | 5.1404 |
0.0692 | 16.5774 | 34000 | 0.1234 | 22.0602 | 5.0773 |
0.0623 | 17.0649 | 35000 | 0.1309 | 21.7914 | 5.0284 |
0.0693 | 17.5525 | 36000 | 0.1233 | 21.6328 | 4.9448 |
0.057 | 18.0400 | 37000 | 0.1225 | 21.5050 | 4.8935 |
0.0479 | 18.5277 | 38000 | 0.1238 | 21.3288 | 4.8722 |
0.0478 | 19.0151 | 39000 | 0.1285 | 21.2715 | 4.8406 |
0.0483 | 19.5028 | 40000 | 0.1228 | 20.9499 | 4.7191 |
0.0435 | 19.9905 | 41000 | 0.1217 | 20.7516 | 4.6489 |
0.052 | 20.4779 | 42000 | 0.1170 | 20.5269 | 4.5795 |
0.0436 | 20.9656 | 43000 | 0.1179 | 20.5137 | 4.5913 |
0.0388 | 21.4531 | 44000 | 0.1196 | 20.3639 | 4.5211 |
0.0464 | 21.9407 | 45000 | 0.1123 | 20.2009 | 4.4800 |
0.0321 | 22.4282 | 46000 | 0.1151 | 20.0819 | 4.4619 |
0.0354 | 22.9159 | 47000 | 0.1143 | 19.8881 | 4.3893 |
0.033 | 23.4033 | 48000 | 0.1104 | 20.0555 | 4.4145 |
0.0315 | 23.8910 | 49000 | 0.1113 | 19.8352 | 4.3096 |
0.0301 | 24.3784 | 50000 | 0.1149 | 19.7207 | 4.3104 |
0.0272 | 24.8661 | 51000 | 0.1141 | 19.4563 | 4.2236 |
0.0323 | 25.3536 | 52000 | 0.1155 | 19.5092 | 4.2441 |
0.032 | 25.8413 | 53000 | 0.1115 | 19.4211 | 4.2062 |
0.0282 | 26.3287 | 54000 | 0.1113 | 19.3109 | 4.1613 |
0.0257 | 26.8164 | 55000 | 0.1118 | 19.3197 | 4.1542 |
0.023 | 27.3038 | 56000 | 0.1121 | 19.2581 | 4.1463 |
0.0305 | 27.7915 | 57000 | 0.1122 | 19.1743 | 4.1384 |
0.0287 | 28.2790 | 58000 | 0.1123 | 19.1964 | 4.1337 |
0.0252 | 28.7666 | 59000 | 0.1121 | 19.2316 | 4.1344 |
0.0325 | 29.2541 | 60000 | 0.1115 | 19.2228 | 4.1376 |
0.0374 | 29.7418 | 61000 | 0.1114 | 19.2228 | 4.1352 |
Framework versions
- Transformers 4.48.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 0
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for davidilag/wav2vec2-xls-r-2b-faroese-100h-30-epochs_20250126
Base model
facebook/wav2vec2-xls-r-2b