wav2vec2-large-xlsr-coraa-words-exp-4

This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5520
  • Wer: 0.3792
  • Cer: 0.1634
  • Per: 0.3736

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine_with_restarts
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer Per
36.9896 0.98 21 11.5637 1.0 0.9606 1.0
36.9896 2.0 43 4.4906 1.0 0.9606 1.0
36.9896 2.98 64 3.8521 1.0 0.9606 1.0
36.9896 4.0 86 3.6001 1.0 0.9606 1.0
10.4512 4.98 107 3.3694 1.0 0.9606 1.0
10.4512 6.0 129 3.2463 1.0 0.9606 1.0
10.4512 6.98 150 3.1842 1.0 0.9606 1.0
10.4512 8.0 172 3.1160 1.0 0.9606 1.0
10.4512 8.98 193 3.0945 1.0 0.9606 1.0
3.0577 10.0 215 3.0865 1.0 0.9606 1.0
3.0577 10.98 236 3.0657 1.0 0.9606 1.0
3.0577 12.0 258 3.0591 1.0 0.9606 1.0
3.0577 12.98 279 3.0367 1.0 0.9606 1.0
2.9732 14.0 301 3.0339 1.0 0.9606 1.0
2.9732 14.98 322 3.0231 1.0 0.9606 1.0
2.9732 16.0 344 2.9633 1.0 0.9606 1.0
2.9732 16.98 365 2.8554 1.0 0.9606 1.0
2.9732 18.0 387 2.7109 1.0 0.9606 1.0
2.8449 18.98 408 2.3843 1.0 0.7400 1.0
2.8449 20.0 430 1.8336 1.0 0.5582 1.0
2.8449 20.98 451 1.3493 1.0 0.3643 1.0
2.8449 22.0 473 1.0668 0.9991 0.3247 0.9991
2.8449 22.98 494 0.8907 0.9991 0.3030 0.9991
1.5487 24.0 516 0.8101 0.9972 0.2923 0.9972
1.5487 24.98 537 0.7335 0.9802 0.2799 0.9792
1.5487 26.0 559 0.7279 0.9623 0.2740 0.9623
1.5487 26.98 580 0.6539 0.5264 0.1914 0.5236
0.5979 28.0 602 0.6419 0.4472 0.1821 0.4387
0.5979 28.98 623 0.6303 0.4179 0.1775 0.4085
0.5979 30.0 645 0.6278 0.4047 0.1739 0.3962
0.5979 30.98 666 0.6249 0.3896 0.1727 0.3830
0.5979 32.0 688 0.6010 0.3962 0.1700 0.3868
0.3815 32.98 709 0.5864 0.3943 0.1710 0.3868
0.3815 34.0 731 0.5773 0.3887 0.1703 0.3811
0.3815 34.98 752 0.5817 0.3792 0.1693 0.3708
0.3815 36.0 774 0.5911 0.3887 0.1723 0.3792
0.3815 36.98 795 0.5546 0.3887 0.1671 0.3792
0.3023 38.0 817 0.5548 0.3868 0.1672 0.3774
0.3023 38.98 838 0.5555 0.3821 0.1673 0.3745
0.3023 40.0 860 0.5582 0.3783 0.1676 0.3689
0.3023 40.98 881 0.5644 0.3896 0.1682 0.3811
0.2719 42.0 903 0.5822 0.3830 0.1673 0.3774
0.2719 42.98 924 0.5649 0.3811 0.1652 0.3717
0.2719 44.0 946 0.5553 0.3774 0.1641 0.3698
0.2719 44.98 967 0.5777 0.3745 0.1654 0.3660
0.2719 46.0 989 0.5868 0.3792 0.1664 0.3726
0.2202 46.98 1010 0.5546 0.3670 0.1626 0.3594
0.2202 48.0 1032 0.5724 0.3632 0.1624 0.3557
0.2202 48.98 1053 0.5851 0.3698 0.1657 0.3613
0.2202 50.0 1075 0.5671 0.3774 0.1637 0.3689
0.2202 50.98 1096 0.5720 0.3726 0.1651 0.3642
0.1995 52.0 1118 0.5672 0.3755 0.1644 0.3670
0.1995 52.98 1139 0.5520 0.3792 0.1634 0.3736
0.1995 54.0 1161 0.5687 0.3613 0.1617 0.3557
0.1995 54.98 1182 0.5597 0.3623 0.1598 0.3566
0.1802 56.0 1204 0.5828 0.3594 0.1605 0.3528
0.1802 56.98 1225 0.5753 0.3717 0.1636 0.3642
0.1802 58.0 1247 0.5592 0.3613 0.1627 0.3547
0.1802 58.98 1268 0.5694 0.3679 0.1650 0.3632
0.1802 60.0 1290 0.6006 0.3679 0.1640 0.3613
0.1642 60.98 1311 0.6042 0.3698 0.1644 0.3613
0.1642 62.0 1333 0.5754 0.3651 0.1633 0.3566
0.1642 62.98 1354 0.5793 0.3670 0.1630 0.3566
0.1642 64.0 1376 0.5941 0.3660 0.1627 0.3557
0.1642 64.98 1397 0.5825 0.3774 0.1624 0.3679
0.1466 66.0 1419 0.5996 0.3689 0.1627 0.3613
0.1466 66.98 1440 0.5777 0.3745 0.1631 0.3679
0.1466 68.0 1462 0.5910 0.3679 0.1620 0.3604
0.1466 68.98 1483 0.5821 0.3698 0.1626 0.3623
0.1398 70.0 1505 0.5960 0.3679 0.1643 0.3613
0.1398 70.98 1526 0.5922 0.3604 0.1640 0.3519
0.1398 72.0 1548 0.5987 0.3698 0.1633 0.3623
0.1398 72.98 1569 0.6110 0.3736 0.1654 0.3651

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
75
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support