wav2vec2-large-xlsr-facebook-300m-texts-exp-1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7400
  • Wer: 0.5455
  • Cer: 0.2311
  • Per: 0.5289

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer Per
38.9028 1.0 14 18.5048 1.0 0.9671 1.0
38.9028 2.0 28 9.8810 1.0 0.9671 1.0
38.9028 3.0 42 7.7650 1.0 0.9671 1.0
38.9028 4.0 56 6.9159 1.0 0.9671 1.0
38.9028 5.0 70 6.3930 1.0 0.9671 1.0
38.9028 6.0 84 5.9372 1.0 0.9671 1.0
38.9028 7.0 98 5.5319 1.0 0.9671 1.0
12.9708 8.0 112 5.1820 1.0 0.9671 1.0
12.9708 9.0 126 4.8767 1.0 0.9671 1.0
12.9708 10.0 140 4.6011 1.0 0.9671 1.0
12.9708 11.0 154 4.3649 1.0 0.9671 1.0
12.9708 12.0 168 4.1568 1.0 0.9671 1.0
12.9708 13.0 182 3.9743 1.0 0.9671 1.0
12.9708 14.0 196 3.8195 1.0 0.9671 1.0
4.98 15.0 210 3.6889 1.0 0.9671 1.0
4.98 16.0 224 3.5786 1.0 0.9671 1.0
4.98 17.0 238 3.4828 1.0 0.9671 1.0
4.98 18.0 252 3.4061 1.0 0.9671 1.0
4.98 19.0 266 3.3420 1.0 0.9671 1.0
4.98 20.0 280 3.2863 1.0 0.9671 1.0
4.98 21.0 294 3.2379 1.0 0.9671 1.0
3.5425 22.0 308 3.1723 1.0 0.9671 1.0
3.5425 23.0 322 3.1202 1.0 0.9671 1.0
3.5425 24.0 336 3.0854 1.0 0.9671 1.0
3.5425 25.0 350 3.0373 1.0 0.9671 1.0
3.5425 26.0 364 3.0513 1.0 0.9671 1.0
3.5425 27.0 378 3.0163 1.0 0.9671 1.0
3.5425 28.0 392 2.9870 1.0 0.9671 1.0
3.0988 29.0 406 2.9741 1.0 0.9671 1.0
3.0988 30.0 420 2.9698 1.0 0.9671 1.0
3.0988 31.0 434 2.9717 1.0 0.9671 1.0
3.0988 32.0 448 2.9575 1.0 0.9671 1.0
3.0988 33.0 462 2.9537 1.0 0.9671 1.0
3.0988 34.0 476 2.9581 1.0 0.9671 1.0
3.0988 35.0 490 2.9491 1.0 0.9671 1.0
2.9753 36.0 504 2.9414 1.0 0.9671 1.0
2.9753 37.0 518 2.9396 1.0 0.9671 1.0
2.9753 38.0 532 2.9386 1.0 0.9671 1.0
2.9753 39.0 546 2.9313 1.0 0.9671 1.0
2.9753 40.0 560 2.9585 1.0 0.9671 1.0
2.9753 41.0 574 2.9350 1.0 0.9671 1.0
2.9753 42.0 588 2.9248 1.0 0.9671 1.0
2.9468 43.0 602 2.9306 1.0 0.9671 1.0
2.9468 44.0 616 2.9244 1.0 0.9671 1.0
2.9468 45.0 630 2.9234 1.0 0.9671 1.0
2.9468 46.0 644 2.9226 1.0 0.9671 1.0
2.9468 47.0 658 2.9173 1.0 0.9671 1.0
2.9468 48.0 672 2.9177 1.0 0.9671 1.0
2.9468 49.0 686 2.9207 1.0 0.9671 1.0
2.9258 50.0 700 2.9186 1.0 0.9671 1.0
2.9258 51.0 714 2.9123 1.0 0.9671 1.0
2.9258 52.0 728 2.9160 1.0 0.9671 1.0
2.9258 53.0 742 2.9112 1.0 0.9671 1.0
2.9258 54.0 756 2.8988 1.0 0.9671 1.0
2.9258 55.0 770 2.9045 1.0 0.9671 1.0
2.9258 56.0 784 2.9132 1.0 0.9671 1.0
2.9258 57.0 798 2.8828 1.0 0.9671 1.0
2.9114 58.0 812 2.8750 1.0 0.9671 1.0
2.9114 59.0 826 2.8658 1.0 0.9671 1.0
2.9114 60.0 840 2.8470 1.0 0.9671 1.0
2.9114 61.0 854 2.8315 1.0 0.9671 1.0
2.9114 62.0 868 2.8104 1.0 0.9671 1.0
2.9114 63.0 882 2.7643 1.0 0.9671 1.0
2.9114 64.0 896 2.7412 1.0 0.9671 1.0
2.8525 65.0 910 2.7178 1.0 0.9671 1.0
2.8525 66.0 924 2.6887 1.0 0.9668 1.0
2.8525 67.0 938 2.6267 1.0 0.9670 1.0
2.8525 68.0 952 2.5971 0.9995 0.9575 0.9995
2.8525 69.0 966 2.5567 0.9997 0.9243 0.9997
2.8525 70.0 980 2.5002 1.0 0.8266 1.0
2.8525 71.0 994 2.4290 1.0 0.7585 1.0
2.6546 72.0 1008 2.3754 1.0 0.7096 1.0
2.6546 73.0 1022 2.3152 1.0 0.6939 1.0
2.6546 74.0 1036 2.2623 1.0 0.6643 1.0
2.6546 75.0 1050 2.1567 1.0 0.6488 1.0
2.6546 76.0 1064 2.0693 1.0 0.6107 1.0
2.6546 77.0 1078 1.9740 1.0 0.5911 1.0
2.6546 78.0 1092 1.9123 0.9997 0.5685 0.9997
2.2421 79.0 1106 1.8264 1.0 0.5500 1.0
2.2421 80.0 1120 1.7678 0.9997 0.5253 0.9997
2.2421 81.0 1134 1.6755 1.0 0.5180 1.0
2.2421 82.0 1148 1.6509 1.0 0.4988 1.0
2.2421 83.0 1162 1.5905 1.0 0.4927 1.0
2.2421 84.0 1176 1.5812 1.0 0.4782 1.0
2.2421 85.0 1190 1.4685 1.0 0.4804 1.0
1.7467 86.0 1204 1.4286 1.0 0.4693 1.0
1.7467 87.0 1218 1.4161 0.9995 0.4555 0.9995
1.7467 88.0 1232 1.3703 1.0 0.4503 1.0
1.7467 89.0 1246 1.3651 0.9980 0.4382 0.9980
1.7467 90.0 1260 1.3306 0.9970 0.4315 0.9967
1.7467 91.0 1274 1.2882 0.9957 0.4274 0.9954
1.7467 92.0 1288 1.2564 0.9939 0.4210 0.9937
1.4044 93.0 1302 1.2472 0.9726 0.3998 0.9695
1.4044 94.0 1316 1.2310 0.9518 0.3813 0.9464
1.4044 95.0 1330 1.1658 0.9563 0.3799 0.9528
1.4044 96.0 1344 1.1555 0.9317 0.3697 0.9264
1.4044 97.0 1358 1.1608 0.8918 0.3573 0.8865
1.4044 98.0 1372 1.1363 0.8720 0.3483 0.8652
1.4044 99.0 1386 1.1085 0.8479 0.3379 0.8405
1.191 100.0 1400 1.0848 0.8489 0.3348 0.8426
1.191 101.0 1414 1.0706 0.8179 0.3252 0.8106
1.191 102.0 1428 1.0839 0.7905 0.3212 0.7824
1.191 103.0 1442 1.0419 0.8187 0.3210 0.8121
1.191 104.0 1456 1.0630 0.7692 0.3141 0.7593
1.191 105.0 1470 1.0117 0.7831 0.3122 0.7743
1.191 106.0 1484 1.0035 0.7633 0.3064 0.7529
1.191 107.0 1498 1.0035 0.7463 0.3017 0.7364
1.0047 108.0 1512 0.9954 0.7395 0.2974 0.7288
1.0047 109.0 1526 0.9830 0.7438 0.2979 0.7349
1.0047 110.0 1540 0.9699 0.7219 0.2917 0.7115
1.0047 111.0 1554 0.9626 0.7184 0.2901 0.7075
1.0047 112.0 1568 0.9367 0.7176 0.2896 0.7075
1.0047 113.0 1582 0.9550 0.7057 0.2838 0.6960
1.0047 114.0 1596 0.9140 0.6872 0.2769 0.6790
0.9031 115.0 1610 0.9214 0.6851 0.2781 0.6752
0.9031 116.0 1624 0.9040 0.6877 0.2786 0.6767
0.9031 117.0 1638 0.9107 0.6920 0.2811 0.6813
0.9031 118.0 1652 0.8947 0.6699 0.2743 0.6585
0.9031 119.0 1666 0.8975 0.6595 0.2715 0.6480
0.9031 120.0 1680 0.8817 0.6750 0.2729 0.6646
0.9031 121.0 1694 0.8686 0.6529 0.2670 0.6412
0.8165 122.0 1708 0.8622 0.6442 0.2653 0.6323
0.8165 123.0 1722 0.8741 0.6437 0.2653 0.6320
0.8165 124.0 1736 0.8656 0.6399 0.2636 0.6282
0.8165 125.0 1750 0.8582 0.6366 0.2628 0.6249
0.8165 126.0 1764 0.8547 0.6366 0.2624 0.6249
0.8165 127.0 1778 0.8515 0.6275 0.2606 0.6158
0.8165 128.0 1792 0.8515 0.6272 0.2603 0.6153
0.7588 129.0 1806 0.8351 0.6204 0.2565 0.6089
0.7588 130.0 1820 0.8311 0.6216 0.2573 0.6089
0.7588 131.0 1834 0.8403 0.6188 0.2579 0.6061
0.7588 132.0 1848 0.8281 0.6153 0.2562 0.6036
0.7588 133.0 1862 0.8210 0.6130 0.2560 0.6008
0.7588 134.0 1876 0.8312 0.6107 0.2566 0.5990
0.7588 135.0 1890 0.8171 0.6122 0.2547 0.6013
0.723 136.0 1904 0.8264 0.6056 0.2551 0.5940
0.723 137.0 1918 0.8117 0.6084 0.2540 0.5957
0.723 138.0 1932 0.8066 0.6018 0.2511 0.5886
0.723 139.0 1946 0.8022 0.6023 0.2525 0.5899
0.723 140.0 1960 0.7986 0.6034 0.2518 0.5912
0.723 141.0 1974 0.7937 0.5945 0.2486 0.5820
0.723 142.0 1988 0.8014 0.5950 0.2504 0.5828
0.6458 143.0 2002 0.7933 0.5907 0.2492 0.5759
0.6458 144.0 2016 0.7918 0.5922 0.2478 0.5777
0.6458 145.0 2030 0.7814 0.5876 0.2456 0.5734
0.6458 146.0 2044 0.7851 0.5866 0.2455 0.5716
0.6458 147.0 2058 0.7908 0.5884 0.2462 0.5741
0.6458 148.0 2072 0.7968 0.5894 0.2475 0.5749
0.6458 149.0 2086 0.7879 0.5851 0.2458 0.5698
0.6326 150.0 2100 0.7997 0.5835 0.2466 0.5681
0.6326 151.0 2114 0.7859 0.5841 0.2454 0.5683
0.6326 152.0 2128 0.7836 0.5785 0.2431 0.5620
0.6326 153.0 2142 0.7778 0.5769 0.2408 0.5617
0.6326 154.0 2156 0.7653 0.5731 0.2394 0.5574
0.6326 155.0 2170 0.7690 0.5790 0.2407 0.5627
0.6326 156.0 2184 0.7688 0.5731 0.2389 0.5574
0.6326 157.0 2198 0.7631 0.5703 0.2383 0.5546
0.583 158.0 2212 0.7584 0.5744 0.2396 0.5587
0.583 159.0 2226 0.7607 0.5691 0.2385 0.5531
0.583 160.0 2240 0.7651 0.5675 0.2381 0.5521
0.583 161.0 2254 0.7572 0.5625 0.2372 0.5467
0.583 162.0 2268 0.7561 0.5637 0.2377 0.5475
0.583 163.0 2282 0.7570 0.5602 0.2354 0.5434
0.583 164.0 2296 0.7545 0.5571 0.2341 0.5411
0.5626 165.0 2310 0.7548 0.5582 0.2350 0.5416
0.5626 166.0 2324 0.7436 0.5561 0.2324 0.5391
0.5626 167.0 2338 0.7545 0.5546 0.2335 0.5378
0.5626 168.0 2352 0.7469 0.5584 0.2342 0.5416
0.5626 169.0 2366 0.7503 0.5536 0.2333 0.5371
0.5626 170.0 2380 0.7522 0.5472 0.2325 0.5305
0.5626 171.0 2394 0.7434 0.5508 0.2321 0.5348
0.5491 172.0 2408 0.7462 0.5495 0.2328 0.5328
0.5491 173.0 2422 0.7524 0.5482 0.2322 0.5317
0.5491 174.0 2436 0.7400 0.5455 0.2311 0.5289
0.5491 175.0 2450 0.7447 0.5510 0.2334 0.5333
0.5491 176.0 2464 0.7500 0.5500 0.2328 0.5320
0.5491 177.0 2478 0.7471 0.5498 0.2331 0.5322
0.5491 178.0 2492 0.7496 0.5513 0.2328 0.5340
0.5339 179.0 2506 0.7521 0.5505 0.2322 0.5330
0.5339 180.0 2520 0.7461 0.5467 0.2315 0.5289
0.5339 181.0 2534 0.7466 0.5460 0.2312 0.5289
0.5339 182.0 2548 0.7436 0.5470 0.2312 0.5305
0.5339 183.0 2562 0.7439 0.5427 0.2308 0.5254
0.5339 184.0 2576 0.7428 0.5434 0.2309 0.5267
0.5339 185.0 2590 0.7456 0.5439 0.2309 0.5267
0.5208 186.0 2604 0.7444 0.5449 0.2308 0.5272
0.5208 187.0 2618 0.7438 0.5444 0.2306 0.5269
0.5208 188.0 2632 0.7487 0.5439 0.2310 0.5272
0.5208 189.0 2646 0.7471 0.5422 0.2300 0.5254
0.5208 190.0 2660 0.7436 0.5422 0.2304 0.5249
0.5208 191.0 2674 0.7428 0.5414 0.2299 0.5236
0.5208 192.0 2688 0.7454 0.5414 0.2295 0.5239
0.5432 193.0 2702 0.7460 0.5442 0.2299 0.5267
0.5432 194.0 2716 0.7450 0.5409 0.2295 0.5231

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.13.3
Downloads last month
169
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support