helldivers2-jarvis-asr

This model is a fine-tuned version of facebook/wav2vec2-base-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 59.4560
  • Wer: 0.2597
  • Cer: 0.8461

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 35

Training results

Training Loss Epoch Step Validation Loss Wer Cer
592.8701 1.0 33 309.4361 0.4890 0.8633
451.2683 2.0 66 203.9322 0.3950 0.8575
379.4145 3.0 99 167.4042 0.3591 0.8544
336.9402 4.0 132 150.8170 0.3398 0.8525
321.581 5.0 165 134.3422 0.3232 0.8515
286.1988 6.0 198 118.7198 0.3232 0.8504
275.0421 7.0 231 98.2645 0.3232 0.8503
271.4439 8.0 264 96.4090 0.3149 0.8501
257.0019 9.0 297 90.3913 0.3094 0.8490
237.0949 10.0 330 86.4160 0.2928 0.8484
245.8697 11.0 363 86.0826 0.2901 0.8485
237.8742 12.0 396 84.3617 0.2901 0.8478
234.2587 13.0 429 78.6872 0.2845 0.8481
209.8943 14.0 462 81.1480 0.2873 0.8478
226.1856 15.0 495 86.4198 0.2790 0.8481
200.5639 16.0 528 73.4982 0.2735 0.8473
206.5734 17.0 561 70.4848 0.2735 0.8473
190.7183 18.0 594 69.2320 0.2818 0.8474
224.0161 19.0 627 63.0107 0.2680 0.8467
202.4105 20.0 660 69.8898 0.2790 0.8467
189.4804 21.0 693 64.9169 0.2790 0.8469
194.0889 22.0 726 73.2138 0.2735 0.8464
177.1985 23.0 759 69.4830 0.2735 0.8464
179.883 24.0 792 67.9310 0.2624 0.8462
184.1469 25.0 825 65.2100 0.2735 0.8464
180.9639 26.0 858 65.6959 0.2707 0.8465
161.8609 27.0 891 64.2684 0.2597 0.8461
176.6417 28.0 924 64.3977 0.2652 0.8459
162.5974 29.0 957 61.5234 0.2597 0.8458
174.1075 30.0 990 64.5171 0.2597 0.8459
165.3279 31.0 1023 64.0766 0.2597 0.8458
174.7633 32.0 1056 70.9056 0.2680 0.8464
166.3098 33.0 1089 61.5759 0.2707 0.8460
176.9802 34.0 1122 59.8946 0.2652 0.8461
161.3542 35.0 1155 59.4560 0.2597 0.8461

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.4.1+cu118
  • Datasets 3.5.1
  • Tokenizers 0.21.1
Downloads last month
148
Safetensors
Model size
94.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for 8688chris/helldivers2-jarvis-asr

Finetuned
(140)
this model