helldivers2-jarvis-asrV2

This model is a fine-tuned version of facebook/wav2vec2-base-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 28.5574
  • Wer: 0.2202
  • Cer: 0.8428

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Wer Cer
564.7336 1.0 50 298.1647 0.4697 0.8612
417.1263 2.0 100 214.3165 0.3927 0.8549
361.7804 3.0 150 187.2314 0.3596 0.8524
326.2224 4.0 200 145.8555 0.3450 0.8509
288.2027 5.0 250 126.9637 0.3413 0.8501
297.8872 6.0 300 99.9533 0.3174 0.8486
260.7991 7.0 350 91.3213 0.3046 0.8480
248.5329 8.0 400 89.1852 0.2936 0.8480
228.2494 9.0 450 71.1274 0.2881 0.8471
235.5672 10.0 500 74.1389 0.2661 0.8465
231.3615 11.0 550 64.1308 0.2697 0.8461
214.3394 12.0 600 63.4379 0.2587 0.8456
216.4062 13.0 650 65.4323 0.2385 0.8453
207.2749 14.0 700 51.0200 0.2385 0.8446
194.7204 15.0 750 53.9227 0.2495 0.8449
191.7318 16.0 800 46.7860 0.2404 0.8446
184.4219 17.0 850 47.1186 0.2459 0.8443
174.2516 18.0 900 50.4025 0.2385 0.8443
181.3694 19.0 950 51.3427 0.2367 0.8440
171.3787 20.0 1000 57.2478 0.2349 0.8444
169.6002 21.0 1050 45.4265 0.2367 0.8440
163.5564 22.0 1100 57.1685 0.2294 0.8441
179.7925 23.0 1150 42.7982 0.2220 0.8437
160.0045 24.0 1200 43.9563 0.2275 0.8436
162.1235 25.0 1250 41.3504 0.2349 0.8437
171.0586 26.0 1300 35.8618 0.2294 0.8433
163.222 27.0 1350 48.1241 0.2275 0.8436
144.6168 28.0 1400 35.0105 0.2239 0.8435
154.0386 29.0 1450 40.7426 0.2312 0.8437
149.5638 30.0 1500 37.7159 0.2440 0.8438
152.7088 31.0 1550 44.9629 0.2202 0.8429
141.1782 32.0 1600 43.5452 0.2202 0.8431
148.6998 33.0 1650 46.8319 0.2257 0.8433
156.1795 34.0 1700 40.1366 0.2239 0.8432
134.192 35.0 1750 48.7881 0.2275 0.8433
136.9826 36.0 1800 50.8378 0.2202 0.8431
132.9241 37.0 1850 28.7557 0.2183 0.8425
141.7361 38.0 1900 33.2380 0.2220 0.8429
133.5196 39.0 1950 42.5577 0.2239 0.8429
131.6621 40.0 2000 33.2488 0.2275 0.8429
132.694 41.0 2050 32.1173 0.2239 0.8428
136.4332 42.0 2100 31.2864 0.2183 0.8426
138.5151 43.0 2150 43.6833 0.2220 0.8427
133.53 44.0 2200 27.9468 0.2183 0.8424
119.6547 45.0 2250 43.3999 0.2147 0.8426
134.2982 46.0 2300 28.5882 0.2202 0.8428
129.6781 47.0 2350 40.8014 0.2165 0.8426
133.2878 48.0 2400 46.5926 0.2183 0.8425
120.2284 49.0 2450 30.5833 0.2183 0.8426
131.5662 50.0 2500 40.5421 0.2202 0.8430
128.9309 51.0 2550 33.1733 0.2202 0.8426
125.6526 52.0 2600 33.8879 0.2220 0.8429
134.5112 53.0 2650 31.5242 0.2183 0.8425
128.9252 54.0 2700 36.8484 0.2239 0.8430
120.8643 55.0 2750 35.2391 0.2183 0.8426
124.6056 56.0 2800 41.8901 0.2183 0.8424
128.6048 57.0 2850 34.9353 0.2257 0.8427
137.4 58.0 2900 36.6512 0.2183 0.8427
112.7822 59.0 2950 37.5492 0.2220 0.8426
132.3333 60.0 3000 28.5574 0.2202 0.8428

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.4.1+cu118
  • Datasets 3.5.1
  • Tokenizers 0.21.1
Downloads last month
32
Safetensors
Model size
94.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for 8688chris/helldivers2-jarvis-asrV2

Finetuned
(140)
this model