fs-w-he-tiny-en

This model is a fine-tuned version of openai/whisper-tiny.en on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7192
  • Wer: 173.9316
  • Cer: 146.1783

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
10.4685 4.5872 500 10.1251 536.1823 471.2470
3.0813 9.1743 1000 3.6378 239.3162 199.3988
1.4313 13.7615 1500 2.4350 250.0 207.9268
0.6499 18.3486 2000 1.9427 376.7331 318.6104
0.336 22.9358 2500 1.9422 245.7265 195.3538
0.0683 27.5229 3000 2.2195 265.3846 224.6908
0.0104 32.1101 3500 2.3854 177.7778 148.1621
0.0013 36.6972 4000 2.6033 175.7360 158.7513
0.0003 41.2844 4500 2.6985 155.6030 127.8169
0.0002 45.8716 5000 2.7192 173.9316 146.1783

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
2
Safetensors
Model size
37.8M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for hiwden00/fs-w-he-tiny-en

Finetuned
(66)
this model