Pretrained
Finetuned
PyLaia Pretraining Metrics
Training Summary
- Total epochs: 52
- Best validation CER: 0.001036 (epoch 47)
- Best validation WER: 0.004066 (epoch 47)
Full Metrics Table
epoch | tr_cer | tr_loss | tr_wer | va_cer | va_loss | va_wer |
---|---|---|---|---|---|---|
0 | 0.102511 | 20.5502 | 0.228909 | 0.005607 | 1.44675 | 0.025887 |
1 | 0.013082 | 2.87236 | 0.061452 | 0.003544 | 0.977254 | 0.015911 |
2 | 0.010093 | 2.25076 | 0.048003 | 0.002787 | 0.806774 | 0.012519 |
3 | 0.008387 | 1.89794 | 0.039916 | 0.002174 | 0.694116 | 0.009469 |
4 | 0.007398 | 1.69734 | 0.03535 | 0.002058 | 0.658268 | 0.009019 |
5 | 0.006802 | 1.57578 | 0.032362 | 0.001917 | 0.614705 | 0.008205 |
6 | 0.006398 | 1.48598 | 0.03044 | 0.001798 | 0.596449 | 0.007565 |
7 | 0.005984 | 1.40728 | 0.02844 | 0.003597 | 1.37793 | 0.01733 |
8 | 0.00573 | 1.35598 | 0.027263 | 0.001649 | 0.571631 | 0.006976 |
9 | 0.005494 | 1.30841 | 0.026087 | 0.001704 | 0.609703 | 0.007205 |
10 | 0.005316 | 1.27343 | 0.025282 | 0.001609 | 0.585576 | 0.006896 |
11 | 0.005129 | 1.23307 | 0.024386 | 0.001624 | 0.566718 | 0.006805 |
12 | 0.005015 | 1.201 | 0.023676 | 0.001487 | 0.561116 | 0.006289 |
13 | 0.004933 | 1.1912 | 0.02329 | 0.001504 | 0.547125 | 0.006369 |
14 | 0.00474 | 1.15883 | 0.022434 | 0.001563 | 0.55139 | 0.006674 |
15 | 0.004657 | 1.13046 | 0.022109 | 0.00145 | 0.527566 | 0.006078 |
16 | 0.004509 | 1.11441 | 0.021305 | 0.001435 | 0.527379 | 0.006123 |
17 | 0.004429 | 1.0977 | 0.020912 | 0.001401 | 0.509225 | 0.00589 |
18 | 0.004358 | 1.07066 | 0.020649 | 0.001421 | 0.496447 | 0.006016 |
19 | 0.004249 | 1.05592 | 0.019901 | 0.001297 | 0.516378 | 0.005331 |
20 | 0.004206 | 1.04969 | 0.019817 | 0.00136 | 0.487321 | 0.005667 |
21 | 0.004093 | 1.03316 | 0.019192 | 0.001548 | 0.526055 | 0.006689 |
22 | 0.004057 | 1.02762 | 0.019002 | 0.001376 | 0.502673 | 0.005828 |
23 | 0.004001 | 1.01451 | 0.018822 | 0.001361 | 0.512399 | 0.005709 |
24 | 0.00395 | 1.00391 | 0.0185 | 0.001298 | 0.496007 | 0.005319 |
25 | 0.003864 | 0.981246 | 0.018071 | 0.001279 | 0.487199 | 0.005308 |
26 | 0.003849 | 0.984405 | 0.018042 | 0.001238 | 0.489225 | 0.005063 |
27 | 0.003797 | 0.971805 | 0.017749 | 0.00119 | 0.487824 | 0.004838 |
28 | 0.003787 | 0.966469 | 0.017727 | 0.001231 | 0.468324 | 0.005042 |
29 | 0.003697 | 0.955839 | 0.017186 | 0.0012 | 0.45356 | 0.00479 |
30 | 0.00369 | 0.961032 | 0.017177 | 0.00114 | 0.453792 | 0.004524 |
31 | 0.003773 | 0.97912 | 0.017593 | 0.001265 | 0.526798 | 0.004998 |
32 | 0.003655 | 0.951006 | 0.017039 | 0.00125 | 0.53167 | 0.00518 |
33 | 0.003613 | 0.936915 | 0.016874 | 0.001192 | 0.481596 | 0.004827 |
34 | 0.00356 | 0.920974 | 0.01664 | 0.00119 | 0.502282 | 0.004755 |
35 | 0.002972 | 0.772762 | 0.013666 | 0.001186 | 0.466747 | 0.004863 |
36 | 0.002802 | 0.726356 | 0.012762 | 0.001127 | 0.458169 | 0.004509 |
37 | 0.002763 | 0.711398 | 0.012652 | 0.00109 | 0.446273 | 0.004346 |
38 | 0.002683 | 0.693865 | 0.012302 | 0.00106 | 0.457616 | 0.004171 |
39 | 0.002673 | 0.693615 | 0.012196 | 0.001068 | 0.453765 | 0.00427 |
40 | 0.002621 | 0.677396 | 0.011972 | 0.001098 | 0.464599 | 0.004382 |
41 | 0.002617 | 0.672459 | 0.011962 | 0.001062 | 0.462883 | 0.004217 |
42 | 0.00261 | 0.674766 | 0.011838 | 0.001058 | 0.460468 | 0.004213 |
43 | 0.00258 | 0.669272 | 0.011772 | 0.001047 | 0.453072 | 0.004141 |
44 | 0.002565 | 0.660597 | 0.011639 | 0.001038 | 0.452772 | 0.004078 |
45 | 0.002528 | 0.654716 | 0.011505 | 0.001044 | 0.452606 | 0.00413 |
46 | 0.002501 | 0.64822 | 0.01137 | 0.00106 | 0.45137 | 0.004214 |
47 | 0.002542 | 0.653288 | 0.011591 | 0.001036 | 0.449435 | 0.004066 |
48 | 0.00252 | 0.650962 | 0.011476 | 0.001038 | 0.446807 | 0.00407 |
49 | 0.002543 | 0.662387 | 0.011516 | 0.00105 | 0.446939 | 0.004147 |
50 | 0.002528 | 0.654322 | 0.01149 | 0.001044 | 0.445585 | 0.004111 |
51 | 0.002518 | 0.653832 | 0.011452 | 0.001051 | 0.447358 | 0.004158 |
PyLaia Finetuning Metrics
Training Summary
- Total epochs: 27
- Best validation CER: 0.000869 (epoch 21)
- Best validation WER: 0.003693 (epoch 21)
Full Metrics Table
epoch | tr_cer | tr_loss | tr_wer | va_cer | va_loss | va_wer |
---|---|---|---|---|---|---|
0 | 0.183214 | 35.8187 | 0.224537 | 0.001531 | 0.370508 | 0.006411 |
1 | 0.003867 | 0.883387 | 0.018513 | 0.001255 | 0.298736 | 0.005188 |
2 | 0.003428 | 0.769932 | 0.016496 | 0.001143 | 0.245342 | 0.00476 |
3 | 0.003254 | 0.734276 | 0.015593 | 0.001168 | 0.262834 | 0.004929 |
4 | 0.003137 | 0.688775 | 0.01504 | 0.001151 | 0.244334 | 0.004897 |
5 | 0.00301 | 0.667008 | 0.014394 | 0.001086 | 0.230109 | 0.004562 |
6 | 0.002953 | 0.656692 | 0.01415 | 0.001123 | 0.238079 | 0.004684 |
7 | 0.002937 | 0.650136 | 0.014223 | 0.001034 | 0.234374 | 0.004454 |
8 | 0.002913 | 0.644122 | 0.014053 | 0.001017 | 0.233998 | 0.004383 |
9 | 0.002822 | 0.626512 | 0.013617 | 0.001038 | 0.23499 | 0.004427 |
10 | 0.00291 | 0.644877 | 0.014099 | 0.001179 | 0.265034 | 0.005115 |
11 | 0.002319 | 0.512036 | 0.01107 | 0.000923 | 0.203359 | 0.003937 |
12 | 0.002136 | 0.469168 | 0.010157 | 0.0009 | 0.201167 | 0.003825 |
13 | 0.002081 | 0.455159 | 0.00996 | 0.0009 | 0.198162 | 0.003825 |
14 | 0.002001 | 0.433845 | 0.009498 | 0.000908 | 0.197196 | 0.003886 |
15 | 0.001985 | 0.425972 | 0.009495 | 0.000884 | 0.195152 | 0.003785 |
16 | 0.001951 | 0.427156 | 0.009355 | 0.000874 | 0.196313 | 0.003695 |
17 | 0.001979 | 0.431152 | 0.009395 | 0.000897 | 0.194316 | 0.003843 |
18 | 0.00191 | 0.418722 | 0.009082 | 0.000881 | 0.198212 | 0.00374 |
19 | 0.001922 | 0.422092 | 0.009175 | 0.000906 | 0.195854 | 0.003853 |
20 | 0.001939 | 0.410425 | 0.009221 | 0.000888 | 0.19173 | 0.003807 |
21 | 0.001861 | 0.404768 | 0.0089 | 0.000869 | 0.194746 | 0.003693 |
22 | 0.001848 | 0.407902 | 0.008812 | 0.000908 | 0.201331 | 0.00387 |
23 | 0.001844 | 0.40212 | 0.008813 | 0.000894 | 0.193942 | 0.003816 |
24 | 0.001819 | 0.399957 | 0.008706 | 0.000902 | 0.200013 | 0.003802 |
25 | 0.001825 | 0.400195 | 0.00867 | 0.000909 | 0.198319 | 0.003824 |
26 | 0.001822 | 0.395797 | 0.008696 | 0.00088 | 0.193306 | 0.003716 |
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support