arabic-hs-degree-prediction
This model is a fine-tuned version of aubmindlab/bert-base-arabert on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.1190
- Mse: 3.1190
- Mae: 1.1829
- R2: 0.5254
- Accuracy: 0.4174
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Mse | Mae | R2 | Accuracy |
---|---|---|---|---|---|---|---|
5.7507 | 0.1017 | 100 | 6.2822 | 6.2822 | 1.9385 | 0.0441 | 0.0830 |
5.2964 | 0.2035 | 200 | 5.5734 | 5.5734 | 1.8619 | 0.1520 | 0.1363 |
5.5627 | 0.3052 | 300 | 5.2112 | 5.2112 | 1.7034 | 0.2071 | 0.2289 |
4.8823 | 0.4069 | 400 | 4.8415 | 4.8415 | 1.6348 | 0.2633 | 0.2643 |
4.431 | 0.5086 | 500 | 4.4674 | 4.4674 | 1.6101 | 0.3203 | 0.2662 |
4.4338 | 0.6104 | 600 | 4.2332 | 4.2332 | 1.5406 | 0.3559 | 0.2611 |
4.359 | 0.7121 | 700 | 4.1760 | 4.1760 | 1.4834 | 0.3646 | 0.2875 |
3.8457 | 0.8138 | 800 | 4.4217 | 4.4217 | 1.4482 | 0.3272 | 0.3531 |
3.945 | 0.9156 | 900 | 3.8921 | 3.8921 | 1.4309 | 0.4078 | 0.3113 |
4.0329 | 1.0173 | 1000 | 3.7213 | 3.7213 | 1.4357 | 0.4338 | 0.2720 |
3.4687 | 1.1190 | 1100 | 3.7389 | 3.7389 | 1.4563 | 0.4311 | 0.2785 |
3.5662 | 1.2208 | 1200 | 3.6180 | 3.6180 | 1.4173 | 0.4495 | 0.2971 |
3.4077 | 1.3225 | 1300 | 3.6398 | 3.6398 | 1.3373 | 0.4462 | 0.3633 |
3.2049 | 1.4242 | 1400 | 3.6723 | 3.6723 | 1.4147 | 0.4412 | 0.2926 |
3.31 | 1.5259 | 1500 | 3.4428 | 3.4428 | 1.3293 | 0.4762 | 0.3267 |
3.305 | 1.6277 | 1600 | 3.4988 | 3.4988 | 1.3174 | 0.4676 | 0.3473 |
3.3609 | 1.7294 | 1700 | 3.4473 | 3.4473 | 1.2963 | 0.4755 | 0.3756 |
3.2378 | 1.8311 | 1800 | 3.4053 | 3.4053 | 1.2767 | 0.4819 | 0.3698 |
2.9712 | 1.9329 | 1900 | 3.3944 | 3.3944 | 1.3107 | 0.4835 | 0.3344 |
2.9234 | 2.0346 | 2000 | 3.3752 | 3.3752 | 1.3278 | 0.4864 | 0.3241 |
2.8122 | 2.1363 | 2100 | 3.3209 | 3.3209 | 1.2657 | 0.4947 | 0.3794 |
2.7341 | 2.2380 | 2200 | 3.2862 | 3.2862 | 1.2335 | 0.5000 | 0.3871 |
2.7065 | 2.3398 | 2300 | 3.5387 | 3.5387 | 1.3335 | 0.4616 | 0.3486 |
2.8505 | 2.4415 | 2400 | 3.2023 | 3.2023 | 1.2427 | 0.5128 | 0.3756 |
2.956 | 2.5432 | 2500 | 3.1469 | 3.1469 | 1.2202 | 0.5212 | 0.3961 |
2.6824 | 2.6450 | 2600 | 3.1683 | 3.1683 | 1.2606 | 0.5179 | 0.3518 |
2.6478 | 2.7467 | 2700 | 3.0612 | 3.0612 | 1.1932 | 0.5342 | 0.3994 |
2.6817 | 2.8484 | 2800 | 3.1506 | 3.1506 | 1.2346 | 0.5206 | 0.3666 |
2.6265 | 2.9502 | 2900 | 3.5059 | 3.5059 | 1.3420 | 0.4666 | 0.3273 |
2.5614 | 3.0519 | 3000 | 3.0260 | 3.0260 | 1.1854 | 0.5396 | 0.3968 |
2.3188 | 3.1536 | 3100 | 3.2470 | 3.2470 | 1.2873 | 0.5059 | 0.3408 |
2.4269 | 3.2553 | 3200 | 3.1014 | 3.1014 | 1.1887 | 0.5281 | 0.4084 |
2.2214 | 3.3571 | 3300 | 3.1259 | 3.1259 | 1.1820 | 0.5244 | 0.4270 |
2.3768 | 3.4588 | 3400 | 3.0962 | 3.0962 | 1.2112 | 0.5289 | 0.3820 |
2.2363 | 3.5605 | 3500 | 3.0551 | 3.0551 | 1.1791 | 0.5352 | 0.4045 |
2.4002 | 3.6623 | 3600 | 3.0551 | 3.0551 | 1.2458 | 0.5351 | 0.3402 |
2.3295 | 3.7640 | 3700 | 3.0461 | 3.0461 | 1.1723 | 0.5365 | 0.4232 |
2.3019 | 3.8657 | 3800 | 3.0123 | 3.0123 | 1.1734 | 0.5417 | 0.4141 |
2.3294 | 3.9674 | 3900 | 3.0258 | 3.0258 | 1.1553 | 0.5396 | 0.4322 |
2.2853 | 4.0692 | 4000 | 3.0522 | 3.0522 | 1.1543 | 0.5356 | 0.4334 |
2.0828 | 4.1709 | 4100 | 3.0527 | 3.0527 | 1.1690 | 0.5355 | 0.4219 |
1.8908 | 4.2726 | 4200 | 3.1150 | 3.1150 | 1.1921 | 0.5260 | 0.4096 |
2.0039 | 4.3744 | 4300 | 3.0075 | 3.0075 | 1.1799 | 0.5424 | 0.4045 |
2.0977 | 4.4761 | 4400 | 3.0723 | 3.0723 | 1.1770 | 0.5325 | 0.4148 |
1.9904 | 4.5778 | 4500 | 3.1245 | 3.1245 | 1.1612 | 0.5246 | 0.4412 |
2.1779 | 4.6796 | 4600 | 3.1405 | 3.1405 | 1.1778 | 0.5222 | 0.4354 |
1.9731 | 4.7813 | 4700 | 2.9948 | 2.9948 | 1.1675 | 0.5443 | 0.4206 |
1.996 | 4.8830 | 4800 | 2.9782 | 2.9782 | 1.1761 | 0.5468 | 0.3910 |
2.1196 | 4.9847 | 4900 | 3.0197 | 3.0197 | 1.1890 | 0.5405 | 0.3961 |
1.5696 | 5.0865 | 5000 | 3.0354 | 3.0354 | 1.1818 | 0.5381 | 0.4122 |
1.7041 | 5.1882 | 5100 | 3.0963 | 3.0963 | 1.1914 | 0.5289 | 0.4064 |
1.7822 | 5.2899 | 5200 | 3.1153 | 3.1153 | 1.1687 | 0.5260 | 0.4277 |
1.8154 | 5.3917 | 5300 | 3.0880 | 3.0880 | 1.1698 | 0.5301 | 0.4289 |
1.8872 | 5.4934 | 5400 | 3.1228 | 3.1228 | 1.2046 | 0.5248 | 0.3910 |
1.9144 | 5.5951 | 5500 | 3.2307 | 3.2307 | 1.2436 | 0.5084 | 0.3749 |
1.7356 | 5.6968 | 5600 | 3.0748 | 3.0748 | 1.1635 | 0.5321 | 0.4264 |
1.8056 | 5.7986 | 5700 | 3.0312 | 3.0312 | 1.1819 | 0.5388 | 0.3981 |
1.7858 | 5.9003 | 5800 | 3.2011 | 3.2011 | 1.2046 | 0.5129 | 0.4058 |
1.7623 | 6.0020 | 5900 | 3.0523 | 3.0523 | 1.1582 | 0.5356 | 0.4180 |
1.5906 | 6.1038 | 6000 | 3.1295 | 3.1295 | 1.1898 | 0.5238 | 0.4103 |
1.5975 | 6.2055 | 6100 | 3.0125 | 3.0125 | 1.1728 | 0.5416 | 0.4084 |
1.6435 | 6.3072 | 6200 | 3.0240 | 3.0240 | 1.1833 | 0.5399 | 0.3994 |
1.7167 | 6.4090 | 6300 | 3.1221 | 3.1221 | 1.1991 | 0.5250 | 0.3923 |
1.592 | 6.5107 | 6400 | 3.0504 | 3.0504 | 1.1515 | 0.5359 | 0.4315 |
1.4698 | 6.6124 | 6500 | 3.2183 | 3.2183 | 1.2272 | 0.5103 | 0.3871 |
1.5992 | 6.7141 | 6600 | 3.0718 | 3.0718 | 1.1815 | 0.5326 | 0.4071 |
1.3935 | 6.8159 | 6700 | 3.0772 | 3.0772 | 1.1626 | 0.5318 | 0.4232 |
1.4834 | 6.9176 | 6800 | 3.1010 | 3.1010 | 1.2021 | 0.5282 | 0.3929 |
1.6854 | 7.0193 | 6900 | 3.0626 | 3.0626 | 1.1693 | 0.5340 | 0.4174 |
1.3724 | 7.1211 | 7000 | 3.1190 | 3.1190 | 1.1912 | 0.5254 | 0.4026 |
1.584 | 7.2228 | 7100 | 3.0804 | 3.0804 | 1.1644 | 0.5313 | 0.4225 |
1.3925 | 7.3245 | 7200 | 3.1830 | 3.1830 | 1.2079 | 0.5157 | 0.4039 |
1.4212 | 7.4262 | 7300 | 3.2901 | 3.2901 | 1.2196 | 0.4994 | 0.4096 |
1.3901 | 7.5280 | 7400 | 3.1260 | 3.1260 | 1.1736 | 0.5244 | 0.4238 |
1.2885 | 7.6297 | 7500 | 3.0702 | 3.0702 | 1.1753 | 0.5329 | 0.4039 |
1.4006 | 7.7314 | 7600 | 3.0681 | 3.0681 | 1.1714 | 0.5332 | 0.4135 |
1.4378 | 7.8332 | 7700 | 3.1086 | 3.1086 | 1.1884 | 0.5270 | 0.4 |
1.5068 | 7.9349 | 7800 | 3.0886 | 3.0886 | 1.1799 | 0.5301 | 0.4103 |
1.4551 | 8.0366 | 7900 | 3.0851 | 3.0851 | 1.1770 | 0.5306 | 0.4135 |
1.3684 | 8.1384 | 8000 | 3.1484 | 3.1484 | 1.1978 | 0.5210 | 0.4045 |
1.4743 | 8.2401 | 8100 | 3.0832 | 3.0832 | 1.1789 | 0.5309 | 0.4090 |
1.3129 | 8.3418 | 8200 | 3.0417 | 3.0417 | 1.1636 | 0.5372 | 0.4244 |
1.292 | 8.4435 | 8300 | 3.0918 | 3.0918 | 1.1801 | 0.5296 | 0.4109 |
1.3969 | 8.5453 | 8400 | 3.0818 | 3.0818 | 1.1760 | 0.5311 | 0.4174 |
1.3746 | 8.6470 | 8500 | 3.0735 | 3.0735 | 1.1767 | 0.5323 | 0.4193 |
1.3887 | 8.7487 | 8600 | 3.1460 | 3.1460 | 1.1890 | 0.5213 | 0.4148 |
1.1526 | 8.8505 | 8700 | 3.1036 | 3.1036 | 1.1656 | 0.5278 | 0.4322 |
1.2347 | 8.9522 | 8800 | 3.1268 | 3.1268 | 1.1807 | 0.5242 | 0.4225 |
1.3415 | 9.0539 | 8900 | 3.1485 | 3.1485 | 1.1787 | 0.5209 | 0.4244 |
1.3208 | 9.1556 | 9000 | 3.0984 | 3.0984 | 1.1719 | 0.5286 | 0.4270 |
1.2138 | 9.2574 | 9100 | 3.1122 | 3.1122 | 1.1724 | 0.5265 | 0.4270 |
1.2221 | 9.3591 | 9200 | 3.1270 | 3.1270 | 1.1738 | 0.5242 | 0.4289 |
1.2833 | 9.4608 | 9300 | 3.1411 | 3.1411 | 1.1802 | 0.5221 | 0.4232 |
1.3077 | 9.5626 | 9400 | 3.1322 | 3.1322 | 1.1849 | 0.5234 | 0.4174 |
1.2864 | 9.6643 | 9500 | 3.1166 | 3.1166 | 1.1822 | 0.5258 | 0.4193 |
1.2775 | 9.7660 | 9600 | 3.1313 | 3.1313 | 1.1838 | 0.5236 | 0.4206 |
1.3308 | 9.8678 | 9700 | 3.1078 | 3.1078 | 1.1783 | 0.5271 | 0.4199 |
1.2625 | 9.9695 | 9800 | 3.1190 | 3.1190 | 1.1829 | 0.5254 | 0.4174 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.20.3
- Downloads last month
- 50
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for HrantDinkFoundation/arabic-hs-degree-prediction
Base model
aubmindlab/bert-base-arabert