train_sst2_1744902622
This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B-Instruct on the sst2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.1366
- Num Input Tokens Seen: 35754976
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.3
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
---|---|---|---|---|
0.4099 | 0.0528 | 200 | 0.4572 | 178464 |
0.3998 | 0.1056 | 400 | 0.3829 | 357184 |
0.4073 | 0.1584 | 600 | 0.3757 | 535488 |
0.4059 | 0.2112 | 800 | 0.3848 | 714592 |
0.3533 | 0.2640 | 1000 | 0.3506 | 893216 |
0.3603 | 0.3167 | 1200 | 0.3607 | 1072832 |
0.3559 | 0.3695 | 1400 | 0.3772 | 1250688 |
0.401 | 0.4223 | 1600 | 0.3816 | 1429824 |
0.3453 | 0.4751 | 1800 | 0.3657 | 1608736 |
0.3909 | 0.5279 | 2000 | 0.4157 | 1787552 |
0.3384 | 0.5807 | 2200 | 0.3461 | 1968064 |
0.3478 | 0.6335 | 2400 | 0.3466 | 2145056 |
0.3469 | 0.6863 | 2600 | 0.3650 | 2323552 |
0.3491 | 0.7391 | 2800 | 0.3578 | 2501632 |
0.3526 | 0.7919 | 3000 | 0.3800 | 2681600 |
0.3443 | 0.8447 | 3200 | 0.3549 | 2859456 |
0.3488 | 0.8975 | 3400 | 0.3485 | 3039712 |
0.3592 | 0.9502 | 3600 | 0.3443 | 3218400 |
0.3477 | 1.0029 | 3800 | 0.3512 | 3395632 |
0.3483 | 1.0557 | 4000 | 0.3722 | 3575248 |
0.3519 | 1.1085 | 4200 | 0.3467 | 3754960 |
0.3651 | 1.1613 | 4400 | 0.3553 | 3932752 |
0.3492 | 1.2141 | 4600 | 0.3504 | 4112272 |
0.3326 | 1.2669 | 4800 | 0.3464 | 4291792 |
0.3653 | 1.3197 | 5000 | 0.3487 | 4472784 |
0.3668 | 1.3724 | 5200 | 0.3496 | 4651696 |
0.361 | 1.4252 | 5400 | 0.3495 | 4829360 |
0.3642 | 1.4780 | 5600 | 0.3506 | 5007920 |
0.3366 | 1.5308 | 5800 | 0.3462 | 5188208 |
0.3595 | 1.5836 | 6000 | 0.3496 | 5366384 |
0.3637 | 1.6364 | 6200 | 0.3471 | 5544176 |
0.3699 | 1.6892 | 6400 | 0.3469 | 5723216 |
0.3378 | 1.7420 | 6600 | 0.3578 | 5902896 |
0.3527 | 1.7948 | 6800 | 0.3469 | 6081040 |
0.3552 | 1.8476 | 7000 | 0.3447 | 6259056 |
0.3267 | 1.9004 | 7200 | 0.3901 | 6437904 |
0.3474 | 1.9531 | 7400 | 0.3460 | 6616176 |
0.3544 | 2.0058 | 7600 | 0.3500 | 6793648 |
0.3556 | 2.0586 | 7800 | 0.3493 | 6973744 |
0.346 | 2.1114 | 8000 | 0.3509 | 7150896 |
0.3559 | 2.1642 | 8200 | 0.3513 | 7330032 |
0.3445 | 2.2170 | 8400 | 0.3444 | 7508816 |
0.3449 | 2.2698 | 8600 | 0.3477 | 7686352 |
0.3379 | 2.3226 | 8800 | 0.3490 | 7864080 |
0.3469 | 2.3753 | 9000 | 0.3447 | 8042928 |
0.3429 | 2.4281 | 9200 | 0.3435 | 8223824 |
0.3341 | 2.4809 | 9400 | 0.3496 | 8402448 |
0.3522 | 2.5337 | 9600 | 0.3446 | 8581936 |
0.3368 | 2.5865 | 9800 | 0.3442 | 8762128 |
0.3741 | 2.6393 | 10000 | 0.3652 | 8939600 |
0.347 | 2.6921 | 10200 | 0.3489 | 9117424 |
0.365 | 2.7449 | 10400 | 0.3700 | 9299120 |
0.3432 | 2.7977 | 10600 | 0.3447 | 9477968 |
0.3394 | 2.8505 | 10800 | 0.3437 | 9658192 |
0.356 | 2.9033 | 11000 | 0.3465 | 9837392 |
0.3413 | 2.9561 | 11200 | 0.3439 | 10014320 |
0.3375 | 3.0087 | 11400 | 0.3450 | 10191904 |
0.3476 | 3.0615 | 11600 | 0.3664 | 10369824 |
0.3464 | 3.1143 | 11800 | 0.3448 | 10547296 |
0.3187 | 3.1671 | 12000 | 0.3486 | 10726592 |
0.3154 | 3.2199 | 12200 | 0.3991 | 10905760 |
0.3458 | 3.2727 | 12400 | 0.3471 | 11086528 |
0.3693 | 3.3255 | 12600 | 0.3438 | 11266208 |
0.3565 | 3.3782 | 12800 | 0.3454 | 11445184 |
0.3565 | 3.4310 | 13000 | 0.3500 | 11623936 |
0.3359 | 3.4838 | 13200 | 0.3452 | 11801312 |
0.3398 | 3.5366 | 13400 | 0.3458 | 11979552 |
0.341 | 3.5894 | 13600 | 0.3488 | 12158496 |
0.3436 | 3.6422 | 13800 | 0.3435 | 12336928 |
0.3458 | 3.6950 | 14000 | 0.3431 | 12516800 |
0.3396 | 3.7478 | 14200 | 0.3433 | 12695648 |
0.3396 | 3.8006 | 14400 | 0.3440 | 12874656 |
0.3443 | 3.8534 | 14600 | 0.3433 | 13053184 |
0.3497 | 3.9062 | 14800 | 0.3474 | 13232576 |
0.343 | 3.9590 | 15000 | 0.3430 | 13410176 |
0.3576 | 4.0116 | 15200 | 0.3443 | 13588176 |
0.3502 | 4.0644 | 15400 | 0.3443 | 13766160 |
0.3278 | 4.1172 | 15600 | 0.3444 | 13945776 |
0.3348 | 4.1700 | 15800 | 0.3409 | 14123120 |
0.3246 | 4.2228 | 16000 | 0.3460 | 14300816 |
0.3533 | 4.2756 | 16200 | 0.3403 | 14479248 |
0.3292 | 4.3284 | 16400 | 0.3436 | 14660976 |
0.3421 | 4.3812 | 16600 | 0.3339 | 14839056 |
0.3406 | 4.4339 | 16800 | 0.3471 | 15016048 |
0.4585 | 4.4867 | 17000 | 0.3426 | 15196432 |
0.3429 | 4.5395 | 17200 | 0.3436 | 15374128 |
0.3087 | 4.5923 | 17400 | 0.3540 | 15553776 |
0.3419 | 4.6451 | 17600 | 0.3432 | 15733520 |
0.3412 | 4.6979 | 17800 | 0.3441 | 15911728 |
0.3391 | 4.7507 | 18000 | 0.3486 | 16091728 |
0.3376 | 4.8035 | 18200 | 0.3449 | 16268208 |
0.3453 | 4.8563 | 18400 | 0.3436 | 16446704 |
0.3522 | 4.9091 | 18600 | 0.3427 | 16627152 |
0.3556 | 4.9619 | 18800 | 0.3435 | 16806032 |
0.3482 | 5.0145 | 19000 | 0.3428 | 16986160 |
0.3503 | 5.0673 | 19200 | 0.3495 | 17164848 |
0.3429 | 5.1201 | 19400 | 0.3430 | 17342800 |
0.3361 | 5.1729 | 19600 | 0.3384 | 17520144 |
0.3844 | 5.2257 | 19800 | 0.3426 | 17697936 |
0.3232 | 5.2785 | 20000 | 0.3328 | 17876496 |
0.3292 | 5.3313 | 20200 | 0.3295 | 18054800 |
0.3124 | 5.3841 | 20400 | 0.3556 | 18232176 |
0.3378 | 5.4368 | 20600 | 0.3282 | 18411760 |
0.2488 | 5.4896 | 20800 | 0.2913 | 18590672 |
0.3317 | 5.5424 | 21000 | 0.2735 | 18770000 |
0.2356 | 5.5952 | 21200 | 0.2657 | 18947664 |
0.234 | 5.6480 | 21400 | 0.2636 | 19127344 |
0.2434 | 5.7008 | 21600 | 0.2597 | 19306864 |
0.2006 | 5.7536 | 21800 | 0.2430 | 19485200 |
0.2932 | 5.8064 | 22000 | 0.2437 | 19664112 |
0.2208 | 5.8592 | 22200 | 0.2235 | 19843216 |
0.1877 | 5.9120 | 22400 | 0.2214 | 20022672 |
0.23 | 5.9648 | 22600 | 0.2081 | 20201808 |
0.1988 | 6.0174 | 22800 | 0.2024 | 20380512 |
0.2189 | 6.0702 | 23000 | 0.2129 | 20560608 |
0.2273 | 6.1230 | 23200 | 0.1930 | 20739200 |
0.1988 | 6.1758 | 23400 | 0.1903 | 20917728 |
0.2019 | 6.2286 | 23600 | 0.2082 | 21097088 |
0.1777 | 6.2814 | 23800 | 0.1904 | 21275360 |
0.2151 | 6.3342 | 24000 | 0.2026 | 21454048 |
0.1871 | 6.3870 | 24200 | 0.1782 | 21631232 |
0.1854 | 6.4398 | 24400 | 0.1979 | 21809632 |
0.1491 | 6.4925 | 24600 | 0.1834 | 21988192 |
0.1252 | 6.5453 | 24800 | 0.1765 | 22168864 |
0.1922 | 6.5981 | 25000 | 0.1776 | 22347392 |
0.1705 | 6.6509 | 25200 | 0.1737 | 22526048 |
0.1719 | 6.7037 | 25400 | 0.1682 | 22704800 |
0.2146 | 6.7565 | 25600 | 0.1686 | 22883200 |
0.1568 | 6.8093 | 25800 | 0.1644 | 23063104 |
0.1452 | 6.8621 | 26000 | 0.1680 | 23242080 |
0.1224 | 6.9149 | 26200 | 0.1612 | 23421312 |
0.1643 | 6.9677 | 26400 | 0.1678 | 23599008 |
0.1497 | 7.0203 | 26600 | 0.1580 | 23777520 |
0.1685 | 7.0731 | 26800 | 0.1582 | 23954320 |
0.1654 | 7.1259 | 27000 | 0.1578 | 24134608 |
0.2289 | 7.1787 | 27200 | 0.1636 | 24312464 |
0.1654 | 7.2315 | 27400 | 0.1562 | 24491696 |
0.1089 | 7.2843 | 27600 | 0.1565 | 24670160 |
0.1367 | 7.3371 | 27800 | 0.1535 | 24848976 |
0.0995 | 7.3899 | 28000 | 0.1536 | 25027536 |
0.1447 | 7.4427 | 28200 | 0.1559 | 25205648 |
0.1411 | 7.4954 | 28400 | 0.1566 | 25384496 |
0.1797 | 7.5482 | 28600 | 0.1544 | 25563856 |
0.1395 | 7.6010 | 28800 | 0.1515 | 25743536 |
0.1068 | 7.6538 | 29000 | 0.1516 | 25921616 |
0.1509 | 7.7066 | 29200 | 0.1516 | 26103376 |
0.1469 | 7.7594 | 29400 | 0.1514 | 26283664 |
0.1529 | 7.8122 | 29600 | 0.1458 | 26463440 |
0.1481 | 7.8650 | 29800 | 0.1462 | 26642352 |
0.1837 | 7.9178 | 30000 | 0.1503 | 26822096 |
0.112 | 7.9706 | 30200 | 0.1469 | 27000688 |
0.1671 | 8.0232 | 30400 | 0.1441 | 27178304 |
0.1247 | 8.0760 | 30600 | 0.1457 | 27356864 |
0.1153 | 8.1288 | 30800 | 0.1423 | 27536640 |
0.1171 | 8.1816 | 31000 | 0.1469 | 27714496 |
0.204 | 8.2344 | 31200 | 0.1436 | 27893536 |
0.1062 | 8.2872 | 31400 | 0.1434 | 28071808 |
0.1214 | 8.3400 | 31600 | 0.1456 | 28250112 |
0.1444 | 8.3928 | 31800 | 0.1419 | 28428832 |
0.1068 | 8.4456 | 32000 | 0.1418 | 28607296 |
0.1513 | 8.4984 | 32200 | 0.1408 | 28787040 |
0.1479 | 8.5511 | 32400 | 0.1420 | 28966560 |
0.1023 | 8.6039 | 32600 | 0.1447 | 29144544 |
0.2131 | 8.6567 | 32800 | 0.1401 | 29323040 |
0.1279 | 8.7095 | 33000 | 0.1418 | 29502176 |
0.1017 | 8.7623 | 33200 | 0.1400 | 29682560 |
0.1098 | 8.8151 | 33400 | 0.1420 | 29860768 |
0.1124 | 8.8679 | 33600 | 0.1403 | 30038624 |
0.0983 | 8.9207 | 33800 | 0.1402 | 30216384 |
0.1087 | 8.9735 | 34000 | 0.1389 | 30395872 |
0.1062 | 9.0261 | 34200 | 0.1416 | 30573920 |
0.0826 | 9.0789 | 34400 | 0.1386 | 30753536 |
0.1712 | 9.1317 | 34600 | 0.1407 | 30931776 |
0.1202 | 9.1845 | 34800 | 0.1404 | 31110592 |
0.1284 | 9.2373 | 35000 | 0.1397 | 31288160 |
0.0793 | 9.2901 | 35200 | 0.1398 | 31465760 |
0.0728 | 9.3429 | 35400 | 0.1392 | 31643168 |
0.0993 | 9.3957 | 35600 | 0.1383 | 31821856 |
0.116 | 9.4485 | 35800 | 0.1385 | 31998368 |
0.1251 | 9.5013 | 36000 | 0.1378 | 32178176 |
0.1478 | 9.5540 | 36200 | 0.1374 | 32356768 |
0.1491 | 9.6068 | 36400 | 0.1371 | 32537792 |
0.0952 | 9.6596 | 36600 | 0.1375 | 32714880 |
0.1537 | 9.7124 | 36800 | 0.1366 | 32893312 |
0.1055 | 9.7652 | 37000 | 0.1370 | 33071840 |
0.1323 | 9.8180 | 37200 | 0.1370 | 33251936 |
0.134 | 9.8708 | 37400 | 0.1370 | 33431008 |
0.1264 | 9.9236 | 37600 | 0.1375 | 33610816 |
0.1379 | 9.9764 | 37800 | 0.1372 | 33790528 |
0.1111 | 10.0290 | 38000 | 0.1369 | 33967520 |
0.089 | 10.0818 | 38200 | 0.1368 | 34145280 |
0.1279 | 10.1346 | 38400 | 0.1373 | 34324448 |
0.0847 | 10.1874 | 38600 | 0.1373 | 34503808 |
0.0885 | 10.2402 | 38800 | 0.1371 | 34683552 |
0.1121 | 10.2930 | 39000 | 0.1369 | 34860896 |
0.1661 | 10.3458 | 39200 | 0.1370 | 35039424 |
0.1234 | 10.3986 | 39400 | 0.1369 | 35217792 |
0.1109 | 10.4514 | 39600 | 0.1373 | 35396000 |
0.1118 | 10.5042 | 39800 | 0.1374 | 35575872 |
0.1245 | 10.5569 | 40000 | 0.1372 | 35754976 |
Framework versions
- PEFT 0.15.1
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_sst2_1744902622
Base model
meta-llama/Meta-Llama-3-8B-Instruct