train_sst2_1744902625
This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.3 on the sst2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.0713
- Num Input Tokens Seen: 33458560
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 123
- gradient_accumulation_steps: 4
- total_train_batch_size: 16
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- training_steps: 40000
Training results
Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen |
---|---|---|---|---|
0.1318 | 0.0528 | 200 | 0.1586 | 166688 |
0.1951 | 0.1056 | 400 | 0.1397 | 334048 |
0.0693 | 0.1584 | 600 | 0.1319 | 500448 |
0.0969 | 0.2112 | 800 | 0.1235 | 667872 |
0.1196 | 0.2640 | 1000 | 0.1205 | 834848 |
0.1125 | 0.3167 | 1200 | 0.1172 | 1002816 |
0.0759 | 0.3695 | 1400 | 0.1145 | 1169088 |
0.2051 | 0.4223 | 1600 | 0.1123 | 1337088 |
0.1022 | 0.4751 | 1800 | 0.1098 | 1505536 |
0.1273 | 0.5279 | 2000 | 0.1084 | 1673024 |
0.1299 | 0.5807 | 2200 | 0.1064 | 1842304 |
0.1179 | 0.6335 | 2400 | 0.1051 | 2007328 |
0.0799 | 0.6863 | 2600 | 0.1043 | 2174880 |
0.1575 | 0.7391 | 2800 | 0.1027 | 2341280 |
0.116 | 0.7919 | 3000 | 0.1017 | 2509440 |
0.0956 | 0.8447 | 3200 | 0.1001 | 2674784 |
0.0767 | 0.8975 | 3400 | 0.0986 | 2843680 |
0.1587 | 0.9502 | 3600 | 0.1008 | 3011904 |
0.0879 | 1.0029 | 3800 | 0.0968 | 3178064 |
0.1003 | 1.0557 | 4000 | 0.0961 | 3345904 |
0.0937 | 1.1085 | 4200 | 0.0947 | 3514608 |
0.0834 | 1.1613 | 4400 | 0.0943 | 3680560 |
0.0678 | 1.2141 | 4600 | 0.0933 | 3849328 |
0.1148 | 1.2669 | 4800 | 0.0934 | 4017200 |
0.0771 | 1.3197 | 5000 | 0.0917 | 4187184 |
0.0694 | 1.3724 | 5200 | 0.0915 | 4354416 |
0.0979 | 1.4252 | 5400 | 0.0907 | 4519856 |
0.1024 | 1.4780 | 5600 | 0.0901 | 4687280 |
0.0657 | 1.5308 | 5800 | 0.0896 | 4856112 |
0.0484 | 1.5836 | 6000 | 0.0896 | 5022736 |
0.1531 | 1.6364 | 6200 | 0.0881 | 5188656 |
0.0504 | 1.6892 | 6400 | 0.0878 | 5356208 |
0.1301 | 1.7420 | 6600 | 0.0874 | 5523952 |
0.1359 | 1.7948 | 6800 | 0.0867 | 5690672 |
0.0949 | 1.8476 | 7000 | 0.0864 | 5857072 |
0.0511 | 1.9004 | 7200 | 0.0864 | 6024976 |
0.0878 | 1.9531 | 7400 | 0.0859 | 6191664 |
0.0956 | 2.0058 | 7600 | 0.0850 | 6357472 |
0.1002 | 2.0586 | 7800 | 0.0851 | 6525984 |
0.0851 | 2.1114 | 8000 | 0.0841 | 6692320 |
0.0766 | 2.1642 | 8200 | 0.0838 | 6860064 |
0.1541 | 2.2170 | 8400 | 0.0846 | 7026528 |
0.0482 | 2.2698 | 8600 | 0.0834 | 7192384 |
0.1002 | 2.3226 | 8800 | 0.0832 | 7358816 |
0.0775 | 2.3753 | 9000 | 0.0827 | 7526496 |
0.0724 | 2.4281 | 9200 | 0.0824 | 7696064 |
0.0841 | 2.4809 | 9400 | 0.0831 | 7863456 |
0.1016 | 2.5337 | 9600 | 0.0818 | 8031776 |
0.0756 | 2.5865 | 9800 | 0.0819 | 8199584 |
0.0949 | 2.6393 | 10000 | 0.0810 | 8366016 |
0.0677 | 2.6921 | 10200 | 0.0812 | 8531808 |
0.0611 | 2.7449 | 10400 | 0.0807 | 8702976 |
0.0474 | 2.7977 | 10600 | 0.0804 | 8870944 |
0.0933 | 2.8505 | 10800 | 0.0812 | 9039680 |
0.1127 | 2.9033 | 11000 | 0.0813 | 9206880 |
0.0633 | 2.9561 | 11200 | 0.0802 | 9372128 |
0.0816 | 3.0087 | 11400 | 0.0794 | 9538768 |
0.0781 | 3.0615 | 11600 | 0.0791 | 9705232 |
0.0599 | 3.1143 | 11800 | 0.0793 | 9871632 |
0.0713 | 3.1671 | 12000 | 0.0794 | 10039472 |
0.0291 | 3.2199 | 12200 | 0.0789 | 10206320 |
0.0547 | 3.2727 | 12400 | 0.0785 | 10376240 |
0.0882 | 3.3255 | 12600 | 0.0787 | 10544464 |
0.0322 | 3.3782 | 12800 | 0.0781 | 10712240 |
0.0395 | 3.4310 | 13000 | 0.0778 | 10879120 |
0.0472 | 3.4838 | 13200 | 0.0779 | 11045072 |
0.0689 | 3.5366 | 13400 | 0.0781 | 11211312 |
0.09 | 3.5894 | 13600 | 0.0774 | 11378128 |
0.0392 | 3.6422 | 13800 | 0.0780 | 11544592 |
0.1368 | 3.6950 | 14000 | 0.0771 | 11713040 |
0.1223 | 3.7478 | 14200 | 0.0774 | 11880432 |
0.106 | 3.8006 | 14400 | 0.0765 | 12048176 |
0.049 | 3.8534 | 14600 | 0.0771 | 12215792 |
0.0427 | 3.9062 | 14800 | 0.0769 | 12383792 |
0.052 | 3.9590 | 15000 | 0.0764 | 12549680 |
0.0927 | 4.0116 | 15200 | 0.0763 | 12716448 |
0.0437 | 4.0644 | 15400 | 0.0767 | 12882752 |
0.0549 | 4.1172 | 15600 | 0.0764 | 13051200 |
0.0587 | 4.1700 | 15800 | 0.0761 | 13217024 |
0.0562 | 4.2228 | 16000 | 0.0757 | 13382784 |
0.0657 | 4.2756 | 16200 | 0.0764 | 13549216 |
0.0374 | 4.3284 | 16400 | 0.0752 | 13719072 |
0.1196 | 4.3812 | 16600 | 0.0752 | 13884928 |
0.0847 | 4.4339 | 16800 | 0.0751 | 14051584 |
0.0485 | 4.4867 | 17000 | 0.0769 | 14220704 |
0.0352 | 4.5395 | 17200 | 0.0749 | 14387008 |
0.1084 | 4.5923 | 17400 | 0.0749 | 14555808 |
0.0591 | 4.6451 | 17600 | 0.0755 | 14723456 |
0.116 | 4.6979 | 17800 | 0.0749 | 14890880 |
0.0692 | 4.7507 | 18000 | 0.0755 | 15059744 |
0.0686 | 4.8035 | 18200 | 0.0746 | 15224512 |
0.1239 | 4.8563 | 18400 | 0.0744 | 15392960 |
0.0474 | 4.9091 | 18600 | 0.0744 | 15561696 |
0.0925 | 4.9619 | 18800 | 0.0744 | 15728800 |
0.0724 | 5.0145 | 19000 | 0.0741 | 15897552 |
0.0674 | 5.0673 | 19200 | 0.0740 | 16064688 |
0.0695 | 5.1201 | 19400 | 0.0740 | 16231120 |
0.0706 | 5.1729 | 19600 | 0.0737 | 16397744 |
0.1331 | 5.2257 | 19800 | 0.0738 | 16564176 |
0.0663 | 5.2785 | 20000 | 0.0737 | 16731600 |
0.0327 | 5.3313 | 20200 | 0.0748 | 16898064 |
0.0879 | 5.3841 | 20400 | 0.0738 | 17064080 |
0.0532 | 5.4368 | 20600 | 0.0736 | 17231888 |
0.0614 | 5.4896 | 20800 | 0.0735 | 17399184 |
0.0563 | 5.5424 | 21000 | 0.0745 | 17566160 |
0.0631 | 5.5952 | 21200 | 0.0736 | 17732304 |
0.0431 | 5.6480 | 21400 | 0.0733 | 17900880 |
0.0466 | 5.7008 | 21600 | 0.0733 | 18070192 |
0.0843 | 5.7536 | 21800 | 0.0732 | 18237168 |
0.0494 | 5.8064 | 22000 | 0.0731 | 18403856 |
0.1229 | 5.8592 | 22200 | 0.0732 | 18571248 |
0.0307 | 5.9120 | 22400 | 0.0731 | 18738672 |
0.0534 | 5.9648 | 22600 | 0.0730 | 18905744 |
0.0806 | 6.0174 | 22800 | 0.0731 | 19073440 |
0.0733 | 6.0702 | 23000 | 0.0732 | 19241920 |
0.1169 | 6.1230 | 23200 | 0.0732 | 19409408 |
0.0757 | 6.1758 | 23400 | 0.0731 | 19577024 |
0.0495 | 6.2286 | 23600 | 0.0728 | 19744608 |
0.0752 | 6.2814 | 23800 | 0.0727 | 19911488 |
0.0694 | 6.3342 | 24000 | 0.0726 | 20078944 |
0.0617 | 6.3870 | 24200 | 0.0727 | 20244928 |
0.093 | 6.4398 | 24400 | 0.0725 | 20411232 |
0.0579 | 6.4925 | 24600 | 0.0728 | 20578080 |
0.0712 | 6.5453 | 24800 | 0.0725 | 20746592 |
0.1026 | 6.5981 | 25000 | 0.0727 | 20913344 |
0.0384 | 6.6509 | 25200 | 0.0725 | 21081952 |
0.0928 | 6.7037 | 25400 | 0.0724 | 21248384 |
0.0907 | 6.7565 | 25600 | 0.0723 | 21415872 |
0.0511 | 6.8093 | 25800 | 0.0729 | 21584000 |
0.1154 | 6.8621 | 26000 | 0.0723 | 21751168 |
0.0398 | 6.9149 | 26200 | 0.0722 | 21918816 |
0.0674 | 6.9677 | 26400 | 0.0723 | 22084384 |
0.0688 | 7.0203 | 26600 | 0.0722 | 22251776 |
0.0766 | 7.0731 | 26800 | 0.0722 | 22418080 |
0.0622 | 7.1259 | 27000 | 0.0721 | 22587392 |
0.0562 | 7.1787 | 27200 | 0.0721 | 22753056 |
0.0631 | 7.2315 | 27400 | 0.0724 | 22920768 |
0.0828 | 7.2843 | 27600 | 0.0718 | 23087296 |
0.0412 | 7.3371 | 27800 | 0.0721 | 23254400 |
0.0324 | 7.3899 | 28000 | 0.0721 | 23422752 |
0.0441 | 7.4427 | 28200 | 0.0721 | 23588352 |
0.0616 | 7.4954 | 28400 | 0.0723 | 23755840 |
0.0565 | 7.5482 | 28600 | 0.0721 | 23923680 |
0.0559 | 7.6010 | 28800 | 0.0719 | 24091168 |
0.0394 | 7.6538 | 29000 | 0.0721 | 24258016 |
0.0899 | 7.7066 | 29200 | 0.0718 | 24427808 |
0.0231 | 7.7594 | 29400 | 0.0718 | 24596288 |
0.0492 | 7.8122 | 29600 | 0.0718 | 24764192 |
0.0627 | 7.8650 | 29800 | 0.0719 | 24932000 |
0.0346 | 7.9178 | 30000 | 0.0718 | 25100224 |
0.0597 | 7.9706 | 30200 | 0.0722 | 25267808 |
0.0569 | 8.0232 | 30400 | 0.0720 | 25433440 |
0.0757 | 8.0760 | 30600 | 0.0717 | 25600672 |
0.0524 | 8.1288 | 30800 | 0.0718 | 25769408 |
0.0424 | 8.1816 | 31000 | 0.0717 | 25936160 |
0.0652 | 8.2344 | 31200 | 0.0718 | 26103744 |
0.0822 | 8.2872 | 31400 | 0.0715 | 26270560 |
0.0691 | 8.3400 | 31600 | 0.0719 | 26437536 |
0.031 | 8.3928 | 31800 | 0.0719 | 26604480 |
0.0484 | 8.4456 | 32000 | 0.0716 | 26771680 |
0.1148 | 8.4984 | 32200 | 0.0716 | 26940256 |
0.073 | 8.5511 | 32400 | 0.0715 | 27107680 |
0.0813 | 8.6039 | 32600 | 0.0718 | 27274048 |
0.1232 | 8.6567 | 32800 | 0.0717 | 27440544 |
0.0994 | 8.7095 | 33000 | 0.0716 | 27608000 |
0.0363 | 8.7623 | 33200 | 0.0715 | 27776704 |
0.016 | 8.8151 | 33400 | 0.0717 | 27942752 |
0.0744 | 8.8679 | 33600 | 0.0716 | 28108864 |
0.0325 | 8.9207 | 33800 | 0.0714 | 28275296 |
0.0517 | 8.9735 | 34000 | 0.0716 | 28443520 |
0.028 | 9.0261 | 34200 | 0.0716 | 28609776 |
0.061 | 9.0789 | 34400 | 0.0716 | 28777712 |
0.1408 | 9.1317 | 34600 | 0.0717 | 28944144 |
0.0362 | 9.1845 | 34800 | 0.0716 | 29111152 |
0.0993 | 9.2373 | 35000 | 0.0716 | 29278000 |
0.0391 | 9.2901 | 35200 | 0.0716 | 29443792 |
0.0398 | 9.3429 | 35400 | 0.0716 | 29609072 |
0.0981 | 9.3957 | 35600 | 0.0715 | 29776592 |
0.0716 | 9.4485 | 35800 | 0.0716 | 29941616 |
0.066 | 9.5013 | 36000 | 0.0717 | 30110160 |
0.0694 | 9.5540 | 36200 | 0.0716 | 30277744 |
0.1284 | 9.6068 | 36400 | 0.0716 | 30447152 |
0.028 | 9.6596 | 36600 | 0.0713 | 30612976 |
0.0429 | 9.7124 | 36800 | 0.0714 | 30780240 |
0.0227 | 9.7652 | 37000 | 0.0715 | 30948048 |
0.05 | 9.8180 | 37200 | 0.0715 | 31116368 |
0.0342 | 9.8708 | 37400 | 0.0715 | 31283888 |
0.0368 | 9.9236 | 37600 | 0.0716 | 31452560 |
0.0681 | 9.9764 | 37800 | 0.0714 | 31620720 |
0.0867 | 10.0290 | 38000 | 0.0713 | 31786016 |
0.0869 | 10.0818 | 38200 | 0.0715 | 31952768 |
0.0735 | 10.1346 | 38400 | 0.0714 | 32120320 |
0.0173 | 10.1874 | 38600 | 0.0715 | 32287584 |
0.0469 | 10.2402 | 38800 | 0.0716 | 32455072 |
0.0459 | 10.2930 | 39000 | 0.0713 | 32621184 |
0.0397 | 10.3458 | 39200 | 0.0714 | 32788960 |
0.0401 | 10.3986 | 39400 | 0.0716 | 32955776 |
0.0332 | 10.4514 | 39600 | 0.0716 | 33122816 |
0.0907 | 10.5042 | 39800 | 0.0716 | 33291072 |
0.0616 | 10.5569 | 40000 | 0.0716 | 33458560 |
Framework versions
- PEFT 0.15.1
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for rbelanec/train_sst2_1744902625
Base model
mistralai/Mistral-7B-v0.3
Finetuned
mistralai/Mistral-7B-Instruct-v0.3