|
--- |
|
library_name: peft |
|
license: gemma |
|
base_model: google/gemma-3-1b-it |
|
tags: |
|
- llama-factory |
|
- prompt-tuning |
|
- generated_from_trainer |
|
model-index: |
|
- name: train_qqp_1744902593 |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# train_qqp_1744902593 |
|
|
|
This model is a fine-tuned version of [google/gemma-3-1b-it](https://huggingface.co/google/gemma-3-1b-it) on the qqp dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0834 |
|
- Num Input Tokens Seen: 51858816 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 0.3 |
|
- train_batch_size: 4 |
|
- eval_batch_size: 4 |
|
- seed: 123 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 16 |
|
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments |
|
- lr_scheduler_type: cosine |
|
- training_steps: 40000 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Input Tokens Seen | |
|
|:-------------:|:------:|:-----:|:---------------:|:-----------------:| |
|
| 0.1845 | 0.0098 | 200 | 0.1865 | 260832 | |
|
| 0.1357 | 0.0195 | 400 | 0.1530 | 518880 | |
|
| 0.1369 | 0.0293 | 600 | 0.1775 | 780768 | |
|
| 0.1529 | 0.0391 | 800 | 0.1497 | 1038304 | |
|
| 0.1607 | 0.0489 | 1000 | 0.1531 | 1296288 | |
|
| 0.1522 | 0.0586 | 1200 | 0.1732 | 1554400 | |
|
| 0.1425 | 0.0684 | 1400 | 0.1563 | 1813856 | |
|
| 0.1449 | 0.0782 | 1600 | 0.1475 | 2074816 | |
|
| 0.1475 | 0.0879 | 1800 | 0.1589 | 2332544 | |
|
| 0.137 | 0.0977 | 2000 | 0.1468 | 2594720 | |
|
| 0.1557 | 0.1075 | 2200 | 0.1660 | 2853856 | |
|
| 0.1409 | 0.1173 | 2400 | 0.1445 | 3112928 | |
|
| 0.1626 | 0.1270 | 2600 | 0.1463 | 3374048 | |
|
| 0.117 | 0.1368 | 2800 | 0.1490 | 3637152 | |
|
| 0.1551 | 0.1466 | 3000 | 0.1501 | 3896512 | |
|
| 0.1438 | 0.1564 | 3200 | 0.1470 | 4157216 | |
|
| 0.1477 | 0.1661 | 3400 | 0.1468 | 4418592 | |
|
| 0.1282 | 0.1759 | 3600 | 0.1482 | 4677248 | |
|
| 0.1609 | 0.1857 | 3800 | 0.1421 | 4934080 | |
|
| 0.1478 | 0.1954 | 4000 | 0.1382 | 5191936 | |
|
| 0.1375 | 0.2052 | 4200 | 0.1396 | 5451200 | |
|
| 0.1283 | 0.2150 | 4400 | 0.1436 | 5711648 | |
|
| 0.1314 | 0.2248 | 4600 | 0.1376 | 5970048 | |
|
| 0.1196 | 0.2345 | 4800 | 0.1407 | 6226272 | |
|
| 0.111 | 0.2443 | 5000 | 0.1341 | 6486336 | |
|
| 0.1272 | 0.2541 | 5200 | 0.1347 | 6744864 | |
|
| 0.1251 | 0.2638 | 5400 | 0.1302 | 7006944 | |
|
| 0.125 | 0.2736 | 5600 | 0.1256 | 7267584 | |
|
| 0.1792 | 0.2834 | 5800 | 0.1430 | 7529280 | |
|
| 0.1262 | 0.2932 | 6000 | 0.1473 | 7788992 | |
|
| 0.1255 | 0.3029 | 6200 | 0.1346 | 8052736 | |
|
| 0.1161 | 0.3127 | 6400 | 0.1272 | 8311808 | |
|
| 0.1323 | 0.3225 | 6600 | 0.1323 | 8568416 | |
|
| 0.1171 | 0.3323 | 6800 | 0.1348 | 8830400 | |
|
| 0.1009 | 0.3420 | 7000 | 0.1383 | 9091040 | |
|
| 0.1267 | 0.3518 | 7200 | 0.1242 | 9350272 | |
|
| 0.1283 | 0.3616 | 7400 | 0.1259 | 9609312 | |
|
| 0.1166 | 0.3713 | 7600 | 0.1234 | 9867648 | |
|
| 0.1238 | 0.3811 | 7800 | 0.1205 | 10127328 | |
|
| 0.1282 | 0.3909 | 8000 | 0.1210 | 10383808 | |
|
| 0.1402 | 0.4007 | 8200 | 0.1194 | 10643424 | |
|
| 0.1137 | 0.4104 | 8400 | 0.1418 | 10901760 | |
|
| 0.1264 | 0.4202 | 8600 | 0.1229 | 11159584 | |
|
| 0.1273 | 0.4300 | 8800 | 0.1183 | 11420640 | |
|
| 0.1054 | 0.4397 | 9000 | 0.1203 | 11683072 | |
|
| 0.0973 | 0.4495 | 9200 | 0.1146 | 11941600 | |
|
| 0.1241 | 0.4593 | 9400 | 0.1186 | 12198528 | |
|
| 0.1244 | 0.4691 | 9600 | 0.1284 | 12455968 | |
|
| 0.1204 | 0.4788 | 9800 | 0.1117 | 12716992 | |
|
| 0.0962 | 0.4886 | 10000 | 0.1182 | 12974048 | |
|
| 0.0938 | 0.4984 | 10200 | 0.1114 | 13231360 | |
|
| 0.0973 | 0.5081 | 10400 | 0.1162 | 13489760 | |
|
| 0.1063 | 0.5179 | 10600 | 0.1102 | 13750592 | |
|
| 0.1189 | 0.5277 | 10800 | 0.1123 | 14009088 | |
|
| 0.1164 | 0.5375 | 11000 | 0.1126 | 14268352 | |
|
| 0.0954 | 0.5472 | 11200 | 0.1158 | 14527072 | |
|
| 0.1284 | 0.5570 | 11400 | 0.1140 | 14787040 | |
|
| 0.1099 | 0.5668 | 11600 | 0.1116 | 15045600 | |
|
| 0.0898 | 0.5766 | 11800 | 0.1122 | 15306176 | |
|
| 0.1014 | 0.5863 | 12000 | 0.1101 | 15565184 | |
|
| 0.1279 | 0.5961 | 12200 | 0.1206 | 15824576 | |
|
| 0.1099 | 0.6059 | 12400 | 0.1139 | 16083104 | |
|
| 0.1128 | 0.6156 | 12600 | 0.1065 | 16342784 | |
|
| 0.0954 | 0.6254 | 12800 | 0.1108 | 16601824 | |
|
| 0.1 | 0.6352 | 13000 | 0.1078 | 16860320 | |
|
| 0.1017 | 0.6450 | 13200 | 0.1070 | 17118528 | |
|
| 0.1085 | 0.6547 | 13400 | 0.1096 | 17378528 | |
|
| 0.0944 | 0.6645 | 13600 | 0.1084 | 17638400 | |
|
| 0.0976 | 0.6743 | 13800 | 0.1095 | 17898336 | |
|
| 0.1022 | 0.6840 | 14000 | 0.1127 | 18158528 | |
|
| 0.1218 | 0.6938 | 14200 | 0.1051 | 18418528 | |
|
| 0.1068 | 0.7036 | 14400 | 0.1221 | 18679264 | |
|
| 0.0971 | 0.7134 | 14600 | 0.1046 | 18940320 | |
|
| 0.0899 | 0.7231 | 14800 | 0.1046 | 19196416 | |
|
| 0.1098 | 0.7329 | 15000 | 0.1061 | 19454912 | |
|
| 0.0999 | 0.7427 | 15200 | 0.1030 | 19715616 | |
|
| 0.1159 | 0.7524 | 15400 | 0.1056 | 19976768 | |
|
| 0.1484 | 0.7622 | 15600 | 0.1052 | 20234592 | |
|
| 0.0815 | 0.7720 | 15800 | 0.1022 | 20493056 | |
|
| 0.0977 | 0.7818 | 16000 | 0.1015 | 20750368 | |
|
| 0.1011 | 0.7915 | 16200 | 0.1002 | 21010432 | |
|
| 0.0978 | 0.8013 | 16400 | 0.1017 | 21270112 | |
|
| 0.1306 | 0.8111 | 16600 | 0.1031 | 21531456 | |
|
| 0.0884 | 0.8209 | 16800 | 0.1047 | 21788384 | |
|
| 0.0938 | 0.8306 | 17000 | 0.1015 | 22045600 | |
|
| 0.089 | 0.8404 | 17200 | 0.1006 | 22303808 | |
|
| 0.0837 | 0.8502 | 17400 | 0.1000 | 22562496 | |
|
| 0.0792 | 0.8599 | 17600 | 0.1037 | 22821376 | |
|
| 0.1053 | 0.8697 | 17800 | 0.1070 | 23080448 | |
|
| 0.1144 | 0.8795 | 18000 | 0.0996 | 23338016 | |
|
| 0.1452 | 0.8893 | 18200 | 0.1016 | 23598208 | |
|
| 0.1016 | 0.8990 | 18400 | 0.0998 | 23857824 | |
|
| 0.069 | 0.9088 | 18600 | 0.1044 | 24117056 | |
|
| 0.0994 | 0.9186 | 18800 | 0.0984 | 24375456 | |
|
| 0.0993 | 0.9283 | 19000 | 0.1011 | 24635712 | |
|
| 0.1135 | 0.9381 | 19200 | 0.1038 | 24895360 | |
|
| 0.1089 | 0.9479 | 19400 | 0.0978 | 25156480 | |
|
| 0.1263 | 0.9577 | 19600 | 0.0971 | 25415936 | |
|
| 0.1038 | 0.9674 | 19800 | 0.0984 | 25677472 | |
|
| 0.0826 | 0.9772 | 20000 | 0.0982 | 25934656 | |
|
| 0.1066 | 0.9870 | 20200 | 0.0971 | 26193248 | |
|
| 0.0992 | 0.9968 | 20400 | 0.0959 | 26449184 | |
|
| 0.1161 | 1.0065 | 20600 | 0.0994 | 26710048 | |
|
| 0.1113 | 1.0163 | 20800 | 0.0943 | 26968800 | |
|
| 0.0776 | 1.0261 | 21000 | 0.1056 | 27230240 | |
|
| 0.07 | 1.0359 | 21200 | 0.0986 | 27489152 | |
|
| 0.1263 | 1.0456 | 21400 | 0.1003 | 27746528 | |
|
| 0.0887 | 1.0554 | 21600 | 0.0972 | 28009568 | |
|
| 0.0905 | 1.0652 | 21800 | 0.0948 | 28270592 | |
|
| 0.0639 | 1.0750 | 22000 | 0.0981 | 28533952 | |
|
| 0.092 | 1.0847 | 22200 | 0.0935 | 28788352 | |
|
| 0.0877 | 1.0945 | 22400 | 0.0944 | 29047328 | |
|
| 0.1056 | 1.1043 | 22600 | 0.0956 | 29306368 | |
|
| 0.0945 | 1.1140 | 22800 | 0.0938 | 29567616 | |
|
| 0.0968 | 1.1238 | 23000 | 0.0925 | 29829920 | |
|
| 0.0798 | 1.1336 | 23200 | 0.0925 | 30092128 | |
|
| 0.0825 | 1.1434 | 23400 | 0.0941 | 30349984 | |
|
| 0.1152 | 1.1531 | 23600 | 0.0944 | 30605344 | |
|
| 0.0853 | 1.1629 | 23800 | 0.0917 | 30867648 | |
|
| 0.0941 | 1.1727 | 24000 | 0.0917 | 31127744 | |
|
| 0.0994 | 1.1824 | 24200 | 0.0918 | 31383392 | |
|
| 0.0724 | 1.1922 | 24400 | 0.0973 | 31641056 | |
|
| 0.098 | 1.2020 | 24600 | 0.0920 | 31900960 | |
|
| 0.1033 | 1.2118 | 24800 | 0.0912 | 32158304 | |
|
| 0.0739 | 1.2215 | 25000 | 0.0919 | 32419552 | |
|
| 0.058 | 1.2313 | 25200 | 0.0948 | 32677888 | |
|
| 0.0679 | 1.2411 | 25400 | 0.0916 | 32936608 | |
|
| 0.1096 | 1.2508 | 25600 | 0.0964 | 33195264 | |
|
| 0.0767 | 1.2606 | 25800 | 0.0946 | 33454720 | |
|
| 0.0938 | 1.2704 | 26000 | 0.0894 | 33714496 | |
|
| 0.0885 | 1.2802 | 26200 | 0.0903 | 33972576 | |
|
| 0.1014 | 1.2899 | 26400 | 0.0899 | 34231488 | |
|
| 0.106 | 1.2997 | 26600 | 0.0915 | 34491904 | |
|
| 0.076 | 1.3095 | 26800 | 0.0887 | 34751008 | |
|
| 0.0884 | 1.3193 | 27000 | 0.0900 | 35006432 | |
|
| 0.0715 | 1.3290 | 27200 | 0.0913 | 35264896 | |
|
| 0.0641 | 1.3388 | 27400 | 0.0899 | 35523424 | |
|
| 0.0644 | 1.3486 | 27600 | 0.0902 | 35781024 | |
|
| 0.0943 | 1.3583 | 27800 | 0.0892 | 36040224 | |
|
| 0.066 | 1.3681 | 28000 | 0.0901 | 36297952 | |
|
| 0.1466 | 1.3779 | 28200 | 0.0880 | 36557056 | |
|
| 0.109 | 1.3877 | 28400 | 0.0881 | 36815904 | |
|
| 0.0693 | 1.3974 | 28600 | 0.0914 | 37076064 | |
|
| 0.089 | 1.4072 | 28800 | 0.0892 | 37333536 | |
|
| 0.0607 | 1.4170 | 29000 | 0.0901 | 37593216 | |
|
| 0.0662 | 1.4267 | 29200 | 0.0879 | 37850816 | |
|
| 0.0756 | 1.4365 | 29400 | 0.0872 | 38111232 | |
|
| 0.0639 | 1.4463 | 29600 | 0.0868 | 38370144 | |
|
| 0.0729 | 1.4561 | 29800 | 0.0877 | 38629280 | |
|
| 0.0771 | 1.4658 | 30000 | 0.0872 | 38887744 | |
|
| 0.0929 | 1.4756 | 30200 | 0.0878 | 39146016 | |
|
| 0.0642 | 1.4854 | 30400 | 0.0871 | 39406240 | |
|
| 0.0639 | 1.4952 | 30600 | 0.0871 | 39664736 | |
|
| 0.0726 | 1.5049 | 30800 | 0.0869 | 39922240 | |
|
| 0.0909 | 1.5147 | 31000 | 0.0859 | 40181504 | |
|
| 0.083 | 1.5245 | 31200 | 0.0863 | 40439712 | |
|
| 0.0546 | 1.5342 | 31400 | 0.0860 | 40700736 | |
|
| 0.1009 | 1.5440 | 31600 | 0.0859 | 40963072 | |
|
| 0.0774 | 1.5538 | 31800 | 0.0856 | 41224800 | |
|
| 0.0765 | 1.5636 | 32000 | 0.0856 | 41485536 | |
|
| 0.1073 | 1.5733 | 32200 | 0.0853 | 41743456 | |
|
| 0.0824 | 1.5831 | 32400 | 0.0860 | 42005696 | |
|
| 0.1164 | 1.5929 | 32600 | 0.0852 | 42267520 | |
|
| 0.0754 | 1.6026 | 32800 | 0.0868 | 42528896 | |
|
| 0.1035 | 1.6124 | 33000 | 0.0858 | 42786240 | |
|
| 0.0824 | 1.6222 | 33200 | 0.0859 | 43043616 | |
|
| 0.085 | 1.6320 | 33400 | 0.0845 | 43300896 | |
|
| 0.0677 | 1.6417 | 33600 | 0.0846 | 43559424 | |
|
| 0.1163 | 1.6515 | 33800 | 0.0844 | 43815424 | |
|
| 0.0684 | 1.6613 | 34000 | 0.0862 | 44074432 | |
|
| 0.0725 | 1.6710 | 34200 | 0.0847 | 44334304 | |
|
| 0.0811 | 1.6808 | 34400 | 0.0843 | 44594368 | |
|
| 0.0729 | 1.6906 | 34600 | 0.0840 | 44852576 | |
|
| 0.0726 | 1.7004 | 34800 | 0.0843 | 45109056 | |
|
| 0.076 | 1.7101 | 35000 | 0.0839 | 45367936 | |
|
| 0.0653 | 1.7199 | 35200 | 0.0841 | 45627104 | |
|
| 0.0772 | 1.7297 | 35400 | 0.0838 | 45885312 | |
|
| 0.069 | 1.7395 | 35600 | 0.0842 | 46145568 | |
|
| 0.0938 | 1.7492 | 35800 | 0.0838 | 46409504 | |
|
| 0.0684 | 1.7590 | 36000 | 0.0840 | 46669472 | |
|
| 0.0909 | 1.7688 | 36200 | 0.0840 | 46929280 | |
|
| 0.1004 | 1.7785 | 36400 | 0.0835 | 47188416 | |
|
| 0.0782 | 1.7883 | 36600 | 0.0837 | 47447328 | |
|
| 0.0857 | 1.7981 | 36800 | 0.0835 | 47707040 | |
|
| 0.0703 | 1.8079 | 37000 | 0.0842 | 47966176 | |
|
| 0.0636 | 1.8176 | 37200 | 0.0835 | 48227328 | |
|
| 0.0774 | 1.8274 | 37400 | 0.0835 | 48485632 | |
|
| 0.0662 | 1.8372 | 37600 | 0.0835 | 48744768 | |
|
| 0.0792 | 1.8469 | 37800 | 0.0836 | 49002400 | |
|
| 0.0914 | 1.8567 | 38000 | 0.0834 | 49259584 | |
|
| 0.0747 | 1.8665 | 38200 | 0.0835 | 49518144 | |
|
| 0.0847 | 1.8763 | 38400 | 0.0834 | 49775776 | |
|
| 0.0699 | 1.8860 | 38600 | 0.0834 | 50036384 | |
|
| 0.0633 | 1.8958 | 38800 | 0.0835 | 50298720 | |
|
| 0.082 | 1.9056 | 39000 | 0.0835 | 50560704 | |
|
| 0.0564 | 1.9153 | 39200 | 0.0834 | 50820416 | |
|
| 0.0749 | 1.9251 | 39400 | 0.0834 | 51080832 | |
|
| 0.0717 | 1.9349 | 39600 | 0.0835 | 51339424 | |
|
| 0.0663 | 1.9447 | 39800 | 0.0835 | 51597120 | |
|
| 0.0783 | 1.9544 | 40000 | 0.0835 | 51858816 | |
|
|
|
|
|
### Framework versions |
|
|
|
- PEFT 0.15.1 |
|
- Transformers 4.51.3 |
|
- Pytorch 2.6.0+cu124 |
|
- Datasets 3.5.0 |
|
- Tokenizers 0.21.1 |