thanavut commited on
Commit
a59aa7f
·
verified ·
1 Parent(s): c42358f
README.md CHANGED
@@ -18,10 +18,10 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [avsolatorio/GIST-large-Embedding-v0](https://huggingface.co/avsolatorio/GIST-large-Embedding-v0) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.3318
22
- - F1: 0.6260
23
- - Roc Auc: 0.7856
24
- - Accuracy: 0.1786
25
 
26
  ## Model description
27
 
@@ -41,58 +41,88 @@ More information needed
41
 
42
  The following hyperparameters were used during training:
43
  - learning_rate: 5e-05
44
- - train_batch_size: 16
45
- - eval_batch_size: 16
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
- - num_epochs: 40
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
55
  |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:--------:|
56
- | 0.4488 | 1.0 | 25 | 0.3675 | 0.0779 | 0.5325 | 0.0179 |
57
- | 0.3356 | 2.0 | 50 | 0.3240 | 0.1910 | 0.5740 | 0.0536 |
58
- | 0.2818 | 3.0 | 75 | 0.2998 | 0.3079 | 0.6141 | 0.0357 |
59
- | 0.2346 | 4.0 | 100 | 0.2767 | 0.4724 | 0.6938 | 0.0893 |
60
- | 0.1954 | 5.0 | 125 | 0.2833 | 0.4403 | 0.6850 | 0.0714 |
61
- | 0.1605 | 6.0 | 150 | 0.2706 | 0.5153 | 0.7220 | 0.0536 |
62
- | 0.134 | 7.0 | 175 | 0.2719 | 0.5218 | 0.7311 | 0.1071 |
63
- | 0.1133 | 8.0 | 200 | 0.2776 | 0.5369 | 0.7475 | 0.0714 |
64
- | 0.0935 | 9.0 | 225 | 0.2626 | 0.5796 | 0.7555 | 0.1429 |
65
- | 0.0808 | 10.0 | 250 | 0.2669 | 0.5778 | 0.7576 | 0.125 |
66
- | 0.0694 | 11.0 | 275 | 0.2633 | 0.5963 | 0.7731 | 0.1429 |
67
- | 0.0573 | 12.0 | 300 | 0.2661 | 0.5658 | 0.7612 | 0.1071 |
68
- | 0.0496 | 13.0 | 325 | 0.2543 | 0.6004 | 0.7643 | 0.1429 |
69
- | 0.0429 | 14.0 | 350 | 0.2735 | 0.5936 | 0.7729 | 0.1071 |
70
- | 0.0366 | 15.0 | 375 | 0.2694 | 0.6179 | 0.7848 | 0.1429 |
71
- | 0.0323 | 16.0 | 400 | 0.2724 | 0.6217 | 0.7865 | 0.1429 |
72
- | 0.0289 | 17.0 | 425 | 0.2821 | 0.6157 | 0.7734 | 0.1786 |
73
- | 0.0257 | 18.0 | 450 | 0.2787 | 0.6399 | 0.7854 | 0.1786 |
74
- | 0.0229 | 19.0 | 475 | 0.2887 | 0.6114 | 0.7774 | 0.1071 |
75
- | 0.02 | 20.0 | 500 | 0.2807 | 0.6394 | 0.7970 | 0.1429 |
76
- | 0.0182 | 21.0 | 525 | 0.2852 | 0.6343 | 0.7797 | 0.1786 |
77
- | 0.0165 | 22.0 | 550 | 0.2899 | 0.6132 | 0.7774 | 0.1607 |
78
- | 0.0148 | 23.0 | 575 | 0.3000 | 0.6285 | 0.7888 | 0.1607 |
79
- | 0.0136 | 24.0 | 600 | 0.2950 | 0.6409 | 0.7908 | 0.1429 |
80
- | 0.0123 | 25.0 | 625 | 0.3034 | 0.6165 | 0.7815 | 0.1607 |
81
- | 0.0112 | 26.0 | 650 | 0.3061 | 0.6384 | 0.7949 | 0.1607 |
82
- | 0.0103 | 27.0 | 675 | 0.3041 | 0.6371 | 0.7906 | 0.1964 |
83
- | 0.0095 | 28.0 | 700 | 0.3189 | 0.6204 | 0.7836 | 0.1429 |
84
- | 0.009 | 29.0 | 725 | 0.3115 | 0.6267 | 0.7890 | 0.1786 |
85
- | 0.0083 | 30.0 | 750 | 0.3168 | 0.6264 | 0.7856 | 0.1786 |
86
- | 0.008 | 31.0 | 775 | 0.3199 | 0.6320 | 0.7866 | 0.1786 |
87
- | 0.0075 | 32.0 | 800 | 0.3271 | 0.6208 | 0.7839 | 0.1607 |
88
- | 0.0072 | 33.0 | 825 | 0.3219 | 0.6240 | 0.7856 | 0.1607 |
89
- | 0.0068 | 34.0 | 850 | 0.3257 | 0.6312 | 0.7849 | 0.1786 |
90
- | 0.0065 | 35.0 | 875 | 0.3249 | 0.6247 | 0.7855 | 0.1786 |
91
- | 0.0063 | 36.0 | 900 | 0.3296 | 0.6291 | 0.7881 | 0.1786 |
92
- | 0.0062 | 37.0 | 925 | 0.3302 | 0.6227 | 0.7844 | 0.1786 |
93
- | 0.006 | 38.0 | 950 | 0.3287 | 0.6260 | 0.7856 | 0.1786 |
94
- | 0.0058 | 39.0 | 975 | 0.3317 | 0.6260 | 0.7856 | 0.1786 |
95
- | 0.0058 | 40.0 | 1000 | 0.3318 | 0.6260 | 0.7856 | 0.1786 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
96
 
97
 
98
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [avsolatorio/GIST-large-Embedding-v0](https://huggingface.co/avsolatorio/GIST-large-Embedding-v0) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.5549
22
+ - F1: 0.6828
23
+ - Roc Auc: 0.9255
24
+ - Accuracy: 0.1053
25
 
26
  ## Model description
27
 
 
41
 
42
  The following hyperparameters were used during training:
43
  - learning_rate: 5e-05
44
+ - train_batch_size: 8
45
+ - eval_batch_size: 8
46
  - seed: 42
47
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
  - lr_scheduler_type: linear
49
+ - num_epochs: 70
50
  - mixed_precision_training: Native AMP
51
 
52
  ### Training results
53
 
54
  | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
55
  |:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:--------:|
56
+ | 0.4089 | 1.0 | 50 | 0.3415 | 0.1336 | 0.7999 | 0.0702 |
57
+ | 0.3111 | 2.0 | 100 | 0.3084 | 0.2771 | 0.8501 | 0.0526 |
58
+ | 0.2544 | 3.0 | 150 | 0.2851 | 0.4233 | 0.8679 | 0.0526 |
59
+ | 0.209 | 4.0 | 200 | 0.2893 | 0.4545 | 0.8868 | 0.0526 |
60
+ | 0.1688 | 5.0 | 250 | 0.2560 | 0.5307 | 0.9137 | 0.1053 |
61
+ | 0.1335 | 6.0 | 300 | 0.2679 | 0.4982 | 0.9001 | 0.0702 |
62
+ | 0.1043 | 7.0 | 350 | 0.2689 | 0.5758 | 0.9070 | 0.1053 |
63
+ | 0.0813 | 8.0 | 400 | 0.2786 | 0.5994 | 0.9112 | 0.1228 |
64
+ | 0.0686 | 9.0 | 450 | 0.2742 | 0.6150 | 0.9119 | 0.1053 |
65
+ | 0.0553 | 10.0 | 500 | 0.2751 | 0.6498 | 0.9076 | 0.1404 |
66
+ | 0.0463 | 11.0 | 550 | 0.2905 | 0.5894 | 0.9156 | 0.1228 |
67
+ | 0.0401 | 12.0 | 600 | 0.2786 | 0.6313 | 0.9189 | 0.1579 |
68
+ | 0.0319 | 13.0 | 650 | 0.3090 | 0.6502 | 0.9127 | 0.1053 |
69
+ | 0.0277 | 14.0 | 700 | 0.2876 | 0.6024 | 0.9072 | 0.0877 |
70
+ | 0.0248 | 15.0 | 750 | 0.2991 | 0.6546 | 0.9275 | 0.0702 |
71
+ | 0.02 | 16.0 | 800 | 0.3128 | 0.6345 | 0.9217 | 0.0526 |
72
+ | 0.0176 | 17.0 | 850 | 0.3139 | 0.6782 | 0.9239 | 0.0877 |
73
+ | 0.0147 | 18.0 | 900 | 0.3128 | 0.6739 | 0.9232 | 0.1053 |
74
+ | 0.0128 | 19.0 | 950 | 0.3035 | 0.6718 | 0.9217 | 0.1228 |
75
+ | 0.0108 | 20.0 | 1000 | 0.3298 | 0.6531 | 0.9155 | 0.1053 |
76
+ | 0.0098 | 21.0 | 1050 | 0.3470 | 0.6596 | 0.9183 | 0.1053 |
77
+ | 0.0084 | 22.0 | 1100 | 0.3471 | 0.6674 | 0.9170 | 0.1404 |
78
+ | 0.0071 | 23.0 | 1150 | 0.3483 | 0.6756 | 0.9123 | 0.1228 |
79
+ | 0.0064 | 24.0 | 1200 | 0.3600 | 0.6734 | 0.9158 | 0.1053 |
80
+ | 0.0058 | 25.0 | 1250 | 0.3636 | 0.6734 | 0.9172 | 0.1228 |
81
+ | 0.0051 | 26.0 | 1300 | 0.3687 | 0.6826 | 0.9216 | 0.1053 |
82
+ | 0.0043 | 27.0 | 1350 | 0.3859 | 0.6627 | 0.9215 | 0.0877 |
83
+ | 0.0038 | 28.0 | 1400 | 0.3724 | 0.6759 | 0.9299 | 0.1053 |
84
+ | 0.0034 | 29.0 | 1450 | 0.4112 | 0.6869 | 0.9195 | 0.1228 |
85
+ | 0.0029 | 30.0 | 1500 | 0.3952 | 0.6985 | 0.9207 | 0.1404 |
86
+ | 0.0026 | 31.0 | 1550 | 0.4265 | 0.6762 | 0.9204 | 0.1228 |
87
+ | 0.0023 | 32.0 | 1600 | 0.4360 | 0.6861 | 0.9195 | 0.1053 |
88
+ | 0.002 | 33.0 | 1650 | 0.4182 | 0.6735 | 0.9271 | 0.0877 |
89
+ | 0.0018 | 34.0 | 1700 | 0.4394 | 0.6678 | 0.9211 | 0.0877 |
90
+ | 0.0016 | 35.0 | 1750 | 0.4406 | 0.6890 | 0.9288 | 0.0877 |
91
+ | 0.0014 | 36.0 | 1800 | 0.4398 | 0.6771 | 0.9240 | 0.1053 |
92
+ | 0.0013 | 37.0 | 1850 | 0.4394 | 0.6849 | 0.9226 | 0.0877 |
93
+ | 0.0012 | 38.0 | 1900 | 0.4642 | 0.6712 | 0.9147 | 0.0702 |
94
+ | 0.0011 | 39.0 | 1950 | 0.4667 | 0.6744 | 0.9223 | 0.0877 |
95
+ | 0.001 | 40.0 | 2000 | 0.4570 | 0.6662 | 0.9222 | 0.1053 |
96
+ | 0.0009 | 41.0 | 2050 | 0.4608 | 0.6871 | 0.9257 | 0.1053 |
97
+ | 0.0008 | 42.0 | 2100 | 0.4586 | 0.6771 | 0.9290 | 0.1053 |
98
+ | 0.0007 | 43.0 | 2150 | 0.4737 | 0.6903 | 0.9208 | 0.1228 |
99
+ | 0.0006 | 44.0 | 2200 | 0.4784 | 0.6812 | 0.9251 | 0.1053 |
100
+ | 0.0006 | 45.0 | 2250 | 0.4752 | 0.7063 | 0.9188 | 0.1404 |
101
+ | 0.0006 | 46.0 | 2300 | 0.4852 | 0.6938 | 0.9261 | 0.1053 |
102
+ | 0.0005 | 47.0 | 2350 | 0.4978 | 0.6881 | 0.9276 | 0.1053 |
103
+ | 0.0005 | 48.0 | 2400 | 0.5036 | 0.6664 | 0.9243 | 0.0877 |
104
+ | 0.0005 | 49.0 | 2450 | 0.5029 | 0.6782 | 0.9241 | 0.0877 |
105
+ | 0.0004 | 50.0 | 2500 | 0.5160 | 0.6713 | 0.9268 | 0.0877 |
106
+ | 0.0004 | 51.0 | 2550 | 0.5217 | 0.6789 | 0.9253 | 0.1053 |
107
+ | 0.0004 | 52.0 | 2600 | 0.5203 | 0.6842 | 0.9254 | 0.1228 |
108
+ | 0.0003 | 53.0 | 2650 | 0.5242 | 0.6773 | 0.9197 | 0.1228 |
109
+ | 0.0003 | 54.0 | 2700 | 0.5248 | 0.6887 | 0.9261 | 0.1053 |
110
+ | 0.0003 | 55.0 | 2750 | 0.5309 | 0.6796 | 0.9256 | 0.1053 |
111
+ | 0.0003 | 56.0 | 2800 | 0.5356 | 0.6827 | 0.9251 | 0.1228 |
112
+ | 0.0003 | 57.0 | 2850 | 0.5360 | 0.6693 | 0.9234 | 0.1053 |
113
+ | 0.0003 | 58.0 | 2900 | 0.5420 | 0.6866 | 0.9272 | 0.1053 |
114
+ | 0.0003 | 59.0 | 2950 | 0.5517 | 0.6793 | 0.9245 | 0.1053 |
115
+ | 0.0002 | 60.0 | 3000 | 0.5482 | 0.6855 | 0.9249 | 0.0877 |
116
+ | 0.0002 | 61.0 | 3050 | 0.5514 | 0.6798 | 0.9239 | 0.1053 |
117
+ | 0.0002 | 62.0 | 3100 | 0.5580 | 0.6824 | 0.9240 | 0.1053 |
118
+ | 0.0002 | 63.0 | 3150 | 0.5566 | 0.6821 | 0.9258 | 0.1053 |
119
+ | 0.0002 | 64.0 | 3200 | 0.5582 | 0.6776 | 0.9253 | 0.1053 |
120
+ | 0.0002 | 65.0 | 3250 | 0.5574 | 0.6816 | 0.9264 | 0.1053 |
121
+ | 0.0002 | 66.0 | 3300 | 0.5607 | 0.6767 | 0.9251 | 0.1053 |
122
+ | 0.0002 | 67.0 | 3350 | 0.5523 | 0.6851 | 0.9244 | 0.1053 |
123
+ | 0.0002 | 68.0 | 3400 | 0.5572 | 0.6804 | 0.9255 | 0.1053 |
124
+ | 0.0002 | 69.0 | 3450 | 0.5537 | 0.6828 | 0.9252 | 0.1053 |
125
+ | 0.0002 | 70.0 | 3500 | 0.5549 | 0.6828 | 0.9255 | 0.1053 |
126
 
127
 
128
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:da99fdf7e0bb11d3600630d6305da278d8cd6a9cdf5bf405b31e2b942dbf0292
3
  size 1340688368
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:263925bb36ce4d10de790e3925912a622b297c87c05f7b334bc9b8a3cc6365d7
3
  size 1340688368
runs/Mar17_18-21-47_c33c53608e12/events.out.tfevents.1710699710.c33c53608e12.34.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:28998a11a40c975d26db3d5c16ffe1e26fbab497fb3a2864732102fa0ac0a31b
3
+ size 5239
runs/Mar17_18-23-00_c33c53608e12/events.out.tfevents.1710699780.c33c53608e12.34.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4737106c2908aff9cd4a6679b509f3612edd72c25959fd8426c7e2dbcb342ecd
3
+ size 5239
runs/Mar17_18-23-14_c33c53608e12/events.out.tfevents.1710699795.c33c53608e12.34.2 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:190eec7824c9eab2ef4d972c8750ae18074142ed8c835a73362fe4b46c69fed4
3
+ size 49739
tokenizer.json CHANGED
@@ -2,7 +2,7 @@
2
  "version": "1.0",
3
  "truncation": {
4
  "direction": "Right",
5
- "max_length": 384,
6
  "strategy": "LongestFirst",
7
  "stride": 0
8
  },
 
2
  "version": "1.0",
3
  "truncation": {
4
  "direction": "Right",
5
+ "max_length": 512,
6
  "strategy": "LongestFirst",
7
  "stride": 0
8
  },
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7b8843979bc85a81d08ffdff4c9fc17eb1af5a480b4c652c8042bfd34b094d8c
3
  size 4856
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6467daf6f409ea8b8010541dab00352a1bf0ba35d285a65068c44b17ce474c13
3
  size 4856