Dnq2025 commited on
Commit
13cb0b6
·
verified ·
1 Parent(s): b370897

Model save

Browse files
Files changed (1) hide show
  1. README.md +107 -134
README.md CHANGED
@@ -3,8 +3,6 @@ library_name: transformers
3
  license: other
4
  base_model: facebook/mask2former-swin-base-IN21k-ade-semantic
5
  tags:
6
- - image-segmentation
7
- - vision
8
  - generated_from_trainer
9
  model-index:
10
  - name: mask2former-finetuned-ER-Mito-LD3
@@ -16,9 +14,9 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  # mask2former-finetuned-ER-Mito-LD3
18
 
19
- This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on the Dnq2025/Mask2former_Pretrain dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 33.5405
22
  - Dummy: 1.0
23
 
24
  ## Model description
@@ -38,9 +36,9 @@ More information needed
38
  ### Training hyperparameters
39
 
40
  The following hyperparameters were used during training:
41
- - learning_rate: 0.0001
42
- - train_batch_size: 5
43
- - eval_batch_size: 5
44
  - seed: 1337
45
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
46
  - lr_scheduler_type: polynomial
@@ -48,133 +46,108 @@ The following hyperparameters were used during training:
48
 
49
  ### Training results
50
 
51
- | Training Loss | Epoch | Step | Validation Loss | Dummy |
52
- |:-------------:|:--------:|:-----:|:---------------:|:-----:|
53
- | 48.5372 | 1.0 | 104 | 36.3338 | 1.0 |
54
- | 33.5327 | 2.0 | 208 | 31.7351 | 1.0 |
55
- | 29.7691 | 3.0 | 312 | 30.4858 | 1.0 |
56
- | 26.3002 | 4.0 | 416 | 28.6091 | 1.0 |
57
- | 24.7501 | 5.0 | 520 | 27.0967 | 1.0 |
58
- | 23.4495 | 6.0 | 624 | 26.6241 | 1.0 |
59
- | 23.274 | 7.0 | 728 | 27.1544 | 1.0 |
60
- | 21.1617 | 8.0 | 832 | 27.4625 | 1.0 |
61
- | 20.373 | 9.0 | 936 | 27.5745 | 1.0 |
62
- | 20.4295 | 10.0 | 1040 | 27.6942 | 1.0 |
63
- | 20.2526 | 11.0 | 1144 | 27.7829 | 1.0 |
64
- | 19.2572 | 12.0 | 1248 | 27.2960 | 1.0 |
65
- | 19.0089 | 13.0 | 1352 | 26.0039 | 1.0 |
66
- | 18.3621 | 14.0 | 1456 | 26.5623 | 1.0 |
67
- | 18.0517 | 15.0 | 1560 | 26.2700 | 1.0 |
68
- | 18.3139 | 16.0 | 1664 | 27.2972 | 1.0 |
69
- | 17.6129 | 17.0 | 1768 | 26.4869 | 1.0 |
70
- | 17.8402 | 18.0 | 1872 | 27.7618 | 1.0 |
71
- | 16.6494 | 19.0 | 1976 | 27.5173 | 1.0 |
72
- | 17.0833 | 20.0 | 2080 | 28.1242 | 1.0 |
73
- | 16.5967 | 21.0 | 2184 | 29.1195 | 1.0 |
74
- | 16.2634 | 22.0 | 2288 | 27.0367 | 1.0 |
75
- | 16.6797 | 23.0 | 2392 | 27.1799 | 1.0 |
76
- | 16.0344 | 24.0 | 2496 | 26.6408 | 1.0 |
77
- | 15.7701 | 25.0 | 2600 | 28.4040 | 1.0 |
78
- | 15.6061 | 26.0 | 2704 | 28.0687 | 1.0 |
79
- | 15.3311 | 27.0 | 2808 | 27.1765 | 1.0 |
80
- | 15.2464 | 28.0 | 2912 | 28.2050 | 1.0 |
81
- | 15.0459 | 29.0 | 3016 | 28.6291 | 1.0 |
82
- | 14.7514 | 30.0 | 3120 | 27.8241 | 1.0 |
83
- | 15.0833 | 31.0 | 3224 | 29.1936 | 1.0 |
84
- | 15.0817 | 32.0 | 3328 | 28.4044 | 1.0 |
85
- | 14.3201 | 33.0 | 3432 | 28.3709 | 1.0 |
86
- | 14.5918 | 34.0 | 3536 | 29.3898 | 1.0 |
87
- | 14.7177 | 35.0 | 3640 | 28.5130 | 1.0 |
88
- | 13.9919 | 36.0 | 3744 | 27.7597 | 1.0 |
89
- | 14.2267 | 37.0 | 3848 | 29.2324 | 1.0 |
90
- | 13.7801 | 38.0 | 3952 | 28.3574 | 1.0 |
91
- | 14.1839 | 39.0 | 4056 | 28.8711 | 1.0 |
92
- | 13.7545 | 40.0 | 4160 | 28.2947 | 1.0 |
93
- | 14.1627 | 41.0 | 4264 | 29.4866 | 1.0 |
94
- | 13.5155 | 42.0 | 4368 | 29.8527 | 1.0 |
95
- | 13.704 | 43.0 | 4472 | 29.4292 | 1.0 |
96
- | 13.6644 | 44.0 | 4576 | 29.2324 | 1.0 |
97
- | 13.2006 | 45.0 | 4680 | 29.5414 | 1.0 |
98
- | 13.1545 | 46.0 | 4784 | 29.6988 | 1.0 |
99
- | 13.5744 | 47.0 | 4888 | 28.9933 | 1.0 |
100
- | 12.8073 | 48.0 | 4992 | 28.9770 | 1.0 |
101
- | 13.3773 | 49.0 | 5096 | 30.3950 | 1.0 |
102
- | 12.9506 | 50.0 | 5200 | 31.2871 | 1.0 |
103
- | 13.0674 | 51.0 | 5304 | 29.5711 | 1.0 |
104
- | 13.1265 | 52.0 | 5408 | 31.0887 | 1.0 |
105
- | 13.1392 | 53.0 | 5512 | 29.8433 | 1.0 |
106
- | 12.6108 | 54.0 | 5616 | 29.6436 | 1.0 |
107
- | 12.7608 | 55.0 | 5720 | 29.8706 | 1.0 |
108
- | 12.8723 | 56.0 | 5824 | 30.0596 | 1.0 |
109
- | 12.5437 | 57.0 | 5928 | 30.1367 | 1.0 |
110
- | 12.1387 | 58.0 | 6032 | 30.4089 | 1.0 |
111
- | 12.948 | 59.0 | 6136 | 30.5375 | 1.0 |
112
- | 12.2869 | 60.0 | 6240 | 32.3827 | 1.0 |
113
- | 12.7717 | 61.0 | 6344 | 30.6397 | 1.0 |
114
- | 12.4924 | 62.0 | 6448 | 30.7005 | 1.0 |
115
- | 12.3031 | 63.0 | 6552 | 29.9865 | 1.0 |
116
- | 12.5575 | 64.0 | 6656 | 31.0697 | 1.0 |
117
- | 11.9496 | 65.0 | 6760 | 31.5794 | 1.0 |
118
- | 12.0462 | 66.0 | 6864 | 31.6537 | 1.0 |
119
- | 12.7167 | 67.0 | 6968 | 30.7411 | 1.0 |
120
- | 11.8595 | 68.0 | 7072 | 30.4970 | 1.0 |
121
- | 11.7458 | 69.0 | 7176 | 30.8332 | 1.0 |
122
- | 12.2058 | 70.0 | 7280 | 32.0951 | 1.0 |
123
- | 12.0874 | 71.0 | 7384 | 32.4695 | 1.0 |
124
- | 11.705 | 72.0 | 7488 | 31.3117 | 1.0 |
125
- | 12.0 | 73.0 | 7592 | 30.6540 | 1.0 |
126
- | 11.9852 | 74.0 | 7696 | 34.2950 | 1.0 |
127
- | 11.7597 | 75.0 | 7800 | 31.6361 | 1.0 |
128
- | 11.8713 | 76.0 | 7904 | 31.1082 | 1.0 |
129
- | 11.705 | 77.0 | 8008 | 31.8714 | 1.0 |
130
- | 11.5474 | 78.0 | 8112 | 31.0299 | 1.0 |
131
- | 11.8387 | 79.0 | 8216 | 31.3672 | 1.0 |
132
- | 11.7057 | 80.0 | 8320 | 31.6435 | 1.0 |
133
- | 11.5656 | 81.0 | 8424 | 31.1940 | 1.0 |
134
- | 11.6578 | 82.0 | 8528 | 31.8184 | 1.0 |
135
- | 11.3049 | 83.0 | 8632 | 31.8668 | 1.0 |
136
- | 11.5542 | 84.0 | 8736 | 32.8192 | 1.0 |
137
- | 11.3942 | 85.0 | 8840 | 30.9723 | 1.0 |
138
- | 11.6955 | 86.0 | 8944 | 31.3487 | 1.0 |
139
- | 11.4862 | 87.0 | 9048 | 32.0451 | 1.0 |
140
- | 11.5867 | 88.0 | 9152 | 31.9769 | 1.0 |
141
- | 11.0975 | 89.0 | 9256 | 31.9721 | 1.0 |
142
- | 11.5126 | 90.0 | 9360 | 35.3877 | 1.0 |
143
- | 11.067 | 91.0 | 9464 | 33.7614 | 1.0 |
144
- | 11.3857 | 92.0 | 9568 | 32.7046 | 1.0 |
145
- | 11.5511 | 93.0 | 9672 | 32.1096 | 1.0 |
146
- | 11.0961 | 94.0 | 9776 | 32.8302 | 1.0 |
147
- | 11.2935 | 95.0 | 9880 | 32.6688 | 1.0 |
148
- | 11.2398 | 96.0 | 9984 | 32.2807 | 1.0 |
149
- | 11.0444 | 97.0 | 10088 | 32.2766 | 1.0 |
150
- | 11.3157 | 98.0 | 10192 | 32.4437 | 1.0 |
151
- | 11.0191 | 99.0 | 10296 | 32.3851 | 1.0 |
152
- | 11.1406 | 100.0 | 10400 | 32.1389 | 1.0 |
153
- | 11.1237 | 101.0 | 10504 | 32.4886 | 1.0 |
154
- | 10.9485 | 102.0 | 10608 | 32.5051 | 1.0 |
155
- | 10.9188 | 103.0 | 10712 | 32.8615 | 1.0 |
156
- | 11.3029 | 104.0 | 10816 | 33.0388 | 1.0 |
157
- | 11.2023 | 105.0 | 10920 | 32.4923 | 1.0 |
158
- | 10.9634 | 106.0 | 11024 | 32.3288 | 1.0 |
159
- | 11.257 | 107.0 | 11128 | 31.8855 | 1.0 |
160
- | 11.0193 | 108.0 | 11232 | 34.0067 | 1.0 |
161
- | 10.6401 | 109.0 | 11336 | 33.2946 | 1.0 |
162
- | 11.0542 | 110.0 | 11440 | 34.0535 | 1.0 |
163
- | 10.888 | 111.0 | 11544 | 32.7206 | 1.0 |
164
- | 10.9706 | 112.0 | 11648 | 33.1238 | 1.0 |
165
- | 11.0075 | 113.0 | 11752 | 32.9882 | 1.0 |
166
- | 10.7895 | 114.0 | 11856 | 32.7985 | 1.0 |
167
- | 10.9181 | 115.0 | 11960 | 32.9143 | 1.0 |
168
- | 10.5938 | 116.0 | 12064 | 33.0722 | 1.0 |
169
- | 10.4932 | 117.0 | 12168 | 34.2366 | 1.0 |
170
- | 10.9761 | 118.0 | 12272 | 33.8880 | 1.0 |
171
- | 10.6918 | 119.0 | 12376 | 34.3289 | 1.0 |
172
- | 10.896 | 120.0 | 12480 | 33.6095 | 1.0 |
173
- | 10.6876 | 121.0 | 12584 | 33.8608 | 1.0 |
174
- | 10.5666 | 122.0 | 12688 | 33.6994 | 1.0 |
175
- | 10.8161 | 123.0 | 12792 | 33.6172 | 1.0 |
176
- | 10.7195 | 124.0 | 12896 | 33.5397 | 1.0 |
177
- | 10.6712 | 124.0385 | 12900 | 33.4906 | 1.0 |
178
 
179
 
180
  ### Framework versions
 
3
  license: other
4
  base_model: facebook/mask2former-swin-base-IN21k-ade-semantic
5
  tags:
 
 
6
  - generated_from_trainer
7
  model-index:
8
  - name: mask2former-finetuned-ER-Mito-LD3
 
14
 
15
  # mask2former-finetuned-ER-Mito-LD3
16
 
17
+ This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 39.8755
20
  - Dummy: 1.0
21
 
22
  ## Model description
 
36
  ### Training hyperparameters
37
 
38
  The following hyperparameters were used during training:
39
+ - learning_rate: 0.0004
40
+ - train_batch_size: 4
41
+ - eval_batch_size: 4
42
  - seed: 1337
43
  - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
44
  - lr_scheduler_type: polynomial
 
46
 
47
  ### Training results
48
 
49
+ | Training Loss | Epoch | Step | Validation Loss | Dummy |
50
+ |:-------------:|:-----:|:-----:|:---------------:|:-----:|
51
+ | 57.794 | 1.0 | 129 | 47.5030 | 1.0 |
52
+ | 45.635 | 2.0 | 258 | 42.4622 | 1.0 |
53
+ | 43.6742 | 3.0 | 387 | 40.1383 | 1.0 |
54
+ | 37.5286 | 4.0 | 516 | 41.1964 | 1.0 |
55
+ | 33.7618 | 5.0 | 645 | 34.4374 | 1.0 |
56
+ | 31.5899 | 6.0 | 774 | 39.8242 | 1.0 |
57
+ | 29.0727 | 7.0 | 903 | 33.3223 | 1.0 |
58
+ | 27.8483 | 8.0 | 1032 | 30.9625 | 1.0 |
59
+ | 26.0904 | 9.0 | 1161 | 31.7084 | 1.0 |
60
+ | 26.1043 | 10.0 | 1290 | 31.8088 | 1.0 |
61
+ | 24.3038 | 11.0 | 1419 | 30.3361 | 1.0 |
62
+ | 23.6493 | 12.0 | 1548 | 30.2030 | 1.0 |
63
+ | 23.9146 | 13.0 | 1677 | 31.0806 | 1.0 |
64
+ | 21.9133 | 14.0 | 1806 | 31.3974 | 1.0 |
65
+ | 22.3071 | 15.0 | 1935 | 32.0925 | 1.0 |
66
+ | 21.0819 | 16.0 | 2064 | 29.9367 | 1.0 |
67
+ | 21.0089 | 17.0 | 2193 | 30.0420 | 1.0 |
68
+ | 20.9169 | 18.0 | 2322 | 29.2938 | 1.0 |
69
+ | 19.7935 | 19.0 | 2451 | 31.3945 | 1.0 |
70
+ | 19.8749 | 20.0 | 2580 | 29.8457 | 1.0 |
71
+ | 19.2973 | 21.0 | 2709 | 29.0713 | 1.0 |
72
+ | 18.5436 | 22.0 | 2838 | 29.0846 | 1.0 |
73
+ | 18.5996 | 23.0 | 2967 | 29.8810 | 1.0 |
74
+ | 19.1228 | 24.0 | 3096 | 29.3016 | 1.0 |
75
+ | 18.0519 | 25.0 | 3225 | 30.7155 | 1.0 |
76
+ | 17.7073 | 26.0 | 3354 | 28.7168 | 1.0 |
77
+ | 17.5055 | 27.0 | 3483 | 28.9899 | 1.0 |
78
+ | 17.4854 | 28.0 | 3612 | 30.1944 | 1.0 |
79
+ | 17.0048 | 29.0 | 3741 | 29.2829 | 1.0 |
80
+ | 16.8731 | 30.0 | 3870 | 30.1208 | 1.0 |
81
+ | 16.683 | 31.0 | 3999 | 30.7583 | 1.0 |
82
+ | 16.6109 | 32.0 | 4128 | 30.6232 | 1.0 |
83
+ | 15.8261 | 33.0 | 4257 | 29.4162 | 1.0 |
84
+ | 16.9002 | 34.0 | 4386 | 30.4388 | 1.0 |
85
+ | 16.3081 | 35.0 | 4515 | 29.9756 | 1.0 |
86
+ | 15.4745 | 36.0 | 4644 | 28.8214 | 1.0 |
87
+ | 15.938 | 37.0 | 4773 | 29.1001 | 1.0 |
88
+ | 15.9947 | 38.0 | 4902 | 31.0533 | 1.0 |
89
+ | 15.2328 | 39.0 | 5031 | 31.6211 | 1.0 |
90
+ | 15.202 | 40.0 | 5160 | 33.1383 | 1.0 |
91
+ | 15.0583 | 41.0 | 5289 | 31.4089 | 1.0 |
92
+ | 14.573 | 42.0 | 5418 | 31.5681 | 1.0 |
93
+ | 14.7401 | 43.0 | 5547 | 30.5548 | 1.0 |
94
+ | 14.6052 | 44.0 | 5676 | 31.3953 | 1.0 |
95
+ | 14.1299 | 45.0 | 5805 | 30.8153 | 1.0 |
96
+ | 13.6851 | 46.0 | 5934 | 30.9693 | 1.0 |
97
+ | 14.6677 | 47.0 | 6063 | 31.9361 | 1.0 |
98
+ | 13.6493 | 48.0 | 6192 | 34.3328 | 1.0 |
99
+ | 14.166 | 49.0 | 6321 | 32.6231 | 1.0 |
100
+ | 13.7388 | 50.0 | 6450 | 33.1736 | 1.0 |
101
+ | 13.0849 | 51.0 | 6579 | 34.9522 | 1.0 |
102
+ | 13.2502 | 52.0 | 6708 | 35.7990 | 1.0 |
103
+ | 13.5116 | 53.0 | 6837 | 31.5737 | 1.0 |
104
+ | 12.6993 | 54.0 | 6966 | 33.2650 | 1.0 |
105
+ | 13.3602 | 55.0 | 7095 | 34.8914 | 1.0 |
106
+ | 12.9585 | 56.0 | 7224 | 35.9862 | 1.0 |
107
+ | 12.7434 | 57.0 | 7353 | 34.9106 | 1.0 |
108
+ | 12.7299 | 58.0 | 7482 | 34.0106 | 1.0 |
109
+ | 12.717 | 59.0 | 7611 | 36.3588 | 1.0 |
110
+ | 12.0563 | 60.0 | 7740 | 35.0923 | 1.0 |
111
+ | 13.012 | 61.0 | 7869 | 38.7323 | 1.0 |
112
+ | 12.2878 | 62.0 | 7998 | 34.9967 | 1.0 |
113
+ | 12.2794 | 63.0 | 8127 | 37.5577 | 1.0 |
114
+ | 12.4147 | 64.0 | 8256 | 37.2733 | 1.0 |
115
+ | 12.0032 | 65.0 | 8385 | 35.3015 | 1.0 |
116
+ | 12.2793 | 66.0 | 8514 | 35.2806 | 1.0 |
117
+ | 12.2309 | 67.0 | 8643 | 36.2488 | 1.0 |
118
+ | 11.7082 | 68.0 | 8772 | 35.6687 | 1.0 |
119
+ | 11.8694 | 69.0 | 8901 | 36.0470 | 1.0 |
120
+ | 11.782 | 70.0 | 9030 | 35.4055 | 1.0 |
121
+ | 11.6254 | 71.0 | 9159 | 36.7066 | 1.0 |
122
+ | 11.5873 | 72.0 | 9288 | 36.1084 | 1.0 |
123
+ | 11.6251 | 73.0 | 9417 | 38.2932 | 1.0 |
124
+ | 11.4589 | 74.0 | 9546 | 36.5570 | 1.0 |
125
+ | 11.7378 | 75.0 | 9675 | 35.9887 | 1.0 |
126
+ | 11.4933 | 76.0 | 9804 | 36.4713 | 1.0 |
127
+ | 11.2566 | 77.0 | 9933 | 36.9622 | 1.0 |
128
+ | 11.25 | 78.0 | 10062 | 37.1016 | 1.0 |
129
+ | 11.2962 | 79.0 | 10191 | 37.8711 | 1.0 |
130
+ | 11.0868 | 80.0 | 10320 | 38.5714 | 1.0 |
131
+ | 11.2786 | 81.0 | 10449 | 38.1493 | 1.0 |
132
+ | 11.1528 | 82.0 | 10578 | 39.0100 | 1.0 |
133
+ | 11.089 | 83.0 | 10707 | 38.5474 | 1.0 |
134
+ | 10.954 | 84.0 | 10836 | 38.9405 | 1.0 |
135
+ | 11.0157 | 85.0 | 10965 | 39.3872 | 1.0 |
136
+ | 10.9849 | 86.0 | 11094 | 39.4875 | 1.0 |
137
+ | 10.5423 | 87.0 | 11223 | 39.1179 | 1.0 |
138
+ | 11.1968 | 88.0 | 11352 | 39.4084 | 1.0 |
139
+ | 10.6376 | 89.0 | 11481 | 39.8218 | 1.0 |
140
+ | 10.7131 | 90.0 | 11610 | 39.2553 | 1.0 |
141
+ | 10.8252 | 91.0 | 11739 | 39.1368 | 1.0 |
142
+ | 10.6456 | 92.0 | 11868 | 38.9194 | 1.0 |
143
+ | 10.8488 | 93.0 | 11997 | 39.5955 | 1.0 |
144
+ | 10.8675 | 94.0 | 12126 | 39.4760 | 1.0 |
145
+ | 10.4757 | 95.0 | 12255 | 40.4844 | 1.0 |
146
+ | 10.3191 | 96.0 | 12384 | 39.0673 | 1.0 |
147
+ | 10.6073 | 97.0 | 12513 | 39.3767 | 1.0 |
148
+ | 10.3038 | 98.0 | 12642 | 39.6969 | 1.0 |
149
+ | 11.0709 | 99.0 | 12771 | 39.9325 | 1.0 |
150
+ | 10.5951 | 100.0 | 12900 | 39.8755 | 1.0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
151
 
152
 
153
  ### Framework versions