Model save
Browse files
README.md
CHANGED
@@ -3,8 +3,6 @@ library_name: transformers
|
|
3 |
license: other
|
4 |
base_model: facebook/mask2former-swin-base-IN21k-ade-semantic
|
5 |
tags:
|
6 |
-
- image-segmentation
|
7 |
-
- vision
|
8 |
- generated_from_trainer
|
9 |
model-index:
|
10 |
- name: mask2former-finetuned-ER-Mito-LD3
|
@@ -16,9 +14,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
16 |
|
17 |
# mask2former-finetuned-ER-Mito-LD3
|
18 |
|
19 |
-
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on
|
20 |
It achieves the following results on the evaluation set:
|
21 |
-
- Loss:
|
22 |
- Dummy: 1.0
|
23 |
|
24 |
## Model description
|
@@ -38,9 +36,9 @@ More information needed
|
|
38 |
### Training hyperparameters
|
39 |
|
40 |
The following hyperparameters were used during training:
|
41 |
-
- learning_rate: 0.
|
42 |
-
- train_batch_size:
|
43 |
-
- eval_batch_size:
|
44 |
- seed: 1337
|
45 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
46 |
- lr_scheduler_type: polynomial
|
@@ -48,133 +46,108 @@ The following hyperparameters were used during training:
|
|
48 |
|
49 |
### Training results
|
50 |
|
51 |
-
| Training Loss | Epoch
|
52 |
-
|
53 |
-
|
|
54 |
-
|
|
55 |
-
|
|
56 |
-
|
|
57 |
-
|
|
58 |
-
|
|
59 |
-
|
|
60 |
-
|
|
61 |
-
|
|
62 |
-
|
|
63 |
-
|
|
64 |
-
|
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
|
|
69 |
-
|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
|
|
78 |
-
|
|
79 |
-
|
|
80 |
-
|
|
81 |
-
|
|
82 |
-
|
|
83 |
-
|
|
84 |
-
|
|
85 |
-
|
|
86 |
-
|
|
87 |
-
|
|
88 |
-
|
|
89 |
-
|
|
90 |
-
|
|
91 |
-
|
|
92 |
-
|
|
93 |
-
|
|
94 |
-
|
|
95 |
-
|
|
96 |
-
|
|
97 |
-
|
|
98 |
-
| 13.
|
99 |
-
|
|
100 |
-
|
|
101 |
-
|
|
102 |
-
|
|
103 |
-
| 13.
|
104 |
-
| 13.
|
105 |
-
| 13.
|
106 |
-
| 12.
|
107 |
-
|
|
108 |
-
| 12.
|
109 |
-
| 12.
|
110 |
-
| 12.
|
111 |
-
| 12.
|
112 |
-
| 12.
|
113 |
-
|
|
114 |
-
| 12.
|
115 |
-
| 12.
|
116 |
-
| 12.
|
117 |
-
|
|
118 |
-
| 12.
|
119 |
-
| 12.
|
120 |
-
| 11.
|
121 |
-
| 11.
|
122 |
-
|
|
123 |
-
|
|
124 |
-
| 11.
|
125 |
-
|
|
126 |
-
| 11.
|
127 |
-
| 11.
|
128 |
-
| 11.
|
129 |
-
| 11.
|
130 |
-
| 11.
|
131 |
-
| 11.
|
132 |
-
| 11.
|
133 |
-
| 11.
|
134 |
-
| 11.
|
135 |
-
| 11.
|
136 |
-
|
|
137 |
-
| 11.
|
138 |
-
|
|
139 |
-
|
|
140 |
-
| 11.
|
141 |
-
|
|
142 |
-
|
|
143 |
-
|
|
144 |
-
|
|
145 |
-
|
|
146 |
-
|
|
147 |
-
|
|
148 |
-
|
|
149 |
-
|
|
150 |
-
|
|
151 |
-
| 11.
|
152 |
-
|
|
153 |
-
| 11.1237 | 101.0 | 10504 | 32.4886 | 1.0 |
|
154 |
-
| 10.9485 | 102.0 | 10608 | 32.5051 | 1.0 |
|
155 |
-
| 10.9188 | 103.0 | 10712 | 32.8615 | 1.0 |
|
156 |
-
| 11.3029 | 104.0 | 10816 | 33.0388 | 1.0 |
|
157 |
-
| 11.2023 | 105.0 | 10920 | 32.4923 | 1.0 |
|
158 |
-
| 10.9634 | 106.0 | 11024 | 32.3288 | 1.0 |
|
159 |
-
| 11.257 | 107.0 | 11128 | 31.8855 | 1.0 |
|
160 |
-
| 11.0193 | 108.0 | 11232 | 34.0067 | 1.0 |
|
161 |
-
| 10.6401 | 109.0 | 11336 | 33.2946 | 1.0 |
|
162 |
-
| 11.0542 | 110.0 | 11440 | 34.0535 | 1.0 |
|
163 |
-
| 10.888 | 111.0 | 11544 | 32.7206 | 1.0 |
|
164 |
-
| 10.9706 | 112.0 | 11648 | 33.1238 | 1.0 |
|
165 |
-
| 11.0075 | 113.0 | 11752 | 32.9882 | 1.0 |
|
166 |
-
| 10.7895 | 114.0 | 11856 | 32.7985 | 1.0 |
|
167 |
-
| 10.9181 | 115.0 | 11960 | 32.9143 | 1.0 |
|
168 |
-
| 10.5938 | 116.0 | 12064 | 33.0722 | 1.0 |
|
169 |
-
| 10.4932 | 117.0 | 12168 | 34.2366 | 1.0 |
|
170 |
-
| 10.9761 | 118.0 | 12272 | 33.8880 | 1.0 |
|
171 |
-
| 10.6918 | 119.0 | 12376 | 34.3289 | 1.0 |
|
172 |
-
| 10.896 | 120.0 | 12480 | 33.6095 | 1.0 |
|
173 |
-
| 10.6876 | 121.0 | 12584 | 33.8608 | 1.0 |
|
174 |
-
| 10.5666 | 122.0 | 12688 | 33.6994 | 1.0 |
|
175 |
-
| 10.8161 | 123.0 | 12792 | 33.6172 | 1.0 |
|
176 |
-
| 10.7195 | 124.0 | 12896 | 33.5397 | 1.0 |
|
177 |
-
| 10.6712 | 124.0385 | 12900 | 33.4906 | 1.0 |
|
178 |
|
179 |
|
180 |
### Framework versions
|
|
|
3 |
license: other
|
4 |
base_model: facebook/mask2former-swin-base-IN21k-ade-semantic
|
5 |
tags:
|
|
|
|
|
6 |
- generated_from_trainer
|
7 |
model-index:
|
8 |
- name: mask2former-finetuned-ER-Mito-LD3
|
|
|
14 |
|
15 |
# mask2former-finetuned-ER-Mito-LD3
|
16 |
|
17 |
+
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on an unknown dataset.
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 39.8755
|
20 |
- Dummy: 1.0
|
21 |
|
22 |
## Model description
|
|
|
36 |
### Training hyperparameters
|
37 |
|
38 |
The following hyperparameters were used during training:
|
39 |
+
- learning_rate: 0.0004
|
40 |
+
- train_batch_size: 4
|
41 |
+
- eval_batch_size: 4
|
42 |
- seed: 1337
|
43 |
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
44 |
- lr_scheduler_type: polynomial
|
|
|
46 |
|
47 |
### Training results
|
48 |
|
49 |
+
| Training Loss | Epoch | Step | Validation Loss | Dummy |
|
50 |
+
|:-------------:|:-----:|:-----:|:---------------:|:-----:|
|
51 |
+
| 57.794 | 1.0 | 129 | 47.5030 | 1.0 |
|
52 |
+
| 45.635 | 2.0 | 258 | 42.4622 | 1.0 |
|
53 |
+
| 43.6742 | 3.0 | 387 | 40.1383 | 1.0 |
|
54 |
+
| 37.5286 | 4.0 | 516 | 41.1964 | 1.0 |
|
55 |
+
| 33.7618 | 5.0 | 645 | 34.4374 | 1.0 |
|
56 |
+
| 31.5899 | 6.0 | 774 | 39.8242 | 1.0 |
|
57 |
+
| 29.0727 | 7.0 | 903 | 33.3223 | 1.0 |
|
58 |
+
| 27.8483 | 8.0 | 1032 | 30.9625 | 1.0 |
|
59 |
+
| 26.0904 | 9.0 | 1161 | 31.7084 | 1.0 |
|
60 |
+
| 26.1043 | 10.0 | 1290 | 31.8088 | 1.0 |
|
61 |
+
| 24.3038 | 11.0 | 1419 | 30.3361 | 1.0 |
|
62 |
+
| 23.6493 | 12.0 | 1548 | 30.2030 | 1.0 |
|
63 |
+
| 23.9146 | 13.0 | 1677 | 31.0806 | 1.0 |
|
64 |
+
| 21.9133 | 14.0 | 1806 | 31.3974 | 1.0 |
|
65 |
+
| 22.3071 | 15.0 | 1935 | 32.0925 | 1.0 |
|
66 |
+
| 21.0819 | 16.0 | 2064 | 29.9367 | 1.0 |
|
67 |
+
| 21.0089 | 17.0 | 2193 | 30.0420 | 1.0 |
|
68 |
+
| 20.9169 | 18.0 | 2322 | 29.2938 | 1.0 |
|
69 |
+
| 19.7935 | 19.0 | 2451 | 31.3945 | 1.0 |
|
70 |
+
| 19.8749 | 20.0 | 2580 | 29.8457 | 1.0 |
|
71 |
+
| 19.2973 | 21.0 | 2709 | 29.0713 | 1.0 |
|
72 |
+
| 18.5436 | 22.0 | 2838 | 29.0846 | 1.0 |
|
73 |
+
| 18.5996 | 23.0 | 2967 | 29.8810 | 1.0 |
|
74 |
+
| 19.1228 | 24.0 | 3096 | 29.3016 | 1.0 |
|
75 |
+
| 18.0519 | 25.0 | 3225 | 30.7155 | 1.0 |
|
76 |
+
| 17.7073 | 26.0 | 3354 | 28.7168 | 1.0 |
|
77 |
+
| 17.5055 | 27.0 | 3483 | 28.9899 | 1.0 |
|
78 |
+
| 17.4854 | 28.0 | 3612 | 30.1944 | 1.0 |
|
79 |
+
| 17.0048 | 29.0 | 3741 | 29.2829 | 1.0 |
|
80 |
+
| 16.8731 | 30.0 | 3870 | 30.1208 | 1.0 |
|
81 |
+
| 16.683 | 31.0 | 3999 | 30.7583 | 1.0 |
|
82 |
+
| 16.6109 | 32.0 | 4128 | 30.6232 | 1.0 |
|
83 |
+
| 15.8261 | 33.0 | 4257 | 29.4162 | 1.0 |
|
84 |
+
| 16.9002 | 34.0 | 4386 | 30.4388 | 1.0 |
|
85 |
+
| 16.3081 | 35.0 | 4515 | 29.9756 | 1.0 |
|
86 |
+
| 15.4745 | 36.0 | 4644 | 28.8214 | 1.0 |
|
87 |
+
| 15.938 | 37.0 | 4773 | 29.1001 | 1.0 |
|
88 |
+
| 15.9947 | 38.0 | 4902 | 31.0533 | 1.0 |
|
89 |
+
| 15.2328 | 39.0 | 5031 | 31.6211 | 1.0 |
|
90 |
+
| 15.202 | 40.0 | 5160 | 33.1383 | 1.0 |
|
91 |
+
| 15.0583 | 41.0 | 5289 | 31.4089 | 1.0 |
|
92 |
+
| 14.573 | 42.0 | 5418 | 31.5681 | 1.0 |
|
93 |
+
| 14.7401 | 43.0 | 5547 | 30.5548 | 1.0 |
|
94 |
+
| 14.6052 | 44.0 | 5676 | 31.3953 | 1.0 |
|
95 |
+
| 14.1299 | 45.0 | 5805 | 30.8153 | 1.0 |
|
96 |
+
| 13.6851 | 46.0 | 5934 | 30.9693 | 1.0 |
|
97 |
+
| 14.6677 | 47.0 | 6063 | 31.9361 | 1.0 |
|
98 |
+
| 13.6493 | 48.0 | 6192 | 34.3328 | 1.0 |
|
99 |
+
| 14.166 | 49.0 | 6321 | 32.6231 | 1.0 |
|
100 |
+
| 13.7388 | 50.0 | 6450 | 33.1736 | 1.0 |
|
101 |
+
| 13.0849 | 51.0 | 6579 | 34.9522 | 1.0 |
|
102 |
+
| 13.2502 | 52.0 | 6708 | 35.7990 | 1.0 |
|
103 |
+
| 13.5116 | 53.0 | 6837 | 31.5737 | 1.0 |
|
104 |
+
| 12.6993 | 54.0 | 6966 | 33.2650 | 1.0 |
|
105 |
+
| 13.3602 | 55.0 | 7095 | 34.8914 | 1.0 |
|
106 |
+
| 12.9585 | 56.0 | 7224 | 35.9862 | 1.0 |
|
107 |
+
| 12.7434 | 57.0 | 7353 | 34.9106 | 1.0 |
|
108 |
+
| 12.7299 | 58.0 | 7482 | 34.0106 | 1.0 |
|
109 |
+
| 12.717 | 59.0 | 7611 | 36.3588 | 1.0 |
|
110 |
+
| 12.0563 | 60.0 | 7740 | 35.0923 | 1.0 |
|
111 |
+
| 13.012 | 61.0 | 7869 | 38.7323 | 1.0 |
|
112 |
+
| 12.2878 | 62.0 | 7998 | 34.9967 | 1.0 |
|
113 |
+
| 12.2794 | 63.0 | 8127 | 37.5577 | 1.0 |
|
114 |
+
| 12.4147 | 64.0 | 8256 | 37.2733 | 1.0 |
|
115 |
+
| 12.0032 | 65.0 | 8385 | 35.3015 | 1.0 |
|
116 |
+
| 12.2793 | 66.0 | 8514 | 35.2806 | 1.0 |
|
117 |
+
| 12.2309 | 67.0 | 8643 | 36.2488 | 1.0 |
|
118 |
+
| 11.7082 | 68.0 | 8772 | 35.6687 | 1.0 |
|
119 |
+
| 11.8694 | 69.0 | 8901 | 36.0470 | 1.0 |
|
120 |
+
| 11.782 | 70.0 | 9030 | 35.4055 | 1.0 |
|
121 |
+
| 11.6254 | 71.0 | 9159 | 36.7066 | 1.0 |
|
122 |
+
| 11.5873 | 72.0 | 9288 | 36.1084 | 1.0 |
|
123 |
+
| 11.6251 | 73.0 | 9417 | 38.2932 | 1.0 |
|
124 |
+
| 11.4589 | 74.0 | 9546 | 36.5570 | 1.0 |
|
125 |
+
| 11.7378 | 75.0 | 9675 | 35.9887 | 1.0 |
|
126 |
+
| 11.4933 | 76.0 | 9804 | 36.4713 | 1.0 |
|
127 |
+
| 11.2566 | 77.0 | 9933 | 36.9622 | 1.0 |
|
128 |
+
| 11.25 | 78.0 | 10062 | 37.1016 | 1.0 |
|
129 |
+
| 11.2962 | 79.0 | 10191 | 37.8711 | 1.0 |
|
130 |
+
| 11.0868 | 80.0 | 10320 | 38.5714 | 1.0 |
|
131 |
+
| 11.2786 | 81.0 | 10449 | 38.1493 | 1.0 |
|
132 |
+
| 11.1528 | 82.0 | 10578 | 39.0100 | 1.0 |
|
133 |
+
| 11.089 | 83.0 | 10707 | 38.5474 | 1.0 |
|
134 |
+
| 10.954 | 84.0 | 10836 | 38.9405 | 1.0 |
|
135 |
+
| 11.0157 | 85.0 | 10965 | 39.3872 | 1.0 |
|
136 |
+
| 10.9849 | 86.0 | 11094 | 39.4875 | 1.0 |
|
137 |
+
| 10.5423 | 87.0 | 11223 | 39.1179 | 1.0 |
|
138 |
+
| 11.1968 | 88.0 | 11352 | 39.4084 | 1.0 |
|
139 |
+
| 10.6376 | 89.0 | 11481 | 39.8218 | 1.0 |
|
140 |
+
| 10.7131 | 90.0 | 11610 | 39.2553 | 1.0 |
|
141 |
+
| 10.8252 | 91.0 | 11739 | 39.1368 | 1.0 |
|
142 |
+
| 10.6456 | 92.0 | 11868 | 38.9194 | 1.0 |
|
143 |
+
| 10.8488 | 93.0 | 11997 | 39.5955 | 1.0 |
|
144 |
+
| 10.8675 | 94.0 | 12126 | 39.4760 | 1.0 |
|
145 |
+
| 10.4757 | 95.0 | 12255 | 40.4844 | 1.0 |
|
146 |
+
| 10.3191 | 96.0 | 12384 | 39.0673 | 1.0 |
|
147 |
+
| 10.6073 | 97.0 | 12513 | 39.3767 | 1.0 |
|
148 |
+
| 10.3038 | 98.0 | 12642 | 39.6969 | 1.0 |
|
149 |
+
| 11.0709 | 99.0 | 12771 | 39.9325 | 1.0 |
|
150 |
+
| 10.5951 | 100.0 | 12900 | 39.8755 | 1.0 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
151 |
|
152 |
|
153 |
### Framework versions
|