Model save
Browse files
README.md
ADDED
@@ -0,0 +1,183 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
license: other
|
4 |
+
base_model: facebook/mask2former-swin-base-IN21k-ade-semantic
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
model-index:
|
8 |
+
- name: mask2former-finetuned-ER-Mito-LD3
|
9 |
+
results: []
|
10 |
+
---
|
11 |
+
|
12 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
13 |
+
should probably proofread and complete it, then remove this comment. -->
|
14 |
+
|
15 |
+
# mask2former-finetuned-ER-Mito-LD3
|
16 |
+
|
17 |
+
This model is a fine-tuned version of [facebook/mask2former-swin-base-IN21k-ade-semantic](https://huggingface.co/facebook/mask2former-swin-base-IN21k-ade-semantic) on an unknown dataset.
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- Loss: 33.4906
|
20 |
+
- Dummy: 1.0
|
21 |
+
|
22 |
+
## Model description
|
23 |
+
|
24 |
+
More information needed
|
25 |
+
|
26 |
+
## Intended uses & limitations
|
27 |
+
|
28 |
+
More information needed
|
29 |
+
|
30 |
+
## Training and evaluation data
|
31 |
+
|
32 |
+
More information needed
|
33 |
+
|
34 |
+
## Training procedure
|
35 |
+
|
36 |
+
### Training hyperparameters
|
37 |
+
|
38 |
+
The following hyperparameters were used during training:
|
39 |
+
- learning_rate: 0.0001
|
40 |
+
- train_batch_size: 5
|
41 |
+
- eval_batch_size: 5
|
42 |
+
- seed: 1337
|
43 |
+
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
44 |
+
- lr_scheduler_type: polynomial
|
45 |
+
- training_steps: 12900
|
46 |
+
|
47 |
+
### Training results
|
48 |
+
|
49 |
+
| Training Loss | Epoch | Step | Validation Loss | Dummy |
|
50 |
+
|:-------------:|:--------:|:-----:|:---------------:|:-----:|
|
51 |
+
| 48.5372 | 1.0 | 104 | 36.3338 | 1.0 |
|
52 |
+
| 33.5327 | 2.0 | 208 | 31.7351 | 1.0 |
|
53 |
+
| 29.7691 | 3.0 | 312 | 30.4858 | 1.0 |
|
54 |
+
| 26.3002 | 4.0 | 416 | 28.6091 | 1.0 |
|
55 |
+
| 24.7501 | 5.0 | 520 | 27.0967 | 1.0 |
|
56 |
+
| 23.4495 | 6.0 | 624 | 26.6241 | 1.0 |
|
57 |
+
| 23.274 | 7.0 | 728 | 27.1544 | 1.0 |
|
58 |
+
| 21.1617 | 8.0 | 832 | 27.4625 | 1.0 |
|
59 |
+
| 20.373 | 9.0 | 936 | 27.5745 | 1.0 |
|
60 |
+
| 20.4295 | 10.0 | 1040 | 27.6942 | 1.0 |
|
61 |
+
| 20.2526 | 11.0 | 1144 | 27.7829 | 1.0 |
|
62 |
+
| 19.2572 | 12.0 | 1248 | 27.2960 | 1.0 |
|
63 |
+
| 19.0089 | 13.0 | 1352 | 26.0039 | 1.0 |
|
64 |
+
| 18.3621 | 14.0 | 1456 | 26.5623 | 1.0 |
|
65 |
+
| 18.0517 | 15.0 | 1560 | 26.2700 | 1.0 |
|
66 |
+
| 18.3139 | 16.0 | 1664 | 27.2972 | 1.0 |
|
67 |
+
| 17.6129 | 17.0 | 1768 | 26.4869 | 1.0 |
|
68 |
+
| 17.8402 | 18.0 | 1872 | 27.7618 | 1.0 |
|
69 |
+
| 16.6494 | 19.0 | 1976 | 27.5173 | 1.0 |
|
70 |
+
| 17.0833 | 20.0 | 2080 | 28.1242 | 1.0 |
|
71 |
+
| 16.5967 | 21.0 | 2184 | 29.1195 | 1.0 |
|
72 |
+
| 16.2634 | 22.0 | 2288 | 27.0367 | 1.0 |
|
73 |
+
| 16.6797 | 23.0 | 2392 | 27.1799 | 1.0 |
|
74 |
+
| 16.0344 | 24.0 | 2496 | 26.6408 | 1.0 |
|
75 |
+
| 15.7701 | 25.0 | 2600 | 28.4040 | 1.0 |
|
76 |
+
| 15.6061 | 26.0 | 2704 | 28.0687 | 1.0 |
|
77 |
+
| 15.3311 | 27.0 | 2808 | 27.1765 | 1.0 |
|
78 |
+
| 15.2464 | 28.0 | 2912 | 28.2050 | 1.0 |
|
79 |
+
| 15.0459 | 29.0 | 3016 | 28.6291 | 1.0 |
|
80 |
+
| 14.7514 | 30.0 | 3120 | 27.8241 | 1.0 |
|
81 |
+
| 15.0833 | 31.0 | 3224 | 29.1936 | 1.0 |
|
82 |
+
| 15.0817 | 32.0 | 3328 | 28.4044 | 1.0 |
|
83 |
+
| 14.3201 | 33.0 | 3432 | 28.3709 | 1.0 |
|
84 |
+
| 14.5918 | 34.0 | 3536 | 29.3898 | 1.0 |
|
85 |
+
| 14.7177 | 35.0 | 3640 | 28.5130 | 1.0 |
|
86 |
+
| 13.9919 | 36.0 | 3744 | 27.7597 | 1.0 |
|
87 |
+
| 14.2267 | 37.0 | 3848 | 29.2324 | 1.0 |
|
88 |
+
| 13.7801 | 38.0 | 3952 | 28.3574 | 1.0 |
|
89 |
+
| 14.1839 | 39.0 | 4056 | 28.8711 | 1.0 |
|
90 |
+
| 13.7545 | 40.0 | 4160 | 28.2947 | 1.0 |
|
91 |
+
| 14.1627 | 41.0 | 4264 | 29.4866 | 1.0 |
|
92 |
+
| 13.5155 | 42.0 | 4368 | 29.8527 | 1.0 |
|
93 |
+
| 13.704 | 43.0 | 4472 | 29.4292 | 1.0 |
|
94 |
+
| 13.6644 | 44.0 | 4576 | 29.2324 | 1.0 |
|
95 |
+
| 13.2006 | 45.0 | 4680 | 29.5414 | 1.0 |
|
96 |
+
| 13.1545 | 46.0 | 4784 | 29.6988 | 1.0 |
|
97 |
+
| 13.5744 | 47.0 | 4888 | 28.9933 | 1.0 |
|
98 |
+
| 12.8073 | 48.0 | 4992 | 28.9770 | 1.0 |
|
99 |
+
| 13.3773 | 49.0 | 5096 | 30.3950 | 1.0 |
|
100 |
+
| 12.9506 | 50.0 | 5200 | 31.2871 | 1.0 |
|
101 |
+
| 13.0674 | 51.0 | 5304 | 29.5711 | 1.0 |
|
102 |
+
| 13.1265 | 52.0 | 5408 | 31.0887 | 1.0 |
|
103 |
+
| 13.1392 | 53.0 | 5512 | 29.8433 | 1.0 |
|
104 |
+
| 12.6108 | 54.0 | 5616 | 29.6436 | 1.0 |
|
105 |
+
| 12.7608 | 55.0 | 5720 | 29.8706 | 1.0 |
|
106 |
+
| 12.8723 | 56.0 | 5824 | 30.0596 | 1.0 |
|
107 |
+
| 12.5437 | 57.0 | 5928 | 30.1367 | 1.0 |
|
108 |
+
| 12.1387 | 58.0 | 6032 | 30.4089 | 1.0 |
|
109 |
+
| 12.948 | 59.0 | 6136 | 30.5375 | 1.0 |
|
110 |
+
| 12.2869 | 60.0 | 6240 | 32.3827 | 1.0 |
|
111 |
+
| 12.7717 | 61.0 | 6344 | 30.6397 | 1.0 |
|
112 |
+
| 12.4924 | 62.0 | 6448 | 30.7005 | 1.0 |
|
113 |
+
| 12.3031 | 63.0 | 6552 | 29.9865 | 1.0 |
|
114 |
+
| 12.5575 | 64.0 | 6656 | 31.0697 | 1.0 |
|
115 |
+
| 11.9496 | 65.0 | 6760 | 31.5794 | 1.0 |
|
116 |
+
| 12.0462 | 66.0 | 6864 | 31.6537 | 1.0 |
|
117 |
+
| 12.7167 | 67.0 | 6968 | 30.7411 | 1.0 |
|
118 |
+
| 11.8595 | 68.0 | 7072 | 30.4970 | 1.0 |
|
119 |
+
| 11.7458 | 69.0 | 7176 | 30.8332 | 1.0 |
|
120 |
+
| 12.2058 | 70.0 | 7280 | 32.0951 | 1.0 |
|
121 |
+
| 12.0874 | 71.0 | 7384 | 32.4695 | 1.0 |
|
122 |
+
| 11.705 | 72.0 | 7488 | 31.3117 | 1.0 |
|
123 |
+
| 12.0 | 73.0 | 7592 | 30.6540 | 1.0 |
|
124 |
+
| 11.9852 | 74.0 | 7696 | 34.2950 | 1.0 |
|
125 |
+
| 11.7597 | 75.0 | 7800 | 31.6361 | 1.0 |
|
126 |
+
| 11.8713 | 76.0 | 7904 | 31.1082 | 1.0 |
|
127 |
+
| 11.705 | 77.0 | 8008 | 31.8714 | 1.0 |
|
128 |
+
| 11.5474 | 78.0 | 8112 | 31.0299 | 1.0 |
|
129 |
+
| 11.8387 | 79.0 | 8216 | 31.3672 | 1.0 |
|
130 |
+
| 11.7057 | 80.0 | 8320 | 31.6435 | 1.0 |
|
131 |
+
| 11.5656 | 81.0 | 8424 | 31.1940 | 1.0 |
|
132 |
+
| 11.6578 | 82.0 | 8528 | 31.8184 | 1.0 |
|
133 |
+
| 11.3049 | 83.0 | 8632 | 31.8668 | 1.0 |
|
134 |
+
| 11.5542 | 84.0 | 8736 | 32.8192 | 1.0 |
|
135 |
+
| 11.3942 | 85.0 | 8840 | 30.9723 | 1.0 |
|
136 |
+
| 11.6955 | 86.0 | 8944 | 31.3487 | 1.0 |
|
137 |
+
| 11.4862 | 87.0 | 9048 | 32.0451 | 1.0 |
|
138 |
+
| 11.5867 | 88.0 | 9152 | 31.9769 | 1.0 |
|
139 |
+
| 11.0975 | 89.0 | 9256 | 31.9721 | 1.0 |
|
140 |
+
| 11.5126 | 90.0 | 9360 | 35.3877 | 1.0 |
|
141 |
+
| 11.067 | 91.0 | 9464 | 33.7614 | 1.0 |
|
142 |
+
| 11.3857 | 92.0 | 9568 | 32.7046 | 1.0 |
|
143 |
+
| 11.5511 | 93.0 | 9672 | 32.1096 | 1.0 |
|
144 |
+
| 11.0961 | 94.0 | 9776 | 32.8302 | 1.0 |
|
145 |
+
| 11.2935 | 95.0 | 9880 | 32.6688 | 1.0 |
|
146 |
+
| 11.2398 | 96.0 | 9984 | 32.2807 | 1.0 |
|
147 |
+
| 11.0444 | 97.0 | 10088 | 32.2766 | 1.0 |
|
148 |
+
| 11.3157 | 98.0 | 10192 | 32.4437 | 1.0 |
|
149 |
+
| 11.0191 | 99.0 | 10296 | 32.3851 | 1.0 |
|
150 |
+
| 11.1406 | 100.0 | 10400 | 32.1389 | 1.0 |
|
151 |
+
| 11.1237 | 101.0 | 10504 | 32.4886 | 1.0 |
|
152 |
+
| 10.9485 | 102.0 | 10608 | 32.5051 | 1.0 |
|
153 |
+
| 10.9188 | 103.0 | 10712 | 32.8615 | 1.0 |
|
154 |
+
| 11.3029 | 104.0 | 10816 | 33.0388 | 1.0 |
|
155 |
+
| 11.2023 | 105.0 | 10920 | 32.4923 | 1.0 |
|
156 |
+
| 10.9634 | 106.0 | 11024 | 32.3288 | 1.0 |
|
157 |
+
| 11.257 | 107.0 | 11128 | 31.8855 | 1.0 |
|
158 |
+
| 11.0193 | 108.0 | 11232 | 34.0067 | 1.0 |
|
159 |
+
| 10.6401 | 109.0 | 11336 | 33.2946 | 1.0 |
|
160 |
+
| 11.0542 | 110.0 | 11440 | 34.0535 | 1.0 |
|
161 |
+
| 10.888 | 111.0 | 11544 | 32.7206 | 1.0 |
|
162 |
+
| 10.9706 | 112.0 | 11648 | 33.1238 | 1.0 |
|
163 |
+
| 11.0075 | 113.0 | 11752 | 32.9882 | 1.0 |
|
164 |
+
| 10.7895 | 114.0 | 11856 | 32.7985 | 1.0 |
|
165 |
+
| 10.9181 | 115.0 | 11960 | 32.9143 | 1.0 |
|
166 |
+
| 10.5938 | 116.0 | 12064 | 33.0722 | 1.0 |
|
167 |
+
| 10.4932 | 117.0 | 12168 | 34.2366 | 1.0 |
|
168 |
+
| 10.9761 | 118.0 | 12272 | 33.8880 | 1.0 |
|
169 |
+
| 10.6918 | 119.0 | 12376 | 34.3289 | 1.0 |
|
170 |
+
| 10.896 | 120.0 | 12480 | 33.6095 | 1.0 |
|
171 |
+
| 10.6876 | 121.0 | 12584 | 33.8608 | 1.0 |
|
172 |
+
| 10.5666 | 122.0 | 12688 | 33.6994 | 1.0 |
|
173 |
+
| 10.8161 | 123.0 | 12792 | 33.6172 | 1.0 |
|
174 |
+
| 10.7195 | 124.0 | 12896 | 33.5397 | 1.0 |
|
175 |
+
| 10.6712 | 124.0385 | 12900 | 33.4906 | 1.0 |
|
176 |
+
|
177 |
+
|
178 |
+
### Framework versions
|
179 |
+
|
180 |
+
- Transformers 4.50.0.dev0
|
181 |
+
- Pytorch 2.4.1
|
182 |
+
- Datasets 3.3.2
|
183 |
+
- Tokenizers 0.21.0
|