Upload README.md
Browse files
README.md
CHANGED
@@ -1,155 +1,252 @@
|
|
|
|
1 |
---
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
tags:
|
|
|
|
|
6 |
- generated_from_trainer
|
7 |
-
|
8 |
-
- accuracy
|
9 |
model-index:
|
10 |
- name: DinoVdo-large-2025_01_27_45863-bs32_freeze
|
11 |
results: []
|
12 |
---
|
13 |
|
14 |
-
|
15 |
-
should probably proofread and complete it, then remove this comment. -->
|
16 |
|
17 |
-
# DinoVdo-large-2025_01_27_45863-bs32_freeze
|
18 |
|
19 |
-
This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
|
20 |
-
It achieves the following results on the evaluation set:
|
21 |
- Loss: 0.1236
|
22 |
- F1 Micro: 0.8136
|
23 |
- F1 Macro: 0.7060
|
24 |
- Accuracy: 0.3071
|
25 |
-
- Learning Rate: 0.0000
|
26 |
|
27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
|
29 |
-
|
30 |
|
31 |
-
|
|
|
32 |
|
33 |
-
|
34 |
|
35 |
-
|
|
|
|
|
36 |
|
37 |
-
|
|
|
38 |
|
39 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
40 |
|
41 |
-
|
|
|
|
|
42 |
|
43 |
The following hyperparameters were used during training:
|
44 |
-
|
45 |
-
-
|
46 |
-
-
|
47 |
-
-
|
48 |
-
-
|
49 |
-
-
|
50 |
-
-
|
51 |
-
-
|
52 |
-
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
|
57 |
-
|
58 |
-
|
59 |
-
|
60 |
-
|
61 |
-
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
-
|
67 |
-
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
-
| 0.
|
128 |
-
| 0.
|
129 |
-
| 0.
|
130 |
-
| 0.
|
131 |
-
| 0.
|
132 |
-
| 0.
|
133 |
-
| 0.
|
134 |
-
| 0.
|
135 |
-
| 0.
|
136 |
-
| 0.
|
137 |
-
| 0.
|
138 |
-
| 0.
|
139 |
-
| 0.
|
140 |
-
| 0.
|
141 |
-
| 0.
|
142 |
-
| 0.
|
143 |
-
| 0.
|
144 |
-
| 0.
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
|
149 |
-
|
150 |
-
|
151 |
-
|
152 |
-
|
153 |
-
|
154 |
-
|
155 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
+
language:
|
4 |
+
- eng
|
5 |
+
license: cc0-1.0
|
6 |
tags:
|
7 |
+
- multilabel-image-classification
|
8 |
+
- multilabel
|
9 |
- generated_from_trainer
|
10 |
+
base_model: DinoVdo-large-2025_01_27_45863-bs32_freeze
|
|
|
11 |
model-index:
|
12 |
- name: DinoVdo-large-2025_01_27_45863-bs32_freeze
|
13 |
results: []
|
14 |
---
|
15 |
|
16 |
+
DinoVdo is a fine-tuned version of [DinoVdo-large-2025_01_27_45863-bs32_freeze](https://huggingface.co/DinoVdo-large-2025_01_27_45863-bs32_freeze). It achieves the following results on the test set:
|
|
|
17 |
|
|
|
18 |
|
|
|
|
|
19 |
- Loss: 0.1236
|
20 |
- F1 Micro: 0.8136
|
21 |
- F1 Macro: 0.7060
|
22 |
- Accuracy: 0.3071
|
|
|
23 |
|
24 |
+
| Class | F1 per class |
|
25 |
+
|----------|-------|
|
26 |
+
| Acropore_branched | 0.9015 |
|
27 |
+
| Acropore_digitised | 0.6437 |
|
28 |
+
| Acropore_sub_massive | 0.3853 |
|
29 |
+
| Acropore_tabular | 0.9293 |
|
30 |
+
| Algae_assembly | 0.7615 |
|
31 |
+
| Algae_drawn_up | 0.4765 |
|
32 |
+
| Algae_limestone | 0.7542 |
|
33 |
+
| Algae_sodding | 0.8600 |
|
34 |
+
| Atra/Leucospilota | 0.8369 |
|
35 |
+
| Bleached_coral | 0.6405 |
|
36 |
+
| Blurred | 0.6294 |
|
37 |
+
| Dead_coral | 0.7340 |
|
38 |
+
| Fish | 0.7477 |
|
39 |
+
| Homo_sapiens | 0.7788 |
|
40 |
+
| Human_object | 0.7629 |
|
41 |
+
| Living_coral | 0.6290 |
|
42 |
+
| Millepore | 0.8251 |
|
43 |
+
| No_acropore_encrusting | 0.6364 |
|
44 |
+
| No_acropore_foliaceous | 0.8077 |
|
45 |
+
| No_acropore_massive | 0.7208 |
|
46 |
+
| No_acropore_solitary | 0.4468 |
|
47 |
+
| No_acropore_sub_massive | 0.6970 |
|
48 |
+
| Rock | 0.8818 |
|
49 |
+
| Rubble | 0.7686 |
|
50 |
+
| Sand | 0.9235 |
|
51 |
+
| Sea_cucumber | 0.8234 |
|
52 |
+
| Sea_urchins | 0.7079 |
|
53 |
+
| Sponge | 0.3861 |
|
54 |
+
| Syringodium_isoetifolium | 0.9720 |
|
55 |
+
| Thalassodendron_ciliatum | 0.9886 |
|
56 |
+
| Useless | 0.9745 |
|
57 |
+
|
58 |
|
59 |
+
---
|
60 |
|
61 |
+
# Model description
|
62 |
+
DinoVdo is a model built on top of DinoVdo-large-2025_01_27_45863-bs32_freeze model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
|
63 |
|
64 |
+
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
|
65 |
|
66 |
+
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
|
67 |
+
|
68 |
+
---
|
69 |
|
70 |
+
# Intended uses & limitations
|
71 |
+
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
|
72 |
|
73 |
+
---
|
74 |
+
|
75 |
+
# Training and evaluation data
|
76 |
+
Details on the number of images for each class are given in the following table:
|
77 |
+
| Class | train | test | val | Total |
|
78 |
+
|:-------------------------|--------:|-------:|------:|--------:|
|
79 |
+
| Acropore_branched | 1480 | 469 | 459 | 2408 |
|
80 |
+
| Acropore_digitised | 571 | 156 | 161 | 888 |
|
81 |
+
| Acropore_sub_massive | 150 | 52 | 41 | 243 |
|
82 |
+
| Acropore_tabular | 999 | 292 | 298 | 1589 |
|
83 |
+
| Algae_assembly | 2554 | 842 | 842 | 4238 |
|
84 |
+
| Algae_drawn_up | 367 | 130 | 123 | 620 |
|
85 |
+
| Algae_limestone | 1651 | 562 | 559 | 2772 |
|
86 |
+
| Algae_sodding | 3142 | 994 | 981 | 5117 |
|
87 |
+
| Atra/Leucospilota | 1084 | 349 | 359 | 1792 |
|
88 |
+
| Bleached_coral | 219 | 69 | 72 | 360 |
|
89 |
+
| Blurred | 191 | 68 | 61 | 320 |
|
90 |
+
| Dead_coral | 1980 | 648 | 636 | 3264 |
|
91 |
+
| Fish | 2018 | 661 | 642 | 3321 |
|
92 |
+
| Homo_sapiens | 161 | 63 | 58 | 282 |
|
93 |
+
| Human_object | 156 | 55 | 59 | 270 |
|
94 |
+
| Living_coral | 397 | 151 | 153 | 701 |
|
95 |
+
| Millepore | 386 | 127 | 124 | 637 |
|
96 |
+
| No_acropore_encrusting | 442 | 141 | 142 | 725 |
|
97 |
+
| No_acropore_foliaceous | 204 | 47 | 35 | 286 |
|
98 |
+
| No_acropore_massive | 1030 | 341 | 334 | 1705 |
|
99 |
+
| No_acropore_solitary | 202 | 55 | 46 | 303 |
|
100 |
+
| No_acropore_sub_massive | 1402 | 428 | 426 | 2256 |
|
101 |
+
| Rock | 4481 | 1495 | 1481 | 7457 |
|
102 |
+
| Rubble | 3092 | 1015 | 1016 | 5123 |
|
103 |
+
| Sand | 5839 | 1945 | 1935 | 9719 |
|
104 |
+
| Sea_cucumber | 1407 | 437 | 450 | 2294 |
|
105 |
+
| Sea_urchins | 328 | 110 | 107 | 545 |
|
106 |
+
| Sponge | 267 | 98 | 105 | 470 |
|
107 |
+
| Syringodium_isoetifolium | 1213 | 392 | 390 | 1995 |
|
108 |
+
| Thalassodendron_ciliatum | 781 | 262 | 260 | 1303 |
|
109 |
+
| Useless | 579 | 193 | 193 | 965 |
|
110 |
+
|
111 |
+
---
|
112 |
|
113 |
+
# Training procedure
|
114 |
+
|
115 |
+
## Training hyperparameters
|
116 |
|
117 |
The following hyperparameters were used during training:
|
118 |
+
|
119 |
+
- **Number of Epochs**: 91.0
|
120 |
+
- **Learning Rate**: 0.001
|
121 |
+
- **Train Batch Size**: 32
|
122 |
+
- **Eval Batch Size**: 32
|
123 |
+
- **Optimizer**: Adam
|
124 |
+
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
|
125 |
+
- **Freeze Encoder**: Yes
|
126 |
+
- **Data Augmentation**: Yes
|
127 |
+
|
128 |
+
|
129 |
+
## Data Augmentation
|
130 |
+
Data were augmented using the following transformations :
|
131 |
+
|
132 |
+
Train Transforms
|
133 |
+
- **PreProcess**: No additional parameters
|
134 |
+
- **Resize**: probability=1.00
|
135 |
+
- **RandomHorizontalFlip**: probability=0.25
|
136 |
+
- **RandomVerticalFlip**: probability=0.25
|
137 |
+
- **ColorJiggle**: probability=0.25
|
138 |
+
- **RandomPerspective**: probability=0.25
|
139 |
+
- **Normalize**: probability=1.00
|
140 |
+
|
141 |
+
Val Transforms
|
142 |
+
- **PreProcess**: No additional parameters
|
143 |
+
- **Resize**: probability=1.00
|
144 |
+
- **Normalize**: probability=1.00
|
145 |
+
|
146 |
+
|
147 |
+
|
148 |
+
## Training results
|
149 |
+
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
|
150 |
+
--- | --- | --- | --- | --- | ---
|
151 |
+
1 | 0.16775080561637878 | 0.2293 | 0.7433 | 0.5138 | 0.001
|
152 |
+
2 | 0.15366077423095703 | 0.2436 | 0.7614 | 0.5771 | 0.001
|
153 |
+
3 | 0.14831368625164032 | 0.2415 | 0.7763 | 0.6194 | 0.001
|
154 |
+
4 | 0.14640773832798004 | 0.2555 | 0.7808 | 0.6276 | 0.001
|
155 |
+
5 | 0.14515382051467896 | 0.2520 | 0.7788 | 0.6421 | 0.001
|
156 |
+
6 | 0.14404040575027466 | 0.2579 | 0.7802 | 0.6147 | 0.001
|
157 |
+
7 | 0.14518576860427856 | 0.2503 | 0.7787 | 0.6141 | 0.001
|
158 |
+
8 | 0.14463570713996887 | 0.2534 | 0.7776 | 0.6193 | 0.001
|
159 |
+
9 | 0.14786836504936218 | 0.2454 | 0.7813 | 0.6363 | 0.001
|
160 |
+
10 | 0.14251072704792023 | 0.2607 | 0.7866 | 0.6366 | 0.001
|
161 |
+
11 | 0.14535894989967346 | 0.2618 | 0.7908 | 0.6566 | 0.001
|
162 |
+
12 | 0.14042578637599945 | 0.2600 | 0.7895 | 0.6439 | 0.001
|
163 |
+
13 | 0.1413952261209488 | 0.2531 | 0.7883 | 0.6490 | 0.001
|
164 |
+
14 | 0.14056158065795898 | 0.2649 | 0.7894 | 0.6348 | 0.001
|
165 |
+
15 | 0.13873133063316345 | 0.2632 | 0.7906 | 0.6470 | 0.001
|
166 |
+
16 | 0.14008578658103943 | 0.2604 | 0.7858 | 0.6331 | 0.001
|
167 |
+
17 | 0.13811993598937988 | 0.2520 | 0.7955 | 0.6651 | 0.001
|
168 |
+
18 | 0.13870170712471008 | 0.2702 | 0.7914 | 0.6498 | 0.001
|
169 |
+
19 | 0.1373777985572815 | 0.2632 | 0.7940 | 0.6356 | 0.001
|
170 |
+
20 | 0.13864819705486298 | 0.2551 | 0.7850 | 0.6393 | 0.001
|
171 |
+
21 | 0.13566707074642181 | 0.2646 | 0.7943 | 0.6496 | 0.001
|
172 |
+
22 | 0.1371580958366394 | 0.2723 | 0.7972 | 0.6400 | 0.001
|
173 |
+
23 | 0.13614478707313538 | 0.2604 | 0.7938 | 0.6595 | 0.001
|
174 |
+
24 | 0.1362675279378891 | 0.2649 | 0.7954 | 0.6418 | 0.001
|
175 |
+
25 | 0.1358671337366104 | 0.2733 | 0.7963 | 0.6500 | 0.001
|
176 |
+
26 | 0.13484793901443481 | 0.2691 | 0.7984 | 0.6555 | 0.001
|
177 |
+
27 | 0.13669784367084503 | 0.2688 | 0.7944 | 0.6535 | 0.001
|
178 |
+
28 | 0.13569706678390503 | 0.2677 | 0.7923 | 0.6398 | 0.001
|
179 |
+
29 | 0.14052562415599823 | 0.2639 | 0.7924 | 0.6637 | 0.001
|
180 |
+
30 | 0.13725879788398743 | 0.2723 | 0.7875 | 0.6405 | 0.001
|
181 |
+
31 | 0.13544805347919464 | 0.2719 | 0.7986 | 0.6562 | 0.001
|
182 |
+
32 | 0.13693773746490479 | 0.2649 | 0.7930 | 0.6463 | 0.001
|
183 |
+
33 | 0.13195939362049103 | 0.2747 | 0.8015 | 0.6660 | 0.0001
|
184 |
+
34 | 0.1299898624420166 | 0.2834 | 0.8043 | 0.6756 | 0.0001
|
185 |
+
35 | 0.12946264445781708 | 0.2810 | 0.8066 | 0.6772 | 0.0001
|
186 |
+
36 | 0.13086125254631042 | 0.2817 | 0.8046 | 0.6795 | 0.0001
|
187 |
+
37 | 0.1278763711452484 | 0.2824 | 0.8054 | 0.6792 | 0.0001
|
188 |
+
38 | 0.12904110550880432 | 0.2848 | 0.8078 | 0.6814 | 0.0001
|
189 |
+
39 | 0.12716704607009888 | 0.2939 | 0.8116 | 0.6833 | 0.0001
|
190 |
+
40 | 0.1293308585882187 | 0.2908 | 0.8116 | 0.6879 | 0.0001
|
191 |
+
41 | 0.12695887684822083 | 0.2918 | 0.8089 | 0.6864 | 0.0001
|
192 |
+
42 | 0.12624548375606537 | 0.2914 | 0.8109 | 0.6837 | 0.0001
|
193 |
+
43 | 0.1261172592639923 | 0.2949 | 0.8123 | 0.6984 | 0.0001
|
194 |
+
44 | 0.12830273807048798 | 0.2935 | 0.8106 | 0.6834 | 0.0001
|
195 |
+
45 | 0.12624593079090118 | 0.2932 | 0.8113 | 0.7010 | 0.0001
|
196 |
+
46 | 0.12462077289819717 | 0.2960 | 0.8147 | 0.6964 | 0.0001
|
197 |
+
47 | 0.12529432773590088 | 0.2988 | 0.8126 | 0.6923 | 0.0001
|
198 |
+
48 | 0.12631145119667053 | 0.2977 | 0.8133 | 0.6954 | 0.0001
|
199 |
+
49 | 0.1252526491880417 | 0.3037 | 0.8158 | 0.6952 | 0.0001
|
200 |
+
50 | 0.12632089853286743 | 0.3005 | 0.8136 | 0.7008 | 0.0001
|
201 |
+
51 | 0.1246422603726387 | 0.3009 | 0.8158 | 0.7019 | 0.0001
|
202 |
+
52 | 0.12534211575984955 | 0.2911 | 0.8092 | 0.6949 | 0.0001
|
203 |
+
53 | 0.12436465919017792 | 0.3023 | 0.8154 | 0.7019 | 1e-05
|
204 |
+
54 | 0.12488020956516266 | 0.3009 | 0.8154 | 0.7040 | 1e-05
|
205 |
+
55 | 0.12366042286157608 | 0.3005 | 0.8144 | 0.6998 | 1e-05
|
206 |
+
56 | 0.12352865934371948 | 0.3033 | 0.8168 | 0.7004 | 1e-05
|
207 |
+
57 | 0.1239086389541626 | 0.3030 | 0.8157 | 0.7002 | 1e-05
|
208 |
+
58 | 0.12343526631593704 | 0.3026 | 0.8157 | 0.6995 | 1e-05
|
209 |
+
59 | 0.12345146387815475 | 0.3047 | 0.8150 | 0.7012 | 1e-05
|
210 |
+
60 | 0.1239377036690712 | 0.2981 | 0.8128 | 0.6932 | 1e-05
|
211 |
+
61 | 0.12398885935544968 | 0.3009 | 0.8174 | 0.7076 | 1e-05
|
212 |
+
62 | 0.12334412336349487 | 0.3019 | 0.8152 | 0.7032 | 1e-05
|
213 |
+
63 | 0.12325507402420044 | 0.3023 | 0.8158 | 0.7023 | 1e-05
|
214 |
+
64 | 0.12346883863210678 | 0.3047 | 0.8152 | 0.6999 | 1e-05
|
215 |
+
65 | 0.12324482202529907 | 0.2977 | 0.8145 | 0.7001 | 1e-05
|
216 |
+
66 | 0.12292143702507019 | 0.3012 | 0.8145 | 0.7004 | 1e-05
|
217 |
+
67 | 0.12375594675540924 | 0.3016 | 0.8159 | 0.6993 | 1e-05
|
218 |
+
68 | 0.1228519007563591 | 0.2998 | 0.8176 | 0.7039 | 1e-05
|
219 |
+
69 | 0.12302352488040924 | 0.3058 | 0.8157 | 0.7006 | 1e-05
|
220 |
+
70 | 0.12284138053655624 | 0.3040 | 0.8170 | 0.7009 | 1e-05
|
221 |
+
71 | 0.12295401096343994 | 0.3019 | 0.8158 | 0.7043 | 1e-05
|
222 |
+
72 | 0.123215451836586 | 0.3016 | 0.8171 | 0.7025 | 1e-05
|
223 |
+
73 | 0.12291014939546585 | 0.3054 | 0.8174 | 0.7049 | 1e-05
|
224 |
+
74 | 0.12304174154996872 | 0.3009 | 0.8141 | 0.6942 | 1e-05
|
225 |
+
75 | 0.1232200339436531 | 0.3033 | 0.8161 | 0.7001 | 1.0000000000000002e-06
|
226 |
+
76 | 0.12267689406871796 | 0.3058 | 0.8171 | 0.7020 | 1.0000000000000002e-06
|
227 |
+
77 | 0.12284990400075912 | 0.3079 | 0.8191 | 0.7060 | 1.0000000000000002e-06
|
228 |
+
78 | 0.123690165579319 | 0.3019 | 0.8166 | 0.7072 | 1.0000000000000002e-06
|
229 |
+
79 | 0.12329532951116562 | 0.3047 | 0.8156 | 0.6992 | 1.0000000000000002e-06
|
230 |
+
80 | 0.12325812131166458 | 0.3026 | 0.8172 | 0.6994 | 1.0000000000000002e-06
|
231 |
+
81 | 0.12240613251924515 | 0.3054 | 0.8176 | 0.7037 | 1.0000000000000002e-06
|
232 |
+
82 | 0.12270382046699524 | 0.3002 | 0.8151 | 0.6972 | 1.0000000000000002e-06
|
233 |
+
83 | 0.12315402179956436 | 0.2995 | 0.8146 | 0.6939 | 1.0000000000000002e-06
|
234 |
+
84 | 0.1225922629237175 | 0.3026 | 0.8177 | 0.7017 | 1.0000000000000002e-06
|
235 |
+
85 | 0.1230376735329628 | 0.3072 | 0.8181 | 0.7059 | 1.0000000000000002e-06
|
236 |
+
86 | 0.12335028499364853 | 0.3033 | 0.8168 | 0.7059 | 1.0000000000000002e-06
|
237 |
+
87 | 0.12355341017246246 | 0.3005 | 0.8143 | 0.6944 | 1.0000000000000002e-06
|
238 |
+
88 | 0.12264065444469452 | 0.3079 | 0.8186 | 0.7029 | 1.0000000000000002e-07
|
239 |
+
89 | 0.1229795441031456 | 0.3075 | 0.8179 | 0.7084 | 1.0000000000000002e-07
|
240 |
+
90 | 0.12317965924739838 | 0.3037 | 0.8195 | 0.7105 | 1.0000000000000002e-07
|
241 |
+
91 | 0.12267619371414185 | 0.3019 | 0.8155 | 0.6972 | 1.0000000000000002e-07
|
242 |
+
|
243 |
+
|
244 |
+
---
|
245 |
+
|
246 |
+
# Framework Versions
|
247 |
+
|
248 |
+
- **Transformers**: 4.48.0
|
249 |
+
- **Pytorch**: 2.5.1+cu124
|
250 |
+
- **Datasets**: 3.0.2
|
251 |
+
- **Tokenizers**: 0.21.0
|
252 |
+
|