segformer-b0-finetuned-morphpadver1-hgo-coord
This model is a fine-tuned version of nvidia/mit-b0 on the NICOPOI-9/morphpad_coord_hgo_512_4class dataset. It achieves the following results on the evaluation set:
- Loss: 0.0306
- Mean Iou: 0.9858
- Mean Accuracy: 0.9928
- Overall Accuracy: 0.9928
- Accuracy 0-0: 0.9933
- Accuracy 0-90: 0.9937
- Accuracy 90-0: 0.9943
- Accuracy 90-90: 0.9898
- Iou 0-0: 0.9885
- Iou 0-90: 0.9850
- Iou 90-0: 0.9826
- Iou 90-90: 0.9872
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 80
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.2185 | 2.5445 | 4000 | 1.2349 | 0.2290 | 0.3745 | 0.3762 | 0.2785 | 0.4062 | 0.4936 | 0.3198 | 0.2085 | 0.2334 | 0.2525 | 0.2216 |
1.0978 | 5.0891 | 8000 | 1.1020 | 0.2905 | 0.4487 | 0.4508 | 0.3780 | 0.5302 | 0.5341 | 0.3524 | 0.2937 | 0.2870 | 0.2991 | 0.2822 |
0.9886 | 7.6336 | 12000 | 1.0139 | 0.3231 | 0.4871 | 0.4896 | 0.4154 | 0.4500 | 0.7245 | 0.3585 | 0.3291 | 0.3272 | 0.3266 | 0.3096 |
0.9358 | 10.1781 | 16000 | 0.9575 | 0.3517 | 0.5195 | 0.5215 | 0.3765 | 0.6411 | 0.5865 | 0.4740 | 0.3438 | 0.3539 | 0.3617 | 0.3473 |
0.8735 | 12.7226 | 20000 | 0.8853 | 0.4007 | 0.5704 | 0.5726 | 0.4998 | 0.5637 | 0.7536 | 0.4647 | 0.4109 | 0.3953 | 0.4055 | 0.3913 |
0.7186 | 15.2672 | 24000 | 0.6833 | 0.5558 | 0.7151 | 0.7141 | 0.7389 | 0.6650 | 0.6919 | 0.7647 | 0.5919 | 0.5261 | 0.5453 | 0.5598 |
0.6514 | 17.8117 | 28000 | 0.4379 | 0.7017 | 0.8243 | 0.8243 | 0.8344 | 0.8161 | 0.8279 | 0.8187 | 0.7198 | 0.6807 | 0.6933 | 0.7130 |
0.603 | 20.3562 | 32000 | 0.2900 | 0.7980 | 0.8879 | 0.8874 | 0.9117 | 0.8490 | 0.8888 | 0.9020 | 0.8160 | 0.7726 | 0.7893 | 0.8142 |
0.2448 | 22.9008 | 36000 | 0.2154 | 0.8496 | 0.9184 | 0.9185 | 0.9330 | 0.9179 | 0.9170 | 0.9058 | 0.8683 | 0.8329 | 0.8445 | 0.8527 |
0.2766 | 25.4453 | 40000 | 0.2004 | 0.8612 | 0.9254 | 0.9254 | 0.9487 | 0.9059 | 0.9381 | 0.9088 | 0.8717 | 0.8469 | 0.8635 | 0.8628 |
0.6278 | 27.9898 | 44000 | 0.1410 | 0.8976 | 0.9459 | 0.9459 | 0.9426 | 0.9377 | 0.9559 | 0.9474 | 0.9075 | 0.8863 | 0.8932 | 0.9034 |
0.1684 | 30.5344 | 48000 | 0.1163 | 0.9137 | 0.9549 | 0.9548 | 0.9595 | 0.9417 | 0.9579 | 0.9605 | 0.9245 | 0.9046 | 0.9069 | 0.9187 |
0.0638 | 33.0789 | 52000 | 0.0927 | 0.9338 | 0.9657 | 0.9657 | 0.9697 | 0.9589 | 0.9715 | 0.9627 | 0.9406 | 0.9291 | 0.9291 | 0.9363 |
0.0749 | 35.6234 | 56000 | 0.0836 | 0.9382 | 0.9680 | 0.9680 | 0.9714 | 0.9663 | 0.9680 | 0.9664 | 0.9449 | 0.9325 | 0.9339 | 0.9414 |
0.045 | 38.1679 | 60000 | 0.0624 | 0.9545 | 0.9767 | 0.9767 | 0.9787 | 0.9751 | 0.9763 | 0.9766 | 0.9587 | 0.9521 | 0.9499 | 0.9573 |
0.1278 | 40.7125 | 64000 | 0.0635 | 0.9546 | 0.9767 | 0.9767 | 0.9773 | 0.9743 | 0.9813 | 0.9737 | 0.9598 | 0.9521 | 0.9492 | 0.9572 |
0.0443 | 43.2570 | 68000 | 0.0598 | 0.9584 | 0.9787 | 0.9787 | 0.9815 | 0.9723 | 0.9858 | 0.9752 | 0.9624 | 0.9548 | 0.9548 | 0.9617 |
0.0337 | 45.8015 | 72000 | 0.0549 | 0.9622 | 0.9807 | 0.9807 | 0.9877 | 0.9804 | 0.9820 | 0.9726 | 0.9648 | 0.9587 | 0.9622 | 0.9632 |
0.0434 | 48.3461 | 76000 | 0.0539 | 0.9643 | 0.9816 | 0.9817 | 0.9793 | 0.9779 | 0.9913 | 0.9781 | 0.9691 | 0.9611 | 0.9565 | 0.9703 |
0.1576 | 50.8906 | 80000 | 0.0577 | 0.9656 | 0.9825 | 0.9825 | 0.9799 | 0.9822 | 0.9825 | 0.9856 | 0.9694 | 0.9634 | 0.9653 | 0.9645 |
0.025 | 53.4351 | 84000 | 0.0453 | 0.9724 | 0.9860 | 0.9860 | 0.9856 | 0.9884 | 0.9840 | 0.9858 | 0.9762 | 0.9698 | 0.9697 | 0.9739 |
0.0318 | 55.9796 | 88000 | 0.0401 | 0.9733 | 0.9865 | 0.9865 | 0.9884 | 0.9845 | 0.9865 | 0.9865 | 0.9766 | 0.9700 | 0.9714 | 0.9753 |
0.1355 | 58.5242 | 92000 | 0.0453 | 0.9764 | 0.9880 | 0.9880 | 0.9896 | 0.9874 | 0.9889 | 0.9861 | 0.9796 | 0.9742 | 0.9731 | 0.9786 |
0.0256 | 61.0687 | 96000 | 0.0359 | 0.9817 | 0.9907 | 0.9908 | 0.9902 | 0.9925 | 0.9902 | 0.9901 | 0.9846 | 0.9808 | 0.9783 | 0.9833 |
0.019 | 63.6132 | 100000 | 0.0320 | 0.9819 | 0.9908 | 0.9909 | 0.9914 | 0.9908 | 0.9936 | 0.9875 | 0.9838 | 0.9812 | 0.9787 | 0.9841 |
0.0713 | 66.1578 | 104000 | 0.0319 | 0.9827 | 0.9912 | 0.9912 | 0.9940 | 0.9922 | 0.9937 | 0.9847 | 0.9859 | 0.9812 | 0.9807 | 0.9828 |
0.1036 | 68.7023 | 108000 | 0.0369 | 0.9807 | 0.9902 | 0.9903 | 0.9932 | 0.9916 | 0.9946 | 0.9813 | 0.9844 | 0.9807 | 0.9790 | 0.9788 |
0.0575 | 71.2468 | 112000 | 0.0338 | 0.9843 | 0.9921 | 0.9921 | 0.9939 | 0.9913 | 0.9929 | 0.9901 | 0.9870 | 0.9822 | 0.9814 | 0.9867 |
0.0136 | 73.7913 | 116000 | 0.0259 | 0.9870 | 0.9934 | 0.9934 | 0.9926 | 0.9936 | 0.9946 | 0.9930 | 0.9889 | 0.9852 | 0.9850 | 0.9891 |
0.045 | 76.3359 | 120000 | 0.0310 | 0.9844 | 0.9921 | 0.9921 | 0.9913 | 0.9926 | 0.9941 | 0.9902 | 0.9866 | 0.9834 | 0.9805 | 0.9871 |
0.6665 | 78.8804 | 124000 | 0.0306 | 0.9858 | 0.9928 | 0.9928 | 0.9933 | 0.9937 | 0.9943 | 0.9898 | 0.9885 | 0.9850 | 0.9826 | 0.9872 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 98
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for NICOPOI-9/segformer-b0-finetuned-morphpadver1-hgo-coord
Base model
nvidia/mit-b0