segformer-b4-finetuned-segments-chargers-full-v4.1
This model is a fine-tuned version of nvidia/mit-b4 on the dskong07/chargers-full-v0.1 dataset. It achieves the following results on the evaluation set:
- Loss: 0.4050
- Mean Iou: 0.7302
- Mean Accuracy: 0.8405
- Overall Accuracy: 0.9155
- Accuracy Unlabeled: nan
- Accuracy Screen: 0.8874
- Accuracy Body: 0.9156
- Accuracy Cable: 0.6425
- Accuracy Plug: 0.8031
- Accuracy Void-background: 0.9539
- Iou Unlabeled: nan
- Iou Screen: 0.7807
- Iou Body: 0.8343
- Iou Cable: 0.5652
- Iou Plug: 0.5556
- Iou Void-background: 0.9154
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Screen | Accuracy Body | Accuracy Cable | Accuracy Plug | Accuracy Void-background | Iou Unlabeled | Iou Screen | Iou Body | Iou Cable | Iou Plug | Iou Void-background |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.5049 | 2.2222 | 20 | 0.8068 | 0.5018 | 0.7473 | 0.8447 | nan | 0.7238 | 0.9311 | 0.4504 | 0.7647 | 0.8667 | 0.0 | 0.6849 | 0.7101 | 0.3604 | 0.4256 | 0.8298 |
0.2822 | 4.4444 | 40 | 0.5160 | 0.6209 | 0.7951 | 0.8640 | nan | 0.9440 | 0.7910 | 0.5499 | 0.7662 | 0.9246 | nan | 0.6072 | 0.7047 | 0.4638 | 0.4484 | 0.8802 |
0.1945 | 6.6667 | 60 | 0.4086 | 0.6900 | 0.8274 | 0.8978 | nan | 0.8896 | 0.8973 | 0.6177 | 0.7982 | 0.9342 | nan | 0.7376 | 0.8019 | 0.5163 | 0.4930 | 0.9009 |
0.2245 | 8.8889 | 80 | 0.4093 | 0.7007 | 0.8260 | 0.9020 | nan | 0.8361 | 0.9279 | 0.6242 | 0.8090 | 0.9328 | nan | 0.7540 | 0.8136 | 0.5345 | 0.4992 | 0.9021 |
0.1808 | 11.1111 | 100 | 0.3741 | 0.7021 | 0.8068 | 0.9051 | nan | 0.7754 | 0.9242 | 0.6154 | 0.7694 | 0.9498 | nan | 0.7222 | 0.8089 | 0.5384 | 0.5339 | 0.9070 |
0.1342 | 13.3333 | 120 | 0.3797 | 0.7106 | 0.8224 | 0.9066 | nan | 0.8693 | 0.9029 | 0.6329 | 0.7564 | 0.9504 | nan | 0.7614 | 0.8113 | 0.5539 | 0.5205 | 0.9061 |
0.1204 | 15.5556 | 140 | 0.4131 | 0.7112 | 0.8367 | 0.9063 | nan | 0.8737 | 0.9160 | 0.6331 | 0.8214 | 0.9392 | nan | 0.7638 | 0.8212 | 0.5585 | 0.5070 | 0.9057 |
0.1042 | 17.7778 | 160 | 0.3944 | 0.7180 | 0.8386 | 0.9096 | nan | 0.8884 | 0.9107 | 0.6271 | 0.8207 | 0.9461 | nan | 0.7674 | 0.8208 | 0.5567 | 0.5350 | 0.9103 |
0.093 | 20.0 | 180 | 0.3910 | 0.7231 | 0.8400 | 0.9121 | nan | 0.9020 | 0.9020 | 0.6366 | 0.8069 | 0.9526 | nan | 0.7731 | 0.8274 | 0.5605 | 0.5424 | 0.9119 |
0.0989 | 22.2222 | 200 | 0.3632 | 0.7260 | 0.8355 | 0.9142 | nan | 0.9002 | 0.9133 | 0.6173 | 0.7920 | 0.9547 | nan | 0.7818 | 0.8316 | 0.5560 | 0.5466 | 0.9139 |
0.0937 | 24.4444 | 220 | 0.3956 | 0.7241 | 0.8367 | 0.9130 | nan | 0.8941 | 0.9060 | 0.6165 | 0.8118 | 0.9550 | nan | 0.7743 | 0.8286 | 0.5584 | 0.5467 | 0.9128 |
0.0987 | 26.6667 | 240 | 0.4233 | 0.7240 | 0.8409 | 0.9125 | nan | 0.9040 | 0.9013 | 0.6209 | 0.8244 | 0.9537 | nan | 0.7752 | 0.8274 | 0.5574 | 0.5477 | 0.9124 |
0.0935 | 28.8889 | 260 | 0.4249 | 0.7252 | 0.8392 | 0.9124 | nan | 0.8810 | 0.9102 | 0.6346 | 0.8192 | 0.9512 | nan | 0.7748 | 0.8265 | 0.5619 | 0.5515 | 0.9115 |
0.0827 | 31.1111 | 280 | 0.4266 | 0.7241 | 0.8409 | 0.9124 | nan | 0.8800 | 0.9169 | 0.6387 | 0.8208 | 0.9480 | nan | 0.7736 | 0.8301 | 0.5659 | 0.5388 | 0.9121 |
0.0785 | 33.3333 | 300 | 0.4034 | 0.7285 | 0.8386 | 0.9144 | nan | 0.8847 | 0.9154 | 0.6273 | 0.8122 | 0.9533 | nan | 0.7804 | 0.8317 | 0.5631 | 0.5539 | 0.9135 |
0.0733 | 35.5556 | 320 | 0.4061 | 0.7316 | 0.8446 | 0.9150 | nan | 0.8993 | 0.9031 | 0.6489 | 0.8164 | 0.9554 | nan | 0.7853 | 0.8312 | 0.5658 | 0.5614 | 0.9144 |
0.0683 | 37.7778 | 340 | 0.4115 | 0.7266 | 0.8347 | 0.9148 | nan | 0.8581 | 0.9301 | 0.6239 | 0.8105 | 0.9511 | nan | 0.7685 | 0.8340 | 0.5604 | 0.5545 | 0.9154 |
0.1087 | 40.0 | 360 | 0.4317 | 0.7297 | 0.8459 | 0.9143 | nan | 0.9070 | 0.9005 | 0.6467 | 0.8211 | 0.9544 | nan | 0.7768 | 0.8286 | 0.5663 | 0.5620 | 0.9146 |
0.0634 | 42.2222 | 380 | 0.4252 | 0.7296 | 0.8426 | 0.9150 | nan | 0.8926 | 0.9141 | 0.6285 | 0.8250 | 0.9529 | nan | 0.7812 | 0.8341 | 0.5644 | 0.5541 | 0.9144 |
0.0786 | 44.4444 | 400 | 0.3969 | 0.7294 | 0.8408 | 0.9151 | nan | 0.8860 | 0.9139 | 0.6511 | 0.7993 | 0.9535 | nan | 0.7787 | 0.8333 | 0.5647 | 0.5551 | 0.9153 |
0.0693 | 46.6667 | 420 | 0.4133 | 0.7292 | 0.8422 | 0.9147 | nan | 0.8853 | 0.9138 | 0.6427 | 0.8168 | 0.9525 | nan | 0.7782 | 0.8334 | 0.5674 | 0.5523 | 0.9145 |
0.0663 | 48.8889 | 440 | 0.4050 | 0.7302 | 0.8405 | 0.9155 | nan | 0.8874 | 0.9156 | 0.6425 | 0.8031 | 0.9539 | nan | 0.7807 | 0.8343 | 0.5652 | 0.5556 | 0.9154 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 171
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for irvingz/segformer-b4-finetuned-segments-chargers-full-v4.1
Base model
nvidia/mit-b4