segformer-b5-finetuned-ade20k-morphpadver1-hgo-coord-v9_mix_resample_40epochs
This model is a fine-tuned version of nvidia/segformer-b5-finetuned-ade-640-640 on the NICOPOI-9/morphpad_coord_hgo_512_4class_v2 dataset. It achieves the following results on the evaluation set:
- Loss: 0.5955
- Mean Iou: 0.7753
- Mean Accuracy: 0.8720
- Overall Accuracy: 0.8743
- Accuracy 0-0: 0.8115
- Accuracy 0-90: 0.9059
- Accuracy 90-0: 0.8813
- Accuracy 90-90: 0.8893
- Iou 0-0: 0.7564
- Iou 0-90: 0.7853
- Iou 90-0: 0.7894
- Iou 90-90: 0.7700
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy 0-0 | Accuracy 0-90 | Accuracy 90-0 | Accuracy 90-90 | Iou 0-0 | Iou 0-90 | Iou 90-0 | Iou 90-90 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1.162 | 1.3638 | 4000 | 1.1730 | 0.3007 | 0.4629 | 0.4758 | 0.3342 | 0.5531 | 0.6459 | 0.3186 | 0.2606 | 0.3421 | 0.3576 | 0.2426 |
0.395 | 2.7276 | 8000 | 0.9973 | 0.4133 | 0.5816 | 0.5930 | 0.4473 | 0.6859 | 0.7131 | 0.4801 | 0.3740 | 0.4579 | 0.4501 | 0.3713 |
0.3669 | 4.0914 | 12000 | 0.7348 | 0.5875 | 0.7355 | 0.7422 | 0.6643 | 0.7867 | 0.8261 | 0.6650 | 0.5753 | 0.6040 | 0.6029 | 0.5679 |
0.1682 | 5.4552 | 16000 | 0.5960 | 0.6669 | 0.7992 | 0.8012 | 0.7239 | 0.7890 | 0.8507 | 0.8333 | 0.6476 | 0.6794 | 0.6793 | 0.6613 |
0.375 | 6.8190 | 20000 | 0.6153 | 0.6717 | 0.8059 | 0.8041 | 0.8256 | 0.7924 | 0.7818 | 0.8238 | 0.6541 | 0.6957 | 0.6718 | 0.6652 |
0.0555 | 8.1827 | 24000 | 0.5139 | 0.7146 | 0.8366 | 0.8337 | 0.8633 | 0.7891 | 0.8253 | 0.8688 | 0.7066 | 0.7163 | 0.7193 | 0.7164 |
0.0922 | 9.5465 | 28000 | 0.5495 | 0.7061 | 0.8232 | 0.8292 | 0.7713 | 0.8948 | 0.8771 | 0.7495 | 0.6945 | 0.7226 | 0.7186 | 0.6887 |
0.0843 | 10.9103 | 32000 | 0.5586 | 0.7233 | 0.8358 | 0.8409 | 0.7749 | 0.8844 | 0.8935 | 0.7903 | 0.7022 | 0.7486 | 0.7297 | 0.7127 |
0.0548 | 12.2741 | 36000 | 0.5068 | 0.7430 | 0.8519 | 0.8536 | 0.7967 | 0.8802 | 0.8552 | 0.8754 | 0.7230 | 0.7646 | 0.7530 | 0.7314 |
0.0703 | 13.6379 | 40000 | 0.4079 | 0.7834 | 0.8771 | 0.8791 | 0.8567 | 0.9133 | 0.8792 | 0.8592 | 0.7719 | 0.7917 | 0.7903 | 0.7796 |
0.0532 | 15.0017 | 44000 | 0.4843 | 0.7503 | 0.8549 | 0.8588 | 0.7706 | 0.8987 | 0.8868 | 0.8634 | 0.7228 | 0.7687 | 0.7628 | 0.7471 |
0.0702 | 16.3655 | 48000 | 0.4905 | 0.7572 | 0.8605 | 0.8626 | 0.8501 | 0.8941 | 0.8697 | 0.8280 | 0.7580 | 0.7694 | 0.7678 | 0.7335 |
0.0389 | 17.7293 | 52000 | 0.4224 | 0.7777 | 0.8726 | 0.8760 | 0.8338 | 0.9119 | 0.9028 | 0.8418 | 0.7611 | 0.7940 | 0.7879 | 0.7680 |
0.0441 | 19.0931 | 56000 | 0.5241 | 0.7439 | 0.8517 | 0.8537 | 0.8437 | 0.8851 | 0.8598 | 0.8180 | 0.7361 | 0.7455 | 0.7595 | 0.7346 |
0.0278 | 20.4569 | 60000 | 0.4326 | 0.7632 | 0.8623 | 0.8669 | 0.8353 | 0.9140 | 0.9066 | 0.7936 | 0.7663 | 0.7758 | 0.7795 | 0.7314 |
11.8254 | 21.8207 | 64000 | 0.4380 | 0.7765 | 0.8743 | 0.8747 | 0.8708 | 0.8755 | 0.8820 | 0.8687 | 0.7650 | 0.7907 | 0.7788 | 0.7716 |
0.0239 | 23.1845 | 68000 | 0.4872 | 0.7606 | 0.8630 | 0.8650 | 0.8762 | 0.8800 | 0.8908 | 0.8051 | 0.7477 | 0.7866 | 0.7649 | 0.7434 |
0.128 | 24.5482 | 72000 | 0.4604 | 0.7731 | 0.8700 | 0.8730 | 0.8505 | 0.9075 | 0.8951 | 0.8268 | 0.7637 | 0.7875 | 0.7842 | 0.7568 |
0.0482 | 25.9120 | 76000 | 0.5039 | 0.7559 | 0.8602 | 0.8620 | 0.8134 | 0.8757 | 0.8786 | 0.8732 | 0.7406 | 0.7746 | 0.7678 | 0.7407 |
0.0566 | 27.2758 | 80000 | 0.5290 | 0.7681 | 0.8671 | 0.8701 | 0.8100 | 0.9024 | 0.8928 | 0.8631 | 0.7410 | 0.7847 | 0.7865 | 0.7603 |
0.0148 | 28.6396 | 84000 | 0.7038 | 0.7441 | 0.8530 | 0.8541 | 0.8293 | 0.8652 | 0.8607 | 0.8570 | 0.7317 | 0.7596 | 0.7541 | 0.7311 |
0.0164 | 30.0034 | 88000 | 0.5058 | 0.7846 | 0.8776 | 0.8803 | 0.8808 | 0.9156 | 0.8982 | 0.8158 | 0.7733 | 0.8034 | 0.7946 | 0.7671 |
0.0019 | 31.3672 | 92000 | 0.5421 | 0.7682 | 0.8669 | 0.8697 | 0.8498 | 0.9145 | 0.8761 | 0.8272 | 0.7508 | 0.7768 | 0.7832 | 0.7619 |
0.0028 | 32.7310 | 96000 | 0.6268 | 0.7829 | 0.8773 | 0.8791 | 0.8674 | 0.9000 | 0.8917 | 0.8501 | 0.7662 | 0.7947 | 0.7993 | 0.7712 |
0.0103 | 34.0948 | 100000 | 0.5294 | 0.7730 | 0.8697 | 0.8731 | 0.8520 | 0.9142 | 0.8966 | 0.8160 | 0.7595 | 0.7844 | 0.7930 | 0.7552 |
0.0508 | 35.4586 | 104000 | 0.5242 | 0.7829 | 0.8763 | 0.8790 | 0.8294 | 0.9140 | 0.8913 | 0.8704 | 0.7691 | 0.7895 | 0.7964 | 0.7766 |
0.0102 | 36.8224 | 108000 | 0.6011 | 0.7772 | 0.8716 | 0.8756 | 0.8096 | 0.9239 | 0.8983 | 0.8546 | 0.7583 | 0.7816 | 0.7976 | 0.7714 |
0.0311 | 38.1862 | 112000 | 0.6471 | 0.7771 | 0.8727 | 0.8754 | 0.8152 | 0.9135 | 0.8833 | 0.8789 | 0.7553 | 0.7848 | 0.7924 | 0.7760 |
0.0089 | 39.5499 | 116000 | 0.5955 | 0.7753 | 0.8720 | 0.8743 | 0.8115 | 0.9059 | 0.8813 | 0.8893 | 0.7564 | 0.7853 | 0.7894 | 0.7700 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.1.0
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 19