smids_1x_deit_small_adamax_00001_fold1
This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.6979
- Accuracy: 0.8648
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.5894 | 1.0 | 76 | 0.5595 | 0.7830 |
0.4626 | 2.0 | 152 | 0.4242 | 0.8314 |
0.2923 | 3.0 | 228 | 0.3702 | 0.8531 |
0.2746 | 4.0 | 304 | 0.3590 | 0.8464 |
0.2111 | 5.0 | 380 | 0.3390 | 0.8631 |
0.1639 | 6.0 | 456 | 0.3346 | 0.8614 |
0.108 | 7.0 | 532 | 0.3491 | 0.8698 |
0.1272 | 8.0 | 608 | 0.3559 | 0.8681 |
0.132 | 9.0 | 684 | 0.3508 | 0.8648 |
0.0579 | 10.0 | 760 | 0.3614 | 0.8715 |
0.0748 | 11.0 | 836 | 0.3587 | 0.8681 |
0.0507 | 12.0 | 912 | 0.3817 | 0.8715 |
0.0272 | 13.0 | 988 | 0.4000 | 0.8781 |
0.0136 | 14.0 | 1064 | 0.4159 | 0.8731 |
0.0187 | 15.0 | 1140 | 0.4377 | 0.8648 |
0.0055 | 16.0 | 1216 | 0.4760 | 0.8698 |
0.0039 | 17.0 | 1292 | 0.5096 | 0.8648 |
0.0182 | 18.0 | 1368 | 0.5153 | 0.8698 |
0.0025 | 19.0 | 1444 | 0.5136 | 0.8698 |
0.0086 | 20.0 | 1520 | 0.5513 | 0.8664 |
0.0136 | 21.0 | 1596 | 0.5585 | 0.8765 |
0.0139 | 22.0 | 1672 | 0.5755 | 0.8648 |
0.0008 | 23.0 | 1748 | 0.5786 | 0.8765 |
0.0007 | 24.0 | 1824 | 0.6119 | 0.8748 |
0.013 | 25.0 | 1900 | 0.6090 | 0.8681 |
0.0086 | 26.0 | 1976 | 0.6766 | 0.8581 |
0.0066 | 27.0 | 2052 | 0.6214 | 0.8681 |
0.0074 | 28.0 | 2128 | 0.6445 | 0.8664 |
0.0116 | 29.0 | 2204 | 0.6664 | 0.8614 |
0.0003 | 30.0 | 2280 | 0.6454 | 0.8648 |
0.006 | 31.0 | 2356 | 0.6504 | 0.8681 |
0.0144 | 32.0 | 2432 | 0.6501 | 0.8698 |
0.0003 | 33.0 | 2508 | 0.6602 | 0.8698 |
0.0003 | 34.0 | 2584 | 0.6626 | 0.8648 |
0.0124 | 35.0 | 2660 | 0.6658 | 0.8698 |
0.0103 | 36.0 | 2736 | 0.6772 | 0.8698 |
0.0136 | 37.0 | 2812 | 0.6878 | 0.8648 |
0.0002 | 38.0 | 2888 | 0.6900 | 0.8681 |
0.0083 | 39.0 | 2964 | 0.6827 | 0.8631 |
0.0002 | 40.0 | 3040 | 0.6875 | 0.8698 |
0.0045 | 41.0 | 3116 | 0.6912 | 0.8664 |
0.0002 | 42.0 | 3192 | 0.6876 | 0.8614 |
0.0047 | 43.0 | 3268 | 0.6912 | 0.8631 |
0.0002 | 44.0 | 3344 | 0.7110 | 0.8598 |
0.0002 | 45.0 | 3420 | 0.6957 | 0.8648 |
0.0002 | 46.0 | 3496 | 0.6969 | 0.8648 |
0.0002 | 47.0 | 3572 | 0.6980 | 0.8648 |
0.0039 | 48.0 | 3648 | 0.6956 | 0.8681 |
0.0002 | 49.0 | 3724 | 0.6983 | 0.8648 |
0.0002 | 50.0 | 3800 | 0.6979 | 0.8648 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0
- Downloads last month
- 4
Model tree for hkivancoral/smids_1x_deit_small_adamax_00001_fold1
Base model
facebook/deit-small-patch16-224