nb-sbert-base-edu-scorer-lr3e4-bs32
This model is a fine-tuned version of NbAiLab/nb-sbert-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.1391
- Precision: 0.4950
- Recall: 0.32
- F1 Macro: 0.3154
- Accuracy: 0.3455
Model description
More information needed
Intended uses & limitations
More information needed
Test results
Binary classification accuracy (threshold at label 3) โ 79.27%
Test Report:
precision recall f1-score support
0 0.78 0.49 0.60 100
1 0.32 0.38 0.35 100
2 0.29 0.51 0.37 100
3 0.24 0.34 0.28 100
4 0.35 0.16 0.22 100
5 1.00 0.04 0.08 50
accuracy 0.35 550
macro avg 0.49 0.32 0.32 550
weighted avg 0.45 0.35 0.34 550
Confusion Matrix:
[[49 43 5 3 0 0]
[12 38 42 7 1 0]
[ 2 30 51 17 0 0]
[ 0 8 47 34 11 0]
[ 0 1 24 59 16 0]
[ 0 0 6 24 18 2]]
Test metrics
epoch = 20.0
eval_accuracy = 0.3455
eval_f1_macro = 0.3154
eval_loss = 1.1391
eval_precision = 0.495
eval_recall = 0.32
eval_runtime = 0:00:05.66
eval_samples_per_second = 97.116
eval_steps_per_second = 3.178
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 Macro | Accuracy |
---|---|---|---|---|---|---|---|
No log | 0 | 0 | 3.2995 | 0.0587 | 0.1667 | 0.0869 | 0.3524 |
0.7648 | 0.3368 | 1000 | 0.7304 | 0.4028 | 0.3358 | 0.3363 | 0.4918 |
0.7537 | 0.6736 | 2000 | 0.7005 | 0.4079 | 0.3483 | 0.3481 | 0.493 |
0.7174 | 1.0104 | 3000 | 0.6792 | 0.4157 | 0.3607 | 0.3625 | 0.5032 |
0.6713 | 1.3473 | 4000 | 0.6772 | 0.4212 | 0.3606 | 0.3630 | 0.484 |
0.6703 | 1.6841 | 5000 | 0.6570 | 0.4203 | 0.3585 | 0.3630 | 0.514 |
0.6936 | 2.0209 | 6000 | 0.6464 | 0.4116 | 0.3563 | 0.3603 | 0.5134 |
0.6942 | 2.3577 | 7000 | 0.6597 | 0.4005 | 0.3606 | 0.3627 | 0.5014 |
0.678 | 2.6945 | 8000 | 0.6517 | 0.4192 | 0.3652 | 0.3705 | 0.5244 |
0.6397 | 3.0313 | 9000 | 0.6397 | 0.4371 | 0.3669 | 0.3700 | 0.5126 |
0.6528 | 3.3681 | 10000 | 0.6725 | 0.4178 | 0.3699 | 0.3704 | 0.4856 |
0.6221 | 3.7050 | 11000 | 0.6370 | 0.4208 | 0.3672 | 0.3698 | 0.5108 |
0.5952 | 4.0418 | 12000 | 0.6464 | 0.4201 | 0.3629 | 0.3684 | 0.5248 |
0.614 | 4.3786 | 13000 | 0.6336 | 0.4247 | 0.3619 | 0.3667 | 0.5248 |
0.5978 | 4.7154 | 14000 | 0.6384 | 0.4205 | 0.3879 | 0.3903 | 0.5146 |
0.5992 | 5.0522 | 15000 | 0.6378 | 0.4238 | 0.3848 | 0.3886 | 0.516 |
0.61 | 5.3890 | 16000 | 0.6252 | 0.4302 | 0.3721 | 0.3764 | 0.5262 |
0.5936 | 5.7258 | 17000 | 0.6489 | 0.4754 | 0.4015 | 0.4092 | 0.517 |
0.5715 | 6.0626 | 18000 | 0.6327 | 0.4216 | 0.3769 | 0.3816 | 0.5168 |
0.5624 | 6.3995 | 19000 | 0.6425 | 0.4305 | 0.3812 | 0.3878 | 0.537 |
0.5979 | 6.7363 | 20000 | 0.6388 | 0.4243 | 0.3727 | 0.3759 | 0.5246 |
0.5284 | 7.0731 | 21000 | 0.6272 | 0.4234 | 0.3770 | 0.3814 | 0.5234 |
0.5926 | 7.4099 | 22000 | 0.6329 | 0.4978 | 0.3948 | 0.4108 | 0.531 |
0.5509 | 7.7467 | 23000 | 0.6361 | 0.5074 | 0.4001 | 0.4145 | 0.5198 |
0.5477 | 8.0835 | 24000 | 0.6281 | 0.4344 | 0.3776 | 0.3848 | 0.5284 |
0.5431 | 8.4203 | 25000 | 0.6586 | 0.4333 | 0.3568 | 0.3592 | 0.533 |
0.552 | 8.7572 | 26000 | 0.6311 | 0.5080 | 0.3937 | 0.4091 | 0.5242 |
0.5067 | 9.0940 | 27000 | 0.6317 | 0.4188 | 0.3794 | 0.3829 | 0.5194 |
0.5351 | 9.4308 | 28000 | 0.6339 | 0.4192 | 0.3782 | 0.3833 | 0.5254 |
0.5429 | 9.7676 | 29000 | 0.6277 | 0.4192 | 0.3811 | 0.3839 | 0.5226 |
0.5171 | 10.1044 | 30000 | 0.6314 | 0.5087 | 0.3889 | 0.4005 | 0.523 |
0.504 | 10.4412 | 31000 | 0.6608 | 0.4205 | 0.3807 | 0.3813 | 0.4998 |
0.5315 | 10.7780 | 32000 | 0.6389 | 0.4210 | 0.3767 | 0.3807 | 0.5198 |
0.5042 | 11.1149 | 33000 | 0.6375 | 0.4197 | 0.3795 | 0.3838 | 0.5258 |
0.5241 | 11.4517 | 34000 | 0.6423 | 0.4085 | 0.3783 | 0.3798 | 0.5168 |
0.5277 | 11.7885 | 35000 | 0.6428 | 0.4130 | 0.3795 | 0.3829 | 0.5282 |
0.505 | 12.1253 | 36000 | 0.6543 | 0.4182 | 0.3903 | 0.3905 | 0.5148 |
0.4923 | 12.4621 | 37000 | 0.6453 | 0.4181 | 0.3791 | 0.3832 | 0.5192 |
0.4689 | 12.7989 | 38000 | 0.6612 | 0.4317 | 0.4020 | 0.4042 | 0.5092 |
0.4658 | 13.1357 | 39000 | 0.6425 | 0.4131 | 0.3742 | 0.3781 | 0.522 |
0.4848 | 13.4725 | 40000 | 0.6549 | 0.4709 | 0.3844 | 0.3934 | 0.5064 |
0.4889 | 13.8094 | 41000 | 0.6459 | 0.4464 | 0.3893 | 0.3969 | 0.5198 |
0.4586 | 14.1462 | 42000 | 0.6515 | 0.4529 | 0.3927 | 0.4003 | 0.5218 |
0.4644 | 14.4830 | 43000 | 0.6429 | 0.4769 | 0.3825 | 0.3944 | 0.5258 |
0.4666 | 14.8198 | 44000 | 0.6565 | 0.4619 | 0.3963 | 0.4078 | 0.5182 |
0.4524 | 15.1566 | 45000 | 0.6487 | 0.4558 | 0.3882 | 0.3977 | 0.5222 |
0.4431 | 15.4934 | 46000 | 0.6475 | 0.4739 | 0.3851 | 0.3967 | 0.5266 |
0.4591 | 15.8302 | 47000 | 0.6521 | 0.4534 | 0.3862 | 0.3959 | 0.5244 |
0.4504 | 16.1671 | 48000 | 0.6543 | 0.4703 | 0.3817 | 0.3930 | 0.5234 |
0.4402 | 16.5039 | 49000 | 0.6601 | 0.4393 | 0.3975 | 0.4042 | 0.5132 |
0.4395 | 16.8407 | 50000 | 0.6556 | 0.4695 | 0.3849 | 0.3962 | 0.5238 |
0.4285 | 17.1775 | 51000 | 0.6577 | 0.4672 | 0.3852 | 0.3946 | 0.5178 |
0.4181 | 17.5143 | 52000 | 0.6540 | 0.4444 | 0.3854 | 0.3944 | 0.5226 |
0.4407 | 17.8511 | 53000 | 0.6544 | 0.4422 | 0.3868 | 0.3956 | 0.5262 |
0.3957 | 18.1879 | 54000 | 0.6594 | 0.4200 | 0.3868 | 0.3915 | 0.5182 |
0.4051 | 18.5248 | 55000 | 0.6572 | 0.4354 | 0.3861 | 0.3935 | 0.5192 |
0.424 | 18.8616 | 56000 | 0.6549 | 0.4476 | 0.3847 | 0.3941 | 0.5256 |
0.4271 | 19.1984 | 57000 | 0.6566 | 0.4471 | 0.3847 | 0.3936 | 0.5172 |
0.4225 | 19.5352 | 58000 | 0.6557 | 0.4473 | 0.3898 | 0.3984 | 0.5206 |
0.4312 | 19.8720 | 59000 | 0.6556 | 0.4468 | 0.3884 | 0.3973 | 0.5228 |
Framework versions
- Transformers 4.53.2
- Pytorch 2.7.1+cu126
- Datasets 4.0.0
- Tokenizers 0.21.2
- Downloads last month
- 19
Model tree for versae/nb-sbert-base-edu-scorer-lr3e4-bs32
Base model
NbAiLab/nb-sbert-base