helldivers2-jarvis-asrV4

This model is a fine-tuned version of facebook/wav2vec2-base-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 25.6691
  • Wer: 0.17
  • Cer: 0.7603

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 60

Training results

Training Loss Epoch Step Validation Loss Wer Cer
507.0049 1.0 50 245.1068 0.3743 0.7814
334.0895 2.0 100 161.0169 0.3129 0.7744
279.7883 3.0 150 130.1554 0.2886 0.7716
244.374 4.0 200 104.5179 0.2714 0.7701
211.0537 5.0 250 94.9101 0.2471 0.7683
214.1148 6.0 300 86.9921 0.2514 0.7681
192.2161 7.0 350 75.8522 0.2357 0.7663
173.9121 8.0 400 75.7438 0.2243 0.7658
166.7699 9.0 450 68.1917 0.2243 0.7653
157.0137 10.0 500 59.1111 0.2171 0.7647
166.9989 11.0 550 50.2394 0.2143 0.7643
147.1322 12.0 600 45.1821 0.2171 0.7639
142.869 13.0 650 43.5846 0.2143 0.7641
131.6637 14.0 700 43.0457 0.2071 0.7635
126.8534 15.0 750 45.8238 0.21 0.7635
121.8376 16.0 800 36.9011 0.2029 0.7630
123.6633 17.0 850 38.0945 0.2043 0.7630
104.7618 18.0 900 37.8898 0.2014 0.7626
111.8574 19.0 950 32.1545 0.1986 0.7619
105.2309 20.0 1000 34.3187 0.1943 0.7617
99.6247 21.0 1050 32.9372 0.1843 0.7613
92.9916 22.0 1100 31.1413 0.1871 0.7613
102.2719 23.0 1150 28.5850 0.1814 0.7610
93.6705 24.0 1200 31.5977 0.1814 0.7611
96.9849 25.0 1250 34.8815 0.1843 0.7615
105.9482 26.0 1300 28.3215 0.1757 0.7608
94.6159 27.0 1350 26.6447 0.1771 0.7606
83.7973 28.0 1400 27.4755 0.1814 0.7611
85.906 29.0 1450 23.9488 0.1757 0.7605
87.2979 30.0 1500 23.0084 0.1771 0.7605
77.1649 31.0 1550 22.0342 0.17 0.7603
76.6646 32.0 1600 22.6523 0.1729 0.7604
84.5785 33.0 1650 26.1900 0.1743 0.7604
87.0745 34.0 1700 24.5733 0.1729 0.7603
80.8229 35.0 1750 26.2349 0.1743 0.7605
79.2393 36.0 1800 24.3964 0.1714 0.7603
74.8672 37.0 1850 23.2604 0.1671 0.7602
80.8669 38.0 1900 19.7348 0.17 0.7602
77.9821 39.0 1950 21.5322 0.1686 0.7604
70.6975 40.0 2000 26.5959 0.17 0.7603
73.8925 41.0 2050 24.7372 0.17 0.7602
82.1079 42.0 2100 24.7751 0.1686 0.7602
71.1623 43.0 2150 25.9074 0.1729 0.7604
82.281 44.0 2200 25.5285 0.1714 0.7604
69.5761 45.0 2250 24.7130 0.1714 0.7603
80.4887 46.0 2300 25.2155 0.1686 0.7602
74.3038 47.0 2350 24.2761 0.1671 0.7601
69.2056 48.0 2400 24.0605 0.1671 0.7601
67.871 49.0 2450 25.4337 0.17 0.7602
73.1627 50.0 2500 24.6985 0.1671 0.7601
67.3236 51.0 2550 27.8274 0.1686 0.7603
66.3366 52.0 2600 24.6496 0.17 0.7602
76.2498 53.0 2650 23.5383 0.1714 0.7603
76.629 54.0 2700 24.6941 0.1671 0.7601
64.3516 55.0 2750 24.4289 0.1686 0.7602
64.3968 56.0 2800 24.8174 0.1686 0.7602
73.7818 57.0 2850 26.2579 0.1686 0.7602
74.5849 58.0 2900 26.1643 0.1671 0.7602
65.7819 59.0 2950 25.5072 0.1671 0.7602
73.0206 60.0 3000 25.6691 0.17 0.7603

Framework versions

  • Transformers 4.51.3
  • Pytorch 2.4.1+cu118
  • Datasets 3.5.1
  • Tokenizers 0.21.1
Downloads last month
4
Safetensors
Model size
94.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for 8688chris/helldivers2-jarvis-asrV4

Finetuned
(147)
this model