Model Card for zakariajaadi/distilbert-base-uncased-imdb
- This distilbert-base-uncased model was fine-tuned for sequence classification using the imdb dataset with Lora (Low-Rank Adaptation) to enable efficient fine-tuning while reducing the number of trainable parameters.
- The model was fine-tuned for 5 epochs with:
- batch size: 16
- learning rate: 5e-05
- maximum sequence length: 256.
- The best validation accuracy the model achieved was 89.7%
- The evaluation set was created by sampling 15% of the training data.
Model settings and hyperparams
- max_length=256
- batch_size = 16
- num_epochs = 5
- lr = 5e-5
- warmup_ratio=0.1
- weight_decay=0.1
Lora settings
- r=4
- lora_alpha=32
- lora_dropout=0.01
π GitHub profile
For more projects and code, check out my GitHub: zakariajaadi
- Downloads last month
- 3
Model tree for zakariajaadi/distilbert-base-uncased-imdb
Base model
distilbert/distilbert-base-uncased