Model Card for zakariajaadi/distilbert-base-uncased-imdb

  • This distilbert-base-uncased model was fine-tuned for sequence classification using the imdb dataset with Lora (Low-Rank Adaptation) to enable efficient fine-tuning while reducing the number of trainable parameters.
  • The model was fine-tuned for 5 epochs with:
    • batch size: 16
    • learning rate: 5e-05
    • maximum sequence length: 256.
  • The best validation accuracy the model achieved was 89.7%
  • The evaluation set was created by sampling 15% of the training data.

Model settings and hyperparams

  • max_length=256
  • batch_size = 16
  • num_epochs = 5
  • lr = 5e-5
  • warmup_ratio=0.1
  • weight_decay=0.1

Lora settings

  • r=4
  • lora_alpha=32
  • lora_dropout=0.01

πŸ“Œ GitHub profile

For more projects and code, check out my GitHub: zakariajaadi

Downloads last month
3
Safetensors
Model size
67M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for zakariajaadi/distilbert-base-uncased-imdb

Finetuned
(9541)
this model

Dataset used to train zakariajaadi/distilbert-base-uncased-imdb