Описание

Следовало дообучить базовую модель с помощью самописной реализации DoRA

Детали

Морозились все слои кроме слоев лоры Параметры Лоры - r=8, alpha=16, target_submodules=["k_proj", "v_proj"] Гиперпараметры самые удачные - BATCH_SIZE = 8 LEARNING_RATE = 3e-3 NUM_EPOCHS = 1 И бралось всего 10к из тренировочной выборки

Метрики

Удалось достичь Validation F1: 0.29556541198212705 На тесте же

image/png

Примеры генерации

"QT @user In the original draft of the 7th book, Remus Lupin survived the Battle of Hogwarts. #HappyBirthdayRemusLupin"
positive
neutralizing 
neutral 
neutral 

==========
"Ben Smith / Smith (concussion) remains out of the lineup Thursday, Curtis #NHL #SJ"
neutral
neutral
neutral
neutral
neutral},
neutral {
==========
Sorry bout the stream last night I crashed out but will be on tonight for sure. Then back to Minecraft in pc tomorrow night.
neutral
neutral
neutral {positive
neutral {
neutral.</
==========
Chase Headley's RBI double in the 8th inning off David Price snapped a Yankees streak of 33 consecutive scoreless innings against Blue Jays
neutral
neutral
neutral
neutral
neutralization 
neut
==========
@user Alciato: Bee will invest 150 million in January, another 200 in the Summer and plans to bring Messi by 2017"
positive
neutral
neutral  
neutral  
neutral}, 
Downloads last month
5
Safetensors
Model size
300M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for CMCenjoyer/llm-course-hw3-dora

Finetuned
(32)
this model

Dataset used to train CMCenjoyer/llm-course-hw3-dora

Collection including CMCenjoyer/llm-course-hw3-dora