L8.5 Logistic Regression in PyTorch -- Code Example

preview_player
Показать описание


-------

This video is part of my Introduction of Deep Learning course.

-------

Рекомендации по теме
Комментарии
Автор

Thank you for this course) it's so cool. Can you answer please, I still can't get it? If we train logistic regression model and add L2 regularization to our binary cross entropy loss ( bin_loss + c*sum(Wi**2) Wi - weight of x[i]), does L2 include bias coef. (b in these videos) or not?
And with L2 regularization it should be also added partial derivatives 2*c*Wi to already calculated gradient of bin loss function: dLoss/dWi + 2*c*Wi, right? So if we don't include bias coef into L2 part, then we add nothing to dLoss/db unlike others dLoss/dWi, am I got this correct?

СтрингерБелл