filmov
tv
13.4.1 Recursive Feature Elimination (L13: Feature Selection)

Показать описание
In this video, we start our discussion of wrapper methods for feature selection. In particular, we cover Recursive Feature Elimination (RFE) and see how we can use it in scikit-learn to select features based on linear model coefficients.
Logistic regression lectures:
L8.0 Logistic Regression – Lecture Overview (06:28)
L8.1 Logistic Regression as a Single-Layer Neural Network (09:15)
L8.2 Logistic Regression Loss Function (12:57)
L8.3 Logistic Regression Loss Derivative and Training (19:57)
L8.4 Logits and Cross Entropy (06:47)
L8.5 Logistic Regression in PyTorch – Code Example (19:02)
L8.6 Multinomial Logistic Regression / Softmax Regression (17:31)
L8.7.1 OneHot Encoding and Multi-category Cross Entropy (15:34)
L8.7.2 OneHot Encoding and Multi-category Cross Entropy Code Example (15:04)
L8.8 Softmax Regression Derivatives for Gradient Descent (19:38)
L8.9 Softmax Regression Code Example Using PyTorch (25:39)
-------
This video is part of my Introduction of Machine Learning course.
-------
Комментарии