Logistic Regression Part 1 | Perceptron Trick

preview_player
Показать описание
This is the first part of Logistic Regression. We'll kick things off by introducing the Perceptron Trick, a foundational concept that sets the stage for understanding how Logistic Regression works.

============================
Do you want to learn from me?
============================

📱 Grow with us:

⌚Time Stamps⌚

00:00 - Intro
00:42 - What is Logistic Regression
02:36 - Prerequisite for Logistic Regression
05:07 - Perceptron Trick
14:56 - How to label regions?
18:15 - Transformations
30:44 - Algorithm
41:43 - Simplified Algorithm
Рекомендации по теме
Комментарии
Автор

Nitish, I am software Architect with 17 years of experience. I am trying to make a transition into Data Science. I am doing Data Science course from one of the startups. But I must say that content wise your free content is way more superior than what I am doing. As I am very strong in both Maths and Statistics I know the maths behind behind almost all the ML and Neural Network algorithms. Your explanation is absolutely spot on. I wish you all the best!!!

saptarshipeter
Автор

Thank God, I found your channel, and thank God I didn't ignore Hindi subject as a child being from south

krishnakanthmacherla
Автор

Hello sir, I am pursuing my masters in Data Science here in US. Your tutorials are undoubtedly more instructive and helpful than a master's degree, in my opinion. Please keep sharing your knowledge and also please continue the deep learning tutorials more. I eagerly wait for you to upload a video under deep learning playlist.

sowmyachinthapally
Автор

Can't explain how good this is... Should be marked as THE lecture to go through before you search the internet....

aniket
Автор

This is the real Logistic Regression. Thanks a Lot Sir.

rambaldotra
Автор

Literally, you are my inspiration for DSA, I recently solved the 1000th Leatcode problem within four months.

manujkumarjoshi
Автор

Excellent Teaching !! Grateful to you for bringing this program for us who are from entirely different background and are in transition from some other field to DataScience. Hats off to you, Nitesh Sir !!

laxmimansukhani
Автор

You sir, are an awesome educator .
I'll never fail to do my part to help your channel grow ..it's gold.

Also thank you for going through the trouble of visualising the graphs on desmos. It really helped

rohitekka
Автор

You really explained in a very easy way....!! I am loving this playlist.

youganjansarki
Автор

Nitish, Your are the best teacher. Your way of explanation is very attractive. Thanks for this kind of content.

anupambayen
Автор

you are the best teacher for machine learning and deep learning.

visheshmp
Автор

❤❤❤❤ Love for DEEP AND MACHINE LEARNING

duniya-wale
Автор

No I am not sad that you have less fame than others, I am sad because I wish I had found your channel earlier u were never there in my recommended Channel. Please increase your reach ...not for fame..but because many people need to you.

ndbweurt
Автор

abhi tak lagta tha ki krish naik bap hai ab pata chala ki ye sabka bap hai bhai isne sabko fail kar diya had soft for your effort respected sir

sahiltiwari
Автор

@ 22:44 the line is not correct with the equation, the equation is 2x+3y+5=0, which implies that slope and y intercept both are negative but the line drawn has positive y intercept.

MohammedMusab-krvo
Автор

thank you Sir, completed on 17th September 2024 8:19AM

hasanrants
Автор

Such a genius I found on the internet🤩

abhishekbhardwaj
Автор

28:00 understood sir, but what is the reason behind this technique of transforming line??!! I mean, How the technique is even working!

bharathchandrapanasa
Автор

Hi Sir, Firs of all thank you for your efforts.
I have one question. You said data must be linearly separable but in high dimension data set how can we check whether data is linearly separable or not.

jimmysandhu
Автор

I have a confusion. In ax + by +c = 0,

y is what we calculate using mx + c = 0

Or it is itself a input feature?

DharmendraKumar-DS