Logistic Regression Part 2 | Perceptron Trick Code

preview_player
Показать описание
This is the second part of Logistic Regression. Follow the video until the end to understand the concept in detail.

============================
Do you want to learn from me?
============================

📱 Grow with us:
Рекомендации по теме
Комментарии
Автор

bro your channel is nothing less than diamond

xijinping
Автор

My eyes are filled with tears after watching this video. I thought I was too dumb to learn ML, and I almost wasted three weeks trying to understand it. But this helped me so much! Thank you, dada.

jaypople
Автор

i swear i never see the channel like this before
Nitish sir, you are Gem

UmerFarooq-zvky
Автор

Huge respect Nitish sir. It's so simplified.

divyab
Автор

One more problem for high dimension data,
"Before coding, we have to choose 1 and 0 class in the data wisely."
Like in example, you took two dimensional data, so you were able to plot it and analysis that, this has to be assign 1 and this to be 0 because this class will be positive side of line and this will be negative side at the end.
But...in high dimensional data
how will you decide which class will to be assign 1 and which to be assign with 0, because you can't plot and visualise it.

BTCSashishgupta
Автор

Hello sir i have query regarding how we plot a line using np.linspace(-3, 3, 100)

mukeshsirvi
Автор

Sir pls make video on under and oversampling

humerashaikh
Автор

What will be the value of m and b when there are more than 2 features(higher dimensional data). As we'll get more than 2 coefficients, how will we calculate the value of "m" for the hyperplane.

uditpandey
Автор

excellent explanation. But I have a doubt.

I have a line equation 3x + 2y- 9 = 0 and a point ( 1, 1 ) which is in the negative region of the line. I want to bring this negative point to positive therefore, I am doing as per the transformation logic explained in the video:

(3+1)x + (2+1)y + (-9+1) = 0

which makes the transformed line as:

4x + 3y - 8 = 0

However, when I checked, the point (1, 1) is still in the negative region of the transformed line.

can anyone explain please?

DharmendraKumar-DS
Автор

Sir is there any possibility to get this onenote pdf format, please sir it will really helpful

KamaleshgowdaOfficial
Автор

6:54 you are saying that you returned W[0] (intercept term) but it's not intercept term if it is than why are you finding it again sir

hadibuxmahessar
Автор

sir you teach best, explain best but the problem is jo sikha hey vo kaha pey apply kare samaj mey nahi aata. Aap bolte ho kaggle kaa koi bhi dataset uthavo orr deko lekin dataset milta nahi hey same problem pey

tanmayshinde
Автор

But how we calculated x for the final line plot , we only got slope and intercept

acceleratedofficial
Автор

Javascript error: No module named Ipython .

Try doing all possible things, installing, importing etc . Still same issue .

amansah
Автор

how the model will perform if we have an outlier?

bhushanbowlekar
Автор

hi,
in code, under the function -> (df Perceptron(X, y):

there are 1000 epochs, in each epoch, we have to traverse each data point. But in code, in every epoch, we are only picking up 1 data point and updating the equation using one point only.
So, according to me, there are 100 data points, so in each epoch, we have to use all these 100 data points to update the coefficients. Is my understanding right or have I missed something?
Please let me know.

Thanks

muditmishra
Автор

i am getting the error that name step is not defined please help me sir
def perceptron(X, y):
X=np.insert(X, 0, 1, axis=1)
weights=np.ones(X.shape[1])
lr=0.1
for i in range(1000):
j=np.random.randint(0, 100)
y_hat= step(np.dot(X[j], weights))


return weights[0], weights[1:]

vinilreddy
Автор

I cannot understand the code there...what is happening is beyond my head totally
Can anyone tell me where am i doing am i not meant for machine learning
I easily understand the theoritical part but the code part is too tough for me

subhanjalpant
Автор

name 'step' is not defined
i in range(1000):
6 j=np.random.randint(0, 100)
7 y_hat= step(np.dot(X[j], weights))
8

vinilreddy