Support Vector Assertion - Practical Machine Learning Tutorial with Python p.22

preview_player
Показать описание
In this tutorial, we cover the assertion for the calculation of a support vector within the Support Vector Machine.

Рекомендации по теме
Комментарии
Автор

man you're amazing, thank you!
ps please never stop spreading knowledge

EatShitCh
Автор

A big thank you here. Whole series is not just very comprehensive but pellucid as well.

zafarahmad
Автор

I am not sure if it is mentioned during the video, but the projection of u to w is the dot product of u and the w's unit vector, not w.
u.w/||w|| = u*cos(alpha)= projection of u on w.

chamberhorsefeeder
Автор

It is sad to see how the views go down with each video... Hope all reading this don't give up and be successful machine learning engineers.

neillunavat
Автор

Love the fact that you explain the principles involved

HilaryKansiime
Автор

How to get the we get the output of u.w+b as -1 or 1? why not other numbers?Please explain.

rajivkumar
Автор

Wonderful video series! I'm learning a ton from it. I had a few questions related to this video.

At around 5:05, you introduce the formal notion of a "support vector" as X sub -sv and X sub +sv. What exactly do these variables denote? Are these the data points that are "closest" to the decision boundary (on either side)? Also, what happened to the inequality, i.e. <= 0 or >= 0? How do we know that the equations at 5:05 equate to -1 and +1?

Finally, I'm confused by Y sub i. Is this just a constant that we introduce for clarity? At around 8:30, why do we multiply the two equations by this value?

I know that's a lot...but any help would be appreciated. Thanks so much!

michaelwalczyk
Автор

I'm probably late but I get confused at 2:55. When you project u onto w you say u is in the + side of the hyperplane if u * w + b >= 0.

Couldnt you just check if u projected onto w > ||w|| or something?

thecactus
Автор

could someone explain why it has to be -1 or 1? like what determined this value? from 5:10 to 5:40

JohanSebastianCorn
Автор

y(xw + b) - 1 = 0 is not the same for both, remember y is positive and negative for the other.

bornofdata
Автор

so first of all what is like, what's the, we know that eventually xD nice one
thanks for an amazing video i couldn't understand the math so i had to go search on my own for a couple of days but still thanks for the tutorials

becauseiwanttoanime
Автор

Why we need to multiply class y with equation x.w +b?

huojinchowdhury
Автор

Great video! Yourr really making machine learning accessible! I have a question on the value of b, could you explain its value a little more? When you say it is the bias, is that the spread between the two support vectors?

irishuserk
Автор

I don't get the part with X_sv and X+sv what are those?

mohammednagdy
Автор

I'm pretty confused as to why you're not using the projection formula to determine length to the decision boundary.

Joe-cszk
Автор

I really love your video.You explain things very well.Could you make some videos about other algorithms like random forest, decision tree:)

李云龙-cq
Автор

Could the magnitude on the RHS of these equations have been anything other than 1?

RishikavsAnnie
Автор

But why these values are 1 and -1? It is just an example and these can be 3 and 5 or anything else?

johnshepard
Автор

Did you ever explain what is a support vector? All the previous videos in this series helped me out in understanding from scratch(though I learnt the math behind linear regression from Khan Academy), this makes no sense to me.

chandeepadissanayake
Автор

Isn't the projection of vector u on vector w = u.w/||w|| ?
Why did you consider only the dot product?

SanketPatole