Support Vector Machines - THE MATH YOU SHOULD KNOW

preview_player
Показать описание
In this video, we are going to see exactly why SVMs are so versatile by getting into the math that powers it.

If you like this video and want to see more content on data Science, Machine learning, Deep Learning and AI, hit that SUBSCRIBE button. And ring that damn bell for notifications when I upload.

REFERENCES

FOLLOW ME
Рекомендации по теме
Комментарии
Автор

Like I said at the end of the video, I have some ideas for this channel. Here are some types of videos:
1. Concepts with Comedy - Short, but pithy explanations of certain big idea concepts in the Data Science Space (Eg. the AB Testing Video I made) with some bad comedy to keep it engaging.
2. Algorithm Concepts - Explaining the details of Machine Learning Algorithms (like this one).
3. The Math You Should know - Subset of (2) with a ton of Math (Linear, Logistic Regression and Kernel Vids I made)
4. Papers and Discussions - Reading and explaining the concept math behind recent research papers in deep Learning, machine learning (Eg. Attention, Masked RCNN, CycleGANs vids)

I made explicit playlists of these videos, so check 'em out in the "playlist" section. add the entire playlist for some continuous quality content ;)

I'll have a video on High dimensional data and PCA coming up - either the next or the one after that. So subscribe to keep an eye out for it!

I have some goals for this channel. Lets make it big! Thanks for all your support!

CodeEmporium
Автор

Thank you for saving my summer project essay! It's really helpful to have someone show me the details rather than one sentence saying "doing this with Lagrange multipliers" on Boswell's paper 🥳

alicezhou
Автор

This is exactly what I was looking for. End to end explanation clearly showing the steps involved. Thanks a ton man!❤

PRUTHVIRAJRGEEB
Автор

Great stuff, hats off !
Dude kindly keep making videos of "Maths you should know"
Can you please go for Hidden Markov Models and Kalman filters ?

muhammadshaheer
Автор

you were just wonderful!. you explained the concept amazingly, exactly what I needed to hear and at correct speed.

priyankakaswan
Автор

With super crisp explanations of the maths behind it, visualization super-awesome and picture-perfect presentation this video is a nice contribution to the community of ML. Super awesome work!! Keep it up.

ananyadas
Автор

The best video of SVM! Thank you for your excellent work!

pingchuanma
Автор

Why doesn't anyone go through an example with real numbers so we can actually see these formulas in action?

ericaltf
Автор

thank you for the deep dive into rbf kernels, my hope is to fill in my gaps w my maths so I can watch these videos and get better intuition of these topics.. for now not quite there but we still getting there !

maria
Автор

Most crisp and to the point explanation.

theghostwhowalk
Автор

Maybe a tad to rushed, but balancing time to make video and release is propably a tricky buisness :)

That being said, I like your videos 1000x more than Siraj, wish I could move some of his views to you ;)

Keep it up!

dammi
Автор

hey that slack variable which you introduced in the objective function basically gives us control over the margin from the decison boundary right (mathematically speaking what you wrote)coz in practical cases definitely data points are not always linearly separable due to presence of outliers and hence whenever our model misclassifies the slack term makes sure that the margin is increased from the decison boundary hence like we tsake steps somewhere between underfitting(no slack variable) and overfitting(slack variable present along with the penalty term) am i right in my intuition please reply dada...

subhajitsarkar
Автор

Very well done on the explanation part and i am obsessed with your Math explanation especially the term and usage of it.

Saravananmicrosoft
Автор

I would be thankful if you tell me what maths are required for SVM and any reference to learn that. I have searched on internet for a time and every single one says just the basic math. I'm too frustrated over the internet rn.

muhtasirimran
Автор

Dude.. You are a saviour! Great work. Keep it coming.. 👍🏻

rtgunti
Автор

@2.20 the equation of distance for point vector from hyperplane, the denominator should be ||w|| than ||w|| squared.

yuktikaura
Автор

OMG This is so helpful!!!. So simple but so general. Thank you very much sir!

blmppes
Автор

This is wonderful. Can you please make a video on Pegasos: Primal Estimated sub-GrAdient SOlver for SVM

ochanabondhu
Автор

Excellent explanation, keep up the good work.

cdsjatin
Автор

0:42 that “kernalization” had me laughing 😂😂😂😂

exoticcoder