Lecture 4 'Curse of Dimensionality / Perceptron' -Cornell CS4780 SP17

preview_player
Показать описание


Рекомендации по теме
Комментарии
Автор

1:00 A few words about k-Nearest Neighbors
2:00 Curse of dimensionality - by examining k neighbors in various dimensions
11:00 Perhaps the high dimensional data is in a subspace or low dimensional sub-manifold
12:55 A manifold is basically…locally Euclidean distances work but not globally
15:45 Detect manifold by creating spheres
16:30 Helps to think about the true dimensionality of the data
17:55 Always good to try and reduce the dimensionality
18:50 Demo - k-nearest neighbors
21:30 Demo - Curse of dimensionality
32:45 Advantages and disadvantages of KNN
36:45 Perceptron
38:10 Works better in high-dimensional space where points are far apart, but not low dimensional. Opposite of KNN
39:35 Mathematically how do we define a hyperplane
43:40 How to find the hyperplane
46:00 Geometrically we are now saying the hyperplane goes through the origin. We removed b

michaelmellinger
Автор

Up until now is such a pleasure to listen to you, prof Kilian Weinberg. You formalized all the ideas I have been learning throughout the internet, books, and my prof's lecture at uni. I am quite excited for what's coming. I'll stay tuned in.

ksblbzw
Автор

Professor Kilian you're a legend!! Amazing lectures with a beautiful sense of humor

naifalkhunaizi
Автор

It's really great to learn about assumptions about the algorithms success and their limitations! Truly helps in determining better choice of Algorithms :)

kirtanpatel
Автор

That is one of the best lectures I've seen so far on this topic! Thanks Prof! I now have to make the time to work through the whole course. Awesome teaching

MrSyncope
Автор

You are my most favorite teacher in the whole world. Everytime i stumble on a difficult subject of ML and DL and get discouraged, I randomly rewatch your lectures and feel all inspired again. Thank you for being so great at teaching

minhtamnguyen
Автор

What an eye-opening/insightful lecture (and series of lectures, more generally)! Thank you, prof. Weinberger. This class is the best/friendliest/most fun way to learn Machine Learning, by far.

sergiujava
Автор

Thank you Prof. Weinberger! such a good lecture!

ylee
Автор

Thank you for posting your videos! They're the best!

jaimecristalino
Автор

Thank you so much for your work!
I was really curious about manifolds, but couldn't find any good explanation, and yours is just brilliant!

mebwiri
Автор

Great lecture. Really helped me get a high level understanding of what’s going on

amsrremix
Автор

Sir, previous lectures were really very good and explained everything very well. The thing unique about your today's lecture was connecting why k-nearest can't be used in multi-dimensional objects, which other lectures are missing. Being said that, this lecture might be fine for some students but for me I learnt some German.

ali
Автор

great explanation of curse of dimensionality, thank you.

DavesTechChannel
Автор

You are simply a wonderful lecturer. The whole course is amazing.

HuyNguyen-cvzb
Автор

"Any questions at this point?"👨‍🎓 "Raise your hand if this is making sense."👨‍🎓

chillmode
Автор

In one of your homework problem in the 2017 Spring Folder, I couldn't grasp the second claim of Q1, i.e. "This supports our second claim: as the dimensionality increases, the distance growth between normally distributed points overwhelms the degree to which some of those points are closer together than others". I have proved that limit (4*sigma_d/u_d) approaches to 0 as d -> infinity, but its relation to the claim made above is puzzling me a bit (in fact, I would say I could not completely comprehend the claim made either). Incredibly thankful for all your previous replies, Prof. Kilian.

sudhanshuvashisht
Автор

Hello, thank you for the lecture.
Could you please comment more on the definition of uniform probability on the interval [0, 1] to be (1 - 2 * epsilon) when you were explaining curse of dimensionality.
It is approximately 10:30 time.

mihnearomanovschi
Автор

Hello Sir!! Firstly thanks for sharing lectures so that we can see these.
Secondly, regarding the math background thing you say in video like I could not find it at the course website or somewhere.
Are there some notes or video of that class that we can look upon.
Thanks in advance

SinghCoder
Автор

Great answer and also can you tell me is there any way in which we can know the number of intrinsic dimensions in our data such that it becomes easy to know whether to apply KNN directly or do PCA and then apply KNN

raghavgaur
Автор

It’s very helpful and quite detailed. Although I can’t find the exercises/assignments for this course on the course’s webpage. Would be of great help if someone can direct me to the same. Thanks.

easterPole