Machine Learning Lecture 28 'Ball Trees / Decision Trees' -Cornell CS4780 SP17

preview_player
Показать описание
Lecture Notes:
Рекомендации по теме
Комментарии
Автор

I needed to learn about KD Trees and Ball Trees for a specific application but after just 2 videos I am hooked! Now I am going to watch the whole course, thank you so much Kilian for making it public :)

aleksandarcvetkovic
Автор

Helpful. Sololearn uses KD trees in the machine learning tutorial.

allennelson
Автор

Really great content. And very much appreciated! Thanks :)

marcelo.
Автор

1:20 I thought he will say, "if something does not make sense then contact me" 😂😂😂

scchouhansanjay
Автор

thanks a lot :) for the very unique explanation!

salmaabdelmonem
Автор

A huge thanks to you Prof. Kilian!!! Big fan of your teaching. You really kill the topic and it seems nothing more is left to know about the topic. 😀 I have a little question- why ball trees are not that famous, given they work so well and are invariant of the dimensionality of feature space?

rjain
Автор

Question about Ball Tree Complexity. To construct the ball tree, we need to perform argmax distance(x, xi) for xi in S, therefore the algorithm still need to go through all the data points and compute their distance to the random chosen point. In this sense, comparing to KNN, I don't see any advantage using ball tree, since the complexity is almost the same.

Or it is because we consider constructing ball tree as a pre-computed process before testing, and we don't add that part of running time to the final running time?

Great thanks!

vincentxu
Автор

If aligned axes is the issue, would random axis work, why did we create spheres instead of random aligned hyperplanes, any particular advantages to having spheres vs hyperplanes ? any pointers would help . thanks

ThePatelprateek
Автор

Hi
I got a slightly different understanding of KD tree from python scikit learn implementation. It says it uses or switches to brute Force method in the final search table to find the k nearest neighbor. The documentation does not talk about going up the tree and checking for neighbors hiding in other partitions.

Not sure if I am able state my confusion

Your ball tree example seems good. But scikit learn is quiet abt ball tree although it supports it for knn algorithm

bhupensinha
Автор

What if we incorporate the label as an additional dimension and begin splitting from that first? Wont that always ensure that our leaf has only 1 label?(because in a way we are trying to explain the variability in labels through variability in dimensions/features so the variability in labels must be significant)

saketdwivedi
Автор

I always tried to help, each time i did the cult took advantage now its their problem, and i dont know if i can help anybody everything has changed since last night

Dragon-Slayr
Автор

That answer was really psych. Damnnnn!!!

subhasdh
Автор

Could anyone explain to me why ball trees are better in higher dimensional space (with a lower-dimensional manifold) than kd trees? I find it hard to imagine/understand when Prof. explains (12:55) it is because of the axis-aligned splits that kd trees are bad even in this setting (low-dimension manifold in high ambient dimension)

sudhanshuvashisht
Автор

question : why construct sphere instead of hyperplanes like kd-tree but not axes aligned ?

prateekpatel
Автор

there goes the professor recruiting arthur for a phd :P

vatsan
Автор

Hi. Is ball trees and KD trees the same?

RGDot
Автор

6:20 It's like the bottom up method huh

adiflorense