CS480/680 Lecture 2: K-nearest neighbours

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

Day 1 complete. Amazing lecture. Thank you professor.

rain
Автор

Inductive learning and deductive learning are two approaches to machine learning. Inductive learning involves learning from examples, while deductive learning involves learning from rules and knowledge.

vamsiteja-ntmq
Автор

Hi! Has dimensionality reduction been discussed in any of these lectures, PCA and t-Sne in specific? If yes, please share the lecture number. I didn't see any lecture name targeting this topic.

ambujmittal
Автор

Great Lecture! I have a question: in K nearest neighbor, is it possible that the test accuracy will be larger than training accuracy as K increase? Thanks!

stevenyang
Автор

Patient lecturer!
In K-fold validation, average accuracy is reported. So, there is no testing set? Or, the average accuracy is reported as training accuracy?
In K-fold validation, you get K models. So, which model to use among the K models?

subramaniannk
Автор

Hi … I have a question why the need of a lot of data in larger hypothesis space maybe I could use the same exact data for all the hypotheses in the hypothesis space ?

hibamajdy