How to implement Decision Trees from scratch with Python

preview_player
Показать описание
In the fourth lesson of the Machine Learning from Scratch course, we will learn how to implement Decision Trees. This one is a bit longer due to all the details we need to implement, but we will go through it all in less than 40 minutes.

Welcome to the Machine Learning from Scratch course by AssemblyAI.
Thanks to libraries like Scikit-learn we can use most ML algorithms with a couple of lines of code. But knowing how these algorithms work inside is very important. Implementing them hands-on is a great way to achieve this.

And mostly, they are easier than you’d think to implement.

In this course, we will learn how to implement these 10 algorithms.
We will quickly go through how the algorithms work and then implement them in Python using the help of NumPy.

▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

#MachineLearning #DeepLearning
Рекомендации по теме
Комментарии
Автор

by far the best set of videos on ML algorithms from scratch and I have seen so many githubs and youtube videos on the same subject. this content is 10/10

TheWorldwideapple
Автор

I have understood so much about the underlying theory of decision trees and found that except for the criteria, we are basically traversing to the end of the tree and putting values at leaf nodes. The split and stop criteria are also intuitive. Thank you for this incredible resource.

piyusharora
Автор

I think there is a small error in the entropy calculation at 23:37. The formula shown uses the logarithm to the base 2, so I think np.log2() would be correct as np.log() represents the natural logarithm. Nevertheless, the Videos as well as the whole series is great! Thank you for the content!

MrThought
Автор

Unbelievable how helpful this was! More than I learned in lecture

samuellefischer
Автор

OMG, so glad I found your channel. Your way of education is super useful. Intuition, hands-on coding, what else can I expect?

KitkatTrading
Автор

I love you...💖💖 your tutorials saved my machine learning assignments....😭💖

sherimshin
Автор

Thank you very much for this video! Your explanation is excellent, and the quality is outstanding!

abeldomokos
Автор

One of the best videos i watched on the subject. Great Job and Thank you for creating it. 👍🙏

Karankaran-mxlb
Автор

thanks, i have been searching for this

thangha
Автор

Thanks for a clear presentation, it helped me a lot!

sunnybhatti
Автор

Great implementation,
May I know why GINI index was not considered as It can give better result compared to entropy and information gain.
the code complexity might have reduced.
kindly share your thoughts 🙂

ajithdevadiga
Автор

You give an incorrect explanation of p(X) at 4:46 - n is not the total number of nodes, but rather total number of data points in the node

BenBuddendorff
Автор

So nice from you sharing. Thanks so much.

yusmanisleidissotolongo
Автор

thank you so much for this video, helped alot, watched it twice

gamingDivyaa
Автор

Excelente! Me tomará un par de días digerir todo lo expuesto en 37 min. pero no dudo que valdrá la pena. Gracias.

Wizhka
Автор

im learning a lot of python while copying your code

MiroKrotky
Автор

You are just GREAT. Thanks so very much.

yusmanisleidissotolongo
Автор

The example in the beginning shows a dataset with mixed data types (categorical and numerical), how ever it looks like the code you provided only handle numerical data points, right?

XuyenNguyen-jbsj
Автор

Bruh, seeing people translate human language to programming language that easy is amazing. I want that experience too

slyceryk
Автор

comprehensive video . I've used your method but my accuracy scores are low at 0.67 what might be the issue

derrick
welcome to shbcf.ru