Tutorial 37: Entropy In Decision Tree Intuition

preview_player
Показать описание
Entropy gives measure of impurity in a node. In a decision tree building process, two important decisions are to be made — what is the best split(s) and which is the best variable to split a node.

Buy the Best book of Machine Learning, Deep Learning with python sklearn and tensorflow from below
amazon url:

Connect with me here:

Subscribe my unboxing Channel

Below are the various playlist created on ML,Data Science and Deep Learning. Please subscribe and support the channel. Happy Learning!

You can buy my book on Finance with Machine Learning and Deep Learning from the below url

🙏🙏🙏🙏🙏🙏🙏🙏
YOU JUST NEED TO DO
3 THINGS to support my channel
LIKE
SHARE
&
SUBSCRIBE
TO MY YOUTUBE CHANNEL
Рекомендации по теме
Комментарии
Автор

One of the great teacher in the Machine Learning field. You are my best teacher in ML.Thank you so much sir for spreading your knowledge.

shivadumnawar
Автор

I checked all the codes in your book. Everything works like charm. I can guess that you have mastered Machine Learning by struggling through it. Those who are spoon-fed cannot be half as good as you. Great job! We wish you all the success.

SALESENGLISH
Автор

This is what I was looking for. Thank you so much for making this video. Eagerly wait for video on information gain. Please keep going 🙏

aaroncode
Автор

thank you. we all need teachers like you. god bless you. you're a blessing for us college students who are struggling with offline colleges after the reopening.

yamika.
Автор

Hi, there might be calculation mistake in the entropy part. its not 0.78. Can you please mention that in a caption in the video or a description. So that people dont mistaken it in the future. Great video!!

ABINASHPANDA-beug
Автор

You are doing an awesome job with our expecting returns. good job Krish, You just nail down the concepts in a line or two thats the way i like it.

rchilumuri
Автор

You cleared my all doubts about Excellent Explanation 😍😍😍😍

mdrashidansari
Автор

Thank you for a great tutorial. The entropy value is actually 0.97 and not 0.78.

sameerkhnl
Автор

Good explanation Krish.Now my misconceptions about decision trees is dwindling away.Thanks

sandipansarkar
Автор

You clearly explain the mathematics of machine learning algorithms! Thank you for your effort.

ayberkctis
Автор

Best channel for Data Science Beginners

bhavikdudhrejiya
Автор

Thanks for the video. At 05:48, how does -3/5log2(3/5)-(2/5log2(2/5)) equal 0.78 ??? I think the correct answer ist 0.971

Could you explain?

maximumthefirst
Автор

You should start explaining from the root node.. Like take entropy of all f1, f2, f3 first.. then select the best one as the root node, then calculate entropy for remaining data for f2 and f3, and select next best entropy as the node... and continue the same process

cequest
Автор

Thank you Thank you Thank After this I am ready for my test tomorrow.... You are boss with these concepts!!.. Please keep making more. I''ll definitely subscribe and share with friends.

keamogetsethaoge
Автор

Nice explanation.... But looking for deep learning video..Please don't stop DL in-between

VivekKumar-nffh
Автор

Explained in a great way ...Thank you krish

abdulkayumshaikh
Автор

very much helpful sir thank you you are best :)

AbhishekRana-yeuw
Автор

very well understandable your teaching curriculum.

lekhnathojha
Автор

This is one of the best explanation thankyou somuch sir

Lavanya-pe
Автор

Hi Sir, this video is 37th in ML playlist but we don't have any decision tree video before it.

hemantsharma
visit shbcf.ru