Tutorial 39- Gini Impurity Intuition In Depth In Decision Tree

preview_player
Показать описание
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

Please do subscribe my other channel too

Connect with me here:

Рекомендации по теме
Комментарии
Автор

I have know this Prof. since 2019. If he is talking about something and I don't understand, I always know I'm the problem. Thank you, Professor

abdulahiopejin
Автор

very well said, just one correction at 5:24, starting from 0 when the probability of + increases and reaches 0.5 (50%), entropy maxes out at 1. After that the probability of + /continues to increase/, while the entropy will start to decrease. Probability will reach 1(100% probability) and entropy returns back to 0.

praos
Автор

thank you so much. Wonderful explanation. Your videos have been a savior at many circumstances, especially for the beginners .

vasaviedukulla
Автор

Thank you for explaination. 👍
For this, I plotted
0.5 (Entropy) - Gini
The actual equation would be:
y =
Intutively, for splits that class probabilities are between 0 and 0.5 Entropy penelizes splits more than Gini. Therefore, using Entropy instead of Gini, it is more likely to choose a feature that create a leaf node and an evenly distributed node.
Overall, I think trees with Entropy have more early leaf nodes and are deeper. On the other hand, trees with Gini are wider.

emadfarjami
Автор

man this is beautiful, love the way the concept clicked in me. man i love this i wish for more videos from krish sir.... can i have more of these sweet decisive perfectly explained classes. yeah i want more of these

ashwinissac
Автор

thm toh bade heavy teacher ho bhai, maja aa gaya

deepakdhaka.
Автор

Wow so easily explained. I was hating maths but now with your videos i am gaining confidence and feeling it is simple.. Thanks Krish.

neelark
Автор

for the entropy curve that you have described, I think this explanation is better: when your probability is 0.5: It is the worst case and entropy is maximized, after that either if the positive probability is increased, or it is decreased(means that the negative probability is increased) the system is purer and thus, the entropy is reduced.

niloufarfouladi
Автор

Small correction* at 5:30 as the p+ increases entropy increases till its value becomes 1 post that it starts decreasing.

mujeebrahman
Автор

I was watching Stanford University lectures
Even they can't teach like you
Thankyou sir
Video was amazing

varunshrivastav
Автор

This video is computational efficient, coz u don't need others after watching this

rileywong
Автор

Great explanation sir!
I gave the 2kth like. Now you give my comment heart <3

utkarsh
Автор

i respect and love you sir, reason is your teaching technic is really admire us . thank to explain gini index and entropy

praveenpandey.
Автор

Than you so much, you are amazing! Greetings from Peru.

victorreyesalvarado
Автор

you are the best on I could learn about every complicated question so easily! Thank you so much, Love U

nahidzeinali
Автор

Much awaited video Krish. Thanks a lot.

yogoai
Автор

There was a small error at the video when u said the entropy increases as p+ increases, after that entropy decreases with the decrease in p+ .I think the p+ is also increasing . Anyways sir great explanation and got to learn all the things clearly .

arindamghosh
Автор

I subscribed when you had 5k subscribers

tejaltatiwar
Автор

Well, I don't know if it's a problem for other people or not but I am used to this teaching technique(Board and marker) so I find it more comfortable than ppt. Randomly saw your video and this is exactly what I needed.

prithicksamui
Автор

Thank you! You are really a great teacher. Such a good lecture can save me a huge amount of time.

fengjeremy