3 Decision Tree | ID3 Algorithm | Solved Numerical Example by Mahesh Huddar

preview_player
Показать описание
3 Decision Tree – ID3 Algorithm Solved Numerical Example by Mahesh Huddar

id3 algorithm decision tree,
id3 algorithm in machine learning,
decision tree in ML,
decision tree solved example,
decision tree numerical example solved,
id3 algorithm in data mining,
id3 algorithm decision tree in data mining,
id3 algorithm decision tree python,
id3 algorithm decision tree in machine learning,
id3 algorithm example,
id3 algorithm in data mining with an example,
id3 in data mining,
decision tree problem,
decision tree problem in big data analytics,
decision tree machine learning,
decision tree,
decision tree in data mining,
decision tree analysis,
decision tree by Mahesh Huddar,
Рекомендации по теме
Комментарии
Автор

you have a mistake at Gain(s, a1) at 3:50, it should be (-5/10) * 0 and not (05/10)*1 since E(Sfalse)=0.0(result is right tho, you just misspelled). Other than that, nicely explained, learnt how this works thanks to your vids.

padelis_doulis
Автор

Dear Mahesh sir,
If I'm not wrong then At 10:30 the Entropy (S) must be 0.7219 .
And Gain(S, A3) answer would be 0.7219-0-0=0.7219.

Thank you

techfort
Автор

At 9:46, Entropy(S) should be 0.7219 right ? SO the answer must be 0.7219-0.6486=0.07294 . Kindly someone correct me if wrong

VamshiChaithanya
Автор

Entropy values of false in a1 is zero but while calculating final value you have taken it as 1

ramyasrikanteswara
Автор

Love you sir, it's help in our semester exam❤

ayushmanbhargabagopalbiswas
Автор

Thank you so much for this video. I really do appreciate. It's amazing how explicit you made it and very understandable too. I was able to finish my assignment in a short interval (zero stress)

emmanuellafakonyui
Автор

Tks sir, It''s help me so much in my semester exam

quangminhtran
Автор

The guys confused on how to put log base 2 in the calc. In most calcs u can not directly put log base 2 but instead u can find it out by for example: Find log base 2 of 5, it can be done by: (log 5 )/ (log2) where the log is in base 10

kushalsharma
Автор

is it okay to leave a2 out of the tree?

johnrooney
Автор

If there is equal gain, which one we have to consider for next node??

yourbrother
Автор

Correct me if I am wrong, but I wonder where is the numerical data in this dataset? every feature is categorical rt ?

santoshvadisala
Автор

Sir plz can you solve even last year dec2019/jan2020 paper problems

thegeethaart
Автор

Instances 1 and 2 are the same, also 4 and 5 are the same. Shouldn't we merge them into one single instance?

miklospoth
Автор

Sir can u just say wic entropy value must be consider after dividing table in above example should we consider entropy(true) r intial entropy

vinayg
Автор

finally a1 —sir you are put false=1 this wrong
please sir correction false = 0

alimuddinkhan
Автор

Today I got this question only in the exam but I seen yesterday and Today only that 23 minutes video that's it

sravanr