Decision Tree In Machine Learning | Decision Tree Algorithm In Python |Machine Learning |Simplilearn

preview_player
Показать описание
This Decision Tree in the Machine Learning tutorial will help you understand all the basics of the Decision Tree and how the Decision Tree algorithm works. In the end, we will implement a Decision Tree algorithm in Python on loan payment prediction. This Decision Tree tutorial is ideal for both beginners as well as professionals who want to learn Machine Learning Algorithms.

Below topics are covered in this Decision Tree Algorithm Tutorial:
0. Intro (0:00)
1. What is Machine Learning? ( 02:25 )
2. Types of Machine Learning? ( 03:27 )
3. Problems in Machine Learning ( 04:43 )
4. What is a Decision Tree? ( 06:29 )
5. What are the problems a Decision Tree Solves? ( 07:11 )
6. Advantages of Decision Tree ( 07:54 )
7. How does Decision Tree Work? ( 10:55 )
8. Use Case - Loan Repayment Prediction ( 14:32 )

#DecisionTreeMachineLearning #DecisionTree #DecisionTreeAlgorithm #DecisionTreeAlgorithmInMachineLearning #DecisionTreePython #DecisionTrees #DecisionTreeExample #MachineLearningAlgorithms #MachineLearningTutorial #Simplilearn

What is a Decision Tree Algorithm?
A Decision Tree is a supervised machine learning algorithm for solving classification problems. Generally, a decision tree is drawn upside down with its root at the top and it is known as the Top-Down Approach. Decision trees are used for handling non-linear data sets effectively. The decision tree tool is used in real life in many areas, such as engineering, civil planning, law, and business.

➡️ About Post Graduate Program In AI And Machine Learning

This AI ML course is designed to enhance your career in AI and ML by demystifying concepts like machine learning, deep learning, NLP, computer vision, reinforcement learning, and more. You'll also have access to 4 live sessions, led by industry experts, covering the latest advancements in AI such as generative modeling, ChatGPT, OpenAI, and chatbots.

✅ Key Features

- Post Graduate Program certificate and Alumni Association membership
- Exclusive hackathons and Ask me Anything sessions by IBM
- 3 Capstones and 25+ Projects with industry data sets from Twitter, Uber, Mercedes Benz, and many more
- Master Classes delivered by Purdue faculty and IBM experts
- Simplilearn's JobAssist helps you get noticed by top hiring companies
- Gain access to 4 live online sessions on latest AI trends such as ChatGPT, generative AI, explainable AI, and more
- Learn about the applications of ChatGPT, OpenAI, Dall-E, Midjourney & other prominent tools

✅ Skills Covered

- ChatGPT
- Generative AI
- Explainable AI
- Generative Modeling
- Statistics
- Python
- Supervised Learning
- Unsupervised Learning
- NLP
- Neural Networks
- Computer Vision
- And Many More…

👉 Learn More At:

🔥🔥 Interested in Attending Live Classes? Call Us: IN - 18002127688 / US - +18445327688
Рекомендации по теме
Комментарии
Автор

We hope this video was useful. The link for the dataset used in the video is provided in the description. Thanks!

SimplilearnOfficial
Автор

Like the demonstration and visualizations. Simple and Clear.
The Entropy formula of getting 0.57, I believe is using log base of 10. I tried log base 2 got around 1.91.

Thanks for the Video

lihang
Автор

I really amazed when i did not find good content and the last moment to quit, I hit upon a plan like thirsty crow and I search "Simplilearn" to satisfy my soul and knowledge . Really Appreciatable Efforts and carry on till last moment

raja.
Автор

This tutorial was precise and clear. I loved it. Thank you.

albinchacko
Автор

thank you, really liked how you brake down each step to understand how the code works. Thanks

cinaralima
Автор

I guess you should add minus sign before the entropy formula.

kumarsaurabh
Автор

Well explained. Enjoyed and got to understand decision trees better.
God bless you all!

olashoretijesunimi
Автор

Hello. Nice explanation. Could you please send along the data set you used for the Use Case?



Also, I have a couple questions about the disadvantages of a decision tree. You say that the decision tree is high variance and low bias (at 9:02 minutes into the presentation).



#1. I am not sure why low bias is a disadvantage. Why is that?



#2. I assume that the closer the decision tree gets to zero entropy (i.e. each leaf contains only one class of data), the higher the variance and the lower the bias. Is that correct?



#3. I guess that the art of building a decision tree is finding the best sequence of splitting criteria to employ, and knowing when to stop splitting. I guess you can always continue to split until you reach zero entropy, but toward the end of that process you are just learning noise (overfitting). I assume that choosing the "best" sequence of splitting criteria essentially determines how soon you start learning noise.



Please comment on these three questions if you would.


Thanks

johnread
Автор

Just Woooow, the way you teach, simple and perfect!

ariap
Автор

Nicely done ! You have a like and a new subscriber.

rzipper
Автор

Excellent tutorial on Decision Tree Classifier. Thanks a ton. It was awesome.

rajamoorthy
Автор

My question, the model is good and I understand what you have done. However as a newbie, how can I actually see the tree instead of seeing just accuracy scores everywhere if you understand what I am trying to say. As in, is there any way I can actually visualise the graph? Also how can I append my Y predictions to whether the person will be approved a loan to my original dataset? I am seeing everywhere people doing the usual sci-kit learn train test, modelling predictions and checking of the accuracy, but I don't see anywhere of people actually inputting one row of the data and seeing whether the person should be approved of a loan or not.


Are you able to help me on this? I can pm my email if needed.

DJSHIM
Автор

Good Examply. Kindly send the data set.
1) Please, Explain Arguments like criterion, random state, max depth, min simples leaf. In more depth.

2) Please, show how to the diagram o final decision tree after prediction in Detail.

gautam
Автор

Hey there,

I tried to recalculate the entropy (12:45) and inserted the data in an online calculator as well as my personal calculator.. unfortunately, me and the online calc get an another result as you present in the video (1.905) with the exact same data using the same formula. So.. are you sure about the result??

Basketstyler
Автор

15:10
from sklearn.model_selection import train_test_split

jjmondal
Автор

Awesome tutorial. Simple and Clear explanations.

csorex
Автор

Let me say thank you so much for this huge valuable work. It is so easy, you are a professional teacher.

hamzawi
Автор

wow man.. its been a awesome session to learn a decision tree as simple as possible 👏👏👏👌

guhanathanprathish
Автор

Thanks a lot for this video tutorial !!

west
Автор

Thank You! I received all the required data sets.

pphot