Naive Bayes Classifier | Naive Bayes Algorithm | Naive Bayes Classifier With Example | Simplilearn

preview_player
Показать описание
This Naive Bayes Classifier tutorial video will introduce you to the basic concepts of Naive Bayes classifier, what the Naive Bayes algorithm is and Bayes theorem in general. You will understand conditional probability concepts, where the Naive Bayes classifier is used and how the Naive Bayes algorithm works. By the end of this video, you will also implement the Naive Bayes algorithm for text classification in Python.

The topics covered in this Naive Bayes video are as follows:
00:00 - 01:06 Introduction and Agenda
01:06 - 05:45 What is Naive Bayes?
05:45 - 06:30 Why do we need Naive Bayes?
06:30 - 20:17 Understanding Naive Bayes Classifier
20:17 - 22:36 Advantages of Naive Bayes Classifier
22:36 - 43:45 Demo - Text Classification using Naive Bayes

#NaiveBayesClassifer #NaiveBayes #NaiveBayesAlgorithm #NaiveBayesInMachineLearning #NaiveBayesMachineLearning #NaiveBayesClassiferExample #MachineLearningAlgoithms #MachineLearning #Simplilearn

What is Naive Bayes Classifier?
Naive Bayes is a supervised learning algorithm that is based on applying Bayes’ theorem with the “naive” assumption. The Bayes Rule gives the formula for the probability of Y given X. It is called Naive because of the naive assumption that the X’s are independent of each other.

➡️ About Caltech Post Graduate Program In Data Science
This Post Graduation in Data Science leverages the superiority of Caltech's academic eminence. The Data Science program covers critical Data Science topics like Python programming, R programming, Machine Learning, Deep Learning, and Data Visualization tools through an interactive learning model with live sessions by global practitioners and practical labs.

✅ Key Features
- Simplilearn's JobAssist helps you get noticed by top hiring companies
- Caltech PG program in Data Science completion certificate
- Earn up to 14 CEUs from Caltech CTME
- Masterclasses delivered by distinguished Caltech faculty and IBM experts
- Caltech CTME Circle membership
- Online convocation by Caltech CTME Program Director
- IBM certificates for IBM courses
- Access to hackathons and Ask Me Anything sessions from IBM
- 25+ hands-on projects from the likes of Amazon, Walmart, Uber, and many more
- Seamless access to integrated labs
- Capstone projects in 3 domains
- Simplilearn’s Career Assistance to help you get noticed by top hiring companies
- 8X higher interaction in live online classes by industry experts

✅ Skills Covered
- Exploratory Data Analysis
- Descriptive Statistics
- Inferential Statistics
- Model Building and Fine Tuning
- Supervised and Unsupervised Learning
- Ensemble Learning
- Deep Learning
- Data Visualization



🔥🔥 Interested in Attending Live Classes? Call Us: IN - 18002127688 / US - +18445327688
Рекомендации по теме
Комментарии
Автор

Can I just say how helpful you have been by making everything look so simple on python, there's other youtubers who just presume we are all qualified Data Scientists and know how to always write Machine Learning functions from scratch.


You on the other hand teach us the basics of each ML topic and teach us how to use implement ML in datasets SIMPLY!!!!

DJSHIM
Автор

THANK YOU SO MUCH,
YOUR EXPLANATION IS EASY TO FOLLOW AND UNDERSTAND.

juniorJxE
Автор

The Demo part was just awesome ... Thanks a lot for the effort

subtlethingsinlife
Автор

I must say your explanation is excellent compared to others. You’re to the point and not unnecessarily stretching, as I’ve seen on other tutorials.
Also Please help me with the data set

slkslk
Автор

@ 13:07 min in the slide of calculation P(B|A) = P(WeekDay | Buy) is wrong. It should be 9/24 not 2/6. 2/6 is P(weekDay | No Buy). Is my understanding correct?

sandyjust
Автор

1. For the first problem (12:00 into the lesson), P(No Buy | Weekday) can also be calculated by seeing that there are 11 visits on the weekday and 2 of them are No Buy. Therefore the answer is 2/11. Which does match the answer via Bayes (if the Bayes calculation does not round the fractions)

2. On the example using Holiday, Discount = Yes and Delivery = Yes, I was surprised that P(No Buy | B) + P(Buy | B) does not equal 1. Given that these are the same condition isn't the probability of Buy or No Buy 100% ?

3. P(A|B) When you doing the classification of articles ... is it true that the likelyhood tables are - The article types as the row headers and the key words as the column headers ... so there would be one very large likelyhood table.

rzipper
Автор

16:27 Result is: ≈0.02
17:16 Result is: ≈0.98

19:15 Then sum of probabilities is: ≈1 (AND NOT 1.164, because sum of probabilities of two related events equals 1)

Likelihood of Purchase: 0.98 / 1 = 98%
Likelihood of NO Purchase: 0.02 / 1 = 2%

And then as 98% > 2% we can conclude that customer will buy on a holiday with discount and free delivery.

demydteslenko
Автор

very clear explanation with great examples, thank you!

s.e.
Автор

Machine Learning is the Future and yours can begin today. Comment below with you email to get our latest Machine Learning Career Guide. Let your journey begin.



Do you have any questions on this topic? Please share your feedback in the comment section below and we'll have our experts answer it for you. Also, if you would like to have the dataset for implementing Naive Bayes Classifier in Python, please comment below and we will get back to you.
Thanks for watching the video. Cheers!

SimplilearnOfficial
Автор

TQVM for the Machine Learning series!!!

digigoliath
Автор

Heyy thank you so much.
I am doing project called order Prediction
my dataset has following columns
count year month day hour working_day weekend_day public_holiday




then how should I apply naive bayes on it?

shreyamangade
Автор

Your videos are amazing. Thanks! I have subscribed!

siyizheng
Автор

Hello @ Simplilearn! I have become glued to your videos, please can you kindly help with the dataset too to this email? - It will be greatly appreciated. Thanks in anticipation.

teejay
Автор

YOUR tutorials are awesome!!!.Thankx for sharing.God bless u!!

shivamojha
Автор

Very good explanation. I have a question at 19:36 sum of probabilities is 1.164, how sum of probabilities is greater than 1?

fazalerabbi
Автор

The way you explain is amazing. Could you do a favour by providing the dataset.

altafqureshi
Автор

Nice and useful! Just curious to ask a question. As far as I understand, in the shopping use case, we used three features (Day, free delivery & discount). And, in the text categorization use case, we used one feature (text containing email content). Is that really treated as one feature here? Or, will the naive bayes algorithm, internally creates multiple features for the given input (text containing email)?

HARIRAM-lisr
Автор

Just what I was looking for thank you so much.

Mega
Автор

Hello thank you for this interesting explanation.
please how to i get the dataset you used for this?
thank you

chiedudennis
Автор

Wonderful tutorial! Great work and thanks! :)

tymothylim