## 9 – SL NB 08 S Bayesian Learning 2 V1 V6

So let’s see. We have three spam emails and one of them contains the word ‘easy,’ which means the probability of an email containing the word ‘easy’ given that it’s spam is one-third. Since two out of the three three spam emails containing the word ‘money,’ then the probability of an email containing the word … Read more

## 8 – SL NB 07 Q Bayesian Learning 1 V1 V4

Now the question is, how do we use this wonderful Bayes theorem to do machine learning. And the answer is repeatedly. Let’s look at this example, a spam email classifier. So let’s say, we have some data in the form of a bunch of emails. Some of them are spam and some of them are … Read more

## 7 – SL NB 06 S False Positives V1 V3

Well, let’s see. Let’s use Bayes theorem to calculate it. We’ll use the following notation, S will stand for sick, H will stand for healthy, and the plus sign will stand for testing positive. So since one out of every 10,000 people are sick, we get that P of S is 0.0001. Similarly, P of … Read more

## 6 – SL NB 05 Q False Positives V1 V2

Now, let’s look at an interesting application of Bayes Theorem. Let’s say we’re not feeling very well and we go to the doctor, the doctor says there’s a terrible disease going on, I’ll administer a test for you. Moreover, she says that the test has 99 percent accuracy. More specifically, she says that for every … Read more

## 5 – SL NB 04 Bayes Theorem V1 V2

So, let’s look at a formal version of Bayes Theorem. Initially, we start with an event, and this event could be A or B. The probabilities for each are here, P of A, and P of B. Now, we observe a third event, and that event can either happen or not happen both for A … Read more

## 4 – SL NB 03 Guess The Person Now V1 V211

Bayes Theorem can get a little more complex. Let’s take a look at a small example and what we’ll do here is we’ll mess a bit with the prior probability. So again, we have Alex and Brenda in the office, and we saw someone pass by quickly and we don’t know who the person is. … Read more

## 4 – SL NB 03 Guess The Person Now V1 V2

Bayes Theorem can get a little more complex. Let’s take a look at a small example and what we’ll do here is we’ll mess a bit with the prior probability. So again, we have Alex and Brenda in the office, and we saw someone pass by quickly and we don’t know who the person is. … Read more

## 3 – SL NB 02 Known And Inferred V1 V2

In the last video, we saw an example of Bayes theorem. But here’s the main idea fit and it’s a very powerful theorem. What it does is it switches from what we know to what we infer. What we know in this case is the probability that Alex wears red and the probability that Brenda … Read more

## 2 – SL NB 01 Guess The Person V1 V1

We’ll start with an example. Let’s say we’re in an office and there are two people, Alex and Brenda, and they’re both there the same amount of time. When they were in the office and we see someone passing by really fast, we can’t tell who it is, but we’d like to take a guess. … Read more

## 12 – MLND SL NB Solution Naive Bayes Algorithm

So the way to do this is to actually divide each one by the sum of both. This will make sure that they add to one. For the first one, we have one over 12 divided by one over 12 plus one over 40, which is 10 divided by 13. And for the second one, … Read more

## 11 – MLND SL NB Naive Bayes Algorithm

Now, here’s where the word naive comes in Naive Bayes. We’re going to make a pretty naive assumption here. Let’s look at the probability of two events happening together, so P of A and B. We can also read this a P of A intersection B. And we’re going to say that this is the … Read more

## 10 – SL NB 09 Bayesian Learning 3 V1 V4

So, let’s do this calculation a bit more in detail. Since we have eight emails in total and three of them are spam and five of them are non-spam or ham, then our prior probabilities are three over eight for spam and five over eight for ham. So, onto calculate the posteriors. Say we have … Read more

## 1 – Naive Bayes Intro V2

Hello again and welcome to the Naive Bayes section. Naive Bayes is a more probabilistic algorithm which is based on playing with the concept of conditional probability. This algorithm has great benefits such as being easy to implement and very fast to train. We’ll be studying one of very interesting applications, natural language processing. In … Read more