9 – SL NB 08 S Bayesian Learning 2 V1 V6

So let’s see. We have three spam emails and one of them contains the word ‘easy,’ which means the probability of an email containing the word ‘easy’ given that it’s spam is one-third. Since two out of the three three spam emails containing the word ‘money,’ then the probability of an email containing the word … Read more

7 – SL NB 06 S False Positives V1 V3

Well, let’s see. Let’s use Bayes theorem to calculate it. We’ll use the following notation, S will stand for sick, H will stand for healthy, and the plus sign will stand for testing positive. So since one out of every 10,000 people are sick, we get that P of S is 0.0001. Similarly, P of … Read more

6 – SL NB 05 Q False Positives V1 V2

Now, let’s look at an interesting application of Bayes Theorem. Let’s say we’re not feeling very well and we go to the doctor, the doctor says there’s a terrible disease going on, I’ll administer a test for you. Moreover, she says that the test has 99 percent accuracy. More specifically, she says that for every … Read more

10 – SL NB 09 Bayesian Learning 3 V1 V4

So, let’s do this calculation a bit more in detail. Since we have eight emails in total and three of them are spam and five of them are non-spam or ham, then our prior probabilities are three over eight for spam and five over eight for ham. So, onto calculate the posteriors. Say we have … Read more

1 – Naive Bayes Intro V2

Hello again and welcome to the Naive Bayes section. Naive Bayes is a more probabilistic algorithm which is based on playing with the concept of conditional probability. This algorithm has great benefits such as being easy to implement and very fast to train. We’ll be studying one of very interesting applications, natural language processing. In … Read more