9 – NLPND POS 08 Quiz How Many Paths Now V1

So, here we are, we have our diagram, which is called a Trellis diagram. And in here, we’re going to find the path of the highest probability. Let’s recall that transition and emission probability tables. We’ll record the emission probabilities and we won’t record the transmission ones yet because they will look too messy. But … Read more

8 – NLPND POS 07 Solution How Many Paths V1

Correct. The answer is 81. And the reason is that we have three possibilities for game, three for will, three for spot, and three for Will. Here are the 81 possibilities. So this is not so bad. We could check 81 products of four things and find the largest one. But what if we have … Read more

7 – NLPND POS 06 Quiz How Many Paths V1

So now, here is how the Hidden Markov Model generates a sentence. We start by placing ourselves at the start. And with some probability, we walk around the hidden states. Let’s say we walk into the N of noun. Once we’re at the N, then we have some options to generate observations. So let’s say … Read more

6 – NLPND POS 05 Hidden Markov Models V1

The idea for Hidden Markov Models is the following: Let’s say that a way of tagging the sentence “Jane will spot Will” is noun-modal-verb-noun and we’ll calculate a probability associated with this tagging. So we need two things. First of all, how likely is it that a noun is followed by a modal and a … Read more

5 – NLPND POS 04 When Bigrams Wont Work V1

Okay. So now let’s complicate the problem a little bit. We still have Mary, Jane and Will. And let’s say there’s a new member of the gang called Spot. And now our data is formed by the following four sentences, Mary, Jane can see Will. Spot will see Mary. Will Jane spot Mary? And Mary … Read more

4 – NLPND POS 03 Bigrams V1

Now, of course we can’t really expect the Lookup Table method to work all the time. Here’s a new example where the sentences are getting a bit more complicated. Now, our data is formed by the sentences, Mary will see Jane, Will will see Mary, and Jane will see Will. And the tags are as … Read more

3 – NLPND POS 02 Lookup Table V1

So, let’s start with some friends. Let’s say we have our friend, Mary, our friend, Jane, and our friend, Will. And we want to tag the sentence: Mary saw Will. And to figure this out, we have some exciting data in the form of some sentences, which are “Mary saw Jane.”, and “Jane so Will.” … Read more

2 – NLPND POS 01 Intro V1

Okay, here’s the problem we’re going to solve in this section. We have a sentence, for example, Mary had a little lamb, and we need to figure out which words are nouns, verbs, adjectives, adverbs, et cetera. Now, a thorough knowledge in grammar is not needed in this section only to know that words are … Read more

14 – 4 Outro POS V1

Great job. In this section, you have learned Hidden Markov Models and how to apply them to part-of-speech tagging. Now in the project, you will have the chance to apply this algorithm in a real data set of sentences to create your own part-of-speech tagger. Let’s go.

13 – NLPND POS 12 Viterbi Algorithm V2

In this video, will develop the Viterbi algorithm in detail for this example. And if you’ve seen dynamic programming, that’s exactly what we’re doing. So here’s our trellised diagram with the two probability tables, mission and transition. Let’s draw this a bit larger with all the probabilities. Recall that on each node we have the … Read more

12 – NLPND POS 11 Viterbi Algorithm Idea V1

So in the last video, we saw a method of removing paths which brought our calculations down from 81 paths to four. But that was still a bit lucky and it required a lot of thinking. We want a better algorithm. So let’s think. Let’s look at only these two paths and let’s merge them. … Read more

11 – NLPND POS 10 Solution Which Path Is More Likely V1

Correct. The answer is noun-modal-verb-noun. And to help us out, here are the probabilities for all the paths where you can see that the highest one is the bottom right one, which corresponds to noun-modal-verb-noun. Precisely this one, this is the winner.

10 – NLPND POS 09 Quiz Which Path Is More Likely V1

And the answer is four. And here are the four paths we have to check. They are noun noun noun noun, noun modal noun noun, noun noun verb noun, and noun modal verb noun. Now, let me write down the probabilities in the edges and the vertices. So new quiz. Which one of these gives … Read more

1 – 3 Intro POS V1

Hi again. Welcome to the part of speech tagging section. In this section, we’ll study a very interesting problem which consists in tagging sentences with their parts of speech like noun, verb, adjective et cetera. These types of models are particularly useful for applications like grammar or spelling checkers. We’ll use several methods to implement … Read more

9 – SL NB 08 S Bayesian Learning 2 V1 V6

So let’s see. We have three spam emails and one of them contains the word ‘easy,’ which means the probability of an email containing the word ‘easy’ given that it’s spam is one-third. Since two out of the three three spam emails containing the word ‘money,’ then the probability of an email containing the word … Read more

8 – SL NB 07 Q Bayesian Learning 1 V1 V4

Now the question is, how do we use this wonderful Bayes theorem to do machine learning. And the answer is repeatedly. Let’s look at this example, a spam email classifier. So let’s say, we have some data in the form of a bunch of emails. Some of them are spam and some of them are … Read more

7 – SL NB 06 S False Positives V1 V3

Well, let’s see. Let’s use Bayes theorem to calculate it. We’ll use the following notation, S will stand for sick, H will stand for healthy, and the plus sign will stand for testing positive. So since one out of every 10,000 people are sick, we get that P of S is 0.0001. Similarly, P of … Read more

6 – SL NB 05 Q False Positives V1 V2

Now, let’s look at an interesting application of Bayes Theorem. Let’s say we’re not feeling very well and we go to the doctor, the doctor says there’s a terrible disease going on, I’ll administer a test for you. Moreover, she says that the test has 99 percent accuracy. More specifically, she says that for every … Read more

5 – SL NB 04 Bayes Theorem V1 V2

So, let’s look at a formal version of Bayes Theorem. Initially, we start with an event, and this event could be A or B. The probabilities for each are here, P of A, and P of B. Now, we observe a third event, and that event can either happen or not happen both for A … Read more

4 – SL NB 03 Guess The Person Now V1 V213

Bayes Theorem can get a little more complex. Let’s take a look at a small example and what we’ll do here is we’ll mess a bit with the prior probability. So again, we have Alex and Brenda in the office, and we saw someone pass by quickly and we don’t know who the person is. … Read more