9 – NLPND POS 08 Quiz How Many Paths Now V1

So, here we are, we have our diagram, which is called a Trellis diagram. And in here, we’re going to find the path of the highest probability. Let’s recall that transition and emission probability tables. We’ll record the emission probabilities and we won’t record the transmission ones yet because they will look too messy. But … Read more

8 – NLPND POS 07 Solution How Many Paths V1

Correct. The answer is 81. And the reason is that we have three possibilities for game, three for will, three for spot, and three for Will. Here are the 81 possibilities. So this is not so bad. We could check 81 products of four things and find the largest one. But what if we have … Read more

7 – NLPND POS 06 Quiz How Many Paths V1

So now, here is how the Hidden Markov Model generates a sentence. We start by placing ourselves at the start. And with some probability, we walk around the hidden states. Let’s say we walk into the N of noun. Once we’re at the N, then we have some options to generate observations. So let’s say … Read more

6 – NLPND POS 05 Hidden Markov Models V1

The idea for Hidden Markov Models is the following: Let’s say that a way of tagging the sentence “Jane will spot Will” is noun-modal-verb-noun and we’ll calculate a probability associated with this tagging. So we need two things. First of all, how likely is it that a noun is followed by a modal and a … Read more

5 – NLPND POS 04 When Bigrams Wont Work V1

Okay. So now let’s complicate the problem a little bit. We still have Mary, Jane and Will. And let’s say there’s a new member of the gang called Spot. And now our data is formed by the following four sentences, Mary, Jane can see Will. Spot will see Mary. Will Jane spot Mary? And Mary … Read more

4 – NLPND POS 03 Bigrams V1

Now, of course we can’t really expect the Lookup Table method to work all the time. Here’s a new example where the sentences are getting a bit more complicated. Now, our data is formed by the sentences, Mary will see Jane, Will will see Mary, and Jane will see Will. And the tags are as … Read more

3 – NLPND POS 02 Lookup Table V1

So, let’s start with some friends. Let’s say we have our friend, Mary, our friend, Jane, and our friend, Will. And we want to tag the sentence: Mary saw Will. And to figure this out, we have some exciting data in the form of some sentences, which are “Mary saw Jane.”, and “Jane so Will.” … Read more

2 – NLPND POS 01 Intro V1

Okay, here’s the problem we’re going to solve in this section. We have a sentence, for example, Mary had a little lamb, and we need to figure out which words are nouns, verbs, adjectives, adverbs, et cetera. Now, a thorough knowledge in grammar is not needed in this section only to know that words are … Read more

13 – NLPND POS 12 Viterbi Algorithm V2

In this video, will develop the Viterbi algorithm in detail for this example. And if you’ve seen dynamic programming, that’s exactly what we’re doing. So here’s our trellised diagram with the two probability tables, mission and transition. Let’s draw this a bit larger with all the probabilities. Recall that on each node we have the … Read more

12 – NLPND POS 11 Viterbi Algorithm Idea V1

So in the last video, we saw a method of removing paths which brought our calculations down from 81 paths to four. But that was still a bit lucky and it required a lot of thinking. We want a better algorithm. So let’s think. Let’s look at only these two paths and let’s merge them. … Read more

10 – NLPND POS 09 Quiz Which Path Is More Likely V1

And the answer is four. And here are the four paths we have to check. They are noun noun noun noun, noun modal noun noun, noun noun verb noun, and noun modal verb noun. Now, let me write down the probabilities in the edges and the vertices. So new quiz. Which one of these gives … Read more

1 – 3 Intro POS V1

Hi again. Welcome to the part of speech tagging section. In this section, we’ll study a very interesting problem which consists in tagging sentences with their parts of speech like noun, verb, adjective et cetera. These types of models are particularly useful for applications like grammar or spelling checkers. We’ll use several methods to implement … Read more