## 7 – Sakoe Chiba Bounds

Let’s suppose we have two signals that really are not that similar. Dynamic time warping could allow them to match much better than they should. Look at this example where we use a different shorter signal along with the same long one as in the last example. >> Okay. >> While I have to start … Read more

## 6 – Dynamic Time Warping

Okay, so all dynamic time warping is doing is trying to align the samples between two whistles we’re comparing so that they best match up. >> Yep, let’s line up the two signals we are trying to compare on the x and y-axes. That will allow us to more easily see how we are matching … Read more

## 5 – Euclidean Distance Not Sufficient

You always say do the simple thing first and add intelligence only if necessary. What happens if we just do Euclidean distance here? >> Let’s try it and find out. Here are some reasonable delta frequency numbers for the top and bottom graphs. >> The top time series has 21 samples, but the bottom one … Read more

## 4 – Warping Time

That will help our recognizer handle the whistles no matter what frequency they start at. Next let’s work on the time warping. >> Just to be clear you’re talking about the problem where I could draw out your name saying Fad or say your name quickly like Fad. >> Yep, that’s precisely the problem we … Read more

## 3 – Problems Matching Dolphin Whistles

That problem is going to be the focus for this lesson. We need to be able to handle classes of signals, where each example may be warped in time a bit differently. >> Okay, well, for features, can we just use the whistle frequencies through time? >> Normally, that would be a good idea, but … Read more

## 2 – Dolphin Whistles

Let’s start with a problem I’m currently exploring, dolphin communication. Here’s a spectrogram of a dolphin whistle. In actuality, dolphins have several types of vocalizations, including burst pulses and echolocation, but whistles are the easiest to see in a spectrogram. >> First, let’s talk about what a spectrogram is. On this spectrogram, the x-axis is … Read more

## 1 – Pattern Recognition through Time Intro

Thad, we’ve covered a lot of different machine learning algorithms. But what about situations where we have time series? >> You mean like speech recognition? >> Yeah, I know you’ve worked in sign language recognition and handwriting recognition. They seem to be similar problems. >> Yep, they are. Pattern recognition through time is one of … Read more

## 9 – NLPND POS 08 Quiz How Many Paths Now V1

So, here we are, we have our diagram, which is called a Trellis diagram. And in here, we’re going to find the path of the highest probability. Let’s recall that transition and emission probability tables. We’ll record the emission probabilities and we won’t record the transmission ones yet because they will look too messy. But … Read more

## 8 – NLPND POS 07 Solution How Many Paths V1

Correct. The answer is 81. And the reason is that we have three possibilities for game, three for will, three for spot, and three for Will. Here are the 81 possibilities. So this is not so bad. We could check 81 products of four things and find the largest one. But what if we have … Read more

## 7 – NLPND POS 06 Quiz How Many Paths V1

So now, here is how the Hidden Markov Model generates a sentence. We start by placing ourselves at the start. And with some probability, we walk around the hidden states. Let’s say we walk into the N of noun. Once we’re at the N, then we have some options to generate observations. So let’s say … Read more

## 6 – NLPND POS 05 Hidden Markov Models V1

The idea for Hidden Markov Models is the following: Let’s say that a way of tagging the sentence “Jane will spot Will” is noun-modal-verb-noun and we’ll calculate a probability associated with this tagging. So we need two things. First of all, how likely is it that a noun is followed by a modal and a … Read more

## 5 – NLPND POS 04 When Bigrams Wont Work V1

Okay. So now let’s complicate the problem a little bit. We still have Mary, Jane and Will. And let’s say there’s a new member of the gang called Spot. And now our data is formed by the following four sentences, Mary, Jane can see Will. Spot will see Mary. Will Jane spot Mary? And Mary … Read more

## 4 – NLPND POS 03 Bigrams V1

Now, of course we can’t really expect the Lookup Table method to work all the time. Here’s a new example where the sentences are getting a bit more complicated. Now, our data is formed by the sentences, Mary will see Jane, Will will see Mary, and Jane will see Will. And the tags are as … Read more

## 3 – NLPND POS 02 Lookup Table V1

So, let’s start with some friends. Let’s say we have our friend, Mary, our friend, Jane, and our friend, Will. And we want to tag the sentence: Mary saw Will. And to figure this out, we have some exciting data in the form of some sentences, which are “Mary saw Jane.”, and “Jane so Will.” … Read more

## 2 – NLPND POS 01 Intro V1

Okay, here’s the problem we’re going to solve in this section. We have a sentence, for example, Mary had a little lamb, and we need to figure out which words are nouns, verbs, adjectives, adverbs, et cetera. Now, a thorough knowledge in grammar is not needed in this section only to know that words are … Read more

## 14 – 4 Outro POS V1

Great job. In this section, you have learned Hidden Markov Models and how to apply them to part-of-speech tagging. Now in the project, you will have the chance to apply this algorithm in a real data set of sentences to create your own part-of-speech tagger. Let’s go.

## 13 – NLPND POS 12 Viterbi Algorithm V2

In this video, will develop the Viterbi algorithm in detail for this example. And if you’ve seen dynamic programming, that’s exactly what we’re doing. So here’s our trellised diagram with the two probability tables, mission and transition. Let’s draw this a bit larger with all the probabilities. Recall that on each node we have the … Read more

## 12 – NLPND POS 11 Viterbi Algorithm Idea V1

So in the last video, we saw a method of removing paths which brought our calculations down from 81 paths to four. But that was still a bit lucky and it required a lot of thinking. We want a better algorithm. So let’s think. Let’s look at only these two paths and let’s merge them. … Read more

## 11 – NLPND POS 10 Solution Which Path Is More Likely V1

Correct. The answer is noun-modal-verb-noun. And to help us out, here are the probabilities for all the paths where you can see that the highest one is the bottom right one, which corresponds to noun-modal-verb-noun. Precisely this one, this is the winner.

## 10 – NLPND POS 09 Quiz Which Path Is More Likely V1

And the answer is four. And here are the four paths we have to check. They are noun noun noun noun, noun modal noun noun, noun noun verb noun, and noun modal verb noun. Now, let me write down the probabilities in the edges and the vertices. So new quiz. Which one of these gives … Read more