## 9 – I vs We Quiz Solution

Here’s the answer. It could be probability distributions in middle states, as well as likely time spent in middle states

## 8 – I vs We Quiz

What property of the observed sequences of delta_ys can help tell the difference between the two gestures? Probability distributions in respect to starting states, probability distributions in middle states, likely time spent in middle states, or none of the above? Select all answers that could apply.

## 7 – HMM_ _We_

Great, now here’s the HMM I created for the gesture, we. [BLANK_AUDIO] >> Hold on. I would have used four states here. Why did you only use three? >> Well it was mostly to simplify the problem for our purposes. Note that the middle section varies a little bit more in delta y than with … Read more

## 6 – HMM_ _I_

Okay, I’ve made an HMM for sign language word, I. >> Great, how did you pick those states? >> Well, the gesture seemed like it had three separate motions. So, I made each of those their own state and chose the transition probabilities based on the timing. >> We can take a look at the … Read more

## 5 – Delta-y Quiz Solution

Here’s the answer. [BLANK_AUDIO]

## 4 – Delta-y Quiz

Here are several plots of y versus t. Given these plots, match each of the y versus t plots with their derivative plots, delta y versus t.

## 31 – Baum Welch

So what’s next? >> A process called Baum Welch re-estimation. >> That’s like Expectation-maximization again, right? >> Correct. >> But how does it differ from what we just did? >> It’s very similar, but with Baum Welch, every sample of the data contributes to every state proportionally to the probability of that frame of data … Read more

## 30 – HMM Training

When we started this lesson, we create our models by inspection, however, most of the time we want to train using the data itself. When using HMMs for gesture recognition, I like to have at least 12 examples for each gesture I’m trying to recognize, five examples at a minimum. >> For illustration purposes let’s … Read more

## 3 – Sign Language Recognition

We will use sign language recognition as our first application of HMMs. For example, let’s consider the signs I and we and create HMMs for each of them. Here’s I. [BLANK_AUDIO] We is a little different. [BLANK_AUDIO] Let’s focus on the I gesture. We’ll use delta y as our first feature here. >> Wait a … Read more

## 29 – New Observation Sequence for _We_ Solution

Here’s the resulting probability for We, 2.91 x 10 to the -5. Note that this answer is higher than what we got for the model of i. Indicating that this observation sequence probably came from a We gesture. This is a different result from what we saw previously. Showing how the additional time spent in … Read more

## 28 – New Observation Sequence for _We_

Now let’s do the same thing for We. We have the same observation sequence as the previous quiz, where the middle zero was replaced with the sequence negative one, zero, and one. We’ve given you new probabilities for We. So go ahead and tell us the probability of this observation sequence given our model for … Read more

## 27 – New Observation Sequence for _I_ Solution

Here’s the answer. By multiplying all the transition and output probabilities, along with the curvy path and the new trellis, we get the resulting probability for I 1.42 X 10 to the negative 5th.

## 26 – New Observation Sequence for _I_

Let’s look at a new observation sequence. We’ve replaced the middle 0 observation with a new sequence -1 0 and 1. Given these probabilities, can you tell us the probability of this observation sequence, given the model for I?

## 25 – Which Gesture is Recognized_

So it looks like it’s a lot more probable that the model for I generated this data. >> Yep. The main difference between the values for the models producing this observation sequence has to do with the middle state. Remember that we used delta y even though it is a relatively bad feature for distinguishing … Read more

## 24 – _We__ Viterbi Path Solution

Here’s the most likely path through the trellis. Notice that it’s very similar to the path for i, but the probability is much smaller.

## 23 – _We__ Viterbi Path

Finally, we need to determine the most likely sequence through the trellis. Check the boxes to indicate the best path and then fill out the probability of that path here.

## 22 – _We__ Output Probabilities Quiz Solution

Here’s the answer. [BLANK_AUDIO]

## 21 – _We__ Output Probabilities Quiz

In the last quiz, you looked at the transition probabilities. Now, let’s consider the output probabilities. We filled out some of the probabilities to get you started. Choose from these answers and fill out the remaining nodes in the trellis.

## 20 – _We__ Transition Probabilities Quiz Solution

Here is the answer. Note that the main difference between I and We is the transitions for state two.

## 2 – HMM Representation

We should probably go over how to represent an HMM. >> Right, the Russell and Norvig book draws them like a Markov chain and adds an output node for each state. In this representation, which is common in the machine learning community, each Xi represents a frame of data. Xo is the beginning state which … Read more