## 8 – Normalizing 2

And now finally, we come up with the actual posterior, whereas this one over here is often called the joint probability of two events. And the posterior is obtained by dividing this guy over here with this normalizer. So let’s do this over here–let’s divide this guy over here by this normalizer to get my … Read more

## 7 – Normalizing 1 Solution

And, yes, the answer is 0.108. Technically, what this really means is the probability of a positive test result– that’s the area in the circle that I just marked. By virtue of what we learned last, it’s just the sum of two things over here, which gives us 0.108.

## 6 – Normalizing 1

The normalization proceeds in two steps. We just normalized these guys to keep ratio the same but make sure they add up to 1. So let’s first compute the sum of these two guys. Please let me know what it is.

## 59 – Bayes Rule Conclusion

Great job finishing this lesson on Bayes rule. At this point, you’ve now gained a ton of valuable knowledge about probability, conditional probability and Bayes rule. To reinforce your understanding of these topics, let’s go through some probability practice using Python. You’ll do this in the next lesson.

## 58 – Using Sensor Data

Once we gather sensor data about the car’s surroundings and its movement, we can then use this information to improve our initial location prediction. For example, say we sense lane markers and specific terrain, and we say, hmm. Actually, we know from previously collected data that if we sense landlines close to the sides of … Read more

## 57 – Bayes’ Rule and Robotics

Bayes rule is extremely important in robotics and it can be described in one sentence. Given an initial prediction, if we gather additional related data, data that our initial prediction depends on, we can improve that prediction. For example, let’s say our initial prediction also known as a prior belief, is an estimate of a … Read more

## 56 – Reducing Uncertainty

Let’s talk a bit more about why uncertainty is so important in the field of robotics and self-driving cars. We know that measurements like the speed, the direction, and the location of a car are challenging to measure and we can’t measure them perfectly. There’s some uncertainty in each of these measurements. We also know … Read more

## 55 – Sebastian At Home Solution

And I get 0.0217, which is a really small thing. And the way I get there is what taking home times the probability of rain at home normalizing it using the same number of a year plus the calculation for the same probability of being gone is 0.6 times the rain I’ve been gone has … Read more

## 54 – Sebastian At Home

This test is actually directly taken from my life and you’ll smile when you see my problem. I used to travel a lot. It was so bad for a while. I would find myself in a bed not knowing what country I’m in. I kid you not. So let’s say, I’m gone 6

## 53 – Generalizing

So what have you learned? In Bayes Rule, there will be more than just 2 underlying causes of cancer/non cancer. There might be 3, 4, or 5, any number. We can apply exactly the same math, but we have to keep track of more values. In fact, the robot might also have more than just … Read more

## 52 – Robot Sensing 8 Solution

As usual, we divide this guy over here by the normalizer, which gives us 0.818. Realize all these numbers are a little bit approximate here. Same for this guy, it’s approximately 0.091. And this is completely symmetrical, 0.091. And surprise, these guys all add up to 1.

## 51 – Robot Sensing 8

And now we calculate the desired posterior probability for all 3 possible outcomes. So please plug them in over here.

## 5 – Prior And Posterior Solution

Obviously, P(C) is 0.01 (times) 0.9 is 0.009, whereas 0.99 (times) 0.1, this guy over here, is 0.099. What we’ve computed is here is the absolute area in here which is 0.009 and the absolute area in here which is 0.099.

## 49 – Robot Sensing 7

So here’s the \$100,000 question. What is our normalizer?

## 48 – Robot Sensing 6 Solution

And the answer is exactly the same as this over here, because the prior is the same for B and C, and those probabilities are the same for B and C, so they should be exactly the same.

## 47 – Robot Sensing 6

Finally, probability of C and Red. What is that?

## 46 – Robot Sensing 5 Solution

Well, the answer is you multiply our prior of 1/3 with the probability of seeing red in Cell B, as seeing green at 0.9 probability, so red is 0.1. So 0.1 times this guy over here gives 0.033.

## 45 – Robot Sensing 5

What’s the joined for Cell B?