8 – Speeding Up Enumeration 2 Solution

The answer is, that alarm is dependent on both John and Mary, and so we can draw both nodes in. Intuitively that makes sense, because if John calls, then it’s more likely that the alarm has occurred, likely as if Mary calls, and if both called it’s really likely. So you can figure out the … Read more

6 – 06 – Speeding Up Enumeration Solution V2

And the answer is that the node for Mary calls in this network is dependent on John calls. In the previous network, they were independent, given that we knew that the alarm had occurred. But here, we don’t know that the alarm had occurred. And so, the nodes are dependent because having information about one … Read more

5 – 05 – Speeding Up Enumeration V3

We’ve seen how to do enumeration to solve the inference problem on belief networks. For a simple network like the alarm network, that’s all we need to know. There’s only five variables. So even if all five of them were hidden, there’d only be 32 rows in the table to sum up. >From a theoretical … Read more

4 – 04 – Enumeration Solution V2

We get the answer by reading numbers off the conditional probability table. So the probability of B being positive 0.001, of E being positive because we’re dealing with the positive case now for the variable E, is 0.002. Probability of A being positive because we’re dealing with that case, given that B is positive. And … Read more

33 – Monty Hall Letter

Now as a final epilogue, I have here a copy of the letter written by Monty Hall himself in 1990 to Professor Lawrence Denenberg of Harvard, who with Harry Lewis wrote a statistics book in which they use the Monty Hall problem as an example. And they wrote to Monty asking him for permission to … Read more

32 – Monty Hall Problem Solution

The answer is that you have a one third chance of winning, if you stick with door number one. And a two thirds chance if you switch to door number two. How do we explain that? Why and why isn’t it 50 50? While it’s true that there’s two possibilities. But we have learned from … Read more

31 – Monty Hall Problem

Now, just one more thing. I can’t help but describe what is probably the most famous probability problem at all. It’s called the Monty Hall Problem, after the game show host. The idea is that you’re on a game show and there’s three doors. Door number 1, door number 2, and door number 3. And … Read more

30 – Gibbs Sampling

A technique called Gibbs sampling, named after the Physicist Josiah Gibbs, takes all the evidence into account, not just the upstream evidence. It uses a method called Markov Chain Monte Carlo, or MCMC. The idea is that we resample just one variable at a time, conditioned on all the others. That is, we have a … Read more

3 – 03 – Enumeration V3

Now we’re going to talk about how to do inference on Bayes Net. We’ll start with our familiar network. And we’ll talk about a method called enumeration. Which goes through all the possibilities, adds them up. And comes up with an answer. So what we do is start by stating the problem. We’re going to … Read more

29 – Likelihood Weighting 2

Likelihood weight is a great technique but it doesn’t solve all our problems. Suppose we want to compute the probability of C given +s and +r. In other words, we’re constraining sprinkler and rain to always be positive. Since we used the evidence when we generate a node that has that evidence as parents, the … Read more

27 – 27 – Likelihood Weighting Solution V2

The answer is, we’re looking for the probability of having a positive w given the positive s and a positive R. So that’s in this row, so it’s 0.99. So we take our old weight and multiply it by 0.99, give us a final weight of 0.099 for a sample of +c, +s, +r and … Read more

26 – 26 – Likelihood Weighting V2

In likelihood weighting, we’re going to be collecting samples just like before. But we’re going to add a probabilistic weight to each sample. Now, let’s say we want to compute the probability of rain given that the sprinklers are on and the grass is wet. We start as before, we make a choice for cloudy. … Read more

25 – Rejection Sampling

But there’s a problem with rejection sampling. If the evidence is unlikely, you end up rejecting a lot of the samples. Let’s go back to the alarm network where we had variables for burglary and for an alarm. And say we’re interested in computing the probability of a burglary given that the alarm goes off. … Read more

24 – Approximate Inference2

Now the probability of sampling a particular variable choosing a +W or a -W depends on the values of the parents. But those are chosen according to the conditional probability tables. So, in a limit, the count of each sampled variable will approach the true probability. That is, with an infinite number of samples, this … Read more

23 – Sampling Example Solution

The answer to the question is that we look at the parents. We find that the sprinkler variable is negative so we’re looking at this part of the table. And the rain variable is positive so we’re looking at this part. So it would be these two rows that we would consider and, thus, we’d … Read more

22 – Sampling Example

Here’s a new network that we’ll use to investigate how sampling can be used to do inference. In this network, we have four variables, they’re all boolean. Cloudy tells us if it’s cloudy or not outside and that can have an effect on whether the sprinklers are turned on and whether it’s raining. And those … Read more

21 – Approximate Inference

Now I want to talk about approximate inference by means of sampling. What do I mean by that? Say we want to deal with a joint probability distribution, say the distribution of heads and tails over these two coins. We can build a table, and then start counting by sampling. Here, we have our first … Read more