Welcome to the next lecture on conditional probabilities in the course intuitive introduction to probability. In this lecture, we will learn about Bayes Rule. A rather complicated looking formula that has many applications. But don't worry, you don't actually have to use the complicated formula or memorise it. I will show you how we can easily do the calculations in probability tables. And how it's actually rather easy to do the relevant calculations. Before we look at an ugly formula, let's look at a small simple example to introduce some ideas. <i>A company has 3 suppliers,</i> <i>S, T, and U.</i> <i>These parts that these suppliers deliver to the manufacturer,</i> <i>to our company, are either good or defective.</i> <i>For the sake of discussion, let's call them bad parts.</i> <i>Good parts have a probability of 90%,</i> <i>the opposite, bad parts have a probability of 10%.</i> The company looked at some recent data to see, is there any difference between our suppliers S, T, and U? And here's the data. <i>Among the food parts, 60% came from S.</i> <i>25% from T, </i> <i>and the last 15% from the supplier U.</i> <i>Among the bad parts, 40% can be from the supplier S.</i> <i>And 30% each from T and U.</i> <i>And now, our manufacturer is asking the following question.</i> <i>Which of my suppliers delivers the best parts?</i> <i>The fewest defective parts, the fewest proportion of bad parts,</i> <i>and which is my worst supplier?</i> <i>So, in the language of probability</i> it's asking the following question. <i>What's the probability of a good part given</i> <i>that it comes from supplier S?</i> <i>What's the probability of a good part came from T?</i> <i>What's the probability of a good part, it came from U?</i> We can't answer that question yet, because let's look at our data, our data set tells us something different. <i>We learned, the probability of good, 0.9. Probability of bad, 0.1.</i> <i>And then, the other probabilities are conditional probabilities,</i> <i>but they are the wrong way. We have the probability of S, given good.</i> <i>Because we were told among the good parts,</i> <i>60% came from S, and so at the bottom of the slide</i> <i>we see the other numbers.</i> <i>But, what we are interested in is the probability of good, given S.</i> <i>So, what should we do?</i> <i>Let's create our probability table with the data that's given.</i> <i>Notice, we have good and bad parts on the one hand,</i> <i>and we have 3 suppliers S, T, and U.</i> <i>Good and bad, we have given these numbers 0.9, and 0.1.</i> <i>So we can fill in the right margin of our little table</i> <i>and the sum definitely should be 1.</i> <i>In the interior, we can now use a general multiplication rule</i> <i>for dependent events, to create the intersection probability.</i> <i>For example, the probability of a part good and S,</i> <i>is 0.6 times 0.9.</i> <i>And so on, all the way to the probability of bad and U,</i> <i>0.3 times 0.1.</i> <i>We can fill in all the probabilities in the interior</i> <i>add up every colour, and what do we get?</i> <i>Voila! There's our complete probability table.</i> <i>So, we see 58% of all parts are from supplier S,</i> <i>25.5% of all parts are from supplier T,</i> <i>and the remaining 16 and a half percent are from supplier U.</i> <i>Now, we can calculate the probabilities we really care about.</i> <i>Here we go.</i> <i>Probability of good, given S.</i> <i>Remember how we do this, we take the probability from the interior,</i> <i>in our case 0.54.</i> <i>The joint probability, and divide by the marginal probability</i> <i>of supplier S, 0.58.</i> <i>And what do we learn? 93.1% of the parts</i> <i>that come from supplier S are good.</i> <i>Or more formally in the language of probability,</i> <i>probability of good, given S, is 93.1.</i> <i>You see all the calculation, and what do we learn?</i> <i>We see that a proportion of good parts</i> <i>is the largest for supplier S.</i> <i>Put differently, the proportion of bad parts</i> <i>is the smallest for supplier S.</i> <i>And supplier U, actually is the worst in this little case.</i> <i>So, what did we actually just do?</i> <i>We flipped the conditional probabilities.</i> <i>This casual language of flipping probabilities around is actually</i> <i>very popular among people in probability theorem.</i> <i>So that's why it uses everyday language.</i> <i>So, we were given the probabilities S given good.</i> <i>U given bed, and so on.</i> <i>And in the end, we calculated the reverse conditional probabilities</i> <i>of good given S, good given T,</i> <i>all the way to bad given U.</i> And thus, now actually there's a very general approach of what we did in this little toy example, there's actually a general rule which does exactly what we just did and that's the famous Bayes Rule. Let me derive it for you. <i>Recall the general multiplication rules that we saw before,</i> <i>for conditional probabilities.</i> <i>Which is, intersection probability equals conditional probability</i> <i>times the probability of the condition.</i> <i>If we use this, and once use the event B as a condition,</i> <i>and once use our event A as a condition,</i> <i>and then set these 2 right hand sides equal.</i> <i>We see the formula at the bottom of the slide in purple.</i> <i>So probability of A given B,</i> <i>equals the probability of B given A,</i> <i>times the probability of A, divided by probability of B.</i> <i>Notice on the left, we have probability of A given B,</i> <i>and on the right, we have the probability of B given A.</i> <i>So, if you give me B given A,</i> <i>and the marginal probabilities,</i> <i>I can flip around conditions</i> <i>using this formula.</i> <i>Now we can go a step further and say</i> <i>that sometimes you may not have the probability of B,</i> <i>and in that case, </i> <i>and that's actually what also happened in our little calculation,</i> <i>we can calculate it by adding up the elements in a probability table here</i> <i>for completion, I give you the formula at the top.</i> <i>If you now take this formula and put it into a flipping formula,</i> <i>we get the formula at the bottom with the yellow background.</i> <i>and that's the famous Bayes Rule for 2 events.</i> <i>In our case, we just have good and bad</i> <i>or in general language, A and A complement.</i> <i>And what this Bayes Rule says,</i> <i>knowing the conditional probability of B given A,</i> <i>and B given the complement of A,</i> <i>we can flip around the conditioners</i> <i>and calculate the probability of A given B.</i> <i>And we could also calculate the probability of the complement of A, given B.</i> <i>There's nothing special about our example</i> <i>good and bad, or A and A complement.</i> This scales up. Remember our probability tables can be as large as we want. And the same is true for Bayes Rule. <i>So, if we have now M,</i> <i>not just 2, but larger number M of</i> <i>mutually exclusive and totally exhaustive events,</i> <i>then here you see the formula scales up.</i> <i>And as you see, it looks kind of ugly.</i> <i>And so this formula usually scares my students</i> <i>and so I learned over the years to de-emphasise the actual rule</i> <i>and this nasty looking formula.</i> <i>And instead of, teach this via simple examples</i> <i>that's the example I showed you at the beginning, the best way is</i> <i>fill in the probability table and then ask yourself</i> <i>which conditional probability do I really need?</i> <i>Calculate it using the definition of conditional probability</i> <i>and that's essentially you applying this complicated looking formula.</i> <i>Let me wrap up.</i> It is possible to flip these conditional probabilities around. To go from A given B to B given A, the famous formula behind this is Bayes Rule but I strongly encourage you to build a probability table and calculate the probabilities this way. This concludes our module on conditional probabilities. As I said at the beginning of the module, this is not an easy concept. I showed you a few examples. In the next module, I will focus very much on the concepts of conditional probabilities, dependence and independence in the context of some real world problems. I could say real world disasters that happen. And you will see that these concepts, maybe you still think they are very abstract, and you think to yourself, do I really need this in everyday life? The answer is yes! Yes! And I will show you some cool applications. So please come back for our next module. Thank you very much. And enjoy the session with the TA on calculating some probabilities.