0:30

So, the conditional probability definition that we saw in previous lecture.

Probability of A given B is equal to intersection probability

divided by the probability of B.

If you want to think of the probability of B given A,

it's intersection probability divided by P of A.

Now if we do a little algebra and

multiply on both sides of this equation with the denominator.

So in the first equation, for example, the P of B,

we get this multiplication rule that intersection probability of A and

B equals the conditional probability of A given B times the probability of B.

You look at this math and you say, this looks kind of abstract.

Actually, I am sure you have used it, perhaps without knowing it.

We use this rule actually all the time.

So, let me give try to convince you that this is actually why the intuitive.

2:12

So, look at the numbers.

If I tell you 17.6% live in Zurich and from those people,

19.6% of people are young, what would you do?

You would say, yeah, 0.176 time 0.196.

That's now the fraction of people that have both.

They're both young and live in the Canton of Zurich.

You do this and you get 3.45%.

We have done this I'm sure in our lives a lot.

If I give you half a cake and of that cake, I cut it in five pieces and

I give you a piece, how much of the total cake did you get?

We started out with half.

I gave you one-fifth, 20% of that, you would say, 0.2 times 0.5 is 0.1.

I just gave you 10% of the cake.

So we multiply these probabilities all the time,

just as in this example or in the cake example.

But now, let's take a look just as everyday

calculation in the context of conditional probabilities.

3:18

If it takes these proportions as our definition of probability, the empirical

probability concept, concept number two from what we saw in a previous module.

Then we would say, the probability that someone is Zurich is 0.176.

The probability of someone yet being young in the Canton of Zurich,

the probability of 0 to 19 years, given the person is from Zurich is 0.196.

And now if we use the multiplication rule, the probability of Zurich and

0 to 19 years old is then to probability of Zurich times the condition

probability 0 to 19 years given Zurich, blah, blah, blah, you do some math.

3.45% exactly of what you did if you got feeling before.

So, this abstract looking multiplication rule in conditional

probabilities is actually an every day concept.

A proportion of a proportion and then we multiply.

4:19

Let me remind you once more of the concept of independence.

Independence meant to occurrence of in terms of one event does not effect

the chances of another event occurring.

That was a probability of A is equal to the probability of A given B.

If that's not the case, if they're unequal, you say, dependence.

Now what happens if we take our multiplication rule and

now assume that A, and B are Independent?

In that case,

the conditional probability of A given B is just the original probability of A.

Now replace the conditional probability in that general multiplication rule and

you get a specialized multiplication rule, and that's the multiplication rule I

showed you in the previous module when we talked about independent events.

So probability of A intersection B is equal to P(A) times P(B).

Look, that looks much easier than the general multiplication rule.

That's why people like this independence events, assumptions and this rule.

Lets look at this example where we can easily use.

Lets say, you play a dice game, three dice.

What's the probability of three ones?

One on the first roll, on the second roll and on the last dice.

So you want the probability of a one and a one, and a one?

Rolling three dice, they are independent, I'm allowed to multiply the probability

one-sixth times one-sixth times one-sixth is one and two hundred sixteen and

there is nothing special about one, one, one.

I can ask you what's the probability of first of one, then a three, then a five.

Same math.

So you see, rather large complex events like 3 numbers in a row,

you can do this for 20 numbers in a row, for 200 numbers in a row.

Suddenly, gets very, very easy under the assumption of independence.

We can just multiply their probabilities.

As simple as this is, we have to be careful in real world applications.

There often, we have to ask ourself, is that assumption reasonable?

Can we really assume independence?

If yes, great for you.

We can use the independence multiplication.

If however, the answer is no, you are not allowed to use this rule.

You may get into real trouble.

And later on in this course, I will show you some devastating applications

where people assumed independence and terrible real world things happened.

Here now, I want to give you a very simple example.

Let's say, you have a machine in an assembly line and

that machine carries a heavy load.

And as a result, it breaks down on average in one out of ten days.

On nine out of ten days, it can handle the workload and it works fine.

So if we uses historical data now and

the equilibrium concept number two, empirical definition,

we can now say, the probability of a good day of working is 0.9,

of a breakdown is 0.1 and here's now the question.

What is the probability that this machine works two days in a row?

So if I even don't know anything,

I would have to say, use a general multiplication rule.

The probability of working well on the first day and

working well on the second day.

Yes, probability of working well on the first day times the conditional

probability working well on the second day,

given it worked well on the first day.

One probability I know, P of working well on the first day is 0.9.

That's from my data, but what's the conditional probability?

Now, I'm in trouble.

So now, I would love to assume independence.

If I have independence, then I can use the simpler rule for

independent events at the bottom of the slide,

0.9 times 0.9, 0.9 squared is 0.81.

But if I cannot assume independence, then I need more data.

Now, what is it?

8:53

If you ask engineers, engineers like to talk about the so-called bathtub curve.

A new machine often has breakdowns, because it isn't perfectly calibrated, so

the probability is maybe a little elevated and the machine working well for a day or

two or three indicate that the machine is calibrated and the probability changes.

It reaches a low, it remains constant for a while.

And eventually, due to wear and tear, it goes up.

In the middle range, independence is an okay assumption.

But at the front-end and at the back-end, it isn't.

So here, we already see how very trivial application

of multiplying probabilities can reside in rather tricky issues.

Do we have independence or do we not have independence?

And that's crucial for our calculations.

And as I said, we will see more cool examples in lectures to come.

Let me wrap up this lecture.

We have seen the general multiplication rule for

conditional probabilities, another special case of independence.

This multiplication will greatly simplifies, but

be careful assuming independence.

Thanks for your attention.

Please come back for more fun with probabilities.

Thank you.