[MUSIC] In this session, we discuss conditional probabilities. And the Bayes theorem. They are important to understand how the logic of probabilities helps managers to make decisions. We start with conditional probabilities. They are the probability that a random variable takes a given set of value. If another random variable as taken a specific set of values. Conditional probabilities are the basis for prediction. Because they tell us how the probability of an event changes, if we observe another event. The conditioning event is the signal, that helps us. To make a better prediction of the focal event. As a matter of fact, two events are independent. If the observation of the conditioning event does not change the probability of the focal event in this case. The signal is not informative. But if it changes it, the signal is informative. Because it tells us whether the focal event is more or less likely to occur. Then before we observe the signal as an example, in a class there are 20 students Coming from Europe. Or North America or any of the other regions of the world. Consider a table that relates the number of lectures in the last name of each student to whether they come from any one of the three areas. It cell of the table, you find the number of students coming from the area in the column. Whose last name as the number of letters, indicated in the row. The last row, we thought the marginal value. That is the total number of students coming from the area and the column. And the last column reports the total number of students with the number of letters in the row. The cell in the bottom right corner is equal to the number of students If we divide all numbers in the cells by 20. We have probabilities or relative frequencies rather than absolute frequencies. The bottom right cell reports one, which is the sum of all probabilities. In this table, the frequencies in the cells are joint probabilities. That is, the probability of observing the event in the row. And the event in the column. For example, there are two students coming from North America whose last name has five letters. The joint probability of these two events is two divided by 20 or 10%. The note with X, the number of letters in the last name, and Y the regional. the conditional probability of X equal to 5, conditional upon Y equal north America Is equal to two divided by seven. This is the joint probability two divided by 20 divided by the marginal probability of observing y equal to North America. Which is equal to seven divided by 20. We make two remarks. The first one is that the conditional probability helps our prediction. Suppose you need to predict whether the last name of one specific student has five letters. If you have no information, you set this probability equal to the marginal probability that is 3/20. Suppose that you know that she is not American. In this case, you condition the probability to the signal. She is not American, and the probability becomes two divided by seven. The information tells you that it is more likely that the last name of the student is composed of five letters. Of course, the probability could reduce. Consider the probability that the student's last name has nine letters and you know that she's North American. There could also be independents. Consider the case in which you want to assess the probability that the student has seven letters in their last name. And you receive a signal that tells you that she does not come from Europe. Or North America. The second remark is that the conditional probability. And the joint probability are related. The former is equal to the latter divided by the marginal probability of the conditional probability. The conditional expectation is the mean of the distribution conditional upon a given event. In our example, the mean number of letters is the number of letters in the rows multiplied by the marginal probabilities in the last column. Which is equal to 6.4. This is the average number of letters of the last name of the students in this class. Conditional upon the students coming from Europe. This is the number of letters in the rows multiplied by the probabilities of each number conditional upon the student coming from Europe. You can easily check that this conditional mean is 6.6 125. This is the average number of letters of the last name of the European students in this class. We now turn to the Bayes theorem, the Bayes theorem tells us that the probability of X conditional on Y. Is equal to the probability of y conditional on x times the marginal probability of x divided by the marginal probability of y. It is a way to retrieve the probability of x conditional on y from the probability of y conditioned on x and the marginal probabilities. Some terminology is also useful. The probability P X is sometimes called prior probability. The probability P of X given Y is the posterior probability. Because it is the probability of X after the update provided by the sector. To understand the theory, consider this, the probability test in the last name or equal to a condition two divided by 8 or 25%. Using the Bayes theorem, this probability is equal to the probability that the student is European conditional upon having eight or more letters in the last name, multiplied by the probability of having eight or more letters in the last name. Divided by the probability of being European. We have then retrieved the probability of having more than eight letters in the last name, conditional upon being European. From the probability of being European conditional upon having more than eight letters in the last name.