[MUSIC] Hello everyone. In previous lecture, we proved that the linear program introduced for Maxcut has an integrality cap of two. To prove that, we worked with a family of random graphs. Remember, in the family of random graphs, we have end nodes. And for each possible pair of nodes, we flip a coin independently. We have a head with probability b in this coin, and we have a tail with probability 1-b in this coin. We're going to consider this value P as C divided by N for sum C. One of the properties that we need to insure in order to prove the gap, was property one. Which is that the LP value almost matches the upper bound. Let's say it's at least 0.99 the number of edges. To prove that, we first argue that in this family of random graphs, the probability that given a graph drawn from this family. We will have more than square root of n short cycles with small probability. But how can we prove this probabilistic statement? Well, we are going to use Markov's Inequality. Imagine we have a random variable X, non-negative, and we have a positive real number a. Then Markov's Inequality says that the probability that X is at least a, is at most the ratio between the expected value of x and the value a. Let's try to sketch a proof of this fact. So let's consider this auxiliary function H of a, which is a if x is at least a. And zero if x is less than a. Then by definition, we have that the random variable x is always at least Ha(X). Then the expected value of X is at least the expected value of Ha(X). Which is equal to the length of this interval, which is a, times the probability that X is at least a. Then if we divide both sides by a, we have that the expected value of X divided by a, is at least the probability that x is at least a. Then we have proved Markov's Inequality. Now let's see how can we apply Markov's Inequality in order to prove our random graph result. Remember, the goal is to prove that the probability of having more dense is at most 0.5. So if we consider X to be the random variable that counts the number of short cycles, and we consider a equals to square root of n then we are in the of Markov's inequality. So let's try to use it. Also, we are going to assume that the value n is equal to the value c. Two, three, J. So the values are chosen appropriately, in order to match the desired upper bound on the probability. If we apply Markov's Inequality with X being the number of short cycles, and a equals to this value, then. We have that the probability that the number of short cycles is greater than this value, is upper bounded by the expected value of X, divided by this value a. Now let's try to find an upper bound on the expected value of X. In order to obtain the desire 0.5. The number of short cycles can be upper bounded by the following sum. So let's consider all possible lengths of the short cycles. It means We consider from l which is the length, from 3 to g-1. Then the number of possibilities for choosing a cycle of length l, is upper bounded by n to the l over l. And in order to have a cycle, we need that in the experiment of this random graph, we obtained a on each of the edges of the cycle, which is C over n to the n, because they are all independent. And therefore, the expected value of x is equal to the sum from 3 to g -1 of C to the l over l, which is upper bounded by C to the g. Then we have an upper bound on the expected value of x. And now we come [INAUDIBLE] this upper bound in the inequality we obtained by using Markov's. Then we have that the probability that X is at least the value A, is not greater than 1 over C to the g halves. Which again, by the choose of n, is not greater than 1 over n to the one-sixth. Now, we can pick n appropriately, in order to have this probability ofost 0.5. This is possible by choosing for example N to be greater than 0.5 to the sixth, which is 64. Then we have seen how a property like Markov's Inequality can be very useful to argue about the existence of graphs, satisfying these higher properties as property for the linear programming integrated.