[MUSIC] We have an approximation algorithm for Maxcut. It runs in polynomial time. Now, let's try to analyze it, and see how close we are to the unknown optimal solution. As always, there are two parts to the analysis. Bound the value of OPT, bound the value of the output of the algorithm. First, bounding, I put bounding in this case because this is maximization, the value of OPT. We know that by definition of relaxation, OPT is at most, the weighted sum of one minus, Vi.Vj over two. And what is Vi.product Vj? By definition it's the length of Vi one times the length of Vj one, times the cosine of the angle between the two. Let theta ij denote the angle between Vi and Vj. Theta ij is somewhere between zero and pi. So, OPT is at most the weighted sum of one minus cosine of theta ij over two. That's our upper bound. Next, we turn to lower bounding the value of optimum. The value of the output, let's see. What is the value of the output? Since the bounding is randomized, the algorithm is randomized. And we are bounding the expected value of the output. The expected value of the output is the weighted sum of the expected value of one minus Xi, Xj over two. Xi, Xj are the values, one or minus one, associated to Vi, depending on whether Vi is above or below the hyperplane H. In other words, we have the weighted sum of the probability that Xi is different from Xj. The probability that the two vectors are separated by the random hyperplane. The probability that Xi is different from Xj equals the probability that our random hyperplane H separates the two vectors from one another. And what is this probability when H is a random hyperplane? Let's see. The nice thing about these random hyperplanes is that these two vectors, Vi, Vj. Together they span a two dimensional subspace of a random. A random hyperplane, if we just look at what it looks like in these two dimensions, it's a random line. So all we have to do is worry about two unit vectors on the unit circle. A random line cutting at some random angle through the origin of the circle. What is the probability that this line separates Vi from Vj? Very simple. The angle is theta ij. The line has to come in between. In this angle, theta ij. When it has all possibilities, from zero to pi. And so, the probability that H separates Vi from Vj is exactly theta ij over pi. The angle is uniform for a random line. So what this means is that if the expected value of the output is equal to the weighted sum of theta ij over pi. The value of OPT is at most the weighted sum of one minus cosine of theta ij over two. And then, there's a miracle lemma. Miracle lemma, for every theta, theta over pi is at least 0.878 approximately, times one minus cos theta over two. What luck. So that's some trigonometric fact, and because of that, we can deduce that the expected value of output is at least 0.878 times OPT. Very simple. And so what do we have, we have designed an algorithm that gives us as output a random cut of the graph, whose expected value is at least 87% of the unknown optimal value. Better than 50%, we have a better than a factor of two approximation. So, it all depends on that magical lemma. How do you prove that lemma? That's just basic calculus. You plot the function that to every angle theta, associates pi over theta times one minus cos theta over two. With theta ranges from zero to pi. So you look at this function, you plot it, and you look at its maximum. The maximum is around here. And it's one over .878. Strictly less than two. And so that's, you can prove it by calculus but that's the proof by plotting so to speak. So to recap we now have an algorithm using the ellipsoid method to solve the SDP relaxation of the maxcut problem. Rounding by choosing a random hyperplane, and using it to separate the unit vectors into two parts. And then outputting the resulting cut of the vertices of the graph. We have theorem, we have proof that the resulting cut has expected value at least 87% times OPT.