[MUSIC] So we have defined a rule for the parallel transport of a vector with a upper index. So the rule is like this. A nu (x) d x alpha. So the question, what is the rule for the parallel transporting the vector with a lower index. And to define it, we have to, it should be in agreement with this rule. And to define it, we have to bear in mind that parallel transport is basically a kind of transformation, kind of rotation of a vector. Transformation of a vector. If we move it along some paths, not necessarily geodesic, in our space time or space, under such a transformation, of course, scalar product doesn't change. Scalar product under parallel transport doesn't change as a result we have this rule. Hence, B mu, delta A mu is equal to, from this relation is minus A mu delta B mu. Now, here we can use this rule for this vector, as a result, we obtain that this is A mu times gamma mu. Gamma mu B nu of dx alpha. Hence, from this, that this is equivalent to this, one can obtain that dA mu is equal to gamma nu mu alpha (x). A nu (x) dx alpha, In addition to this equation. As a result, we have the following definition of a covariant derivative. So let me write it explicitly. So we have the following definition of the covariant derivative. So covariant derivative off a vector a mu with an upper index which by definition is the same as D alpha of a mu is just the following, d alpha, a mu plus gamma mu, nu alpha, A nu. This just follows from the equation that I have been writing so far. Similarly, A mu alpha, which is the same as D alpha, they mean different notation for the same thing, is d alpha, A mu minus gamma nu mu alpha A nu. So this is covariant differential. And also, one more comment is in order. Is that apart from this gamma, we are going to use also gamma with all lower case indices. Beta nu alpha by definition is just beta mu gamma mu nu alpha. Well, natural question is to ask what is a covariant differentiation of tensors with more indices? Well, to show examples, let me just write a few formulas explicitly. Well, similarly, we have a tensor with two indices, mu and nu. And I want to take covariant differential with respect to this guy, which, as you may guess, is just the same in notation as this guy. It is equal to d alpha A mu nu, and then I of course have to apply gamma to both indices separately. So it's gamma mu beta alpha A beta nu, plus another application to a different index, gamma nu beta alpha, A mu beta. So this is a rule for this guy. Now if we want to apply covariant differential to a guy which has one upper index and one lower index, the result is as follows, d alpha a mu nu plus gamma mu. I apply gamma to the upper index. Beta alpha A beta nu and minus, according to this plus here and minus here, minus gamma beta nu alpha A mu beta. And furthermore, if we have to apply D alpha to something which carries both lower case indices, we have d alpha A mu nu minus gamma beta, mu alpha, A beta, oops sorry, A beta, no beta here, A beta nu minus gamma beta nu alpha, A mu beta. So these are the formulas, and similarly for more complicated terms, if there's more indices, a different kind of indices in different places. So it's not hard if you will make many exercises with the tensors, you will get used to it. It's not horribly complicated. One thing one have to bear in mind at this point. That in Minkowski space, in case if metric is eta mu nu, so metric is eta mu nu. Then, it follows that gamma mu nu alpha is 0. I have to stress that for Minkowski space it is 0. If we have flat space in different curvilinear coordinates gamma is not necessarily 0. It's only Minkowski space it's 0. And, well that goes in accordance with the fact that in flat space, the parallel transport doesn't change the vector along a curved closed contour. But not only that, because like in principle, gamma can be non 0, but still the vector doesn't transform if we parallel transport it along the closed contour. We will see this fact in a different manner at sometime during this lecture. [MUSIC]