0:40

We will say that a process X_t is stationary or strictly

stationary if all of

its finite-dimensional distributions are invariant under shifts in time.

That is, X_t1 plus h,

X_tn plus h is equal in distribution to X_t1 and so on,

X_tn and this accords here should be fulfilled for all time moments t_1 and so on,

t_n and for any h larger than zero.

Another kind of stationarity,

the so-called weakly stationarity is the following.

So we will say that X_t

is weakly stationary if,

first of all, the mathematical expectation m of

t doesn't depend on t. It is a constant.

And on another side, covariance function,

recall its covariance between X_t and X_s,

depends on the difference between T and S. We can

express this mathematical concept in two ways.

So, first of all, we can say that this is the same as K_t plus h,

S plus h for all t

and s and for any h larger than zero or,

in other words, we can say that there exists

some function gamma which is called autocovariance function.

This is a function from R to R,

such that K of t,

s is equal to gamma at a point t minus

s. Weak stationarity has a lot of different names.

For instance, the same kind of stationarity is sometimes called

second-order stationarity or wide-sense stationarity.

Also, there is a notion of covariance stationarity.

All these notions are the same.

So we either speak on strict stationarity and discuss the properties of

complete finite-dimensional distributions or

we're speaking only about mathematical expectation and covariance function.

I will discuss the relation between these two kinds of stationarity a bit later.

But now, let me just give a couple of properties of

this function gamma because this function will play essential role in this lecture.

So which properties has this autocovariance function?

Let me just list some of them.

4:08

First of all, let me mention that gamma at zero is non-negative.

In fact, gamma at zero is equal to

the covariance function between X_t and X_t for any time moment T,

and this is exactly the same as the variance of X_t.

So variance is non-negative and therefore,

the gamma function at zero is also non-negative.

Secondly, let me mention that absolute value of gamma of

t is less or equal than gamma at zero.

That is, the function gamma can take negative values but

cannot be very big negative numbers or very small negative numbers.

It just because absolute value of gamma at

t is in any case smaller or equals gamma at zero.

This is also a very simple fact because covariance function between X_t and X,

let me write here zero,

is in any case smaller or equal than square root of

the variance of X_t multiplied by the square root variance of X_0.

It just because of Gaussian on equality.

And since we have its variance X of t is equal to gamma zero,

this is the same as square root of gamma at zero

multiplied by square root gamma at zero and therefore, it's gamma at zero.

So basically, this property is also proven.

And the third property,

which I would like to mention now,

is that the function gamma is even.

6:11

This is also nothing more than the properties of the covariance function,

because gamma at t is equal to the covariance between X_t and X_0.

This is the same as the covariance between X_0 and X_t.

And this is gamma as a time moment minus t. Therefore,

the function gamma is even.

So once more, the main properties of the autocovariance function are the following.

So, gamma at zero is non-negative.

Secondly, its absolute value for any t is smaller or equals in gamma at zero.

And the same properties as this function is even.

Let me now provide some examples on stationarity and

weak stationarity and discuss the relation between these two kinds of stationarity.

First, let me assume that the process X_t has a finite second moment.

You know that this assumption is very common in the context of stochastic processes

because if you write Aquarius function and you should write it in many situations,

then you immediately assume that this assumption is fulfilled.

And in this case, it turns out that if the process X_t is strictly stationary,

7:51

I guess that this statement is intuitively very clear because if

X_t has finite-dimensional distributions which are invariant on the shifts in time then,

of course, these conditions should be also fulfilled.

But I don't want to proof it because intuitively it should be so.

And second statement for Gaussian processes,

exist the notions of strict and weak stationarity are the same.

So for Gaussian processes,

X_t is strictly stationary if and only if X_t is weakly stationary.

I think that this statement is also due in teaching if you

recall it because you know that in case of Gaussian processes,

the function of mathematical expectation as

a covariance function actually determines the distribution.

So, please keep in mind just the statements.

And now, I will proceed with examples.

The first example is the so-called white noise process.

9:15

This process is defined for integer t,

sometimes they assumed only non-negative,

so 0, 1, 2, 3, and so on.

But sometimes, one can consider also negative values,

I'll write here plus minus 1,

plus minus 2, plus minus 3.

So, X t is drawn from a fixed distribution,

such that X t and X s are uncorrelated,

if t is not equal to s. That is mathematical expectation of X t is equal to a constant,

and it is normally assumed that this constant is equal to zero.

And variance of X t is also a constant sigma squared.

In this case, it is very common to denote white noise process as W N,

and then in brackets zero, sigma squared.

Okay. Correlation and covariance between X t and S s,

is equal to zero,

if t is not equal to s. This means that the covariance function can be

represented as sigma squared multiplied by

the indicator that t is equal to s. And therefore,

this covariance function can be represented

as autocovariance functions at point t minus s.

The autocovariance function in this case is equal to sigma

squared multiplied with the indicator that x is equal to zero.

So, we conclude that this process is weakly stationary.

In fact, we have the mathematical expectation is a constant,

which is equal to zero.

And also, there is this, the autocovariance function.

In general case, this process is not strictly stationary,

but there are some partial cases where it is so.

One partial case, is a case when X 1,

X 2, and so on,

are in fact independent identically distributed trend of variables.

And in this case,

the process X t is known as i.i.d noise.

There is no doubt i.i.d noise is also strictly stationary.

And now the partial case is when the process X t is a Gaussian process.

12:24

Well, let me now provide some other examples.

Second example is a Random walk.

We have discussed this example when we started the theory of Markov Chains and

it was one of our first examples of a Markov Chain.

Just recall that a process S n is a Random walk,

if it is equal to S n minus one psi n. Where psi 1,

psi 2, and so on,

are a sequence of independent identically distributed random of variables

with the following distributions are equal to 1 and minus 1,

1 with probability P,

and minus 1 is probability 1 minus P. And also it is assumed that S 0 is equal to zero,

there's no doubt we can also rewrite the definition in the following form.

S n is equal to psi 1, plus so on,

plus Xi n. Mathematical expectation of

S n is equal to n multiplied by the mathematical expectation of psi 1,

and it is equal to n multiplied by 2 p minus 1.

Therefore, if p is not equal to one half,

then mathematical expectation of S n depends on n. And in this case,

we get that the process S n is not stationary in the weak sense.

But you know that if the process is strictly stationary,

then it is weakly stationary.

And from here, where by simple logic concludes that if it isn't weakly stationary,

it is also not strictly stationary.

Therefore, if p is not equal to one half,

then the process is neither strictly nor weakly stationary.

So, to continue our consideration,

we should concentrate on the case p is equal to one half.

And in this case,

mathematical expectation of S n is equal to zero.

As for the covariance function,

let me take two integers values n and m,

and let me assumed that n is larger than m. Then we have here covariance

between S m plus

psi m plus 1 and so on plus psi n,

and S m. Since covariance function is a linear function,

we get here covariance between S m and

S m plus covariance between this sum psi m plus 1,

and so on, psi n and

S n. You know that S m is a sum of psi 1 psi m. And here,

we have a sum of psis which are with indices larger than m starting from m plus 1.

Since psi 1, psi 2, and so on,

are independent then [inaudible] distributed,

we have that this covariance is equal to zero.

Well, as for the first sum

here it is equal to the variance of S m,

and variance of S m is equal m multiplied by variance of psi 1.

16:05

So, you see that this variance depends on m or minimum between n and m. Therefore,

what we have here is that the covariance function cannot be decomposed as a form of

some function at the argument n minus m. If you are not sure that it is so,

I advice you to consider time moments,

n plus h, m plus h as it is given here.

And then you will immediately realize that minimum cannot be

presented as a difference between the arguments.

So finally, we conclude that a Random walk is neither weakly nor strictly stationary.

But as is the case when P is equal to a half,

and in the case when P is [inaudible] number between zero and 1.

Let me continue. Our third example will be a Brownian motion.

17:22

But it is stationary just because,

if you can see a variance on the Brownian motion you get

that variance is equal to t. This is just because you know that from definition,

B t minus B s has a normal distribution with a mean zero

and variance equal to t minus s. You should now substitute here s equal to zero,

then you get here B zero that is zero.

And here t minus s will be equal to t. So variance of B t

therefore is equal to t. But if the process will be weakly stationary,

then variance of B t shall be equal to

the value of the function Gamma as a time moment zero.

Therefore, it doesn't depend on t. You

know that this is not possible at all and therefore,

we conclude that Brownian motion is not weakly stationary.

We can also show the same fact from other sources.

For instance, we can consider covariance function,

it is equal to minimum between the arguments.

And as we have discussed previously,

minimum cannot be presented as

a different function of the difference between the arguments.

Let me now clarify this more precisely.

So, if you take K t plus h s plus h,

and assume that t is larger than s. So,

what we have here is actually s plus h. And if a process would be stationary,

then it should be equal to a minimum between t and this,

this will equal to s. And of course,

this equality is not possible for any positive h. Well,

finally, we conclude that Brownian motion is not a weakly

stationary and therefore it is not also strictly stationary.

In this case we can also employ the fact that for Gaussian processes,

strict stationarity and weak stationarity are exactly the same.