0:00

In this lecture, we'll talk about autocovariance function.

Â Objectives are the following.

Â We'll recall random variables from our introductory statistics and

Â probability class, and we'll recall the covariance of two random variables.

Â We will give a new definition to a time series.

Â We'll characterize time series as a realization of a stochastic process.

Â We'll talk about stochastic process taking this cycle as well, and

Â we'll define autocovariance function.

Â So what's a random variable?

Â Random variable is a function that goes from sample space to real numbers.

Â Number of sample space are all possible outcomes of the experiment, and

Â if we map each possible outcome of the experiment

Â with the number in the green line, we get a random variable.

Â 0:57

But for us, we'll look at it in a slightly different way.

Â We're going to look at it as a machine.

Â Basically, it's a machine that produces this random numbers.

Â Now once it produces a lot of numbers, those numbers together is a data set.

Â If we start with data like this, we can say, they're all coming from this machine.

Â This random variable x, if I know the properties of the random variable,

Â for example the distribution of this random variable,

Â I can say something meaningful about my dataset.

Â 1:30

So here we have random variable, actually we have a random variable in the right

Â outside, but we have a dataset in the left outside, 45, 36, 27, it's a dataset.

Â But if we assume that it comes from this one variable x,

Â we're more than left with x, and mathematically we work on x, and

Â then we inverse something meaningful about the dataset using the proper.

Â From your probability and statistics class,

Â you already know that random variables might be discrete or continuous.

Â The script running variably produces countable

Â pascal points numbers on a real line.

Â For example on the left down side, X is a discrete right number variable, possible

Â outcomes of X is 20, 30, 57 and so forth so basically they're countably many.

Â But on the right hand side, we have a continuous random variable y, and

Â it might have any point, might take any point in between lets say 10 to 60.

Â Now before we do experiment, everything is random, right?

Â You pull up in the coin, you have a randomness.

Â It can be heads or tails.

Â But once we flip the coin, the result of experiment is known, randomness is gone.

Â So the same thing happens here, right?

Â Once we do the experiment, let's say X becomes 20,

Â the discrete random variable X becomes 20 which means, randomness is gone now.

Â And we have exact, we have exact value for it, it's 20.

Â We call that 20 as a realization of the random variable X.

Â Same thing for Y.

Â Y is a continuous random variable.

Â But say we do the experiment.

Â Randomness is gone, now we have a value for it.

Â Let's say it 30.29.

Â And then we say 30.29 is a realization of the Y random variable.

Â 3:19

If we have two random variables, X and Y, we'll learn this notion

Â called covariance from our probability class that it somehow measures

Â the linear dependence between two random variables, right?

Â We are talking about this abstractly.

Â If you have two data sets, covariance will

Â tell us something about the linear dependence of the pair, data set.

Â But right now, we model each of our data set with a random variable, x and y.

Â Abstractly, we are defining covariance of x and y,

Â using the formal expectation x minus its expectational Y minus expectational Y.

Â And to put them together as an expectation.

Â And that's defined covariance.

Â And let me just mention that covariance of X and Y is covariance of Y(X)

Â if it's symmetrical.

Â 4:17

We talked about random variable but

Â if you just put a lot of random variable together and give them a sequence.

Â For example, there's the first random variable X1.

Â The second one, at time one it's X1, and time two it's X2, at time three it's X3,

Â and now you have a sequence of random variables.

Â We call it a stochastic process

Â that each one of these random variables might have their own distribution,

Â might have its own expectation, might have its own variations.

Â But the way to think about Stochastic process is to

Â think of it versus deterministic process.

Â In deterministic processes, for example,

Â if you ask me solution of ordinary differential equation.

Â You start with some point and the solution of the [INAUDIBLE] will tell you exact

Â trajectory so you know exactly where you're going to be the next time,

Â next time step, next time step and so forth.

Â The Stochastic process is basically opposite of that.

Â At every step you have some randomness.

Â You don't know exactly where you're going to be.

Â But there are some distribution of X at that time stamp.

Â 5:25

Now, we are ready to define a time series in a slightly different way.

Â Let me remind you our first definition.

Â What was the time series?

Â Time series is any dataset but collected different times.

Â But now we say,wait a minute, maybe there is some stochastic

Â process going on the background they are not way off which is X1, X2, X3,

Â and so forth, and the realization of X1 is my first datapoint in the time series,

Â realization of X2 is my second datapoint in my time series.

Â So, 30, 29, 57, and ..., this time series, that I start with, I am trying to analyze

Â mainly, it's actually a realization of the stochastic process going on the back one.

Â So if I know the stochastic process.

Â If I know X1, X2, X3, and how it changes, then I can say something meaningful about

Â my client series, but realize the phone X1, X2, X3, and

Â so forth, the stochastic process might come with ensemble of realizations,

Â I mean, it might get its own ensemble of time series.

Â But I only have one time series.

Â By having only one time series, basically, one point at each time,

Â you would like to say something meaningful about the stochastic process.

Â 6:49

Autocovariance function is defined, basically,

Â just taking covariance of different elements in our sequence,

Â in our stochastic process.

Â If you take Xt and Xs and s and t might be in different locations and

Â we'll get the cavariance of them, we get gamma (s,t) then we call

Â that covariance and if we take ( x,t) the covariance of (x,t) will

Â itself of course will get the variance at that time stand.

Â 7:33

the kind of difference between these random variables.

Â In other words, you don't look at, for example, random variable xt and

Â run them wherever xt plus k.

Â It doesn't matter what t is.

Â The time difference is k and the time difference actually decides the nature,

Â decides the fate of our autocovariance.

Â And the reason is the following.

Â We assume you're working with stationary times series.

Â Remember in a stationary time series we said one part of the time series,

Â the properties of the one part of the time series,

Â is same as the properties of the other parts of the time series.

Â 8:13

So in this case if you start at zero x1 to xk plus 1 or x10,

Â x10 plus k, it's same different parts of the time series.

Â But the sense of we only have k steps in between.

Â The properties of these sections of the time series must be the same.

Â So the covariance from 4 k plus 1 with x1 is same as x10 plus k with x10.

Â And we call that gamma k.

Â So gamma is our autocovariance function.

Â Gamma k is going to be called autocovariance coefficient, but

Â we usually do not have the stochastic process, right?

Â We only have a time series, just a realization of the stochastic process.

Â So we're going to use that to approximate gamma k with Ck,

Â which we will call the autocovariance coefficient.

Â 9:07

So what have we learned in this lecture?

Â We have learned the definition of a stochastic process,

Â which is collection of random variables.

Â And you learned how to characterize time series in slightly different way,

Â but realizing that it is actually a realization of a stochastic process.

Â And we learned how to define our autocovariance function of a time series.

Â