Hi, in this final lecture on tipping points, what I wanna do, is, I wanna talk

about how we can measure tips. Now remember, previously, we've talked about

active tips or direct tips, where the variable itself causes the system to tip.

And then contextual tips, where something in the environment, remember, in the case

of the percolation model, causes the environment to [inaudible] to tip. You've

also talked about tips, within class so from an equilibrium to an equilibrium. And

tips across classes, from equilibrium to complex. What we'd like to do is. Now some

sort of way measuring how big a tip is like is it a tip that was completely

unexpected or is it a tip that may be was, you know, sort of likely to happen. Or

just how rare the event that the system tipped into was. So, to do that we need

some way of measuring. Now that's a, a key thing in this models class. Right? We

wanna be able to take these models and take '''em to data, some way to use data.

So, if we're gonna use data we have to have some measures. So, what I wanna do in

this very simple lecture is just introduce two different measures on what tipping

points are to measure the extent of the tip. One is gonna be using something

called the diversity index. The other's gonna use a formula called entropy which

is used a lot in physics, not that much in social science. Okay, so let's get

started. Let's first think about an active tip. So an active tip what we have, is

we've got, you know, this ball up here. And it's sitting peak, it's sitting on top

of this peak. And it could either go this way or this way. And if it heads this way

it's gonna go down here, or it could head down here. Each of these things has a 50

percent chance of occurring. Once the system is tipped, then it's either gonna

be to the right or to the left. So if it's down here now there's a 100 percent chance

that's its over here and there's no chance it's over here on the left. When we think

about a tipping point, what we can think of is that previously, what was gonna

happen was uncertain. It could either go left or it could go right, but once the

tip has occurred, we know what's happened. We know the system is now completely gonna

be in the right. So one way to think about tipping points in the measure we're going

to introduce is gonna depend on that idea. That the uncertainty goes away. Initially,

there was some uncertainty. It could go left or right, but after the tip, we know

where it's gonna go. So we're gonna measure tipsiness by reductions in

uncertainty. So, to get there we first need a measure of uncertainty. Way to

think about that is you want to think about changes in outcomes. So, initially

there's a whole bunch of different outcomes that could occur, but then after

the system tips, either there's an, only one outcome could occur, could occur that

maybe if it goes to some equilibrium, or it could be the case that there's a whole

bunch of other things that could occur. So, it could have gone that you thought

for sure A was gonna happen, but after the tip it could be B, C, or D. So, what you

want to think of is changes in the likelihoods of different outcomes. So, how

do we measure that? How do we measure likelihoods of different outcomes? So,

we're gonna use, as I mentioned, two measures. One is gonna be this thing

called the diversity index, which is used. In social science a lot. In economics,

political science, sociology, and the other is going to be a [inaudible] entropy

which comes from physics and information theory. So here's the idea. Suppose we

have four outcomes, A, B, C, and D, and suppose each one has a probability. Of

one-fourth. So each one has one-fourth probability of being true. The question is

how do we measure how unlikely that is? Here's what we're gonna do. First we're

gonna introduce the diversity index and here's how it works. Plus you've got four

possibilities, A, B, C and D. So here they are. Pa, PB, PC, and PD. So remember from

probability that these four things are gonna have to, that those are the only

things that are gonna happen. These are gonna have to sum up to one. Well we want

some measures of set of how diverse this distribution is over these four outcomes,

cuz if it's a fourth, fourth, fourth, fourth that's more diverse than if it's

half a half zero, zero. So here's the idea of the diversity index. We basically first

compete the probability that have two people meet they are of the same type. So

let's suppose we, this is a distribution across people and there's type A, B, C,

and D. Well, what are the odds that they're both type AS? Well, that would be

PA times PA. And what would be the probability that they're both type Bs?

Well, that would be PB times PB. And what's the probability they're both PCs?

Well again, that's PC plus, times PC. And finally they could both be Ds. I can write

this in a fancy way and write it into summation sign. Of PI equals A, B, C, or

D, PI squared. Right so I can just take each of these types, P1, PA, PB, PC, PD,

square them, and just sum those up. So this little funny sign here means a

summation sign. So let?s suppose I have PA Equals PB, equals PC, equals PD, equals,

so they're all one fourth. Well then what I'm gonna get is I'm gonna get one fourth.

Squared [sound] plus one fourth squared, plus one fourth squared, plus one fourth

squared, which is one sixteenth plus, one sixteenth plus, one sixteenth plus, one

sixteenth plus which is four sixteenths, Which is one-fourth. So, that's nice, it's

sort of like one-fourth. So, to get the diversity index, what we do is we just

take, is its equal to one over. The summation of those PI squares. So that's

going to be one over one fourth, which is equal to four. So what that's telling us

is, look, it's like there's four types here. Okay, the reason the maximum number

is just one over the summation of these PI squares. So before we had PA equals PB

equals PC equals PD, we got a diverse index of four. Well, let's suppose we

have, PA, is a half. Pb. Is a third MPC equals a sixth. So, what we have is we've

got three types. A's, B's, and C's, but they're not evenly distributed. So, what

we'd like is our diversity mix to be maybe a little less than three because this

isn't quite three types. So, let's see what happens when we get it. We're going

to get PA squared which is one fourth. Plus PB squared which is one ninth, plus

PC squared which is one thirty-sixth. So you put all these over set 36 and you get

nine plus four plus one. Over 36. That's thirteen over, I'm sorry, fourteen,

fourteen over 36. So, fourteen over 36. If we take the adversity index, that's going

to be the opposite of that, which is 36 over fourteen, so that's two [sound]. And

eight over fourteen, or. To [sound]. And four 7's. So what the Burstein index tells

us is, well you know this is not quite three, because, you know, you?re more

likely to get A than you are to get C, and so it's, we'll call this two and four 7's.

So what the Burstein index is sort of, tells you approximately how many different

types of things there are. So why does this work to measure tips? Well if you

think of it as, if initially, we thought there were five places it could go, and

then after the tip there was only one place it could go, the adversity index

would flip from five to one. If initially it could go to A with probability

one\u20442, B with probability one-third, and C with probability one-sixth, then we

could say well, initially. There were two and four-sevenths places it could go, but

then if it tips and goes to C, we could say now there's only one place it could

go. And so we get a reduction in the diversity index from two and four-sevenths

down to one, and that helps us measure the extend of the tip. Okay, now let's turn to

entrophy. Entrophy is a slightly more complicated formula. But it also measures

the degree of uncertainty. So entrophy, again, comes from physics information

theory and it looks like this. It's equal to minus. The summation of those pi's.

That look similar to what we had before but now times [inaudible] the log base two

of pi. Now what is the log base two? Log base two. Of two raised to the power x,

this equals x. So it tells you what's the exponent of the number with respect to two

[inaudible] to the power of two. So if I take the log base two of one fourth that's

gonna equal the log base two of two to the minus two right. So I can write one-fourth

as two to the minus two. So that gets equal to minus two. So if we take our

example we had before and compute the NPD we get minus one-fourth log base two. Of

one-fourth, plus one-fourth. Log base two, of one fourth. Plus 1/4th, log base two of

1/4th, and then we got one more. Plus 1/4th Log base two of one-fourth. And if

you haven't done logs before, don't worry about it. I just want you to get a sense

of the idea behind this, what entropy measures, so [inaudible] conceptually what

this is, after we can finish this calculation. So, we're gonna get minus.

One-fourth times minus two, right? Plus one-fourth times minus two. Plus

one-fourth times minus two, plus one-fourth times minus two, right? And we

get this, remember, because the log of log based two of one-fourth is just the log

based [inaudible] minus two, and so it's just minus two. So that gives us minus.

And I've got four times one-fourth times minus two, so I get minus, minus two which

equals two. Okay, so what does entropy tell us? Well, entropy tells us is it

tells us the number of pieces of information we know, the number of bits of

information we have to know in order to identify the outcome. So, let's go back to

our example with this. Four outcomes, A, B, C, and D and their equally likely.

Well, how many pieces of information would I have to tell you in order to identify

what the outcome is? Well, suppose I said. First thing I use I can see well it's

either an A and B. See an A or B or C or D right. So I could divide this in half then

in the second thing I can tell you whether it's A or it's B. So I can always identify

which one it is by asking two questions. So question one could be is it. A or B,

or. C or D? So you can, you have to name which of these two sets it's in. And then

question two would depend on question one. If you said it was an A or B, it would

just ask, okay, is it A? And if you said yes, then it's A. And if you said no, then

it's B. So entropy tells you just how many pieces of information you would have to

know to identify the outcome. And so if there's four outcomes, you'd only need

two. This [inaudible], the diversity index tells you sort of the number of types and

entropy tells you the amount of information you would need to identify the

type. Right. So, in our example, the diversity index was four. Because there

were equally spaced, four types, right? A, B, C, and D were all equally likely. And

the entropy was two, because with two questions, you could identify whether,

which the type was, right? You could ask, is it A or B or C or D. And if they said C

or D, that means [inaudible] D. If they said yes, it's D, they said no, it's C. So

entropy's amount of information, diversity mix is the number of types. Either one of

these things will work. So let's see this in action, from our first example.

Remember, we had, initially. You're sitting up here, and there's a 50 percent

chance it goes here, and a 50 percent chance it goes here. So the diversity

index, in this case would be two, right? Because there's two outcomes, and each one

is equally likely. And so if we did our formula, we'd just get. One-half squared

plus one-half squared, which is one-fourth plus one-fourth, which is equal to

one-half. And then we take one over one-half and that gives us a [inaudible]

index of two. Now, if you compute the entrophy of this, we won't do it, you're

going to get the answer one. So, why? Because all you need is one piece of

information. You would say is it to the left or is it to the right? And that would

tell you where the ball is. So diversity index is two, entrophy is one. Well what

happens after this tips? After this tips, it goes over, let's say, the left with a

probability of 100%, well, what's going to happen to the diversity index? Well, when

this thing goes to 100%, zero. Then the university index is now one, right?

Because you know where it is and the entropy is zero, because you don't need to

be told anything. You don't need any information [inaudible] because it's

already on the left. So there's no uncertainty left in the system at all.

They can see this thing, the university index went from two to one, entropy went

from one to zero. That's how we can measure how much a system tipped, right?

Because it's telling us it went from, we don't know what's going to happen, it can

either go left or right, to now we know where it is, so the amount of number of

types is decreased from two to one. Or, if we think in terms of information, we can

say, previously I needed one piece of information. Now I need zero. So if you

think of tipping points, when a system suddenly tips to a new thing. We go from a

situation where we didn't know what was going to happen. [inaudible] where we knew

what was gonna happen. So, that would be true if we had a between class tip from a

complex situation to and equilibrium situation. Alternatively, right, we can

have a system tip from the nice equilibrium with the diversity index was

low, the entropy was zero, to one that became incredibly complex where

uncertainty was very high. Then the diversity index would increase. And the

entropy would also increase. So, what we've got is these measures, tippyness,

you'll be given an increase or a decrease in both diversity index and in entropy.

But th, the tip in a system is basically telling us that what's gonna happen is

likely to change. Now the way to think of Kips is changes in the likelihood of

outcomes. Remember, we talked about these direct tips or these contextual tips, or

these between class, within class. But so what's going to be happening with a

tipping point is what we thought was going to happen is no longer going to happen.

Or, if we didn't know it was going to happen and now we know it's going to

happen, so the system is tipped. The way to measure that is we need some measure of

uncertainty, and we've introduced two, right? We introduced the diversity index

and we introduced entrophy. So that's tipping points. It's been sort of a big

thing but we've learned some stuff, right? We've learned that just because we see a

kink in the graph doesn't mean it's a tipping point. That could just be a.

Growth process so there could be a diffusion process. We've also learned

there's a difference between active tips and direct tips, in between class and

within class tips. And we've learned a little bit about how to measure those

tips. Okay, thank you.