And answering this question as we just saw, leads to a deeper question and indeed

a more surprising observation out of the Milgram experiment in the late sixties,

and that is the algorithmic. Small world.

One model by Kleinberg, two years after the Watts Strogatz model, can be described

briefly as follows and then we have homer problem to go into further detail about

that. So now we would like to find out the

quantity little l, which is the length of the greedy search path in a social network

and we want this number to be small and also grow slowly as a function of number

of lengths. Just like a big L.

In Kleinberg's model he says that, true we will have longer path but the longer the

path is, the less likely it will be established.

That is the starting point of Plan Berth model to explain how these small little

hours. Well how less likely, let's say, if the

distance is denoted by R which could be the physical distance or some composite

matric of social and physical distance. Then the probability that such a long

length exists is R to the minus alpha. So alpha is some parameter.

That controls the rate of decay of the likelihood of the longer path.

And then it turns out that Kleinberg showed if this rate of decay, alpha, is

exactly equal to the dimension of the space, for example, inth, our word,

3-dimension. Hm.

In a word of say, a grid, a two dimension, in a word of a line, or a circle, one

dimension. Then if alpha is three, or two, or one, in

these three different cases, on in general as long as alpha is exactly, you go to the

dimension of the space in which the network re size, then algorithmic small

world will hold, and this little L will be small and grows slowly as a function

again. So why is that?

Again the proofs are in the homework and we'll go through a similar model in just a

minute. But intuitively what's going on is the

following. That if you look at say 2-dimensional

space, or in general r, k dimensional space.

U2018kay? Then you look at ball or radius r, then

you look at another ball of radius 2r. And then you look at how many more people

will live in this space, okay. The probability of having the linked one

of these nodes will drop as R to the minus K and yet the number of node living in the

space, assuming some evenly distributed node location, will be proportional to R

to the positive K. So, as you go farther out, there are more

people that you might know, even though the chance that you know any one of them

is smaller. And it turns out, if these two growth and

decay rates cancel each other out. Then, you pretty much have a constant

chance of number of people you will know as the space, space grows.

And that turns out to be the key idea. The number of people going up compensates

for the smaller chance of knowing any, any individual as the social circle expands.

And that's what's underlying this resout. Now, you may say this resout is a little

too brittle. Because we need alpha to be exactly = to

the space number of dimensions, K. It can be a little bigger than that.

To say, K + 0.001. You cannot be K- 00.1.

And part of this brittle nature of the result is due to the fact, how we define

algorithmics more world. We want L to grow as, Log or, and most a polynomial of a log

function. Okay.

And part of the reason is because the way we construct the likelihood of longer

path. In advanced material part of the lecture

video we will talk about yet another alternative model to explain algorithmic's

will work without this brittle nature. But before then, we will first talk about

a variant of Kleinberg model, called Watts Dodds Newman model,