The terminology would then be to say that the running time of merge sort is big O

of n log n.

So in other words,

when you say that an algorithm's running time is big O of some function.

What you mean is that after you've dropped the lower order terms, and

suppressed the leading constant factor, you're left with the function f of n.

Intuitively, that is what the big O notation means.

So to be clear, I'm certainly not asserting thay

constant factors never matter when you're designing and analyzing algorithms.

Rather, I'm just saying that when you think about high level

algorithmic approaches, when you might want to make a comparison between

fundamentally different ways of solving a problem.

Asymptotic analysis is often the right tool for

giving you guidance about which one is going to perform better.

Especially on reasonably large inputs.

Now, once you've committed to a particularly algorithmic solution to

a problem.

Of course, you might want to then work harder to improve the leading constant

factor, perhaps even to improve the lower order terms.

By all means if the future of your start up depends on how efficiently you

implement some particular set of lines of code, have at it.

Make it as fast as you can.

In the rest of this video I want to go through four very simple examples.

In fact these examples are so simple, if you have any experience with big O

notation, you're probably better off just skipping the rest of this video.

And moving on to the mathematical formalism that we begin in the next video.

But if you've never seen it before,

I hope these simple examples will get you oriented.