I can do linear search, which will

take O of n or, I can do sort, first,

and then I can do binary search.

But, this is going to take, we have seen it already,

n log n, plus O of log n, and this is of course O of n log n.

So it doesn't make any sense to sort, and do binary search.

Just do linear search, and you are going to get the O of n.

So would we ever sort, and do binary search over this?

The answer is no.

If all you need to do is just one search operation, if,

if all you care about is just one-time search, you should do this.

If all you care about is twice, three times, four times,

seven times, a small number of searches, do this.

But, imagine what would happen if you have a list with, with millions of items, and

your search operation is done millions of times.

Then it is worth putting the effort into searching first, and

then start doing all the other search queries in log n, okay?

So, this is a very important algorithmic principle here.

That sometimes, it is worth pre-processing the data,

before we start doing the actual operations we're interested in.

The pre-processing here was sorting, so that I can do the search efficiently.

It is, this is worth it when n log n is not going to be the main factor.

N log n is not the main factor.

It is when the number of operations here, actually overwhelms this.

So I will always have n log n for the sorting, plus k

times log n, where k is the number of search queries that I will be executing.

If k is much larger than n, this is worth it.

If k is too small this is not worth it.

Okay.

So this here,

now that we have seen sort, you can start using sort as a pre-processing technique,

pre-process the data and do the other operations much, much faster.

This belongs in some sense to the category of transform and conquer.

This kind of approach in the sense that I wanted to do a search on a list,

I transform the instance of the problem by sorting it,

and then I did my actual solution to the problem, okay.

So keep in mind that sorting while it is about getting a sorted list of items.

Sometimes we, it is a very powerful technique for pre-processing the data, so

that subsequent operations, whether they are searching or comparison, and

so on, can be done very efficiently.

Okay?

There are many operations, that if you do them on an unsorted list are going to be

very unefficient, inefficient.

And these operations, once you have preprocessed the data so

that it is sorted, they become efficient all of a sudden.

So it's very important to keep in mind that, if you are doing just one search

operation, or five, or 100, or, or 1,000, it is not worth going through this.

Do linear search.

If the number of queries is, is, growing all the time, and

it is a very large number.

It is worth pre processing the data so

that every time you do the query, it is taken a short amount of time, okay.

So in this set of lectures we have seen divide and conquer technique, for

merge sort, for binary source, how we divide a big problem into sub problems.

Then solve each one of them, and then merge the results.

Remember that there are three steps.

We have seen that,

the natural implementation of these algorithms is recursive, where you

call the functions multiple times on the, in the instances of the sub problems.

We saw that these recursive algorithms naturally give rise,

to formulas that we call recurrences for the running time of the algorithms.

We have seen how recurrences of this specific form can be solved using

the master theorem, to give us an explicit function that we understand.

I want to emphasize once more.

Master theorem is not a framework for solving all recurrences.

The recurrences have to be of a specific form, satisfies that the conditions for

it to apply.

And the last thing we saw is that using this technique of sorting,

I can use it as a pre-pocessing to pre-process lists of, of numbers,

of strings and so on so that later when I apply other operations like searching or

comparing the two closes elements and so on.

I can, I can get that operation done in much faster amount of time.