The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

Loading...

From the course by Stanford University

Greedy Algorithms, Minimum Spanning Trees, and Dynamic Programming

488 оценки

The primary topics in this part of the specialization are: greedy algorithms (scheduling, minimum spanning trees, clustering, Huffman codes) and dynamic programming (knapsack, sequence alignment, optimal search trees).

From the lesson

Week 4

Advanced dynamic programming: the knapsack problem, sequence alignment, and optimal binary search trees.

- Tim RoughgardenProfessor

Computer Science

We've now got two dynamic programming algorithms under our belt.

We know how to compute weighted independence sets in path graphs.

We also know a dynamic programming solution to the famous knapsack problem.

But before I proceed to still more useful and

famous dynamic programming algorithms Let's pause for a sanity check.

Let's go through a worked example

for that, dynamic programming algorithm for Knapsack.

Just to make sure that everything is crystal clear.

So, for reference, let me just,

rewrite the key point to the Knapsack algorithm.

So first, we have a 2D, 2D array A.

And for initialization, just whenever I equals 0.

That is, you.

You can't use any items at all.

Of course the optimal solution value is 0, and then I'm just going

to rewrite the same occurrence that we had in the previous couple of videos.

So

you'll recall that in the main loop, when you're considering a given item a,

and a residual cap-, nap sack capacity x, you take the better of two solutions.

Either you inherit the solution With i minus one and

the residual capacity x that corresponds to not taking item

i, or if you do take item i you get

a credit of Visa Buy the value for that item.

But the residual capacity drops from x to x minus the weight of item i and

you look up the optimal solution for that smaller sub problem.

So, for our concrete example, let's just look at a case with 4 items.

An initial nap sack capacity capital W equal to 6,

and the values and weights of the 4 items as follows.

So, I'm going to describe the most straightforward

implementation of the dynamic programming algorithm for knapsack.

We're literally just going to explicitly form the 2D

array A.

One index for i will range between 0 and n, the other

index for x will range between 0 and capital W, the original capacity.

There certainly a lot of optimizations you can apply when filling in these tables.

In fact, the future programming assignment will ask you to consider some.

But just to make sure the basic algorithm is totally

clear, let's just stick with that naive implementation for now.

So we begin with the initialization and setting A0X

to be equal to 0 for all x, just

means we take the leftmost column corresponding to i

equals 0 and we fill that in entirely with 0's.

Now in a real implementation there's no reason

to explicitly store these 0's, you know they are

there, but again to keep the picture clear let's

just fill in the 0's in the left column.

Now we proceed to the main while loop and

recall the outer for loop just increments the index i

for the item that we're considering so the outer for loop

is considering each of the columns in turn from left to right.

Now for a fixed value of i for a fixed column, in the

inner for loop we consider all values of x from 0 to capital W.

So that just corresponds in a given column we're

going to fill in the entries from bottom to top.

So we begin this double for loop with the second

column i equal 1 and at the bottom x equals 0.

Now in general when we fill in one of these

table entries, we're going to take the better of two solutions.

Either we just inherit the solution from the column to the left in the same row.

This corresponds to skipping item i.

Or we add the value of the current item.

to the opitimal solution that we extract from

the column one to the left, but also W

[UNKNOWN]

rows down.

So, that corresponds to reducing the residual capacity of W sub i.

Now, for the early subproblems in a column, where

you have essentially 0 residual capacity, it's kind of degenerate.

Right so, if your residual capacity x is

actually less than the weight of the current

item you're considering, w sub i, you have

no choice but to inherit the previous solution.

You're actually not allowed to pack the current item i.

So concretely in this example,

in the first column, notice the weight of the first item is 4.

So that means if x equals 0 or 1 or 2 or 3.

We're actually not permitted to choose item

one, we're going to run out of residual capacity.

So in the first, bottommost four rows

[INAUDIBLE],

we're forced to inherit the solution from the column to the left.

So it is the first four 0's from the left column get copied over to column with i

equals 1.

Now, in this outer iteration of the, in the first outer iteration of

the for-loop, once x reaches 4, now

there's actually an interesting decision to make.

We have the option of inheriting the 0,

immediately to the left.

Or, we can take item 1, therefore getting a value of 3.

And then we inherit the solution from a 0, 0.

And now A000 has a 0, but were obviously happy to get this value of three.

So that's going to determine the max we're going to except V sub i , plus a of 0, 0.

And that gives us a three in this next table entry.

By the same reasoning, we're going to fill in

the uppermost rows of column one with these threes.

We're, we certainly would prefer just taking the

item one which we're now allowed to do.

We have enough residual capacity.

As opposed to just inheriting the 0 from the column one to the left.

So that's how we fill in the column of i equal one.

Moving on to i equal two.

Again, so here notice that item two has a weight of three.

So again, the bottommost three rows when x equals 0, or one, or two.

There's nothing we can do.

We don't have the option of picking item number two.

We don't have enough digital capacity.

So we just copy the values from column i equal 1 over.

Those happen to be 0's, so that gives us three 0's to start column two.

Now when i equals 2 and x equals 3, now

we're, we have enough residual capacity to take item two.

And we're certainly, much happier to take the value of 2.

Which is the value of item number two, as

opposed to inheriting the 0, 1 column on the left.

So that gives us a two in the column i

equals 2, with a residual capacity x equal to 3.

So perhaps the first truly interesting table entry to fill in is when i

equals 2 and x equals 4, because in this case, we

actually have two different non trivial solutions we have to pick from.

So one solution, as usual, we can just copy over the number, which

is one column to the left, but actually in this case, there's a 3.

There's no longer a 0 to the To the left there's a 3 to the left Right?

A 1,4 is equal to 3.

Our other option is to take the second item getting us a value

of 2 and add that to whatever is in the leftmost column but shifted

down by 3, because 3 is the weight of the second item.

And you'll note that A of 1,1 is 0.

So, picking the current item would just give us 2 plus 0 equals 0.

We'd prefer to just inherit the 3, from the column one to the left.

That is, we prefer to skip item two, so

that we get the value from item one instead.

The upper two rows of the second, the column with i

equal 2 are filled in with 3s for exactly the same reason.

So for example

in the top row, what are our options?

Well we can inherit the three that's one to the left.

If we actually use item number two, that

knocks a residual capacity down to three and

then you have 1,3 equals 0, so again, you'd rather have the three than the two.

Moving on to the penultimate column, 1i equals 3, for the

usual reasons, we have to fill in the two bottommost rows.

with a 0.

Notice that the weight of the third item equals 2.

So we need a residual capacity x equal to 2 before we can actually use it.

Now, when x equals 2, we'd much rather have the value

of 4 for the current item than to copy the 0 over.

So we're going to fill in the entry A of 3,2 of 4, the value of the third item.

Similar reasons apply when x equals three or four.

So here the alternative starts looking better, right, so in the row where x

equals three the value that we'd inherit would be two.

In the row where x equals four, the value we'd inherit would be three.

But in both of these cases we'd prefer to have the

immediate gratification of the value of four from the third item.

And just inherit the empty solution with the residual capacity.

So when X equals 3 and X equals 4, we're just

going to take the third item and get a value of 4.

Now good things really start happening once X equals

5, because at that point we can both grab the third item and gets it's value of 4.

But now, we knocked down the residual capacity only

to 5 minus 2 which equals 3, and when you

have i equal 2, and x equal 3, you

actually get to, a value of 2, in that case.

So, we're going to fill in the entry for i equals 3 and

x equal 5 with 4, the value of the current item, plus 2,

the optimal solution to the corresponding sub-problem.

It gets even better when x equals 6.

We're again going to take the third item, get its value of 4.

But now, in the smaller subproblem, when i equals 3 and x equals 4, we actually

get a value of 3, so the value here's going to be 4 plus 3, or 7.

Moving to the last column when i equals 4, so here the fourth item has weight 3, so

that means when x equals 0, or 1, or 2, we don't even have the option of picking it.

We have no choice but to copy over the

number from the column to the left, so that's going to

give us a 0, and a 0, and a 4, for the first three rows of the final column.

So now in column 3, we do have the option of picking the fourth item.

That'll give us a value of 4.

You also have the option of just copying over

the value in the column to the left. That will also give us a 4.

So, we have a tie.

And in some sense it doesn't matter.

So we're going to fill in the entry of 4, when i equals 4, and x equals 3.

The exact same reasoning applies when x equals 4.

We're making the comparison between two thing, two

things of equal value, both equal to 4.

Now, when x equals 5, we let the good times roll.

We both can take the fourth item, and we get a value of 4 for the the fourth item.

But now, when we subtract out its weight, we get a residual capacity of 2,

and when x equals 2 and i equals 3, we also get a value of 4.

So in this entry, we can write 4 plus 4, or 8.

And that's superior to the alternative, which is just

inheriting the six from the column to the left.

Same story holds when x equals to six we can take the fourth

item, get the immediate gratification of

four, get four from the smaller subproblem.

When i equals 3 and x equals 3, and that eight beats

out the alternative, inheriting the seven from the column to the left.

So that completes the forward pass of the dynamic

programming algorithm, filling in the table systematically using our recurrence.

When it completes, of course, the optimal

solution value is in the upper right corner.

It's the value of the biggest sub-problem.

So we know, at this point, that the max value

of any feasible solution to this knapsack instance is 8.

And as as we've discussed after you've filled in

the table in the forward pass, if you want

to get your grubby little hands on the optimal

solution itself, you can do that with a reverse pass.

There's a reconstruction post-processing step.

How does that work?

Well you start with the biggest subproblems, in this

case when i equals 4 and x equals 6.

And then you ask, by which branch of

the recurrence did we fill in this table entry.

And that gives

us guidance over whether to pick this item or not.

So, how did we get this 8?

Did we inherit it from the column one

to the left, corresponding to not taking item 4?

Or, did we get it from the column to the left, in

the sub-problem with decrease, with residual

capacity decreased by the weight of item.

Four. Corresponding to taking item four.

Well, if you look at it, we didn't just inherit the solution

in the column to the left which does indeed correspond to taking

item four.

That was the better of the two options we used to fill in this table entry.

So, that means that in the optimal solution, item four will be there.

So, having figured that out, we trace back through

the table, we say, okay, well, let's look then

at the table entry that we used to build

up to this optimal solution to the big problem.

So, that corresponds to going one column to the left, and a number of

columns down equal to the weight of this fourth item, that is three rows down.

Now we ask exactly the same question.

Did we inherit the solution from the previous

column, corresponding to picking this item, or did we

use this item, and build from a solution

to a smaller sit problem from the previous column.

Well, again, you know, where did this 4 come from?

It didn't come from immediately to the left.

It came from the value of item 3

plus the optimal solution with a decreased residual capacity.

So that means that the optimal solution.

Also includes item number three.

So, what do we do then?

We trace back, we go one column to the left, and we have to go now two rows down.

Two rows because the weight of the third item is two.

So, that leads us to A(2,1) and at this point, the

residual capacity is so small that we have no choice but to

inherit the numbers from the left, so we're not going to be able

to pack either item one or item two into the optimal solution.

So at this point, we just go left during our traceback,

and then when we get to i equals 0, we're done.

We know we've constructed the optimal solution, which in

this case is item number 3 and item number 4.

That's how

you get an optimal value of 8 in this particular knapsack instance.

Coursera делает лучшее в мире образование доступным каждому, предлагая онлайн-курсы от ведущих университетов и организаций.