## 字幕列表 影片播放

• The following content is provided under a Creative

• Your support will help MIT OpenCourseWare

• To make a donation or to view additional materials

• from hundreds of MIT courses, visit MIT OpenCourseWare

• at ocw.mit.edu.

• CHARLES LEISERSON: Today, we're going

• And this is going to rely on the fact

• that everybody has taken an algorithms class.

• And so I want to remind you of some of the stuff

• you learned in your algorithms class.

• And if you don't remember this, then it's

• probably good to bone up on it a little bit,

• because it's going to be essential.

• And that is the topic of divide-and-conquer recurrences.

• Everybody remember divide and conquer recurrences?

• These are-- and there's a general method for solving them

• that will deal with most of the ones we want,

• called the Master Method.

• And it deals with recurrences in the form T of n

• equals a times T of n over b plus f of n.

• And this is generally interpreted

• as I have a problem of size n.

• I can solve it by solving a problems of size n over b,

• and it costs me f of n work to do that division

• and accumulate whatever the results are of that

• to make my final result.

• For all these recurrences, the unstated base case

• is that this is a running time.

• So T of n is constant if n is small.

• So does that makes sense?

• Everybody familiar with this?

• Right?

• Well we're going to review it anyway,

• because I don't like to go ahead and just assume, and then leave

• 20% of you or more, or less, left behind in the woods.

• So let's just remind ourselves of what this means.

• So it's easy to understand this in terms of a recursion tree.

• I start out, and the idea is a recursion tree

• is to take the running time, here,

• and to reexpress it using the recurrence.

• So if I reexpress this and I've written

• it a little bit differently, then I have an f of n.

• I can put an f of n at the root, and have

• a copies of T of n over b.

• And that's exactly the same amount of work as I had--

• or running time as I had in the T of n.

• I've just simply expressed it with the right hand side.

• And then I do it again at every level.

• So I expand all the leaves.

• I only expanded one here because I ran out of space.

• And you keep doing that until you get down to T of 1.

• And so then the trick of looking at these recurrences

• is to add across the rows.

• So the first row adds up to f of n.

• The second row adds up to a times f of n over b.

• The third one is a squared f of n over b squared, and so forth.

• And the height here, now.

• Since I'm taking n and dividing it by b each time,

• how many times can I divide by b until I get to something

• that's constant size?

• That's just log base b of n.

• So, so far, review--

• any questions here?

• For anybody?

• OK.

• So I get the height, and then I look at how many--

• if I've got T of 1 work at every leaf,

• how many leaves are there?

• And for this analysis we're going to assume everything

• works out--

• n is a perfect power of b and so forth.

• So if I go down k levels, how many sub problems

• are there at k levels?

• a of the k.

• So how many levels am I going down?

• h, which is log base b of n.

• So I end up with a to log base b of n times what's

• at the leaf, which is T of 1.

• And T of 1 is constant.

• a log base b of n--

• that's the same as n to the log base b of a.

• OK.

• That's just a little bit of exponential algebra.

• And you can-- one way to see that is,

• take the log of both sides of both equations,

• and you realize that all that's used there

• is the commutative law.

• Because if you take the log base--

• if you take log of a log bn, you get log bn times log--

• if you take a base b, log ba.

• And then you get the same thing if you take the log base b

• of what I have as the result. Then you

• get the exponent log base ba times log base b of n.

• So same thing, just in different orders.

• So that's just a little bit of math, because this is--

• basically, we're interested in, what's the growth in n?

• So we prefer not to have log n's in the denominator.

• We prefer to have n's--

• sorry, in the exponent we prefer to have n's.

• So that's basically the number of things.

• And so then the question is, how much work is there

• if I add up all of these guys all the way down there?

• How much work is in all those levels?

• And it turns out there's a trick,

• and the trick is to compare n to log base b of a with f of n.

• And there are three cases that are very commonly arising,

• and for the most part, that's what we're going to see,

• is just these three cases.

• So case 1 is the case where n to the log base b of a

• is much bigger than f of n.

• And by much bigger, I mean it's bigger by a polynomial amount.

• In other words, there's an epsilon such

• that the ratio between the two is at least n to the epsilon.

• There's an epsilon greater than 0.

• In other words, f of n is O of n to the log base

• b of a minus epsilon in the numerator

• there, which is the same as n log base b

• of a divided by n to the epsilon.

• In that case, this is geometrically increasing,

• and so all the weight--

• the constant fraction of the weight--

• is in the leaves.

• So then the answer is T of n is n to the log base b of a.

• So if n to log base b of a is bigger than f of n,

• the answer is n to the log base b

• of a, as long as it's bigger by a polynomial amount.

• Now, case 2 is the situation where n to the log base b of a

• is approximately equal to f of n.

• They're very similar in growth.

• And specifically, we're going to look at the case where f of n

• is n to the log base b of a poly-logarithmic factor--

• log to the k of n for some constant k greater than

• or equal to 0.

• That greater than or equal to 0 is very important.

• You can't do this for negative k.

• Even though negative k is defined and meaningful,

• this is not the answer when k is negative.

• But if k is greater than or equal to 0,

• then it turns out that what's happening

• is it's growing arithmetically from beginning to end.

• And so when you solve it, what happens is,

• you essentially add an extra log term.

• So the answer is, if f of n is n to the log base

• b of a log to the k n, the answer is n to the log base

• b of a log to the k plus 1 of n.

• So you kick in one extra log.

• And basically, it's like--

• on average, there's basically--

• it's almost all equal, and there are log layers.

• That's not quite the math, but it's good intuition

• that they're almost all equal and there are log layers,

• so you tack on an extra log.

• And then finally, case 3 is the case when no to the log base

• b is much less than f of n, and specifically

• where it is smaller by, once again, a polynomial factor--

• by an n to the epsilon factor for epsilon greater than 0.

• It's also the case here that f has

• to satisfy what's called a regularity condition.

• And this is a condition that's satisfied

• by all the functions we're going to look at-- polynomials

• and polynomials times logarithms,

• and things of that nature.

• It's not satisfied for weird functions

• like sines and cosines and things like that.

• It's also not-- more relevantly, it's

• not satisfied if you have things like exponentials.

• So this is-- but for all the things we're going to look at,

• that's the case.

• And in that case, things are geometrically decreasing,

• and so all the work is at the root.

• And the root is basically cos f of n,

• so the solution is theta f of n.

• We're going to hand out a cheat sheet.

• So if you could conscript some of the TAs to get that

• distributed as quickly as possible.

• OK.

• So let's do a little puzzle here.

• So here's the cheat sheet.

• That's basically what's on it.

• And we'll do a little in-class quiz, self-quiz.

• So we have T of n is 4T n over 2 plus n.

• And the solution is?

• This is a thing that, as a computer scientist,

• you just memorize this so that you can-- in any situation,

• you don't have to even look at the cheat sheet.

• You just know it.

• It's one of these basic things that all computer scientists

• should know.

• It's kind of like, what's 2 to the 15th?

• What is it?

• AUDIENCE: [INAUDIBLE]

• CHARLES LEISERSON: Yes.

• And interestingly, that's my office number.

• I'm in 32-G768.

• I'm the only one in this data center with a power of 2 office

• number.

• And that was totally unplanned.

• So if you need to remember my office number, 2 to the 15th.

• OK, so what's the solution here?

• AUDIENCE: Case 1.

• CHARLES LEISERSON: It's case 1.

• And what's the solution?

• AUDIENCE: n squared?

• CHARLES LEISERSON: n squared.

• Very good.

• Yeah.

• So n to the log base b of a is n to the log base 2 of 4.

• Log base 2 of 4 is 2, so that's n squared.

• That's much bigger than n.

• So it's case 1, and the answer is theta n squared.

• Pretty easy.

• Yeah.

• AUDIENCE: [INAUDIBLE]

• CHARLES LEISERSON: Yeah.

• It's n squared log n.

• Once again, the first part is the same.

• n to the log base b of a is n squared.

• n squared is n squared log to the 0 n.

• So it's case 2 with k equals 0, and so you just

• tack on an extra log factor.

• So it's n squared log n.

• And then, of course, we've got to do this one.

• Yeah.

• AUDIENCE: [INAUDIBLE]

• CHARLES LEISERSON: Yeah, n cubed, because once again,

• n to log base b of a is n squared.

• That's much less than n cubed. n cubed's bigger,

• so that dominates.

• So we have theta n squared.

• Yeah.

• AUDIENCE: Theta of n squared.

• CHARLES LEISERSON: No.

• Which case do you think it is?

• AUDIENCE: [INAUDIBLE]

• CHARLES LEISERSON: Case 2?

• AUDIENCE: Yeah.

• CHARLES LEISERSON: OK.

• No.

• Yeah.

• AUDIENCE: None of the cases?

• CHARLES LEISERSON: It's none of the cases.

• It's a trick question.

• Oh, I'm a nasty guy.

• I'm a nasty guy.

• This is one where the master method does not apply.

• This would be case 2, but k has to be

• greater than or equal to 0, and here k is minus 1.

• So case two doesn't apply.

• And case 1 doesn't apply, where we're

• comparing n squared to n squared over log n,

• because the ratio there is 1 over log n, and that--