COMP3121/9101-Acct代写
时间:2022-11-03
THE UNIVERSITY OF 
NEW SOUTH WALES 
Algorithms 
COMP3121/9101 
School of Computer Science and Engineering 
University of New South Wales Sydney 
3. RECURRENCES 
COMP3121/9101 1 / 22 
Asymptotic notation 
“Big Oh” notation: f(n) = O(g(n)) is an abbreviation for: 
“There exist positive constants c and n0 such that 
0 ≤ f(n) ≤ c g(n) for all n ≥ n0”. 
In this case we say that g(n) is an asymptotic upper bound for 
f(n). 
f(n) = O(g(n)) means that f(n) does not grow substantially faster 
than g(n) because a multiple of g(n) eventually dominates f(n). 
Clearly, multiplying constants c of interest will be larger than 1, 
thus “enlarging” g(n). 
COMP3121/9101 2 / 22 
Asymptotic notation 
“Omega” notation: f(n) = Ω(g(n)) is an abbreviation for: 
“There exists positive constants c and n0 such that 
0 ≤ c g(n) ≤ f(n) for all n ≥ n0.” 
In this case we say that g(n) is an asymptotic lower bound for 
f(n). 
f(n) = Ω(g(n)) essentially says that f(n) grows at least as fast as 
g(n), because f(n) eventually dominates a multiple of g(n). 
Since c g(n) ≤ f(n) if and only if g(n) ≤ 1cf(n), we have 
f(n) = Ω(g(n)) if and only if g(n) = O(f(n)). 
“Theta” notation: f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) 
and f(n) = Ω(g(n)); thus, f(n) and g(n) have the same 
asymptotic growth rate. 
COMP3121/9101 3 / 22 
Recurrences 
Recurrences are important to us because they arise in estimations 
of time complexity of divide-and-conquer algorithms. 
Merge-Sort(A, p, r) *sorting A[p..r]* 
1 if p < r 
2 then q ← ⌊p+r2 ⌋ 
3 Merge-Sort(A, p, q) 
4 Merge-Sort(A, q + 1, r) 
5 Merge(A, p, q, r) 
Since Merge(A, p, q, r) runs in linear time, the runtime T (n) of 
Merge-Sort(A, p, r) satisfies 
T (n) = 2T 
(n 


+ c n 
COMP3121/9101 4 / 22 
Recurrences 
Let a ≥ 1 be an integer and b > 1 a real number; 
Assume that a divide-and-conquer algorithm: 
reduces a problem of size n to a many problems of smaller size n/b; 
the overhead cost of splitting up/combining the solutions for size 
n/b into a solution for size n is if f(n), 
then the time complexity of such algorithm satisfies 
T (n) = a T 
(n 


+ f(n) 
Note: we should be writing 
T (n) = a T 
(⌈n 

⌉) 
+ f(n) 
but it can be shown that ignoring the integer parts and additive 
constants is OK when it comes to obtaining the asymptotics. 
COMP3121/9101 5 / 22 
T (n) = a T 
(n 


+ f(n) 

a many instances … … … … 





. … … 









depth of recursion: log! 
size of instance = n 
size of instances = n/b size of instances = n/b2 



size of instances = 1 
COMP3121/9101 6 / 22 
Some recurrences can be solved explicitly, but this tends to be 
tricky. 
Fortunately, to estimate efficiency of an algorithm we do not need 
the exact solution of a recurrence 
We only need to find: 
1 the growth rate of the solution i.e., its asymptotic behaviour; 
2 the (approximate) sizes of the constants involved (more about 
that later) 
This is what the Master Theorem provides (when it is 
applicable). 
COMP3121/9101 7 / 22 
Master Theorem: 
Let: 
a ≥ 1 be an integer and and b > 1 a real; 
f(n) > 0 be a non-decreasing function; 
T (n) be the solution of the recurrence T (n) = aT (n/b) + f(n); 
Then: 
1 If f(n) = O(nlogb a−ε) for some ε > 0, then T (n) = Θ(nlogb a); 
2 If f(n) = Θ(nlogb a), then T (n) = Θ(nlogb a log2 n); 
3 If f(n) = Ω(nlogb a+ε) for some ε > 0, and for some c < 1 and some n0, 
a f (n/b) ≤ c f(n) 
holds for all n > n0, then T (n) = Θ(f(n)); 
4 If none of these conditions hold, the Master Theorem is NOT applicable. 
(But often the proof of the Master Theorem can be tweaked to obtain the 
asymptotic of the solution T (n) in such a case when the Master Theorem does 
not apply; an example is T (n) = 2T (n/2) + n logn). 
COMP3121/9101 8 / 22 
Master Theorem - a remark 
Note that for any b > 1, 
logb n = logb 2 log2 n; 
Since b > 1 is constant (does not depend on n), we have for 
c = logb 2 > 0 
logb n = c log2 n; 
log2 n = 


logb n; 
Thus, 
logb n = Θ(log2 n) 
and also 
log2 n = Θ(logb n). 
So whenever we have f = Θ(g(n) log n) we do not have to specify 
what base the log is - all bases produce equivalent asymptotic 
estimates (but we do have to specify b in expressions such as 
nlogb a). 
COMP3121/9101 9 / 22 
Master Theorem - Examples 
Let T (n) = 4T (n/2) + n; 
then nlogb a = nlog2 4 = n2; 
thus f(n) = n = O(n2−ε) for any ε < 1. 
Condition of case 1 is satisfied; thus, T (n) = Θ(n2). 
Let T (n) = 2T (n/2) + 5n; 
then nlogb a = nlog2 2 = n1 = n; 
thus f(n) = 5n = Θ(n) = Θ(nlog2 2). 
Thus, condition of case 2 is satisfied; and so, 
T (n) = Θ(nlog2 2 log n) = Θ(n log n). 
COMP3121/9101 10 / 22 
Master Theorem - Examples 
Let T (n) = 3T (n/4) + n; 
then nlogb a = nlog4 3 < n0.8; 
thus f(n) = n = Ω(n0.8+ε) for any ε < 0.2. 
Also, af(n/b) = 3f(n/4)= 3/4 n < cn = cf(n) for c = .9 < 1. 
Thus, Case 3 applies, and T (n) = Θ(f(n)) = Θ(n). 
Let T (n) = 2T (n/2) + n log2 n; 
then nlogb a = nlog2 2 = n1 = n. 
Thus, f(n) = n log2 n = Ω(n). 
However, f(n) = n log2 n ̸= Ω(n1+ε), no matter how small ε > 0. 
This is because for every ε > 0, and every c > 0, no matter how 
small, log2 n < c · nε for all sufficiently large n. 
Homework: Prove this. 
Hint: Use de L’Hoˆpital’s Rule to show that log n/nε → 0. 
Thus, in this case the Master Theorem does not apply! 
COMP3121/9101 11 / 22 
Master Theorem - Proof: 
Since 
T (n) = a T 




+ f(n) (1) 
implies (by applying it to n/b in place of n) 





= a T 


b2 

+ f 




(2) 
and (by applying (1) to n/b2 in place of n) 



b2 

= a T 


b3 

+ f 


b2 

(3) 
and so on . . ., we get 
(1)︷ ︸︸ ︷ 
T (n) = a T 




︸ ︷︷ ︸ 
(2L) 
+f(n) = a 

a T 


b2 

+ f 




︸ ︷︷ ︸ 
(2R) 

+ f(n) 
= a2 T 


b2 

︸ ︷︷ ︸ 
(3L) 
+a f 




+ f(n) = a2 

a T 


b3 

+ f 


b2 

︸ ︷︷ ︸ 
(3R) 

+ a f 




+ f(n) 
= a3 T 


b3 

︸ ︷︷ ︸+a 
2f 


b2 

+ a f 




+ f(n) = . . . 
COMP3121/9101 12 / 22 
Master Theorem Proof: 
Continuing in this way logb n− 1 many times we get ... 
T (n) = a3 T 
( n 
b3 

︸ ︷︷ ︸+a2f 
( n 
b2 

+ a f 
(n 


+ f(n) = 
= . . . 
= a⌊logb n⌋T 
( n 
b⌊logb n⌋ 

+ a⌊logb n⌋−1f 
( n 
b⌊logb n⌋−1 

+ . . . 
+ a3f 
( n 
b3 

+ a2f 
( n 
b2 

+ a f 
(n 


+ f(n) 
≈ alogb nT 
( n 
blogb n 


⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

We now use alogb n = nlogb a: 
T (n) ≈ nlogb aT (1) + 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

(4) 
Note that so far we did not use any assumptions on f(n) . . . 
COMP3121/9101 13 / 22 
T (n) ≈ nlogb aT (1) + 
⌊logb n⌋−1∑ 
i=0 
aif 
(n 
bi 

︸ ︷︷ ︸ 
Case 1: f(m) = O(mlogb a−ε) 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 


⌊logb n⌋−1∑ 
i=0 
aiO 
( n 
bi 
)logb a−ε 
= O 
⌊logb n⌋−1∑ 
i=0 
ai 
( n 
bi 
)logb a−ε = O 
nlogb a−ε ⌊logb n⌋−1∑ 
i=0 

ai 
(bi)logb a−ε 
) 
= O 
nlogb a−ε ⌊logb n⌋−1∑ 
i=0 
( a 
blogb a−ε 
)i = O 
nlogb a−ε ⌊logb n⌋−1∑ 
i=0 
( a 
blogb ab−ε 
)i 
= O 
nlogb a−ε ⌊logb n⌋−1∑ 
i=0 

a bε 

)i = O 
nlogb a−ε ⌊logb n⌋−1∑ 
i=0 
(bε)i 
 
= O 

nlogb a−ε 
(bε)⌊logb n⌋ − 1 
bε − 1 

; we are using 
m−1∑ 
i=0 
qi = 
qm − 1 
q − 1 
COMP3121/9101 14 / 22 
Master Theorem Proof: 
Case 1 - continued: 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

= O 

nlogb a−ε 
(bε)⌊logb n⌋ − 1 
bε − 1 

= O 
nlogb a−ε 

b⌊logb n⌋ 
)ε 
− 1 
bε − 1 
 
= O 

nlogb a−ε 
nε − 1 
bε − 1 

= O 

nlogb a − nlogb a−ε 
bε − 1 

= O 

nlogb a 

Since we had: T (n) ≈ nlogb aT (1) + 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

we get: 
T (n) ≈ nlogb aT (1) +O 

nlogb a 

= Θ 

nlogb a 

COMP3121/9101 15 / 22 
Master Theorem Proof: 
Case 2: f(m) = Θ(mlogb a) 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 


⌊logb n⌋−1∑ 
i=0 
aiΘ 
( n 
bi 
)logb a 
= Θ 
⌊logb n⌋−1∑ 
i=0 
ai 
( n 
bi 
)logb a 
= Θ 
nlogb a ⌊logb n⌋−1∑ 
i=0 

ai 
(bi)logb a 
) 
= Θ 
nlogb a ⌊logb n⌋−1∑ 
i=0 
( a 
blogb a 
)i 
= Θ 
nlogb a ⌊logb n⌋−1∑ 
i=0 

 
= Θ 

nlogb a⌊logb n⌋ 

COMP3121/9101 16 / 22 
Master Theorem Proof: 
Case 2 (continued): 
Thus, 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

= Θ 

nlogb alogb n 

= Θ 

nlogb alog2 n 

because logb n = log2 n · logb 2 = Θ(log2 n). Since we had (1): 
T (n) ≈ nlogb aT (1) + 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

we get: 
T (n) ≈ nlogb aT (1) + Θ 

nlogb a log2 n 

= Θ 

nlogb a log2 n 

COMP3121/9101 17 / 22 
Master Theorem Proof: 
Case 3: f(m) = Ω(mlogb a+ε) and a f(n/b) ≤ c f(n) for some 0 < c < 1. 
We get by substitution: f(n/b) ≤ c 

f(n) 
f(n/b2) ≤ c 

f(n/b) 
f(n/b3) ≤ c 

f(n/b2) 
. . . 
f(n/bi) ≤ c 

f(n/bi−1) 
By chaining these inequalities we get 
f(n/b2) ≤ c 

f(n/b)︸ ︷︷ ︸ ≤ ca · ca f(n)︸ ︷︷ ︸ = c 

a2 
f(n) 
f(n/b3) ≤ c 

f(n/b2)︸ ︷︷ ︸ ≤ ca · c2a2 f(n)︸ ︷︷ ︸ = c 

a3 
f(n) 
. . . 
f(n/bi) ≤ c 

f(n/bi−1)︸ ︷︷ ︸ ≤ ca · ci−1ai−1 f(n)︸ ︷︷ ︸ = c 

ai 
f(n) 
COMP3121/9101 18 / 22 
Master Theorem Proof: 
Case 3 (continued): 
We got f(n/bi) ≤ c 

ai 
f(n) 
Thus, 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

≤ 
⌊logb n⌋−1∑ 
i=0 
ai 
ci 
ai 
f(n) < f(n) 
∞∑ 
i=0 
ci = 
f(n) 
1− c 
Since we had (1): 
T (n) ≈ nlogb aT (1) + 
⌊logb n⌋−1∑ 
i=0 
aif 
( n 
bi 

and since f(n) = Ω(nlogb a+ε) we get: 
T (n) < nlogb aT (1) +O (f(n)) = O (f(n)) 
but we also have 
T (n) = aT (n/b) + f(n) > f(n) 
thus, 
T (n) = Θ (f(n)) 
COMP3121/9101 19 / 22 
Master Theorem Proof: Homework 
Exercise 1: Show that condition 
f(n) = Ω(nlogb a+ε) 
follows from the condition 
a f(n/b) ≤ c f(n) for some 0 < c < 1. 
Example: Let us estimate the asymptotic growth rate of T (n) which satisfies 
T (n) = 2T (n/2) + n logn 
Note: we have seen that the Master Theorem does NOT apply, but the technique 
used in its proof still works! So let us just unwind the recurrence and sum up the 
logarithmic overheads. 
COMP3121/9101 20 / 22 
T (n) = 2T 
(n 


︸ ︷︷ ︸+n logn 
= 2 
︷ ︸︸ ︷2T ( n 
22 




log 


+ n logn 
= 22 T 
( n 
22 

︸ ︷︷ ︸+n log n2 + n logn 
= 22 
︷ ︸︸ ︷2T ( n 
23 



22 
log 

22 
+ n log n 

+ n logn 
= 23 T 
( n 
23 

︸ ︷︷ ︸+n log n22 + n log n2 + n logn 
. . . 
= 2lognT 


2logn 

+ n log n 
2logn−1 + ...+ n log 

22 
+ n log n 

+ n logn 
= nT (1) + n(logn× logn− log 2logn−1 − ...− log 22 − log 2 
= nT (1) + n((logn)2 − (logn− 1)− ...− 3− 2− 1) 
= nT (1) + n((logn)2 − logn(logn− 1)/2 
= nT (1) + n((logn)2/2 + logn/2) 
= Θ 

n(logn)2 


COMP3121/9101 21 / 22 
PUZZLE! 
Five pirates have to split 100 bars of gold. They all line up and proceed as follows: 
1 The first pirate in line gets to propose a way to split up the gold (for example: 
everyone gets 20 bars) 
2 The pirates, including the one who proposed, vote on whether to accept the proposal. 
If the proposal is rejected, the pirate who made the proposal is killed. 
3 The next pirate in line then makes his proposal, and the 4 pirates vote again. If the 
vote is tied (2 vs 2) then the proposing pirate is still killed. Only majority can accept 
a proposal. The process continues until a proposal is accepted or there is only one 
pirate left. Assume that every pirate : 
above all wants to live; 
given that he will be alive he wants to get as much gold as possible; 
given maximal possible amount of gold, he wants to see any other pirate killed, 
just for fun; 
each pirate knows his exact position in line; 
all of the pirates are excellent puzzle solvers. 
Question : What proposal should the first pirate make? 
Hint: assume first that there are only two pirates, and see what happens. Then assume that 
there are three pirates and that they have figured out what happens if there were only two 
pirates and try to see what they would do. Further, assume that there are four pirates and that 
they have figured out what happens if there were only three pirates, try to see what they would 
do. Finally assume there are five pirates and that they have figured out what happens if there 
were only four pirates. 
COMP3121/9101 22 / 22 
essay、essay代写