Give the solutions for the following recurrences: i. T(n) = 2T(n/2) + n ii. T(n) = T(n-1) + 1/n


Q.) Give the solutions for the following recurrences: i. T(n) = 2T(n/2) + n ii. T(n) = T(n-1) + 1/n

Subject: Data Structures

Solution for Recurrence i: T(n) = 2T(n/2) + n

This is a common recurrence relation that can be solved using the Master Theorem. The Master Theorem provides a way to get the time complexity of recurrence relations of the form:

T(n) = aT(n/b) + f(n)

where,

  • a ≥ 1 is the number of subproblems in the recursion
  • n/b is the size of each subproblem. (Here, it must be assumed that we are considering n to be an exact power of b)
  • f(n) is the cost of the work done outside the recursive calls, which includes the cost of dividing the problem and the cost of merging the solutions

The recurrence given is T(n) = 2T(n/2) + n, which matches the form required by the Master Theorem with a=2, b=2, and f(n)=n.

The Master Theorem states that this recurrence will have the following solutions based on the comparison between f(n) and n^(log_b(a)):

  1. If f(n) = O(n^(log_b(a) - ε)) for some ε > 0, then T(n) = Θ(n^(log_b(a))).
  2. If f(n) = Θ(n^(log_b(a)) * log^k(n)) for some k ≥ 0, then T(n) = Θ(n^(log_b(a)) * log^(k+1)(n)).
  3. If f(n) = Ω(n^(log_b(a) + ε)) for some ε > 0, and if af(n/b) ≤ cf(n) for some c < 1 and sufficiently large n, then T(n) = Θ(f(n)).

For our recurrence:

  • a = 2
  • b = 2
  • f(n) = n
  • log_b(a) = log_2(2) = 1

Now we compare f(n) with n^(log_b(a)):

f(n) = n = n^(log_2(2)) = n^1

Since f(n) is equal to n^(log_b(a)), we are in case 2 of the Master Theorem with k=0.

Therefore, the solution to the recurrence is:

T(n) = Θ(n^(log_2(2)) * log^(0+1)(n)) = Θ(n * log(n))

Solution for Recurrence ii: T(n) = T(n-1) + 1/n

This recurrence does not fit the Master Theorem, so we'll solve it using the iterative method (also known as the method of backward substitution).

We will expand the recurrence step by step:

T(n) = T(n-1) + 1/n = (T(n-2) + 1/(n-1)) + 1/n = ((T(n-3) + 1/(n-2)) + 1/(n-1)) + 1/n = ... = T(1) + 1/2 + 1/3 + ... + 1/(n-1) + 1/n

Assuming T(1) is some constant c (since the base case is not given), we can see that the recurrence sums up the harmonic series. The harmonic series is known to grow logarithmically, so we can approximate the sum as follows:

T(n) ≈ c + ∫(1/x dx) from 1 to n ≈ c + [ln(x)] from 1 to n ≈ c + ln(n) - ln(1) ≈ c + ln(n)

Therefore, the solution to the recurrence is:

T(n) = Θ(ln(n))

Here's a summary table of the solutions:

Recurrence Relation Solution Method Solution
T(n) = 2T(n/2) + n Master Theorem Θ(n log(n))
T(n) = T(n-1) + 1/n Iterative Method Θ(ln(n))

These solutions provide the asymptotic behavior of the given recurrence relations.