Recurrences#

TL;DR

Recurrence relations are mathematical expressions that define the runtime of algorithms or the space complexity of data structures recursively in terms of their input size. In other words, they describe the time or space complexity of an algorithm by expressing it as a function of the size of the input data.

In the context of data structures and algorithms, recurrences are commonly used to analyze the performance of recursive algorithms such as quicksort, mergesort, and binary search. By using recurrence relations, we can obtain a closed-form expression for the runtime complexity of such algorithms and make predictions about their performance as the input size grows.

Solving recurrence relations can be challenging, and there are several methods to do so, including the substitution method, the recursion tree method, and the master theorem. Understanding and using recurrence relations is an important skill for anyone studying data structures and algorithms, as it enables them to analyze the performance of algorithms and design more efficient solutions.

Additional Resources

Factorial of n (formula)#

\(n!\)

For each positive integer \(n\), the quantity \(n\) factorial denoted \(n!\), is defined to be the product of all the integers from \(1\) to \(n\):

\[n! = n · (n − 1) \dots 3·2·1\]

Zero factorial, denoted 0!, is defined to be 1:

\[0! = 1\]

Discrete Mathematics with Applications, 4th

Examples

Simplify the following expressions:

\(\frac{8!}{7!}\)
\[\begin{split}\begin{align} \frac{8!}{7!} & = \frac{8 * 7!}{7!} \\ & = 8 \\ \end{align}\end{split}\]
\(\frac{5!}{2!*3!}\)
\[\begin{split}\begin{align} \frac{5!}{2!*3!} & = \frac{5 * 4 * 3!}{2! *3!} \\ & = \frac{5 * 4}{2 * 1} \\ & = \frac{20}{2} \\ & = 10 \\ \end{align}\end{split}\]
\(\frac{(n)!}{(n - 3)!}\)
\[\begin{split}\begin{align} \frac{(n)!}{(n - 3)!} & = \frac{n * (n - 1) * (n - 2) * (n - 3)!}{(n - 3)!} \\ & = n * (n - 1) * (n - 2) \\ & = n^3 - 3n^2 + 2n \end{align}\end{split}\]

code

1int fact(int num) {
2
3  if (num == 0) return 1;
4
5  else return num * fact(num - 1);   
6}

https://www.geeksforgeeks.org/factorial-formula/

Use Cases

Permutation

  • gives the number of ways to select \(r\) elements from \(n\) elements when order matters

\[^nP_r = \frac{n!}{(n – r)!}\]
Example

Three different fruits are to be distributed among a group of 10 people. Find the total number of ways this can be possible.

\(n = 10,\ r = 3 \dots =\ ^{10}P_3\)
\[\begin{split}\begin {align} n = 10,\ r = 3...\ &=\ ^{10}P_3 \\ &= \frac{10!}{(10 – 3)!} \\ &= \frac{10!}{7!} \\ &= \frac{10 × 9 × 8 × 7!}{7!} \\ &= 10 × 9 × 8 \\ &= 720 \end {align}\end{split}\]

Combination

  • gives the number of ways to select \(r\) elements from \(n\) elements where order does not matter

\[^nC_r = \frac{n!}{r! (n – r)!}\]
Example

Find the number of ways 3 students can be selected from a class of 50 students.

\(n = 50,\ r = 3 \dots =\ ^{50}C_3\)
\[\begin{split}\begin {align} n = 50,\ r = 3...\ &=\ ^{50}C_3 \\ &= \frac{50!}{3!(50 - 3)!} \\ &= \frac{50 × 49 × 48 × 47!}{3! × 47!} \\ &= \frac{50 ×49 × 48}{6} \\ &= 19,600 \end {align}\end{split}\]

Recurrence Relations#

By itself, a recurrence does not describe the running time of an algorithm

  • need a closed-form solution (non-recursive description)

  • exact closed-form solution may not exist, or may be too difficult to find

For most recurrences, an asymptotic solution of the form \(\Theta()\) is acceptable

  • …in the context of analysis of algorithms

Methods of Solving Recurrences#

Unrolling

In the unrolling method, we repeatedly substitute the recurrence relation into itself until we reach a base case. Let’s unroll the recurrence relation for a specific value of \(n\):

\[\begin{split}\begin{align} T(n) &= 2T \bigg(\frac{n}{2} \bigg) + n \\ &= 2 \bigg[2T \bigg(\frac{n}{4} \bigg) + \frac{n}{2} \bigg] + n \\ &= 4T \bigg(\frac{n}{4} \bigg) + 2n + n \\ &= 4 \bigg[2T \bigg(\frac{n}{8} \bigg) + \frac{n}{4} \bigg] + 2n + n \\ &= 8T \bigg(\frac{n}{8} \bigg) + 4n + 2n + n \\ &= 2^kT \bigg(\frac{n}{2^k} \bigg) + kn \\ \end{align}\end{split}\]

We continue this process until we reach the base case, which occurs when \(\frac{n}{2^k} = 1\). Solving for \(k\), we find that \(k = log_2\ (n)\). Therefore, the final unrolled expression becomes:

\[ \ \ T(n) = 2^{log_2 (n)} \ T(1) + n \ log_2 \ (n) \ \ \]

Since \(T(1)\) is a constant, we can simplify the expression further:

\[\begin{split}\begin{align} T(n) &= nT(1) + n\ log_2 \ (n) \\ &= O(n\ log_2\ (n)) \end{align}\end{split}\]

Thus, the solution obtained through unrolling suggests that the time complexity of the original recurrence relation is \(O(n\ log_2 (n))\).

Guessing

In the guessing method, we make an educated guess or hypothesis about the form of the solution based on the recurrence relation. We then use mathematical induction or substitution to prove our guess.

Let’s guess that the solution to the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\) is \(T(n) = O(n\ log\ (n))\).

We assume that \(T(k) ≤ ck\ log\ (k)\) for some constant \(c\), where \(k < n\). Now we substitute this assumption into the recurrence relation:

\[\begin{split}\begin{align} T(n) &= 2T \bigg(\frac{n}{2} \bigg) + n \\ &≤ 2 \Bigg(\ c \bigg( \frac{n}{2} \bigg)\ log\ \bigg(\frac{n}{2} \bigg) \Bigg) + n \\ &= cn\ log (n) - cn\ log (2) + n \\ &= cn\ log (n) - cn + n \\ &= cn\ log (n) + (n - cn) \\ \end{align}\end{split}\]

To ensure that \(T(n) ≤ cn\ log\ (n)\) holds, we need to find a value of \(c\) such that \((n - cn) ≤ 0\). By choosing \(c ≥ 1\), we can guarantee that \(T(n) ≤ cn\ log\ (n)\).

Hence, the solution \(T(n) = O(n\ log\ (n))\) satisfies the recurrence relation.

Recursion Tree

In the recursion tree method, we draw a tree to represent the recursive calls made in the recurrence relation. Each level of the tree corresponds to a recursive call, and we analyze the work done at each level.

For our example, the recursion tree for \(T(n) = 2T(\frac{n}{2}) + n\) would have a root node representing \(T(n)\), two child nodes representing \(T(\frac{n}{2})\), and so on. Each node represents a recursive call, and the work done at each node is represented by the term n.

The recursion tree will have \(log_2\ (n)\) levels since we divide the problem size by \(2\) at each level. At each level, the work done is \(n\), and there are \(2^i\) nodes at level \(i\). Therefore, the total work at each level is \(2^i * n\).

Summing up the work done at each level, we have:

\[ \ \ Total\ work\ = n + 2n + 4n + ... + (n * 2^{log_2\ (n)} - 1) \ \ \]

This is a geometric series with a common ratio of \(2\) and the first term being \(n\). The sum of this series is given by:

\[\begin{split}\begin{align} Total work &= n * \frac{2^{log_2\ (n)} - 1}{2 - 1} \\ &= \frac{n * (n - 1)}{1} \\ &= n^2 - n \\ \end{align}\end{split}\]

Therefore, the time complexity of the recurrence relation is \(O(n^2)\).

Note that in this example, the unrolling and guessing methods led to a time complexity of \(O(n\ log\ (n))\), while the recursion tree method resulted in \(O(n^2)\). The discrepancy arises because the recursion tree method accounts for the exact work done at each level, while the other methods provide upper bounds or estimations.

Master Theorem

The master theorem is a mathematical formula used to analyze the time complexity of divide-and-conquer algorithms. It provides a solution for recurrence relations of the form:

\[T(n) = aT \bigg(\frac{n}{b} \bigg) + f(n)\]

Where:

\(T(n)\) represents the time complexity of the algorithm, \(a\) is the number of recursive subproblems, \(\frac{n}{b}\) is the size of each subproblem, \(f(n)\) is the time complexity of the remaining work done outside the recursive calls.

The master theorem provides three cases based on the relationship between \(f(n)\) and the subproblem part \(aT(\frac{n}{b})\).

Case 1

If \(f(n) = O(n^c)\) for some constant \(c < log_b(a)\), then the time complexity is dominated by the subproblem part, and the overall time complexity is given by

\[T(n) = \Theta(n^{log_b(a)})\]
Case 2

If \(f(n) = \Theta(n^c \ log^{k(n)})\) for some constants \(c ≥ 0\) and \(k ≥ 0\), and if \(aT(\frac{n}{b})\) is the dominant term, then the overall time complexity is given by

\[T(n) = \Theta(n^{log_b(a)} \ log^{(k+1)}(n))\]
Case 3

If \(f(n) = \Omega(n^c)\) for some constant \(c > log_b(a)\), and if \(f(n)\) satisfies the regularity condition (\(af(\frac{n}{b}) ≤ kf(n)\) for some constant \(k < 1\) and sufficiently large \(n\)), then the non-recursive part dominates the time complexity, and the overall time complexity is given by

\[T(n) = \Theta(f(n))\]
\[T(n) = 4T \bigg(\frac{n}{2} \bigg) + n\]
Step 1 : Identify the values of a, b, and f(n)

In this case,

\[a = 4\]
\[b = 2\]
\[f(n) = n\]
Step 2 : Compare f(n) with n^log_b(a)

Here, \(f(n) = n\) and \(n^{log_b(a)} = n^{log_2(4)} = n^2\)

Since \(f(n) = n = n^1 < n^2 = n^{log_b(a)}\), we fall into Case 1 of the master theorem.

Step 3 : Determine the overall time complexity

Using Case 1, the overall time complexity is \(T(n) = \Theta(n^{log_b(a)})\).

In this case, \(T(n) = \Theta(n^{log_2\ (4)}) = \Theta(n^2)\).

So, the overall time complexity of the algorithm in this example is \(\Theta(n^2)\), which means the algorithm’s running time grows quadratically with the input size \(n\).

\[T(n) = 9T \bigg(\frac{n}{3} \bigg) + n^2\]
Step 1 : Identify the values of a, b, and f(n)

In this case:

\[a = 9 \ \ \ (number\ of\ sub-problems)\]
\[b = 3 \ \ \ (size\ of\ each\ subproblem)\]
\[ \ f(n) = n^2 \ \]
Step 2 : Compare f(n) with n^log_b(a)

Here, \(f(n) = n^2\) and \(n^{log_b(a)} = n^{log_3(9)} = n^2\)

Since \(f(n) = n^2 = n^{log_b(a)}\), we fall into Case 2 of the master theorem.

Step 3 : Determine the overall time complexity

Using Case 2, the overall time complexity is

\[T(n) = Θ(n^2 \ log(n))\]
\[T(n) = 2T \bigg(\frac{n}{2} \bigg) + n^3\]
Step 1 : Identify the values of a, b, and f(n)

In this case,

\[a = 2\]
\[b = 2\]
\[f(n) = n^3\]
Step 2 : Compare f(n) with n^log_b(a)

Here, \(f(n) = n^3\) and \(n^{log_b(a)} = n^{log_2(2)} = n^1 = n\)

Since \(f(n) = n^3 > n = n^{log_b(a)}\), we fall into Case 3 of the master theorem.

Step 3: Check if f(n) satisfies the regularity condition

In this case, we can see that \(af(\frac{n}{b}) = 2(\frac{n}{2})^3 = (\frac{1}{2})n^3 \le kn^3\) for \(k = \frac{1}{2}\) and sufficiently large \(n\).

The regularity condition is satisfied.

Step 4 : Determine the overall time complexity.

Using Case 3, the overall time complexity is

\[T(n) = Θ(n^3)\]

In our example, the recurrence relation is \(T(n) = 2T(\frac{n}{2}) + n\), where \(a = 4\), \(b = 2\), and \(f(n) = n\).

Now, let’s compare the function \(f(n)\) with \(n^{log_b\ (a)}\):

\(f(n) = n\), which is asymptotically smaller than \(n^{log_2\ (4)} = n\).

According to the Master Theorem:

If \(f(n) = O(n^c)\) where \(c < log_b\ (a)\), then \(T(n) = \Theta(n^{log_b\ (a)})\). In our case, \(f(n) = O(n^1)\), and \(c = 1 < log_2\ (4) = 1\). Since the condition is met, the time complexity of the recurrence relation is \(Θ(n^{log_2\ (4)}) = \Theta(n)\).

Therefore, according to the Master Theorem, the time complexity of the binary search recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\) is \(\Theta(n)\), which aligns with the result obtained from the unrolling and guessing methods.

It’s important to note that the Master Theorem is applicable in specific cases, and not all recurrence relations can be solved using this theorem. However, for recurrence relations that follow the prescribed form, it provides a convenient way to determine the time complexity.

Examples#

Recurrence Relation: Dividing Function

1 1, 3, 4
2void Test(int n) {       // = T(n)
3  if (n > 0) {
4    printf("/d", n);     // = 1
5    Test(n/2);           // = T(n/2) 
6  }
7}
\[\begin{split} T(n) = \begin{cases} 1, & \text{$n = 1$} \\[2ex] T(\frac{n}{2}) + 1, & \text{$n \gt 1$} \end{cases} \end{split}\]
Unrolling

Let’s unroll the recurrence relation step by step:

\[\begin{split}\begin{align} Step\ 1 : \ & T(n) = T(\frac{n}{2}) + c \\ Step\ 2 : \ & T(n) = \bigg[T(\frac{n}{4}) + c \bigg] + c \\ & = T(\frac{n}{4}) + 2c \\ Step\ 3 : \ & T(n) = \bigg[T(\frac{n}{8}) + c \bigg] + 2c \\ & = T(\frac{n}{8}) + 3c \\ \end{align}\end{split}\]

Continuing this process, we can unroll the recurrence relation up to k steps:

\[ \ \ T(n) = T \bigg(\frac{n}{2^k} \bigg) + kc \ \ \]

We keep unrolling until we reach the base case, which occurs when \(\frac{n}{2^k} = 1\). Solving for \(k\), we find that \(k = log_2\ (n)\).

Now, let’s substitute this value of \(k\) into the unrolled relation:

\[ \ \ T(n) = T(\frac{n}{2^{log_2\ (n)}}) + log_2\ (n)c \ \ \]

Since \(\frac{n}{2^{log_2\ (n)}} = 1\), we simplify further:

\[T(n) = T(1) + log_2\ (n)c\]

Since \(T(1)\) is a constant time complexity for the base case of a binary search, we can replace it with a constant, say \(d\):

\[T(n) = d + log_2\ (n)c\]

Finally, we can simplify this expression as:

\[T(n) = O(log\ n)\]

Therefore, after unrolling the recurrence relation for binary search, we obtain the time complexity of \(O(log\ n)\).

This analysis shows that the time complexity of a binary search algorithm is logarithmic with respect to the size of the array being searched, which aligns with the intuitive understanding of the algorithm’s efficiency.

Guessing

Let’s make a guess or hypothesis about the form of the solution based on the recurrence relation \(T(n) = T(\frac{n}{2}) + 1\).

Let’s assume that \(T(n) = O(log\ (n))\).

We assume that \(T(k) ≤ c\ log\ (k)\) for some constant \(c\), where \(k < n\). Now, let’s substitute this assumption into the recurrence relation:

\[\begin{split}\begin{align} T(n) &= T \bigg(\frac{n}{2} \bigg) + 1 \\ &≤ c\ log\ \frac{n}{2} + 1 \\ &= c(log\ (n) - 1) + 1 \\ &= c\ log\ (n) + 1 + c \\ &= c\ log\ (n) + (1 - c) \\ \end{align}\end{split}\]

To ensure that \(T(n) ≤ c\ log\ (n)\) holds, we need to find a value of \(c\) such that \((1 - c) ≤ 0\). By choosing \(c ≥ 1\), we can guarantee that \(T(n) ≤ c\ log\ (n)\).

Hence, the solution \(T(n) = O(log\ n)\) satisfies the recurrence relation.

Recursion Tree

At each level of the recursion tree, we divide the problem size by \(2\), following the recurrence relation \(T(n) = T(\frac{n}{2}) + 1\). The work done at each level is constant, represented by the “+ 1” term.

Let’s start with the initial value \(T(n)\) and recursively split it into smaller sub-problems until we reach the base case.

https://media.geeksforgeeks.org/wp-content/uploads/20210608091942/img1-300x141.PNG

And so on…

The recursion tree will have \(log_2\ (n)\) levels since we divide the problem size by \(2\) at each level. At each level, the work done is a constant “+ 1”.

The total work done can be calculated by summing up the work at each level:

\[\begin{split}\begin{align} Total work &= 1 + 1 + 1 + ... + 1\ (log_2\ (n) times) \\ &= log_2\ (n) \\ \end{align}\end{split}\]

Therefore, the time complexity of the recurrence relation \(T(n) = T(\frac{n}{2}) + 1\), analyzed using the recursion tree method, is \(O(log\ n)\).

The recursion tree method provides a direct visualization of the recursive calls and the corresponding work done at each level, allowing us to analyze the time complexity of the recurrence relation.

Master Theorem

The Master Theorem provides a framework for solving recurrence relations of the form \(T(n) = aT(\frac{n}{b}) + f(n)\), where \(a ≥ 1\), \(b > 1\), and \(f(n)\) is a function.

In our case, \(T(n) = T(\frac{n}{2}) + 1\), where \(a = 1\), \(b = 2\), and \(f(n) = 1\).

Comparing \(f(n) = 1\) with \(n^{log_b\ (a)}\):

\(f(n) = 1\), which is equal to \(n^{log_2\ (1)} = n^0 = 1\).

According to the Master Theorem:

If \(f(n) = \Theta(n^{log_b\ (a)} * log^{k(n)})\), where \(k ≥ 0\), then \(T(n) = \Theta(n^{log_b\ (a)} * log^{k+1}\ (n))\). In our case, \(f(n) = \Theta(1)\), which falls under the second case. Therefore, the time complexity of the recurrence relation \(T(n) = T(\frac{n}{2}) + 1\) is \(\Theta(n^{log_2\ (1)} * log^{0+1}\ (n)) = \Theta(log\ (n))\).

Thus, according to the Master Theorem, the time complexity of the recurrence relation \(T(n) = T(\frac{n}{2}) + 1\) is \(\Theta(log\ (n))\), which aligns with the results obtained from unrolling, guessing, and the recursion tree.

All four methods provide consistent solutions, indicating that the time complexity of the recurrence relation \(T(n) = T(\frac{n}{2}) + 1\) is \(O(log\ n)\).

Recurrence Relation: Linear

1 1, 3, 4
2void Test(int n) {       // = T(n)
3  if (n > 0) {
4    printf("/d", n);     // = 1
5    Test(n - 1);         // = T(n - 1) 
6  }
7}
\[\begin{split} T(n) = \begin{cases} 1, & \text{$n\ = 0$} \\[2ex] T(n - 1) + 1, & \text{$n\ \gt 0$} \end{cases} \end{split}\]
Unrolling

Let’s unroll the recurrence relation step by step:

\[\begin{split}\begin{align} Step\ 1 : \ & T(n) = T(n - 1) + 1 \\ Step\ 2 : \ & T(n) = [T(n - 2) + 1] + 1 \\ & = T(n - 2) + 2 \\ Step\ 3 : \ & T(n) = [T(n - 3) + 1] + 2 \\ & = T(n - 3) + 3 \\ \end{align}\end{split}\]

Continuing this process, we can unroll the recurrence relation up to k steps:

Step k:

\[ \ \ T(n) = T(n - k) + k \ \ \]

We keep unrolling until we reach the base case, which occurs when \(n - k = 1\). Solving for \(k\), we find that \(k = n - 1\).

Now, let’s substitute this value of k into the unrolled relation:

\[ \ \ T(n) = T(1) + (n - 1) \ \ \]

Since \(T(1)\) represents the time complexity for the base case, which is a constant, we can replace it with a constant, say \(c\):

\[ \ \ \ T(n) = c + (n - 1) \ \ \ \]

Finally, we can simplify this expression as:

\[ \ \ T(n) = O(n) \ \ \]

Therefore, the solution obtained through unrolling suggests that the time complexity of the recurrence relation \(T(n) = T(n - 1) + 1\) is \(O(n)\).

Guessing

Let’s make a guess or hypothesis about the form of the solution based on the recurrence relation \(T(n) = T(n - 1) + 1\).

Let’s assume that \(T(n) = O(n)\).

We assume that \(T(k) ≤ ck\) for some constant \(c\), where \(k < n\). Now, let’s substitute this assumption into the recurrence relation:

\[\begin{split}\begin{align} T(n) & = T(n - 1) + 1 \\ & ≤ c(n - 1) + 1 \\ & = cn - c + 1 \\ \end{align}\end{split}\]

To ensure that \(T(n) ≤ cn\) holds, we need to find a value of \(c\) such that \((-c + 1) ≤ 0\). By choosing \(c ≥ 1\), we can guarantee that \(T(n) ≤ cn\).

Hence, the solution \(T(n) = O(n)\) satisfies the recurrence relation.

Recursion Tree

Let’s draw a recursion tree to represent the recursive calls made in the recurrence relation \(T(n) = T(n - 1) + 1\). Each level of the tree corresponds to a recursive call, and we analyze the work done at each level.

The recursion tree will have n levels since we subtract \(1\) from \(n\) at each level. At each level, the work done is \(1\), and there is only one node at each level. Therefore, the total work at each level is \(1\).

Summing up the work done at each level, we have:

\[\begin{split}\begin{align} Total work & = 1 + 1 + 1 + ... + 1\ (n\ times) \\ & = n \\ \end{align}\end{split}\]

Therefore, the time complexity of the recurrence relation is \(O(n)\).

Master Theorem

The Master Theorem is not applicable to the recurrence relation \(T(n) = T(n - 1) + 1\) because it is not in the required form of \(T(n) = aT(\frac{n}{b}) + f(n)\).

Therefore, the Master Theorem cannot be directly used to solve this recurrence relation.

To summarize, the analysis using unrolling, guessing, recursion tree, and the Master Theorem (where applicable) all yield a time complexity of \(O(n)\) for the recurrence relation \(T(n) = T(n - 1) + 1\).

Recurrence Relation: Divide & Conquer

 1 1, 3, 6, 7
 2void Test(int n) {              // = T(n)
 3  if (n > 1) {
 4    for(i = 0; i < n; i++>) {   // = n
 5      // some statement
 6    }
 7    Test(n/2);                  // = T(n/2)
 8    Test(n/2);                  // = T(n/2)
 9  }
10}
\[\begin{split} T(n) = \begin{cases} 1, & \text{$n\ = 1$} \\[2ex] 2T(\frac{n}{2}) + n, & \text{$n\ \gt 1$} \end{cases} \end{split}\]
Unrolling

Let’s unroll the recurrence relation step by step:

\[\begin{split}\begin{align} Step\ 1 : \ & T(n) = 2T \bigg(\frac{n}{2} \bigg) + n \\ Step\ 2 : \ & T(n) = 2[2T \bigg(\frac{n}{4} \bigg) + \frac{n}{2}] + n \\ & = 4T \bigg(\frac{n}{4} \bigg) + 2n \\ Step\ 3 : \ T(n) & = [T \bigg(\frac{n}{8} \bigg) + \frac{n}{4} + \frac{n}{2}] + 4n \\ & = 8T \bigg(\frac{n}{8} \bigg) + 4n \\ \end{align}\end{split}\]

Continuing this process, we can unroll the recurrence relation up to k steps:

\[\begin{split}\begin{align} Step\ k : \ & T(n) = 2^{k}\ T \bigg( \frac{n}{2^k} \bigg) + kn \\ \end{align}\end{split}\]

We keep unrolling until we reach the base case, which occurs when \( \frac{n}{2^k} = 1 \). Solving for \(k\), we find that \(k = log_2\ (n)\).

Now, let’s substitute this value of k into the unrolled relation:

\[\begin{split}\begin{align} T(n) &= 2^{log_2\ (n)}\ T \bigg(\frac{n}{2^{log_2\ (n)}} \bigg) + n\ log_2\ (n) \\ &= nT(1) + n\ log_2\ (n) \\ &= n + n\ log_2\ (n) \\ \end{align}\end{split}\]

Since \(T(1)\) represents the time complexity for the base case, which is a constant, we can replace it with a constant, say \(c\):

\[ \ \ T(n) = c + n + n\ log_2\ (n) \ \ \]

Finally, we can simplify this expression as:

\[ \ \ T(n) = O(n\ log\ (n)) \ \ \]

Therefore, the solution obtained through unrolling suggests that the time complexity of the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\) is \(O(n\ log\ (n))\).

Guessing

Let’s make a guess or hypothesis about the form of the solution based on the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\).

Let’s assume that \(T(n) = O(n\ log\ (n))\).

We assume that \(T(k) ≤ ck\ log\ (k)\) for some constant \(c\), where \(k < n\). Now, let’s substitute this assumption into the recurrence relation:

\[\begin{split}\begin{align} T(n) &= 2T \bigg(\frac{n}{2} \bigg) + n \\ &≤ 2c \bigg(\frac{n}{2} \bigg)\ log\ \bigg(\frac{n}{2} \bigg) + n \\ &= cn\ log\ \bigg(\frac{n}{2} \bigg) + n \\ &= cn\ log\ (n) - cn\ log\ (2) + n \\ &= cn\ log\ (n) - cn + n \\ \end{align}\end{split}\]

To ensure that \(T(n) ≤ cn\ log\ (n)\) holds, we need to find a value of \(c\) such that \((-cn + n) ≤ 0\). By choosing \(c ≥ 1\), we can guarantee that \(T(n) ≤ cn\ log\ (n)\).

Hence, the solution \(T(n) = O(n\ log\ (n))\) satisfies the recurrence relation.

Recursion Tree

Let’s draw a recursion tree to represent the recursive calls made in the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\). Each level of the tree corresponds to a recursive call, and we analyze the work done at each level.

https://algorithmtutor.com/images/recursion_tree1_final.png

The recursion tree will have \(log_2\ (n)\) levels since we divide the problem size by \(2\) at each level. At each level, the work done is \(n\), and there are \(2^i\) nodes at level \(i\). Therefore, the total work at each level is \(n * 2^i\).

Summing up the work done at each level, we have:

\[Total work = n + 2n + 4n + ... + (2^{log_2\ (n) - 1}\ n) \ \ \]

This is a geometric series with a common ratio of 2 and the first term being n. The sum of this series is given by:

\[\begin{split}\begin{align} Total work &= n * \frac{2^{log_2\ (n)}\ - 1}{2 - 1} \\ &= n * {2^{log₂(n)}\ - 1} \\ &= n * (n - 1) \\ &= n^2 - n \\ \end{align}\end{split}\]

Therefore, the time complexity of the recurrence relation is \(O(n^2)\).

Master Theorem

The Master Theorem is applicable to the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\). We can express it in the form \(T(n) = aT(\frac{n}{b}) + f(n)\) with \(a = 2\), \(b = 2\), and \(f(n) = n\).

Comparing \(f(n) = n\) with \(n^{log_b\ (a)}\):

\(f(n) = n\), which is smaller than \(n^{log_2\ (2)} = n\).

According to the Master Theorem:

If \(f(n) = O(n^c)\) for some constant \(c < log_b\ (a)\), then \(T(n) = Θ(n^{log_b\ (a)})\). In our case, \(f(n) = O(n^1) = O(n)\), which falls under the first case. Therefore, the time complexity of the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\) is \(\Theta(n^{log_2\ (2)}) = \Theta(n)\).

Thus, according to the Master Theorem, the time complexity of the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\) is \(\Theta(n)\), which aligns with the results obtained from unrolling, guessing, and the recursion tree.

All four methods provide consistent solutions, indicating that the time complexity of the recurrence relation \(T(n) = 2T(\frac{n}{2}) + n\) is \(O(n\ log\ n)\).

Recurrence Relation: Tower of Hanoi

1 1 - 4
2void Test(int n) {    // = T(n)
3  printf("/d", n);    // = 1
4  Test(n - 1);        // = T(n - 1)
5  Test(n - 1);        // = T(n - 1)
6}
\[\begin{split} T(n) = \begin{cases} 1, & \text{$n\ = 0$} \\[2ex] 2T(n - 1) + 1, & \text{$n\ \gt 0$} \end{cases} \end{split}\]
Unrolling

Let’s unroll the recurrence relation step by step:

\[\begin{split}\begin{align} Step\ 1 : \ & T(n) = 2T(n - 1) + 1 \\ Step\ 2 : \ & T(n) = 2[2T(n - 2) + 1] + 1 \\ & = 4T(n - 2) + 3 \\ Step\ 3 : \ & T(n) = 2[2[2T(n - 3) + 1] + 1] + 1 \\ & = 8T(n - 3) + 7 \\ \end{align}\end{split}\]

Continuing this process, we can unroll the recurrence relation up to k steps:

\[\begin{align} Step\ k: & T(n) = 2^{k}\ T(n - k) + (2^k - 1) \end{align}\]

We keep unrolling until we reach the base case, which occurs when \(n - k = 1\). Solving for \(k\), we find that \(k = n - 1\).

Now, let’s substitute this value of \(k\) into the unrolled relation:

\[\begin{split}\begin{align} T(n) &= 2^{n - 1}\ T(1) + (2^{n - 1} - 1) \\ &= 2^{n - 1} + (2^{n - 1} - 1) \\ &= 2^n - 1 \\ \end{align}\end{split}\]

Therefore, the solution obtained through unrolling suggests that the time complexity of the recurrence relation \(T(n) = 2T(n - 1) + 1\) is \(O(2^n)\).

Guessing

Let’s make a guess or hypothesis about the form of the solution based on the recurrence relation \(T(n) = 2T(n - 1) + 1\).

Let’s assume that \(T(n) = O(2^n)\).

We assume that \(T(k) ≤ c * 2^k\) for some constant \(c\), where \(k < n\). Now, let’s substitute this assumption into the recurrence relation:

\[\begin{split}\begin{align} T(n) &= 2T(n - 1) + 1 \\ &≤ 2(c * 2^(n - 1)) + 1 \\ &= c * 2^n + 1 \\ \end{align}\end{split}\]

To ensure that \(T(n) ≤ c * 2^n\) holds, we need to find a value of \(c\) such that \((c + 1) ≤ c\). This is not possible, so our assumption is incorrect.

Hence, we cannot prove the solution \(T(n) = O(2^n)\) using the guessing method.

Recursion Tree

Let’s draw a recursion tree to represent the recursive calls made in the recurrence relation \(T(n) = 2T(n - 1) + 1\). Each level of the tree corresponds to a recursive call, and we analyze the work done at each level.

https://frederick-s.github.io/Introduction-to-Algorithms-Notes/04-Divide-and-Conquer/4.4-4.png

The recursion tree will have n levels since we subtract \(1\) from \(n\) at each level. At each level, the work done is \(1\). Therefore, the total work at each level is \(1 * 2^i\).

Summing up the work done at each level, we have:

\[ \ \ Total work = 1 + 2 + 4 + ... + 2^{n - 1} \ \ \]

This is a geometric series with a common ratio of {2} and the first term being {1}. The sum of this series is given by:

\[ \ \ Total work = 2^n - 1 \ \ \]

Therefore, the time complexity of the recurrence relation is \(O(2^n)\).

Master Theorem

The Master Theorem is not directly applicable to the recurrence relation \(T(n) = 2T(n - 1) + 1\) because it is not in the required form of \(T(n) = aT(\frac{n}{b}) + f(n)\).

Therefore, the Master Theorem cannot be used to directly solve this recurrence relation.

To summarize, the analysis using unrolling, guessing, recursion tree, and the Master Theorem (where applicable) all yield a time complexity of \(O(2^n)\) for the recurrence relation \(T(n) = 2T(n - 1) + 1\).