Here is a recursive definition that you might see when analyzing the running time of a "divide and conquer" algorithm. (We'll see algorithm analysis in a few weeks.
Let's try to work out the closed form using unrolling. For simplicity, we'll assume that n is a power of 2, so that all the divisions produce integers. In an actual algorithms context, the running times for other inputs would lie somewhere between the values for the the nearest powers of two. So this is a good practical simplification.
First, we substitute the recursive equation into itself, twice. So we have three applications of the recursive definition.
\(t(n) = 2t(n/2) + dn \)
\( = 2(2t(n/4) + dn/2) + dn\)
\( = 2(2(2t(n/8) + dn/4) + dn/2) + dn\)
Now clean up the equation so we can see what's going on:
\(t(n) = 2(2(2t(n/8) + dn/4) + dn/2) + dn\)
\( = 2^3 t(n/2^3) + dn + dn + dn\)
\( = 2^3 t(n/2^3) + 3dn\)
Now let's generalize the pattern from 3 applications to k applications:
\(t(n) = 2^3 t(n/2^3) + 3dn\)
\( = 2^k t(n/2^k) + kdn\)
Now we need to figure out when we'll hit the base case. The base case of our definition is t(4) = c. The expression in our unrolling is \(t(n/2^k)\). Setting the two inputs equal gives us \(n/2^k = 4\).
To solve this for k, first put the powers of 2 all on the same side: \(n = 2^{k+2}\). Then take the log (base 2) of both sides: \(k+2 = \log n\). So \(k = \log n -2\). Substituting this into our equation gives us
\(t(n) = 2^k t(n/2^k) + kdn\)
\( = 2^{\log n - 2} t(4) + (\log n - 2) d n\)
Substitute the value of t(4) and simplify the term with the log in the exponent. Remember that \(2^{\log n}\) is just n.
\(t(n) = 2^{\log n - 2} t(4) + (\log n - 2) d n\)
\( = 1/4 (2^{\log n}) c + dn \log n - 2 d n\)
\( = 1/4 n c + dn \log n - 2 d n\)
\( = (c/4 -2d)n+ dn \log n \)