Prove f(n) is always O(f(n-1)) [closed]

Tag: math , big-o , asymptotic-complexity Author: walgd Date: 2013-09-15

Assume that f(n) goes to infinity as n goes to infinity.

This is a homework problem and I would appreciate an idea/guidance instead of the complete answer.

Best Answer

This isn't true. Consider the function f(n) = n! as a counterexample, which definitely goes toward infinity as n goes to infinity. We can prove, though, that n! ? O((n - 1)!).

The proof is by contradiction. Suppose that n! = O((n - 1)!). Then there exists some n0 and c such that for any n ? n0, we have n! ? c(n - 1)!. This means that for any n ? n0, we have that n! / (n - 1)! ? c, or that n ? c. But if we pick n = max{n0, c} + 1, then we know that n ? n0 and that n ? c + 1, contradicting that n ? c. Since we have a contradiction, the assumption must be wrong and therefore n! ? O((n - 1)!).

In case you're wondering how I came up with this: my idea was to find a function that grows so rapidly that no matter what constant you picked, the ratio between f(n + 1) and f(n) would eventually get so large that it would exceed that constant. It just happened to be the case that n! fit the bill. In retrospect, I should have remembered that n! ? O((n - 1)!) because many algorithms have runtimes like O((n + 1)!), which doesn't simplify down to O(n!).

Hope this helps!

comments:

Shouldn't it be n>= c+1 ?
@ArunKumar- Ah, yes, that's right. Fixed!
Thank you so much @templatetypedef.