- == Short version ==
The main piece of math (written in its ugliest form here thanks to the wiki's lack of math -- see SideChatter) is:
f(n) is O(g(n)) iff \exists c s.t. \forall n > n_0_ f(n) <= c * g(n)
In other words, ignoring small values of n, g(n) times a fixed constant is always at least as large as f( that n).
For example, f(n)= 2n+100 is O(n) because \forall n > 100 f(n) <= 3*n
\Omega is a bound below:
f(n) is O(g(n)) iff \exists c s.t. \forall n > n_0_ c * g(n) <= f(n)
\Theta involves a bound below and a bound above:
f(n) is O(g(n)) iff \exists c_1_, c_2_ s.t. \forall n > n_0_ c_1_ * g(n) <= f(n) <= c_2_ * g(n)
- == Where to Look (Books) ==
There is a section on Orders of Growth (sometimes called big-Oh notation, though we're actually using \Theta as much as big-Oh; sometimes called Asymptotic Analysis of Algorithms in virtually every Algorithms or Theory of Computer Science book.
- Your Theory Textbook
- Your Algorithms Textbook
- Cormen 2e: chapter 3 (see CCL or Olin library)