[FrontPage] [TitleIndex] [WordIndex

Contents

The main piece of math (written in its ugliest form here thanks to the wiki's lack of math -- see SideChatter) is:

In other words, ignoring small values of n, g(n) times a fixed constant is always at least as large as f( that n).

For example, f(n)= 2n+100 is O(n) because \forall n > 100 f(n) <= 3*n

\Omega is a bound below:

\Theta involves a bound below and a bound above:

There is a section on Orders of Growth (sometimes called big-Oh notation, though we're actually using \Theta as much as big-Oh; sometimes called Asymptotic Analysis of Algorithms in virtually every Algorithms or Theory of Computer Science book.


2013-07-17 10:43