## Big-O, Little-O, Theta, Omega

Big-O, Little-o, Omega, and Theta are formal notational methods for stating the growth of resource needs (efficiency and storage) of an algorithm. There are four basic notations used when describing resource needs. These are: O(f(n)), o(f(n)), $\Omega(f(n))$, and $\Theta(f(n))$. (Pronounced, Big-O, Little-O, Omega and Theta respectively)

### Formally:

"$T(n)$ is $O(f(n))$" iff for some constants $c$ and $n_0$, $T(n) <=c f(n)$ for all $n >= n_0$

"$T(n)$ is $\Omega(f(n))$" iff for some constants $c$ and $n_0$, $T(n)>=cf(n)$ for all $n >= n_0$

"$T(n)$ is $\Theta(f(n))$" iff $T(n)$ is $O(f(n))$ AND $T(n)$ is $\Omega(f(n))$

"$T(n)$ is $o(f(n))$" iff $T(n)$ is $O(f(n))$ AND $T(n)$ is NOT $\Theta(f(n))$

### Informally:

"$T(n)$ is $O(f(n))$" basically means that $f(n)$ describes the upper bound for $T(n)$

"$T(n)$ is $\Omega(f(n))$" basically means that $f(n)$ describes the lower bound for $T(n)$

"$T(n)$ is $\Theta(f(n))$" basically means that $f(n)$ describes the exact bound for $T(n)$

"$T(n)$ is $o(f(n))$" basically means that $f(n)$ is the upper bound for $T(n)$ but that $T(n)$ can never be equal to $f(n)$

#### Another way of saying this:

"$T(n)$ is $O(f(n))$" growth rate of $T(n)$ <= growth rate of $f(n)$

"$T(n)$ is $\Omega(f(n))$" growth rate of $T(n)$ >= growth rate of $f(n)$

"$T(n)$ is $\Theta(f(n))$" growth rate of $T(n)$ == growth rate of $f(n)$

"$T(n)$ is $o(f(n))$" growth rate of $T(n)$ < growth rate of $f(n)$

### An easy way to think about big-O

The math in big-O analysis can often be intimidates students. One of the simplest ways to think about big-O analysis is that it is basically a way to apply a rating system for your algorithms (like movie ratings). It tells you the kind of resource needs you can expect the algorithm to exhibit as your data gets bigger and bigger. From best (least resource requirements ) to worst, the rankings are: $O(1)$, $O(\log n)$, $O(n)$, $O(n \log n)$, $O( n^2 )$, $O(n^3)$, $O(2^n)$. Think about the graphs in the grow rate section. The way each curve looks. That is the most important thing to understand about algorithms analysis

### What all this means

Let's take a closer look a the formal definition for big-O analysis

"$T(n)$ is $O(f(n))$" if for some constants $c$ and $n_0$, $T(n)<=cf(n)$ for all $n >= n_0$

The way to read the above statement is as follows.

- n is the size of the data set.
- $f(n)$ is a function that is calculated using n as the parameter.
- $O(f(n))$ means that the curve described by $f(n)$ is an upper bound for the resource needs of a function.

This means that if we were to draw a graph of the resource needs of a particular algorithm, it would fall under the curve described by $f(n)$. What's more, it doesn't need to be under the exact curve described by $f(n)$. It could be under a constant *scaled* curve for $f(n)$... so instead of having to be under the $n^2$ curve, it can be under the $10n^2$ curve or the $200n^2$ curve. In fact it can be any constant, as long as it is a constant. A constant is simply a number that does not change with n. So as $n$ gets bigger, you cannot change what the constant is. The actual value of the constant does not matter though.

The other portion of the statement $n >= n_0$ means that $T(n)<=cf(n)$ does not need to be true for all values of $n$. It means that as long as you can find a value $n_0$ for which $T(n)<=cf(n)$ is true, and it never becomes untrue for all $n$ larger than $n_0$, then you have met the criteria for the statement $T(n)$ is $O(f(n))$

In summary, when we are looking at big-O, we are in essence looking for a description of the growth rate of the resource increase. The exact numbers do not actually matter in the end.