written 8.4 years ago by | • modified 4.7 years ago |
written 8.4 years ago by |
Big Oh Notation :
- Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f (n) = O (g (n)) means it is less than some constant multiple of g(n).
Consider an algorithm whose efficiency is
$f (n)=8.023n^2+7n$
at very large values of n, we may write
$f (n) α n^2$
If we designate n^2 as g(n we may say that as f(n) grows
$f (n) α g(n)$
- The same fact can also be stated as f(n) is o(g(n)) and read as f(n) is Big-Oh of g(n)
Mathematically, it means that there exists a positive real constant c such that
$f (n) \lt= c*g(n)$, at very large values of n.
Consider the following example where
$f (n)=n^2 + 100n \\ n ^2 +100n\lt=2n^2, n\gt=100 \\ f (n) is O(n^2) \\ then g(n)=n^2, c=2$
- Hence as the value of n keeps an increasing f(n) becomes more and more smaller than some multiple of g(n).
We may even say that f(n) is bounded by g(n) from above
I.e. f (n) is asymptotically bounded by g (n).
Consider an algorithm whose efficiency is
$8.023n^2+7n+2.863n^ (2/5) %4.981n^ (5/9) +2.87n^ (8/9)$
- Now instead of expressing efficiency using this complex mathematical expression we may simply write:
- Efficiency | Complexity | f (n) is $O (n^2)$.
- The simplification of analysis from f (n) to g(n) is called Big Oh analysis.
- Hence, simplification is an advantage of Big Oh notation.
Omega Notation
- Definition: f (n) is said to be g (n) if and only if a positive real constant C such that. f (n) Cg(n) for infinitely many values of n. The notation describes asymptotic tight bounds.
- The big Omega notation always specifies a lower boundary on the amount of resources required. Hence, if f (n) is Ω (g (n)), then mathematically it means that there exists a positive real constant c such that f (n)>=c*g (n), at very large value of n.
Consider an algorithm whose efficiency is
$f (n)=3n^2-100n$
Because $3n^2-100n\gt=2n^2$, at very large value of n therefore f (n) is $Ω (n^2)$.
- Hence the amount of resource required will be never less than $2n^2$.
Problems:
T(n) =4T(n/2)+n^2
Multiplying by 4 & $n-\gt (n/2)$
$4T (n/2) =16T (n/2^2)+4((n^2)/4)$
Multiply by $4^2$ & $n-\gt (n/(2^2))$
$16T (n/ (2^2)) = 4^3T (n/2^3) +16*(n^2/16)$
$4^ (k-1) T (n/2^ (k-1)) =4^kT (n/2^k) +n^2$
Adding upto $log n or n=2^k$
$T (n) =constant*T (n/n) +n^3*log n$
$T(c) =c+n^2log n$
2) T(n)=2T(n/2)+n^3
Let the base two representation of n be
$n=∑_{k=0}^{logn} d^k 2^k$
Then we can unroll the recurrence to obtain the following exact formula n>=1
$T(n) =∑_{j=0}^{logn}2^j ((∑_{k=j}^{logn} d^k 2^{k-j} ) ^3)$
Now to get an upper bound consider a string of one digits which yields
$T(n)\lt=∑_{j=0}^{logn}2^j ((∑_{k=j}^{logn} 2^{k-j} ) ^3)$
Note that this bound is attained and cannot be improved. It simplifies to
$(32/3)8^{logn} -24*4^{logn} +(6log n+(40/3))* 2^{ogn} +1$
The lower bound is for the case of a one digit followed by a string of zeros and yields
$T(n)\gt=∑_{j=0}^{logn}2^j ((∑_{k=j}^{logn} 2^{k-j} ) ^3)$
It simplifies to
$(4/3)8^{logn} - (1/3) 2^{logn}$
And this matches the results posted by the other contributors.
Joining the dominant terms of the upper and the lower bound we obtain the asymptotic
$2^{3 logn} € Ө (2^{3 logn}) = Ө(n^3)$