Algorithm Efficiency - 2
Powers
To find the time complexity of an algorithm look at the loops:
count = 0
for i = 1 to x do
for j = 1 to y do
for k = 1 to z do
count = count + 1
For this loop the time complexity is $O(xyz)$. You can see that the variable relates to the time complexity.
Here is another example:
i = 1
while i <= n do
begin
j = i
while j <= n do
j = j + 1
i = i + 1
end
The time complexity for this loop is $O(n^2)$. This is as a result of the time complexity of:
\[\frac{n\times(n+1)}{2}\]The highest term in this equation is $n^2$.
Logs
Here is another algorithm:
i = 1
count = 0 while i < n do
begin
i = i * 2
count = count + 1
end
output count
This is the trace table for this algorithm:
-
Suppose $n=8$
Iteration i
beforei
aftercount
Before Loop 1 0 1 1 2 1 2 2 4 2 3 4 8 3 -
Suppose $n=32$
Iteration i
beforei
aftercount
Before Loop 1 0 1 1 2 1 2 2 4 2 3 4 8 3 4 8 16 4 5 16 32 5
We can see by comparing $n$ to the iterations that this algorithm is not linear.
To find the count that gives $n$ we would need to calculate:
\[2^\text{count}=n\]Therefore this algorithm has logarithmic time complexity:
\[O(\log n)\]Don’t presume, when you see a loop, that the time complexity is polynomial.