CSCI 2824 Lecture 21: Growth of FunctionsStudying the growth of functions is an important part of the analysis of algorithms. Take a look at the two codes below: Linear Loop
for (i=0; i < n; ++i){ sleep(1); /*-- Sleep for one second --*/ } For , the program takes roughly to run. Contrast the program above with this one: Quadratic Loop
for (i=0; i < n; ++i){ for (j=0; j < n; ++j){ sleep(0.01); /*-- sleep 10 milli seconds --*/ } } For , the program above takes to run. Comparing the expected running times of the programs:
As increases, the quadratic loop gets slower and slower than the linear loop. Let us increase the delay in each linear loop iteration to and decrease the delay in quadratic loop to milli seconds (0.001 s). How does the table above change?? Answer
The linear loop's running time for given input is roughly seconds and the quadratic loop is . But for , the linear loop will once again be faster than the quadratic and remain so for larger values of . Running Time Complexity of AlgorithmsThere are two basic rules we follow when comparing running times or analyzing running times of algorithms.
In terms of trends, we say that the linear loop in the example above has a running time complexity of (pronounced Big-OH of or Order ). And quadratic loop above has a complexity of . Informally, can refer to running times that could in reality be any number of functions including:
All we care about in these functions is that: as grows larger and larger, beyond for some large number , the dominant term in the function is with a constant factor in front of it. The remaining terms matter less and less as increases, and eventually stop being the dominant terms. Let us now define the Big-OH notation. Big-OH Notation
Given a function , and we write iff there exists a number such that for all , Note that denotes a set of functions which satisfy the definition above. In other words, we have that asymptotically as , grows at least as fast as (if not faster) as . Eg., we have being in the set . Beyond , we see that . Questions for class
Let us now answer these questions. Answers
Example # 2.Is ? Answer
This is an interesting question. Can we find a constant such that for all ? To understand, let us call . We have . We are asking the question if ? The answer is no. Is ? Answer
Once again, let us call as . We are asking now if , and the answer is yes. Is ? Answer
Again, let us call . We have . We are asking if . The answer is yes. Big-Omega and Big-ThetaJust like Big-Oh, we can define Big-Omega to denote lower bounds. Big-Omega
We say that if and only if there is a and a constant such that In other words, informally we say that the function is lower bounded asymptotically by (upto a constant factor). Exampls1. We can choose and . We conclude (trivially) that for all . 2. . We can choose and , we conclude that for all . 3. What are all the functions in , give a succinct description? Any function whose growth is super-linear. Examples include . 4. Is the following function in ? Is it in ? Since we are concerned about the asymptotics of the function: for the upper bound we should focus on when is odd. Therefore, . But for lower bound, we should look at when is even. Therefore . We say that if . Note
It is common to write or often instead of . But the latter is more sound since strictly speaking is not a function but a set of functions that are all upper bounded by . |