COM1201 Winter 2000 Midterm Exam Information

What you need to know and be able to do on the Midterm

Created 2/12/2000; revised 2/15/2000

This is a closed book, no notes, no calculators exam.

The material I describe below is quite narrowly targeted to specific topics in the textbook, with specific sections mentioned. However, all additional reading you do will help raise the quality of your answers, answer extra credit questions, get a higher score, and most importantly, it will help you learn more(!)

All topics on the Quiz #1 plus the material through Monday, February 14th can be on the Midterm.

There are two topics that will be definitely included: (1) Estimating complexity from numerical values and (2) the insert and remove maximum operations in heaps using the tree representation.

(1) Estimating the complexity of an algorithm from measured run times:

This can best be done by comparing the ratio of the complexity measures to the ratio of the run times for two or more values of N.

Consider two different algorithms run for various values of N:

For the first algorithm, we'll assume that it is run for N=1024 and N=2048, giving times t1024 = 10 msec and t2048=21 msec. The ratio of these times is approximately 2:1. Consider the complexity O(N2). The ratio of 20482 to 10242 is (2048/1024)2 = 22 = 4. So the 2:1 ratio of the times and the 4:1 ratio of the complexities are in serious disagreement. On the other hand, consider the complexity O(N). The ratio of these is (2048/1024), exactly 2:1. So complexity O(N) matches the observed times well.

The situation is more subtle for O(NlgN), which for modest changes in N looks essentially linear, O(N). But larger N ratios are adequate to show the difference. So consider another algorithm run for N = 128 = 27 and N = 4096 = 212. Assume that the run times are t128 = 10 msec and t4096 = 556 msec. The ratio of the N's is 32 and the ratio of the times is about 56, and these numbers are quite different. So let's try the ratio of complexities NlgN. This gives 4096(lg4096)/(128(lg128)) = 32 * 12 / 7 which is about 55 which is quite close to 56. So we can reasonably conclude that this second algorithm is O(NlgN).

We must be careful in all this to look at large values of N because the true complexity may not make itself evident for small values of N. Bottom line: Be prepared on the Midterm to do some estimation similar to calculations above

(2) Insert and remove maximum operations for heaps and the tree manipulations.

Excerpted from the Syllabus for Monday the 14th: "Reading due today: Skim through Sec. 9.1. Carefully read Secs. 9.2, 9.3 and 9.4. You should understand how to do the operations fixUp and fixDown and be able to do them on paper for the Midterm, how the tree changes as insert and remove maximum operations are done."

I will give you a tree to start with and you will be asked to insert two elements and restructure the tree each time to yield a correctly formed tree. The values at the nodes will be integers, rather than letters, to make the task easier. This operation involves placing the new element in the first open position, at the "end" of the tree, and then using the fixUp operations to readjust the tree.

Starting with the *original* tree, you will be asked to apply the remove maximum twice, again adjusting the tree so that it ends up with the correct structure and achieved by the correct sequence of operations. In this case the top element is removed, the last element is moved to the empty top position, and fixDown is used to move the element down until the tree structure is again correct.

I will be going over all this in class on Monday, when we discuss Priority Queues and Heapsort. Be there!

Other material to be covered with more detailed reading guide --

Chap. 2, Algorithmic Analysis: Pay special attention to using numerical values to estimate complexity. Know your powers of two and how to approximate numbers with them and how to estimate ratios. Realize that if N=26=64, then N2=212=4x210=4x1024= about 4000, and other such values and relations. Reading focus through Sec. 2.4 of Chap. 2.

You can do simple problems on your own by assuming a dependencies such as CN=a+bN and then trying to estimate the complexity from the numbers you generate for various Ns.

Adjacency graphs: Be sure you clearly understand the difference between the links in a graph, Fig. 3.14, and the links in the adjacency-list representation of the graph connectivity, Fig. 3.15.

Sorting: Be able to do an insertion sort on a small array by hand, showing the array as you process each additional element as the sort proceeds. I may give you the source code as a reminder, but don't count on it. Sections 6.1 and 6.3 of Chap. 6.

Also, be able to write out what a stable sort looks like for a set of records I give you, similar to Fig. 6.1 and what a non-stable sort would look like.

Quicksort: Be able to do a partition of an array by hand, showing which exchanges are made and showing that the result is properly partitioned and that one element is in its final position. Reading, Sec. 7.1 of Chap. 7.