[FrontPage] [TitleIndex] [WordIndex

Notes for Lecture 5 - Friday September 17th, 2004

Parlor Trick

Lynn gives Steve Shannon an incredibly large, thick dictionary and says in 11 guesses or less she will pinpoint the exact page if Steve will tell her whether her guess is high or low. Lynn proceeds to use binary (with some help from the class) and makes it in exactly 11. "Today," says Lynn, "we are going to talk about the amazing power of logarithms."

Big Idea: Logarithms

Logarithms can be found again and again in computer science. Logarithms extended far beyond trees and searches, think about how integers are stored in the computer. 10110 is only bits long but it can store 25 values. See where else you can apply them.

Quick Review of Arboreal Mathematics

     O              O          A binary tree of depth 3
    /  \          /  \         15 total nodes (nodes = 2^(depth+1) - 1)
   /    \        /    \        8 leaves (2^depth)
  O      O      O      O
 / \    / \    / \    / \
O   O  O   O  O   O  O   O 

Last time we spoke about node datums, left and right children, list representations of trees, and caddaddadadaaaddadddaddadaddr (shortcuts for representing multiple car and cdr statements in scheme).

In scheme the node datum is the car of the node. The left child is the cadr and the right child is the caddr.

Representing a tree in an array

For each node at location [i], place node.left at [2i+1] and node.right at [2i+2]. For a tree of depth 2, it looks something like...

Datum Located At












  1       2         A pictorial representation of the above array - numbers correspond to positions in the array
 / \     / \
3   5   4   6

Binary trees can be used in manner which reduces computation (n -> log n) or where it increases it (n -> 2n). NOTE: If a tree is unbalanced it is a very wasteful data type.

For additional information concerning trees checkout the Wikipedia Binary Tree entry.

New Sorts

Refresh your memory about Bubble, Insertion, and Selection sorts at the bottom of NotesSeptember14, or check out the applets here(tiny) or here(moredetail).

Why Are These Sorts Cool?

Each node, when given a problem, whaps the problem into two parts and hands it down to their children. And they pass it onto their children, and their children's children, and yea, all the children of the tree, which are as numerous as the stars in the sky, do break down the data of their ancestors. And verily, saith the Lynn, I shalt cause the answers to traverse back up the tree, each parent taking the burdens of their children and passing it upwards to their parents, and so is the tree sorted. Thus spake the Lynn. And so shall the Big-O be log(n) where n is the number of elements in the tree.

Binary Tree Sort

The tree is arranged such that for each node, all things on the its left node are lower than it, and all things on its right node are higher than it. For instance,

  3       9         binary tree sort - numbers are actual data in tree
 / \     / \
1   5   8   12

Heap Sort

The tree is arranged such that the smallest element is on top. New elements are unceremoniously tacked onto the end of the array, then the entire tree is rummaged through to verify/fix the heap property of x > x_parent. This verify-fix shebang will always take O(log n) steps, where n is the size of the current heap. Not bad.

Merge Sort

Start with groups made of individual elements. For each iteration, pair up groups and merge-sort them - that is, compare the smallest element in both groups and place the smaller of the two in the new group, leaving the larger one in place, and repeat until the entire group has been sorted.

Example: Merge sort the two lists (1, 3, 4, 8) and (2, 5, 6, 7)





Smallest two are 1 and 2, and 1 is smaller than 2.

3, 4, 8

2, 5, 6, 7


Smallest two are 3 and 2, and 2 is smaller than 3.

3, 4, 8

5, 6, 7

1, 2

Smallest two are 3 and 5, and 3 is smaller than 5.

4, 8

5, 6, 7

1, 2, 3

Smallest two are 4 and 5, and 4 is smaller than 5.


5, 6, 7

1, 2, 3, 4

Smallest two are 8 and 5, and 5 is smaller than 8.


6, 7

1, 2, 3, 4, 5

Smallest two are 8 and 6, and 6 is smaller than 8.



1, 2, 3, 4, 5, 6

Smallest two are 8 and 7, and 7 is smaller than 8.


1, 2, 3, 4, 5, 6, 7

Smallest two are 8 and - whoops, just 8. Done.

1, 2, 3, 4, 5, 6, 7, 8

Sort Running-Time Comparisons So Far










n log(n)


n log(n)



The Origin Of N Log N aka the Information Theoretic Lower Bound

In an array (tree, heap, whatever) with n elements, there are n! different orders to arrange those elements in (since there are n possible first elements, then n-1 possible second ones, then n-2, and so forth, and (n)(n-1)(n-2)...(1) = n!). So we decide the sort in log(n!) which is approximately n log(n). Someone else can explain this more coherently than me, right?

Let's see:

Any better?

Thanks to the spiffiness that is Katie Rivard, we also have This Massive Index Page Of Sorts to check out. Code is even included.



Recieved Chapter 11 (Sorting) from Data Structures and Their Algorithms by Lewis & Denenberg that describes different sorting algorithms and their speeds.

What's Next

We had hoped to get to the Paul Graham essay on why Lisp rocks, but that'll wait yet again until the ever-ephemeral "next time".


...will absolutely positively be up Tuesday, and will not be due until the Tuesday after that.

2013-07-17 10:43