Notes for Lecture 5  Friday September 17th, 2004
Return to the main StudentFrontPage.
 Scribe = Mel Chua
 Editor = Matt Colyer
 Editor =
 Editor =
 Editor =
Contents
Parlor Trick
Lynn gives Steve Shannon an incredibly large, thick dictionary and says in 11 guesses or less she will pinpoint the exact page if Steve will tell her whether her guess is high or low. Lynn proceeds to use binary (with some help from the class) and makes it in exactly 11. "Today," says Lynn, "we are going to talk about the amazing power of logarithms."
Big Idea: Logarithms
Logarithms can be found again and again in computer science. Logarithms extended far beyond trees and searches, think about how integers are stored in the computer. 10110 is only bits long but it can store 2^{5} values. See where else you can apply them.
Quick Review of Arboreal Mathematics
O _____________ O O A binary tree of depth 3 / \ / \ 15 total nodes (nodes = 2^(depth+1)  1) / \ / \ 8 leaves (2^depth) O O O O / \ / \ / \ / \ O O O O O O O O
Last time we spoke about node datums, left and right children, list representations of trees, and caddaddadadaaaddadddaddadaddr (shortcuts for representing multiple car and cdr statements in scheme).
In scheme the node datum is the car of the node. The left child is the cadr and the right child is the caddr.
Representing a tree in an array
For each node at location [i], place node.left at [2i+1] and node.right at [2i+2]. For a tree of depth 2, it looks something like...
Datum Located At 
0 
1 
2 
datum.left 
1 
3 
5 
datum.right 
2 
4 
6 
0 ______ 1 2 A pictorial representation of the above array  numbers correspond to positions in the array / \ / \ 3 5 4 6
Binary trees can be used in manner which reduces computation (n > log n) or where it increases it (n > 2n). NOTE: If a tree is unbalanced it is a very wasteful data type.
For additional information concerning trees checkout the Wikipedia Binary Tree entry.
New Sorts
Refresh your memory about Bubble, Insertion, and Selection sorts at the bottom of NotesSeptember14, or check out the applets here(tiny) or here(moredetail).
Why Are These Sorts Cool?
Each node, when given a problem, whaps the problem into two parts and hands it down to their children. And they pass it onto their children, and their children's children, and yea, all the children of the tree, which are as numerous as the stars in the sky, do break down the data of their ancestors. And verily, saith the Lynn, I shalt cause the answers to traverse back up the tree, each parent taking the burdens of their children and passing it upwards to their parents, and so is the tree sorted. Thus spake the Lynn. And so shall the BigO be log(n) where n is the number of elements in the tree.
Binary Tree Sort
The tree is arranged such that for each node, all things on the its left node are lower than it, and all things on its right node are higher than it. For instance,
6 ______ 3 9 binary tree sort  numbers are actual data in tree / \ / \ 1 5 8 12
Heap Sort
The tree is arranged such that the smallest element is on top. New elements are unceremoniously tacked onto the end of the array, then the entire tree is rummaged through to verify/fix the heap property of x > x_parent. This verifyfix shebang will always take O(log n) steps, where n is the size of the current heap. Not bad.
Merge Sort
Start with groups made of individual elements. For each iteration, pair up groups and mergesort them  that is, compare the smallest element in both groups and place the smaller of the two in the new group, leaving the larger one in place, and repeat until the entire group has been sorted.
Example: Merge sort the two lists (1, 3, 4, 8) and (2, 5, 6, 7)
Step 

List1 
List2 
Sorted 
Smallest two are 1 and 2, and 1 is smaller than 2. 

3, 4, 8 
2, 5, 6, 7 
1 
Smallest two are 3 and 2, and 2 is smaller than 3. 

3, 4, 8 
5, 6, 7 
1, 2 
Smallest two are 3 and 5, and 3 is smaller than 5. 

4, 8 
5, 6, 7 
1, 2, 3 
Smallest two are 4 and 5, and 4 is smaller than 5. 

8 
5, 6, 7 
1, 2, 3, 4 
Smallest two are 8 and 5, and 5 is smaller than 8. 

8 
6, 7 
1, 2, 3, 4, 5 
Smallest two are 8 and 6, and 6 is smaller than 8. 

8 
7 
1, 2, 3, 4, 5, 6 
Smallest two are 8 and 7, and 7 is smaller than 8. 

8 

1, 2, 3, 4, 5, 6, 7 
Smallest two are 8 and  whoops, just 8. Done. 



1, 2, 3, 4, 5, 6, 7, 8 
Sort RunningTime Comparisons So Far
Sort 
O(n) 
Bubble 
n^{2} 
Insertion 
n^{2} 
Selection 
n^{2} 
Heap 
n log(n) 
Merge 
n log(n) 
Quicksort 
Whoaaaa. 
The Origin Of N Log N aka the Information Theoretic Lower Bound
In an array (tree, heap, whatever) with n elements, there are n! different orders to arrange those elements in (since there are n possible first elements, then n1 possible second ones, then n2, and so forth, and (n)(n1)(n2)...(1) = n!). So we decide the sort in log(n!) which is approximately n log(n). Someone else can explain this more coherently than me, right?
Let's see:
We know there are n! permutations of any data set of size n (if not, take Applied Math or Discrete, and you will know it all too well.)
If we make these permutations be the tail ends (leaves) of a binary tree, that tree has log(n!) levels or decisions in order to go from the root of the tree all the way down.
This means that for any of n! permutations, we can figure out which one we've got, or "decide the sort", in log(n!) decisions.
We do math magic, and "it turns out" that log(n!) is real close to n log(n). Thus, good sorts can't get a whole heck of a lot better than n log(n).
Any better?
Cool Sorting Link
Thanks to the spiffiness that is Katie Rivard, we also have This Massive Index Page Of Sorts to check out. Code is even included.
NonLecture
Handouts
Recieved Chapter 11 (Sorting) from Data Structures and Their Algorithms by Lewis & Denenberg that describes different sorting algorithms and their speeds.
What's Next
We had hoped to get to the Paul Graham essay on why Lisp rocks, but that'll wait yet again until the everephemeral "next time".
Homework
...will absolutely positively be up Tuesday, and will not be due until the Tuesday after that.