Lecture 21: Termination, Making Choices



  1. Termination: we now understand that programs do not necessarily terminate anymore. What does it mean to make a termination argument? Let's look at three cases:
    1. Quick-Sort
       
      ;; quick-sort : (listof number) -> (listof number)
      ;; Purpose: sort a list of numbers
      (define (quick-sort alon)
        (cond
          ((empty? alon) empty)
          (else (append 
      	    (quick-sort (smaller-items alon (first alon))) 
      	    (list (first alon)) 
      	    (quick-sort (larger-items alon (first alon)))))))
      
      Here is the graph of applications:
                      (quick-sort (list ....))
                              |
                       _______|________
                      |                |
                (quick-sort ...)   (quick-sort ...)
           ... .... 
      
      We need to make sure that each "path" terminates. If we can argue that each application receives a smaller call we are set -- why -- induction -> 280.
    2. Tabulate
      ;; tabulate : (num -> num) num num -> num
      ;; Purpose: list of (list x (f x)) for all x in start, start + 1.0, ... end.
      ;; Example: (tabulate sin 4. 6.)
      ;;  = (list (list 4. (sin 4.)) (list 5. (sin 5.)) (list 6. (sin 6.)))
      (define (tabulate f start end)
        (cond
          ((= start end) empty)
          (else (cons (list start (f start)) (tabulate f (+ start 1.0) end)))))
      
      Won't terminate -- ouch. Need < to do a good job.
    3. Newton : nothing is guaranteed to decrease -- say so!


  2. Choices: now that we have "generative" recursion, we can make a choice as we design programs.

    Example 1: both sort versus quick-sort produce sorted lists of numbers from list of numbers

    Example 2: both find-root versus find-root2 produce small intervals in which some given function f has a root; so does find-root-newton

    What choice to make? If a structural solution exists, develop it. If it is doing things properly, fine. If not, think about a generative approach.
    Consider find-root versus find-root2. Suppose the root is close to 0. And we start at 1024. How many recursive steps will we do with find-root and how many with find-root2:

     
      (find-root f 1024)                (find-root f 0 1024)    
    = (find-root f 1023)		  = (find-root f 0 512)     
    = ...				  = ...                     
    = (find-root f 0)		  = (find-root f 0 1)       
    = 0				  = 0                       
    
         ~ 1024 steps [O(i)]             ~ 10 steps [O(log2 i)]
    

    We will cover this topic in Lab again with a similarly dramatic example.

  3. Coding up Algorithms versus Creating Algorithms
  4. The spectrum: There is a whole spectrum of generative programs:
    Newton          Bi-section        Quick             Floating-Point loops
     * --------------- * --------------- * ------------------- * 
    
    Some are difficult, and not expected from a plain old undergraduate. Others are easy and belong into your standard repertoire.

    The role of mathematics. In many cases new algorithms are discovered by studying the underlying math. You often need to know/study continuous mathematics (Zippel) for scientific problems; deep discrete mathematics for structural problems. A mere programmer may never be asked to do so, but a computer scientist must understand what's going on.





Matthias Felleisen This page was generated on Fri Apr 9 09:17:38 CDT 1999.