A Funny Function, Mergesort, Finding a root of a function, Homework hints
Generative recursion is simply a catch-all term for recursion that doesn't necessarily follow from the data definition. When our functions do follow the data definition, life is good: half our work is done by simply writing the template, plus we are guaranteed that our function will terminate (by the mathematical principle of induction -- see Comp 280). But sometimes life (and programming) is more complicated.
Consider the function f : positive -> positive,
f(n) = | 1 | if n=1 |
n/2 | if n is even | |
3n+1 | if n is odd |
; threes : positive -> positive (define threes (lambda (n) (if (= n 1) 1 (threes (if (even? n) (/ n 2) (+ 1 (* n 3)))))))
An example: if we start with (say) 5, we find that:
Write a program stops? which takes a number n, and returns #t if n, f(n), f(f(n)), ... ever reaches the number 1, i.e., if (threes n) stops. (What does your program do if this doesn't happen?) It should not call threes, but it will resemble it. This function is an example of generative recursion: the answer to (stops? n) might depending generating a new instance, 3n+1, and recursing on that. Notice that unlike structural recursion, there is no guarantee that our generative recursion problem is going to finish.
Why do we choose to stop at f(1)?
Does threes stop for all inputs? (Don't feel bad if you can't answer this. No one else has yet.)
Now, write a modified function time-til-stop (or tts if you hate typing), which instead of returning #t, it returns the number of steps until 1 is reached. That is, (time-til-stop 5) = 5.
(This function can be written without using an accumulator. If you want, you could also use an accumulator for it. Really, you should have enough knowledge to write it either way.)
Test time-til-stop on some large numbers, say 1001, 2349872349247990, and 2349872349247991.
Here's a more standard version of mergesort than the one you have seen:
; mergesort : list-of-numbers -> list-of-numbers (define mergesort (lambda (lon) (if (has-zero-or-one-elements? lon) lon (local [(define len (length lon)) (define len-half (truncate (/ len 2)))] (merge (mergesort (take lon len-half)) (mergesort (drop lon len-half)))))))(The full code is in ~comp210/public_html/Labs/lab07/mergesort.ss.)
As mergesort recurs, clearly the length of the argument lon decreases, and we eventually get to our base case. That is how we convince ourselves that this function always terminates.
To be more accurate, we have two ways of describing what is decreasing:
(Labbies: Omit this if you don't have time.)
As a different example of generative recursion, consider using Newton's Method to find a root of a function--that is, where the graph crosses the x-axis.
Newton's method will be described by the TA on the board. Roughly, you start with an initial guess x0 as to where a function f crosses the x-axis. The guess is probably wrong, so generate a next guess by the following method:
Newton's method often converges very quickly to a root, but you can also think of cases where it is foiled: it might divide by 0, it might loop or diverge, and even if it gives you a root, it might not be the root you want.
A hint and test data are available.
In lab, we will discuss some ideas about what states are for the Missionaries & Cannibals problem. The following are some generalities to consider:
E.g., if we are representing only right triangles, the data might be the lengths of the three sides, but having two side lengths is sufficient. We may want to keep track of only the minimal amount of information (here, two sides' lengths), or all of the data (here, all three sides' lengths). The latter choice makes it easier to get at all of the data, but we may have to worry about having inconsistent data (here, three lengths that don't make a right triangle).
Your M&C program needs to build up a list of moves to the result. That sounds sort of similar to our in-class example that built up a list of 'mother and 'father for a path in a family tree.