I was going through some old drafts of posts when I stumbled across one about primitive recursive functions and register machines. Here’s how that one began:

So in class, we were talking a bit about primitive recursive functions, and it was mentioned in passing that every primitive recursive function can be converted to an equivalent register machine. This is fairly obvious if you think about it, seeing that register machines are computationally universal.

^{1}It was an “exercise for the reader” to implement such a compiler from primitive recursive functions to register machines. Naturally, having many better things to do over the weekend but not wanting to do them, I decided to try my hand at it.I’m well aware that this is precisely the sort of blog post that no one ever reads. I’ll take full advantage of that and write the code in Standard ML, a language which no one ever runs. Maybe more people would run it if I wrote it in Haskell, but oh well.

I never finished that program—I encountered some difficult-to-debug issues with getting recursion to work. But seeing it made me realize that primitive recursion would make an excellent topic for a problem for 15-150 (Principles of Functional Programming, a class for which I was once a TA). It involves:

- thinking more carefully about functions and recursion,
- representing functions in a different way,
- defining functions in a “point-free” manner, and
- using a recursive datatype.

Quite possibly someone in the past has written such a problem, but in case not, I thought that I’d note this here. Some of this material is drawn from Klaus Sutner’s excellent lecture notes at CMU, but presented in a way that is more in line with what 15-150 emphasizes.

Admittedly, the issue with adapting this as a 15-150 problem is that it requires a rather long-winded introduction in order to be comprehensible. I think that our writeups are long enough already (and I certainly felt that way as a student!), so unfortunately perhaps there is little room for additional cool theory.

Let’s get slightly technical about functions. A primitive recursive function is a one that can be computed using a very limited sense of recursion, with only a single, decreasing parameter. Despite the limitation, this is a surprisingly powerful class of functions, and you’d be hard-pressed to find an everyday function that is not primitive recursive. In fact, you can show that this forms exactly the class of things you can compute using only a single for-loop with a *fixed* bound.

To be a little more formal, we’ll define some ground set $A$ and consider functions of the form $f : A^n \to A$. Call $n$ the *arity* of $f$. The most relevant case to us will be when $A$ is the set $\mathbb{N}$ of natural numbers, in which case these functions are the familiar arithmetic functions.

There are three types of arithmetic functions that are primitive recursive by definition:

- The constant function $0$. We can treat $0$ as a function by thinking of it as a constant, nullary function $c_0$,
*sc*. one with no arguments. The function $c_0 : \mathbb{N}^0 \to \mathbb{N}$ is a funny one to think about, but it technically checks out. - The successor function, $S : \mathbb{N} \to \mathbb{N}$ via $S(x) = x + 1$.
- All (infinitely many)
*projection*functions, of the form $P_i^n : A^n \to A$. These functions allow us to mix-and-match with different arities; the projection $P_i^n$ is just the function $P_i^n(x_1, \ldots, x_n) = x_i.$ In this way, we can define, for instance, a ternary addition function in terms of a binary one: $\mathsf{add}^{(3)} = \mathsf{add}^{(2)} \circ (P_1^3, \mathsf{add}^{(2)} \circ (P_2^3, P_3^3)).$

Then there are two ways that we can make a new primitive recursive function out of existing ones:

**Function composition.**Given functions $g_i : A^m \to A$, for $i \in [n]$, and some function $h : A^n \to A$, their composition is the function $f : A^m \to A$, via $f(\bm{x}) = h(g_1(\bm{x}), \ldots, g_n(\bm{x})).$ Notationally, this is $f = h \circ (g_1, \ldots, g_n)$.**Primitive recursion.**Given functions $h : \mathbb{N}^{n+2} \to \mathbb{N}$ and $g : \mathbb{N}^n \to \mathbb{N}$, define the function $f : \mathbb{N}^{n+1} \to \mathbb{N}$ via $\begin{aligned}f(0, \bm{y}) &= g(\bm{y}) \\ f(x+1, \bm{y}) &= h(x,f(x,\bm{y}),\bm{y}),\end{aligned}$ written as $f = \mathsf{Prec}[h,g]$. As promised, this is just a limited version of recursion, with a “base case” $g$ and a “recursive case” $h$. The limitation here is that there is only a*single*parameter—$x$—that is allowed to change between recursive calls, and it must decrease each time.

Again, this is presented with a very 15-150 flavor. It is also interesting to think about primitive recursive functions in a more algebraic way, as a *function algebra* defined by zero and successor and closed under primitive recursion.

Primitive recursive functions are interesting in many respects and turn out to be particularly important when trying to formalize arithmetic, *à la* Gödel. One may reasonably ask whether *all* arithmetic functions are primitive recursive. It turns out that this is not the case; one good example is the evaluation of (suitably encoded) primitive recursive functions. Just like general recursive functions,^{2} primitive recursive functions are not powerful enough to evaluate themselves. An even more interesting example is the well-known Ackermann function, a *bona fide* arithmetic function that is obviously computable^{3} yet provably grows faster than any primitive recursive function.

The way we’ve defined them, primitive recursive functions are essentially just a recursive datatype. It’s quite straightforward to represent them as such in Standard ML, everyone’s favorite programming language:

```
datatype pr = zero (* constant 0 *)
(* successor S *)
| succ of int * int (* projection P_i *)
| proj of pr * pr list (* f o (g1, ..., gn) *)
| comp of pr * pr (* Prec[h, g] *) | prec
```

As an example, the addition function $\mathsf{add} : \mathbb{N}^2 \to \mathbb{N}$ is primitive recursive, for it can be written as $\begin{aligned} \mathsf{add}(0,y) &= y \\ \mathsf{add}(x+1,y) &= S(\mathsf{add}(x,y)), \end{aligned}$ or more formally as $\mathsf{add} = \mathsf{Prec}[S \circ P_2^3, P_1^1]$. In Standard ML:

`val add = prec (comp (succ, [proj (2,3)]), proj (1,1))`

As an exercise, perhaps try something slightly more complex, like subtraction or factorial, in code. If you’re truly bored, you can try primality testing or something.

Why can’t we just use actual functions in Standard ML to represent primitive recursive functions? One might be tempted to write something like:

`type pr' = int list -> int`

For one, it’s quite difficult to get information out of a function this way—indeed, it’s in general impossible to get nontrivial information about an arbitrary function.^{4} Additionally, we lose some important guarantees that our type system gives, such as program termination. In particular, every primitive recursive function must terminate, but not every function in Standard ML must.

But still, this leads to some good exercises. For instance: write a function `convert : pr -> pr'`

which produces an honest-to-goodness Standard ML function that evaluates the given primitive recursive function, so that

`val 5 = (convert add) [2, 3]`

It’s not too difficult, but I won’t spoil the fun here, in case this actually does end up as a 15-150 problem some day.

On the proofs side, I would be hesitant to assign problems of the form “show that function $f$ is primitive recursive,” since lots of hand-waving is involved for all but the simplest of functions, and I’m not sure that I want to encourage that in an introductory class.

Actually, the claim in class was slightly stronger, namely that the set of partial functions computable by a register machine forms a “clone” (also called a

*function algebra*, which I feel makes one sound smarter), and that this clone contains the clone of primitive recursive functions.↩︎Famously, the class of general recursive functions is exactly the same as those functions which are computable on a Turing machine, a central result in computability theory.↩︎

And for the 15-150 students out there, total!↩︎

This is related to Rice’s Theorem, I think.↩︎