Church’s Thesis and Functional Programming

Church’s Thesis and Functional Programming – Turner 2006

One of a collection of papers celebrating the 70th anniversary of Church’s thesis in 2006, as recently recommended by Erik Meijer on twitter.

Both the thesis and the lambda calculus have been of seminal influence on the development of Computing Science.

There were three independently developed definitions for the computable functions of the natural numbers in the early -mid 1930s. These were proved to be equivalent, and then…

A few months later Turing (1936) introduced his concept of a logical computing machine – a finite automation with an unbounded tape divided into squares….

In reviewing the Turing paper in 1937, Church wrote:

there is involved here the equivalence of three different notions: computability by a Turing machine, general recursiveness . . . and lambda-definability . . .The first has the advantage of making the identification with effectiveness in the ordinary sense evident immediately … The second and third have the advantage of suitability for embodiment in a system of symbolic logic

The Turing machine ultimately led to the development of the Turing/von-Neumann computer. The second and third notions led to the development of functional programming.

Of the various convergent notions of computability Church’s lambda calculus is distinguished by its combination of simplicity with remarkable expressive power.

The lambda calculus has three productions: a variable, function application, and function abstraction (a lambda expression, for example λx.x ). The essential rule of substitution, called beta reduction, tells us how to supply a value for a function parameter and substitute all occurences of that parameter in the expression body for the provided value. In other words, what it means to provide an argument to a function.

Also note that it is from the lamda calculus we get the term ‘combinator’. A combinator is simply a lambda expression with no unbound variables. (called a ‘closed term’). Famous combinators are named by single letters. For example, the identity combinator is called ‘I’:

I = λx.x

And the constant combinator is called ‘K’:

K = λx.λy.x

(Constant because K z returns a function that always returns z, whatever argument you give it…).

Combinators can also be used to support recursive functions:

The master-stroke, which shows every recursive function to be λ-definable is to find a universal fixpoint operator, that is a term Y with the property that for any term F, Y F ⇔ F(Y F) There are many such terms, of which the simplest is due to H.B.Curry. Y = λf.(λx.f(xx))(λx.f(xx))

By convention, this combinator is represented by the letter Y. Yes HN, that’s the ‘y-combinator.’

Church showed how to represent numbers as function iterators (a, f a, f f a, f f f a , …). The scheme also works for any data type.

Moreover we are not limited to arithmetic. The idea behind the Church numerals is very general and allows any data type — pairs, lists, trees and so on — to be represented in a purely functional way. Each datum is encoded as a function that captures its elimination operation, that is the way in which information is extracted from it during computation. It is also possible to represent codata, such as infinite lists, infinitary trees and so on.

There follows a wonderful section on functional programming, contrasted with imperative programming:

Imperative programming languages, from the earliest such as FORTRAN and COBOL which emerged in the 1950’s to current ”object-oriented” ones such as C++ and Java have certain features in common. Their basic action is the assignment command, which changes the content of a location in memory and they have an explicit flow of control by which these state changes are ordered. This reflects more or less directly the structure of the Turing/von Neumann computer, as a central processing unit operating on a passive store. Backus (1978) calls them ”von Neumann languages”. Functional programming languages offer a radical alternative — they are descriptive rather than imperative, have no assignment command and no explicit flow of control — sub-computations are ordered only partially, by data dependency.

(We’ve encountered this idea earlier in the morning paper series too, at #themorningpaper no. 20).

Now comes an important point about what happens when you try to add functional support to imperative languages, echoed by Erik Meijer recently The Curse of the Excluded Middle, #themorningpaper no. 41. :

The disadvantages of functional programming within a language that includes imperative features are two. First, you are not forced to explore the limits of the functional style, since you can escape at will into an imperative idiom. Second, the presence of side effects, exceptions etc., even if they are rarely used, invalidate important theorems on which the benefits of the style rest.

There are many other good passages within the paper which we do not have space here to enumerate. I leave you with an extended quotation from the conclusions:

Church’s Thesis played a founding role in computing theory by providing a single notion of effective computability. Without this foundation we might have been stuck with a plethora of notions of computability depending on computer architecture, programming language etc.: we might have Motorola-computable versus Intel-computable, Java-computable versus C-computable and so on. The λ-calculus, which Church developed during the period of convergence from which the Thesis emerged, has influenced almost every aspect of the development of programming and programming languages. It is the basis of functional programming, which after a long infancy is entering adulthood as a practical alternative to traditional ad-hoc imperative programming languages. Many important ideas in mainstream programming languages—recursion, procedures as parameters, linked lists and trees, garbage collectors — came by cross fertilization from functional programming. Moreover the main schools of both operational and denotational semantics are λ-calculus based and amount to using functional programming to explain other programming systems.