There's another sense of the word ``closure'' while still in the subject of computing but which is different from what you've understood so far. Your understanding and the sense of the word used by Matthias Felleisen and David Storrs is that of Peter Landin in 1964.
In ``[t]he mechanical evaluation of expressions'', Peter Landin defines ``closure'' in the popular sense used in computing. (This is the earliest use of the word, as far as I know.) Also we represent the value of lambda-expression by a bundle of information called a "closure", comprising the lambda-expression and the environment relative to which it was evaluated. We must therefore arrange that such a bundle is correctly interpreted whenever it has to be applied to some argument. More precisely: a closure has an /environment part/ which is a list whose two items are: (1) an environment (2) an identifier or a list of identifiers, and a /control part/ which consists of a list whose sole item is [a lambda expression]. This is how most computer scientists use the word. Mathematicians on the other hand usually mean something else when it comes to ``closure''. For example, ``a group is an algebraic structure consisting of a set of elements equipped with an operation that combines any two elements to form a third element. The operation satisfies four conditions called the group axioms, namely /closure/, associativity, identity and invertibility.'' -- Wikipedia But this more mathematical sense is quite relevant in computing. The best example I know is that from Harold Abelson (and Gerald Sussman) in the Lisp Lectures of 1986. Abelson is the one that uses the term in this mathematical meaning, applying it to the design of functions. Here's every mention of the word in the lectures. ``So I almost took it for granted when I said that cons allows you to put things together. But it's very easy to not appreciate that, because notice, some of the things I can put together can themselves be pairs. And let me introduce a word that I'll talk about more next time, it's one of my favorite words, called closure. And by closure I mean that the means of combination in your system are such that when you put things together using them, like we make a pair, you can then put those together with the same means of combination. So I can have not only a pair of numbers, but I can have a pair of pairs.'' -- Harold Abelson, lecture 2B ``Compound Data'', 1986. Then again in the next lecture: ``And again just to remind you, there was this notion of closure. See, closure was the thing that allowed us to start building up complexity, that didn't trap us in pairs. Particularly what I mean is the things that we make, having combined things using cons to get a pair, those things themselves can be combined using cons to make more complicated things. Or as a mathematician might say, the set of data objects in List is closed under the operation of forming pairs. That's the thing that allows us to build complexity. And that seems obvious, but remember, a lot of the things in the computer languages that people use are not closed. So for example, forming arrays in basic and Fortran is not a closed operation, because you can make an array of numbers or character strings or something, but you can't make an array of arrays.'' -- Harold Abelson, lecture 3A, ``Henderson Escher Example'', 1986. A few lectures later, Gerald Sussman uses the term, but in the sense of Peter Landin. ``First of all, one thing we see, is that things become a little simpler. If I don't have to have the environment be the environment of definition for procedure, the procedure need not capture the environment at the time it's defined. And so if we look here at this slide, we see that the clause for a lambda expression, which is the way a procedure is defined, does not make up a thing which has a type closure and a attached environment structure. It's just the expression itself. And we'll decompose that some other way somewhere else.'' -- Gerald Sussman, lecture 7B, ``Metacircular Evaluator'', 1986. And the two last appearances of the word, Abelson calls our attention to the importance of the idea. ``The major point to notice here, and it's a major point we've looked at before, is this idea of closure. The things that we build as a means of combination have the same overall structure as the primitive things that we're combining. So the AND of two things when looked at from the outside has the same shape. And what that means is that this box here could be an AND or an OR or a NOT or something because it has the same shape to interface to the larger things.'' ``It's the same thing that allowed us to get complexity in the Escher picture language or allows you to immediately build up these complicated structures just out of pairs. It's closure. And that's the thing that allowed me to do what by now you took for granted when I said, gee, there's a query which is AND of job and salary, and I said, oh, there's another one, which is AND of job, a NOT of something. The fact that I can do that is a direct consequence of this closure principle.'' -- Harold Abelson, lecture 8B, ``Logic Programming'', 1986. In the book, ``The Structure and Interpretation of Computer Programs'', Abelson and Sussman expose the difference between these two meanings in chapter 2 --- in particular in footnote 6. The ability to create pairs whose elements are pairs is the essence of list structure's importance as a representational tool. We refer to this ability as the closure property of cons. In general, an operation for combining data objects satisfies the closure property if the results of combining things with that operation can themselves be combined using the same operation.[6] Closure is the key to power in any means of combination because it permits us to create hierarchical structures -- structures made up of parts, which themselves are made up of parts, and so on. [...] In this section, we take up the consequences of closure for compound data. We describe some conventional techniques for using pairs to represent sequences and trees, and we exhibit a graphics language that illustrates closure in a vivid way.[7] (*) Footnotes [6] The use of the word ``closure'' here comes from abstract algebra, where a set of elements is said to be closed under an operation if applying the operation to elements in the set produces an element that is again an element of the set. The Lisp community also (unfortunately) uses the word ``closure'' to describe a totally unrelated concept: A closure is an implementation technique for representing procedures with free variables. We do not use the word ``closure'' in this second sense in this book. [7] The notion that a means of combination should satisfy closure is a straightforward idea. Unfortunately, the data combiners provided in many popular programming languages do not satisfy closure, or make closure cumbersome to exploit. In Fortran or Basic, one typically combines data elements by assembling them into arrays -- but one cannot form arrays whose elements are themselves arrays. Pascal and C admit structures whose elements are structures. However, this requires that the programmer manipulate pointers explicitly, and adhere to the restriction that each field of a structure can contain only elements of a prespecified form. Unlike Lisp with its pairs, these languages have no built-in general-purpose glue that makes it easy to manipulate compound data in a uniform way. This limitation lies behind Alan Perlis's comment in his foreword to this book: ``In Pascal the plethora of declarable data structures induces a specialization within functions that inhibits and penalizes casual cooperation. It is better to have 100 functions operate on one data structure than to have 10 functions operate on 10 data structures.'' -- You received this message because you are subscribed to the Google Groups "Racket Users" group. To unsubscribe from this group and stop receiving emails from it, send an email to racket-users+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.