Table of Contents for Programming Languages: a survey

Lisps (in general)

Lisp is a language family, not a single language. In lists of popular or beloved languages, "Lisp" often comes up, but it's unclear which "Lisp" is the most popular or beloved among the Lisps.

Common Lisp and Scheme are sometimes identified as the two most popular Lisp dialects (eg. "Today, the most widely known general-purpose Lisp dialects are Common Lisp and Scheme." -- [1]).

In online discussions about Lisp, I get the impression that the following four variants or subfamilies are the most popular or talked about:

Common Lisp is a committee-based standardization that was intended to unify a handful of (but not all) existing Lisp projects and dialects in the early 80s (to place this in context, Lisp was invented in 1958, so there were already a number of Lisp variants and descendents by 1980). It gained a lot of momentum and many commercial implementations of it were written. It is standardized but does not have a canonical implementation. It is said to be a 'large' language, and a good practical choice for getting real projects done in Lisp.

Scheme is a more minimal Lisp invented in 1975 (or, some would say, between 1975 and 1980), which also supports some features common lisp doesn't, such as a language primitive for first-class continuations (also, tail call optimization is required by the Scheme spec but not by the Common Lisp spec). It permits a functional style of programming. It is said to be small and elegant. It is standardized but does not have a canonical implementation. One Scheme variant, Racket, is said to have advanced facilities for metaprogramming and for prototyping new programming language features.

Clojure is a more recent (2007) Lisp variant. It compiles to the JVM and is known for its embrace of immutable data structures and its focus on cuncurrency constructs. It has a canonical implementation but is not standardized.

Elisp (started around 1984 or 1985 [2] [3]) is a language purpose-built for Emacs, a popular text editor software application. Most of Emacs is implemented in Elisp; Elisp is exposed to the user, who can use it as a scripting language to create Emacs extensions or to modify the behavior of Emacs. Because Emacs is used by many people, Elisp is used by many people; however, Elisp is not much used for anything outside of Emacs. Furthermore, Elisp's creator now promotes GNU Guile, a Scheme implementation, over Elisp (although Guile can also run Elisp) [4].

I decided to choose a couple of these to focus on. I chose Racket and Clojure. I chose Racket because it is perhaps the most popular Scheme implementation, and because its facilities for metaprogramming intrigued me. I chose Scheme because it is one of the two (supposedly) "most widely known general-purpose Lisp dialects", and because it is small, and because it is said to be elegant.

I ruled out Elisp because it seems that its popularity is due not to its own merits, but to the popularity of Emacs.

Between Common Lisp and Clojure, I chose Clojure because i am intrigued by its concurrency constructs and because it is the most recently designed popular Lisp (and so I hope that it has improved upon what came before it), and also because, as my purpose is to learn the 'essence' of Lisp and to describe it in a book surveying many programming languages, i am frightened by Common Lisp's reputation as a large language.

So, because Lisp is a beloved language family, there are three separate chapters about it. In this chapter, I talk about Lisp in general, about the various variants of Lisp and comparisons between them, and about Lisps other than Racket and Clojure, although we don't spend much time on details of Lisp itself. In the chapter on Racket, we explore the language of Scheme, and also Racket-specific constructs. In the chapter on Clojure, we explore the language of Clojure.

todo: am i sure i dont want to cover CL also?

todo: Paul Graham's list of attributes of lisp; Peter Norvig's list; the one from that book about lisp (games? aliens?)


Two things about lisp:

1) Many Lisp commands evaluate their arguments before applying some further transformation, but quote (abbreviated as the prefix character ') does not; the result of evaluating quote is its argument, unevaluated and unchanged. This is usually used to separate data from code. Note that this is different from saying "consider my argument as a single thing"; to do that you would enclose the argument in parentheses; for example car '(1 2 3) = 1, but car '((1 2 3)) = (1 2 3) (todo check)

2) Cons cells are just another name for the building block of linked lists. They are structs with two pointers, call them payload and next; the first element (payload) is a pointer to the payload (the data contained in this cons cell), and the second one is a pointer to the next cons cell in the linked list. You can think of a lisp list structure in two ways; as a binary tree (representing how the pointers between the cons cells are actually structured; the first cons cell in a list is the root of the tree with two children, its payload and the next cons cell; etc), or alternately as a nested structure of lists (a non-uniform tree; this is the way that the lisp language semantically interprets these things; however the underlying nature of cons cells as pairs of pointers (as opposed to flat lists) comes into play when (a) to get the first element in a list, you do car, but to get the second element in a list, you do car(cdr()), (rather than just cdr); and (b) The last cons cell in a list has a cdr of '() (rather than the value of the last item in the list)

recommended reading:

Best practices:


Good at:


Popularity of various lisps:


common lisp implementations: short answer: SBCL


Lisp is a language family, not a single language. What defines a Lisp?

McCarthy's 15 features

Quoted from McCarthy?, "LISP - Notes on its past and future", 1980:

Norvig's 9 features

Quoted from

Graham's 9 features

Quoted and paraphrased from :

Pitman: Lisp is a community, not a feature set

Comparison of various Lisps is found in the first part of the essay More than Just Words: Lamba, the ultimate political party by Kent Pitman. That essay makes the point that aside from CONS, there are few or no "common core of operators that were present throughout the family with the same name and semantics". He examines some candidates, and rejects them all:

landoflisp's 10 reasons why lisp reduces bugs


didibus's 6 features


Which Lisp

The short answer is 'racket or clojure or common lisp'.


Lisp Links



Because Scheme is sufficiently simple, there are many variant "schemes" and many "scheme" implementations. I've chosen to focus on the variant "Racket", which has its won chapter.


people: Guy Steele

History of Scheme

Some of Scheme's history is described in the R7RS-small Scheme standard document.

Scheme started out as an implementation, was described in a series of published memos/technical reports and papers (the Lambda Papers, published between 1975 to 1980). The first of these, MIT AI Lab Memo AIM-349: Scheme: An Interpreter for Extended Lambda Calculus, is now sometimes retrospectively referred to as "R0RS". Another memo, MIT AI Lab Memo AIM-452: The Revised Report on SCHEME: A Dialect of LISP, published in 1978, is sometimes retrospectively referred to as "R1RS". Starting at least as early as 1981, separate projects to create separate implementations of Scheme sprang up outside of MIT, and diverged, and in 1984 a committee began meeting to standardize Scheme [36] (apparently, using consensus decision-making [37]). The result of their work was The revised revised report on Scheme, or an uncommon Lisp, published in 1985, and retrospectively referred to as R2RS. In 1986 the committee issued a revision, Revised(3) Report on the Algorithmic Language Scheme, or R3RS, which is apparently when the RnRS? convention started. R4RS was issued in 1988, and became the basis for an IEEE standard in 1991 (IEEE 1179-1990). 1988 is also when the SRFI ("Scheme Requests for Implementation") process was begun; an SRFI is a proposal/standard for a Scheme extension or library. In 1992, R5RS added high-level hygienic macros, multiple return values, a 'load' command, and eval [38]; claims that R5RS is still the most widely implemented version of Scheme today.

In 2002-2003 a Steering Committee was setup to supervise future language revisions. In 2007, R6RS was produced, breaking from the consensus decision-making process, and ratified by a vote of over 60%. R6RS "specifies a much broader language" [39]; it added Unicode, various standard library functions, modules, exceptions, a metaprogramming tool called syntax-case, and requirements to support more numeric types [40].

However, according to the Background section of the R7RS-small spec, "most existing R5RS implementations (even excluding those which are essentially unmaintained) did not adopt R6RS, or adopted only selected parts of it.". Therefore, in 2009 R7RS the Scheme Steering Committee decided to divide Scheme into two "separate but compatible" language; Scheme-small and Scheme-large. In 2013 R7RS-small was published, and as of this writing, R7RS-large is still being written.

See also


Retrospectives and rationales




Lisp vs Scheme (and Common Lisp vs Scheme) (toread) ===:


Because it is a small language, there are many Scheme implementations written for pedagogical purposes, for practice, or for fun, and reading these is sometimes said to be a good way to learn about programming language implementation.

Racket vs. standard Scheme



" Data and procedures and the values they amass, Higher-order functions to combine and mix and match, Objects with their local state, the messages they pass, A property, a package, the control point for a catch-- In the Lambda Order they are all first-class. One Thing to name them all, One Thing to define them, One Thing to place them in environments and bind them, In the Lambda order they are all first-class. " -- Scheme R2RS aka MIT AI Memo 848,


The feature list in the 2009 Steering Committee Position Statement

The Scheme Steering Committee Position Statement (2009) says

"It is almost misleading to call Scheme a "programming language;" it would be more accurate to characterise Scheme as a family of dialects, all loosely related by the common features of:

going on to note that the following additional features are often necessary yet are typically implemented in a non-standardized way:

Because of the need to use implementation-dependent variants of these latter features, they call Scheme "the world's most unportable programming language".

and they also make note of "two key technologies" that should (in the future) "enable great portability":


Chicken Scheme

Tiny Scheme


Because it is so well-liked, Clojure gets its own chapter.

See chapter on chapter on Clojure.


Because the Lisp family is so well-liked, Racket gets its own chapter (as a representative of Scheme/Lisp).

See chapter on chapter on Racket.

Common Lisp

A committee-defined effort to standardize various MACLISP variants into one language.




Best practices and style guides and tooling and libraries:





Less popular Lisps





In order to support the small memory sizes of many machines of the time, Elisp started out with an initial focus on being small [48].





Descended from Scheme




not as popular or well-regarded as Racket, although it seems to be more portable and closer to 'embeddable'.


See also Guile section of proj-plbook-plChImplementationCaseStudies, [[proj-plbook-plChMiscIntermedLang?]], proj-plbook-plChSingleIntermedLangs, proj-plbook-plChTargetsImpl.

Introductory links:


A language for 'procedural reflective' metaprogramming. See its section in proj-plbook-plChMetaprogramming for more.


see also

Operatives vs. applicatives

Applicatives are ordinary functions, that take as input the value of their arguments.

Operatives are fexprs, functions whose arguments are passed in without being evaluated. (todo is this precisely correct? Shutt never says fexprs == operatives (update; in his thesis chapter 1 it seems like he is says, yes, it's fexprs)

todo: how is this related to non-strict/lazy evaluation? With operatives, are you guaranteed to be able to inspect/reflect upon the thunk for the argument passed in (in contrast to most lazy languages, in which thunks are opaque; all you can do is pass them around, or force their evaluation). 's reply says yes, this is it, but shutt's thesis says that most of what he is talking about is ortho to lazy/eager


Kernel Links




First non-prototype implementation of Scheme (?):


" Somewhere in the T 2 effort, Kent Pitman, another Lisp wizard, came down to Yale from MIT. He and Jonathan poured an immense amount of design effort into the language, and it was just really, really *clean*. Small (but difficult) things: they chose a standard set of lexemes and a regular way of assembling them into the names of the standard procedures, so that you could easily remember or reconstruct names when you were coding. (I have followed this example in the development of the SRFIs I've done for the Scheme community. It is not an easy task.)

Larger, deeper things: they designed a beautiful object system that was integrated into the assignment machinery -- just as Common Lisp's SETF lets you assign using accessors, e.g., in Common Lisp (setf (car x) y) is equivalent to (rplaca x y) in T, (set! (car x) y) was shorthand for ((setter car) x y) Accessor functions like CAR handled "generic functions" or "messages" like SETTER -- CAR returned the SET-CAR! procedure when sent the SETTER message. The compiler was capable of optimising this into the single Vax store instruction that implements the SET-CAR! operation, but the semantic machinery was completely general -- you could define your own accessor procedures, give them SETTER methods, and then use them in SET! forms.

(This turned out to be very nice in the actual implementation of the compiler. The AST was a tree of objects, connected together in both directions -- parents knew their children; children also had links to their parents. If the optimiser changed the else-part of an if-node N with something like this (set! (if-node:else n) new-child) which was really ((setter if-node:else) n new-child) the if-node:else's SETTER method did a lot of work for you -- it disconnected the old child, installed NEW-CHILD as N's else field, and set NEW-CHILD's parent field to be N. So you could never forget to keep all the links consistent; it was all handled for you just by the single SET! assignment.) " -- [54]



Wasp Lisp


"A Clojure inspired Lisp implemented in less than 1024 bytes of JavaScript?...with JSON source, macros, TCO, interop and exception handling."


RSR5 Scheme.


Scheme 79

scheme 79 : a LISP chip.

Items in memory were 32 bit tagged elements: 24 data bits, 7 type bits, and 1 garbage collection bit. The 24 data bits could be 'immediates' or pointers.

Norvig's Python toy Lisp implementation

Tiddlylisp toy

Based on Norvig's Python toy Lisp implementation.


A lisp that compiles to Python bytecode.



" femtolisp is about 150kb, is very self-contained, and has the following features: is fast, ranking among the fastest non-native-compiled Scheme implementations. It achieves this level of speed even though many primitives (e.g. filter and for-each) are written in the language instead of C. femtolisp uses a bytecode compiler and VM, with the compiler written in femtolisp. Bytecode is first-class, can be printed and read, and is "human readable" (the representation is a string of normal low-ASCII characters).

femtolisp is a simple, elegant Scheme dialect. It is a lisp-1 with lexical scope. The core is 12 builtin special forms and 33 builtin functions. ... too often I see people omit some of the obscure but critical features that make lisp uniquely wonderful. These include read macros like #. and backreferences, gensyms, and properly escaped symbol names ... Another design goal is to avoid spurious novelties. Many others offering their own "shiny new" lisp dialects get carried away and change anything that strikes their fancy. These changes have no effect except incompatibility, and often make the language worse because the new design was not as carefully thought out and has not stood the test of time. For example, how does it help to remove backquote? One design changes the syntax of quote. Some systems disallow dotted lists. ... I would like to refute the idea that...tail call optimization...makes interpreters slow. Look at the "tiny" subdirectory or the "interpreter" branch to see a pure s-expr interpreter with efficient TCO. All you have to do is keep track of whether you're in tail position, which can be done very cheaply. These interpreters are difficult to beat for speed, yet they have lexical scope and TCO. " [56]

Used in the Julia languages's implementation to write the parser.

The "33 builtin functions" are listed near the top of [57], and are:

[58] also lists the following, perhaps that page is out of date and these are no longer primitive?:

and [59] also lists the following 'nonstandard builtin functions' in addition to the above:

According to [60], the 12 builtin special forms are:

[61] notes that the following of these could be derives from the others: cond, and, or, for, prog1

The page [62] also contains a listing of standard library functions.

In various files in the 'tiny' directory (which i think contains one or more forks of the main code?), the following 10 special forms are named:

these files also contain the following list of 23 builtin functions (different from the previous list):

one of the files in the tiny directory (tiny/lisp2.c) contains a few extra builtin functions:

The file [63] begins with a table of instructions including:


Other misc uses of Lisp uses sexprs


Scheme48, Lisp Machine Lisp, EuLisp?, and Kawa. -- --



"hygiene The prin­ciple that macros adopt the lexical context of their defi­n­i­tion site (which is known in advance to the macro writer) rather than their calling site (which is not). This elim­i­nates the risk of name colli­sions between iden­ti­fiers intro­duced by the macro and iden­ti­fiers at the calling site. (A problem also known as iden­ti­fier capture.) One can over­ride hygiene by using an unhy­gienic iden­ti­fier. See also the hygiene explainer." [64]

Lists or pairs?

When reading about lists, you sometimes get the idea that everything is built out of (nested) lists of atoms; but then other times you get the idea that everything is built out of nested cons cells, that is, binary trees with atom leaf nodes, or equivalently 'pairs', or equivalently '(nested) lists of length 2', or equivalently '2-ary nested lists'.

It's actually nested cons cells (lists of fixed length 2); n-ary lists are represented using a terminator, NIL, and eg (1 2 3 4) is represented as (1 (2 (3 (4 NIL)))).

See also Elisp manual: Lists and Cons Cells.

Lisp-1 vs Lisp-2