Table of Contents for Programming Languages: a survey

Chapter: metaprogramming: hacking the call stack

first-class call stacks

Chapter: metaprogramming: hacking the ENV

hacking the ENV

ruby's method_missing, respond_to, instance_eval

Chapter: metaprogramming: operations


object protocols

Chapter: metaprogramming: hacking classes

Chapter : metaprogramming: syntax

custom operator precedence

syntax rules


Chapter : metaprogramming: eval (todo generalize)


interpreter tower

up and down

Chapter : metaprogramming: misc

by model of computation

the models of computation also suggest extension mechanisms:

turing/imperative: gotos, selfmodifying code, mutable state, call stack manipulation (including continuations) lambda calc/functional: higher order functions combinatorial: reductions grammar? concatenative? logic? mu-recursive? relational?

where do macros come from? grammar? combinatorial? none of these?

by evasion of constraints

see Martin's [1]. Martin defines programming paradigms by constraints, and it seem to me that by evading these constraints (e.g. by using GOTO) we get metaprogrammy stuff.


" There is a reasonable fear that attributes will be used to create language dialects. The recommendation is to use attributes to only control things that do not affect the meaning of a program but might help detect errors (e.g. [[noreturn?]]) or help optimizers (e.g. [[carries_dependency?]]) " --


todo: put this section elsewhere




The SELL paper also criticizes metaprogramming through compiler options and pragmas (not flexible enough), preprocessed languages (distinguished from compilers by the criterion that in a true compiler, you can never get an error from the target language's compiler; the problems are: (i) the preprocessor toolchain must be maintained/released in lockstep with the host language (ii) difficult to remove undesirable features from the host language, (iii) impedence mismatch with the host language, especially the type system, a symptom of which is often seen in trouble with error detection and reporting), and dialects (the problems are: (i) difficult to remove undesirable features from the host language; (ii) similar but lesser problems as making a new language w/r/t expense of tooling implementation and maintanence, and networks effects of users).



" Wyvern: at CMU, Jonathan Aldrich's group has been working on the Wyvern language, which incorporates a novel language feature called Type-Specific Languages (TSLs) developed by Cyrus Omar. TSLs are a means of extending the syntax of Wyvern by defining parsers as a first-class member of the language and allowing users to create their own mini-languages inside of Wyvern: This is similar to procedural macros in Rust which are essentially functions from TokenStream? → AST that define an alternate syntax for the language. TSLs, however, are notable in that they are unambiguous and composable. The general idea is that a TSL returns a value of a known type, and the compiler can do type inference to determine the expected type of a value, so it can resolve ambiguity between languages based on their types. Additionally, the TSLs can be used within each other. When you define a new TSL, you get interoperability with every other TSL defined in Wyvern for free! Imagine if you could freely mix C, OCaml, Javascript and HTML with type-safe interop. This style of composition is the future of front-end, or syntax level, interop.

Figure 3: HTML templating with SQL and CSS mixed in using Wyvern.

" --


3-Lisp augements Lisp with 'reflective procedures', which can be thought of as 'the hook to end all hooks' -- they are called from the context of the interpreter, and are passed in their own arguments, and the environment, and finally a continuation. They are supposed to compute their result, and then return it by calling the continuation that they were passed (with the result as argument). They have access to interpreter functions 'normalize' and 'reduce' as well as 'set' (or 'rebind'), and also 'up' and 'down'; 'down' is similar to unquote and can be used to refer to the 'program level' interpretation of a term, rather than the 'interpreter level' interpretation, and 'up' is similar to quoting (todo: not sure if i got that right, especially with 'up' and 'down').

Note that 3-Lisp executes the program as if it is at the highest level of an infinite tower of interpreters; eg you can have a 2nd-order reflective procedure, etc.

Note that in 3-Lisp fexprs can be implemented because a reflective procedure has control over when, if at all, each of its arguments is evaluated.

"(Apparently, 3-Lisp is quite similar to Kernel: every operative receives not only its operand tree and the lexical environment in which it is called, but also its continuation. I still don't understand it. Why would you pass the continuation to an operative, when it can easily obtain it using e.g. call/cc? Apparently because 3-Lisp considers the continuation to exist on the next meta-level, not the current one.)" --


deep vs. shallow DSL embedding:

"As far as I'm aware, while you can use the strict definition (is there, in the metalanguage, the creation of an AST?) they're often discussed as more of a continuum. HOAS is a great example. There is the creation of an AST, but perhaps the most important and tricky part of any AST, the binding system, is left to the metalanguage. For that reason exactly I'm happy to say that HOAS is "shallower" than, say, a de Bruijn indexed binding system. "


macros like __FILE__ and __LINE__