---------------- The wait Programming Language Specification. Version 1.0. Each implementation must choose some Turing-complete programming language as a "reference language", and some function mapping the natural numbers to the programs in the reference language. Each program must be produced by at least one natural number. The reverse need not be the case, but it seems desirable that a reasonable percentage of smallish numbers translate to programs. (Machine languages will work reasonably well for this purpose, as would Java byte code or compressed brainfuck. High-level languages, translated via ASCII, would work very poorly, but would still be acceptable.) A valid wait program consists of an empty text file; any character in a wait program constitutes a syntax error, which is fatal, and must be reported. (An implementation which allows C-style comments, or even a file containing a single newline character, is not strictly conforming. These are features of wait++.) When a wait program is compiled or interpreted, the resulting program has the same semantics as the program in the reference language which is the value of the mapping function when applied to the time of compilation or interpretation, expressed in seconds since the beginning of AD 1970, UTC. (This last is commonly known as the time_t format.) The time used must be within the half-second after the command to run the compiler or interpreter is given. When the mapping function has an undefined value, the behaviour of the compiler or interpreter is also undefined. Of course if it maliciously deletes totally unrelated files, its author is morally reprehensible. -------------------------- Notes for compiler writers. Unlike many languages, wait looks vastly easier to compile than to interpret, at least when the machine language of the machine it's run on is used as the reference language, which was the original thought. Some small problems that arise in such a case: -leading zeroes don't change what natural number it is; so they shouldn't change what program is produced either. One answer is to ignore this theoretical difficulty; another is to gerrymander the definition of either the reference language or the mapping function so as to legitimize the behaviour of your implementation. -Practically speaking: What should happen if the last instruction in the time is cut off halfway through, because instructions are of varying lengths? Depending on your memory layout, it might get completed with stray data or you might get some kind of memory protection fault. Which one happens is undefined by the spec because the sliced instruction means the program is not a legitimate machine-language program. -Is the reference-language program responsible for providing the code for a normal termination? My inclination is to be merciful and have the compiler provide that code right after the time. Of course if the time contains a jump instruction to some other place, it's on its own.