<programming> (Or "linear analysis", "scanning") The first stage of processing a language. The stream of characters making up the source program or other input is read one at a time and grouped into lexemes (or "tokens") - word-like pieces such as keywords, identifiers, literals and punctutation. The lexemes are then passed to the parser.
["Compilers - Principles, Techniques and Tools", by Alfred V. Aho, Ravi Sethi and Jeffrey D. Ullman, pp. 4-5]
Last updated: 1995-04-05
Try this search on Wikipedia, OneLook, Google
Nearby terms: lexeme « lexer « lexical analyser « lexical analysis » lexical scope » lexical scoping » lexiphage