A function that accepts input in its original form and separates it into tokens, will be called a lexical analyzer. These analyzers are functional: call one with the original input, and it returns a function of 0 arguments (with changeable internal state) that keeps returning tokens each time it is called until none are left. Actually, the analyzer is to return a pair: (pos,token), where pos is a string indicating the position where token was found in the input. A position will be a sort of thing which can be converted to string with toString (for printing error messages) and can be sorted.
The object Analyzer is a self initializing type, with ancestor classes FunctionClosure < Function < Thing.
The source of this document is in /build/reproducible-path/macaulay2-1.25.05+ds/M2/Macaulay2/packages/Parsing.m2:233:0.