A token is a piece of text, typically a word or punctuation mark, separated by a lexical analyzer and passed to a parser.

Older Interpreters would tokenize the program, turning the instructions from the programmer into byte-codes that were more easily processed by the VM - Virtual Machine. By doing so they would gain a bit more speed, faster execution.

Some examples are the, O-code, P-code, Z-machine, and byte-codes used in different VM systems.