Wiktionary
tokenizer
n. (context computing English) A system that parses an input stream into its component tokens.