Tokenizer
Turn lexemes into tokens.
Subtopics
- Tokenize-pfcs
- Lexes the bytes of pfcs-string into a list of tokens.
- Filter-and-reduce-lexeme-tree-to-subtoken-trees
- Sees through lexeme and token trees to return token subtype trees.
- Check-and-deref-tree-token?
- Check if the ABNF tree is a nonleaf for rule "token",
extracting its component tree
(identifier, integer, operator, or separator) if successful.
If it is not successful, returns a reserrp.
- Check-and-deref-tree-lexeme?
- Check if the ABNF tree is a nonleaf for rule "lexeme",
extracting its component tree (token or whitespace) if successful.
If not successful, returns a reserrp.
- Tokenize-pfcs-bytes
- Lexes the bytes of a pfcs system into a list of tokens.
- Is-tree-rulename?
- True if tree is nonleaf for rule rulename-string.