Tokenizer
An executable tokenizer of Yul.
This is a simple tokenizer for Yul code. The tokenizer simply lexes and
then discards comments and whitespace.
The primary API for tokenizing is
tokenize-yul and tokenize-yul-bytes.
Subtopics
- Tokenize-yul
- Lexes the bytes of yul-string into a list of tokens.
- Filter-and-reduce-lexeme-tree-to-subtoken-trees
- Sees through lexeme and token rules to return a list of keyword, literal, identifier, and symbol trees.
- Check-and-deref-tree-token?
- Check if the ABNF tree is a nonleaf for rule "token",
extracting its component tree (keyword, literal, identifier, or symbol) if successful.
If it is not successful, returns a reserrp.
- Check-and-deref-tree-lexeme?
- Check if the ABNF tree is a nonleaf for rule "lexeme",
extracting its component tree (token, comment, or whitespace) if successful.
If not successful, returns a reserrp.
- Tokenize-yul-bytes
- Lexes the bytes of a Yul source program into a list of tokens.
- Is-tree-rulename?
- True if tree is nonleaf for rule rulename-string.