Skip to content

Commit

Permalink
Codegen for stateful lexer.
Browse files Browse the repository at this point in the history
This uses a fairly naive recursive solution but even with that caveat gives
significant gains. Roughly 10x performance with constant GC overhead per
tokenisation call. The latter is possible because the tokens return
slices into the input substring.
  • Loading branch information
alecthomas committed Oct 9, 2020
1 parent 37cfcfa commit e2b420f
Show file tree
Hide file tree
Showing 5 changed files with 958 additions and 27 deletions.
2 changes: 1 addition & 1 deletion lexer/lexer.go
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ func Must(def Definition, err error) Definition {

// ConsumeAll reads all tokens from a Lexer.
func ConsumeAll(lexer Lexer) ([]Token, error) {
tokens := []Token{}
tokens := make([]Token, 0, 1024)
for {
token, err := lexer.Next()
if err != nil {
Expand Down
Loading

0 comments on commit e2b420f

Please sign in to comment.