Skip to content

A regex-based lexer (tokenizer) in Rust.

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT
Notifications You must be signed in to change notification settings

krsnik02/regex-lexer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

regex-lexer

github crates.io docs.rs build status

A regex-based lexer (tokenizer) in Rust.

Basic Usage

enum Tok {
    Num,
    // ...
}

let lexer = regex_lexer::LexerBuilder::new()
  .token(r"[0-9]+", Tok::Num)
  .ignore(r"\s+") // skip whitespace
  // ...
  .build();
  
let tokens = lexer.tokens(/* source */);

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusing in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

About

A regex-based lexer (tokenizer) in Rust.

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE
MIT
LICENSE-MIT

Stars

Watchers

Forks

Packages

No packages published

Languages