Note
This is not standard-compliant according to any specification such as CommonMark or GitHub Markdown. If you're looking for something like that, please use goldmark
A simple, zero-dependency library that tokenizes raw markdown and offers serialization to JSON. Its approach is very much inspired by Rob Pike's talk on Lexical Scanning in Go found here. Expect bugs here and there, as well as some very messy code - this is a quick project I did over a few weekends!
gomd.mov
The example above involves naive file watching, and so is much slower than in practice.
go-markdown
can process markdown (2000+ lines) files in <3.5ms:
func main() {
input, err := os.ReadFile("myFile.md")
if err != nil {
log.Fatalf("Error reading file: %v", err)
}
l := lexer.NewLexer(string(input))
go l.Run()
var tokens []lexer.Token
for token := range l.GetTokens() {
tokens = append(tokens, token)
}
tokenSlice := serializer.TokenSlice(tokens)
tokenJson, err := tokenSlice.ToJson()
fmt.Println(tokenJson)
}
Supported elements can be found in lexer/token.go
:
Lines 10 to 25 in b7eb4f0