Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Source Maps performance #18

Open
Jarred-Sumner opened this issue Sep 28, 2022 · 3 comments
Open

Source Maps performance #18

Jarred-Sumner opened this issue Sep 28, 2022 · 3 comments

Comments

@Jarred-Sumner
Copy link

Jarred-Sumner commented Sep 28, 2022

Have there been discussions about the performance of source maps for (1) tools generating sourcemaps and (2) tools consuming sourcemaps?

I think there is a lot of room to make source maps faster.

Two specific ideas:

  • A binary format: JSON is nice for readability, but it means that source text must be escaped and then unescaped. It also means that VLQ must be base64-encoded (and later, decoded). A binary format could fix both of those things. This would be smaller - both when sending over the network and in-memory for tools reading the source map. Smaller source maps means browser devtools loads faster and use less memory.

  • SIMD-friendly alternative to VLQ: Something like Stream VByte: Faster Byte-Oriented Integer Compression
    (github) should make it faster for tooling to generate and parse mappings

Bun generates source maps for browsers to consume with bun dev and for bun itself to consume for stack traces (bun transpiles every file). It would be straightforward to experiment with other sourcemap formats internally with bun.

I think the big question with suggestions like these is ecosystem compatibility. I don't have a good answer for that right now. I think it's worth discussing regardless.

@mitsuhiko
Copy link
Contributor

I'm somewhat okay with an inefficient format because tools can always convert them into something else before working with them. At Sentry we already convert source maps into our own format before processing to make them work for what we do. That said I quite welcome the idea of experimentation here on the side.

nicolo-ribaudo pushed a commit to nicolo-ribaudo/source-map that referenced this issue Mar 13, 2024
Check IPR for commits in `HEAD`, without hard-coding `main`
@jkup
Copy link
Collaborator

jkup commented Jun 24, 2024

Some other performance ideas that came up in meeting:

  1. The Base64 encoding is not efficient, not doing that would be great.
  2. It would be great if we could support partial decoding on the consumer side that would be great too.
  3. @jridgewell has some prior work of switching from relative indexes to a marker concept which would allow for partial parsing.

It would be great to talk more about performance wins.

@jridgewell
Copy link
Member

has some prior work of switching from relative indexes to a marker concept which would allow for partial parsing.

My idea here was like a video's key frame, computing the relative values into absolute values, and marking the offset index into mappings. Using that, we could scan the absolute map to find the closest preceding key frame, and resume decoding from that point in mappings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants