Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider using Zstandard and/or LZ4 instead of Deflate #39

Open
richgel999 opened this issue Nov 30, 2021 · 10 comments
Open

Consider using Zstandard and/or LZ4 instead of Deflate #39

richgel999 opened this issue Nov 30, 2021 · 10 comments

Comments

@richgel999
Copy link

richgel999 commented Nov 30, 2021

One of the issues we have with .PNG is slow read/write times. There are now new lossless open source codecs without patent concerns, such as Zstandard (maintained by Facebook) or LZ4:

https://facebook.github.io/zstd/
https://github.com/lz4/lz4

Zstandard is used by the new Khronos KTX2 GPU texture format specification. I propose that it be added as an option to a future version of .PNG. The possible speedups are quite significant, and for users that read and write a lot of .PNG's as part of their data processing pipelines the speedups will be high value improvements.

There are also other far simpler but even faster codecs being developed, such as .QOI's, but using this would likely require changing or not filtering the image before compression:
https://news.ycombinator.com/item?id=29328750

@svgeesus
Copy link
Contributor

That is an interesting possibility, but would be a significant work item as it would be completely incompatible with all existing image creation and display software.

I propose that it be added as an option to a future version of .PNG. The possible speedups are quite significant, and for users that read and write a lot of .PNG's as part of their data processing pipelines the speedups will be high value improvements.

Do you have any data on that speedup, for PNG?

@randy408
Copy link

randy408 commented Dec 1, 2021

The existing file format does allow for additional compression/filter methods and new ones could be added to a PNG 2.0 standard.

The situation would be similar to JPEG XL where jpeg-only decoders cannot read .jxl files but they can be transcoded back and forth without loss of information. Websites serve the older format if the browser is not capable of reading the new one.

There are also other far simpler but even faster codecs being developed, such as .QOI's, but using this would likely require changing or not filtering the image before compression:

There is a 'None' filter type, you're still left with an extra leading 0 byte on each row, if that's a problem a new 'null' filter method could do away with the filter byte entirely.

@vrubleg
Copy link

vrubleg commented Dec 1, 2021

Brotli is also a popular choice: https://github.com/google/brotli
It is used for WOFF2 fonts and already supported as Content-Encoding in modern browsers.

It is worth to consider making this extension available as a new Content-Encoding similar to jxl for JPEG. So, a browser receives a recompressed file, but for a user it looks like a standard PNG. It will be required to store a bit more information to make possible restoring original PNG back (as it is done in JPEG XL for recompressed JPEG files).

@birdie-github
Copy link

Zstandard/ZSTD looks like a better choice since it's already supported by Chrome/Firefox and it features extremely fast decompression, in fact it's faster than gzip that has been used for decades now.

@wisp3rwind
Copy link

Here's a datapoint on the potential performance benefit: https://github.com/catid/Zpng

@ProgramMax
Copy link
Collaborator

I met with @bitbank2 who seemed interested in this (including added less powerful compression algorithms). Roping them in here.

@birdie-github
Copy link

including added less powerful compression algorithms

ZSTD officially has 22 compression levels and low presets are both faster and more powerful than Deflate. In my experience ZSTD is the most versatile compressor (not the best in terms of compression ratio of course) and almost nothing actually beats it in decompression speed which is paramount for image access.

Please first make sure you've compared whatever you had in mind with ZSTD and it various compression levels. And then version 1.5.7 was released a couple of days ago and it became even faster: https://github.com/facebook/zstd/releases/tag/v1.5.7

@vrubleg
Copy link

vrubleg commented Feb 21, 2025

PNG with any new compression algorithm will effectively be a new incompatible format. It probably should use a different name and MIME type then (e.g. PNG2 and image/png2), just to make it distinguishable from standard PNG. Also, it should be better than already existing and supported WebP lossless, otherwise there won't be enough reasons to support the new PNG.

@ProgramMax
Copy link
Collaborator

We discussed this in today's meeting.
We're open to the idea. But we need to see user interest--an implementation being used either widely or by a major player.

That might already exist.
I know of several implementations, including one linked above.
So this is a call to action for you all: Show us it being used [widely/by a major player].

@palemieux
Copy link
Contributor

If the objective is to increase decode/encode throughput at the expense of compatibility with existing decoders, than the field of candidate codecs should be broadened. Below is a throughput/compression ratio analysis across lossless image codecs:

https://www.lossless-benchmarks.com/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

8 participants