Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compress (in transport) BQM/CQM/DQM data during multipart upload #532

Open
randomir opened this issue Aug 16, 2022 · 4 comments
Open

Compress (in transport) BQM/CQM/DQM data during multipart upload #532

randomir opened this issue Aug 16, 2022 · 4 comments

Comments

@randomir
Copy link
Member

Since serialized BQM et al. data is not compressed in dimod anymore, look into benefits of compressing in transit.

@arcondello
Copy link
Member

arcondello commented Aug 16, 2022

DQM.to_file() supportes compress, it would be pretty easy to add it to CQM/BQM if desired. Would need to measure the performance

@randomir
Copy link
Member Author

That would be nice. If all supported compress flag, we could use it by default from the client.

@arcondello
Copy link
Member

Sure, made an issue dwavesystems/dimod#1235.
I do wonder what the memory/time cost of compressing the overall model rather than the individual parts in multi part upload. Obviously compressing the overall object should lead to better compression, but I would also expect it to be slower.

@randomir
Copy link
Member Author

Benefits also heavily depend on network speed. On (very) fast networks, using anything except the simplest/fastest stream compression is probably not worth it. But we upload in parallel, so that tips the balance back... Hard to tell without some benchmarks. "Optimal" upload parameters will definitely be informed by benchmarks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants