Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Investigate in-memory caching of blocks #4557

Closed
lutter opened this issue Apr 19, 2023 · 5 comments
Closed

[Bug] Investigate in-memory caching of blocks #4557

lutter opened this issue Apr 19, 2023 · 5 comments
Labels
bug Something isn't working

Comments

@lutter
Copy link
Collaborator

lutter commented Apr 19, 2023

Bug report

We recently added a way to cache blocks in memory (#4215) though it looks like that did not have the desired effect. We also added metrics (#4440 ) to better understand how that cache works.

We need to figure out why this cache isn't effective. The underlying issue that needs to be addressed is that reading from the block cache in the database can require a lot of IO bandwidth, and we need to reduce that.

@lutter lutter added the bug Something isn't working label Apr 19, 2023
@github-actions
Copy link

Looks like this issue has been open for 6 months with no activity. Is it still relevant? If not, please remember to close it.

@github-actions github-actions bot added the Stale label Oct 24, 2023
@azf20
Copy link
Contributor

azf20 commented Oct 25, 2023

@lutter is this still valid?

@lutter
Copy link
Collaborator Author

lutter commented Oct 25, 2023

Honestly, I am not sure where this ended up; the block cache shards still seem to experience high IO load though. Maybe @neysofu remembers the details of what still needed to be done?

@github-actions github-actions bot removed the Stale label Oct 26, 2023
@neysofu
Copy link
Member

neysofu commented Oct 26, 2023

Not really, I'm afraid. The high IO load came from a different set of queries after the deployment, that much I remember – which probably means some problematic queries were optimized away but not nearly enough to reduce the IO load. We never started an actual follow-up investigation beyond that.

@azf20
Copy link
Contributor

azf20 commented Oct 30, 2023

OK - closing for now, and we can revisit if this surfaces again

@azf20 azf20 closed this as not planned Won't fix, can't repro, duplicate, stale Oct 30, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants