Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bound the max size of the validator set to scan per withdrawals processing #3079

Closed
ralexstokes opened this issue Nov 4, 2022 · 4 comments · Fixed by #3095
Closed

Bound the max size of the validator set to scan per withdrawals processing #3079

ralexstokes opened this issue Nov 4, 2022 · 4 comments · Fixed by #3095
Labels

Comments

@ralexstokes
Copy link
Member

ralexstokes commented Nov 4, 2022

After #3068, the worst case behavior during withdrawals processing implies a complete scan of the full validator set on each block.

There is some discussion in that PR that suggests even as is this worst case scenario is not a blocker but it would be nice to bound the length of the scan to have hard numbers on performance here.

To implement, we need to add a constant MAX_VALIDATORS_PER_WITHDRAWALS_SWEEP and add it to get_expected_withdrawals in that PR on the line that scans the validator set:

for _ in range(MAX_VALIDATORS_PER_WITHDRAWALS_SWEEP):

I wanted to open this issue for discussion of this approach and also to get agreement on a concrete number for this new constant.

I suggest 2**17 as it is a nice power of two and is around 1/4 of the validator set against current mainnet numbers.

edit: It also seems fairer if we also advance the pointer to the last checked validator by MAX_VALIDATORS_PER_WITHDRAWALS_SWEEP in the event we don't have any withdrawals during a single sweep so would need another line or code or two along w/ the above to handle that

@ajsutton
Copy link
Contributor

ajsutton commented Nov 7, 2022

Agree it would be nice to have a fixed upper limit so we have one less bit of performance that potentially degrades as there are more validators. Personally I'd be happy to have quite a low limit but Potuz seems to feel much more strongly about having a high limit than I do about a low one, so 2**17 is fine.

@djrtwo
Copy link
Contributor

djrtwo commented Nov 10, 2022

How critical is this and is there consensus on some bound going in?

If so, we should push this out in the next couple of days to get to something closer to frozen to unblock devnets and the associated 4844 impact (due to it being based on capella)

@potuz
Copy link
Contributor

potuz commented Nov 10, 2022

My personal interpretation:

  1. not critical at all
  2. there seems to be consensus that a constant bound is better than no bound

Other phrasing of 2) there are people in favor of a bound, there is no one that voiced against it

@twoeths
Copy link

twoeths commented Nov 11, 2022

Agree it would be nice to have a fixed upper limit so we have one less bit of performance that potentially degrades as there are more validators. Personally I'd be happy to have quite a low limit but Potuz seems to feel much more strongly about having a high limit than I do about a low one, so 2**17 is fine.

I totally agree with this. In lodestar looping through validator is not cheap so we also want a low limit here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
6 participants