Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: return correct prune_target_block when syncing #14181

Merged
merged 2 commits into from
Feb 4, 2025
Merged

Conversation

klkvr
Copy link
Member

@klkvr klkvr commented Feb 3, 2025

Right now we are likely to fail here when starting e.g Holesky full node and trying to insert genesis block:

let contract_log_pruner = self.prune_modes.receipts_log_filter.group_by_block(tip, None)?;

Reason is that this match would return Err for Self::Before(0), tip == 0

Self::Before(n) if *n == tip + 1 && purpose.is_static_file() => Some((tip, *self)),
Self::Before(n) if *n > tip => None, // Nothing to prune yet
Self::Before(n) if tip - n >= segment.min_blocks(purpose) => {
Some(((*n).saturating_sub(1), *self))
}
_ => return Err(PruneSegmentError::Configuration(segment)),

Instead, we should return None if tip - n < segment.min_blocks(purpose) because this might become false once node syncs to a higher tip

Copy link
Collaborator

@mattsse mattsse left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, I see, makes sense

failing test needs a closer look from @joshieDo ?

@klkvr
Copy link
Member Author

klkvr commented Feb 4, 2025

I think the test was incorrect but anyway would appreciate sanity review here @joshieDo

@klkvr klkvr added this pull request to the merge queue Feb 4, 2025
Merged via the queue into main with commit 740bf04 Feb 4, 2025
44 checks passed
@klkvr klkvr deleted the klkvr/fix-pruning branch February 4, 2025 11:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants