Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run subsystem-benchmark without network latency #4068

Merged
merged 4 commits into from
Apr 11, 2024

Conversation

AndreiEres
Copy link
Contributor

@AndreiEres AndreiEres commented Apr 10, 2024

Implements the idea from #3899

  • Removed latencies
  • Number of runs reduced from 50 to 5, according to local runs it's quite enough
  • Network message is always sent in a spawned task, even if latency is zero. Without it, CPU time sometimes spikes.
  • Removed the testnet profile because we probably don't need that debug additions.

After the local tests I can't say that it brings a significant improvement in the stability of the results. However, I belive it is worth trying and looking at the results over time.

@AndreiEres AndreiEres added R0-silent Changes should not be mentioned in any release notes T12-benchmarks This PR/Issue is related to benchmarking and weights. labels Apr 10, 2024
@AndreiEres AndreiEres requested a review from a team as a code owner April 10, 2024 12:55
@paritytech-review-bot paritytech-review-bot bot requested a review from a team April 10, 2024 14:06
let latency_ms = std::time::Duration::from_millis(self.latency_ms as u64);

self.spawn_handle
.spawn("peer-latency-emulator", "test-environment", async move {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I understand how this should help, we would actually get scheduling noise even if latency is zero, so this change would work against making the test more deterministic by setting latency to None.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have the same question. How much of an improvement are we seeing and do we have any clue why that is ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right, I used to receive weird results with sending without spawning, but now checked again and everything is ok. I suppose I just had some "noise" on my laptop because of rust-analyzer.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would use a VM with ref hw specs to run the tests and establish gold values for the asserts.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, I reverted that

@AndreiEres AndreiEres added this pull request to the merge queue Apr 11, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to a conflict with the base branch Apr 11, 2024
@AndreiEres AndreiEres enabled auto-merge April 11, 2024 16:21
@AndreiEres AndreiEres added this pull request to the merge queue Apr 11, 2024
Merged via the queue into master with commit 25f038a Apr 11, 2024
116 of 136 checks passed
@AndreiEres AndreiEres deleted the AndreiEres/subsystem-bench-wo-latency branch April 11, 2024 17:26
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
R0-silent Changes should not be mentioned in any release notes T12-benchmarks This PR/Issue is related to benchmarking and weights.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants