Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inspect flaky test-net-bytes-per-incoming-chunk-overhead #23740

Closed
Trott opened this issue Oct 18, 2018 · 3 comments
Closed

Inspect flaky test-net-bytes-per-incoming-chunk-overhead #23740

Trott opened this issue Oct 18, 2018 · 3 comments
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI. net Issues and PRs related to the net subsystem.

Comments

@Trott
Copy link
Member

Trott commented Oct 18, 2018

https://ci.nodejs.org/job/node-test-commit-linux/22484/nodes=ubuntu1804-docker/console

10:55:55 not ok 2374 sequential/test-net-bytes-per-incoming-chunk-overhead
10:55:55   ---
10:55:55   duration_ms: 120.72
10:55:55   severity: fail
10:55:55   exitcode: -15
10:55:55   stack: |-
10:55:55     timeout
10:55:55   ...
@Trott Trott added net Issues and PRs related to the net subsystem. flaky-test Issues and PRs related to the tests with unstable failures on the CI. labels Oct 18, 2018
@Trott
Copy link
Member Author

Trott commented Oct 21, 2018

https://ci.nodejs.org/job/node-test-commit-linux/22513/nodes=ubuntu1804-docker/console

19:34:03 not ok 2375 sequential/test-net-bytes-per-incoming-chunk-overhead
19:34:03   ---
19:34:03   duration_ms: 120.106
19:34:03   severity: fail
19:34:03   exitcode: -15
19:34:03   stack: |-
19:34:03     timeout
19:34:03   ...

Looks like it's a particular problem in the docker container. Maybe @nodejs/docker would have some insight.

@Trott
Copy link
Member Author

Trott commented Oct 21, 2018

It does seem like something is up that slows the docker host down during these timeouts.

Example run with failure: https://ci.nodejs.org/job/node-test-commit-linux/22513/nodes=ubuntu1804-docker/console

sequential/test-net-bytes-per-incoming-chunk-overhead times out after 120 seconds
sequential/test-net-GH-5504 4.806 seconds
sequential/test-net-listen-shared-ports 4.474 seconds

Example run with success: https://ci.nodejs.org/job/node-test-commit-linux/22514/nodes=ubuntu1804-docker/console

sequential/test-net-bytes-per-incoming-chunk-overhead 6.522 seconds
sequential/test-net-GH-5504 0.510 seconds
sequential/test-net-listen-shared-ports 0.509 seconds

Both were on test-digitalocean-ubuntu1804_container-x64-1 but the success was on test-digitalocean-ubuntu1804_container-x64-2 so maybe that's significant?

Why yes, it probably is. Here's a successful run on test-digitalocean-ubuntu1804_container-x64-1:

https://ci.nodejs.org/job/node-test-commit-linux/22516/nodes=ubuntu1804-docker/console:

sequential/test-net-bytes-per-incoming-chunk-overhead 89.54 seconds
sequential/test-net-GH-5504 2.655 seconds
sequential/test-net-listen-shared-ports 2.926 seconds

It seems to be sustained across multiple builds on test-digitalocean-ubuntu1804_container-x64-1. Here's https://ci.nodejs.org/job/node-test-commit-linux/22518/nodes=ubuntu1804-docker/console:

sequential/test-net-bytes-per-incoming-chunk-overhead 49.683 seconds

sequential/test-net-bytes-per-incoming-chunk-overhead:

sequential/test-net-bytes-per-incoming-chunk-overhead 97.632 seconds

/ping @nodejs/build is test-digitalocean-ubuntu1804_container-x64-1 provisioned less generously than test-digitalocean-ubuntu1804_container-x64-2? Or maybe it has some intensive stale processes running on it that need to be terminated or something?

@Trott
Copy link
Member Author

Trott commented Nov 23, 2018

Seems to not be happening anymore. Closing, can re-open if it surfaces again.

@Trott Trott closed this as completed Nov 23, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
flaky-test Issues and PRs related to the tests with unstable failures on the CI. net Issues and PRs related to the net subsystem.
Projects
None yet
Development

No branches or pull requests

1 participant