We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test_file_descriptors_dont_leak
On macOS, 3.8: https://github.com/dask/distributed/pull/4925/checks?check_run_id=2888102705#step:10:2325
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ s = <Scheduler: "tcp://127.0.0.1:60661" workers: 0 cores: 0, tasks: 0> @pytest.mark.skipif( sys.platform.startswith("win"), reason="file descriptors not really a thing" ) @gen_cluster(nthreads=[]) async def test_file_descriptors_dont_leak(s): psutil = pytest.importorskip("psutil") proc = psutil.Process() before = proc.num_fds() w = await Worker(s.address) await w.close() during = proc.num_fds() start = time() while proc.num_fds() > before: await asyncio.sleep(0.01) > assert time() < start + 5 E assert 1624391425.688429 < (1624391420.652608 + 5) E + where 1624391425.688429 = time() distributed/tests/test_scheduler.py:677: AssertionError
Why doesn't during = proc.num_fds() come before await w.close()?
during = proc.num_fds()
await w.close()
The text was updated successfully, but these errors were encountered:
I'm not sure, but it looks like during isn't actually used in this test. We can probably remove it altogether
during
Sorry, something went wrong.
FWIW, I ran into some leaky file descriptors and could trace this back to our connectionpool, see #4951 but on another test
Do you think this might also be related? #4045
No failure in the last 30days. Closing
No branches or pull requests
On macOS, 3.8: https://github.com/dask/distributed/pull/4925/checks?check_run_id=2888102705#step:10:2325
Why doesn't
during = proc.num_fds()
come beforeawait w.close()
?The text was updated successfully, but these errors were encountered: