Skip to content
GitHub Actions / Unit Test Results failed Mar 18, 2024 in 0s

25 fail, 109 skipped, 3 921 pass in 11h 11m 26s

    29 files      29 suites   11h 11m 26s ⏱️
 4 055 tests  3 921 ✅   109 💤  25 ❌
54 888 runs  52 231 ✅ 2 410 💤 247 ❌

Results for commit d9ba4ca.

Annotations

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_simple (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 2s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 1s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 1s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 1s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
tornado.httpclient.HTTPClientError: HTTP 500: Internal Server Error
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:40137', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:38769', name: 0, status: closed, stored: 0, running: 1/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:41461', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True, scheduler_kwargs={"dashboard": True})
    async def test_simple(c, s, a, b):
        port = s.http_server.port
        ev = Event()
        future = c.submit(block_on_event, ev)
        await asyncio.sleep(0.1)
    
        http_client = AsyncHTTPClient()
        for suffix in applications:
            if suffix in blocklist_apps:
                continue
>           response = await http_client.fetch(f"http://localhost:{port}{suffix}")
E           tornado.httpclient.HTTPClientError: HTTP 500: Internal Server Error

distributed/dashboard/tests/test_scheduler_bokeh.py:92: HTTPClientError

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_stealing_events (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:46009', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:40429', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:34089', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_stealing_events(c, s, a, b):
>       se = StealingEvents(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:142: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:1925: in __init__
    self.root.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_events (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:39073', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:46359', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:33719', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_events(c, s, a, b):
>       e = Events(s, "all")

distributed/dashboard/tests/test_scheduler_bokeh.py:157: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:2019: in __init__
    self.root.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_WorkerTable (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:43447', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:34499', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:35119', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_WorkerTable(c, s, a, b):
>       wt = WorkerTable(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:549: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:4201: in __init__
    mem_plot.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_WorkerTable_custom_metrics (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:41157', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:34311', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:38043', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_WorkerTable_custom_metrics(c, s, a, b):
        def metric_port(worker):
            return worker.port
    
        def metric_address(worker):
            return worker.address
    
        metrics = {"metric_port": metric_port, "metric_address": metric_address}
    
        for w in [a, b]:
            for name, func in metrics.items():
                w.metrics[name] = func
    
        await asyncio.gather(a.heartbeat(), b.heartbeat())
    
        for w in [a, b]:
            assert s.workers[w.address].metrics["metric_port"] == w.port
            assert s.workers[w.address].metrics["metric_address"] == w.address
    
>       wt = WorkerTable(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:586: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:4201: in __init__
    mem_plot.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_WorkerTable_different_metrics (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:45429', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:39009', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:40447', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_WorkerTable_different_metrics(c, s, a, b):
        def metric_port(worker):
            return worker.port
    
        a.metrics["metric_a"] = metric_port
        b.metrics["metric_b"] = metric_port
        await asyncio.gather(a.heartbeat(), b.heartbeat())
    
        assert s.workers[a.address].metrics["metric_a"] == a.port
        assert s.workers[b.address].metrics["metric_b"] == b.port
    
>       wt = WorkerTable(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:612: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:4201: in __init__
    mem_plot.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_WorkerTable_metrics_with_different_metric_2 (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:35171', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:35301', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:46821', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_WorkerTable_metrics_with_different_metric_2(c, s, a, b):
        def metric_port(worker):
            return worker.port
    
        a.metrics["metric_a"] = metric_port
        await asyncio.gather(a.heartbeat(), b.heartbeat())
    
>       wt = WorkerTable(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:633: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:4201: in __init__
    mem_plot.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_WorkerTable_add_and_remove_metrics (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:34991', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:37387', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:46783', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True, worker_kwargs={"metrics": {"my_port": lambda w: w.port}})
    async def test_WorkerTable_add_and_remove_metrics(c, s, a, b):
        def metric_port(worker):
            return worker.port
    
        a.metrics["metric_a"] = metric_port
        b.metrics["metric_b"] = metric_port
        await asyncio.gather(a.heartbeat(), b.heartbeat())
    
        assert s.workers[a.address].metrics["metric_a"] == a.port
        assert s.workers[b.address].metrics["metric_b"] == b.port
    
>       wt = WorkerTable(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:656: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:4201: in __init__
    mem_plot.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_WorkerTable_with_memory_limit_as_0 (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:44153', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:46017', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:44505', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True, worker_kwargs={"memory_limit": 0})
    async def test_WorkerTable_with_memory_limit_as_0(c, s, a, b):
>       wt = WorkerTable(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:692: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:4201: in __init__
    mem_plot.circle(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/glyph_api.py:144: in circle
    deprecated((3, 4, 0), "circle() method with size value", "scatter(size=...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = "'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead."
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'circle() method with size value' was deprecated in Bokeh 3.4.0 and will be removed, use 'scatter(size=...) instead' instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_TaskGraph (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:36865', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:40369', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:43745', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_TaskGraph(c, s, a, b):
>       gp = TaskGraph(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:822: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:2305: in __init__
    rect = self.root.square(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/_decorators.py:58: in wrapped
    deprecated((3, 4, 0), f"{func.__name__}() method", f"scatter(marker={func.__name__!r}, ...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = '\'square() method\' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker=\'square\', ...) instead" instead.'
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_TaskGraph_clear (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:36417', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:36419', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:36807', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_TaskGraph_clear(c, s, a, b):
>       gp = TaskGraph(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:864: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:2305: in __init__
    rect = self.root.square(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/_decorators.py:58: in wrapped
    deprecated((3, 4, 0), f"{func.__name__}() method", f"scatter(marker={func.__name__!r}, ...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = '\'square() method\' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker=\'square\', ...) instead" instead.'
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_TaskGraph_limit (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:36261', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:40847', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:40173', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True, config={"distributed.dashboard.graph-max-items": 2})
    async def test_TaskGraph_limit(c, s, a, b):
>       gp = TaskGraph(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:888: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:2305: in __init__
    rect = self.root.square(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/_decorators.py:58: in wrapped
    deprecated((3, 4, 0), f"{func.__name__}() method", f"scatter(marker={func.__name__!r}, ...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = '\'square() method\' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker=\'square\', ...) instead" instead.'
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_TaskGraph_complex (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:37743', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:46313', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:36247', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_TaskGraph_complex(c, s, a, b):
        da = pytest.importorskip("dask.array")
>       gp = TaskGraph(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:911: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:2305: in __init__
    rect = self.root.square(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/_decorators.py:58: in wrapped
    deprecated((3, 4, 0), f"{func.__name__}() method", f"scatter(marker={func.__name__!r}, ...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = '\'square() method\' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker=\'square\', ...) instead" instead.'
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_TaskGraph_order (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:39987', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:40797', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:34083', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_TaskGraph_order(c, s, a, b):
        x = c.submit(inc, 1)
        y = c.submit(div, 1, 0)
        await wait(y)
    
>       gp = TaskGraph(s)

distributed/dashboard/tests/test_scheduler_bokeh.py:945: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
distributed/dashboard/components/scheduler.py:2305: in __init__
    rect = self.root.square(
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/plotting/_decorators.py:58: in wrapped
    deprecated((3, 4, 0), f"{func.__name__}() method", f"scatter(marker={func.__name__!r}, ...) instead")
../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/deprecation.py:73: in deprecated
    warn(message, BokehDeprecationWarning)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

message = '\'square() method\' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker=\'square\', ...) instead" instead.'
category = <class 'bokeh.util.warnings.BokehDeprecationWarning'>, stacklevel = 4

    def warn(message: str, category: type[Warning] | None = None, stacklevel: int | None = None) -> None:
        if stacklevel is None:
            stacklevel = find_stack_level()
    
>       warnings.warn(message, category, stacklevel=stacklevel)
E       bokeh.util.warnings.BokehDeprecationWarning: 'square() method' was deprecated in Bokeh 3.4.0 and will be removed, use "scatter(marker='square', ...) instead" instead.

../../../miniconda3/envs/dask-distributed/lib/python3.9/site-packages/bokeh/util/warnings.py:64: BokehDeprecationWarning

Check warning on line 0 in distributed.dashboard.tests.test_scheduler_bokeh

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

All 11 runs failed: test_https_support (distributed.dashboard.tests.test_scheduler_bokeh)

artifacts/macos-latest-3.12-default-notci1/pytest.xml [took 1s]
artifacts/ubuntu-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-default-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_expr-notci1/pytest.xml [took 0s]
artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.10-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.11-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.12-default-notci1/pytest.xml [took 0s]
artifacts/windows-latest-3.9-default-notci1/pytest.xml [took 0s]
Raw output
tornado.httpclient.HTTPClientError: HTTP 500: Internal Server Error
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:43423', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:46329', name: 0, status: closed, stored: 0, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:35311', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(
        client=True,
        scheduler_kwargs={"dashboard": True},
        config={
            "distributed.scheduler.dashboard.tls.key": get_cert("tls-key.pem"),
            "distributed.scheduler.dashboard.tls.cert": get_cert("tls-cert.pem"),
            "distributed.scheduler.dashboard.tls.ca-file": get_cert("tls-ca-cert.pem"),
        },
    )
    async def test_https_support(c, s, a, b):
        port = s.http_server.port
    
        assert (
            format_dashboard_link("localhost", port) == "https://localhost:%d/status" % port
        )
    
        ctx = ssl.create_default_context()
        ctx.load_verify_locations(get_cert("tls-ca-cert.pem"))
    
        http_client = AsyncHTTPClient()
        response = await http_client.fetch(
            "https://localhost:%d/individual-plots.json" % port, ssl_options=ctx
        )
        response = json.loads(response.body.decode())
    
        for suffix in [
            "system",
            "counters",
            "workers",
            "status",
            "tasks",
            "stealing",
            "graph",
        ] + [url.strip("/") for url in response.values()]:
            req = HTTPRequest(
                url="https://localhost:%d/%s" % (port, suffix), ssl_options=ctx
            )
>           response = await http_client.fetch(req)
E           tornado.httpclient.HTTPClientError: HTTP 500: Internal Server Error

distributed/dashboard/tests/test_scheduler_bokeh.py:1165: HTTPClientError

Check warning on line 0 in distributed.tests.test_jupyter

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

1 out of 7 runs failed: test_jupyter_cli (distributed.tests.test_jupyter)

artifacts/ubuntu-latest-3.9-no_queue-notci1/pytest.xml [took 37s]
Raw output
subprocess.TimeoutExpired: Command '['/home/runner/miniconda3/envs/dask-distributed/bin/dask', 'scheduler', '--jupyter', '--no-dashboard', '--host', '127.0.0.1:41825']' timed out after 30 seconds
loop = <tornado.platform.asyncio.AsyncIOMainLoop object at 0x7c27ad6a19a0>

    @pytest.mark.slow
    def test_jupyter_cli(loop):
        port = open_port()
        with popen(
            [
                "dask",
                "scheduler",
                "--jupyter",
                "--no-dashboard",
                "--host",
                f"127.0.0.1:{port}",
            ],
            capture_output=True,
        ):
            with Client(f"127.0.0.1:{port}", loop=loop):
                response = requests.get("http://127.0.0.1:8787/jupyter/api/status")
>               assert response.status_code == 200

distributed/tests/test_jupyter.py:63: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
../../../miniconda3/envs/dask-distributed/lib/python3.9/contextlib.py:126: in __exit__
    next(self.gen)
distributed/utils_test.py:1251: in popen
    _terminate_process(proc, terminate_timeout)
distributed/utils_test.py:1177: in _terminate_process
    proc.communicate(timeout=terminate_timeout)
../../../miniconda3/envs/dask-distributed/lib/python3.9/subprocess.py:1134: in communicate
    stdout, stderr = self._communicate(input, endtime, timeout)
../../../miniconda3/envs/dask-distributed/lib/python3.9/subprocess.py:1996: in _communicate
    self._check_timeout(endtime, orig_timeout, stdout, stderr)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <Popen: returncode: -9 args: ['/home/runner/miniconda3/envs/dask-distributed...>
endtime = 1174.772667536, orig_timeout = 30
stdout_seq = [b"2024-03-18 15:22:00,447 - distributed.scheduler - INFO - -----------------------------------------------\n[W 2024-0...er - INFO - Scheduler closing all comms\n', b'[I 2024-03-18 15:22:05.047 ServerApp] Shutting down 5 extensions\n', ...]
stderr_seq = None, skip_check_and_raise = False

    def _check_timeout(self, endtime, orig_timeout, stdout_seq, stderr_seq,
                       skip_check_and_raise=False):
        """Convenience for checking if a timeout has expired."""
        if endtime is None:
            return
        if skip_check_and_raise or _time() > endtime:
>           raise TimeoutExpired(
                    self.args, orig_timeout,
                    output=b''.join(stdout_seq) if stdout_seq else None,
                    stderr=b''.join(stderr_seq) if stderr_seq else None)
E           subprocess.TimeoutExpired: Command '['/home/runner/miniconda3/envs/dask-distributed/bin/dask', 'scheduler', '--jupyter', '--no-dashboard', '--host', '127.0.0.1:41825']' timed out after 30 seconds

../../../miniconda3/envs/dask-distributed/lib/python3.9/subprocess.py:1178: TimeoutExpired

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[True-inner] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 11s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 6s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50129', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'inner', disk = True

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[True-outer] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 10s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 8s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 8s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 7s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50170', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'outer', disk = True

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[True-left] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 9s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 6s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 6s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50220', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'left', disk = True

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[True-right] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 11s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 6s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 8s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 6s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50263', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'right', disk = True

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[False-inner] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 10s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 6s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 5s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 6s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50311', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'inner', disk = False

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[False-outer] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 13s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 6s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 8s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 7s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50353', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'outer', disk = False

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[False-left] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 11s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 6s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 6s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50403', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'left', disk = False

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.shuffle.tests.test_merge

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

10 out of 12 runs failed: test_merge[False-right] (distributed.shuffle.tests.test_merge)

artifacts/macos-latest-3.12-default-ci1/pytest.xml [took 12s]
artifacts/ubuntu-latest-3.10-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.11-default-ci1/pytest.xml [took 4s]
artifacts/ubuntu-latest-3.12-default-ci1/pytest.xml [took 5s]
artifacts/ubuntu-latest-3.9-default-ci1/pytest.xml [took 6s]
artifacts/ubuntu-latest-3.9-no_queue-ci1/pytest.xml [took 6s]
artifacts/windows-latest-3.10-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.11-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.12-default-ci1/pytest.xml [took 7s]
artifacts/windows-latest-3.9-default-ci1/pytest.xml [took 7s]
Raw output
pandas.errors.SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame.
Try using .loc[row_indexer,col_indexer] = value instead

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:50447', workers: 0, cores: 0, tasks: 0>
a = Dask DataFrame Structure:
                   x      y
npartitions=2              
0              int64  int64
4                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
b = Dask DataFrame Structure:
                   y      z
npartitions=2              
0              int64  int64
2                ...    ...
5                ...    ...
Dask Name: from_pd_divs, 1 expression
Expr=df
how = 'right', disk = False

    @pytest.mark.parametrize("how", ["inner", "outer", "left", "right"])
    @pytest.mark.parametrize("disk", [True, False])
    @gen_cluster(client=True)
    async def test_merge(c, s, a, b, how, disk):
        A = pd.DataFrame({"x": [1, 2, 3, 4, 5, 6], "y": [1, 1, 2, 2, 3, 4]})
        a = dd.repartition(A, [0, 4, 5])
    
        B = pd.DataFrame({"y": [1, 3, 4, 4, 5, 6], "z": [6, 5, 4, 3, 2, 1]})
        b = dd.repartition(B, [0, 2, 5])
    
        with dask.config.set({"dataframe.shuffle.method": "p2p"}):
            with dask.config.set({"distributed.p2p.disk": disk}):
                joined = dd.merge(a, b, left_index=True, right_index=True, how=how)
            res = await c.compute(joined)
            assert_eq(
                res,
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            joined = dd.merge(a, b, on="y", how=how)
            await list_eq(joined, pd.merge(A, B, on="y", how=how))
            assert all(d is None for d in joined.divisions)
    
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how),
                pd.merge(A, B, left_on="x", right_on="z", how=how),
            )
            await list_eq(
                dd.merge(a, b, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
                pd.merge(A, B, left_on="x", right_on="z", how=how, suffixes=("1", "2")),
            )
    
            await list_eq(dd.merge(a, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(a, B, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, b, how=how), pd.merge(A, B, how=how))
            await list_eq(dd.merge(A, B, how=how), pd.merge(A, B, how=how))
            await list_eq(
                dd.merge(a, b, left_index=True, right_index=True, how=how),
                pd.merge(A, B, left_index=True, right_index=True, how=how),
            )
            await list_eq(
                dd.merge(
                    a, b, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
                pd.merge(
                    A, B, left_index=True, right_index=True, how=how, suffixes=("1", "2")
                ),
            )
    
>           await list_eq(
                dd.merge(a, b, left_on="x", right_index=True, how=how),
                pd.merge(A, B, left_on="x", right_index=True, how=how),
            )

distributed\shuffle\tests\test_merge.py:218: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
distributed\shuffle\tests\test_merge.py:36: in list_eq
    a = await c.compute(a) if isinstance(a, dd.DataFrame) else a
distributed\client.py:336: in _result
    raise exc.with_traceback(tb)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\dask_expr\_merge.py:775: in assign_index_merge_transfer
    index["_index"] = df.index
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4299: in __setitem__
    self._set_item(key, value)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4526: in _set_item
    self._set_item_mgr(key, value, refs)
C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\frame.py:4484: in _set_item_mgr
    self._check_setitem_copy()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   warnings.warn(t, SettingWithCopyWarning, stacklevel=find_stack_level())
E   pandas.errors.SettingWithCopyWarning: 
E   A value is trying to be set on a copy of a slice from a DataFrame.
E   Try using .loc[row_indexer,col_indexer] = value instead
E   
E   See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy

C:\Users\runneradmin\miniconda3\envs\dask-distributed\lib\site-packages\pandas\core\generic.py:4472: SettingWithCopyWarning

Check warning on line 0 in distributed.tests.test_scheduler

See this annotation in the file changed.

@github-actions github-actions / Unit Test Results

1 out of 14 runs failed: test_tell_workers_when_peers_have_left (distributed.tests.test_scheduler)

artifacts/ubuntu-latest-mindeps-pandas-ci1/pytest.xml [took 5s]
Raw output
assert 1710775322.494782 < (1710775317.4777527 + 5)
 +  where 1710775322.494782 = time()
c = <Client: No scheduler connected>
s = <Scheduler 'tcp://127.0.0.1:43601', workers: 0, cores: 0, tasks: 0>
a = <Worker 'tcp://127.0.0.1:34767', name: 0, status: closed, stored: 1, running: 0/1, ready: 0, comm: 0, waiting: 0>
b = <Worker 'tcp://127.0.0.1:41411', name: 1, status: closed, stored: 0, running: 0/2, ready: 0, comm: 0, waiting: 0>

    @gen_cluster(client=True)
    async def test_tell_workers_when_peers_have_left(c, s, a, b):
        f = (await c.scatter({"f": 1}, workers=[a.address, b.address], broadcast=True))["f"]
    
        workers = {a.address: a, b.address: b}
        connect_timeout = parse_timedelta(
            dask.config.get("distributed.comm.timeouts.connect"), default="seconds"
        )
    
        class BrokenGatherDep(Worker):
            async def gather_dep(self, worker, *args, **kwargs):
                w = workers.pop(worker, None)
                if w is not None and workers:
                    w.listener.stop()
                    s.stream_comms[worker].abort()
    
                return await super().gather_dep(worker, *args, **kwargs)
    
        async with BrokenGatherDep(s.address, nthreads=1) as w3:
            start = time()
            g = await c.submit(inc, f, key="g", workers=[w3.address])
            # fails over to the second worker in less than the connect timeout
>           assert time() < start + connect_timeout
E           assert 1710775322.494782 < (1710775317.4777527 + 5)
E            +  where 1710775322.494782 = time()

distributed/tests/test_scheduler.py:4625: AssertionError