Skip to content

Commit

Permalink
On Windows, enable successful test of opening a dataset containing a …
Browse files Browse the repository at this point in the history
…cftime index (#6305)

* Exit cluster context before deleting temporary directory

Previously, on Windows, the scheduler in the outer context prevented deleting
the temporary directory upon exiting the inner context of the latter.  That
caused the test to fail and the temporary directory and file to remain.

* Use fixture instead of context manager for temporary directory

* Edit whats-new entry
  • Loading branch information
stanwest authored Feb 28, 2022
1 parent 4292bde commit 613a8fd
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 10 deletions.
2 changes: 1 addition & 1 deletion doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ Bug fixes

- Variables which are chunked using dask in larger (but aligned) chunks than the target zarr chunk size
can now be stored using `to_zarr()` (:pull:`6258`) By `Tobias Kölling <https://github.com/d70-t>`_.
- Multi-file datasets containing encoded :py:class:`cftime.datetime` objects can be read in parallel again (:issue:`6226`, :pull:`6249`). By `Martin Bergemann <https://github.com/antarcticrainforest>`_.
- Multi-file datasets containing encoded :py:class:`cftime.datetime` objects can be read in parallel again (:issue:`6226`, :pull:`6249`, :pull:`6305`). By `Martin Bergemann <https://github.com/antarcticrainforest>`_ and `Stan West <https://github.com/stanwest>`_.

Documentation
~~~~~~~~~~~~~
Expand Down
15 changes: 6 additions & 9 deletions xarray/tests/test_distributed.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,6 @@
""" isort:skip_file """
import os
import pickle
import numpy as np
import tempfile

import pytest
from packaging.version import Version
Expand Down Expand Up @@ -113,19 +111,18 @@ def test_dask_distributed_netcdf_roundtrip(

@requires_cftime
@requires_netCDF4
def test_open_mfdataset_can_open_files_with_cftime_index():
def test_open_mfdataset_can_open_files_with_cftime_index(tmp_path):
T = xr.cftime_range("20010101", "20010501", calendar="360_day")
Lon = np.arange(100)
data = np.random.random((T.size, Lon.size))
da = xr.DataArray(data, coords={"time": T, "Lon": Lon}, name="test")
file_path = tmp_path / "test.nc"
da.to_netcdf(file_path)
with cluster() as (s, [a, b]):
with Client(s["address"]):
with tempfile.TemporaryDirectory() as td:
data_file = os.path.join(td, "test.nc")
da.to_netcdf(data_file)
for parallel in (False, True):
with xr.open_mfdataset(data_file, parallel=parallel) as tf:
assert_identical(tf["test"], da)
for parallel in (False, True):
with xr.open_mfdataset(file_path, parallel=parallel) as tf:
assert_identical(tf["test"], da)


@pytest.mark.parametrize("engine,nc_format", ENGINES_AND_FORMATS)
Expand Down

0 comments on commit 613a8fd

Please sign in to comment.