Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update pre-commit hooks #4874

Merged
merged 5 commits into from
Feb 8, 2021
Merged

update pre-commit hooks #4874

merged 5 commits into from
Feb 8, 2021

Conversation

keewis
Copy link
Collaborator

@keewis keewis commented Feb 7, 2021

I issued a bugfix release for the issue detected in #4810 (comment), so we should use that. pre-commit autoupdate also tried to update to mypy=v0.800, but this fails because mypy does not like the redefinition of dask_array_type (and the other *_type variables in xarray.core.pycompat):

xarray/core/pycompat.py:19: error: Incompatible types in assignment (expression has type "Tuple[]", variable has type "Tuple[Any]")
xarray/core/pycompat.py:29: error: Incompatible types in assignment (expression has type "Tuple[]", variable has type "Tuple[Any]")
xarray/core/pycompat.py:37: error: Incompatible types in assignment (expression has type "Tuple[]", variable has type "Tuple[Any]")

Does anyone know how to fix that?

  • Passes pre-commit run --all-files

@mathause
Copy link
Collaborator

mathause commented Feb 7, 2021

I am super confused... These errors should be ignored I think:

xarray/setup.cfg

Lines 232 to 233 in 110c857

[mypy-xarray.core.pycompat]
ignore_errors = True

Also I get other errors when running mypy xarray (a lot of them:

xarray/core/pycompat.py:10: error: Skipping analyzing 'dask.array': found module but no type hints or library stubs

Somehow it does not recognize the wildcard - don't you get this?

xarray/setup.cfg

Lines 182 to 183 in 110c857

[mypy-dask.*]
ignore_missing_imports = True

(this is with mypy 0.800 and numpy 1.19)


However, if I update numpy to 1.20 I get a number of errors that (as far as I can tell) have nothing to do with numpy (for both versions of mypy)

edit: not true, see below


On a related note: I think we should we add a full mypy run again? pre-commit only checks the files that are changed?

@mathause
Copy link
Collaborator

mathause commented Feb 7, 2021

Sorry, I was wrong about (2) that has to do with numpy

edit: see #4878

@keewis
Copy link
Collaborator Author

keewis commented Feb 7, 2021

pre-commit is run as pre-commit run --all-files, which will run the hooks on all files (unless we explicitly skip files using a pattern).

I do get the errors if I run python -m mypy xarray but not if I run python -m mypy . or pre-commit run --all-files, so I guess mypy cannot find the configuration file for mypy xarray. With mypy ., however, mypy will complain about a duplicated conftest.py.

@keewis keewis merged commit 4e97b33 into pydata:master Feb 8, 2021
@keewis keewis deleted the pre-commit branch February 8, 2021 00:18
@mathause
Copy link
Collaborator

mathause commented Feb 8, 2021

Thanks, maybe we have to wait for mypy 0.810 then (or however they count). I am still a bit confused why this does not fail for numpy 1.20, though... or does this not install numpy etc?

dcherian added a commit to DWesl/xarray that referenced this pull request Feb 11, 2021
…_and_bounds_as_coords

* upstream/master: (51 commits)
  Ensure maximum accuracy when encoding and decoding cftime.datetime values (pydata#4758)
  Fix `bounds_error=True` ignored with 1D interpolation (pydata#4855)
  add a drop_conflicts strategy for merging attrs (pydata#4827)
  update pre-commit hooks (mypy) (pydata#4883)
  ensure warnings cannot become errors in assert_ (pydata#4864)
  update pre-commit hooks (pydata#4874)
  small fixes for the docstrings of swap_dims and integrate (pydata#4867)
  Modify _encode_datetime_with_cftime for compatibility with cftime > 1.4.0 (pydata#4871)
  vélin (pydata#4872)
  don't skip the doctests CI (pydata#4869)
  fix da.pad example for numpy 1.20 (pydata#4865)
  temporarily pin dask (pydata#4873)
  Add units if "unit" is in the attrs. (pydata#4850)
  speed up the repr for big MultiIndex objects (pydata#4846)
  dim -> coord in DataArray.integrate (pydata#3993)
  WIP: backend interface, now it uses subclassing  (pydata#4836)
  weighted: small improvements (pydata#4818)
  Update related-projects.rst (pydata#4844)
  iris update doc url (pydata#4845)
  Faster unstacking (pydata#4746)
  ...
dcherian added a commit to dcherian/xarray that referenced this pull request Feb 12, 2021
* upstream/master: (24 commits)
  Compatibility with dask 2021.02.0 (pydata#4884)
  Ensure maximum accuracy when encoding and decoding cftime.datetime values (pydata#4758)
  Fix `bounds_error=True` ignored with 1D interpolation (pydata#4855)
  add a drop_conflicts strategy for merging attrs (pydata#4827)
  update pre-commit hooks (mypy) (pydata#4883)
  ensure warnings cannot become errors in assert_ (pydata#4864)
  update pre-commit hooks (pydata#4874)
  small fixes for the docstrings of swap_dims and integrate (pydata#4867)
  Modify _encode_datetime_with_cftime for compatibility with cftime > 1.4.0 (pydata#4871)
  vélin (pydata#4872)
  don't skip the doctests CI (pydata#4869)
  fix da.pad example for numpy 1.20 (pydata#4865)
  temporarily pin dask (pydata#4873)
  Add units if "unit" is in the attrs. (pydata#4850)
  speed up the repr for big MultiIndex objects (pydata#4846)
  dim -> coord in DataArray.integrate (pydata#3993)
  WIP: backend interface, now it uses subclassing  (pydata#4836)
  weighted: small improvements (pydata#4818)
  Update related-projects.rst (pydata#4844)
  iris update doc url (pydata#4845)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants