Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests/numpy 2.0 #44

Merged
merged 10 commits into from
Jul 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 14 additions & 15 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: Python package
name: Run pytests

on:
push:
Expand All @@ -10,7 +10,7 @@ on:
branches: [ "master" ]

jobs:
build:
test:

runs-on: ubuntu-latest
strategy:
Expand All @@ -19,31 +19,30 @@ jobs:
python-version: ["3.9", "3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
"${SHELL}" <(curl -L micro.mamba.pm/install.sh)
eval "$(micromamba shell hook --shell bash)"
micromamba activate base
micromamba install pytest-cov pytest-mpl xarray netcdf4 pandas numpy scikit-learn scipy pyproj cartopy metpy ipywidgets python=${{ matrix.python-version }} -c conda-forge
python -m pip install .
- name: Install environment with micromamba
uses: mamba-org/setup-micromamba@v1
with:
environment-file: tests/test-env.yml
create-args: python=${{ matrix.python-version }}
init-shell: bash
- name: Test with pytest
run: |
eval "$(micromamba shell hook --shell bash)"
micromamba activate base
micromamba activate pyxlma-tests
coverage run --source=pyxlma -m pytest --mpl --mpl-baseline-path=tests/truth/images/ --mpl-generate-summary=html,json --mpl-results-path=tests/mpl-results/ tests/
coverage xml
- name: Upload coverage reports to Codecov
uses: codecov/codecov-action@v3
uses: codecov/codecov-action@v4
env:
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
- name: Upload matplotlib test results
if: always()
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: matplotlib-results
name: matplotlib-results-${{ matrix.python-version }}
path: tests/mpl-results/
18 changes: 18 additions & 0 deletions tests/test-env.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
name: pyxlma-tests
channels:
- conda-forge
dependencies:
- pytest-cov
- pytest-mpl
- xarray
- netcdf4
- pandas
- numpy
- scipy
- scikit-learn
- pyproj
- metpy
- ipywidgets
- pip:
- git+https://github.com/deeplycloudy/lmatools
- -e ../
6 changes: 3 additions & 3 deletions tests/test_flash.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,14 @@
from pyxlma.lmalib.flash.properties import *


def compare_dataarrays(tocheck, truth, var):
def compare_dataarrays(tocheck, truth, var, rtol=1.e-5, atol=1.e-8):
"""Compare two dataarrays"""
if truth[var].data.dtype == 'datetime64[ns]' or truth[var].data.dtype == 'timedelta64[ns]':
if tocheck[var].data.dtype == 'float64':
truth[var].data = truth[var].data.astype(float)/1e9
np.testing.assert_allclose(tocheck[var].data.astype(float), truth[var].data.astype(float))
np.testing.assert_allclose(tocheck[var].data.astype(float), truth[var].data.astype(float), rtol=rtol, atol=atol, equal_nan=True)
else:
np.testing.assert_allclose(tocheck[var].data, truth[var].data)
np.testing.assert_allclose(tocheck[var].data, truth[var].data, rtol=rtol, atol=atol, equal_nan=True)


def test_cluster_flashes():
Expand Down