You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
currently leads to the analysis pytest failing locally, since local_only is set to True in nmma/tests/analysis.py.
The cached Bu2019lm model on the Potsdam enlil machine is not necessarily up to date with gitlab/zenodo (which are also currently inconsistent, see #317). There are a couple ways we could address this issue:
Point the workflow to gitlab or zenodo and download the model from there at the beginning of each workflow
Include Bu2019lm.pkl and a specific filter file in the nmma/tests/data directory, using that path for the analysis test's svdmodels argument
The text was updated successfully, but these errors were encountered:
The cache is not stored on Potsdam machines. These are the original models. Caches are stored on Github servers (and now I cannot see any of the caches under the Actions tab). Caching was done in order to prevent timeouts (because of Zenodo). But yes, it will make sense to add the filter specifc model file in the data directory (I am not sure why didn't I do that previously).
Running
pytest nmma/tests/*.py
should ideally pass all tests locally in addition to Github Actions. Caching the svdmodels in the workflow file, e.g.nmma/.github/workflows/continous_integration.yml
Line 80 in 0395d1e
currently leads to the analysis pytest failing locally, since
local_only
is set toTrue
innmma/tests/analysis.py
.The cached
Bu2019lm
model on the Potsdamenlil
machine is not necessarily up to date with gitlab/zenodo (which are also currently inconsistent, see #317). There are a couple ways we could address this issue:Bu2019lm.pkl
and a specific filter file in thenmma/tests/data
directory, using that path for the analysis test'ssvdmodels
argumentThe text was updated successfully, but these errors were encountered: