Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[REVIEW]: GCM-Filters: A Python Package for Diffusion-based Spatial Filtering of Gridded Data #3947

Closed
60 tasks done
whedon opened this issue Nov 23, 2021 · 70 comments
Closed
60 tasks done
Assignees
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX

Comments

@whedon
Copy link

whedon commented Nov 23, 2021

Submitting author: @NoraLoose (Nora Loose)
Repository: https://github.com/ocean-eddy-cpt/gcm-filters
Version: v0.2.3
Editor: @elbeejay
Reviewer: @callumrollo, @AleksiNummelin, @isgiddy
Archive: 10.5281/zenodo.6039860

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/bc8ad806627f0d754347686e21f00d40"><img src="https://joss.theoj.org/papers/bc8ad806627f0d754347686e21f00d40/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/bc8ad806627f0d754347686e21f00d40/status.svg)](https://joss.theoj.org/papers/bc8ad806627f0d754347686e21f00d40)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@callumrollo & @AleksiNummelin & @isgiddy, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @elbeejay know.

Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest

Review checklist for @callumrollo

✨ Important: Please do not use the Convert to issue functionality when working through this checklist, instead, please open any new issues associated with your review in the software repository associated with the submission. ✨

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@NoraLoose) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @AleksiNummelin

✨ Important: Please do not use the Convert to issue functionality when working through this checklist, instead, please open any new issues associated with your review in the software repository associated with the submission. ✨

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@NoraLoose) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @isgiddy

✨ Important: Please do not use the Convert to issue functionality when working through this checklist, instead, please open any new issues associated with your review in the software repository associated with the submission. ✨

Conflict of interest

  • I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • Repository: Is the source code for this software available at the repository url?
  • License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • Contribution and authorship: Has the submitting author (@NoraLoose) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • Installation: Does installation proceed as outlined in the documentation?
  • Functionality: Have the functional claims of the software been confirmed?
  • Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • A statement of need: Does the paper have a section titled 'Statement of Need' that clearly states what problems the software is designed to solve and who the target audience is?
  • State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
@whedon
Copy link
Author

whedon commented Nov 23, 2021

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @callumrollo, @AleksiNummelin, @isgiddy it looks like you're currently assigned to review this paper 🎉.

⚠️ JOSS reduced service mode ⚠️

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

⭐ Important ⭐

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@whedon
Copy link
Author

whedon commented Nov 23, 2021

Wordcount for paper.md is 1131

@whedon
Copy link
Author

whedon commented Nov 23, 2021

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1029/2021MS002552 is OK
- 10.5334/jors.148 is OK
- 10.25080/majora-7b98e3ed-013 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1029/2019MS001726 is OK
- 1721.1/117188 is OK
- 10.5281/zenodo.4968496 is OK

MISSING DOIs

- 10.25080/majora-7b98e3ed-013 may be a valid DOI for title:  Dask: Parallel Computation with Blocked algorithms and Task Scheduling 

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Nov 23, 2021

Software report (experimental):

github.com/AlDanial/cloc v 1.88  T=0.15 s (233.5 files/s, 178551.3 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          10            428            366           1502
Jupyter Notebook                 8              0          23079            879
reStructuredText                 6            219            182            259
TeX                              1             20              0            204
YAML                             7             12             15            184
Markdown                         2             35              0            114
TOML                             1              2              0             11
make                             1              4              7              9
-------------------------------------------------------------------------------
SUM:                            36            720          23649           3162
-------------------------------------------------------------------------------


Statistical information for the repository '2bd90ce08a4de5d1191d58ac' was
gathered on 2021/11/23.
The following historical commit information, by author, was found:

Author                     Commits    Insertions      Deletions    % of changes
Andrew Ross                      8            89             20            1.10
Arthur                          24           483            420            9.10
Elizabeth A Yankovsk             6           132             41            1.74
Gustavo Marques                 10           632            198            8.37
Ian Grooms                      30           456            305            7.67
Julius Busecke                   6           104            209            3.16
NoraLoose                      116          3111           1557           47.06
Ryan Abernathey                 28          1461            686           21.64
Scott Bachman                    2             8              8            0.16

Below are the number of rows from each author that have survived and are still
intact in the current revision:

Author                     Rows      Stability          Age       % in comments
Andrew Ross                  70           78.7          0.9               14.29
Arthur                        5            1.0          7.9                0.00
Elizabeth A Yankovsk         25           18.9          1.0               16.00
Gustavo Marques             249           39.4          5.6               10.84
Ian Grooms                  201           44.1          5.7                5.97
Julius Busecke               83           79.8          9.2               39.76
NoraLoose                   936           30.1          4.6               10.36
Ryan Abernathey             727           49.8          5.9                8.12

@whedon
Copy link
Author

whedon commented Nov 23, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@elbeejay
Copy link
Member

@callumrollo, @AleksiNummelin, and @isgiddy, thanks for agreeing to review this submission to JOSS. We are currently asking reviewers to try and complete their reviews in 6 weeks.

The JOSS review process is entirely open and transparent, and takes place on GitHub. Review comments can be made as issues in the GCM-Filters repository, please link this review issue when doing so (paste https://github.com/openjournals/joss-reviews/issues/3947 into the issue).

For reference, here are links to the JOSS documentation that may be helpful as you conduct your reviews:

Please feel free to ping me (@elbeejay) if you have any questions/concerns. Thanks again for agreeing to review for JOSS.

@NoraLoose
Copy link

@whedon generate pdf

I have added one sentence to the acknowledgments of the paper - hope that was okay!

@whedon
Copy link
Author

whedon commented Nov 30, 2021

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@callumrollo
Copy link

Hi Nora, this looks like a solid submission for JOSS. I'm not a modeller, but will review the code as best I can. I have a question about authorship. Most of the authors have made contributions to the codebase, corresponding to the author order. However, I could not find code contributions by Laure
Zanna. Could you please outline their contribution to GCM_Filters?

@NoraLoose
Copy link

Hi @callumrollo, thanks so much for reviewing our submission!

We defined the authorship criteria here, following the JOSS recommendations.

The GCM-Filters package is being developed by the Ocean Eddy Climate Process Team (CPT). (The CPT is also the github organization that owns the GCM-Filters repo.) Laure Zanna does not only lead the CPT, but has also made contributions through discussing code & package ideas and the implementation of the MOM5 Laplacians (partially off github's eye, in our CPT meetings as well as in her research group meetings with Elizabeth Yankovsky and Arthur Guillaumin). She is invited as a co-author because her contributions clearly satisfy the following JOSS criterion:

Purely financial (such as being named on an award) and organizational (such as general supervision of a research group) contributions are not considered sufficient for co-authorship of JOSS submissions, but active project direction and other forms of non-code contributions are.

I hope this explanation helps! I'm happy to answer any further questions and concerns.

@whedon
Copy link
Author

whedon commented Dec 7, 2021

👋 @AleksiNummelin, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented Dec 7, 2021

👋 @isgiddy, please update us on how your review is going (this is an automated reminder).

@whedon
Copy link
Author

whedon commented Dec 7, 2021

👋 @callumrollo, please update us on how your review is going (this is an automated reminder).

@callumrollo
Copy link

There is a notebooks folder in the top level of the repo, with a jupyter notebook enticingly named tutorial.ipynb. However, these appears to be a demo for a lesson on packing or similar? Not sure what it's doing here. I'd suggest removing it lest potential users mistake it for a demo of the package functionality.

The documentation contains some great examples of package functionality. Perhaps these could be dumped into a notebook for ease of experimentation by potential users?

@callumrollo
Copy link

Reading the API documentation, I am unclear on the difference between these two methods of the Filter class:

apply(self, field_or_dataset, dims):
"""Filter a field or xarray dataset with scalar Laplacian across the
dimensions specified by dims."""

apply_to_field(self, field, dims):
"""Filter a field with scalar Laplacian across the dimensions specified by dims."""

If apply can process a field or dataset, apply_to_field seems superfluous. If they are performing different functions, this could be better explained in their docstrings

@callumrollo
Copy link

While not essential, the maintainers might consider copying the contributing information on the docs website to CONTRIBUTING.md in the repo so github detects it and users can find it more easily

@callumrollo
Copy link

Similarly Issue and PR templates could make it easier for people to submit bug reports/requests etc. Again not an essential recommendation, it looks like the project is already dealing well with Issues and PRs

@callumrollo
Copy link

Overall a very impressive package. Some of the workings are a bit beyond me, but I managed to run some of the example code and mess around with filter kernels. Great testing suite! Once above small issues are considered, I'm happy for this to go to publication

@NoraLoose
Copy link

@whedon generate pdf

To be consistent between zenodo archive and JOSS paper, I changed v0.2 to v0.2.3 in the paper.

@whedon
Copy link
Author

whedon commented Feb 10, 2022

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@elbeejay
Copy link
Member

@whedon set v0.2.3 as version

@whedon
Copy link
Author

whedon commented Feb 11, 2022

OK. v0.2.3 is the version.

@elbeejay
Copy link
Member

@whedon set 10.5281/zenodo.6039860 as archive

@whedon
Copy link
Author

whedon commented Feb 11, 2022

OK. 10.5281/zenodo.6039860 is the archive.

@elbeejay
Copy link
Member

@whedon generate pdf

@elbeejay
Copy link
Member

@whedon check references

@whedon
Copy link
Author

whedon commented Feb 11, 2022

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1029/2021MS002552 is OK
- 10.5334/jors.148 is OK
- 10.25080/majora-7b98e3ed-013 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1029/2019MS001726 is OK
- 1721.1/117188 is OK
- 10.5281/zenodo.4968496 is OK

MISSING DOIs

- 10.25080/majora-7b98e3ed-013 may be a valid DOI for title:  Dask: Parallel Computation with Blocked algorithms and Task Scheduling 

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Feb 11, 2022

👉📄 Download article proof 📄 View article proof on GitHub 📄 👈

@elbeejay
Copy link
Member

Hi @NoraLoose, this looks good to me, I'll be recommending this for publication in JOSS; the final step is just going to involve one of the Editors-in-Chief looking over this review issue and the paper which should happen over the next few days. Thanks again to @AleksiNummelin, @callumrollo, and @isgiddy for volunteering your time and peer reviewing this submission.

@elbeejay
Copy link
Member

@whedon recommend-accept

@whedon
Copy link
Author

whedon commented Feb 11, 2022

Attempting dry run of processing paper acceptance...

@whedon whedon added the recommend-accept Papers recommended for acceptance in JOSS. label Feb 11, 2022
@whedon
Copy link
Author

whedon commented Feb 11, 2022

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1029/2021MS002552 is OK
- 10.5334/jors.148 is OK
- 10.25080/majora-7b98e3ed-013 is OK
- 10.1038/s41592-019-0686-2 is OK
- 10.1038/s41586-020-2649-2 is OK
- 10.1109/MCSE.2007.55 is OK
- 10.1029/2019MS001726 is OK
- 1721.1/117188 is OK
- 10.5281/zenodo.4968496 is OK

MISSING DOIs

- 10.25080/majora-7b98e3ed-013 may be a valid DOI for title:  Dask: Parallel Computation with Blocked algorithms and Task Scheduling 

INVALID DOIs

- None

@whedon
Copy link
Author

whedon commented Feb 11, 2022

👋 @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof 👉 openjournals/joss-papers#2944

If the paper PDF and Crossref deposit XML look good in openjournals/joss-papers#2944, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.

@whedon accept deposit=true

@elbeejay
Copy link
Member

I'll note for the EiC that looks into this that there is no DOI associated with Dask; the reference used in this paper matches the citation recommended by Dask here.

@NoraLoose
Copy link

Thanks so much @elbeejay!

@kthyng
Copy link

kthyng commented Feb 11, 2022

Everything looks perfect and ready to go!

@kthyng
Copy link

kthyng commented Feb 11, 2022

@whedon accept deposit=true

@whedon
Copy link
Author

whedon commented Feb 11, 2022

Doing it live! Attempting automated processing of paper acceptance...

@whedon whedon added accepted published Papers published in JOSS labels Feb 11, 2022
@whedon
Copy link
Author

whedon commented Feb 11, 2022

🐦🐦🐦 👉 Tweet for this paper 👈 🐦🐦🐦

@whedon
Copy link
Author

whedon commented Feb 11, 2022

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited 👉 Creating pull request for 10.21105.joss.03947 joss-papers#2948
  2. Wait a couple of minutes, then verify that the paper DOI resolves https://doi.org/10.21105/joss.03947
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! 🎉🌈🦄💃👻🤘

Any issues? Notify your editorial technical team...

@kthyng
Copy link

kthyng commented Feb 11, 2022

Congratulations on your new publication @NoraLoose! Many thanks to editor @elbeejay and reviewers @callumrollo, @AleksiNummelin, and @isgiddy for your time, hard work, and expertise!!

@kthyng kthyng closed this as completed Feb 11, 2022
@whedon
Copy link
Author

whedon commented Feb 11, 2022

🎉🎉🎉 Congratulations on your paper acceptance! 🎉🎉🎉

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.03947/status.svg)](https://doi.org/10.21105/joss.03947)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.03947">
  <img src="https://joss.theoj.org/papers/10.21105/joss.03947/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.03947/status.svg
   :target: https://doi.org/10.21105/joss.03947

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

@NoraLoose
Copy link

Thanks @kthyng, @elbeejay, @isgiddy, @AleksiNummelin, and @callumrollo for your time, reviews, and editing. This has been a fun and great experience!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
accepted Jupyter Notebook published Papers published in JOSS Python recommend-accept Papers recommended for acceptance in JOSS. review TeX
Projects
None yet
Development

No branches or pull requests

7 participants