Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve package's doctests #1061

Open
1 of 17 tasks
bonjourmauko opened this issue Oct 7, 2021 · 1 comment
Open
1 of 17 tasks

Improve package's doctests #1061

bonjourmauko opened this issue Oct 7, 2021 · 1 comment
Labels
kind:improvement Refactoring and code cleanup kind:roadmap A group of issues, constituting a delivery roadmap kind:theme A group of issues, directly tied to an OKR

Comments

@bonjourmauko
Copy link
Member

bonjourmauko commented Oct 7, 2021

Hi there!

I really enjoy OpenFisca, but I recently encountered an issue.

Here is what I did:

pytest openfisca_core openfisca_web_api --maxfail 0 --continue-on-collection-errors --quiet

Here is what I expected to happen:

XXX passed, X xfailed, XXX warnings in X.XXs

Here is what actually happened:

========================================================================================= short test summary info ==========================================================================================
FAILED openfisca_core/commons/formulas.py::openfisca_core.commons.formulas.apply_thresholds
FAILED openfisca_core/commons/formulas.py::openfisca_core.commons.formulas.switch
FAILED openfisca_core/holders/holder.py::openfisca_core.holders.holder.Holder.get_memory_usage
...
ERROR openfisca_core/scripts/measure_numpy_condition_notations.py::test_switch_select

Context:

Python provides doctests to help keep a testable documentation.

They provide some benefits:

  • They help maintainers and contributors to spot bugs and improvement opportunities.
  • They helps reusers to understand the code quickly
  • They help separating contexts (unit, functional, ...)
  • They help produce the official doc

Complications:

Today the code coverage by doctests is mild, they are not run, and most of them are broken.

Some observed consequences:

  • The code is in overall hard to grasp without a consequent time investment, thus hard to contribute to
  • The code is hard to test unitarily due to its current design
  • There are then very few unit tests
  • The documentation is was broken

Proposal:

Fix, complete, and run doctests systematically!

TODO

@MattiSG
Copy link
Member

MattiSG commented Oct 7, 2021

Thanks @maukoquiroga for providing a bit more background to this impressive series of PRs 🙂

Could you explain what “to work” and “it didn't” means for pytest? Did you not get failures when you expected to get them? Did the command simply not run?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:improvement Refactoring and code cleanup kind:roadmap A group of issues, constituting a delivery roadmap kind:theme A group of issues, directly tied to an OKR
Projects
None yet
Development

No branches or pull requests

2 participants