Skip to content
This repository has been archived by the owner on Mar 14, 2024. It is now read-only.

Add lint and coverage report on ci builds #65

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

simonepri
Copy link

@simonepri simonepri commented Jun 15, 2019

Types of changes

  • Docs change / refactoring / dependency upgrade
  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Motivation and Context / Related issue

The code needs to linted during ci builds.

Please note that the build now fails due to some lint errors.
image
Please share your preferred flake8 configs, if you have any.

How Has This Been Tested (if it applies)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document and completed the CLA (see CONTRIBUTING).
  • All tests passed, and additional code has been covered with new tests.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 15, 2019
@lw
Copy link
Contributor

lw commented Jun 15, 2019

To be honest, I don't like this too much. While I agree that it would be good to have lint messages and code coverage information, I don't think that having them printed buried in the logs of our continuous integration is ideal. In another project I am maintaining we're using "GitHub apps" like Codacy and Codecov that add comments/checks to the GitHub pull request page. (This means that lint messages appear on the line they are about, and that a decreasing coverage causes a warning or error).

I'm not exceedingly happy with those two specific tools (Codacy has quite few false positives and negatives, Codecov seems to be a bit noisy). I'd love to hear suggestions for better alternatives. In any case, I like their "approach" more than what is introduced by this pull request...

What do you think?

PS: I'm also a bit wary of adding code coverage information because I expect our score there to be awful...

@simonepri
Copy link
Author

I don't think that having them printed buried in the logs of our continuous integration is ideal

I don't see any drawback on having them logged. The tools you mentioned can be seen as a visual aid to what's inside the build log, but that's just my point of view.

Instead I believe that is important for the project to have a script inside setup.cfg to allow people run the lint and the test scripts locally on their machine before opening a PR. (The CI would then run the same command as well)

I like their "approach" more than what is introduced by this pull request...

If we want to use codecov, the changes introduced are necessary anyway.
Indeed, to make codecov work, we have to send the report generated by pytest-cov to them using their api. (And this is easy as running the codecov binary after the tests)

@lw
Copy link
Contributor

lw commented Jun 16, 2019

Well, I see two "drawbacks" in having the lint done as part of our CI: it slows the CI down (delaying the time-to-signal) and it adds noise (e.g., the CI for this PR is failing because of lint warnings, style issues, no real error). I wouldn't mind this if it gave a useful report but, as I said, the fact that it is hidden in the logs makes it hard to find and use.

I would first explore external apps like Codacy (not necessarily that one, I'm just naming it because it's the only one I know). If they don't suit our needs then, well, we'll have to roll our own solution and it will be the easiest to have it be part of our CI. And you're right that even if we use Codecov we'll have to collect coverage data during our CI runs.

I do agree that it's good to have a simple way to run all tests, both by end users and inside the CI. I searched for that and found a setuptools solution that I implemented in #66. A plus of it is that it doesn't require pytest or other additional dependencies.

In conclusion, let me think about it for a bit and I'll get back to you.

@simonepri
Copy link
Author

I wouldn't mind this if it gave a useful report but, as I said, the fact that it is hidden in the logs makes it hard to find and use.

I personally don't consider this hard to find and use. You click on details and it tells you exactly where and what the problem is. Indeed it's quite common for open source projects to have this kind of setup.
image

I searched for that and found a setuptools solution that I implemented in #66.

Great, I like it!
We should also add a lint command to run flake8 inside the setup.cfg. Even though we may not want to run it during the CI build, contributors still needs a way to check whether their code satisfy the codestyle used by this project before opening a PR.

@lw
Copy link
Contributor

lw commented Jun 16, 2019

We should also add a lint command to run flake8 inside the setup.cfg. Even though we may not want to run it during the CI build, contributors still needs a way to check whether their code satisfy the codestyle used by this project before opening a PR.

Unlike unittest, there is no official linter in the standard library (Black is the closest, but I'm not ready to embrace it yet), and I don't want to introduce new dependencies or endorse one linter over another. Also, if one wants flake8 (which I've nothing against), one can already use it to lint PBG by running ./setup.py flake8 (flake8 installs a custom global distutils command).

@simonepri
Copy link
Author

I don't want to introduce new dependencies or endorse one linter over another.

I was talking of flake8 in particular because is written in here: CONTRIBUTING, and I thought it was the one adopted by the project.

@lw
Copy link
Contributor

lw commented Jun 16, 2019

Oops, I forgot about that. Maybe I copied it from another project when setting up this repo? I guess at this point we can stick to it... What do you think of the ./setup.py flake8 solution? I like it because it works out of the box and doesn't require us to do anything. (In particular, we're not adding a dependency: if people don't have flake8 they don't have that command but everything else works)

@simonepri
Copy link
Author

if people don't have flake8 they don't have that command but everything else works

I expect a proper dev setup to take care of installing dev dependencies for me. (I think extras_require can be used for this)
Also if we want to use it, we must define at least some basic configs to tell flake8 what to lint and what to ignore.

Eg:

[options.extras_require]
testing =
    flake8
[flake8]
max-line-length = #some number
exclude = 
    __pycache__,
    .git,
    .circleci,
    .github,
    docs,
    torch_extensions,
    docs

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants