Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Track and compile tests in webpack #1209

Closed
wants to merge 6 commits into from

Conversation

rricard
Copy link
Contributor

@rricard rricard commented Dec 8, 2016

Should fix #1169.

1__yarn_start__node_

react_app

Why would I do that?

Right now CRA checks lint (and soon flow) errors through a webpack compilation process (for example, eslint checks are based on a loader). The issue with this is, this will not check non-imported files in the main app such as tests.

How does this PR solves the problem?

This PR adds files matching a certain "glob" pattern (representing the tests) to the list of watched webpack files as well as triggering an independent child compilation (like html-webpack-plugin does) for those files that will open and process the contents of the files through the available loaders (including eslint) to output them in a tmp dir.

Shouldn't we add this to the test script?

Doing so would require to add a new way to run eslint on top of the jest watcher. I don't even know if that's possible, however webpack plugins are perfect for this kind of thing!

This will not process them for now...
This result will not be used for serving the pages but will be used for tracking lint/type errors in those files
@gaearon
Copy link
Contributor

gaearon commented Dec 8, 2016

Thanks for looking into this. However most often people will write tests while in npm test mode. So they won't see those messages.

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

@gaearon I completely agree with that, but at the moment I don't really know how we could integrate this with jest correctly, I have to admit I lack knowledge to extend it. For now, I would say this is a seamless temporary solution to the problem, that is indeed not perfect but will at least do two things:

  • Show lint errors in tests in the CI
  • If you are running both a server and a test watcher, you will still have the errors showing up correctly

I'll just take a look at jest to see how we can extend it with eslint (maybe during the transform?) but I think both solutions are complementary: we may want to see those errors in every place and consistently.

@gaearon
Copy link
Contributor

gaearon commented Dec 8, 2016

@cpojer Do you have any thoughts on integrating lint output in Jest? Is this something you'd take contributions for? For example transforms could get an API to emit custom messages.

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

@gaearon what do you think about having the warnings both in webpack and jest?

@gaearon
Copy link
Contributor

gaearon commented Dec 8, 2016

My only concern with doing it in Webpack is it will slow down the dev server because it needs to track more files.

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

@gaearon well, that's indeed one of my concerns too. If we can get jest to show those errors I'll need to think about the output we want for the flow checking (so it's consistent: not project-wide but only on the concerned files, in both places: tests & server)

@cpojer
Copy link
Contributor

cpojer commented Dec 8, 2016

I don't think this fits into Jest tbh. Maybe you can convince me. The watch mode needs to be refactored and adding some more information and running a script for more info I'm not completely opposed to if you also do the refactor (basically split the watch feature out of https://github.com/facebook/jest/blob/master/packages/jest-cli/src/jest.js into a separate package).

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

To sum up the choice is here:

Solution 1

Webpack and jest shows consistently the same errors for the whole project.

Potential con: slower webpack builds

Solution 2

Webpack will only show errors related to its own bundle. Jest will only show errors related to the files it saw.

Potential con: silenced flow errors not seen from one point of view or the other (for instance a mistake in a part of the bundle could trigger an error in the test, we won't see it unless running the tests ...)

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

@cpojer I could take a look if you are open to it. The idea is just to add some error reporting features into the transform process. We don't need more, from there CRA would handle running eslint/flow.

@cpojer
Copy link
Contributor

cpojer commented Dec 8, 2016

Honestly, I don't think it will be a good idea to slow down test runs through the use of external tools. I'm worried Jest will be blamed for it.

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

@cpojer @gaearon 🤔 maybe we could create an another script (static-checks ?) for that? that way no-one would be blamed except eslint+flow alone...

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

I mean, that's not the best solution from a DX perspective but I agree that slowing down either one of jest or webpack could be bad... Well, if I had to choose, I would prefer to penalise webpack...

@rricard
Copy link
Contributor Author

rricard commented Dec 8, 2016

@gaearon I'll try to think about it tonight. Maybe I'll have some other solution tomorrow!

@gaearon
Copy link
Contributor

gaearon commented Feb 12, 2017

I don't think we'll be moving forward with this. Thanks for taking a look!

@gaearon gaearon closed this Feb 12, 2017
@rricard rricard deleted the tests-in-webpack branch February 22, 2017 17:30
@lock lock bot locked and limited conversation to collaborators Jan 21, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

ESLint checks (and the upcoming Flow integration) is missing in tests
5 participants