Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak when using --coverage #5837

Closed
philipp-spiess opened this issue Mar 20, 2018 · 14 comments
Closed

Memory leak when using --coverage #5837

philipp-spiess opened this issue Mar 20, 2018 · 14 comments

Comments

@philipp-spiess
Copy link
Contributor

Do you want to request a feature or report a bug?

This is a bug report.

What is the current behavior?

I was able to isolate an example where adding the --coverage operation will cause a memory leak whereas the absence works just fine.

More specifically, the issue occurs when using a very large JavaScript file (in our case, it's the asm.js artifact of PSPDFKit for Web. To work with JavaScript files of that size without issues, we have to add the file to the transformIgnorePatterns configuration option. As soon as we do this, the test passes as expected.

This was working fine in v22.2.2 and for the non- --coverage option, this still works fine.

However, when using the --coverage option, this test started to leak memory when we updated from v22.2.2 to v22.3.0. Hence I assume the exact time this started to cause issues is somewhere in this diff.

If the current behavior is a bug, please provide the steps to reproduce and
either a repl.it demo through https://repl.it/languages/jest or a minimal
repository on GitHub that we can yarn install and yarn test.

I set up a repo that reproduces this behavior here.

What is the expected behavior?

yarn jest

and

yarn jest --coverage

Finishes in a reasonable time without causing the heap to explode.

Please provide your exact Jest configuration

https://github.com/PSPDFKit-labs/jest-leak/blob/master/package.json

Run npx envinfo --preset jest in your project directory and paste the
results here

  System:
    OS: macOS High Sierra 10.13.1
    CPU: x64 Intel(R) Core(TM) i7-4790K CPU @ 4.00GHz
  Binaries:
    Node: 9.7.1
    Yarn: 1.3.2
    npm: 5.6.0
  npmPackages:
    jest:
      wanted: ^22.4.2
      installed: 22.4.2

However, I was also able to reproduce this on other machines (e.g. our CI servers).

I also tried different versions of node without seeing a difference in this behavior.

Please let me know if I can provide further details for you.

@philipp-spiess
Copy link
Contributor Author

I just found out about coveragePathIgnorePatterns and adding this will make the example pass. I still have no idea why this worked before though.

@jy95
Copy link

jy95 commented Mar 20, 2018

Thanks @philipp-spiess for this issue : I also have a memory leak with my TypeScript repo : https://github.com/jy95/mediaScan/blob/master/jest.config.js

Even if I used the compiled version of the lib and without the flag collectCoverage , the tests took a lot of times (I evaluated to 3X-4X the regular time in v22.2.0).

As far that I know, v22.2.0 introduced a bug inside the coverage but was fixed in 22.4.X , maybe related ?

  System:
    OS: Windows 10
    CPU: x64 Intel(R) Core(TM) i7-4500U CPU @ 1.80GHz
  Binaries:
    Node: 8.9.3
    Yarn: Not Found
    npm: 5.6.0
  npmPackages:
    jest:
      wanted: ^22.4.2
      installed: 22.4.2

@mnquintana
Copy link

Just ran into this issue too, and it's sadly preventing us from using Jest for code coverage at all 😢 (the coveragePathIgnorePatterns workaround didn't work for us – we were already using it and still getting the memory leak) – is there any info I could provide that'd help the maintainers with debugging?

@SimenB
Copy link
Member

SimenB commented May 4, 2018

@philipp-spiess We use babel to set up code coverage (babel-plugin-istanbul), which is why it runs even though you use transformIgnorePatterns, and the reason you need coveragePathIgnorePatterns.

@mnquintana a reproduction would be nice 🙂

@JaSpr
Copy link

JaSpr commented Jan 11, 2019

We have coveragePathIgnorePatterns and we're still getting massive memory usage when --coverage is on.

Like for us, without --coverage each thread ends up (max) at about 500MB memory usage over the course of our 2652 tests. WITH --coverage they sometimes crash some threads because they hit the default max_old_space_size (seems to be around 1550GB when they crash)

Increasing the max_old_space_size did resolve the crashes, but we don't want to be doing that on CI.

@SimenB
Copy link
Member

SimenB commented May 1, 2020

Is this a leak, or just high memory usage when collecting coverage?

You can use v8 code coverage if you want, that uses way less memory: https://jestjs.io/docs/en/configuration#coverageprovider-string

@omgoshjosh
Copy link

hi @SimenB , i have been unable to get v8 to run without a "heap out of memory" is there any advice or any tickets for that? i have been following all the performance and memory issues but every time i see mention of v8 coverage, i try it and it fails and i stick with the default. maybe you might have some insight on what i might be able to do to start an investigation as far as running various commands with various flags etc?

@HiptJo
Copy link

HiptJo commented Apr 18, 2021

I've got the same issue with one of my projects.

Probably it has something to do with chrome / puppeteer, as the same config works with other projects.
v8 code coverage (defined in jest.config.js) solves my issue though...

@Duncan-Brain
Copy link

Duncan-Brain commented Dec 28, 2021

Maybe this will help someone.

I was getting a heap out of memory error only on my Github workflows, and only when using --coverage. Locally seemed to be working fine. I made the mistake of doing my testing on workflows with docker like the below.

docker build -t temp-image -f ./frontend/Dockerfile.dev ./frontend
docker run temp-image npm run test         

Where npm test = jest --ci --coverage.

Coverage was hitting an /opt/ folder that I believe node: alpine generated on my container. More specifically the file '/opt/yarn-v1.22.17/lib/toSubscriber.js.map' could not be found. I discovered this when switching to the v8 coverage provider because it had some better error messaging.

My first solution was to add ignoreCoveragePathPatterns: ['/opt/'] in jest.config.js for testing, which worked great.

A slightly better solution for using Docker would be to using a working directory WORKDIR '/app' in your dockerfile and then reference it in jest.config.js roots: ['../', '/app'].

But my best solution was to not use Docker at all for running my jest tests -- it was just completely unnecessary for me and I missed this obvious fix.

@SimenB
Copy link
Member

SimenB commented Mar 3, 2022

The reproduction in the OP is explained here: #5837 (comment)

Beyond that I don't think there have been any reproductions? I'll close this, but feel free to open up a new issue if it's still a problem for people.


Your example is interesting @Duncan-Brain, could Jest have surfaced the issue better, you think?

@SimenB SimenB closed this as completed Mar 3, 2022
@Duncan-Brain
Copy link

@SimenB It's possible, I never did confirm my suspicion about what was going on. But I am a bit of a noob at debugging and it's my first year using this stack so I am out of my depth to know that.

If I recall I was having some trouble creating a repro-repo. But let me try again and if I can reproduce it I will open up an issue with the link to the reproduction?

@SimenB
Copy link
Member

SimenB commented Mar 3, 2022

Yes please, that'd be great 👍

@Duncan-Brain
Copy link

okay @SimenB please see #12541 (comment)

I was mistaken I had been able to reproduce my own issue but figured I was doing a few things wrong and was not worth pursuing further.

@github-actions
Copy link

github-actions bot commented Apr 3, 2022

This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Apr 3, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants