Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory leak of ts_library #1803

Closed
ouchuji opened this issue Apr 8, 2020 · 2 comments · Fixed by #1812
Closed

Memory leak of ts_library #1803

ouchuji opened this issue Apr 8, 2020 · 2 comments · Fixed by #1812
Labels

Comments

@ouchuji
Copy link

ouchuji commented Apr 8, 2020

🐞 bug report

Affected Rule

ts_library

Is this a regression?

No.

Description

Build lot of `ts_library` target via command such as `bazel build src/...` would cause a memory leak problem.

The bazel would create a lot of node processes to compile the ts.
wrong
Then the computor memory is all ate by these node processes.
wrong_memory
I found that this commit caused this problem.
feat(typescript): use run_node helper to execute tsc 066a52c
And I change the code like below and the problem is solved, node processes number became normal.
correct_code
correct

And I make deep debuging, found the extra arg --bazel_node_modules_manifest=%s to the tsc_wrapper caused this probel, and I comment this, the problem is solved too.
image

🔬 Minimal Reproduction

It looks great when build a single ts_library target, but not great when build lots of ts targets via bazel build ....

🔥 Exception or Error


No exception, but the computor memory is overflow.

🌍 Your Environment

Operating System:

  
Ubutu 18.04 LTS
  

Output of bazel version:

  
2.1.0
  

Rules_nodejs version:

(Please check that you have matching versions between WORKSPACE file and @bazel/* npm packages.)

  
1.5.0
  

Anything else relevant?

@alexeagle
Copy link
Collaborator

That's fantastic debugging! Thank you for digging into this and finding the root cause, that should help us find the time to fix it.

@chris-codaio
Copy link

Just guessing here, but it's probably the root cause of recent reports in #1027.

alexeagle pushed a commit to alexeagle/rules_nodejs that referenced this issue Apr 9, 2020
This causes hard-to-reproduce resource leaks as demonstrated in bazel-contrib#1803
Also we didn't fully understand this before, for example if a worker process stayed running we'd never re-link under it when deps change
For now make the conservative fix to keep these two capabilities distinct

Fixes bazel-contrib#1803
gregmagolan added a commit to gregmagolan/rules_nodejs that referenced this issue Apr 9, 2020
Issue bazel-contrib#1803 was fixed by bazel-contrib#1800 which is already landed. bazel-contrib#1805 also fixes via a different mechanism and will also land soon.
alexeagle pushed a commit to alexeagle/rules_nodejs that referenced this issue Apr 9, 2020
This causes hard-to-reproduce resource leaks as demonstrated in bazel-contrib#1803
Also we didn't fully understand this before, for example if a worker process stayed running we'd never re-link under it when deps change
For now make the conservative fix to keep these two capabilities distinct

Fixes bazel-contrib#1803
gregmagolan pushed a commit to alexeagle/rules_nodejs that referenced this issue Apr 10, 2020
This causes hard-to-reproduce resource leaks as demonstrated in bazel-contrib#1803
Also we didn't fully understand this before, for example if a worker process stayed running we'd never re-link under it when deps change
For now make the conservative fix to keep these two capabilities distinct

Fixes bazel-contrib#1803
alexeagle pushed a commit that referenced this issue Apr 10, 2020
This causes hard-to-reproduce resource leaks as demonstrated in #1803
Also we didn't fully understand this before, for example if a worker process stayed running we'd never re-link under it when deps change
For now make the conservative fix to keep these two capabilities distinct

Fixes #1803
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants