-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Node 12 uses ~30% extra memory #28205
Comments
/cc @ulan |
I can reproduce the regression. At first glance it seems to be caused by slow progress in marking phase of GC. Investigating. |
Actually, this looks like a memory leak in V8. I bisected to [deps: update V8 to 7.4]: #26685 The flag that controls the feature in read-only and can be set only at build time. I checked that setting track_constant_fields to false in v8/src/flag-definitions.h on the v12.x branch fixes the issue locally. @ofrobots, @hashseed do we want to disable the feature in Node 12? Unfortunately, the flag no longer exists in upstream V8. So it is not possible to disable the feature on the master branch. I will file an issue for upstream V8 and start investigation with compiler folks. |
The plot thickens! Thanks for looking at this @ulan 👍 |
Upstream issue: https://bugs.chromium.org/p/v8/issues/detail?id=9507 |
@filipesilva you're welcome and thank you for filing a bug with detailed repro steps. |
track_constant_fields exposes a large regression in memory leak due to a suspected memory leak. Disable this until a proper fix is ready. Ref: https://bugs.chromium.org/p/v8/issues/detail?id=9507 Ref: nodejs#28205
I investigated, and it's triggered by a bug in Angular that used to got very lucky with an optimization we removed: private _clone(obj: any): any { The for-in loop also iterates over the prototype chain, copying many functions from the prototypes onto the cloned object as own properties. This doesn't make sense, since the cloned object also has the right prototype, so in a way it has all these functions twice now: on the prototype chain and as own properties. The reason why this was better before is that in older versions of V8, we had an optimization that detected function properties that are always the same and shares them in the hidden class instead of actually putting them on the object. This is kind of V8 doing something like prototypes without the user using prototypes. When we enabled constant field tracking, which serves a similar purpose in other cases, we removed this old optimization. This results in us always putting own properties into the object, so they always consume memory, as one might expect actually. |
@tebbi thank you so much for pinpointing what was happening here! Can you tell me how you figured out it was exactly in that clone that things were going awry? I'm trying to identify other places where we use too much memory and would appreciate better techniques for doing so. |
I used --trace-gc-object-stats to find out that the memory regression came from JavaScript objects getting bigger and going to dictionary mode. Then I instrumented the V8 code to print a stack trace whenever an object goes to dictionary mode because of too many properties (That's in src/objects/map.cc Map::TransitionToDataProperty()). These stack traces were pointing to the _clone function. |
This is fixed in Angular upstream now. Does this resolve the issue for Node.js? |
@tebbi I'll re-run benchmarks for the original test cases when Angular releases the fix, which should be today I think. |
I tried on windows again, same steps as my original comment, but using checkout out commit Node 12.4.0
Node 10.16.0
So if we compare average/peak with my original results, we see
Node 12.4.0 shows the roughly same numbers as Node 10.16.0 after angular/angular@24ca582 (the change that @tebbi suggested). Indeed I think this issue is resolved. 30% to 70% increased memory usage was indeed caused by that loop. I wonder if there's a good way to identify these things in the future. |
Over at Angular CLI we recently instructed our users to update to Node 12. In angular/angular-cli#13734 we started getting reports that projects that used to build correctly were now hitting the memory limit.
I looked a bit further into these reports in https://github.com/filipesilva/angular-cli-perf-benchmark and found that projects were using more memory in Node 12 than they were using in Node 10. In one case the memory usage went up by 70%.
Reproduction steps:
git clone https://github.com/filipesilva/clarity && cd clarity && git checkout d19039902 && npm i
npm run populate-build-cache
to populate disk cache.npm run benchmark
to benchmark the build over 5 builds, save these resultsnpm run benchmark
to benchmark the build over 5 builds, save these resultsResults on my Windows dev machine were:
Results on a CircleCI Linux machine were:
Please note that in the Linux results I did two things different:
--max_old_space_size=2400
on the Node 12.4.0 benchmark because Node 10.16.0 didn't need it. Without increasing the max heap size, the Node 12.4.0 build fails hitting the memory limit (which is how we found Node 12 dynamic heap size maxes out at 2048MB #28202).The numbers that most matter here are
Average Memory usage
andPeak Memory usage
.In Windows these are around 1900/2670 for Node 12.4.0 and 1450/1980 for Node 10. In Linux these are around 1950/2800 for Node 12.4.0 and 1150/1650 for Node 10.
This indicates Node 12.4.0 is using a lot more memory in some cases.
The Windows and Linux results for Node 10.16.0 are a bit different. I think the difference between is because on the Linux test I didn't increase the default memory limit, which forced more aggressive GC. But it might also be that the machine specs influenced it. I don't feel that changes the conclusion though.
https://github.com/filipesilva/angular-cli-perf-benchmark contains more examples. In all of them Node 12.4.0 used more memory. It's also true that Node 12 has a dynamic memory limit though. But in this reproduction, even when setting the memory limit manually we can see a difference.
The text was updated successfully, but these errors were encountered: