Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Significant slowdown of createContext and runInContext #29842

Closed
yrsh opened this issue Oct 4, 2019 · 9 comments
Closed

Significant slowdown of createContext and runInContext #29842

yrsh opened this issue Oct 4, 2019 · 9 comments
Labels
performance Issues and PRs related to the performance of Node.js. vm Issues and PRs related to the vm subsystem.

Comments

@yrsh
Copy link

yrsh commented Oct 4, 2019

In the latest v12.11.0 these methods work very slowly relative to v10.16.3.
Here is a simple example where contexts are created and then a script is run:
https://gist.github.com/yrsh/28d274727c1e67f88d0cdf2e2091b6c2

For example at my local machine, execution time with v12 is 21 ms, but with v10 it is 7.

@bnoordhuis bnoordhuis added performance Issues and PRs related to the performance of Node.js. vm Issues and PRs related to the vm subsystem. labels Oct 4, 2019
@bnoordhuis
Copy link
Member

Reproducing here for posterity:

const vm = require('vm')

const ctxFn = new vm.Script(`
  var b = Math.random();
  var c = a + b;
`);

function run(m) {
  const ctx = {a: 'a'};
  vm.createContext(ctx);
  for (let i = 0; i < m; i++) {
    ctxFn.runInContext(ctx);
  }
}

function test(n, m) {
  const s = Date.now();
  for (let i = 0; i < m; i++) {
    run(m);
  }
  console.log(Date.now() - s);
}

test(1000, 10);

Are you sure you're testing what you think you're testing? n is never used in test().

@joyeecheung
Copy link
Member

This is what I get from our vm benchmarks using

node benchmark/compare.js --new /Users/joyee/.nvm/versions/node/v12.5.0/bin/node --old /Users/joyee/.nvm/versions/node/v10.16.0/bin/node vm > out.csv
                                                                    confidence improvement accuracy (*)   (**)  (***)
 vm/run-in-context.js withSigintListener=0 breakOnSigint=0 n=1             ***     53.32 %       ±4.14% ±5.50% ±7.16%
 vm/run-in-context.js withSigintListener=0 breakOnSigint=1 n=1             ***     40.84 %       ±2.24% ±2.99% ±3.90%
 vm/run-in-context.js withSigintListener=1 breakOnSigint=0 n=1             ***     51.84 %       ±2.00% ±2.67% ±3.48%
 vm/run-in-context.js withSigintListener=1 breakOnSigint=1 n=1             ***     45.36 %       ±2.52% ±3.37% ±4.41%
 vm/run-in-this-context.js withSigintListener=0 breakOnSigint=0 n=1        ***     41.50 %       ±2.72% ±3.64% ±4.77%
 vm/run-in-this-context.js withSigintListener=0 breakOnSigint=1 n=1        ***     32.68 %       ±3.90% ±5.19% ±6.76%
 vm/run-in-this-context.js withSigintListener=1 breakOnSigint=0 n=1        ***     41.44 %       ±4.25% ±5.67% ±7.40%
 vm/run-in-this-context.js withSigintListener=1 breakOnSigint=1 n=1        ***     34.43 %       ±4.27% ±5.69% ±7.41%

The slowdown may come from createContext instead

@joyeecheung
Copy link
Member

joyeecheung commented Oct 4, 2019

I tried with this benchmark

'use strict';

const common = require('../common.js');

const bench = common.createBenchmark(main, {
  n: [100]
});

const vm = require('vm');

const ctxFn = new vm.Script(`
  var b = Math.random();
  var c = a + b;
`);

function main({ n }) {
  bench.start();
  let context;
  for (var i = 0; i < n; i++) {
    context = vm.createContext({ a: 'a' });
  }
  bench.end(n);
  ctxFn.runInContext(context);
}

There is a -75.07% performance regression on master with --without-node-snapshot, and the result is similar from comparison between v12.5.0 (with the embedded snapshot) and v10.16.0

joyeecheung added a commit that referenced this issue Oct 10, 2019
PR-URL: #29845
Refs: #29842
Reviewed-By: Rich Trott <rtrott@gmail.com>
Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de>
@joyeecheung
Copy link
Member

joyeecheung commented Oct 10, 2019

I proflied the benchmark (stripped out into a separate script).

Output of --prof:

Statistical profiling result from isolate-0x1040dd000-5581-v8.log, (194 ticks, 0 unaccounted, 0 excluded).

 [Shared libraries]:
   ticks  total  nonlib   name
      3    1.5%          /usr/lib/system/libsystem_platform.dylib
      1    0.5%          /usr/lib/system/libsystem_kernel.dylib
      1    0.5%          /usr/lib/libc++abi.dylib
      1    0.5%          /usr/lib/libc++.1.dylib

 [JavaScript]:
   ticks  total  nonlib   name

 [C++]:
   ticks  total  nonlib   name
     89   45.9%   47.3%  T __ZNK2v818PropertyDescriptor3setEv
     44   22.7%   23.4%  T node::contextify::ContextifyContext::MakeContext(v8::FunctionCallbackInfo<v8::Value> const&)
     19    9.8%   10.1%  T __kernelrpc_vm_remap
     18    9.3%    9.6%  T node::native_module::NativeModuleEnv::CompileFunction(v8::FunctionCallbackInfo<v8::Value> const&)
      2    1.0%    1.1%  t _tiny_malloc_should_clear
      2    1.0%    1.1%  T _read
      1    0.5%    0.5%  t _rack_region_insert
      1    0.5%    0.5%  t _mvm_aslr_init
      1    0.5%    0.5%  T node::contextify::ContextifyContext::PropertyGetterCallback(v8::Local<v8::Name>, v8::PropertyCallbackInfo<v8::Value> const&)
      1    0.5%    0.5%  T node::binding::GetInternalBinding(v8::FunctionCallbackInfo<v8::Value> const&)
      1    0.5%    0.5%  T node::PatchProcessObject(v8::FunctionCallbackInfo<v8::Value> const&)
      1    0.5%    0.5%  T _semaphore_create
      1    0.5%    0.5%  T _proc_set_dirty
      1    0.5%    0.5%  T _malloc_zone_malloc
      1    0.5%    0.5%  T _malloc_set_zone_name
      1    0.5%    0.5%  T _fstatfs64
      1    0.5%    0.5%  T _fscanf
      1    0.5%    0.5%  T _freopen
      1    0.5%    0.5%  T _free
      1    0.5%    0.5%  T ___pthread_init

 [Summary]:
   ticks  total  nonlib   name
      0    0.0%    0.0%  JavaScript
    188   96.9%  100.0%  C++
      8    4.1%    4.3%  GC
      6    3.1%          Shared libraries

 [C++ entry points]:
   ticks    cpp   total   name
    165  100.0%   85.1%  T __ZNK2v818PropertyDescriptor3setEv

And of --cpu-prof

Screen Shot 2019-10-11 at 5 39 02 AM

I suspect this is related to #29766 - one possibility is to make primordials behind a configure-time flag until we fix the performance issues.

@joyeecheung
Copy link
Member

Oh, wait, I realized that the screenshot in #29842 (comment) should not contain frames of primordials.js at all - these can be deserialized instead.

targos pushed a commit that referenced this issue Nov 8, 2019
PR-URL: #29845
Refs: #29842
Reviewed-By: Rich Trott <rtrott@gmail.com>
Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de>
@joyeecheung
Copy link
Member

joyeecheung commented Jan 6, 2020

Reposted from #29925

I realized that we cannot actually deserialize the default context from the snapshot if the user uses the sandbox argument of vm.createContext, since the context would be different from the one we have for our own Node.js instances. We could still do this for contexts created without the sandbox argument and the users can work around the limitation by setting up the sandbox though property setters after the call to vm.createContext, though.

@joyeecheung
Copy link
Member

Hmm, or maybe we could include one more context with the interceptors in the embedded snapshot (however we have to register the interceptors as external references first), and use it in ContextifyContext::CreateV8Context. Then fix up the context object's Symbol.toStringTag property and set up the embedded data slot.

@spasam
Copy link

spasam commented Feb 10, 2020

This is also locking up the event loop in version 12. Any more info on the availability of a configure-time flag until this is fixed? Or is the only option to revert to Node 8/10? Thanks

@joyeecheung
Copy link
Member

joyeecheung commented Feb 11, 2020

It seems rather difficult to make a toggle for the primordials at configure-time at this point, but another idea would be to lazily initialize the primoridals only when they are necessary for vm contexts - in most cases, there should be no need because it's used by builtin modules that probably will have their own set of primordials when being passed into a context. I am still looking into whether this is safe to do - it does impose some requirements for cross-context support of builtins (e.g. worker_threads.moveMessagePortToContext()) which @addaleax is working on.

codebytere pushed a commit that referenced this issue Feb 27, 2020
Lazily initialize primordials when cross-context support for
builtins is needed to fix the performance regression in context
creation.

PR-URL: #31738
Fixes: #29842
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: David Carlier <devnexen@gmail.com>
targos pushed a commit to targos/node that referenced this issue Apr 25, 2020
Lazily initialize primordials when cross-context support for
builtins is needed to fix the performance regression in context
creation.

PR-URL: nodejs#31738
Fixes: nodejs#29842
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: David Carlier <devnexen@gmail.com>
targos pushed a commit that referenced this issue Apr 28, 2020
Lazily initialize primordials when cross-context support for
builtins is needed to fix the performance regression in context
creation.

PR-URL: #31738
Fixes: #29842
Reviewed-By: Gus Caplan <me@gus.host>
Reviewed-By: Anna Henningsen <anna@addaleax.net>
Reviewed-By: David Carlier <devnexen@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
performance Issues and PRs related to the performance of Node.js. vm Issues and PRs related to the vm subsystem.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants