Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vm: add createCacheForFunction #24069

Closed
wants to merge 4 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 13 additions & 3 deletions lib/vm.js
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,8 @@ const {
ContextifyScript,
makeContext,
isContext: _isContext,
compileFunction: _compileFunction
compileFunction: _compileFunction,
createCacheForFunction
} = internalBinding('contextify');
const { callbackMap } = internalBinding('module_wrap');
const {
Expand Down Expand Up @@ -390,17 +391,26 @@ function compileFunction(code, params, options = {}) {
}
});

return _compileFunction(
const fn = _compileFunction(
code,
filename,
lineOffset,
columnOffset,
cachedData,
produceCachedData,
parsingContext,
contextExtensions,
params
);

if (produceCachedData) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i would rather the could just call this manually. we've learned from vm.Script that produceCachedData isn't a very good model

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@devsnek that could be done. A couple of things:

  1. That'll force us to make this an API method. vm.compileFunction used to do that, and people should be allowed to fallback to the original without breakage.

  2. That'll be a breaking change for vm.compileFunction.

Perhaps I could do that in a separate semver-major PR? That said, I'm still not 100% convinced that this would be a good idea in the first place.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The returned values of vm.compileFunction are neither documented or tested, and it’s only been released since 10.11 (10.10?) I would say it should probably be safe to deprecate the option in 11

Copy link
Member

@joyeecheung joyeecheung Nov 5, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also given how the returned values are used, does it actually break anyone if the cache is not generated? If we only return the function as-is, then cachedDataProduced and cachedData will always be undefined which is not that different from “the cache generation always fail”, but the users (if they exist) should always prepare to handle that case in the first place (provided they actually guess what those values are even when there are no documentations). The only exception that I can think of is our code cache generator but that’s just tooling, not any production code or anything and I don’t think this should be handled as errors in a non-tooling situation...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(The comments above are orthogonal to this PR, BTW, but if vm.compileFunction does not use it, and it’s not exposed as an API either, I couldn’t see why we are adding a binding that is not used...)

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The key question is how we should handle the absence of cached data. Should it be seen as an error or a normal state? My inclination is to treat it as a normal state for non-tooling use cases, as it's unclear how extensively these values have been used in production code.

I support the idea of a separate semver-major PR to address this and consider making produceCachedData an API method. This approach provides clarity and a smoother transition for users, especially in scenarios where the use of these values is minimal.

const cache = createCacheForFunction(fn);
fn.cachedDataProduced = cache !== undefined;
if (fn.cachedDataProduced) {
fn.cachedData = cache;
Copy link
Member

@devsnek devsnek Feb 20, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i can imagine this making people need to delete the prop after grabbing it if they want to pass the resulting function around. i'm not 100% sure what a better interface would be but i think we should brainstorm more.

Copy link
Member

@joyeecheung joyeecheung Feb 20, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has been out since v10, though this has not been tested nor documented. We'll probably still need a deprecation cycle if we want to replace this with something else.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think we also need to allow this to run after the function has been called. we might need to do something like vm.createCodeCacheForFunction(fn)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So @devsnek and @jdalton support exposing this as part of the vm API, unless I'm wrong? That is complete fine by me, I was actually planning of proposing that myself.

As of removing the cache production part inside vm.compileFunction, I honestly don't know. Idk if going through the deprecation cycle is worth it, but if everyone else feels it is a great idea to drop the prop, I have no real objections backed with logic.

}
}

return fn;
}


Expand Down
67 changes: 31 additions & 36 deletions src/node_contextify.cc
Original file line number Diff line number Diff line change
Expand Up @@ -212,6 +212,7 @@ void ContextifyContext::Init(Environment* env, Local<Object> target) {
env->SetMethod(target, "makeContext", MakeContext);
env->SetMethod(target, "isContext", IsContext);
env->SetMethod(target, "compileFunction", CompileFunction);
env->SetMethod(target, "createCacheForFunction", CreateCacheForFunction);
}


Expand Down Expand Up @@ -1000,35 +1001,31 @@ void ContextifyContext::CompileFunction(
cached_data_buf = args[4].As<Uint8Array>();
}

// Argument 6: produce cache data
CHECK(args[5]->IsBoolean());
bool produce_cached_data = args[5]->IsTrue();

// Argument 7: parsing context (optional)
// Argument 6: parsing context (optional)
Local<Context> parsing_context;
if (!args[6]->IsUndefined()) {
CHECK(args[6]->IsObject());
if (!args[5]->IsUndefined()) {
CHECK(args[5]->IsObject());
ContextifyContext* sandbox =
ContextifyContext::ContextFromContextifiedSandbox(
env, args[6].As<Object>());
env, args[5].As<Object>());
CHECK_NOT_NULL(sandbox);
parsing_context = sandbox->context();
} else {
parsing_context = context;
}

// Argument 8: context extensions (optional)
// Argument 7: context extensions (optional)
Local<Array> context_extensions_buf;
if (!args[7]->IsUndefined()) {
CHECK(args[7]->IsArray());
context_extensions_buf = args[7].As<Array>();
if (!args[6]->IsUndefined()) {
CHECK(args[6]->IsArray());
context_extensions_buf = args[6].As<Array>();
}

// Argument 9: params for the function (optional)
// Argument 8: params for the function (optional)
Local<Array> params_buf;
if (!args[8]->IsUndefined()) {
CHECK(args[8]->IsArray());
params_buf = args[8].As<Array>();
if (!args[7]->IsUndefined()) {
CHECK(args[7]->IsArray());
params_buf = args[7].As<Array>();
}

// Read cache from cached data buffer
Expand Down Expand Up @@ -1085,29 +1082,27 @@ void ContextifyContext::CompileFunction(
return;
}

if (produce_cached_data) {
const std::unique_ptr<ScriptCompiler::CachedData> cached_data(
ScriptCompiler::CreateCodeCacheForFunction(fun));
bool cached_data_produced = cached_data != nullptr;
if (cached_data_produced) {
MaybeLocal<Object> buf = Buffer::Copy(
env,
reinterpret_cast<const char*>(cached_data->data),
cached_data->length);
if (fun->Set(
parsing_context,
env->cached_data_string(),
buf.ToLocalChecked()).IsNothing()) return;
}
if (fun->Set(
parsing_context,
env->cached_data_produced_string(),
Boolean::New(isolate, cached_data_produced)).IsNothing()) return;
}

args.GetReturnValue().Set(fun);
}

void ContextifyContext::CreateCacheForFunction(
const FunctionCallbackInfo<Value>& args) {
Environment* env = Environment::GetCurrent(args);

// Argument 1: target function
Local<Function> function = args[0].As<Function>();

const std::unique_ptr<ScriptCompiler::CachedData> cache(
ScriptCompiler::CreateCodeCacheForFunction(function));

if (cache != nullptr) {
args.GetReturnValue().Set(
Buffer::Copy(
env, reinterpret_cast<const char*>(cache->data), cache->length)
.ToLocalChecked());
}
}


void Initialize(Local<Object> target,
Local<Value> unused,
Expand Down
2 changes: 2 additions & 0 deletions src/node_contextify.h
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,8 @@ class ContextifyContext {
static void IsContext(const v8::FunctionCallbackInfo<v8::Value>& args);
static void CompileFunction(
const v8::FunctionCallbackInfo<v8::Value>& args);
static void CreateCacheForFunction(
const v8::FunctionCallbackInfo<v8::Value>& args);
static void WeakCallback(
const v8::WeakCallbackInfo<ContextifyContext>& data);
static void PropertyGetterCallback(
Expand Down