From 87605f0ed3c07c2825cba2182720c378b4704a3a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Juan=20Jos=C3=A9=20Arboleda?= Date: Fri, 17 Apr 2020 13:48:30 -0500 Subject: [PATCH 01/93] doc: add juanarbol as collaborator PR-URL: https://github.com/nodejs/node/pull/32906 Reviewed-By: Anna Henningsen Reviewed-By: Rich Trott Reviewed-By: Colin Ihrig --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 5b42f731b791e5..0048be118c988c 100644 --- a/README.md +++ b/README.md @@ -331,6 +331,8 @@ For information about the governance of the Node.js project, see **João Reis** <reis@janeasystems.com> * [joyeecheung](https://github.com/joyeecheung) - **Joyee Cheung** <joyeec9h3@gmail.com> (she/her) +* [juanarbol](https://github.com/juanarbol) - +**Juan José Arboleda** <soyjuanarbol@gmail.com> (he/him) * [JungMinu](https://github.com/JungMinu) - **Minwoo Jung** <nodecorelab@gmail.com> (he/him) * [kfarnung](https://github.com/kfarnung) - From fc71a85c499f1ceacfdd3e037cae597a008815fb Mon Sep 17 00:00:00 2001 From: Michael Dawson Date: Mon, 13 Apr 2020 17:31:05 -0400 Subject: [PATCH 02/93] doc: add N-API version 6 to table We missed adding version 6 to the compatibility table when we defined version 6. Add it along with the versions that we know will include version 6. PR-URL: https://github.com/nodejs/node/pull/32829 Reviewed-By: Chengzhong Wu Reviewed-By: James M Snell Reviewed-By: Richard Lau Reviewed-By: Gabriel Schulhof --- doc/api/n-api.md | 19 ++++++++++--------- 1 file changed, 10 insertions(+), 9 deletions(-) diff --git a/doc/api/n-api.md b/doc/api/n-api.md index 71796504b19da0..c31eb17eeb7126 100644 --- a/doc/api/n-api.md +++ b/doc/api/n-api.md @@ -241,15 +241,16 @@ from version 3 with some additions. This means that it is not necessary to recompile for new versions of Node.js which are listed as supporting a later version. -| | 1 | 2 | 3 | 4 | 5 | -|-------|---------|----------|----------|----------|-----------| -| v6.x | | | v6.14.2* | | | -| v8.x | v8.0.0* | v8.10.0* | v8.11.2 | v8.16.0 | | -| v9.x | v9.0.0* | v9.3.0* | v9.11.0* | | | -| v10.x | v10.0.0 | v10.0.0 | v10.0.0 | v10.16.0 | v10.17.0 | -| v11.x | v11.0.0 | v11.0.0 | v11.0.0 | v11.8.0 | | -| v12.x | v12.0.0 | v12.0.0 | v12.0.0 | v12.0.0 | v12.11.0 | -| v13.x | v13.0.0 | v13.0.0 | v13.0.0 | v13.0.0 | v13.0.0 | +| | 1 | 2 | 3 | 4 | 5 | 6 | +|-------|---------|----------|----------|----------|-----------|-----------| +| v6.x | | | v6.14.2* | | | | +| v8.x | v8.0.0* | v8.10.0* | v8.11.2 | v8.16.0 | | | +| v9.x | v9.0.0* | v9.3.0* | v9.11.0* | | | | +| v10.x | v10.0.0 | v10.0.0 | v10.0.0 | v10.16.0 | v10.17.0 | v10.20.0 | +| v11.x | v11.0.0 | v11.0.0 | v11.0.0 | v11.8.0 | | | +| v12.x | v12.0.0 | v12.0.0 | v12.0.0 | v12.0.0 | v12.11.0 | | +| v13.x | v13.0.0 | v13.0.0 | v13.0.0 | v13.0.0 | v13.0.0 | | +| v14.x | v14.0.0 | v14.0.0 | v14.0.0 | v14.0.0 | v14.0.0 | v14.0.0 | \* Indicates that the N-API version was released as experimental From ca7e0a226e0ccb86cccd80d6f2b5facb7ad32d5d Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Juan=20Jos=C3=A9=20Arboleda?= Date: Tue, 14 Apr 2020 17:16:40 -0500 Subject: [PATCH 03/93] src: remove redundant v8::HeapSnapshot namespace PR-URL: https://github.com/nodejs/node/pull/32854 Reviewed-By: Anna Henningsen Reviewed-By: Colin Ihrig Reviewed-By: Gireesh Punathil Reviewed-By: James M Snell Reviewed-By: Ruben Bridgewater --- src/heap_utils.cc | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/heap_utils.cc b/src/heap_utils.cc index c21ff8c80062a8..efdd68fde9d160 100644 --- a/src/heap_utils.cc +++ b/src/heap_utils.cc @@ -328,7 +328,7 @@ inline bool WriteSnapshot(Isolate* isolate, const char* filename) { } // namespace -void DeleteHeapSnapshot(const v8::HeapSnapshot* snapshot) { +void DeleteHeapSnapshot(const HeapSnapshot* snapshot) { const_cast(snapshot)->Delete(); } From 22ccf2ba1f37ed2457d9017de85d424d20310ea6 Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Wed, 15 Apr 2020 20:40:02 +0200 Subject: [PATCH 04/93] tools: decrease timeout in test.py This fixes the following crash on Windows for me. I don't know why this I only started to see this now, but anyway, the new timeout value is still longer than a week and a half. File "tools/test.py", line 1725, in sys.exit(Main()) File "tools/test.py", line 1701, in Main if RunTestCases(cases_to_run, options.progress, \ options.j, options.flaky_tests): File "tools/test.py", line 923, in RunTestCases return progress.Run(tasks) File "tools/test.py", line 145, in Run thread.join(timeout=10000000) File "C:\Users\anna\AppData\Local\Programs\Python\Python38-32\ \ lib\threading.py", line 1015, in join self._wait_for_tstate_lock(timeout=max(timeout, 0)) File "C:\Users\anna\AppData\Local\Programs\Python\Python38-32\ \ lib\threading.py", line 1027, in _wait_for_tstate_lock elif lock.acquire(block, timeout): OverflowError: timeout value is too large PR-URL: https://github.com/nodejs/node/pull/32868 Reviewed-By: James M Snell Reviewed-By: Jeremiah Senkpiel Reviewed-By: Colin Ihrig Reviewed-By: Richard Lau Reviewed-By: Gireesh Punathil Reviewed-By: Ruben Bridgewater Reviewed-By: Zeyu Yang Reviewed-By: Bartosz Sosnowski --- tools/test.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tools/test.py b/tools/test.py index 5eb1fa767219ce..dde7c7b2f8e466 100755 --- a/tools/test.py +++ b/tools/test.py @@ -142,7 +142,7 @@ def Run(self, tasks): # Wait for the remaining threads for thread in threads: # Use a timeout so that signals (ctrl-c) will be processed. - thread.join(timeout=10000000) + thread.join(timeout=1000000) except (KeyboardInterrupt, SystemExit): self.shutdown_event.set() except Exception: From 307e43da4d17b07a6dfbd7cc34459df32df3ef63 Mon Sep 17 00:00:00 2001 From: Nimit Date: Thu, 16 Apr 2020 01:51:51 +0530 Subject: [PATCH 05/93] src: elevate v8 namespaces elevate v8 namespaces. Leverage `using` semantics for repeated usage of v8 artifacts. PR-URL: https://github.com/nodejs/node/pull/32872 Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca Reviewed-By: Colin Ihrig Reviewed-By: Gireesh Punathil Reviewed-By: Ruben Bridgewater Reviewed-By: Zeyu Yang --- src/cares_wrap.cc | 3 ++- src/env.cc | 4 ++-- 2 files changed, 4 insertions(+), 3 deletions(-) diff --git a/src/cares_wrap.cc b/src/cares_wrap.cc index ff050cc2dc9502..8d1e3bc8794dfe 100644 --- a/src/cares_wrap.cc +++ b/src/cares_wrap.cc @@ -66,6 +66,7 @@ using v8::Int32; using v8::Integer; using v8::Isolate; using v8::Local; +using v8::NewStringType; using v8::Null; using v8::Object; using v8::String; @@ -1929,7 +1930,7 @@ void CanonicalizeIP(const FunctionCallbackInfo& args) { const int af = (rc == 4 ? AF_INET : AF_INET6); CHECK_EQ(0, uv_inet_ntop(af, &result, canonical_ip, sizeof(canonical_ip))); Local val = String::NewFromUtf8(isolate, canonical_ip, - v8::NewStringType::kNormal).ToLocalChecked(); + NewStringType::kNormal).ToLocalChecked(); args.GetReturnValue().Set(val); } diff --git a/src/env.cc b/src/env.cc index 444d9c0368ed78..f966bfba1ee7e6 100644 --- a/src/env.cc +++ b/src/env.cc @@ -174,10 +174,10 @@ void IsolateData::CreateProperties() { #define V(Provider) \ async_wrap_providers_[AsyncWrap::PROVIDER_ ## Provider].Set( \ isolate_, \ - v8::String::NewFromOneByte( \ + String::NewFromOneByte( \ isolate_, \ reinterpret_cast(#Provider), \ - v8::NewStringType::kInternalized, \ + NewStringType::kInternalized, \ sizeof(#Provider) - 1).ToLocalChecked()); NODE_ASYNC_PROVIDER_TYPES(V) #undef V From b12204e27e2e68f223160cec6489388b46a8616e Mon Sep 17 00:00:00 2001 From: Nimit Date: Thu, 16 Apr 2020 11:20:39 +0530 Subject: [PATCH 06/93] test: changed function to arrow function Convert callback functions that are anonymous to arrow functions for better readability. Also adjusted the `this` object in places where that was required. PR-URL: https://github.com/nodejs/node/pull/32875 Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Colin Ihrig Reviewed-By: Gireesh Punathil Reviewed-By: Ruben Bridgewater --- test/parallel/test-net-after-close.js | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/test/parallel/test-net-after-close.js b/test/parallel/test-net-after-close.js index 641e61bd04c5d0..7d49780d001d6e 100644 --- a/test/parallel/test-net-after-close.js +++ b/test/parallel/test-net-after-close.js @@ -24,14 +24,14 @@ const common = require('../common'); const assert = require('assert'); const net = require('net'); -const server = net.createServer(common.mustCall(function(s) { +const server = net.createServer(common.mustCall((s) => { console.error('SERVER: got connection'); s.end(); })); -server.listen(0, common.mustCall(function() { - const c = net.createConnection(this.address().port); - c.on('close', common.mustCall(function() { +server.listen(0, common.mustCall(() => { + const c = net.createConnection(server.address().port); + c.on('close', common.mustCall(() => { console.error('connection closed'); assert.strictEqual(c._handle, null); // Calling functions / accessing properties of a closed socket should not From ccf6d3e5ed22c506d64ab3eac65d704e6b4af945 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Mon, 13 Apr 2020 16:56:16 -0700 Subject: [PATCH 07/93] doc: add `tsc-agenda` to onboarding labels list PR-URL: https://github.com/nodejs/node/pull/32832 Reviewed-By: Anna Henningsen Reviewed-By: Myles Borins Reviewed-By: James M Snell Reviewed-By: Michael Dawson Reviewed-By: Trivikram Kamat Reviewed-By: Ruben Bridgewater --- doc/guides/onboarding-extras.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/doc/guides/onboarding-extras.md b/doc/guides/onboarding-extras.md index a44a55987fc50a..26060a48022221 100644 --- a/doc/guides/onboarding-extras.md +++ b/doc/guides/onboarding-extras.md @@ -21,6 +21,8 @@ request. * `feature request` - Any issue that requests a new feature (usually not PRs) * `good first issue` - Issues suitable for newcomers to process * `meta` - For issues whose topic is governance, policies, procedures, etc. +* `tsc-agenda` - Open issues and pull requests with this label will be added to + the Technical Steering Committee meeting agenda -- From 2ab4ebc8bf86457b0b37e224f231ee90ca75a0d3 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Thu, 16 Apr 2020 17:25:22 +0200 Subject: [PATCH 08/93] stream: simplify Writable.end() MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Simplifies Writable.end() by inlining and de-duplicating code. PR-URL: https://github.com/nodejs/node/pull/32882 Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca Reviewed-By: Gerhard Stöbich Reviewed-By: Ruben Bridgewater Reviewed-By: Matteo Collina --- lib/_stream_writable.js | 33 +++++++++++++-------------------- 1 file changed, 13 insertions(+), 20 deletions(-) diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 14a7ede27611c4..ec263d8f578fe5 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -588,21 +588,26 @@ Writable.prototype.end = function(chunk, encoding, cb) { this.uncork(); } - if (typeof cb !== 'function') - cb = nop; - // This is forgiving in terms of unnecessary calls to end() and can hide // logic errors. However, usually such errors are harmless and causing a // hard error can be disproportionately destructive. It is not always // trivial for the user to determine whether end() needs to be called or not. + let err; if (!state.errored && !state.ending) { - endWritable(this, state, cb); + state.ending = true; + finishMaybe(this, state, true); + state.ended = true; } else if (state.finished) { - process.nextTick(cb, new ERR_STREAM_ALREADY_FINISHED('end')); + err = new ERR_STREAM_ALREADY_FINISHED('end'); } else if (state.destroyed) { - process.nextTick(cb, new ERR_STREAM_DESTROYED('end')); - } else if (cb !== nop) { - onFinished(this, state, cb); + err = new ERR_STREAM_DESTROYED('end'); + } + + if (typeof cb === 'function') { + if (err || state.finished) + process.nextTick(cb, err); + else + onFinished(this, state, cb); } return this; @@ -683,18 +688,6 @@ function finish(stream, state) { } } -function endWritable(stream, state, cb) { - state.ending = true; - finishMaybe(stream, state, true); - if (cb !== nop) { - if (state.finished) - process.nextTick(cb); - else - onFinished(stream, state, cb); - } - state.ended = true; -} - function onCorkedFinish(corkReq, state, err) { let entry = corkReq.entry; corkReq.entry = null; From 7f498125e4523bde026290142fab6626f293fd28 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Thu, 16 Apr 2020 20:48:41 +0200 Subject: [PATCH 09/93] stream: inline unbuffered _write PR-URL: https://github.com/nodejs/node/pull/32886 Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca Reviewed-By: Zeyu Yang Reviewed-By: Ruben Bridgewater --- lib/_stream_writable.js | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index ec263d8f578fe5..399c27617d17c8 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -364,7 +364,12 @@ function writeOrBuffer(stream, state, chunk, encoding, cb) { } state.bufferedRequestCount += 1; } else { - doWrite(stream, state, false, len, chunk, encoding, cb); + state.writelen = len; + state.writecb = cb; + state.writing = true; + state.sync = true; + stream._write(chunk, encoding, state.onwrite); + state.sync = false; } // Return false if errored or destroyed in order to break From 36a4f54d69d7c4e5114df5be083f650dd2cfda87 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Vadzim=20Zie=C5=84ka?= Date: Tue, 14 Apr 2020 16:25:55 +0300 Subject: [PATCH 10/93] stream: close iterator in Readable.from Call iterator.return() if not all of its values are consumed. Fixes: https://github.com/nodejs/node/issues/32842 PR-URL: https://github.com/nodejs/node/pull/32844 Reviewed-By: Robert Nagy Reviewed-By: Matteo Collina Reviewed-By: Zeyu Yang --- lib/internal/streams/from.js | 32 ++- .../test-readable-from-iterator-closing.js | 198 ++++++++++++++++++ 2 files changed, 229 insertions(+), 1 deletion(-) create mode 100644 test/parallel/test-readable-from-iterator-closing.js diff --git a/lib/internal/streams/from.js b/lib/internal/streams/from.js index ab6db00a125a0b..ca567914bbf0fe 100644 --- a/lib/internal/streams/from.js +++ b/lib/internal/streams/from.js @@ -34,21 +34,51 @@ function from(Readable, iterable, opts) { objectMode: true, ...opts }); + // Reading boolean to protect against _read // being called before last iteration completion. let reading = false; + + // needToClose boolean if iterator needs to be explicitly closed + let needToClose = false; + readable._read = function() { if (!reading) { reading = true; next(); } }; + + readable._destroy = function(error, cb) { + if (needToClose) { + needToClose = false; + close().then( + () => process.nextTick(cb, error), + (e) => process.nextTick(cb, error || e), + ); + } else { + cb(error); + } + }; + + async function close() { + if (typeof iterator.return === 'function') { + const { value } = await iterator.return(); + await value; + } + } + async function next() { try { + needToClose = false; const { value, done } = await iterator.next(); + needToClose = !done; + const resolved = await value; if (done) { readable.push(null); - } else if (readable.push(await value)) { + } else if (readable.destroyed) { + await close(); + } else if (readable.push(resolved)) { next(); } else { reading = false; diff --git a/test/parallel/test-readable-from-iterator-closing.js b/test/parallel/test-readable-from-iterator-closing.js new file mode 100644 index 00000000000000..0254ccfc163093 --- /dev/null +++ b/test/parallel/test-readable-from-iterator-closing.js @@ -0,0 +1,198 @@ +'use strict'; + +const { mustCall, mustNotCall } = require('../common'); +const { Readable } = require('stream'); +const { strictEqual } = require('assert'); + +async function asyncSupport() { + const finallyMustCall = mustCall(); + const bodyMustCall = mustCall(); + + async function* infiniteGenerate() { + try { + while (true) yield 'a'; + } finally { + finallyMustCall(); + } + } + + const stream = Readable.from(infiniteGenerate()); + + for await (const chunk of stream) { + bodyMustCall(); + strictEqual(chunk, 'a'); + break; + } +} + +async function syncSupport() { + const finallyMustCall = mustCall(); + const bodyMustCall = mustCall(); + + function* infiniteGenerate() { + try { + while (true) yield 'a'; + } finally { + finallyMustCall(); + } + } + + const stream = Readable.from(infiniteGenerate()); + + for await (const chunk of stream) { + bodyMustCall(); + strictEqual(chunk, 'a'); + break; + } +} + +async function syncPromiseSupport() { + const returnMustBeAwaited = mustCall(); + const bodyMustCall = mustCall(); + + function* infiniteGenerate() { + try { + while (true) yield Promise.resolve('a'); + } finally { + // eslint-disable-next-line no-unsafe-finally + return { then(cb) { + returnMustBeAwaited(); + cb(); + } }; + } + } + + const stream = Readable.from(infiniteGenerate()); + + for await (const chunk of stream) { + bodyMustCall(); + strictEqual(chunk, 'a'); + break; + } +} + +async function syncRejectedSupport() { + const returnMustBeAwaited = mustCall(); + const bodyMustNotCall = mustNotCall(); + const catchMustCall = mustCall(); + const secondNextMustNotCall = mustNotCall(); + + function* generate() { + try { + yield Promise.reject('a'); + secondNextMustNotCall(); + } finally { + // eslint-disable-next-line no-unsafe-finally + return { then(cb) { + returnMustBeAwaited(); + cb(); + } }; + } + } + + const stream = Readable.from(generate()); + + try { + for await (const chunk of stream) { + bodyMustNotCall(chunk); + } + } catch { + catchMustCall(); + } +} + +async function noReturnAfterThrow() { + const returnMustNotCall = mustNotCall(); + const bodyMustNotCall = mustNotCall(); + const catchMustCall = mustCall(); + const nextMustCall = mustCall(); + + const stream = Readable.from({ + [Symbol.asyncIterator]() { return this; }, + async next() { + nextMustCall(); + throw new Error('a'); + }, + async return() { + returnMustNotCall(); + return { done: true }; + }, + }); + + try { + for await (const chunk of stream) { + bodyMustNotCall(chunk); + } + } catch { + catchMustCall(); + } +} + +async function closeStreamWhileNextIsPending() { + const finallyMustCall = mustCall(); + const dataMustCall = mustCall(); + + let resolveDestroy; + const destroyed = + new Promise((resolve) => { resolveDestroy = mustCall(resolve); }); + let resolveYielded; + const yielded = + new Promise((resolve) => { resolveYielded = mustCall(resolve); }); + + async function* infiniteGenerate() { + try { + while (true) { + yield 'a'; + resolveYielded(); + await destroyed; + } + } finally { + finallyMustCall(); + } + } + + const stream = Readable.from(infiniteGenerate()); + + stream.on('data', (data) => { + dataMustCall(); + strictEqual(data, 'a'); + }); + + yielded.then(() => { + stream.destroy(); + resolveDestroy(); + }); +} + +async function closeAfterNullYielded() { + const finallyMustCall = mustCall(); + const dataMustCall = mustCall(3); + + function* infiniteGenerate() { + try { + yield 'a'; + yield 'a'; + yield 'a'; + while (true) yield null; + } finally { + finallyMustCall(); + } + } + + const stream = Readable.from(infiniteGenerate()); + + stream.on('data', (chunk) => { + dataMustCall(); + strictEqual(chunk, 'a'); + }); +} + +Promise.all([ + asyncSupport(), + syncSupport(), + syncPromiseSupport(), + syncRejectedSupport(), + noReturnAfterThrow(), + closeStreamWhileNextIsPending(), + closeAfterNullYielded(), +]).then(mustCall()); From b957895ff73f574b29e3ce49ed1dbb4a31124158 Mon Sep 17 00:00:00 2001 From: David Daza <03dazal@gmail.com> Date: Fri, 3 Apr 2020 12:53:55 -0500 Subject: [PATCH 11/93] lib: remove unnecesary else block MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit The if statement inside the _writeHostObject function returns an expression which makes the else block unnecessary. PR-URL: https://github.com/nodejs/node/pull/32644 Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Luigi Pinca Reviewed-By: Ruben Bridgewater Reviewed-By: Juan José Arboleda --- lib/internal/child_process/serialization.js | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/lib/internal/child_process/serialization.js b/lib/internal/child_process/serialization.js index 9f03a8e8446c43..df8a6ca67236c5 100644 --- a/lib/internal/child_process/serialization.js +++ b/lib/internal/child_process/serialization.js @@ -25,10 +25,9 @@ class ChildProcessSerializer extends v8.DefaultSerializer { if (isArrayBufferView(object)) { this.writeUint32(kArrayBufferViewTag); return super._writeHostObject(object); - } else { - this.writeUint32(kNotArrayBufferViewTag); - this.writeValue({ ...object }); } + this.writeUint32(kNotArrayBufferViewTag); + this.writeValue({ ...object }); } } From 03d02d74f3ba4874ee510ea9bc49b989a193396c Mon Sep 17 00:00:00 2001 From: Jesus Hernandez Date: Sat, 4 Apr 2020 17:53:29 -0500 Subject: [PATCH 12/93] fs: remove unnecessary else statement MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32662 Reviewed-By: Luigi Pinca Reviewed-By: Ruben Bridgewater Reviewed-By: James M Snell Reviewed-By: Juan José Arboleda --- lib/internal/fs/promises.js | 7 ++----- 1 file changed, 2 insertions(+), 5 deletions(-) diff --git a/lib/internal/fs/promises.js b/lib/internal/fs/promises.js index c93d89461ffa8a..960ea5492d5f8c 100644 --- a/lib/internal/fs/promises.js +++ b/lib/internal/fs/promises.js @@ -188,11 +188,8 @@ async function readFileHandle(filehandle, options) { } while (!endOfFile); const result = Buffer.concat(chunks); - if (options.encoding) { - return result.toString(options.encoding); - } else { - return result; - } + + return options.encoding ? result.toString(options.encoding) : result; } // All of the functions are defined as async in order to ensure that errors From f9b8988df6cd5527aa13a93fe05e443f57b7d78e Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Juan=20Jos=C3=A9=20Arboleda?= Date: Mon, 13 Apr 2020 10:42:13 -0500 Subject: [PATCH 13/93] src: remove validation of unreachable code Based on https://github.com/nodejs/help/issues/2600#issuecomment-612857153 the condition (amount < 0) won't be possible. PR-URL: https://github.com/nodejs/node/pull/32818 Reviewed-By: Anna Henningsen Reviewed-By: Yongsheng Zhang Reviewed-By: Zeyu Yang Reviewed-By: Shelley Vohr Reviewed-By: Ben Noordhuis Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Gus Caplan --- src/node_os.cc | 4 ---- 1 file changed, 4 deletions(-) diff --git a/src/node_os.cc b/src/node_os.cc index b64b75fa6b90be..8f1ca4f0c3ff77 100644 --- a/src/node_os.cc +++ b/src/node_os.cc @@ -132,16 +132,12 @@ static void GetCPUInfo(const FunctionCallbackInfo& args) { static void GetFreeMemory(const FunctionCallbackInfo& args) { double amount = uv_get_free_memory(); - if (amount < 0) - return; args.GetReturnValue().Set(amount); } static void GetTotalMemory(const FunctionCallbackInfo& args) { double amount = uv_get_total_memory(); - if (amount < 0) - return; args.GetReturnValue().Set(amount); } From 861eb39307d68640305ad8cb456ecfa8ed25ffa3 Mon Sep 17 00:00:00 2001 From: Gabriel Schulhof Date: Mon, 6 Apr 2020 10:16:15 -0700 Subject: [PATCH 14/93] n-api: detect deadlocks in thread-safe function We introduce status `napi_would_deadlock` to be used as a return status by `napi_call_threadsafe_function` if the call is made with `napi_tsfn_blocking` on the main thread and the queue is full. Fixes: https://github.com/nodejs/node/issues/32615 Signed-off-by: Gabriel Schulhof PR-URL: https://github.com/nodejs/node/pull/32860 Reviewed-By: Ben Noordhuis Reviewed-By: Michael Dawson Reviewed-By: Zeyu Yang --- doc/api/n-api.md | 28 +++++++++- src/js_native_api_types.h | 10 +++- src/js_native_api_v8.cc | 3 +- src/node_api.cc | 23 ++++++++ .../test_threadsafe_function/binding.c | 55 +++++++++++++++++++ .../test_threadsafe_function/binding.gyp | 5 +- .../node-api/test_threadsafe_function/test.js | 11 +++- 7 files changed, 125 insertions(+), 10 deletions(-) diff --git a/doc/api/n-api.md b/doc/api/n-api.md index c31eb17eeb7126..de588519849dde 100644 --- a/doc/api/n-api.md +++ b/doc/api/n-api.md @@ -458,6 +458,7 @@ typedef enum { napi_date_expected, napi_arraybuffer_expected, napi_detachable_arraybuffer_expected, + napi_would_deadlock, } napi_status; ``` @@ -5096,6 +5097,19 @@ preventing data from being successfully added to the queue. If set to `napi_call_threadsafe_function()` never blocks if the thread-safe function was created with a maximum queue size of 0. +As a special case, when `napi_call_threadsafe_function()` is called from a +JavaScript thread, it will return `napi_would_deadlock` if the queue is full +and it was called with `napi_tsfn_blocking`. The reason for this is that the +JavaScript thread is responsible for removing items from the queue, thereby +reducing their number. Thus, if it waits for room to become available on the +queue, then it will deadlock. + +`napi_call_threadsafe_function()` will also return `napi_would_deadlock` if the +thread-safe function created on one JavaScript thread is called from another +JavaScript thread. The reason for this is to prevent a deadlock arising from the +possibility that the two JavaScript threads end up waiting on one another, +thereby both deadlocking. + The actual call into JavaScript is controlled by the callback given via the `call_js_cb` parameter. `call_js_cb` is invoked on the main thread once for each value that was placed into the queue by a successful call to @@ -5232,6 +5246,12 @@ This API may be called from any thread which makes use of `func`. ```C @@ -5249,9 +5269,13 @@ napi_call_threadsafe_function(napi_threadsafe_function func, `napi_tsfn_nonblocking` to indicate that the call should return immediately with a status of `napi_queue_full` whenever the queue is full. +This API will return `napi_would_deadlock` if called with `napi_tsfn_blocking` +from the main thread and the queue is full. + This API will return `napi_closing` if `napi_release_threadsafe_function()` was -called with `abort` set to `napi_tsfn_abort` from any thread. The value is only -added to the queue if the API returns `napi_ok`. +called with `abort` set to `napi_tsfn_abort` from any thread. + +The value is only added to the queue if the API returns `napi_ok`. This API may be called from any thread which makes use of `func`. diff --git a/src/js_native_api_types.h b/src/js_native_api_types.h index 7a49fc9f719b30..c32c71c4d39334 100644 --- a/src/js_native_api_types.h +++ b/src/js_native_api_types.h @@ -82,11 +82,15 @@ typedef enum { napi_date_expected, napi_arraybuffer_expected, napi_detachable_arraybuffer_expected, + napi_would_deadlock } napi_status; // Note: when adding a new enum value to `napi_status`, please also update -// `const int last_status` in `napi_get_last_error_info()' definition, -// in file js_native_api_v8.cc. Please also update the definition of -// `napi_status` in doc/api/n-api.md to reflect the newly added value(s). +// * `const int last_status` in the definition of `napi_get_last_error_info()' +// in file js_native_api_v8.cc. +// * `const char* error_messages[]` in file js_native_api_v8.cc with a brief +// message explaining the error. +// * the definition of `napi_status` in doc/api/n-api.md to reflect the newly +// added value(s). typedef napi_value (*napi_callback)(napi_env env, napi_callback_info info); diff --git a/src/js_native_api_v8.cc b/src/js_native_api_v8.cc index 422eff6d7c5b68..ef25c92e060592 100644 --- a/src/js_native_api_v8.cc +++ b/src/js_native_api_v8.cc @@ -740,6 +740,7 @@ const char* error_messages[] = {nullptr, "A date was expected", "An arraybuffer was expected", "A detachable arraybuffer was expected", + "Main thread would deadlock", }; napi_status napi_get_last_error_info(napi_env env, @@ -751,7 +752,7 @@ napi_status napi_get_last_error_info(napi_env env, // message in the `napi_status` enum each time a new error message is added. // We don't have a napi_status_last as this would result in an ABI // change each time a message was added. - const int last_status = napi_detachable_arraybuffer_expected; + const int last_status = napi_would_deadlock; static_assert( NAPI_ARRAYSIZE(error_messages) == last_status + 1, diff --git a/src/node_api.cc b/src/node_api.cc index fad9cf72a972c2..cb8bd4b482365e 100644 --- a/src/node_api.cc +++ b/src/node_api.cc @@ -155,6 +155,29 @@ class ThreadSafeFunction : public node::AsyncResource { if (mode == napi_tsfn_nonblocking) { return napi_queue_full; } + + // Here we check if there is a Node.js event loop running on this thread. + // If there is, and our queue is full, we return `napi_would_deadlock`. We + // do this for two reasons: + // + // 1. If this is the thread on which our own event loop runs then we + // cannot wait here because that will prevent our event loop from + // running and emptying the very queue on which we are waiting. + // + // 2. If this is not the thread on which our own event loop runs then we + // still cannot wait here because that allows the following sequence of + // events: + // + // 1. JSThread1 calls JSThread2 and blocks while its queue is full and + // because JSThread2's queue is also full. + // + // 2. JSThread2 calls JSThread1 before it's had a chance to remove an + // item from its own queue and blocks because JSThread1's queue is + // also full. + v8::Isolate* isolate = v8::Isolate::GetCurrent(); + if (isolate != nullptr && node::GetCurrentEventLoop(isolate) != nullptr) + return napi_would_deadlock; + cond->Wait(lock); } diff --git a/test/node-api/test_threadsafe_function/binding.c b/test/node-api/test_threadsafe_function/binding.c index c9c526153804c6..9f2fa5f9bd21bc 100644 --- a/test/node-api/test_threadsafe_function/binding.c +++ b/test/node-api/test_threadsafe_function/binding.c @@ -267,6 +267,60 @@ static napi_value StartThreadNoJsFunc(napi_env env, napi_callback_info info) { /** block_on_full */true, /** alt_ref_js_cb */true); } +static void DeadlockTestDummyMarshaller(napi_env env, + napi_value empty0, + void* empty1, + void* empty2) {} + +static napi_value TestDeadlock(napi_env env, napi_callback_info info) { + napi_threadsafe_function tsfn; + napi_status status; + napi_value async_name; + napi_value return_value; + + // Create an object to store the returned information. + NAPI_CALL(env, napi_create_object(env, &return_value)); + + // Create a string to be used with the thread-safe function. + NAPI_CALL(env, napi_create_string_utf8(env, + "N-API Thread-safe Function Deadlock Test", + NAPI_AUTO_LENGTH, + &async_name)); + + // Create the thread-safe function with a single queue slot and a single thread. + NAPI_CALL(env, napi_create_threadsafe_function(env, + NULL, + NULL, + async_name, + 1, + 1, + NULL, + NULL, + NULL, + DeadlockTestDummyMarshaller, + &tsfn)); + + // Call the threadsafe function. This should succeed and fill the queue. + NAPI_CALL(env, napi_call_threadsafe_function(tsfn, NULL, napi_tsfn_blocking)); + + // Call the threadsafe function. This should not block, but return + // `napi_would_deadlock`. We save the resulting status in an object to be + // returned. + status = napi_call_threadsafe_function(tsfn, NULL, napi_tsfn_blocking); + add_returned_status(env, + "deadlockTest", + return_value, + "Main thread would deadlock", + napi_would_deadlock, + status); + + // Clean up the thread-safe function before returning. + NAPI_CALL(env, napi_release_threadsafe_function(tsfn, napi_tsfn_release)); + + // Return the result. + return return_value; +} + // Module init static napi_value Init(napi_env env, napi_value exports) { size_t index; @@ -305,6 +359,7 @@ static napi_value Init(napi_env env, napi_value exports) { DECLARE_NAPI_PROPERTY("StopThread", StopThread), DECLARE_NAPI_PROPERTY("Unref", Unref), DECLARE_NAPI_PROPERTY("Release", Release), + DECLARE_NAPI_PROPERTY("TestDeadlock", TestDeadlock), }; NAPI_CALL(env, napi_define_properties(env, exports, diff --git a/test/node-api/test_threadsafe_function/binding.gyp b/test/node-api/test_threadsafe_function/binding.gyp index b60352e05af103..34587eed3dfb1f 100644 --- a/test/node-api/test_threadsafe_function/binding.gyp +++ b/test/node-api/test_threadsafe_function/binding.gyp @@ -2,7 +2,10 @@ 'targets': [ { 'target_name': 'binding', - 'sources': ['binding.c'] + 'sources': [ + 'binding.c', + '../../js-native-api/common.c' + ] } ] } diff --git a/test/node-api/test_threadsafe_function/test.js b/test/node-api/test_threadsafe_function/test.js index 3603d79ee6b5d3..f5afe225f07624 100644 --- a/test/node-api/test_threadsafe_function/test.js +++ b/test/node-api/test_threadsafe_function/test.js @@ -210,8 +210,13 @@ new Promise(function testWithoutJSMarshaller(resolve) { })) .then((result) => assert.strictEqual(result.indexOf(0), -1)) -// Start a child process to test rapid teardown +// Start a child process to test rapid teardown. .then(() => testUnref(binding.MAX_QUEUE_SIZE)) -// Start a child process with an infinite queue to test rapid teardown -.then(() => testUnref(0)); +// Start a child process with an infinite queue to test rapid teardown. +.then(() => testUnref(0)) + +// Test deadlock prevention. +.then(() => assert.deepStrictEqual(binding.TestDeadlock(), { + deadlockTest: 'Main thread would deadlock' +})); From a8ed8f5d0a973f00a17a5e702cbf3c055a22508a Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Fri, 17 Apr 2020 05:45:50 -0700 Subject: [PATCH 15/93] doc: synch SECURITY.md with website Refs: https://github.com/nodejs/nodejs.org/pull/3106#issuecomment-614258785 PR-URL: https://github.com/nodejs/node/pull/32903 Reviewed-By: Ruben Bridgewater Reviewed-By: Trivikram Kamat Reviewed-By: Luigi Pinca --- SECURITY.md | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/SECURITY.md b/SECURITY.md index 3196055ccb78e5..64714043db7e3b 100644 --- a/SECURITY.md +++ b/SECURITY.md @@ -11,8 +11,7 @@ handling your submission. After the initial reply to your report, the security team will endeavor to keep you informed of the progress being made towards a fix and full announcement, and may ask for additional information or guidance surrounding the reported -issue. These updates will be sent at least every five days; in practice, this -is more likely to be every 24-48 hours. +issue. ### Node.js Bug Bounty Program From a133ac17eb846fbd2dcb59d4151d2150b3afa51b Mon Sep 17 00:00:00 2001 From: rickyes Date: Sat, 18 Apr 2020 14:24:46 +0800 Subject: [PATCH 16/93] perf_hooks: remove unnecessary assignment when name is undefined MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32910 Reviewed-By: Michaël Zasso Reviewed-By: Zeyu Yang Reviewed-By: Andrey Pechkurov Reviewed-By: Colin Ihrig Reviewed-By: Chengzhong Wu Reviewed-By: Luigi Pinca Reviewed-By: James M Snell --- lib/perf_hooks.js | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/lib/perf_hooks.js b/lib/perf_hooks.js index 11a9b5eba6a343..efc92ca8c4aee4 100644 --- a/lib/perf_hooks.js +++ b/lib/perf_hooks.js @@ -407,8 +407,8 @@ class Performance { } clearMarks(name) { - name = name !== undefined ? `${name}` : name; if (name !== undefined) { + name = `${name}`; this[kIndex][kMarks].delete(name); _clearMark(name); } else { From baa823172842754bfb5e8eb7646c6a751253c72f Mon Sep 17 00:00:00 2001 From: rickyes Date: Fri, 3 Apr 2020 20:30:14 +0800 Subject: [PATCH 17/93] fs: extract kWriteFileMaxChunkSize constant PR-URL: https://github.com/nodejs/node/pull/32640 Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Colin Ihrig Reviewed-By: David Carlier --- lib/internal/fs/promises.js | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) diff --git a/lib/internal/fs/promises.js b/lib/internal/fs/promises.js index 960ea5492d5f8c..05d82991552bad 100644 --- a/lib/internal/fs/promises.js +++ b/lib/internal/fs/promises.js @@ -4,6 +4,11 @@ // See https://github.com/libuv/libuv/pull/1501. const kIoMaxLength = 2 ** 31 - 1; +// Note: This is different from kReadFileBufferLength used for non-promisified +// fs.readFile. +const kReadFileMaxChunkSize = 2 ** 14; +const kWriteFileMaxChunkSize = 2 ** 14; + const { MathMax, MathMin, @@ -150,16 +155,12 @@ async function writeFileHandle(filehandle, data) { do { const { bytesWritten } = await write(filehandle, data, 0, - MathMin(16384, data.length)); + MathMin(kWriteFileMaxChunkSize, data.length)); remaining -= bytesWritten; data = data.slice(bytesWritten); } while (remaining > 0); } -// Note: This is different from kReadFileBufferLength used for non-promisified -// fs.readFile. -const kReadFileMaxChunkSize = 16384; - async function readFileHandle(filehandle, options) { const statFields = await binding.fstat(filehandle.fd, false, kUsePromises); From 57c170c75cf3351cf14b3f93a135dfc241d53df0 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E9=9B=A8=E5=A4=9C=E5=B8=A6=E5=88=80?= Date: Fri, 17 Apr 2020 16:00:13 +0800 Subject: [PATCH 18/93] doc: fix typo in zlib.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32901 Reviewed-By: Richard Lau Reviewed-By: Zeyu Yang Reviewed-By: Colin Ihrig Reviewed-By: Luigi Pinca Reviewed-By: Ruben Bridgewater Reviewed-By: Trivikram Kamat Reviewed-By: Juan José Arboleda Reviewed-By: James M Snell Reviewed-By: Michaël Zasso --- doc/api/zlib.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/zlib.md b/doc/api/zlib.md index 04465c11352269..195c874f8decbc 100644 --- a/doc/api/zlib.md +++ b/doc/api/zlib.md @@ -166,7 +166,7 @@ request.on('response', (response) => { pipeline(response, zlib.createGunzip(), output, onError); break; case 'deflate': - pipeline(response, zlib.createInflate(), outout, onError); + pipeline(response, zlib.createInflate(), output, onError); break; default: pipeline(response, output, onError); From 68b7c80a446662802fc3a35d50895f4f9e1ce1b9 Mon Sep 17 00:00:00 2001 From: karan singh virdi Date: Sun, 19 Apr 2020 01:32:54 +0530 Subject: [PATCH 19/93] doc: fix usage of folder and directory terms in fs.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This commit fixes the interchangeably usage of "folder" and "directory" terms in fs.md Fixes: https://github.com/nodejs/node/issues/32902 PR-URL: https://github.com/nodejs/node/pull/32919 Reviewed-By: Juan José Arboleda Reviewed-By: Andrey Pechkurov Reviewed-By: Colin Ihrig Reviewed-By: Zeyu Yang Reviewed-By: David Carlier Reviewed-By: Trivikram Kamat --- doc/api/fs.md | 32 ++++++++++++++++---------------- 1 file changed, 16 insertions(+), 16 deletions(-) diff --git a/doc/api/fs.md b/doc/api/fs.md index 7bbb5cf75a7356..fa87f017d8cee2 100644 --- a/doc/api/fs.md +++ b/doc/api/fs.md @@ -2459,11 +2459,11 @@ changes: Asynchronously creates a directory. The callback is given a possible exception and, if `recursive` is `true`, the -first folder path created, `(err, [path])`. +first directory path created, `(err, [path])`. The optional `options` argument can be an integer specifying `mode` (permission and sticky bits), or an object with a `mode` property and a `recursive` -property indicating whether parent folders should be created. Calling +property indicating whether parent directories should be created. Calling `fs.mkdir()` when `path` is a directory that exists results in an error only when `recursive` is false. @@ -2509,7 +2509,7 @@ changes: * Returns: {string|undefined} Synchronously creates a directory. Returns `undefined`, or if `recursive` is -`true`, the first folder path created. +`true`, the first directory path created. This is the synchronous version of [`fs.mkdir()`][]. See also: mkdir(2). @@ -2536,7 +2536,7 @@ changes: * `encoding` {string} **Default:** `'utf8'` * `callback` {Function} * `err` {Error} - * `folder` {string} + * `directory` {string} Creates a unique temporary directory. @@ -2546,16 +2546,16 @@ inconsistencies, avoid trailing `X` characters in `prefix`. Some platforms, notably the BSDs, can return more than six random characters, and replace trailing `X` characters in `prefix` with random characters. -The created folder path is passed as a string to the callback's second +The created directory path is passed as a string to the callback's second parameter. The optional `options` argument can be a string specifying an encoding, or an object with an `encoding` property specifying the character encoding to use. ```js -fs.mkdtemp(path.join(os.tmpdir(), 'foo-'), (err, folder) => { +fs.mkdtemp(path.join(os.tmpdir(), 'foo-'), (err, directory) => { if (err) throw err; - console.log(folder); + console.log(directory); // Prints: /tmp/foo-itXde2 or C:\Users\...\AppData\Local\Temp\foo-itXde2 }); ``` @@ -2571,9 +2571,9 @@ must end with a trailing platform-specific path separator const tmpDir = os.tmpdir(); // This method is *INCORRECT*: -fs.mkdtemp(tmpDir, (err, folder) => { +fs.mkdtemp(tmpDir, (err, directory) => { if (err) throw err; - console.log(folder); + console.log(directory); // Will print something similar to `/tmpabc123`. // A new temporary directory is created at the file system root // rather than *within* the /tmp directory. @@ -2581,9 +2581,9 @@ fs.mkdtemp(tmpDir, (err, folder) => { // This method is *CORRECT*: const { sep } = require('path'); -fs.mkdtemp(`${tmpDir}${sep}`, (err, folder) => { +fs.mkdtemp(`${tmpDir}${sep}`, (err, directory) => { if (err) throw err; - console.log(folder); + console.log(directory); // Will print something similar to `/tmp/abc123`. // A new temporary directory is created within // the /tmp directory. @@ -2600,7 +2600,7 @@ added: v5.10.0 * `encoding` {string} **Default:** `'utf8'` * Returns: {string} -Returns the created folder path. +Returns the created directory path. For detailed information, see the documentation of the asynchronous version of this API: [`fs.mkdtemp()`][]. @@ -3465,7 +3465,7 @@ error raised if the file is not available. To check if a file exists without manipulating it afterwards, [`fs.access()`][] is recommended. -For example, given the following folder structure: +For example, given the following directory structure: ```fundamental - txtDir @@ -4972,11 +4972,11 @@ added: v10.0.0 * Returns: {Promise} Asynchronously creates a directory then resolves the `Promise` with either no -arguments, or the first folder path created if `recursive` is `true`. +arguments, or the first directory path created if `recursive` is `true`. The optional `options` argument can be an integer specifying `mode` (permission and sticky bits), or an object with a `mode` property and a `recursive` -property indicating whether parent folders should be created. Calling +property indicating whether parent directories should be created. Calling `fsPromises.mkdir()` when `path` is a directory that exists results in a rejection only when `recursive` is false. @@ -4991,7 +4991,7 @@ added: v10.0.0 * Returns: {Promise} Creates a unique temporary directory and resolves the `Promise` with the created -folder path. A unique directory name is generated by appending six random +directory path. A unique directory name is generated by appending six random characters to the end of the provided `prefix`. Due to platform inconsistencies, avoid trailing `X` characters in `prefix`. Some platforms, notably the BSDs, can return more than six random characters, and replace From c9ae385abf6b865a20cbbb06a862c0da28917ccc Mon Sep 17 00:00:00 2001 From: Andrey Pechkurov Date: Mon, 20 Apr 2020 20:25:50 +0300 Subject: [PATCH 20/93] test: mark test-child-process-fork-args as flaky on Windows MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32950 Refs: https://github.com/nodejs/node/issues/32863 Reviewed-By: Sam Roberts Reviewed-By: Richard Lau Reviewed-By: Luigi Pinca Reviewed-By: Juan José Arboleda --- test/parallel/parallel.status | 2 ++ 1 file changed, 2 insertions(+) diff --git a/test/parallel/parallel.status b/test/parallel/parallel.status index 9314874d3f9082..8703fea2b0b3a7 100644 --- a/test/parallel/parallel.status +++ b/test/parallel/parallel.status @@ -9,6 +9,8 @@ prefix parallel test-http2-reset-flood: PASS,FLAKY [$system==win32] +# https://github.com/nodejs/node/issues/32863 +test-child-process-fork-args: PASS,FLAKY # https://github.com/nodejs/node/issues/20750 test-http2-client-upload: PASS,FLAKY # https://github.com/nodejs/node/issues/20750 From fa710732bf32fceb376160ecaefc9dcde12ae8d3 Mon Sep 17 00:00:00 2001 From: William Armiros <54150514+willarmiros@users.noreply.github.com> Date: Tue, 14 Apr 2020 11:01:58 -0500 Subject: [PATCH 21/93] doc: corrected ERR_SOCKET_CANNOT_SEND message MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32847 Reviewed-By: Colin Ihrig Reviewed-By: Rich Trott Reviewed-By: Ruben Bridgewater Reviewed-By: Juan José Arboleda --- doc/api/errors.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/errors.md b/doc/api/errors.md index 7ed5e5b0bd9ece..5308490b88fe8c 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -2310,7 +2310,7 @@ added: v9.0.0 removed: v14.0.0 --> -Data could be sent on a socket. +Data could not be sent on a socket. ### `ERR_STDERR_CLOSE` From 4221b1c8c942912b14d05ad03c24f9009b26ba94 Mon Sep 17 00:00:00 2001 From: Matt Kulukundis Date: Thu, 16 Apr 2020 19:30:08 -0400 Subject: [PATCH 22/93] src: fix null deref in AllocatedBuffer::clear MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit An empty buffer can have a null environment. Previously, we were getting away with with this, but -fsanitize=null in clang caught it. PR-URL: https://github.com/nodejs/node/pull/32892 Reviewed-By: Anna Henningsen Reviewed-By: David Carlier Reviewed-By: Jan Krems Reviewed-By: James M Snell Reviewed-By: Juan José Arboleda --- src/env-inl.h | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/src/env-inl.h b/src/env-inl.h index 853c74f3e5b0f1..9ba5bebe00cb27 100644 --- a/src/env-inl.h +++ b/src/env-inl.h @@ -995,7 +995,10 @@ inline AllocatedBuffer::~AllocatedBuffer() { inline void AllocatedBuffer::clear() { uv_buf_t buf = release(); - env_->Free(buf.base, buf.len); + if (buf.base != nullptr) { + CHECK_NOT_NULL(env_); + env_->Free(buf.base, buf.len); + } } // It's a bit awkward to define this Buffer::New() overload here, but it From f6be140222176164faa98479c0b0908afdd9743a Mon Sep 17 00:00:00 2001 From: Edward Elric Date: Sun, 19 Apr 2020 16:40:18 +0800 Subject: [PATCH 23/93] doc: fix typo in security-release-process.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32926 Reviewed-By: David Carlier Reviewed-By: James M Snell Reviewed-By: Zeyu Yang Reviewed-By: Colin Ihrig Reviewed-By: Juan José Arboleda --- doc/guides/security-release-process.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/guides/security-release-process.md b/doc/guides/security-release-process.md index 55951f94d4ae1f..73d27ae7c60d8c 100644 --- a/doc/guides/security-release-process.md +++ b/doc/guides/security-release-process.md @@ -2,7 +2,7 @@ The security release process covers the steps required to plan/implement a security release. This document is copied into the description of the Next -Security Release, and used to track progess on the release. It contains ***TEXT +Security Release, and used to track progress on the release. It contains ***TEXT LIKE THIS*** which will be replaced during the release process with the information described. From 8663fd5f88730dff768da522cdbd1ae921597ca9 Mon Sep 17 00:00:00 2001 From: Myles Borins Date: Sat, 18 Apr 2020 11:27:54 -0400 Subject: [PATCH 24/93] module: partial doc removal of --experimental-modules MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit This removes `--experimental-modules` from showing up in `node -h` and also removes the documentation from the man pages. It will still work as a no-op, and is still included in cli.md Refs: https://github.com/nodejs/modules/issues/502 PR-URL: https://github.com/nodejs/node/pull/32915 Reviewed-By: Guy Bedford Reviewed-By: Geoffrey Booth Reviewed-By: David Carlier Reviewed-By: Michaël Zasso Reviewed-By: Franziska Hinkelmann Reviewed-By: James M Snell --- doc/node.1 | 3 --- src/node_options.cc | 2 +- 2 files changed, 1 insertion(+), 4 deletions(-) diff --git a/doc/node.1 b/doc/node.1 index 9bb344fdf27b2d..ec2642170aee6f 100644 --- a/doc/node.1 +++ b/doc/node.1 @@ -132,9 +132,6 @@ Specify the .Ar module to use as a custom module loader. . -.It Fl -experimental-modules -Enable experimental latest experimental modules features. -. .It Fl -experimental-policy Use the specified file as a security policy. . diff --git a/src/node_options.cc b/src/node_options.cc index 4148cfc04eb2ac..3b9142c19e98a8 100644 --- a/src/node_options.cc +++ b/src/node_options.cc @@ -285,7 +285,7 @@ EnvironmentOptionsParser::EnvironmentOptionsParser() { kAllowedInEnvironment); AddAlias("--loader", "--experimental-loader"); AddOption("--experimental-modules", - "experimental modules features", + "", &EnvironmentOptions::experimental_modules, kAllowedInEnvironment); AddOption("--experimental-wasm-modules", From 6fc4d174b5a59f45048d79698fc73f164a996424 Mon Sep 17 00:00:00 2001 From: James M Snell Date: Wed, 15 Apr 2020 15:21:32 -0700 Subject: [PATCH 25/93] http2: refactor and cleanup http2 * cleanup constants in http2 binding The error constants were just doing some weird things. Cleanup and improve maintainability. * simplify settings to reduce duplicate code * improve style consistency and correctness Use snake_case for getters and setters at c++ level, avoid unnecessary use of enums, use consistent style for exported vs. internal constants, avoid unnecessary memory info reporting, use setters/getters for flags_ for improved code readability * make EmitStatistics function private * un-nest Http2Settings and Http2Ping * refactoring and cleanup of Http2Settings and Http2Ping * avoid ** syntax for readability The **session and **stream syntax for getting the underlying nghttp2 pointers is not ideal for readability * use const references for Http2Priority * remove unnecessary GetStream function * refactor Http2Scope to use BaseObjectPtr * move utility function to anonymous namespace * refactor and simplify Origins * Use an AllocatedBuffer instead of MaybeStackBuffer * Use a const reference instead of pointer * use BaseObjectPtr for Http2Streams map * move MemoryInfo impl to cc Signed-off-by: James M Snell PR-URL: https://github.com/nodejs/node/pull/32884 Reviewed-By: Matteo Collina Reviewed-By: Franziska Hinkelmann --- src/node_http2.cc | 994 +++++++++--------- src/node_http2.h | 500 +++++---- test/parallel/test-http2-getpackedsettings.js | 11 +- 3 files changed, 799 insertions(+), 706 deletions(-) diff --git a/src/node_http2.cc b/src/node_http2.cc index 7477bfbb6d8c10..385a2352040c4e 100644 --- a/src/node_http2.cc +++ b/src/node_http2.cc @@ -21,6 +21,7 @@ using v8::ArrayBuffer; using v8::ArrayBufferView; using v8::Boolean; using v8::Context; +using v8::EscapableHandleScope; using v8::Float64Array; using v8::Function; using v8::FunctionCallbackInfo; @@ -30,7 +31,6 @@ using v8::Integer; using v8::Isolate; using v8::Local; using v8::MaybeLocal; -using v8::NewStringType; using v8::Number; using v8::Object; using v8::ObjectTemplate; @@ -48,15 +48,9 @@ namespace { const char zero_bytes_256[256] = {}; -inline Http2Stream* GetStream(Http2Session* session, - int32_t id, - nghttp2_data_source* source) { - Http2Stream* stream = static_cast(source->ptr); - if (stream == nullptr) - stream = session->FindStream(id); - CHECK_NOT_NULL(stream); - CHECK_EQ(id, stream->id()); - return stream; +bool HasHttp2Observer(Environment* env) { + AliasedUint32Array& observers = env->performance_state()->observers; + return observers[performance::NODE_PERFORMANCE_ENTRY_TYPE_HTTP2] != 0; } } // anonymous namespace @@ -75,36 +69,27 @@ const Http2Session::Callbacks Http2Session::callback_struct_saved[2] = { // For example: // // Http2Scope h2scope(session); -// nghttp2_submit_ping(**session, ... ); +// nghttp2_submit_ping(session->session(), ... ); // // When the Http2Scope passes out of scope and is deconstructed, it will // call Http2Session::MaybeScheduleWrite(). Http2Scope::Http2Scope(Http2Stream* stream) : Http2Scope(stream->session()) {} -Http2Scope::Http2Scope(Http2Session* session) { - if (session == nullptr) - return; +Http2Scope::Http2Scope(Http2Session* session) : session_(session) { + if (!session_) return; - if (session->flags_ & (SESSION_STATE_HAS_SCOPE | - SESSION_STATE_WRITE_SCHEDULED)) { - // There is another scope further below on the stack, or it is already - // known that a write is scheduled. In either case, there is nothing to do. + // If there is another scope further below on the stack, or + // a write is already scheduled, there's nothing to do. + if (session_->is_in_scope() || session_->is_write_scheduled()) { + session_.reset(); return; } - session->flags_ |= SESSION_STATE_HAS_SCOPE; - session_ = session; - - // Always keep the session object alive for at least as long as - // this scope is active. - session_handle_ = session->object(); - CHECK(!session_handle_.IsEmpty()); + session_->set_in_scope(); } Http2Scope::~Http2Scope() { - if (session_ == nullptr) - return; - - session_->flags_ &= ~SESSION_STATE_HAS_SCOPE; + if (!session_) return; + session_->set_in_scope(false); session_->MaybeScheduleWrite(); } @@ -112,7 +97,7 @@ Http2Scope::~Http2Scope() { // instances to configure an appropriate nghttp2_options struct. The class // uses a single TypedArray instance that is shared with the JavaScript side // to more efficiently pass values back and forth. -Http2Options::Http2Options(Http2State* http2_state, nghttp2_session_type type) { +Http2Options::Http2Options(Http2State* http2_state, SessionType type) { nghttp2_option* option; CHECK_EQ(nghttp2_option_new(&option), 0); CHECK_NOT_NULL(option); @@ -171,10 +156,10 @@ Http2Options::Http2Options(Http2State* http2_state, nghttp2_session_type type) { // this is set on a per-session basis, but eventually we may switch to // a per-stream setting, giving users greater control if (flags & (1 << IDX_OPTIONS_PADDING_STRATEGY)) { - padding_strategy_type strategy = - static_cast( + PaddingStrategy strategy = + static_cast( buffer.GetValue(IDX_OPTIONS_PADDING_STRATEGY)); - SetPaddingStrategy(strategy); + set_padding_strategy(strategy); } // The max header list pairs option controls the maximum number of @@ -182,7 +167,7 @@ Http2Options::Http2Options(Http2State* http2_state, nghttp2_session_type type) { // if the remote peer sends more than this amount, the stream will be // automatically closed with an RST_STREAM. if (flags & (1 << IDX_OPTIONS_MAX_HEADER_LIST_PAIRS)) - SetMaxHeaderPairs(buffer[IDX_OPTIONS_MAX_HEADER_LIST_PAIRS]); + set_max_header_pairs(buffer[IDX_OPTIONS_MAX_HEADER_LIST_PAIRS]); // The HTTP2 specification places no limits on the number of HTTP2 // PING frames that can be sent. In order to prevent PINGS from being @@ -190,7 +175,7 @@ Http2Options::Http2Options(Http2State* http2_state, nghttp2_session_type type) { // on the number of unacknowledged PINGS that can be sent at any given // time. if (flags & (1 << IDX_OPTIONS_MAX_OUTSTANDING_PINGS)) - SetMaxOutstandingPings(buffer[IDX_OPTIONS_MAX_OUTSTANDING_PINGS]); + set_max_outstanding_pings(buffer[IDX_OPTIONS_MAX_OUTSTANDING_PINGS]); // The HTTP2 specification places no limits on the number of HTTP2 // SETTINGS frames that can be sent. In order to prevent PINGS from being @@ -198,7 +183,7 @@ Http2Options::Http2Options(Http2State* http2_state, nghttp2_session_type type) { // on the number of unacknowledged SETTINGS that can be sent at any given // time. if (flags & (1 << IDX_OPTIONS_MAX_OUTSTANDING_SETTINGS)) - SetMaxOutstandingSettings(buffer[IDX_OPTIONS_MAX_OUTSTANDING_SETTINGS]); + set_max_outstanding_settings(buffer[IDX_OPTIONS_MAX_OUTSTANDING_SETTINGS]); // The HTTP2 specification places no limits on the amount of memory // that a session can consume. In order to prevent abuse, we place a @@ -209,131 +194,133 @@ Http2Options::Http2Options(Http2State* http2_state, nghttp2_session_type type) { // Important: The maxSessionMemory option in javascript is expressed in // terms of MB increments (i.e. the value 1 == 1 MB) if (flags & (1 << IDX_OPTIONS_MAX_SESSION_MEMORY)) - SetMaxSessionMemory(buffer[IDX_OPTIONS_MAX_SESSION_MEMORY] * 1000000); + set_max_session_memory(buffer[IDX_OPTIONS_MAX_SESSION_MEMORY] * 1000000); } -void Http2Session::Http2Settings::Init(Http2State* http2_state) { +#define GRABSETTING(entries, count, name) \ + do { \ + if (flags & (1 << IDX_SETTINGS_ ## name)) { \ + uint32_t val = buffer[IDX_SETTINGS_ ## name]; \ + entries[count++] = \ + nghttp2_settings_entry {NGHTTP2_SETTINGS_ ## name, val}; \ + } } while (0) + +size_t Http2Settings::Init( + Http2State* http2_state, + nghttp2_settings_entry* entries) { AliasedUint32Array& buffer = http2_state->settings_buffer; uint32_t flags = buffer[IDX_SETTINGS_COUNT]; - size_t n = 0; - -#define GRABSETTING(N, trace) \ - if (flags & (1 << IDX_SETTINGS_##N)) { \ - uint32_t val = buffer[IDX_SETTINGS_##N]; \ - if (session_ != nullptr) \ - Debug(session_, "setting " trace ": %d\n", val); \ - entries_[n++] = \ - nghttp2_settings_entry {NGHTTP2_SETTINGS_##N, val}; \ - } + size_t count = 0; - GRABSETTING(HEADER_TABLE_SIZE, "header table size"); - GRABSETTING(MAX_CONCURRENT_STREAMS, "max concurrent streams"); - GRABSETTING(MAX_FRAME_SIZE, "max frame size"); - GRABSETTING(INITIAL_WINDOW_SIZE, "initial window size"); - GRABSETTING(MAX_HEADER_LIST_SIZE, "max header list size"); - GRABSETTING(ENABLE_PUSH, "enable push"); - GRABSETTING(ENABLE_CONNECT_PROTOCOL, "enable connect protocol"); - -#undef GRABSETTING +#define V(name) GRABSETTING(entries, count, name); + HTTP2_SETTINGS(V) +#undef V - count_ = n; + return count; } +#undef GRABSETTING // The Http2Settings class is used to configure a SETTINGS frame that is // to be sent to the connected peer. The settings are set using a TypedArray // that is shared with the JavaScript side. -Http2Session::Http2Settings::Http2Settings(Http2State* http2_state, - Http2Session* session, - Local obj, - uint64_t start_time) - : AsyncWrap(http2_state->env(), obj, PROVIDER_HTTP2SETTINGS), +Http2Settings::Http2Settings(Http2Session* session, + Local obj, + Local callback, + uint64_t start_time) + : AsyncWrap(session->env(), obj, PROVIDER_HTTP2SETTINGS), session_(session), startTime_(start_time) { - Init(http2_state); + callback_.Reset(env()->isolate(), callback); + count_ = Init(session->http2_state(), entries_); +} + +Local Http2Settings::callback() const { + return callback_.Get(env()->isolate()); +} + +void Http2Settings::MemoryInfo(MemoryTracker* tracker) const { + tracker->TrackField("callback", callback_); } // Generates a Buffer that contains the serialized payload of a SETTINGS // frame. This can be used, for instance, to create the Base64-encoded // content of an Http2-Settings header field. -Local Http2Session::Http2Settings::Pack() { - const size_t len = count_ * 6; - Local buf = Buffer::New(env(), len).ToLocalChecked(); +Local Http2Settings::Pack() { + return Pack(session_->env(), count_, entries_); +} + +Local Http2Settings::Pack(Http2State* state) { + nghttp2_settings_entry entries[IDX_SETTINGS_COUNT]; + size_t count = Init(state, entries); + return Pack(state->env(), count, entries); +} + +Local Http2Settings::Pack( + Environment* env, + size_t count, + const nghttp2_settings_entry* entries) { + EscapableHandleScope scope(env->isolate()); + const size_t size = count * 6; + AllocatedBuffer buffer = env->AllocateManaged(size); ssize_t ret = nghttp2_pack_settings_payload( - reinterpret_cast(Buffer::Data(buf)), len, - &entries_[0], count_); - if (ret >= 0) - return buf; - else - return Undefined(env()->isolate()); + reinterpret_cast(buffer.data()), + size, + entries, + count); + Local buf = Undefined(env->isolate()); + if (ret >= 0) buf = buffer.ToBuffer().ToLocalChecked(); + return scope.Escape(buf); } // Updates the shared TypedArray with the current remote or local settings for // the session. -void Http2Session::Http2Settings::Update(Http2Session* session, - get_setting fn) { +void Http2Settings::Update(Http2Session* session, get_setting fn) { AliasedUint32Array& buffer = session->http2_state()->settings_buffer; - buffer[IDX_SETTINGS_HEADER_TABLE_SIZE] = - fn(**session, NGHTTP2_SETTINGS_HEADER_TABLE_SIZE); - buffer[IDX_SETTINGS_MAX_CONCURRENT_STREAMS] = - fn(**session, NGHTTP2_SETTINGS_MAX_CONCURRENT_STREAMS); - buffer[IDX_SETTINGS_INITIAL_WINDOW_SIZE] = - fn(**session, NGHTTP2_SETTINGS_INITIAL_WINDOW_SIZE); - buffer[IDX_SETTINGS_MAX_FRAME_SIZE] = - fn(**session, NGHTTP2_SETTINGS_MAX_FRAME_SIZE); - buffer[IDX_SETTINGS_MAX_HEADER_LIST_SIZE] = - fn(**session, NGHTTP2_SETTINGS_MAX_HEADER_LIST_SIZE); - buffer[IDX_SETTINGS_ENABLE_PUSH] = - fn(**session, NGHTTP2_SETTINGS_ENABLE_PUSH); - buffer[IDX_SETTINGS_ENABLE_CONNECT_PROTOCOL] = - fn(**session, NGHTTP2_SETTINGS_ENABLE_CONNECT_PROTOCOL); + +#define V(name) \ + buffer[IDX_SETTINGS_ ## name] = \ + fn(session->session(), NGHTTP2_SETTINGS_ ## name); + HTTP2_SETTINGS(V) +#undef V } // Initializes the shared TypedArray with the default settings values. -void Http2Session::Http2Settings::RefreshDefaults(Http2State* http2_state) { +void Http2Settings::RefreshDefaults(Http2State* http2_state) { AliasedUint32Array& buffer = http2_state->settings_buffer; + uint32_t flags = 0; + +#define V(name) \ + do { \ + buffer[IDX_SETTINGS_ ## name] = DEFAULT_SETTINGS_ ## name; \ + flags |= 1 << IDX_SETTINGS_ ## name; \ + } while (0); + HTTP2_SETTINGS(V) +#undef V + + buffer[IDX_SETTINGS_COUNT] = flags; +} - buffer[IDX_SETTINGS_HEADER_TABLE_SIZE] = - DEFAULT_SETTINGS_HEADER_TABLE_SIZE; - buffer[IDX_SETTINGS_ENABLE_PUSH] = - DEFAULT_SETTINGS_ENABLE_PUSH; - buffer[IDX_SETTINGS_MAX_CONCURRENT_STREAMS] = - DEFAULT_SETTINGS_MAX_CONCURRENT_STREAMS; - buffer[IDX_SETTINGS_INITIAL_WINDOW_SIZE] = - DEFAULT_SETTINGS_INITIAL_WINDOW_SIZE; - buffer[IDX_SETTINGS_MAX_FRAME_SIZE] = - DEFAULT_SETTINGS_MAX_FRAME_SIZE; - buffer[IDX_SETTINGS_MAX_HEADER_LIST_SIZE] = - DEFAULT_SETTINGS_MAX_HEADER_LIST_SIZE; - buffer[IDX_SETTINGS_ENABLE_CONNECT_PROTOCOL] = - DEFAULT_SETTINGS_ENABLE_CONNECT_PROTOCOL; - buffer[IDX_SETTINGS_COUNT] = - (1 << IDX_SETTINGS_HEADER_TABLE_SIZE) | - (1 << IDX_SETTINGS_ENABLE_PUSH) | - (1 << IDX_SETTINGS_MAX_CONCURRENT_STREAMS) | - (1 << IDX_SETTINGS_INITIAL_WINDOW_SIZE) | - (1 << IDX_SETTINGS_MAX_FRAME_SIZE) | - (1 << IDX_SETTINGS_MAX_HEADER_LIST_SIZE) | - (1 << IDX_SETTINGS_ENABLE_CONNECT_PROTOCOL); -} - - -void Http2Session::Http2Settings::Send() { - Http2Scope h2scope(session_); - CHECK_EQ(nghttp2_submit_settings(**session_, NGHTTP2_FLAG_NONE, - &entries_[0], count_), 0); -} - -void Http2Session::Http2Settings::Done(bool ack) { + +void Http2Settings::Send() { + Http2Scope h2scope(session_.get()); + CHECK_EQ(nghttp2_submit_settings( + session_->session(), + NGHTTP2_FLAG_NONE, + &entries_[0], + count_), 0); +} + +void Http2Settings::Done(bool ack) { uint64_t end = uv_hrtime(); double duration = (end - startTime_) / 1e6; Local argv[] = { - Boolean::New(env()->isolate(), ack), + ack ? v8::True(env()->isolate()) : v8::False(env()->isolate()), Number::New(env()->isolate(), duration) }; - MakeCallback(env()->ondone_string(), arraysize(argv), argv); + MakeCallback(callback(), arraysize(argv), argv); } // The Http2Priority class initializes an appropriate nghttp2_priority_spec @@ -364,34 +351,32 @@ const char* Http2Session::TypeName() const { } } -Origins::Origins(Isolate* isolate, - Local context, - Local origin_string, - size_t origin_count) : count_(origin_count) { +Origins::Origins( + Environment* env, + Local origin_string, + size_t origin_count) + : count_(origin_count) { int origin_string_len = origin_string->Length(); if (count_ == 0) { CHECK_EQ(origin_string_len, 0); return; } - // Allocate a single buffer with count_ nghttp2_nv structs, followed - // by the raw header data as passed from JS. This looks like: - // | possible padding | nghttp2_nv | nghttp2_nv | ... | header contents | - buf_.AllocateSufficientStorage((alignof(nghttp2_origin_entry) - 1) + - count_ * sizeof(nghttp2_origin_entry) + - origin_string_len); + buf_ = env->AllocateManaged((alignof(nghttp2_origin_entry) - 1) + + count_ * sizeof(nghttp2_origin_entry) + + origin_string_len); // Make sure the start address is aligned appropriately for an nghttp2_nv*. char* start = reinterpret_cast( - RoundUp(reinterpret_cast(*buf_), + RoundUp(reinterpret_cast(buf_.data()), alignof(nghttp2_origin_entry))); char* origin_contents = start + (count_ * sizeof(nghttp2_origin_entry)); nghttp2_origin_entry* const nva = reinterpret_cast(start); - CHECK_LE(origin_contents + origin_string_len, *buf_ + buf_.length()); + CHECK_LE(origin_contents + origin_string_len, buf_.data() + buf_.size()); CHECK_EQ(origin_string->WriteOneByte( - isolate, + env->isolate(), reinterpret_cast(origin_contents), 0, origin_string_len, @@ -469,7 +454,7 @@ void Http2Session::DecreaseAllocatedSize(size_t size) { Http2Session::Http2Session(Http2State* http2_state, Local wrap, - nghttp2_session_type type) + SessionType type) : AsyncWrap(http2_state->env(), wrap, AsyncWrap::PROVIDER_HTTP2SESSION), js_fields_(http2_state->env()->isolate()), session_type_(type), @@ -480,18 +465,18 @@ Http2Session::Http2Session(Http2State* http2_state, // Capture the configuration options for this session Http2Options opts(http2_state, type); - max_session_memory_ = opts.GetMaxSessionMemory(); + max_session_memory_ = opts.max_session_memory(); - uint32_t maxHeaderPairs = opts.GetMaxHeaderPairs(); + uint32_t maxHeaderPairs = opts.max_header_pairs(); max_header_pairs_ = type == NGHTTP2_SESSION_SERVER ? GetServerMaxHeaderPairs(maxHeaderPairs) : GetClientMaxHeaderPairs(maxHeaderPairs); - max_outstanding_pings_ = opts.GetMaxOutstandingPings(); - max_outstanding_settings_ = opts.GetMaxOutstandingSettings(); + max_outstanding_pings_ = opts.max_outstanding_pings(); + max_outstanding_settings_ = opts.max_outstanding_settings(); - padding_strategy_ = opts.GetPaddingStrategy(); + padding_strategy_ = opts.padding_strategy(); bool hasGetPaddingCallback = padding_strategy_ != PADDING_STRATEGY_NONE; @@ -525,7 +510,7 @@ Http2Session::Http2Session(Http2State* http2_state, } Http2Session::~Http2Session() { - CHECK_EQ(flags_ & SESSION_STATE_HAS_SCOPE, 0); + CHECK(!is_in_scope()); Debug(this, "freeing nghttp2 session"); // Explicitly reset session_ so the subsequent // current_nghttp2_memory_ check passes. @@ -533,16 +518,23 @@ Http2Session::~Http2Session() { CHECK_EQ(current_nghttp2_memory_, 0); } +void Http2Session::MemoryInfo(MemoryTracker* tracker) const { + tracker->TrackField("streams", streams_); + tracker->TrackField("outstanding_pings", outstanding_pings_); + tracker->TrackField("outstanding_settings", outstanding_settings_); + tracker->TrackField("outgoing_buffers", outgoing_buffers_); + tracker->TrackFieldWithSize("stream_buf", stream_buf_.len); + tracker->TrackFieldWithSize("outgoing_storage", outgoing_storage_.size()); + tracker->TrackFieldWithSize("pending_rst_streams", + pending_rst_streams_.size() * sizeof(int32_t)); + tracker->TrackFieldWithSize("nghttp2_memory", current_nghttp2_memory_); +} + std::string Http2Session::diagnostic_name() const { return std::string("Http2Session ") + TypeName() + " (" + std::to_string(static_cast(get_async_id())) + ")"; } -inline bool HasHttp2Observer(Environment* env) { - AliasedUint32Array& observers = env->performance_state()->observers; - return observers[performance::NODE_PERFORMANCE_ENTRY_TYPE_HTTP2] != 0; -} - void Http2Stream::EmitStatistics() { CHECK_NOT_NULL(session()); if (!HasHttp2Observer(env())) @@ -615,13 +607,13 @@ void Http2Session::EmitStatistics() { void Http2Session::Close(uint32_t code, bool socket_closed) { Debug(this, "closing session"); - if (flags_ & SESSION_STATE_CLOSING) + if (is_closing()) return; - flags_ |= SESSION_STATE_CLOSING; + set_closing(); // Stop reading on the i/o stream if (stream_ != nullptr) { - flags_ |= SESSION_STATE_READING_STOPPED; + set_reading_stopped(); stream_->ReadStop(); } @@ -637,10 +629,10 @@ void Http2Session::Close(uint32_t code, bool socket_closed) { stream_->RemoveStreamListener(this); } - flags_ |= SESSION_STATE_CLOSED; + set_destroyed(); // If we are writing we will get to make the callback in OnStreamAfterWrite. - if ((flags_ & SESSION_STATE_WRITE_IN_PROGRESS) == 0) { + if (!is_write_in_progress()) { Debug(this, "make done session callback"); HandleScope scope(env()->isolate()); MakeCallback(env()->ondone_string(), 0, nullptr); @@ -663,12 +655,12 @@ void Http2Session::Close(uint32_t code, bool socket_closed) { // Locates an existing known stream by ID. nghttp2 has a similar method // but this is faster and does not fail if the stream is not found. -inline Http2Stream* Http2Session::FindStream(int32_t id) { +BaseObjectPtr Http2Session::FindStream(int32_t id) { auto s = streams_.find(id); - return s != streams_.end() ? s->second : nullptr; + return s != streams_.end() ? s->second : BaseObjectPtr(); } -inline bool Http2Session::CanAddStream() { +bool Http2Session::CanAddStream() { uint32_t maxConcurrentStreams = nghttp2_session_get_local_settings( session_.get(), NGHTTP2_SETTINGS_MAX_CONCURRENT_STREAMS); @@ -677,12 +669,12 @@ inline bool Http2Session::CanAddStream() { // We can add a new stream so long as we are less than the current // maximum on concurrent streams and there's enough available memory return streams_.size() < maxSize && - IsAvailableSessionMemory(sizeof(Http2Stream)); + has_available_session_memory(sizeof(Http2Stream)); } -inline void Http2Session::AddStream(Http2Stream* stream) { +void Http2Session::AddStream(Http2Stream* stream) { CHECK_GE(++statistics_.stream_count, 0); - streams_[stream->id()] = stream; + streams_[stream->id()] = BaseObjectPtr(stream); size_t size = streams_.size(); if (size > statistics_.max_concurrent_streams) statistics_.max_concurrent_streams = size; @@ -690,11 +682,16 @@ inline void Http2Session::AddStream(Http2Stream* stream) { } -inline void Http2Session::RemoveStream(Http2Stream* stream) { - if (streams_.empty() || stream == nullptr) - return; // Nothing to remove, item was never added? - streams_.erase(stream->id()); - DecrementCurrentSessionMemory(sizeof(*stream)); +BaseObjectPtr Http2Session::RemoveStream(int32_t id) { + BaseObjectPtr stream; + if (streams_.empty()) + return stream; + stream = FindStream(id); + if (stream) { + streams_.erase(id); + DecrementCurrentSessionMemory(sizeof(*stream)); + } + return stream; } // Used as one of the Padding Strategy functions. Will attempt to ensure @@ -738,7 +735,7 @@ ssize_t Http2Session::ConsumeHTTP2Data() { Debug(this, "receiving %d bytes [wants data? %d]", read_len, nghttp2_session_want_read(session_.get())); - flags_ &= ~SESSION_STATE_NGHTTP2_RECV_PAUSED; + set_receive_paused(false); ssize_t ret = nghttp2_session_mem_recv(session_.get(), reinterpret_cast(stream_buf_.base) + @@ -746,8 +743,8 @@ ssize_t Http2Session::ConsumeHTTP2Data() { read_len); CHECK_NE(ret, NGHTTP2_ERR_NOMEM); - if (flags_ & SESSION_STATE_NGHTTP2_RECV_PAUSED) { - CHECK_NE(flags_ & SESSION_STATE_READING_STOPPED, 0); + if (is_receive_paused()) { + CHECK(is_reading_stopped()); CHECK_GT(ret, 0); CHECK_LE(static_cast(ret), read_len); @@ -770,14 +767,14 @@ ssize_t Http2Session::ConsumeHTTP2Data() { return ret; // Send any data that was queued up while processing the received data. - if (!IsDestroyed()) { + if (!is_destroyed()) { SendPendingData(); } return ret; } -inline int32_t GetFrameID(const nghttp2_frame* frame) { +int32_t GetFrameID(const nghttp2_frame* frame) { // If this is a push promise, we want to grab the id of the promised stream return (frame->hd.type == NGHTTP2_PUSH_PROMISE) ? frame->push_promise.promised_stream_id : @@ -796,10 +793,10 @@ int Http2Session::OnBeginHeadersCallback(nghttp2_session* handle, int32_t id = GetFrameID(frame); Debug(session, "beginning headers for stream %d", id); - Http2Stream* stream = session->FindStream(id); + BaseObjectPtr stream = session->FindStream(id); // The common case is that we're creating a new stream. The less likely // case is that we're receiving a set of trailers - if (LIKELY(stream == nullptr)) { + if (LIKELY(!stream)) { if (UNLIKELY(!session->CanAddStream() || Http2Stream::New(session, id, frame->headers.cat) == nullptr)) { @@ -807,13 +804,16 @@ int Http2Session::OnBeginHeadersCallback(nghttp2_session* handle, session->js_fields_->max_rejected_streams) return NGHTTP2_ERR_CALLBACK_FAILURE; // Too many concurrent streams being opened - nghttp2_submit_rst_stream(**session, NGHTTP2_FLAG_NONE, id, - NGHTTP2_ENHANCE_YOUR_CALM); + nghttp2_submit_rst_stream( + session->session(), + NGHTTP2_FLAG_NONE, + id, + NGHTTP2_ENHANCE_YOUR_CALM); return NGHTTP2_ERR_TEMPORAL_CALLBACK_FAILURE; } session->rejected_stream_count_ = 0; - } else if (!stream->IsDestroyed()) { + } else if (!stream->is_destroyed()) { stream->StartHeaders(frame->headers.cat); } return 0; @@ -830,15 +830,15 @@ int Http2Session::OnHeaderCallback(nghttp2_session* handle, void* user_data) { Http2Session* session = static_cast(user_data); int32_t id = GetFrameID(frame); - Http2Stream* stream = session->FindStream(id); + BaseObjectPtr stream = session->FindStream(id); // If stream is null at this point, either something odd has happened // or the stream was closed locally while header processing was occurring. // either way, do not proceed and close the stream. - if (UNLIKELY(stream == nullptr)) + if (UNLIKELY(!stream)) return NGHTTP2_ERR_TEMPORAL_CALLBACK_FAILURE; // If the stream has already been destroyed, ignore. - if (!stream->IsDestroyed() && !stream->AddHeader(name, value, flags)) { + if (!stream->is_destroyed() && !stream->AddHeader(name, value, flags)) { // This will only happen if the connected peer sends us more // than the allowed number of header items at any given time stream->SubmitRstStream(NGHTTP2_ENHANCE_YOUR_CALM); @@ -976,10 +976,10 @@ int Http2Session::OnStreamClose(nghttp2_session* handle, Local context = env->context(); Context::Scope context_scope(context); Debug(session, "stream %d closed with code: %d", id, code); - Http2Stream* stream = session->FindStream(id); + BaseObjectPtr stream = session->FindStream(id); // Intentionally ignore the callback if the stream does not exist or has // already been destroyed - if (stream == nullptr || stream->IsDestroyed()) + if (!stream || stream->is_destroyed()) return 0; stream->Close(code); @@ -991,6 +991,7 @@ int Http2Session::OnStreamClose(nghttp2_session* handle, MaybeLocal answer = stream->MakeCallback(env->http2session_on_stream_close_function(), 1, &arg); + Local def = v8::False(env->isolate()); if (answer.IsEmpty() || answer.ToLocalChecked()->IsFalse()) { // Skip to destroy stream->Destroy(); @@ -1037,9 +1038,10 @@ int Http2Session::OnDataChunkReceived(nghttp2_session* handle, // so that it can send a WINDOW_UPDATE frame. This is a critical part of // the flow control process in http2 CHECK_EQ(nghttp2_session_consume_connection(handle, len), 0); - Http2Stream* stream = session->FindStream(id); + BaseObjectPtr stream = session->FindStream(id); + // If the stream has been destroyed, ignore this chunk - if (stream->IsDestroyed()) + if (!stream || stream->is_destroyed()) return 0; stream->statistics_.received_bytes += len; @@ -1071,7 +1073,7 @@ int Http2Session::OnDataChunkReceived(nghttp2_session* handle, // If the stream owner (e.g. the JS Http2Stream) wants more data, just // tell nghttp2 that all data has been consumed. Otherwise, defer until // more data is being requested. - if (stream->IsReading()) + if (stream->is_reading()) nghttp2_session_consume_stream(handle, id, avail); else stream->inbound_consumed_data_while_paused_ += avail; @@ -1085,9 +1087,9 @@ int Http2Session::OnDataChunkReceived(nghttp2_session* handle, // If we are currently waiting for a write operation to finish, we should // tell nghttp2 that we want to wait before we process more input data. - if (session->flags_ & SESSION_STATE_WRITE_IN_PROGRESS) { - CHECK_NE(session->flags_ & SESSION_STATE_READING_STOPPED, 0); - session->flags_ |= SESSION_STATE_NGHTTP2_RECV_PAUSED; + if (session->is_write_in_progress()) { + CHECK(session->is_reading_stopped()); + session->set_receive_paused(); return NGHTTP2_ERR_PAUSE; } @@ -1194,10 +1196,10 @@ void Http2Session::HandleHeadersFrame(const nghttp2_frame* frame) { int32_t id = GetFrameID(frame); Debug(this, "handle headers frame for stream %d", id); - Http2Stream* stream = FindStream(id); + BaseObjectPtr stream = FindStream(id); // If the stream has already been destroyed, ignore. - if (stream->IsDestroyed()) + if (!stream || stream->is_destroyed()) return; // The headers are stored as a vector of Http2Header instances. @@ -1263,9 +1265,11 @@ void Http2Session::HandlePriorityFrame(const nghttp2_frame* frame) { int Http2Session::HandleDataFrame(const nghttp2_frame* frame) { int32_t id = GetFrameID(frame); Debug(this, "handling data frame for stream %d", id); - Http2Stream* stream = FindStream(id); + BaseObjectPtr stream = FindStream(id); - if (!stream->IsDestroyed() && frame->hd.flags & NGHTTP2_FLAG_END_STREAM) { + if (stream && + !stream->is_destroyed() && + frame->hd.flags & NGHTTP2_FLAG_END_STREAM) { stream->EmitRead(UV_EOF); } else if (frame->hd.length == 0) { return 1; // Consider 0-length frame without END_STREAM an error. @@ -1292,6 +1296,9 @@ void Http2Session::HandleGoawayFrame(const nghttp2_frame* frame) { size_t length = goaway_frame.opaque_data_len; if (length > 0) { + // If the copy fails for any reason here, we just ignore it. + // The additional goaway data is completely optional and we + // shouldn't fail if we're not able to process it. argv[2] = Buffer::Copy(isolate, reinterpret_cast(goaway_frame.opaque_data), length).ToLocalChecked(); @@ -1317,14 +1324,8 @@ void Http2Session::HandleAltSvcFrame(const nghttp2_frame* frame) { Local argv[3] = { Integer::New(isolate, id), - String::NewFromOneByte(isolate, - altsvc->origin, - NewStringType::kNormal, - altsvc->origin_len).ToLocalChecked(), - String::NewFromOneByte(isolate, - altsvc->field_value, - NewStringType::kNormal, - altsvc->field_value_len).ToLocalChecked(), + OneByteString(isolate, altsvc->origin, altsvc->origin_len), + OneByteString(isolate, altsvc->field_value, altsvc->field_value_len) }; MakeCallback(env()->http2session_on_altsvc_function(), @@ -1347,10 +1348,7 @@ void Http2Session::HandleOriginFrame(const nghttp2_frame* frame) { for (size_t i = 0; i < nov; ++i) { const nghttp2_origin_entry& entry = origin->ov[i]; - origin_v[i] = - String::NewFromOneByte( - isolate, entry.origin, NewStringType::kNormal, entry.origin_len) - .ToLocalChecked(); + origin_v[i] = OneByteString(isolate, entry.origin, entry.origin_len); } Local holder = Array::New(isolate, origin_v.data(), origin_v.size()); MakeCallback(env()->http2session_on_origin_function(), 1, &holder); @@ -1384,9 +1382,10 @@ void Http2Session::HandlePingFrame(const nghttp2_frame* frame) { if (!(js_fields_->bitfield & (1 << kSessionHasPingListeners))) return; // Notify the session that a ping occurred - arg = Buffer::Copy(env(), - reinterpret_cast(frame->ping.opaque_data), - 8).ToLocalChecked(); + arg = Buffer::Copy( + env(), + reinterpret_cast(frame->ping.opaque_data), + 8).ToLocalChecked(); MakeCallback(env()->http2session_on_ping_function(), 1, &arg); } @@ -1430,20 +1429,20 @@ void Http2Session::HandleSettingsFrame(const nghttp2_frame* frame) { void Http2Session::OnStreamAfterWrite(WriteWrap* w, int status) { Debug(this, "write finished with status %d", status); - CHECK_NE(flags_ & SESSION_STATE_WRITE_IN_PROGRESS, 0); - flags_ &= ~SESSION_STATE_WRITE_IN_PROGRESS; + CHECK(is_write_in_progress()); + set_write_in_progress(false); // Inform all pending writes about their completion. ClearOutgoing(status); - if ((flags_ & SESSION_STATE_READING_STOPPED) && - !(flags_ & SESSION_STATE_WRITE_IN_PROGRESS) && + if (is_reading_stopped() && + !is_write_in_progress() && nghttp2_session_want_read(session_.get())) { - flags_ &= ~SESSION_STATE_READING_STOPPED; + set_reading_stopped(false); stream_->ReadStart(); } - if ((flags_ & SESSION_STATE_CLOSED) != 0) { + if (is_destroyed()) { HandleScope scope(env()->isolate()); MakeCallback(env()->ondone_string(), 0, nullptr); return; @@ -1454,7 +1453,7 @@ void Http2Session::OnStreamAfterWrite(WriteWrap* w, int status) { ConsumeHTTP2Data(); } - if (!(flags_ & SESSION_STATE_WRITE_SCHEDULED)) { + if (!is_write_scheduled()) { // Schedule a new write if nghttp2 wants to send data. MaybeScheduleWrite(); } @@ -1465,17 +1464,17 @@ void Http2Session::OnStreamAfterWrite(WriteWrap* w, int status) { // on the next iteration of the Node.js event loop (using the SetImmediate // queue), but only if a write has not already been scheduled. void Http2Session::MaybeScheduleWrite() { - CHECK_EQ(flags_ & SESSION_STATE_WRITE_SCHEDULED, 0); + CHECK(!is_write_scheduled()); if (UNLIKELY(!session_)) return; if (nghttp2_session_want_write(session_.get())) { HandleScope handle_scope(env()->isolate()); Debug(this, "scheduling write"); - flags_ |= SESSION_STATE_WRITE_SCHEDULED; + set_write_scheduled(); BaseObjectPtr strong_ref{this}; env()->SetImmediate([this, strong_ref](Environment* env) { - if (!session_ || !(flags_ & SESSION_STATE_WRITE_SCHEDULED)) { + if (!session_ || !is_write_scheduled()) { // This can happen e.g. when a stream was reset before this turn // of the event loop, in which case SendPendingData() is called early, // or the session was destroyed in the meantime. @@ -1492,11 +1491,11 @@ void Http2Session::MaybeScheduleWrite() { } void Http2Session::MaybeStopReading() { - if (flags_ & SESSION_STATE_READING_STOPPED) return; + if (is_reading_stopped()) return; int want_read = nghttp2_session_want_read(session_.get()); Debug(this, "wants read? %d", want_read); - if (want_read == 0 || (flags_ & SESSION_STATE_WRITE_IN_PROGRESS)) { - flags_ |= SESSION_STATE_READING_STOPPED; + if (want_read == 0 || is_write_in_progress()) { + set_reading_stopped(); stream_->ReadStop(); } } @@ -1504,9 +1503,9 @@ void Http2Session::MaybeStopReading() { // Unset the sending state, finish up all current writes, and reset // storage for data and metadata that was associated with these writes. void Http2Session::ClearOutgoing(int status) { - CHECK_NE(flags_ & SESSION_STATE_SENDING, 0); + CHECK(is_sending()); - flags_ &= ~SESSION_STATE_SENDING; + set_sending(false); if (outgoing_buffers_.size() > 0) { outgoing_storage_.clear(); @@ -1534,8 +1533,8 @@ void Http2Session::ClearOutgoing(int status) { SendPendingData(); for (int32_t stream_id : current_pending_rst_streams) { - Http2Stream* stream = FindStream(stream_id); - if (LIKELY(stream != nullptr)) + BaseObjectPtr stream = FindStream(stream_id); + if (LIKELY(stream)) stream->FlushRstStream(); } } @@ -1573,15 +1572,15 @@ uint8_t Http2Session::SendPendingData() { // Do not attempt to send data on the socket if the destroying flag has // been set. That means everything is shutting down and the socket // will not be usable. - if (IsDestroyed()) + if (is_destroyed()) return 0; - flags_ &= ~SESSION_STATE_WRITE_SCHEDULED; + set_write_scheduled(false); // SendPendingData should not be called recursively. - if (flags_ & SESSION_STATE_SENDING) + if (is_sending()) return 1; // This is cleared by ClearOutgoing(). - flags_ |= SESSION_STATE_SENDING; + set_sending(); ssize_t src_length; const uint8_t* src; @@ -1635,11 +1634,11 @@ uint8_t Http2Session::SendPendingData() { chunks_sent_since_last_write_++; - CHECK_EQ(flags_ & SESSION_STATE_WRITE_IN_PROGRESS, 0); - flags_ |= SESSION_STATE_WRITE_IN_PROGRESS; + CHECK(!is_write_in_progress()); + set_write_in_progress(); StreamWriteResult res = underlying_stream()->Write(*bufs, count); if (!res.async) { - flags_ &= ~SESSION_STATE_WRITE_IN_PROGRESS; + set_write_in_progress(false); ClearOutgoing(res.err); } @@ -1661,7 +1660,8 @@ int Http2Session::OnSendData( nghttp2_data_source* source, void* user_data) { Http2Session* session = static_cast(user_data); - Http2Stream* stream = GetStream(session, frame->hd.stream_id, source); + BaseObjectPtr stream = session->FindStream(frame->hd.stream_id); + if (!stream) return 0; // Send the frame header + a byte that indicates padding length. session->CopyDataIntoOutgoing(framehd, 9); @@ -1707,7 +1707,7 @@ int Http2Session::OnSendData( // Creates a new Http2Stream and submits a new http2 request. Http2Stream* Http2Session::SubmitRequest( - nghttp2_priority_spec* prispec, + const Http2Priority& priority, const Http2Headers& headers, int32_t* ret, int options) { @@ -1717,7 +1717,7 @@ Http2Stream* Http2Session::SubmitRequest( Http2Stream::Provider::Stream prov(options); *ret = nghttp2_submit_request( session_.get(), - prispec, + &priority, headers.data(), headers.length(), *prov, @@ -1768,6 +1768,9 @@ void Http2Session::OnStreamRead(ssize_t nread, const uv_buf_t& buf_) { // The data in stream_buf_ is already accounted for, add nread received // bytes to session memory but remove the already processed // stream_buf_offset_ bytes. + // TODO(@jasnell): There are some cases where nread is < stream_buf_offset_ + // here but things still work. Those need to be investigated. + // CHECK_GE(nread, stream_buf_offset_); IncrementCurrentSessionMemory(nread - stream_buf_offset_); buf = std::move(new_buf); @@ -1846,7 +1849,7 @@ Http2Stream::Http2Stream(Http2Session* session, statistics_.start_time = uv_hrtime(); // Limit the number of header pairs - max_header_pairs_ = session->GetMaxHeaderPairs(); + max_header_pairs_ = session->max_header_pairs(); if (max_header_pairs_ == 0) { max_header_pairs_ = DEFAULT_MAX_HEADER_LIST_PAIRS; } @@ -1861,7 +1864,7 @@ Http2Stream::Http2Stream(Http2Session* session, MAX_MAX_HEADER_LIST_SIZE); if (options & STREAM_OPTION_GET_TRAILERS) - flags_ |= NGHTTP2_STREAM_FLAG_TRAILERS; + set_has_trailers(); PushStreamListener(&stream_listener_); @@ -1871,11 +1874,12 @@ Http2Stream::Http2Stream(Http2Session* session, } Http2Stream::~Http2Stream() { - if (!session_) - return; Debug(this, "tearing down stream"); - session_->DecrementCurrentSessionMemory(current_headers_length_); - session_->RemoveStream(this); +} + +void Http2Stream::MemoryInfo(MemoryTracker* tracker) const { + tracker->TrackField("current_headers", current_headers_); + tracker->TrackField("queue", queue_); } std::string Http2Stream::diagnostic_name() const { @@ -1887,7 +1891,7 @@ std::string Http2Stream::diagnostic_name() const { // Notify the Http2Stream that a new block of HEADERS is being processed. void Http2Stream::StartHeaders(nghttp2_headers_category category) { Debug(this, "starting headers, category: %d", category); - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); session_->DecrementCurrentSessionMemory(current_headers_length_); current_headers_length_ = 0; current_headers_.clear(); @@ -1895,13 +1899,15 @@ void Http2Stream::StartHeaders(nghttp2_headers_category category) { } -nghttp2_stream* Http2Stream::operator*() { - return nghttp2_session_find_stream(**session_, id_); +nghttp2_stream* Http2Stream::operator*() const { return stream(); } + +nghttp2_stream* Http2Stream::stream() const { + return nghttp2_session_find_stream(session_->session(), id_); } void Http2Stream::Close(int32_t code) { - CHECK(!this->IsDestroyed()); - flags_ |= NGHTTP2_STREAM_FLAG_CLOSED; + CHECK(!this->is_destroyed()); + set_closed(); code_ = code; Debug(this, "closed with code %d", code); } @@ -1913,14 +1919,15 @@ ShutdownWrap* Http2Stream::CreateShutdownWrap(v8::Local object) { } int Http2Stream::DoShutdown(ShutdownWrap* req_wrap) { - if (IsDestroyed()) + if (is_destroyed()) return UV_EPIPE; { Http2Scope h2scope(this); - flags_ |= NGHTTP2_STREAM_FLAG_SHUT; - CHECK_NE(nghttp2_session_resume_data(**session_, id_), - NGHTTP2_ERR_NOMEM); + set_not_writable(); + CHECK_NE(nghttp2_session_resume_data( + session_->session(), id_), + NGHTTP2_ERR_NOMEM); Debug(this, "writable side shutdown"); } return 1; @@ -1931,36 +1938,40 @@ int Http2Stream::DoShutdown(ShutdownWrap* req_wrap) { // using the SetImmediate queue. void Http2Stream::Destroy() { // Do nothing if this stream instance is already destroyed - if (IsDestroyed()) + if (is_destroyed()) return; - if (session_->HasPendingRstStream(id_)) + if (session_->has_pending_rststream(id_)) FlushRstStream(); - flags_ |= NGHTTP2_STREAM_FLAG_DESTROYED; + set_destroyed(); Debug(this, "destroying stream"); // Wait until the start of the next loop to delete because there // may still be some pending operations queued for this stream. - BaseObjectPtr strong_ref{this}; - env()->SetImmediate([this, strong_ref](Environment* env) { - // Free any remaining outgoing data chunks here. This should be done - // here because it's possible for destroy to have been called while - // we still have queued outbound writes. - while (!queue_.empty()) { - NgHttp2StreamWrite& head = queue_.front(); - if (head.req_wrap != nullptr) - head.req_wrap->Done(UV_ECANCELED); - queue_.pop(); - } + BaseObjectPtr strong_ref = session_->RemoveStream(id_); + if (strong_ref) { + env()->SetImmediate([this, strong_ref = std::move(strong_ref)]( + Environment* env) { + // Free any remaining outgoing data chunks here. This should be done + // here because it's possible for destroy to have been called while + // we still have queued outbound writes. + while (!queue_.empty()) { + NgHttp2StreamWrite& head = queue_.front(); + if (head.req_wrap != nullptr) + head.req_wrap->Done(UV_ECANCELED); + queue_.pop(); + } - // We can destroy the stream now if there are no writes for it - // already on the socket. Otherwise, we'll wait for the garbage collector - // to take care of cleaning up. - if (session() == nullptr || !session()->HasWritesOnSocketForStream(this)) { - // Delete once strong_ref goes out of scope. - Detach(); - } - }); + // We can destroy the stream now if there are no writes for it + // already on the socket. Otherwise, we'll wait for the garbage collector + // to take care of cleaning up. + if (session() == nullptr || + !session()->HasWritesOnSocketForStream(this)) { + // Delete once strong_ref goes out of scope. + Detach(); + } + }); + } statistics_.end_time = uv_hrtime(); session_->statistics_.stream_average_duration = @@ -1973,18 +1984,18 @@ void Http2Stream::Destroy() { // Initiates a response on the Http2Stream using data provided via the // StreamBase Streams API. int Http2Stream::SubmitResponse(const Http2Headers& headers, int options) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); Http2Scope h2scope(this); Debug(this, "submitting response"); if (options & STREAM_OPTION_GET_TRAILERS) - flags_ |= NGHTTP2_STREAM_FLAG_TRAILERS; + set_has_trailers(); - if (!IsWritable()) + if (!is_writable()) options |= STREAM_OPTION_EMPTY_PAYLOAD; Http2Stream::Provider::Stream prov(this, options); int ret = nghttp2_submit_response( - **session_, + session_->session(), id_, headers.data(), headers.length(), @@ -1996,11 +2007,11 @@ int Http2Stream::SubmitResponse(const Http2Headers& headers, int options) { // Submit informational headers for a stream. int Http2Stream::SubmitInfo(const Http2Headers& headers) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); Http2Scope h2scope(this); Debug(this, "sending %d informational headers", headers.length()); int ret = nghttp2_submit_headers( - **session_, + session_->session(), NGHTTP2_FLAG_NONE, id_, nullptr, @@ -2013,18 +2024,18 @@ int Http2Stream::SubmitInfo(const Http2Headers& headers) { void Http2Stream::OnTrailers() { Debug(this, "let javascript know we are ready for trailers"); - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); Isolate* isolate = env()->isolate(); HandleScope scope(isolate); Local context = env()->context(); Context::Scope context_scope(context); - flags_ &= ~NGHTTP2_STREAM_FLAG_TRAILERS; + set_has_trailers(false); MakeCallback(env()->http2session_on_stream_trailers_function(), 0, nullptr); } // Submit informational headers for a stream. int Http2Stream::SubmitTrailers(const Http2Headers& headers) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); Http2Scope h2scope(this); Debug(this, "sending %d trailers", headers.length()); int ret; @@ -2033,10 +2044,14 @@ int Http2Stream::SubmitTrailers(const Http2Headers& headers) { // to indicate that the stream is ready to be closed. if (headers.length() == 0) { Http2Stream::Provider::Stream prov(this, 0); - ret = nghttp2_submit_data(**session_, NGHTTP2_FLAG_END_STREAM, id_, *prov); + ret = nghttp2_submit_data( + session_->session(), + NGHTTP2_FLAG_END_STREAM, + id_, + *prov); } else { ret = nghttp2_submit_trailer( - **session_, + session_->session(), id_, headers.data(), headers.length()); @@ -2046,17 +2061,20 @@ int Http2Stream::SubmitTrailers(const Http2Headers& headers) { } // Submit a PRIORITY frame to the connected peer. -int Http2Stream::SubmitPriority(nghttp2_priority_spec* prispec, +int Http2Stream::SubmitPriority(const Http2Priority& priority, bool silent) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); Http2Scope h2scope(this); Debug(this, "sending priority spec"); int ret = silent ? - nghttp2_session_change_stream_priority(**session_, - id_, prispec) : - nghttp2_submit_priority(**session_, - NGHTTP2_FLAG_NONE, - id_, prispec); + nghttp2_session_change_stream_priority( + session_->session(), + id_, + &priority) : + nghttp2_submit_priority( + session_->session(), + NGHTTP2_FLAG_NONE, + id_, &priority); CHECK_NE(ret, NGHTTP2_ERR_NOMEM); return ret; } @@ -2064,7 +2082,7 @@ int Http2Stream::SubmitPriority(nghttp2_priority_spec* prispec, // Closes the Http2Stream by submitting an RST_STREAM frame to the connected // peer. void Http2Stream::SubmitRstStream(const uint32_t code) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); code_ = code; // If possible, force a purge of any currently pending data here to make sure // it is sent before closing the stream. If it returns non-zero then we need @@ -2079,11 +2097,14 @@ void Http2Stream::SubmitRstStream(const uint32_t code) { } void Http2Stream::FlushRstStream() { - if (IsDestroyed()) + if (is_destroyed()) return; Http2Scope h2scope(this); - CHECK_EQ(nghttp2_submit_rst_stream(**session_, NGHTTP2_FLAG_NONE, - id_, code_), 0); + CHECK_EQ(nghttp2_submit_rst_stream( + session_->session(), + NGHTTP2_FLAG_NONE, + id_, + code_), 0); } @@ -2091,11 +2112,11 @@ void Http2Stream::FlushRstStream() { Http2Stream* Http2Stream::SubmitPushPromise(const Http2Headers& headers, int32_t* ret, int options) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); Http2Scope h2scope(this); Debug(this, "sending push promise"); *ret = nghttp2_submit_push_promise( - **session_, + session_->session(), NGHTTP2_FLAG_NONE, id_, headers.data(), @@ -2115,17 +2136,17 @@ Http2Stream* Http2Stream::SubmitPushPromise(const Http2Headers& headers, // out to JS land. int Http2Stream::ReadStart() { Http2Scope h2scope(this); - CHECK(!this->IsDestroyed()); - flags_ |= NGHTTP2_STREAM_FLAG_READ_START; - flags_ &= ~NGHTTP2_STREAM_FLAG_READ_PAUSED; + CHECK(!this->is_destroyed()); + set_reading(); Debug(this, "reading starting"); // Tell nghttp2 about our consumption of the data that was handed // off to JS land. - nghttp2_session_consume_stream(**session_, - id_, - inbound_consumed_data_while_paused_); + nghttp2_session_consume_stream( + session_->session(), + id_, + inbound_consumed_data_while_paused_); inbound_consumed_data_while_paused_ = 0; return 0; @@ -2133,10 +2154,10 @@ int Http2Stream::ReadStart() { // Switch the StreamBase into paused mode. int Http2Stream::ReadStop() { - CHECK(!this->IsDestroyed()); - if (!IsReading()) + CHECK(!this->is_destroyed()); + if (!is_reading()) return 0; - flags_ |= NGHTTP2_STREAM_FLAG_READ_PAUSED; + set_paused(); Debug(this, "reading stopped"); return 0; } @@ -2157,7 +2178,7 @@ int Http2Stream::DoWrite(WriteWrap* req_wrap, uv_stream_t* send_handle) { CHECK_NULL(send_handle); Http2Scope h2scope(this); - if (!IsWritable() || IsDestroyed()) { + if (!is_writable() || is_destroyed()) { req_wrap->Done(UV_EOF); return 0; } @@ -2171,7 +2192,9 @@ int Http2Stream::DoWrite(WriteWrap* req_wrap, }); IncrementAvailableOutboundLength(bufs[i].len); } - CHECK_NE(nghttp2_session_resume_data(**session_, id_), NGHTTP2_ERR_NOMEM); + CHECK_NE(nghttp2_session_resume_data( + session_->session(), + id_), NGHTTP2_ERR_NOMEM); return 0; } @@ -2183,7 +2206,7 @@ int Http2Stream::DoWrite(WriteWrap* req_wrap, bool Http2Stream::AddHeader(nghttp2_rcbuf* name, nghttp2_rcbuf* value, uint8_t flags) { - CHECK(!this->IsDestroyed()); + CHECK(!this->is_destroyed()); if (Http2RcBufferPointer::IsZeroLength(name)) return true; // Ignore empty headers. @@ -2192,7 +2215,7 @@ bool Http2Stream::AddHeader(nghttp2_rcbuf* name, size_t length = header.length() + 32; // A header can only be added if we have not exceeded the maximum number // of headers and the session has memory available for it. - if (!session_->IsAvailableSessionMemory(length) || + if (!session_->has_available_session_memory(length) || current_headers_.size() == max_header_pairs_ || current_headers_length_ + length > max_header_length_) { return false; @@ -2210,7 +2233,7 @@ bool Http2Stream::AddHeader(nghttp2_rcbuf* name, // A Provider is the thing that provides outbound DATA frame data. Http2Stream::Provider::Provider(Http2Stream* stream, int options) { - CHECK(!stream->IsDestroyed()); + CHECK(!stream->is_destroyed()); provider_.source.ptr = stream; empty_ = options & STREAM_OPTION_EMPTY_PAYLOAD; } @@ -2245,7 +2268,8 @@ ssize_t Http2Stream::Provider::Stream::OnRead(nghttp2_session* handle, void* user_data) { Http2Session* session = static_cast(user_data); Debug(session, "reading outbound data for stream %d", id); - Http2Stream* stream = GetStream(session, id, source); + BaseObjectPtr stream = session->FindStream(id); + if (!stream) return 0; if (stream->statistics_.first_byte_sent == 0) stream->statistics_.first_byte_sent = uv_hrtime(); CHECK_EQ(id, stream->id()); @@ -2275,21 +2299,21 @@ ssize_t Http2Stream::Provider::Stream::OnRead(nghttp2_session* handle, } } - if (amount == 0 && stream->IsWritable()) { + if (amount == 0 && stream->is_writable()) { CHECK(stream->queue_.empty()); Debug(session, "deferring stream %d", id); stream->EmitWantsWrite(length); - if (stream->available_outbound_length_ > 0 || !stream->IsWritable()) { + if (stream->available_outbound_length_ > 0 || !stream->is_writable()) { // EmitWantsWrite() did something interesting synchronously, restart: return OnRead(handle, id, buf, length, flags, source, user_data); } return NGHTTP2_ERR_DEFERRED; } - if (stream->queue_.empty() && !stream->IsWritable()) { + if (stream->queue_.empty() && !stream->is_writable()) { Debug(session, "no more data for stream %d", id); *flags |= NGHTTP2_DATA_FLAG_EOF; - if (stream->HasTrailers()) { + if (stream->has_trailers()) { *flags |= NGHTTP2_DATA_FLAG_NO_END_STREAM; stream->OnTrailers(); } @@ -2299,12 +2323,12 @@ ssize_t Http2Stream::Provider::Stream::OnRead(nghttp2_session* handle, return amount; } -inline void Http2Stream::IncrementAvailableOutboundLength(size_t amount) { +void Http2Stream::IncrementAvailableOutboundLength(size_t amount) { available_outbound_length_ += amount; session_->IncrementCurrentSessionMemory(amount); } -inline void Http2Stream::DecrementAvailableOutboundLength(size_t amount) { +void Http2Stream::DecrementAvailableOutboundLength(size_t amount) { available_outbound_length_ -= amount; session_->DecrementCurrentSessionMemory(amount); } @@ -2318,10 +2342,9 @@ void HttpErrorString(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); uint32_t val = args[0]->Uint32Value(env->context()).ToChecked(); args.GetReturnValue().Set( - String::NewFromOneByte( + OneByteString( env->isolate(), - reinterpret_cast(nghttp2_strerror(val)), - NewStringType::kInternalized).ToLocalChecked()); + reinterpret_cast(nghttp2_strerror(val)))); } @@ -2330,16 +2353,7 @@ void HttpErrorString(const FunctionCallbackInfo& args) { // output for an HTTP2-Settings header field. void PackSettings(const FunctionCallbackInfo& args) { Http2State* state = Unwrap(args.Data()); - Environment* env = state->env(); - // TODO(addaleax): We should not be creating a full AsyncWrap for this. - Local obj; - if (!env->http2settings_constructor_template() - ->NewInstance(env->context()) - .ToLocal(&obj)) { - return; - } - Http2Session::Http2Settings settings(state, nullptr, obj); - args.GetReturnValue().Set(settings.Pack()); + args.GetReturnValue().Set(Http2Settings::Pack(state)); } // A TypedArray instance is shared between C++ and JS land to contain the @@ -2347,7 +2361,7 @@ void PackSettings(const FunctionCallbackInfo& args) { // default values. void RefreshDefaultSettings(const FunctionCallbackInfo& args) { Http2State* state = Unwrap(args.Data()); - Http2Session::Http2Settings::RefreshDefaults(state); + Http2Settings::RefreshDefaults(state); } // Sets the next stream ID the Http2Session. If successful, returns true. @@ -2356,7 +2370,7 @@ void Http2Session::SetNextStreamID(const FunctionCallbackInfo& args) { Http2Session* session; ASSIGN_OR_RETURN_UNWRAP(&session, args.Holder()); int32_t id = args[0]->Int32Value(env->context()).ToChecked(); - if (nghttp2_session_set_next_stream_id(**session, id) < 0) { + if (nghttp2_session_set_next_stream_id(session->session(), id) < 0) { Debug(session, "failed to set next stream id to %d", id); return args.GetReturnValue().Set(false); } @@ -2385,7 +2399,7 @@ void Http2Session::RefreshState(const FunctionCallbackInfo& args) { AliasedFloat64Array& buffer = session->http2_state()->session_state_buffer; - nghttp2_session* s = **session; + nghttp2_session* s = session->session(); buffer[IDX_SESSION_STATE_EFFECTIVE_LOCAL_WINDOW_SIZE] = nghttp2_session_get_effective_local_window_size(s); @@ -2413,8 +2427,9 @@ void Http2Session::New(const FunctionCallbackInfo& args) { Http2State* state = Unwrap(args.Data()); Environment* env = state->env(); CHECK(args.IsConstructCall()); - int32_t val = args[0]->Int32Value(env->context()).ToChecked(); - nghttp2_session_type type = static_cast(val); + SessionType type = + static_cast( + args[0]->Int32Value(env->context()).ToChecked()); Http2Session* session = new Http2Session(state, args.This(), type); session->get_async_id(); // avoid compiler warning Debug(session, "session created"); @@ -2450,14 +2465,13 @@ void Http2Session::Request(const FunctionCallbackInfo& args) { Local headers = args[0].As(); int32_t options = args[1]->Int32Value(env->context()).ToChecked(); - Http2Priority priority(env, args[2], args[3], args[4]); Debug(session, "request submitted"); int32_t ret = 0; Http2Stream* stream = session->Http2Session::SubmitRequest( - &priority, + Http2Priority(env, args[2], args[3], args[4]), Http2Headers(env, headers), &ret, static_cast(options)); @@ -2478,7 +2492,7 @@ void Http2Session::Goaway(uint32_t code, int32_t lastStreamID, const uint8_t* data, size_t len) { - if (IsDestroyed()) + if (is_destroyed()) return; Http2Scope h2scope(this); @@ -2629,10 +2643,9 @@ void Http2Stream::Priority(const FunctionCallbackInfo& args) { Http2Stream* stream; ASSIGN_OR_RETURN_UNWRAP(&stream, args.Holder()); - Http2Priority priority(env, args[0], args[1], args[2]); - bool silent = args[3]->IsTrue(); - - CHECK_EQ(stream->SubmitPriority(&priority, silent), 0); + CHECK_EQ(stream->SubmitPriority( + Http2Priority(env, args[0], args[1], args[2]), + args[3]->IsTrue()), 0); Debug(stream, "priority submitted"); } @@ -2649,8 +2662,8 @@ void Http2Stream::RefreshState(const FunctionCallbackInfo& args) { AliasedFloat64Array& buffer = stream->session()->http2_state()->stream_state_buffer; - nghttp2_stream* str = **stream; - nghttp2_session* s = **(stream->session()); + nghttp2_stream* str = stream->stream(); + nghttp2_session* s = stream->session()->session(); if (str == nullptr) { buffer[IDX_STREAM_STATE] = NGHTTP2_STREAM_STATE_IDLE; @@ -2685,12 +2698,13 @@ void Http2Session::AltSvc(int32_t id, origin, origin_len, value, value_len), 0); } -void Http2Session::Origin(nghttp2_origin_entry* ov, size_t count) { +void Http2Session::Origin(const Origins& origins) { Http2Scope h2scope(this); CHECK_EQ(nghttp2_submit_origin( session_.get(), NGHTTP2_FLAG_NONE, - ov, count), 0); + *origins, + origins.length()), 0); } // Submits an AltSvc frame to be sent to the connected peer. @@ -2705,6 +2719,9 @@ void Http2Session::AltSvc(const FunctionCallbackInfo& args) { Local origin_str = args[1]->ToString(env->context()).ToLocalChecked(); Local value_str = args[2]->ToString(env->context()).ToLocalChecked(); + if (origin_str.IsEmpty() || value_str.IsEmpty()) + return; + size_t origin_len = origin_str->Length(); size_t value_len = value_str->Length(); @@ -2728,15 +2745,9 @@ void Http2Session::Origin(const FunctionCallbackInfo& args) { ASSIGN_OR_RETURN_UNWRAP(&session, args.Holder()); Local origin_string = args[0].As(); - int32_t count = args[1]->Int32Value(context).ToChecked(); - + size_t count = args[1]->Int32Value(context).ToChecked(); - Origins origins(env->isolate(), - env->context(), - origin_string, - static_cast(count)); - - session->Origin(*origins, origins.length()); + session->Origin(Origins(env, origin_string, count)); } // Submits a PING frame to be sent to the connected peer. @@ -2753,28 +2764,9 @@ void Http2Session::Ping(const FunctionCallbackInfo& args) { CHECK_EQ(payload.length(), 8); } - Local obj; - if (!env->http2ping_constructor_template() - ->NewInstance(env->context()) - .ToLocal(&obj)) { - return; - } - if (obj->Set(env->context(), env->ondone_string(), args[1]).IsNothing()) - return; - - Http2Ping* ping = session->AddPing( - MakeDetachedBaseObject(session, obj)); - // To prevent abuse, we strictly limit the number of unacknowledged PING - // frames that may be sent at any given time. This is configurable in the - // Options when creating a Http2Session. - if (ping == nullptr) return args.GetReturnValue().Set(false); - - // The Ping itself is an Async resource. When the acknowledgement is received, - // the callback will be invoked and a notification sent out to JS land. The - // notification will include the duration of the ping, allowing the round - // trip to be measured. - ping->Send(payload.data()); - args.GetReturnValue().Set(true); + CHECK(args[1]->IsFunction()); + args.GetReturnValue().Set( + session->AddPing(payload.data(), args[1].As())); } // Submits a SETTINGS frame for the Http2Session @@ -2782,26 +2774,11 @@ void Http2Session::Settings(const FunctionCallbackInfo& args) { Environment* env = Environment::GetCurrent(args); Http2Session* session; ASSIGN_OR_RETURN_UNWRAP(&session, args.Holder()); - - Local obj; - if (!env->http2settings_constructor_template() - ->NewInstance(env->context()) - .ToLocal(&obj)) { - return; - } - if (obj->Set(env->context(), env->ondone_string(), args[0]).IsNothing()) - return; - - Http2Settings* settings = session->AddSettings( - MakeDetachedBaseObject( - session->http2_state(), session, obj, 0)); - if (settings == nullptr) return args.GetReturnValue().Set(false); - - settings->Send(); - args.GetReturnValue().Set(true); + CHECK(args[0]->IsFunction()); + args.GetReturnValue().Set(session->AddSettings(args[0].As())); } -BaseObjectPtr Http2Session::PopPing() { +BaseObjectPtr Http2Session::PopPing() { BaseObjectPtr ping; if (!outstanding_pings_.empty()) { ping = std::move(outstanding_pings_.front()); @@ -2811,19 +2788,36 @@ BaseObjectPtr Http2Session::PopPing() { return ping; } -Http2Session::Http2Ping* Http2Session::AddPing( - BaseObjectPtr ping) { +bool Http2Session::AddPing(const uint8_t* payload, Local callback) { + Local obj; + if (!env()->http2ping_constructor_template() + ->NewInstance(env()->context()) + .ToLocal(&obj)) { + return false; + } + + BaseObjectPtr ping = + MakeDetachedBaseObject(this, obj, callback); + if (!ping) + return false; + if (outstanding_pings_.size() == max_outstanding_pings_) { ping->Done(false); - return nullptr; + return false; } - Http2Ping* ptr = ping.get(); - outstanding_pings_.emplace(std::move(ping)); + IncrementCurrentSessionMemory(sizeof(*ping)); - return ptr; + // The Ping itself is an Async resource. When the acknowledgement is received, + // the callback will be invoked and a notification sent out to JS land. The + // notification will include the duration of the ping, allowing the round + // trip to be measured. + ping->Send(payload); + + outstanding_pings_.emplace(std::move(ping)); + return true; } -BaseObjectPtr Http2Session::PopSettings() { +BaseObjectPtr Http2Session::PopSettings() { BaseObjectPtr settings; if (!outstanding_settings_.empty()) { settings = std::move(outstanding_settings_.front()); @@ -2833,60 +2827,88 @@ BaseObjectPtr Http2Session::PopSettings() { return settings; } -Http2Session::Http2Settings* Http2Session::AddSettings( - BaseObjectPtr settings) { +bool Http2Session::AddSettings(Local callback) { + Local obj; + if (!env()->http2settings_constructor_template() + ->NewInstance(env()->context()) + .ToLocal(&obj)) { + return false; + } + + BaseObjectPtr settings = + MakeDetachedBaseObject(this, obj, callback, 0); + if (!settings) + return false; + if (outstanding_settings_.size() == max_outstanding_settings_) { settings->Done(false); - return nullptr; + return false; } - Http2Settings* ptr = settings.get(); - outstanding_settings_.emplace(std::move(settings)); + IncrementCurrentSessionMemory(sizeof(*settings)); - return ptr; + settings->Send(); + outstanding_settings_.emplace(std::move(settings)); + return true; } -Http2Session::Http2Ping::Http2Ping(Http2Session* session, Local obj) +Http2Ping::Http2Ping( + Http2Session* session, + Local obj, + Local callback) : AsyncWrap(session->env(), obj, AsyncWrap::PROVIDER_HTTP2PING), session_(session), startTime_(uv_hrtime()) { + callback_.Reset(env()->isolate(), callback); +} + +void Http2Ping::MemoryInfo(MemoryTracker* tracker) const { + tracker->TrackField("callback", callback_); +} + +Local Http2Ping::callback() const { + return callback_.Get(env()->isolate()); } -void Http2Session::Http2Ping::Send(const uint8_t* payload) { - CHECK_NOT_NULL(session_); +void Http2Ping::Send(const uint8_t* payload) { + CHECK(session_); uint8_t data[8]; if (payload == nullptr) { memcpy(&data, &startTime_, arraysize(data)); payload = data; } - Http2Scope h2scope(session_); - CHECK_EQ(nghttp2_submit_ping(**session_, NGHTTP2_FLAG_NONE, payload), 0); + Http2Scope h2scope(session_.get()); + CHECK_EQ(nghttp2_submit_ping( + session_->session(), + NGHTTP2_FLAG_NONE, + payload), 0); } -void Http2Session::Http2Ping::Done(bool ack, const uint8_t* payload) { +void Http2Ping::Done(bool ack, const uint8_t* payload) { uint64_t duration_ns = uv_hrtime() - startTime_; double duration_ms = duration_ns / 1e6; - if (session_ != nullptr) session_->statistics_.ping_rtt = duration_ns; + if (session_) session_->statistics_.ping_rtt = duration_ns; - HandleScope handle_scope(env()->isolate()); + Isolate* isolate = env()->isolate(); + HandleScope handle_scope(isolate); Context::Scope context_scope(env()->context()); - Local buf = Undefined(env()->isolate()); + Local buf = Undefined(isolate); if (payload != nullptr) { - buf = Buffer::Copy(env()->isolate(), + buf = Buffer::Copy(isolate, reinterpret_cast(payload), 8).ToLocalChecked(); } Local argv[] = { - Boolean::New(env()->isolate(), ack), - Number::New(env()->isolate(), duration_ms), + ack ? v8::True(isolate) : v8::False(isolate), + Number::New(isolate, duration_ms), buf }; - MakeCallback(env()->ondone_string(), arraysize(argv), argv); + MakeCallback(callback(), arraysize(argv), argv); } -void Http2Session::Http2Ping::DetachFromSession() { - session_ = nullptr; +void Http2Ping::DetachFromSession() { + session_.reset(); } void NgHttp2StreamWrite::MemoryInfo(MemoryTracker* tracker) const { @@ -2970,6 +2992,9 @@ void Initialize(Local target, // Method to fetch the nghttp2 string description of an nghttp2 error code env->SetMethod(target, "nghttp2ErrorString", HttpErrorString); + env->SetMethod(target, "refreshDefaultSettings", RefreshDefaultSettings); + env->SetMethod(target, "packSettings", PackSettings); + env->SetMethod(target, "setCallbackFunctions", SetCallbackFunctions); Local http2SessionClassName = FIXED_ONE_BYTE_STRING(isolate, "Http2Session"); @@ -2978,7 +3003,7 @@ void Initialize(Local target, ping->SetClassName(FIXED_ONE_BYTE_STRING(env->isolate(), "Http2Ping")); ping->Inherit(AsyncWrap::GetConstructorTemplate(env)); Local pingt = ping->InstanceTemplate(); - pingt->SetInternalFieldCount(Http2Session::Http2Ping::kInternalFieldCount); + pingt->SetInternalFieldCount(Http2Ping::kInternalFieldCount); env->set_http2ping_constructor_template(pingt); Local setting = FunctionTemplate::New(env->isolate()); @@ -3038,113 +3063,52 @@ void Initialize(Local target, session->GetFunction(env->context()).ToLocalChecked()).Check(); Local constants = Object::New(isolate); - Local name_for_error_code = Array::New(isolate); - -#define NODE_NGHTTP2_ERROR_CODES(V) \ - V(NGHTTP2_SESSION_SERVER); \ - V(NGHTTP2_SESSION_CLIENT); \ - V(NGHTTP2_STREAM_STATE_IDLE); \ - V(NGHTTP2_STREAM_STATE_OPEN); \ - V(NGHTTP2_STREAM_STATE_RESERVED_LOCAL); \ - V(NGHTTP2_STREAM_STATE_RESERVED_REMOTE); \ - V(NGHTTP2_STREAM_STATE_HALF_CLOSED_LOCAL); \ - V(NGHTTP2_STREAM_STATE_HALF_CLOSED_REMOTE); \ - V(NGHTTP2_STREAM_STATE_CLOSED); \ - V(NGHTTP2_NO_ERROR); \ - V(NGHTTP2_PROTOCOL_ERROR); \ - V(NGHTTP2_INTERNAL_ERROR); \ - V(NGHTTP2_FLOW_CONTROL_ERROR); \ - V(NGHTTP2_SETTINGS_TIMEOUT); \ - V(NGHTTP2_STREAM_CLOSED); \ - V(NGHTTP2_FRAME_SIZE_ERROR); \ - V(NGHTTP2_REFUSED_STREAM); \ - V(NGHTTP2_CANCEL); \ - V(NGHTTP2_COMPRESSION_ERROR); \ - V(NGHTTP2_CONNECT_ERROR); \ - V(NGHTTP2_ENHANCE_YOUR_CALM); \ - V(NGHTTP2_INADEQUATE_SECURITY); \ - V(NGHTTP2_HTTP_1_1_REQUIRED); \ - -#define V(name) \ - NODE_DEFINE_CONSTANT(constants, name); \ - name_for_error_code->Set(env->context(), \ - static_cast(name), \ - FIXED_ONE_BYTE_STRING(isolate, \ - #name)).Check(); - NODE_NGHTTP2_ERROR_CODES(V) + + // This does alocate one more slot than needed but it's not used. +#define V(name) FIXED_ONE_BYTE_STRING(isolate, #name), + Local error_code_names[] = { + HTTP2_ERROR_CODES(V) + }; #undef V - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_HCAT_REQUEST); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_HCAT_RESPONSE); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_HCAT_PUSH_RESPONSE); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_HCAT_HEADERS); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_NV_FLAG_NONE); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_NV_FLAG_NO_INDEX); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_ERR_DEFERRED); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_ERR_STREAM_ID_NOT_AVAILABLE); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_ERR_INVALID_ARGUMENT); - NODE_DEFINE_HIDDEN_CONSTANT(constants, NGHTTP2_ERR_STREAM_CLOSED); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_ERR_FRAME_SIZE_ERROR); - - NODE_DEFINE_HIDDEN_CONSTANT(constants, STREAM_OPTION_EMPTY_PAYLOAD); - NODE_DEFINE_HIDDEN_CONSTANT(constants, STREAM_OPTION_GET_TRAILERS); - - NODE_DEFINE_CONSTANT(constants, NGHTTP2_FLAG_NONE); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_FLAG_END_STREAM); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_FLAG_END_HEADERS); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_FLAG_ACK); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_FLAG_PADDED); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_FLAG_PRIORITY); - - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_HEADER_TABLE_SIZE); - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_ENABLE_PUSH); - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_MAX_CONCURRENT_STREAMS); - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_INITIAL_WINDOW_SIZE); - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_MAX_FRAME_SIZE); - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_MAX_HEADER_LIST_SIZE); - NODE_DEFINE_CONSTANT(constants, DEFAULT_SETTINGS_ENABLE_CONNECT_PROTOCOL); - NODE_DEFINE_CONSTANT(constants, MAX_MAX_FRAME_SIZE); - NODE_DEFINE_CONSTANT(constants, MIN_MAX_FRAME_SIZE); - NODE_DEFINE_CONSTANT(constants, MAX_INITIAL_WINDOW_SIZE); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_DEFAULT_WEIGHT); + Local name_for_error_code = + Array::New( + isolate, + error_code_names, + arraysize(error_code_names)); + + target->Set(context, + FIXED_ONE_BYTE_STRING(isolate, "nameForErrorCode"), + name_for_error_code).Check(); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_HEADER_TABLE_SIZE); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_ENABLE_PUSH); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_MAX_CONCURRENT_STREAMS); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_INITIAL_WINDOW_SIZE); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_MAX_FRAME_SIZE); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_MAX_HEADER_LIST_SIZE); - NODE_DEFINE_CONSTANT(constants, NGHTTP2_SETTINGS_ENABLE_CONNECT_PROTOCOL); +#define V(constant) NODE_DEFINE_HIDDEN_CONSTANT(constants, constant); + HTTP2_HIDDEN_CONSTANTS(V) +#undef V - NODE_DEFINE_CONSTANT(constants, PADDING_STRATEGY_NONE); - NODE_DEFINE_CONSTANT(constants, PADDING_STRATEGY_ALIGNED); - NODE_DEFINE_CONSTANT(constants, PADDING_STRATEGY_MAX); - NODE_DEFINE_CONSTANT(constants, PADDING_STRATEGY_CALLBACK); +#define V(constant) NODE_DEFINE_CONSTANT(constants, constant); + HTTP2_CONSTANTS(V) +#undef V -#define STRING_CONSTANT(NAME, VALUE) \ + // NGHTTP2_DEFAULT_WEIGHT is a macro and not a regular define + // it won't be set properly on the constants object if included + // in the HTTP2_CONSTANTS macro. + NODE_DEFINE_CONSTANT(constants, NGHTTP2_DEFAULT_WEIGHT); + +#define V(NAME, VALUE) \ NODE_DEFINE_STRING_CONSTANT(constants, "HTTP2_HEADER_" # NAME, VALUE); -HTTP_KNOWN_HEADERS(STRING_CONSTANT) -#undef STRING_CONSTANT + HTTP_KNOWN_HEADERS(V) +#undef V -#define STRING_CONSTANT(NAME, VALUE) \ +#define V(NAME, VALUE) \ NODE_DEFINE_STRING_CONSTANT(constants, "HTTP2_METHOD_" # NAME, VALUE); -HTTP_KNOWN_METHODS(STRING_CONSTANT) -#undef STRING_CONSTANT + HTTP_KNOWN_METHODS(V) +#undef V #define V(name, _) NODE_DEFINE_CONSTANT(constants, HTTP_STATUS_##name); -HTTP_STATUS_CODES(V) + HTTP_STATUS_CODES(V) #undef V - env->SetMethod(target, "refreshDefaultSettings", RefreshDefaultSettings); - env->SetMethod(target, "packSettings", PackSettings); - env->SetMethod(target, "setCallbackFunctions", SetCallbackFunctions); - - target->Set(context, - env->constants_string(), - constants).Check(); - target->Set(context, - FIXED_ONE_BYTE_STRING(isolate, "nameForErrorCode"), - name_for_error_code).Check(); + target->Set(context, env->constants_string(), constants).Check(); } } // namespace http2 } // namespace node diff --git a/src/node_http2.h b/src/node_http2.h index 1e5f99acfacec9..6b11535f84e121 100644 --- a/src/node_http2.h +++ b/src/node_http2.h @@ -21,6 +21,10 @@ namespace node { namespace http2 { +// Constants in all caps are exported as user-facing constants +// in JavaScript. Constants using the kName pattern are internal +// only. + // We strictly limit the number of outstanding unacknowledged PINGS a user // may send in order to prevent abuse. The current default cap is 10. The // user may set a different limit using a per Http2Session configuration @@ -34,16 +38,62 @@ constexpr size_t kDefaultMaxSettings = 10; constexpr uint64_t kDefaultMaxSessionMemory = 10000000; // These are the standard HTTP/2 defaults as specified by the RFC -#define DEFAULT_SETTINGS_HEADER_TABLE_SIZE 4096 -#define DEFAULT_SETTINGS_ENABLE_PUSH 1 -#define DEFAULT_SETTINGS_MAX_CONCURRENT_STREAMS 0xffffffffu -#define DEFAULT_SETTINGS_INITIAL_WINDOW_SIZE 65535 -#define DEFAULT_SETTINGS_MAX_FRAME_SIZE 16384 -#define DEFAULT_SETTINGS_MAX_HEADER_LIST_SIZE 65535 -#define DEFAULT_SETTINGS_ENABLE_CONNECT_PROTOCOL 0 -#define MAX_MAX_FRAME_SIZE 16777215 -#define MIN_MAX_FRAME_SIZE DEFAULT_SETTINGS_MAX_FRAME_SIZE -#define MAX_INITIAL_WINDOW_SIZE 2147483647 +constexpr uint32_t DEFAULT_SETTINGS_HEADER_TABLE_SIZE = 4096; +constexpr uint32_t DEFAULT_SETTINGS_ENABLE_PUSH = 1; +constexpr uint32_t DEFAULT_SETTINGS_MAX_CONCURRENT_STREAMS = 0xffffffffu; +constexpr uint32_t DEFAULT_SETTINGS_INITIAL_WINDOW_SIZE = 65535; +constexpr uint32_t DEFAULT_SETTINGS_MAX_FRAME_SIZE = 16384; +constexpr uint32_t DEFAULT_SETTINGS_MAX_HEADER_LIST_SIZE = 65535; +constexpr uint32_t DEFAULT_SETTINGS_ENABLE_CONNECT_PROTOCOL = 0; +constexpr uint32_t MAX_MAX_FRAME_SIZE = 16777215; +constexpr uint32_t MIN_MAX_FRAME_SIZE = DEFAULT_SETTINGS_MAX_FRAME_SIZE; +constexpr uint32_t MAX_INITIAL_WINDOW_SIZE = 2147483647; + +// Stream is not going to have any DATA frames +constexpr int STREAM_OPTION_EMPTY_PAYLOAD = 0x1; + +// Stream might have trailing headers +constexpr int STREAM_OPTION_GET_TRAILERS = 0x2; + +// Http2Stream internal states +constexpr int kStreamStateNone = 0x0; +constexpr int kStreamStateShut = 0x1; +constexpr int kStreamStateReadStart = 0x2; +constexpr int kStreamStateReadPaused = 0x4; +constexpr int kStreamStateClosed = 0x8; +constexpr int kStreamStateDestroyed = 0x10; +constexpr int kStreamStateTrailers = 0x20; + +// Http2Session internal states +constexpr int kSessionStateNone = 0x0; +constexpr int kSessionStateHasScope = 0x1; +constexpr int kSessionStateWriteScheduled = 0x2; +constexpr int kSessionStateClosed = 0x4; +constexpr int kSessionStateClosing = 0x8; +constexpr int kSessionStateSending = 0x10; +constexpr int kSessionStateWriteInProgress = 0x20; +constexpr int kSessionStateReadingStopped = 0x40; +constexpr int kSessionStateReceivePaused = 0x80; + +// The Padding Strategy determines the method by which extra padding is +// selected for HEADERS and DATA frames. These are configurable via the +// options passed in to a Http2Session object. +enum PaddingStrategy { + // No padding strategy. This is the default. + PADDING_STRATEGY_NONE, + // Attempts to ensure that the frame is 8-byte aligned + PADDING_STRATEGY_ALIGNED, + // Padding will ensure all data frames are maxFrameSize + PADDING_STRATEGY_MAX, + // Removed and turned into an alias because it is unreasonably expensive for + // very little benefit. + PADDING_STRATEGY_CALLBACK = PADDING_STRATEGY_ALIGNED +}; + +enum SessionType { + NGHTTP2_SESSION_SERVER, + NGHTTP2_SESSION_CLIENT +}; template struct Nghttp2Deleter { @@ -93,37 +143,6 @@ struct Http2RcBufferPointerTraits { using Http2Headers = NgHeaders; using Http2RcBufferPointer = NgRcBufPointer; - -enum nghttp2_session_type { - NGHTTP2_SESSION_SERVER, - NGHTTP2_SESSION_CLIENT -}; - -enum nghttp2_stream_flags { - NGHTTP2_STREAM_FLAG_NONE = 0x0, - // Writable side has ended - NGHTTP2_STREAM_FLAG_SHUT = 0x1, - // Reading has started - NGHTTP2_STREAM_FLAG_READ_START = 0x2, - // Reading is paused - NGHTTP2_STREAM_FLAG_READ_PAUSED = 0x4, - // Stream is closed - NGHTTP2_STREAM_FLAG_CLOSED = 0x8, - // Stream is destroyed - NGHTTP2_STREAM_FLAG_DESTROYED = 0x10, - // Stream has trailers - NGHTTP2_STREAM_FLAG_TRAILERS = 0x20, - // Stream has received all the data it can - NGHTTP2_STREAM_FLAG_EOS = 0x40 -}; - -enum nghttp2_stream_options { - // Stream is not going to have any DATA frames - STREAM_OPTION_EMPTY_PAYLOAD = 0x1, - // Stream might have trailing headers - STREAM_OPTION_GET_TRAILERS = 0x2, -}; - struct NgHttp2StreamWrite : public MemoryRetainer { WriteWrap* req_wrap = nullptr; uv_buf_t buf; @@ -137,38 +156,14 @@ struct NgHttp2StreamWrite : public MemoryRetainer { SET_SELF_SIZE(NgHttp2StreamWrite) }; -// The Padding Strategy determines the method by which extra padding is -// selected for HEADERS and DATA frames. These are configurable via the -// options passed in to a Http2Session object. -enum padding_strategy_type { - // No padding strategy. This is the default. - PADDING_STRATEGY_NONE, - // Attempts to ensure that the frame is 8-byte aligned - PADDING_STRATEGY_ALIGNED, - // Padding will ensure all data frames are maxFrameSize - PADDING_STRATEGY_MAX, - // Removed and turned into an alias because it is unreasonably expensive for - // very little benefit. - PADDING_STRATEGY_CALLBACK = PADDING_STRATEGY_ALIGNED -}; - -enum session_state_flags { - SESSION_STATE_NONE = 0x0, - SESSION_STATE_HAS_SCOPE = 0x1, - SESSION_STATE_WRITE_SCHEDULED = 0x2, - SESSION_STATE_CLOSED = 0x4, - SESSION_STATE_CLOSING = 0x8, - SESSION_STATE_SENDING = 0x10, - SESSION_STATE_WRITE_IN_PROGRESS = 0x20, - SESSION_STATE_READING_STOPPED = 0x40, - SESSION_STATE_NGHTTP2_RECV_PAUSED = 0x80 -}; - typedef uint32_t(*get_setting)(nghttp2_session* session, nghttp2_settings_id id); +class Http2Ping; class Http2Session; +class Http2Settings; class Http2Stream; +class Origins; // This scope should be present when any call into nghttp2 that may schedule // data to be written to the underlying transport is made, and schedules @@ -180,8 +175,7 @@ class Http2Scope { ~Http2Scope(); private: - Http2Session* session_ = nullptr; - v8::Local session_handle_; + BaseObjectPtr session_; }; // The Http2Options class is used to parse the options object passed in to @@ -191,7 +185,7 @@ class Http2Scope { class Http2Options { public: Http2Options(Http2State* http2_state, - nghttp2_session_type type); + SessionType type); ~Http2Options() = default; @@ -199,43 +193,43 @@ class Http2Options { return options_.get(); } - void SetMaxHeaderPairs(uint32_t max) { + void set_max_header_pairs(uint32_t max) { max_header_pairs_ = max; } - uint32_t GetMaxHeaderPairs() const { + uint32_t max_header_pairs() const { return max_header_pairs_; } - void SetPaddingStrategy(padding_strategy_type val) { + void set_padding_strategy(PaddingStrategy val) { padding_strategy_ = val; } - padding_strategy_type GetPaddingStrategy() const { + PaddingStrategy padding_strategy() const { return padding_strategy_; } - void SetMaxOutstandingPings(size_t max) { + void set_max_outstanding_pings(size_t max) { max_outstanding_pings_ = max; } - size_t GetMaxOutstandingPings() const { + size_t max_outstanding_pings() const { return max_outstanding_pings_; } - void SetMaxOutstandingSettings(size_t max) { + void set_max_outstanding_settings(size_t max) { max_outstanding_settings_ = max; } - size_t GetMaxOutstandingSettings() const { + size_t max_outstanding_settings() const { return max_outstanding_settings_; } - void SetMaxSessionMemory(uint64_t max) { + void set_max_session_memory(uint64_t max) { max_session_memory_ = max; } - uint64_t GetMaxSessionMemory() const { + uint64_t max_session_memory() const { return max_session_memory_; } @@ -243,7 +237,7 @@ class Http2Options { Nghttp2OptionPointer options_; uint64_t max_session_memory_ = kDefaultMaxSessionMemory; uint32_t max_header_pairs_ = DEFAULT_MAX_HEADER_LIST_PAIRS; - padding_strategy_type padding_strategy_ = PADDING_STRATEGY_NONE; + PaddingStrategy padding_strategy_ = PADDING_STRATEGY_NONE; size_t max_outstanding_pings_ = kDefaultMaxPings; size_t max_outstanding_settings_ = kDefaultMaxSettings; }; @@ -282,13 +276,13 @@ class Http2Stream : public AsyncWrap, int options = 0); ~Http2Stream() override; - nghttp2_stream* operator*(); + nghttp2_stream* operator*() const; + + nghttp2_stream* stream() const; Http2Session* session() { return session_.get(); } const Http2Session* session() const { return session_.get(); } - void EmitStatistics(); - // Required for StreamBase int ReadStart() override; @@ -312,7 +306,7 @@ class Http2Stream : public AsyncWrap, void OnTrailers(); // Submit a PRIORITY frame for this stream - int SubmitPriority(nghttp2_priority_spec* prispec, bool silent = false); + int SubmitPriority(const Http2Priority& priority, bool silent = false); // Submits an RST_STREAM frame using the given code void SubmitRstStream(const uint32_t code); @@ -331,42 +325,74 @@ class Http2Stream : public AsyncWrap, // Destroy this stream instance and free all held memory. void Destroy(); - inline bool IsDestroyed() const { - return flags_ & NGHTTP2_STREAM_FLAG_DESTROYED; + bool is_destroyed() const { + return flags_ & kStreamStateDestroyed; + } + + bool is_writable() const { + return !(flags_ & kStreamStateShut); + } + + bool is_paused() const { + return flags_ & kStreamStateReadPaused; + } + + bool is_closed() const { + return flags_ & kStreamStateClosed; + } + + bool has_trailers() const { + return flags_ & kStreamStateTrailers; + } + + void set_has_trailers(bool on = true) { + if (on) + flags_ |= kStreamStateTrailers; + else + flags_ &= ~kStreamStateTrailers; } - inline bool IsWritable() const { - return !(flags_ & NGHTTP2_STREAM_FLAG_SHUT); + void set_closed() { + flags_ |= kStreamStateClosed; } - inline bool IsPaused() const { - return flags_ & NGHTTP2_STREAM_FLAG_READ_PAUSED; + void set_destroyed() { + flags_ |= kStreamStateDestroyed; } - inline bool IsClosed() const { - return flags_ & NGHTTP2_STREAM_FLAG_CLOSED; + void set_not_writable() { + flags_ |= kStreamStateShut; } - inline bool HasTrailers() const { - return flags_ & NGHTTP2_STREAM_FLAG_TRAILERS; + void set_reading(bool on = true) { + if (on) { + flags_ |= kStreamStateReadStart; + set_paused(false); + } else {} + } + + void set_paused(bool on = true) { + if (on) + flags_ |= kStreamStateReadPaused; + else + flags_ &= ~kStreamStateReadPaused; } // Returns true if this stream is in the reading state, which occurs when - // the NGHTTP2_STREAM_FLAG_READ_START flag has been set and the - // NGHTTP2_STREAM_FLAG_READ_PAUSED flag is *not* set. - inline bool IsReading() const { - return flags_ & NGHTTP2_STREAM_FLAG_READ_START && - !(flags_ & NGHTTP2_STREAM_FLAG_READ_PAUSED); + // the kStreamStateReadStart flag has been set and the + // kStreamStateReadPaused flag is *not* set. + bool is_reading() const { + return flags_ & kStreamStateReadStart && !is_paused(); } // Returns the RST_STREAM code used to close this stream - inline int32_t code() const { return code_; } + int32_t code() const { return code_; } // Returns the stream identifier for this stream - inline int32_t id() const { return id_; } + int32_t id() const { return id_; } - inline void IncrementAvailableOutboundLength(size_t amount); - inline void DecrementAvailableOutboundLength(size_t amount); + void IncrementAvailableOutboundLength(size_t amount); + void DecrementAvailableOutboundLength(size_t amount); bool AddHeader(nghttp2_rcbuf* name, nghttp2_rcbuf* value, uint8_t flags); @@ -382,7 +408,7 @@ class Http2Stream : public AsyncWrap, return current_headers_.size(); } - inline nghttp2_headers_category headers_category() const { + nghttp2_headers_category headers_category() const { return current_headers_category_; } @@ -403,11 +429,7 @@ class Http2Stream : public AsyncWrap, int DoWrite(WriteWrap* w, uv_buf_t* bufs, size_t count, uv_stream_t* send_handle) override; - void MemoryInfo(MemoryTracker* tracker) const override { - tracker->TrackField("current_headers", current_headers_); - tracker->TrackField("queue", queue_); - } - + void MemoryInfo(MemoryTracker* tracker) const override; SET_MEMORY_INFO_NAME(Http2Stream) SET_SELF_SIZE(Http2Stream) @@ -445,10 +467,12 @@ class Http2Stream : public AsyncWrap, nghttp2_headers_category category, int options); + void EmitStatistics(); + BaseObjectWeakPtr session_; // The Parent HTTP/2 Session int32_t id_ = 0; // The Stream Identifier int32_t code_ = NGHTTP2_NO_ERROR; // The RST_STREAM code (if any) - int flags_ = NGHTTP2_STREAM_FLAG_NONE; // Internal state flags + int flags_ = kStreamStateNone; // Internal state flags uint32_t max_header_pairs_ = DEFAULT_MAX_HEADER_LIST_PAIRS; uint32_t max_header_length_ = DEFAULT_SETTINGS_MAX_HEADER_LIST_SIZE; @@ -545,29 +569,28 @@ class Http2Session : public AsyncWrap, public: Http2Session(Http2State* http2_state, v8::Local wrap, - nghttp2_session_type type = NGHTTP2_SESSION_SERVER); + SessionType type = NGHTTP2_SESSION_SERVER); ~Http2Session() override; - class Http2Ping; - class Http2Settings; - - void EmitStatistics(); - - inline StreamBase* underlying_stream() { + StreamBase* underlying_stream() { return static_cast(stream_); } void Close(uint32_t code = NGHTTP2_NO_ERROR, bool socket_closed = false); + void Consume(v8::Local stream); + void Goaway(uint32_t code, int32_t lastStreamID, const uint8_t* data, size_t len); + void AltSvc(int32_t id, uint8_t* origin, size_t origin_len, uint8_t* value, size_t value_len); - void Origin(nghttp2_origin_entry* ov, size_t count); + + void Origin(const Origins& origins); uint8_t SendPendingData(); @@ -575,25 +598,48 @@ class Http2Session : public AsyncWrap, // will be a pointer to the Http2Stream instance assigned. // This only works if the session is a client session. Http2Stream* SubmitRequest( - nghttp2_priority_spec* prispec, + const Http2Priority& priority, const Http2Headers& headers, int32_t* ret, int options = 0); - inline nghttp2_session_type type() const { return session_type_; } + SessionType type() const { return session_type_; } - inline nghttp2_session* session() const { return session_.get(); } + nghttp2_session* session() const { return session_.get(); } - inline nghttp2_session* operator*() { return session_.get(); } + nghttp2_session* operator*() { return session_.get(); } - inline uint32_t GetMaxHeaderPairs() const { return max_header_pairs_; } + uint32_t max_header_pairs() const { return max_header_pairs_; } + + const char* TypeName() const; + + bool is_destroyed() { + return (flags_ & kSessionStateClosed) || session_ == nullptr; + } - inline const char* TypeName() const; + void set_destroyed() { + flags_ |= kSessionStateClosed; + } - inline bool IsDestroyed() { - return (flags_ & SESSION_STATE_CLOSED) || session_ == nullptr; +#define IS_FLAG(name, flag) \ + bool is_##name() const { return flags_ & flag; } \ + void set_##name(bool on = true) { \ + if (on) \ + flags_ |= flag; \ + else \ + flags_ &= ~flag; \ } + IS_FLAG(in_scope, kSessionStateHasScope) + IS_FLAG(write_scheduled, kSessionStateWriteScheduled) + IS_FLAG(closing, kSessionStateClosing) + IS_FLAG(sending, kSessionStateSending) + IS_FLAG(write_in_progress, kSessionStateWriteInProgress) + IS_FLAG(reading_stopped, kSessionStateReadingStopped) + IS_FLAG(receive_paused, kSessionStateReceivePaused) + +#undef IS_FLAG + // Schedule a write if nghttp2 indicates it wants to write to the socket. void MaybeScheduleWrite(); @@ -601,15 +647,15 @@ class Http2Session : public AsyncWrap, void MaybeStopReading(); // Returns pointer to the stream, or nullptr if stream does not exist - inline Http2Stream* FindStream(int32_t id); + BaseObjectPtr FindStream(int32_t id); - inline bool CanAddStream(); + bool CanAddStream(); // Adds a stream instance to this session - inline void AddStream(Http2Stream* stream); + void AddStream(Http2Stream* stream); // Removes a stream instance from this session - inline void RemoveStream(Http2Stream* stream); + BaseObjectPtr RemoveStream(int32_t id); // Indicates whether there currently exist outgoing buffers for this stream. bool HasWritesOnSocketForStream(Http2Stream* stream); @@ -617,32 +663,22 @@ class Http2Session : public AsyncWrap, // Write data from stream_buf_ to the session ssize_t ConsumeHTTP2Data(); - void MemoryInfo(MemoryTracker* tracker) const override { - tracker->TrackField("streams", streams_); - tracker->TrackField("outstanding_pings", outstanding_pings_); - tracker->TrackField("outstanding_settings", outstanding_settings_); - tracker->TrackField("outgoing_buffers", outgoing_buffers_); - tracker->TrackFieldWithSize("stream_buf", stream_buf_.len); - tracker->TrackFieldWithSize("outgoing_storage", outgoing_storage_.size()); - tracker->TrackFieldWithSize("pending_rst_streams", - pending_rst_streams_.size() * sizeof(int32_t)); - tracker->TrackFieldWithSize("nghttp2_memory", current_nghttp2_memory_); - } - + void MemoryInfo(MemoryTracker* tracker) const override; SET_MEMORY_INFO_NAME(Http2Session) SET_SELF_SIZE(Http2Session) std::string diagnostic_name() const override; // Schedule an RstStream for after the current write finishes. - inline void AddPendingRstStream(int32_t stream_id) { + void AddPendingRstStream(int32_t stream_id) { pending_rst_streams_.emplace_back(stream_id); } - inline bool HasPendingRstStream(int32_t stream_id) { - return pending_rst_streams_.end() != std::find(pending_rst_streams_.begin(), - pending_rst_streams_.end(), - stream_id); + bool has_pending_rststream(int32_t stream_id) { + return pending_rst_streams_.end() != + std::find(pending_rst_streams_.begin(), + pending_rst_streams_.end(), + stream_id); } // Handle reads/writes from the underlying network transport. @@ -676,14 +712,13 @@ class Http2Session : public AsyncWrap, return env()->event_loop(); } - Http2State* http2_state() { - return http2_state_.get(); - } + Http2State* http2_state() const { return http2_state_.get(); } + BaseObjectPtr PopPing(); - Http2Ping* AddPing(BaseObjectPtr ping); + bool AddPing(const uint8_t* data, v8::Local callback); BaseObjectPtr PopSettings(); - Http2Settings* AddSettings(BaseObjectPtr settings); + bool AddSettings(v8::Local callback); void IncrementCurrentSessionMemory(uint64_t amount) { current_session_memory_ += amount; @@ -700,7 +735,7 @@ class Http2Session : public AsyncWrap, // Returns the current session memory including memory allocated by nghttp2, // the current outbound storage queue, and pending writes. - uint64_t GetCurrentSessionMemory() { + uint64_t current_session_memory() const { uint64_t total = current_session_memory_ + sizeof(Http2Session); total += current_nghttp2_memory_; total += outgoing_storage_.size(); @@ -708,8 +743,8 @@ class Http2Session : public AsyncWrap, } // Return true if current_session_memory + amount is less than the max - bool IsAvailableSessionMemory(uint64_t amount) { - return GetCurrentSessionMemory() + amount <= max_session_memory_; + bool has_available_session_memory(uint64_t amount) const { + return current_session_memory() + amount <= max_session_memory_; } struct Statistics { @@ -728,6 +763,8 @@ class Http2Session : public AsyncWrap, Statistics statistics_ = {}; private: + void EmitStatistics(); + // Frame Padding Strategies ssize_t OnDWordAlignedPadding(size_t frameLength, size_t maxPayloadLen); @@ -812,7 +849,7 @@ class Http2Session : public AsyncWrap, void* user_data); struct Callbacks { - inline explicit Callbacks(bool kHasGetPaddingCallback); + explicit Callbacks(bool kHasGetPaddingCallback); Nghttp2SessionCallbacksPointer callbacks; }; @@ -827,7 +864,7 @@ class Http2Session : public AsyncWrap, AliasedStruct js_fields_; // The session type: client or server - nghttp2_session_type session_type_; + SessionType session_type_; // The maximum number of header pairs permitted for streams on this session uint32_t max_header_pairs_ = DEFAULT_MAX_HEADER_LIST_PAIRS; @@ -839,12 +876,12 @@ class Http2Session : public AsyncWrap, uint64_t current_nghttp2_memory_ = 0; // The collection of active Http2Streams associated with this session - std::unordered_map streams_; + std::unordered_map> streams_; - int flags_ = SESSION_STATE_NONE; + int flags_ = kSessionStateNone; // The StreamBase instance being used for i/o - padding_strategy_type padding_strategy_ = PADDING_STRATEGY_NONE; + PaddingStrategy padding_strategy_ = PADDING_STRATEGY_NONE; // use this to allow timeout tracking during long-lasting writes uint32_t chunks_sent_since_last_write_ = 0; @@ -890,7 +927,7 @@ class Http2SessionPerformanceEntry : public performance::PerformanceEntry { Http2SessionPerformanceEntry( Http2State* http2_state, const Http2Session::Statistics& stats, - nghttp2_session_type type) : + SessionType type) : performance::PerformanceEntry( http2_state->env(), "Http2Session", "http2", stats.start_time, @@ -914,7 +951,7 @@ class Http2SessionPerformanceEntry : public performance::PerformanceEntry { int32_t stream_count() const { return stream_count_; } size_t max_concurrent_streams() const { return max_concurrent_streams_; } double stream_average_duration() const { return stream_average_duration_; } - nghttp2_session_type type() const { return session_type_; } + SessionType type() const { return session_type_; } Http2State* http2_state() const { return http2_state_.get(); } void Notify(v8::Local obj) { @@ -930,7 +967,7 @@ class Http2SessionPerformanceEntry : public performance::PerformanceEntry { int32_t stream_count_; size_t max_concurrent_streams_; double stream_average_duration_; - nghttp2_session_type session_type_; + SessionType session_type_; BaseObjectPtr http2_state_; }; @@ -975,14 +1012,14 @@ class Http2StreamPerformanceEntry BaseObjectPtr http2_state_; }; -class Http2Session::Http2Ping : public AsyncWrap { +class Http2Ping : public AsyncWrap { public: - explicit Http2Ping(Http2Session* session, v8::Local obj); - - void MemoryInfo(MemoryTracker* tracker) const override { - tracker->TrackField("session", session_); - } + explicit Http2Ping( + Http2Session* session, + v8::Local obj, + v8::Local callback); + void MemoryInfo(MemoryTracker* tracker) const override; SET_MEMORY_INFO_NAME(Http2Ping) SET_SELF_SIZE(Http2Ping) @@ -990,34 +1027,38 @@ class Http2Session::Http2Ping : public AsyncWrap { void Done(bool ack, const uint8_t* payload = nullptr); void DetachFromSession(); + v8::Local callback() const; + private: - Http2Session* session_; + BaseObjectWeakPtr session_; + v8::Global callback_; uint64_t startTime_; }; // The Http2Settings class is used to parse the settings passed in for // an Http2Session, converting those into an array of nghttp2_settings_entry // structs. -class Http2Session::Http2Settings : public AsyncWrap { +class Http2Settings : public AsyncWrap { public: - Http2Settings(Http2State* http2_state, - Http2Session* session, + Http2Settings(Http2Session* session, v8::Local obj, + v8::Local callback, uint64_t start_time = uv_hrtime()); - void MemoryInfo(MemoryTracker* tracker) const override { - tracker->TrackField("session", session_); - } - + void MemoryInfo(MemoryTracker* tracker) const override; SET_MEMORY_INFO_NAME(Http2Settings) SET_SELF_SIZE(Http2Settings) void Send(); void Done(bool ack); + v8::Local callback() const; + // Returns a Buffer instance with the serialized SETTINGS payload v8::Local Pack(); + static v8::Local Pack(Http2State* state); + // Resets the default values in the settings buffer static void RefreshDefaults(Http2State* http2_state); @@ -1026,8 +1067,17 @@ class Http2Session::Http2Settings : public AsyncWrap { get_setting fn); private: - void Init(Http2State* http2_state); - Http2Session* session_; + static size_t Init( + Http2State* http2_state, + nghttp2_settings_entry* entries); + + static v8::Local Pack( + Environment* env, + size_t count, + const nghttp2_settings_entry* entries); + + BaseObjectWeakPtr session_; + v8::Global callback_; uint64_t startTime_; size_t count_ = 0; nghttp2_settings_entry entries_[IDX_SETTINGS_COUNT]; @@ -1035,14 +1085,13 @@ class Http2Session::Http2Settings : public AsyncWrap { class Origins { public: - Origins(v8::Isolate* isolate, - v8::Local context, + Origins(Environment* env, v8::Local origin_string, size_t origin_count); ~Origins() = default; - nghttp2_origin_entry* operator*() { - return reinterpret_cast(*buf_); + const nghttp2_origin_entry* operator*() const { + return reinterpret_cast(buf_.data()); } size_t length() const { @@ -1051,9 +1100,88 @@ class Origins { private: size_t count_; - MaybeStackBuffer buf_; + AllocatedBuffer buf_; }; +#define HTTP2_HIDDEN_CONSTANTS(V) \ + V(NGHTTP2_HCAT_REQUEST) \ + V(NGHTTP2_HCAT_RESPONSE) \ + V(NGHTTP2_HCAT_PUSH_RESPONSE) \ + V(NGHTTP2_HCAT_HEADERS) \ + V(NGHTTP2_NV_FLAG_NONE) \ + V(NGHTTP2_NV_FLAG_NO_INDEX) \ + V(NGHTTP2_ERR_DEFERRED) \ + V(NGHTTP2_ERR_STREAM_ID_NOT_AVAILABLE) \ + V(NGHTTP2_ERR_INVALID_ARGUMENT) \ + V(NGHTTP2_ERR_STREAM_CLOSED) \ + V(STREAM_OPTION_EMPTY_PAYLOAD) \ + V(STREAM_OPTION_GET_TRAILERS) + +#define HTTP2_ERROR_CODES(V) \ + V(NGHTTP2_NO_ERROR) \ + V(NGHTTP2_PROTOCOL_ERROR) \ + V(NGHTTP2_INTERNAL_ERROR) \ + V(NGHTTP2_FLOW_CONTROL_ERROR) \ + V(NGHTTP2_SETTINGS_TIMEOUT) \ + V(NGHTTP2_STREAM_CLOSED) \ + V(NGHTTP2_FRAME_SIZE_ERROR) \ + V(NGHTTP2_REFUSED_STREAM) \ + V(NGHTTP2_CANCEL) \ + V(NGHTTP2_COMPRESSION_ERROR) \ + V(NGHTTP2_CONNECT_ERROR) \ + V(NGHTTP2_ENHANCE_YOUR_CALM) \ + V(NGHTTP2_INADEQUATE_SECURITY) \ + V(NGHTTP2_HTTP_1_1_REQUIRED) \ + +#define HTTP2_CONSTANTS(V) \ + V(NGHTTP2_ERR_FRAME_SIZE_ERROR) \ + V(NGHTTP2_SESSION_SERVER) \ + V(NGHTTP2_SESSION_CLIENT) \ + V(NGHTTP2_STREAM_STATE_IDLE) \ + V(NGHTTP2_STREAM_STATE_OPEN) \ + V(NGHTTP2_STREAM_STATE_RESERVED_LOCAL) \ + V(NGHTTP2_STREAM_STATE_RESERVED_REMOTE) \ + V(NGHTTP2_STREAM_STATE_HALF_CLOSED_LOCAL) \ + V(NGHTTP2_STREAM_STATE_HALF_CLOSED_REMOTE) \ + V(NGHTTP2_STREAM_STATE_CLOSED) \ + V(NGHTTP2_FLAG_NONE) \ + V(NGHTTP2_FLAG_END_STREAM) \ + V(NGHTTP2_FLAG_END_HEADERS) \ + V(NGHTTP2_FLAG_ACK) \ + V(NGHTTP2_FLAG_PADDED) \ + V(NGHTTP2_FLAG_PRIORITY) \ + V(DEFAULT_SETTINGS_HEADER_TABLE_SIZE) \ + V(DEFAULT_SETTINGS_ENABLE_PUSH) \ + V(DEFAULT_SETTINGS_MAX_CONCURRENT_STREAMS) \ + V(DEFAULT_SETTINGS_INITIAL_WINDOW_SIZE) \ + V(DEFAULT_SETTINGS_MAX_FRAME_SIZE) \ + V(DEFAULT_SETTINGS_MAX_HEADER_LIST_SIZE) \ + V(DEFAULT_SETTINGS_ENABLE_CONNECT_PROTOCOL) \ + V(MAX_MAX_FRAME_SIZE) \ + V(MIN_MAX_FRAME_SIZE) \ + V(MAX_INITIAL_WINDOW_SIZE) \ + V(NGHTTP2_SETTINGS_HEADER_TABLE_SIZE) \ + V(NGHTTP2_SETTINGS_ENABLE_PUSH) \ + V(NGHTTP2_SETTINGS_MAX_CONCURRENT_STREAMS) \ + V(NGHTTP2_SETTINGS_INITIAL_WINDOW_SIZE) \ + V(NGHTTP2_SETTINGS_MAX_FRAME_SIZE) \ + V(NGHTTP2_SETTINGS_MAX_HEADER_LIST_SIZE) \ + V(NGHTTP2_SETTINGS_ENABLE_CONNECT_PROTOCOL) \ + V(PADDING_STRATEGY_NONE) \ + V(PADDING_STRATEGY_ALIGNED) \ + V(PADDING_STRATEGY_MAX) \ + V(PADDING_STRATEGY_CALLBACK) \ + HTTP2_ERROR_CODES(V) + +#define HTTP2_SETTINGS(V) \ + V(HEADER_TABLE_SIZE) \ + V(ENABLE_PUSH) \ + V(MAX_CONCURRENT_STREAMS) \ + V(INITIAL_WINDOW_SIZE) \ + V(MAX_FRAME_SIZE) \ + V(MAX_HEADER_LIST_SIZE) \ + V(ENABLE_CONNECT_PROTOCOL) \ + } // namespace http2 } // namespace node diff --git a/test/parallel/test-http2-getpackedsettings.js b/test/parallel/test-http2-getpackedsettings.js index 4aa5747a053bd1..a54ab4499e1f89 100644 --- a/test/parallel/test-http2-getpackedsettings.js +++ b/test/parallel/test-http2-getpackedsettings.js @@ -7,11 +7,11 @@ const assert = require('assert'); const http2 = require('http2'); const check = Buffer.from([0x00, 0x01, 0x00, 0x00, 0x10, 0x00, + 0x00, 0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x03, 0xff, 0xff, 0xff, 0xff, - 0x00, 0x05, 0x00, 0x00, 0x40, 0x00, 0x00, 0x04, 0x00, 0x00, 0xff, 0xff, + 0x00, 0x05, 0x00, 0x00, 0x40, 0x00, 0x00, 0x06, 0x00, 0x00, 0xff, 0xff, - 0x00, 0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x08, 0x00, 0x00, 0x00, 0x00]); const val = http2.getPackedSettings(http2.getDefaultSettings()); assert.deepStrictEqual(val, check); @@ -83,12 +83,13 @@ http2.getPackedSettings({ enablePush: false }); { const check = Buffer.from([ 0x00, 0x01, 0x00, 0x00, 0x00, 0x64, + 0x00, 0x02, 0x00, 0x00, 0x00, 0x01, 0x00, 0x03, 0x00, 0x00, 0x00, 0xc8, - 0x00, 0x05, 0x00, 0x00, 0x4e, 0x20, 0x00, 0x04, 0x00, 0x00, 0x00, 0x64, + 0x00, 0x05, 0x00, 0x00, 0x4e, 0x20, 0x00, 0x06, 0x00, 0x00, 0x00, 0x64, - 0x00, 0x02, 0x00, 0x00, 0x00, 0x01, - 0x00, 0x08, 0x00, 0x00, 0x00, 0x00]); + 0x00, 0x08, 0x00, 0x00, 0x00, 0x00 + ]); const packed = http2.getPackedSettings({ headerTableSize: 100, From 308681307faabd4e864dcb3afe169e144270da2d Mon Sep 17 00:00:00 2001 From: James M Snell Date: Sat, 18 Apr 2020 11:25:04 -0700 Subject: [PATCH 26/93] tls: move getAllowUnauthorized to internal/options Make it so that the allow unauthorized warning can be easily reused by the QUIC impl once that lands. Extracted from https://github.com/nodejs/node/pull/32379 Signed-off-by: James M Snell PR-URL: https://github.com/nodejs/node/pull/32917 Reviewed-By: Sam Roberts Reviewed-By: Colin Ihrig --- lib/_tls_wrap.js | 17 +++++------------ lib/internal/options.js | 19 ++++++++++++++++++- 2 files changed, 23 insertions(+), 13 deletions(-) diff --git a/lib/_tls_wrap.js b/lib/_tls_wrap.js index cf1d2c27b89fd4..82b43ea481fd5b 100644 --- a/lib/_tls_wrap.js +++ b/lib/_tls_wrap.js @@ -70,7 +70,10 @@ const { ERR_TLS_INVALID_STATE } = codes; const { onpskexchange: kOnPskExchange } = internalBinding('symbols'); -const { getOptionValue } = require('internal/options'); +const { + getOptionValue, + getAllowUnauthorized, +} = require('internal/options'); const { validateString, validateBuffer, @@ -1533,22 +1536,12 @@ function onConnectEnd() { } } -let warnOnAllowUnauthorized = true; - // Arguments: [port,] [host,] [options,] [cb] exports.connect = function connect(...args) { args = normalizeConnectArgs(args); let options = args[0]; const cb = args[1]; - const allowUnauthorized = process.env.NODE_TLS_REJECT_UNAUTHORIZED === '0'; - - if (allowUnauthorized && warnOnAllowUnauthorized) { - warnOnAllowUnauthorized = false; - process.emitWarning('Setting the NODE_TLS_REJECT_UNAUTHORIZED ' + - 'environment variable to \'0\' makes TLS connections ' + - 'and HTTPS requests insecure by disabling ' + - 'certificate verification.'); - } + const allowUnauthorized = getAllowUnauthorized(); options = { rejectUnauthorized: !allowUnauthorized, diff --git a/lib/internal/options.js b/lib/internal/options.js index e494787b96c088..03586f9dae6d76 100644 --- a/lib/internal/options.js +++ b/lib/internal/options.js @@ -3,6 +3,8 @@ const { getOptions } = internalBinding('options'); const { options, aliases } = getOptions(); +let warnOnAllowUnauthorized = true; + function getOptionValue(option) { const result = options.get(option); if (!result) { @@ -11,8 +13,23 @@ function getOptionValue(option) { return result.value; } +function getAllowUnauthorized() { + const allowUnauthorized = process.env.NODE_TLS_REJECT_UNAUTHORIZED === '0'; + + if (allowUnauthorized && warnOnAllowUnauthorized) { + warnOnAllowUnauthorized = false; + process.emitWarning( + 'Setting the NODE_TLS_REJECT_UNAUTHORIZED ' + + 'environment variable to \'0\' makes TLS connections ' + + 'and HTTPS requests insecure by disabling ' + + 'certificate verification.'); + } + return allowUnauthorized; +} + module.exports = { options, aliases, - getOptionValue + getOptionValue, + getAllowUnauthorized, }; From ca5ebcfb67321a71b33b10119bc98a304a4217bf Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Tue, 14 Apr 2020 19:01:29 +0200 Subject: [PATCH 27/93] tools: fix mkcodecache when run with ASAN MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fixes: https://github.com/nodejs/node/issues/32835 PR-URL: https://github.com/nodejs/node/pull/32850 Reviewed-By: Matheus Marchini Reviewed-By: Richard Lau Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Juan José Arboleda Reviewed-By: Jiawen Geng --- tools/code_cache/mkcodecache.cc | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/tools/code_cache/mkcodecache.cc b/tools/code_cache/mkcodecache.cc index 34af7bc61ba374..23df0f69317b8b 100644 --- a/tools/code_cache/mkcodecache.cc +++ b/tools/code_cache/mkcodecache.cc @@ -49,8 +49,8 @@ int main(int argc, char* argv[]) { // Create a new Isolate and make it the current one. Isolate::CreateParams create_params; - create_params.array_buffer_allocator = - ArrayBuffer::Allocator::NewDefaultAllocator(); + create_params.array_buffer_allocator_shared.reset( + ArrayBuffer::Allocator::NewDefaultAllocator()); Isolate* isolate = Isolate::New(create_params); { Isolate::Scope isolate_scope(isolate); @@ -65,6 +65,7 @@ int main(int argc, char* argv[]) { out << cache; out.close(); } + isolate->Dispose(); v8::V8::ShutdownPlatform(); return 0; From eaf841d585f431e5f2fea6490bb6ff4ae9e74f10 Mon Sep 17 00:00:00 2001 From: Myles Borins Date: Fri, 17 Apr 2020 17:13:16 -0400 Subject: [PATCH 28/93] module: improve error for invalid package targets For targets that are strings that do not start with `./` or `/` the error will now have additional information about what the programming error is. Closes: https://github.com/nodejs/node/issues/32034 PR-URL: https://github.com/nodejs/node/pull/32052 Fixes: https://github.com/nodejs/node/issues/32034 Reviewed-By: Geoffrey Booth Reviewed-By: Ruben Bridgewater Reviewed-By: Jan Krems Reviewed-By: Guy Bedford Signed-off-by: Myles Borins --- lib/internal/errors.js | 17 ++++++++++++++--- test/es-module/test-esm-exports.mjs | 4 ++++ .../node_modules/pkgexports/package.json | 1 + 3 files changed, 19 insertions(+), 3 deletions(-) diff --git a/lib/internal/errors.js b/lib/internal/errors.js index e0668e2f827e5f..6fa230349072d6 100644 --- a/lib/internal/errors.js +++ b/lib/internal/errors.js @@ -20,6 +20,7 @@ const { ObjectDefineProperty, ObjectKeys, StringPrototypeSlice, + StringPrototypeStartsWith, Symbol, SymbolFor, WeakMap, @@ -1104,18 +1105,28 @@ E('ERR_INVALID_PACKAGE_CONFIG', (path, message, hasMessage = true) => { }, Error); E('ERR_INVALID_PACKAGE_TARGET', (pkgPath, key, subpath, target, base = undefined) => { + const relError = typeof target === 'string' && + target.length && !StringPrototypeStartsWith(target, './'); if (key === null) { if (subpath !== '') { return `Invalid "exports" target ${JSONStringify(target)} defined ` + `for '${subpath}' in the package config ${pkgPath} imported from ` + - base; + `${base}.${relError ? '; targets must start with "./"' : ''}`; } else { return `Invalid "exports" main target ${target} defined in the ` + - `package config ${pkgPath} imported from ${base}.`; + `package config ${pkgPath} imported from ${base}${relError ? + '; targets must start with "./"' : ''}`; } } else if (key === '.') { return `Invalid "exports" main target ${JSONStringify(target)} defined ` + - `in the package config ${pkgPath}${sep}package.json`; + `in the package config ${pkgPath}${sep}package.json${relError ? + '; targets must start with "./"' : ''}`; + } else if (typeof target === 'string' && target !== '' && + !StringPrototypeStartsWith(target, './')) { + return `Invalid "exports" target ${JSONStringify(target)} defined for '${ + StringPrototypeSlice(key, 0, -subpath.length || key.length)}' in the ` + + `package config ${pkgPath}${sep}package.json; ` + + 'targets must start with "./"'; } else { return `Invalid "exports" target ${JSONStringify(target)} defined for '${ StringPrototypeSlice(key, 0, -subpath.length || key.length)}' in the ` + diff --git a/test/es-module/test-esm-exports.mjs b/test/es-module/test-esm-exports.mjs index f71e2172951843..2b58f28344522c 100644 --- a/test/es-module/test-esm-exports.mjs +++ b/test/es-module/test-esm-exports.mjs @@ -78,6 +78,7 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; ['pkgexports/null', './null'], ['pkgexports/invalid2', './invalid2'], ['pkgexports/invalid3', './invalid3'], + ['pkgexports/invalid5', 'invalid5'], // Missing / invalid fallbacks ['pkgexports/nofallback1', './nofallback1'], ['pkgexports/nofallback2', './nofallback2'], @@ -106,6 +107,9 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; strictEqual(err.code, 'ERR_INVALID_PACKAGE_TARGET'); assertStartsWith(err.message, 'Invalid "exports"'); assertIncludes(err.message, subpath); + if (!subpath.startsWith('./')) { + assertIncludes(err.message, 'targets must start with'); + } })); } diff --git a/test/fixtures/node_modules/pkgexports/package.json b/test/fixtures/node_modules/pkgexports/package.json index f3ec20c49b2b91..43ccf7795b978e 100644 --- a/test/fixtures/node_modules/pkgexports/package.json +++ b/test/fixtures/node_modules/pkgexports/package.json @@ -13,6 +13,7 @@ "./invalid2": 1234, "./invalid3": "", "./invalid4": {}, + "./invalid5": "invalid5.js", "./fallbackdir/": [[], null, {}, "builtin:x/", "./fallbackfile", "./"], "./fallbackfile": [[], null, {}, "builtin:x", "./asdf.js"], "./nofallback1": [], From b5b3efeb90e78840a03f67297a9380794a94ce1c Mon Sep 17 00:00:00 2001 From: Gus Caplan Date: Tue, 21 Apr 2020 13:30:47 -0500 Subject: [PATCH 29/93] doc: add more info to v14 changelog PR-URL: https://github.com/nodejs/node/pull/32979 Reviewed-By: Colin Ihrig Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Ruben Bridgewater Reviewed-By: Jiawen Geng --- doc/changelogs/CHANGELOG_V14.md | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/doc/changelogs/CHANGELOG_V14.md b/doc/changelogs/CHANGELOG_V14.md index 37a678820fcdd7..c3b6aa0b48d06f 100644 --- a/doc/changelogs/CHANGELOG_V14.md +++ b/doc/changelogs/CHANGELOG_V14.md @@ -79,6 +79,10 @@ interact with `std::shared_ptr`. This is expected to be fixed in a later version #### Update to V8 8.1 * **(SEMVER-MAJOR)** **deps**: update V8 to 8.1.307.20 (Matheus Marchini) [#32116](https://github.com/nodejs/node/pull/32116) + * Enables Optional Chaining by default ([MDN](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Optional_chaining), [v8.dev](https://v8.dev/features/optional-chaining)) + * Enables Nullish Coalescing by default ([MDN](https://wiki.developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Nullish_Coalescing_Operator), [v8.dev](https://v8.dev/features/nullish-coalescing)) + * Enables `Intl.DisplayNames` by default ([MDN](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DisplayNames), [v8.dev](https://v8.dev/features/intl-displaynames)) + * Enables `calendar` and `numberingSystem` options for `Intl.DateTimeFormat` by default ([MDN](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DateTimeFormat)) #### Other Notable Changes: From 4dd3a7ddc9a610607522b0ec7e1f1531bc9c0e51 Mon Sep 17 00:00:00 2001 From: Gerhard Stoebich <18708370+Flarna@users.noreply.github.com> Date: Tue, 21 Apr 2020 19:25:58 +0200 Subject: [PATCH 30/93] doc: set module version 83 to node 14 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Clearly state the modules version 83 is official node 14, similar as it is done for other major node versions. PR-URL: https://github.com/nodejs/node/pull/32975 Reviewed-By: Beth Griggs Reviewed-By: Richard Lau Reviewed-By: Michaël Zasso Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Jiawen Geng --- doc/abi_version_registry.json | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/abi_version_registry.json b/doc/abi_version_registry.json index 98c1522a8dd94e..c4de6b8719b032 100644 --- a/doc/abi_version_registry.json +++ b/doc/abi_version_registry.json @@ -1,6 +1,6 @@ { "NODE_MODULE_VERSION": [ - { "modules": 83, "runtime": "node", "variant": "v8_8.1", "versions": "14.0.0-pre" }, + { "modules": 83, "runtime": "node", "variant": "v8_8.1", "versions": "14.0.0" }, { "modules": 82, "runtime": "electron", "variant": "electron", "versions": "10" }, { "modules": 81, "runtime": "node", "variant": "v8_7.9", "versions": "14.0.0-pre" }, { "modules": 80, "runtime": "electron", "variant": "electron", "versions": "9" }, From 21d314e7fca2a401a050a7f055f8054e8a72a258 Mon Sep 17 00:00:00 2001 From: Guy Bedford Date: Mon, 13 Apr 2020 22:36:15 -0700 Subject: [PATCH 31/93] module: exports not exported for null resolutions PR-URL: https://github.com/nodejs/node/pull/32838 Reviewed-By: Jan Krems --- doc/api/esm.md | 4 +++- lib/internal/modules/cjs/loader.js | 14 +++++++++----- lib/internal/modules/esm/resolve.js | 7 +++++-- test/es-module/test-esm-exports.mjs | 9 ++++++--- test/fixtures/node_modules/pkgexports/package.json | 1 + 5 files changed, 24 insertions(+), 11 deletions(-) diff --git a/doc/api/esm.md b/doc/api/esm.md index 97124486cc2bf5..49c467effbc3cd 100644 --- a/doc/api/esm.md +++ b/doc/api/esm.md @@ -1661,13 +1661,15 @@ The resolver can throw the following errors: > loop on any _Package Path Not Exported_ error. > 1. Throw a _Package Path Not Exported_ error. > 1. Otherwise, if _target_ is an Array, then -> 1. If _target.length is zero, throw an _Invalid Package Target_ error. +> 1. If _target.length is zero, throw a _Package Path Not Exported_ error. > 1. For each item _targetValue_ in _target_, do > 1. If _targetValue_ is an Array, continue the loop. > 1. Return the result of **PACKAGE_EXPORTS_TARGET_RESOLVE**(_packageURL_, > _targetValue_, _subpath_, _env_), continuing the loop on any > _Package Path Not Exported_ or _Invalid Package Target_ error. > 1. Throw the last fallback resolution error. +> 1. Otherwise, if _target_ is _null_, throw a _Package Path Not Exported_ +> error. > 1. Otherwise throw an _Invalid Package Target_ error. **ESM_FORMAT**(_url_) diff --git a/lib/internal/modules/cjs/loader.js b/lib/internal/modules/cjs/loader.js index 9db2b1e6c1c317..e09cc437068d6b 100644 --- a/lib/internal/modules/cjs/loader.js +++ b/lib/internal/modules/cjs/loader.js @@ -552,21 +552,22 @@ function resolveExportsTarget(baseUrl, target, subpath, mappingKey) { , 0, -1), mappingKey); } else if (ArrayIsArray(target)) { if (target.length === 0) - throw new ERR_INVALID_PACKAGE_TARGET(StringPrototypeSlice(baseUrl.pathname - , 0, -1), mappingKey, subpath, target); + throw new ERR_PACKAGE_PATH_NOT_EXPORTED( + StringPrototypeSlice(baseUrl.pathname, 0, -1), mappingKey + subpath); + let lastException; for (const targetValue of target) { try { return resolveExportsTarget(baseUrl, targetValue, subpath, mappingKey); } catch (e) { + lastException = e; if (e.code !== 'ERR_PACKAGE_PATH_NOT_EXPORTED' && e.code !== 'ERR_INVALID_PACKAGE_TARGET') throw e; } } // Throw last fallback error - resolveExportsTarget(baseUrl, target[target.length - 1], subpath, - mappingKey); - assert(false); + assert(lastException !== undefined); + throw lastException; } else if (typeof target === 'object' && target !== null) { const keys = ObjectKeys(target); if (keys.some(isArrayIndex)) { @@ -595,6 +596,9 @@ function resolveExportsTarget(baseUrl, target, subpath, mappingKey) { } throw new ERR_PACKAGE_PATH_NOT_EXPORTED( StringPrototypeSlice(baseUrl.pathname, 0, -1), mappingKey + subpath); + } else if (target === null) { + throw new ERR_PACKAGE_PATH_NOT_EXPORTED( + StringPrototypeSlice(baseUrl.pathname, 0, -1), mappingKey + subpath); } throw new ERR_INVALID_PACKAGE_TARGET( StringPrototypeSlice(baseUrl.pathname, 0, -1), mappingKey, subpath, target); diff --git a/lib/internal/modules/esm/resolve.js b/lib/internal/modules/esm/resolve.js index 04c6abe54269f6..5bb2f37e53eeb5 100644 --- a/lib/internal/modules/esm/resolve.js +++ b/lib/internal/modules/esm/resolve.js @@ -16,7 +16,7 @@ const { StringPrototypeStartsWith, StringPrototypeSubstr, } = primordials; - +const assert = require('internal/assert'); const internalFS = require('internal/fs/utils'); const { NativeModule } = require('internal/bootstrap/loaders'); const { @@ -345,7 +345,7 @@ function resolveExportsTarget( return finalizeResolution(resolved, base); } else if (ArrayIsArray(target)) { if (target.length === 0) - throwExportsInvalid(packageSubpath, target, packageJSONUrl, base); + throwExportsNotFound(packageSubpath, packageJSONUrl, base); let lastException; for (let i = 0; i < target.length; i++) { @@ -366,6 +366,7 @@ function resolveExportsTarget( return finalizeResolution(resolved, base); } + assert(lastException !== undefined); throw lastException; } else if (typeof target === 'object' && target !== null) { const keys = ObjectGetOwnPropertyNames(target); @@ -392,6 +393,8 @@ function resolveExportsTarget( } } throwExportsNotFound(packageSubpath, packageJSONUrl, base); + } else if (target === null) { + throwExportsNotFound(packageSubpath, packageJSONUrl, base); } throwExportsInvalid(packageSubpath, target, packageJSONUrl, base); } diff --git a/test/es-module/test-esm-exports.mjs b/test/es-module/test-esm-exports.mjs index 2b58f28344522c..74bf66be32c39a 100644 --- a/test/es-module/test-esm-exports.mjs +++ b/test/es-module/test-esm-exports.mjs @@ -64,6 +64,11 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; // Conditional exports with no match are "not exported" errors ['pkgexports/invalid1', './invalid1'], ['pkgexports/invalid4', './invalid4'], + // Null mapping + ['pkgexports/null', './null'], + ['pkgexports/null/subpath', './null/subpath'], + // Empty fallback + ['pkgexports/nofallback1', './nofallback1'], ]); const invalidExports = new Map([ @@ -74,13 +79,11 @@ import fromInside from '../fixtures/node_modules/pkgexports/lib/hole.js'; ['pkgexports/belowdir/pkgexports/asdf.js', './belowdir/'], // This target file steps below the package ['pkgexports/belowfile', './belowfile'], - // Invalid target handling - ['pkgexports/null', './null'], + // Invalid targets ['pkgexports/invalid2', './invalid2'], ['pkgexports/invalid3', './invalid3'], ['pkgexports/invalid5', 'invalid5'], // Missing / invalid fallbacks - ['pkgexports/nofallback1', './nofallback1'], ['pkgexports/nofallback2', './nofallback2'], // Reaching into nested node_modules ['pkgexports/nodemodules', './nodemodules'], diff --git a/test/fixtures/node_modules/pkgexports/package.json b/test/fixtures/node_modules/pkgexports/package.json index 43ccf7795b978e..b99e5c7b79f6a8 100644 --- a/test/fixtures/node_modules/pkgexports/package.json +++ b/test/fixtures/node_modules/pkgexports/package.json @@ -9,6 +9,7 @@ "./belowfile": "../belowfile", "./missingtrailer/": ".", "./null": null, + "./null/": null, "./invalid1": {}, "./invalid2": 1234, "./invalid3": "", From 4ef91a640e964b89b652d6bfdfcf6b4f57ffd3cf Mon Sep 17 00:00:00 2001 From: bcoe Date: Tue, 21 Apr 2020 15:55:54 -0700 Subject: [PATCH 32/93] http2: wait for secureConnect before initializing PR-URL: https://github.com/nodejs/node/pull/32958 Fixes: https://github.com/nodejs/node/issues/32922 Reviewed-By: James M Snell Reviewed-By: Anna Henningsen --- lib/_tls_wrap.js | 2 + lib/internal/http2/core.js | 2 +- test/internet/test-http2-issue-32922.js | 80 +++++++++++++++++++++++++ 3 files changed, 83 insertions(+), 1 deletion(-) create mode 100644 test/internet/test-http2-issue-32922.js diff --git a/lib/_tls_wrap.js b/lib/_tls_wrap.js index 82b43ea481fd5b..51650c760b4158 100644 --- a/lib/_tls_wrap.js +++ b/lib/_tls_wrap.js @@ -467,6 +467,7 @@ function TLSSocket(socket, opts) { this._securePending = false; this._newSessionPending = false; this._controlReleased = false; + this.secureConnecting = true; this._SNICallback = null; this.servername = null; this.alpnProtocol = null; @@ -1029,6 +1030,7 @@ function onServerSocketSecure() { if (!this.destroyed && this._releaseControl()) { debug('server emit secureConnection'); + this.secureConnecting = false; this._tlsOptions.server.emit('secureConnection', this); } } diff --git a/lib/internal/http2/core.js b/lib/internal/http2/core.js index 204e84556b0b3b..e673be6c349673 100644 --- a/lib/internal/http2/core.js +++ b/lib/internal/http2/core.js @@ -1152,7 +1152,7 @@ class Http2Session extends EventEmitter { socket.disableRenegotiation(); const setupFn = setupHandle.bind(this, socket, type, options); - if (socket.connecting) { + if (socket.connecting || socket.secureConnecting) { const connectEvent = socket instanceof tls.TLSSocket ? 'secureConnect' : 'connect'; socket.once(connectEvent, () => { diff --git a/test/internet/test-http2-issue-32922.js b/test/internet/test-http2-issue-32922.js new file mode 100644 index 00000000000000..e11de0286eb7a4 --- /dev/null +++ b/test/internet/test-http2-issue-32922.js @@ -0,0 +1,80 @@ +'use strict'; +const common = require('../common'); +const assert = require('assert'); + +if (!common.hasCrypto) + common.skip('missing crypto'); + +const http2 = require('http2'); +const net = require('net'); + +const { + HTTP2_HEADER_PATH, +} = http2.constants; + +// Create a normal session, as a control case +function normalSession(cb) { + http2.connect('https://google.com', (clientSession) => { + let error = null; + const req = clientSession.request({ [HTTP2_HEADER_PATH]: '/' }); + req.on('error', (err) => { + error = err; + }); + req.on('response', (_headers) => { + req.on('data', (_chunk) => { }); + req.on('end', () => { + clientSession.close(); + return cb(error); + }); + }); + }); +} +normalSession(common.mustCall(function(err) { + assert.ifError(err); +})); + +// Create a session using a socket that has not yet finished connecting +function socketNotFinished(done) { + const socket2 = net.connect(443, 'google.com'); + http2.connect('https://google.com', { socket2 }, (clientSession) => { + let error = null; + const req = clientSession.request({ [HTTP2_HEADER_PATH]: '/' }); + req.on('error', (err) => { + error = err; + }); + req.on('response', (_headers) => { + req.on('data', (_chunk) => { }); + req.on('end', () => { + clientSession.close(); + socket2.destroy(); + return done(error); + }); + }); + }); +} +socketNotFinished(common.mustCall(function(err) { + assert.ifError(err); +})); + +// Create a session using a socket that has finished connecting +function socketFinished(done) { + const socket = net.connect(443, 'google.com', () => { + http2.connect('https://google.com', { socket }, (clientSession) => { + let error = null; + const req = clientSession.request({ [HTTP2_HEADER_PATH]: '/' }); + req.on('error', (err) => { + error = err; + }); + req.on('response', (_headers) => { + req.on('data', (_chunk) => { }); + req.on('end', () => { + clientSession.close(); + return done(error); + }); + }); + }); + }); +} +socketFinished(common.mustCall(function(err) { + assert.ifError(err); +})); From 4c643c0d424b743cf9daa2482b9508c3a11fe4b8 Mon Sep 17 00:00:00 2001 From: daemon1024 Date: Fri, 17 Apr 2020 12:13:53 +0530 Subject: [PATCH 33/93] fs: update validateOffsetLengthRead in utils.js MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32896 Fixes: https://github.com/nodejs/node/issues/32871 Reviewed-By: Anna Henningsen Reviewed-By: Zeyu Yang Reviewed-By: Juan José Arboleda Reviewed-By: Andrey Pechkurov --- lib/internal/fs/utils.js | 12 +++++++----- test/parallel/test-fs-read-type.js | 21 +++++++++++++++++---- 2 files changed, 24 insertions(+), 9 deletions(-) diff --git a/lib/internal/fs/utils.js b/lib/internal/fs/utils.js index 763a940f6e98a7..b91eaa006c44cf 100644 --- a/lib/internal/fs/utils.js +++ b/lib/internal/fs/utils.js @@ -539,13 +539,15 @@ function toUnixTimestamp(time, name = 'time') { const validateOffsetLengthRead = hideStackFrames( (offset, length, bufferLength) => { - if (offset < 0 || offset >= bufferLength) { - throw new ERR_OUT_OF_RANGE('offset', - `>= 0 && <= ${bufferLength}`, offset); + if (offset < 0) { + throw new ERR_OUT_OF_RANGE('offset', '>= 0', offset); } - if (length < 0 || offset + length > bufferLength) { + if (length < 0) { + throw new ERR_OUT_OF_RANGE('length', '>= 0', length); + } + if (offset + length > bufferLength) { throw new ERR_OUT_OF_RANGE('length', - `>= 0 && <= ${bufferLength - offset}`, length); + `<= ${bufferLength - offset}`, length); } } ); diff --git a/test/parallel/test-fs-read-type.js b/test/parallel/test-fs-read-type.js index dbe036794ceb56..0f9bdbab588661 100644 --- a/test/parallel/test-fs-read-type.js +++ b/test/parallel/test-fs-read-type.js @@ -44,7 +44,7 @@ assert.throws(() => { }, { code: 'ERR_OUT_OF_RANGE', name: 'RangeError', - message: 'The value of "offset" is out of range. It must be >= 0 && <= 4. ' + + message: 'The value of "offset" is out of range. It must be >= 0. ' + 'Received -1' }); @@ -73,7 +73,7 @@ assert.throws(() => { code: 'ERR_OUT_OF_RANGE', name: 'RangeError', message: 'The value of "length" is out of range. ' + - 'It must be >= 0 && <= 4. Received -1' + 'It must be >= 0. Received -1' }); @@ -110,7 +110,7 @@ assert.throws(() => { code: 'ERR_OUT_OF_RANGE', name: 'RangeError', message: 'The value of "offset" is out of range. ' + - 'It must be >= 0 && <= 4. Received -1' + 'It must be >= 0. Received -1' }); assert.throws(() => { @@ -136,5 +136,18 @@ assert.throws(() => { code: 'ERR_OUT_OF_RANGE', name: 'RangeError', message: 'The value of "length" is out of range. ' + - 'It must be >= 0 && <= 4. Received -1' + 'It must be >= 0. Received -1' +}); + +assert.throws(() => { + fs.readSync(fd, + Buffer.allocUnsafe(expected.length), + 0, + expected.length + 1, + 0); +}, { + code: 'ERR_OUT_OF_RANGE', + name: 'RangeError', + message: 'The value of "length" is out of range. ' + + 'It must be <= 4. Received 5' }); From 95e897edfc568023640740b96edbcc989f7c86be Mon Sep 17 00:00:00 2001 From: rickyes Date: Tue, 14 Apr 2020 20:53:45 +0800 Subject: [PATCH 34/93] src: use using NewStringType PR-URL: https://github.com/nodejs/node/pull/32843 Reviewed-By: Colin Ihrig Reviewed-By: Anna Henningsen Reviewed-By: Richard Lau Reviewed-By: James M Snell Reviewed-By: Ruben Bridgewater Reviewed-By: Gireesh Punathil Reviewed-By: Franziska Hinkelmann --- src/node_os.cc | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/src/node_os.cc b/src/node_os.cc index 8f1ca4f0c3ff77..bd61b4c87c0e1a 100644 --- a/src/node_os.cc +++ b/src/node_os.cc @@ -193,7 +193,7 @@ static void GetInterfaceAddresses(const FunctionCallbackInfo& args) { // they name the interface from any input that uses UTF-8, which should be // the most frequent case by far these days.) name = String::NewFromUtf8(isolate, raw_name, - v8::NewStringType::kNormal).ToLocalChecked(); + NewStringType::kNormal).ToLocalChecked(); snprintf(mac.data(), mac.size(), @@ -253,7 +253,7 @@ static void GetHomeDirectory(const FunctionCallbackInfo& args) { Local home = String::NewFromUtf8(env->isolate(), buf, - v8::NewStringType::kNormal, + NewStringType::kNormal, len).ToLocalChecked(); args.GetReturnValue().Set(home); } From fa7d96923741ce3ced74138d5068ccf01b3b3473 Mon Sep 17 00:00:00 2001 From: Rich Trott Date: Sat, 18 Apr 2020 05:22:07 -0700 Subject: [PATCH 35/93] tools: remove unused code in doc generation tool MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit tools/doc/html.js includes code that looks for comments with `start-include` and `end-include` for file inclusion. This seems to be legacy code that is no longer used. The code only appears in the table-of-contents generation function. The strings `start-include` and `end-include` appear nowhere else in our tools and nowhere at all in our docs. PR-URL: https://github.com/nodejs/node/pull/32913 Reviewed-By: Franziska Hinkelmann Reviewed-By: Luigi Pinca Reviewed-By: Juan José Arboleda Reviewed-By: Trivikram Kamat Reviewed-By: Michael Dawson Reviewed-By: James M Snell --- tools/doc/html.js | 14 +------------- 1 file changed, 1 insertion(+), 13 deletions(-) diff --git a/tools/doc/html.js b/tools/doc/html.js index acd62c3e8d0ddb..b58563045def8d 100644 --- a/tools/doc/html.js +++ b/tools/doc/html.js @@ -313,23 +313,11 @@ function versionSort(a, b) { function buildToc({ filename, apilinks }) { return (tree, file) => { - const startIncludeRefRE = /^\s*\s*$/; - const endIncludeRefRE = /^\s*\s*$/; - const realFilenames = [filename]; const idCounters = Object.create(null); let toc = ''; let depth = 0; visit(tree, null, (node) => { - // Keep track of the current filename for comment wrappers of inclusions. - if (node.type === 'html') { - const [, includedFileName] = node.value.match(startIncludeRefRE) || []; - if (includedFileName !== undefined) - realFilenames.unshift(includedFileName); - else if (endIncludeRefRE.test(node.value)) - realFilenames.shift(); - } - if (node.type !== 'heading') return; if (node.depth - depth > 1) { @@ -339,7 +327,7 @@ function buildToc({ filename, apilinks }) { } depth = node.depth; - const realFilename = path.basename(realFilenames[0], '.md'); + const realFilename = path.basename(filename, '.md'); const headingText = file.contents.slice( node.children[0].position.start.offset, node.position.end.offset).trim(); From 005c2bab2977e39d8d340df435ac31938af6b968 Mon Sep 17 00:00:00 2001 From: Gireesh Punathil Date: Wed, 25 Mar 2020 19:15:56 +0530 Subject: [PATCH 36/93] doc: elevate diagnostic report to tier1 diagnostic report qualifies for all the criteria for being in tier1. Classify it as such. PR-URL: https://github.com/nodejs/node/pull/32732 Refs: https://github.com/nodejs/diagnostics/issues/369 Reviewed-By: Richard Lau Reviewed-By: Michael Dawson Reviewed-By: Matheus Marchini Reviewed-By: Matteo Collina Reviewed-By: Chengzhong Wu Reviewed-By: James M Snell --- doc/guides/diagnostic-tooling-support-tiers.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/doc/guides/diagnostic-tooling-support-tiers.md b/doc/guides/diagnostic-tooling-support-tiers.md index 62bca48e1b7af7..254f90ecadd5fd 100644 --- a/doc/guides/diagnostic-tooling-support-tiers.md +++ b/doc/guides/diagnostic-tooling-support-tiers.md @@ -95,7 +95,8 @@ The tools are currently assigned to Tiers as follows: | Tool Type | Tool/API Name | Regular Testing in Node.js CI | Integrated with Node.js | Target Tier | |-----------|---------------------------|-------------------------------|-------------------------|-------------| - | | | | | | + | FFDC | diagnostic report | Yes | Yes | 1 | + | | | | | | ## Tier 2 From aec7bc754e3bbaccdf6f9a5cf6ddef82ea4adb32 Mon Sep 17 00:00:00 2001 From: Prosper Opara Date: Sun, 19 Apr 2020 17:45:15 +0100 Subject: [PATCH 37/93] doc: remove repeated word in modules.md MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32931 Reviewed-By: Colin Ihrig Reviewed-By: Zeyu Yang Reviewed-By: Rich Trott Reviewed-By: Juan José Arboleda Reviewed-By: James M Snell Reviewed-By: Trivikram Kamat --- doc/api/modules.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/modules.md b/doc/api/modules.md index 57d36392188058..aec88a936ba716 100644 --- a/doc/api/modules.md +++ b/doc/api/modules.md @@ -1031,7 +1031,7 @@ added: v13.7.0 > Stability: 1 - Experimental -Helpers for for interacting with the source map cache. This cache is +Helpers for interacting with the source map cache. This cache is populated when source map parsing is enabled and [source map include directives][] are found in a modules' footer. From 7647860000f07a3ed57cc78d92a6e9e6bcc64b58 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Tue, 21 Apr 2020 12:29:37 +0200 Subject: [PATCH 38/93] stream: finished should complete with read-only Duplex MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit If passed a Duplex where readable or writable has been explicitly disabled then don't assume 'close' will be emitted. Fixes: https://github.com/nodejs/node/issues/32965 PR-URL: https://github.com/nodejs/node/pull/32967 Reviewed-By: Matteo Collina Reviewed-By: Mathias Buus Reviewed-By: Rich Trott Reviewed-By: Juan José Arboleda --- lib/internal/streams/end-of-stream.js | 4 +++- test/parallel/test-stream-finished.js | 34 ++++++++++++++++++++++++++- 2 files changed, 36 insertions(+), 2 deletions(-) diff --git a/lib/internal/streams/end-of-stream.js b/lib/internal/streams/end-of-stream.js index 4742391fd71a7a..f1f1b3e5a77877 100644 --- a/lib/internal/streams/end-of-stream.js +++ b/lib/internal/streams/end-of-stream.js @@ -76,7 +76,9 @@ function eos(stream, opts, callback) { state && state.autoDestroy && state.emitClose && - state.closed === false + state.closed === false && + isReadable(stream) === readable && + isWritable(stream) === writable ); let writableFinished = stream.writableFinished || diff --git a/test/parallel/test-stream-finished.js b/test/parallel/test-stream-finished.js index ab35d402e31cfd..17ab976c6136f4 100644 --- a/test/parallel/test-stream-finished.js +++ b/test/parallel/test-stream-finished.js @@ -1,7 +1,7 @@ 'use strict'; const common = require('../common'); -const { Writable, Readable, Transform, finished } = require('stream'); +const { Writable, Readable, Transform, finished, Duplex } = require('stream'); const assert = require('assert'); const EE = require('events'); const fs = require('fs'); @@ -352,3 +352,35 @@ testClosed((opts) => new Writable({ write() {}, ...opts })); r.push(null); r.destroy(); } + +{ + const d = new Duplex({ + final(cb) { }, // Never close writable side for test purpose + read() { + this.push(null); + } + }); + + d.on('end', common.mustCall()); + + finished(d, { readable: true, writable: false }, common.mustCall()); + + d.end(); + d.resume(); +} + +{ + const d = new Duplex({ + final(cb) { }, // Never close writable side for test purpose + read() { + this.push(null); + } + }); + + d.on('end', common.mustCall()); + + d.end(); + finished(d, { readable: true, writable: false }, common.mustCall()); + + d.resume(); +} From bbed1e56cd910e0a4f55290b4fa3dbf9b6157ca7 Mon Sep 17 00:00:00 2001 From: Milad Farazmand Date: Tue, 21 Apr 2020 17:16:53 +0000 Subject: [PATCH 39/93] deps: V8: cherry-pick e1eac1b16c96 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Original commit message: Fix compilation error with devtoolset-8 We are compiling V8 using devtoolset-8 and it is generating a new compilation error related to String Truncation: error: ‘char* strncpy(char*, const char*, size_t)’ output truncated copying between 1 and 15 bytes from a string of length 15 [-Werror=stringop-truncation] strncpy(buffer, unicode_utf8, i); Which basically means the null terminating character was not added to the end of the buffer: https://developers.redhat.com/blog/2018/05/24/detecting-string-truncation-with-gcc-8/ This CL will changes 2 uses of "strncpy" to "memcpy" as strings are being copied partially and `\n` being added at a later stage. Change-Id: I3656afb00463d70ddb8700a487a1978b793e1d09 Reviewed-on: https://chromium-review.googlesource.com/c/v8/v8/+/2155038 Reviewed-by: Andreas Haas Reviewed-by: Toon Verwaest Commit-Queue: Milad Farazmand Cr-Commit-Position: refs/heads/master@{#67277} Refs: https://github.com/v8/v8/commit/e1eac1b16c966879102c2310d7649637227eaa02 PR-URL: https://github.com/nodejs/node/pull/32974 Reviewed-By: Richard Lau Reviewed-By: Colin Ihrig Reviewed-By: Ujjwal Sharma --- common.gypi | 2 +- deps/v8/test/cctest/parsing/test-scanner-streams.cc | 8 ++++---- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/common.gypi b/common.gypi index 040d0d1593eb15..476cd2dbdd1606 100644 --- a/common.gypi +++ b/common.gypi @@ -35,7 +35,7 @@ # Reset this number to 0 on major V8 upgrades. # Increment by one for each non-official patch applied to deps/v8. - 'v8_embedder_string': '-node.30', + 'v8_embedder_string': '-node.31', ##### V8 defaults for Node.js ##### diff --git a/deps/v8/test/cctest/parsing/test-scanner-streams.cc b/deps/v8/test/cctest/parsing/test-scanner-streams.cc index 35b7048bb01e3b..28687cef5b4b25 100644 --- a/deps/v8/test/cctest/parsing/test-scanner-streams.cc +++ b/deps/v8/test/cctest/parsing/test-scanner-streams.cc @@ -331,8 +331,8 @@ TEST(Utf8AdvanceUntilOverChunkBoundaries) { for (size_t i = 1; i < len; i++) { // Copy source string into buffer, splitting it at i. // Then add three chunks, 0..i-1, i..strlen-1, empty. - strncpy(buffer, unicode_utf8, i); - strncpy(buffer + i + 1, unicode_utf8 + i, len - i); + memcpy(buffer, unicode_utf8, i); + memcpy(buffer + i + 1, unicode_utf8 + i, len - i); buffer[i] = '\0'; buffer[len + 1] = '\n'; buffer[len + 2] = '\0'; @@ -360,8 +360,8 @@ TEST(Utf8ChunkBoundaries) { for (size_t i = 1; i < len; i++) { // Copy source string into buffer, splitting it at i. // Then add three chunks, 0..i-1, i..strlen-1, empty. - strncpy(buffer, unicode_utf8, i); - strncpy(buffer + i + 1, unicode_utf8 + i, len - i); + memcpy(buffer, unicode_utf8, i); + memcpy(buffer + i + 1, unicode_utf8 + i, len - i); buffer[i] = '\0'; buffer[len + 1] = '\0'; buffer[len + 2] = '\0'; From 180b935b5890b7ef78cc65ab0dfd469654a12344 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Tue, 21 Apr 2020 13:29:43 +0200 Subject: [PATCH 40/93] stream: pipeline should only destroy un-finished streams This PR logically reverts https://github.com/nodejs/node/pull/31940 which has caused lots of unnecessary breakage in the ecosystem. This PR also aligns better with the actual documented behavior: `stream.pipeline()` will call `stream.destroy(err)` on all streams except: * `Readable` streams which have emitted `'end'` or `'close'`. * `Writable` streams which have emitted `'finish'` or `'close'`. The behavior introduced in https://github.com/nodejs/node/pull/31940 was much more aggressive in terms of destroying streams. This was good for avoiding potential resources leaks however breaks some common assumputions in legacy streams. Furthermore, it makes the code simpler and removes some hacks. Fixes: https://github.com/nodejs/node/issues/32954 Fixes: https://github.com/nodejs/node/issues/32955 PR-URL: https://github.com/nodejs/node/pull/32968 Reviewed-By: Matteo Collina Reviewed-By: Mathias Buus --- lib/internal/streams/pipeline.js | 56 +++++----------- test/parallel/test-stream-pipeline.js | 95 ++++++++++++++++++++++++++- 2 files changed, 112 insertions(+), 39 deletions(-) diff --git a/lib/internal/streams/pipeline.js b/lib/internal/streams/pipeline.js index cdd5bcb791f451..cf9d7868916a9e 100644 --- a/lib/internal/streams/pipeline.js +++ b/lib/internal/streams/pipeline.js @@ -25,43 +25,18 @@ let EE; let PassThrough; let createReadableStreamAsyncIterator; -function isIncoming(stream) { - return ( - stream.socket && - typeof stream.complete === 'boolean' && - ArrayIsArray(stream.rawTrailers) && - ArrayIsArray(stream.rawHeaders) - ); -} - -function isOutgoing(stream) { - return ( - stream.socket && - typeof stream.setHeader === 'function' - ); -} - -function destroyer(stream, reading, writing, final, callback) { - const _destroy = once((err) => { - if (!err && (isIncoming(stream) || isOutgoing(stream))) { - // http/1 request objects have a coupling to their response and should - // not be prematurely destroyed. Assume they will handle their own - // lifecycle. - return callback(); - } +function destroyer(stream, reading, writing, callback) { + callback = once(callback); - if (!err && reading && !writing && stream.writable) { - return callback(); - } - - if (err || !final || !stream.readable) { - destroyImpl.destroyer(stream, err); - } - callback(err); + let finished = false; + stream.on('close', () => { + finished = true; }); if (eos === undefined) eos = require('internal/streams/end-of-stream'); eos(stream, { readable: reading, writable: writing }, (err) => { + finished = !err; + const rState = stream._readableState; if ( err && @@ -78,14 +53,19 @@ function destroyer(stream, reading, writing, final, callback) { // eos will only fail with premature close on the reading side for // duplex streams. stream - .once('end', _destroy) - .once('error', _destroy); + .once('end', callback) + .once('error', callback); } else { - _destroy(err); + callback(err); } }); - return (err) => _destroy(err || new ERR_STREAM_DESTROYED('pipe')); + return (err) => { + if (finished) return; + finished = true; + destroyImpl.destroyer(stream, err); + callback(err || new ERR_STREAM_DESTROYED('pipe')); + }; } function popCallback(streams) { @@ -204,7 +184,7 @@ function pipeline(...streams) { if (isStream(stream)) { finishCount++; - destroys.push(destroyer(stream, reading, writing, !reading, finish)); + destroys.push(destroyer(stream, reading, writing, finish)); } if (i === 0) { @@ -262,7 +242,7 @@ function pipeline(...streams) { ret = pt; finishCount++; - destroys.push(destroyer(ret, false, true, true, finish)); + destroys.push(destroyer(ret, false, true, finish)); } } else if (isStream(stream)) { if (isReadable(ret)) { diff --git a/test/parallel/test-stream-pipeline.js b/test/parallel/test-stream-pipeline.js index b273fddfa3b613..453ac30b3f4d64 100644 --- a/test/parallel/test-stream-pipeline.js +++ b/test/parallel/test-stream-pipeline.js @@ -13,6 +13,7 @@ const { const assert = require('assert'); const http = require('http'); const { promisify } = require('util'); +const net = require('net'); { let finished = false; @@ -916,7 +917,7 @@ const { promisify } = require('util'); const src = new PassThrough({ autoDestroy: false }); const dst = new PassThrough({ autoDestroy: false }); pipeline(src, dst, common.mustCall(() => { - assert.strictEqual(src.destroyed, true); + assert.strictEqual(src.destroyed, false); assert.strictEqual(dst.destroyed, false); })); src.end(); @@ -1118,3 +1119,95 @@ const { promisify } = require('util'); assert.strictEqual(closed, true); })); } + +{ + const server = net.createServer(common.mustCall((socket) => { + // echo server + pipeline(socket, socket, common.mustCall()); + // 13 force destroys the socket before it has a chance to emit finish + socket.on('finish', common.mustCall(() => { + server.close(); + })); + })).listen(0, common.mustCall(() => { + const socket = net.connect(server.address().port); + socket.end(); + })); +} + +{ + const d = new Duplex({ + autoDestroy: false, + write: common.mustCall((data, enc, cb) => { + d.push(data); + cb(); + }), + read: common.mustCall(() => { + d.push(null); + }), + final: common.mustCall((cb) => { + setTimeout(() => { + assert.strictEqual(d.destroyed, false); + cb(); + }, 1000); + }), + destroy: common.mustNotCall() + }); + + const sink = new Writable({ + write: common.mustCall((data, enc, cb) => { + cb(); + }) + }); + + pipeline(d, sink, common.mustCall()); + + d.write('test'); + d.end(); +} + +{ + const server = net.createServer(common.mustCall((socket) => { + // echo server + pipeline(socket, socket, common.mustCall()); + socket.on('finish', common.mustCall(() => { + server.close(); + })); + })).listen(0, common.mustCall(() => { + const socket = net.connect(server.address().port); + socket.end(); + })); +} + +{ + const d = new Duplex({ + autoDestroy: false, + write: common.mustCall((data, enc, cb) => { + d.push(data); + cb(); + }), + read: common.mustCall(() => { + d.push(null); + }), + final: common.mustCall((cb) => { + setTimeout(() => { + assert.strictEqual(d.destroyed, false); + cb(); + }, 1000); + }), + // `destroy()` won't be invoked by pipeline since + // the writable side has not completed when + // the pipeline has completed. + destroy: common.mustNotCall() + }); + + const sink = new Writable({ + write: common.mustCall((data, enc, cb) => { + cb(); + }) + }); + + pipeline(d, sink, common.mustCall()); + + d.write('test'); + d.end(); +} From 7abc61f668255fc8ac829a71afcbddcf02a1281c Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Sun, 15 Dec 2019 12:19:16 +0100 Subject: [PATCH 41/93] stream: refactor Writable buffering Refactors buffering in Writable to use an array instead of a linked list. PR-URL: https://github.com/nodejs/node/pull/31046 Reviewed-By: Ruben Bridgewater Reviewed-By: Denys Otrishko Reviewed-By: Matteo Collina --- lib/_stream_writable.js | 220 ++++++++++++++++------------------------ 1 file changed, 88 insertions(+), 132 deletions(-) diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 399c27617d17c8..1d02e2ff8e27f6 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -26,7 +26,6 @@ 'use strict'; const { - Array, FunctionPrototype, ObjectDefineProperty, ObjectDefineProperties, @@ -150,8 +149,7 @@ function WritableState(options, stream, isDuplex) { // synchronous _write() completion. this.afterWriteTickInfo = null; - this.bufferedRequest = null; - this.lastBufferedRequest = null; + resetBuffer(this); // Number of pending user-supplied write callbacks // this must be 0 before 'finish' can be emitted @@ -177,27 +175,25 @@ function WritableState(options, stream, isDuplex) { // Indicates whether the stream has finished destroying. this.closed = false; +} - // Count buffered requests - this.bufferedRequestCount = 0; - - // Allocate the first CorkedRequest, there is always - // one allocated and free to use, and we maintain at most two - const corkReq = { next: null, entry: null, finish: undefined }; - corkReq.finish = onCorkedFinish.bind(undefined, corkReq, this); - this.corkedRequestsFree = corkReq; +function resetBuffer(state) { + state.buffered = []; + state.bufferedIndex = 0; + state.allBuffers = true; + state.allNoop = true; } WritableState.prototype.getBuffer = function getBuffer() { - let current = this.bufferedRequest; - const out = []; - while (current) { - out.push(current); - current = current.next; - } - return out; + return this.buffered.slice(this.bufferedIndex); }; +ObjectDefineProperty(WritableState.prototype, 'bufferedRequestCount', { + get() { + return this.buffered.length - this.bufferedIndex; + } +}); + // Test _writableState for inheritance to account for Duplex streams, // whose prototype chain only points to Readable. let realHasInstance; @@ -318,10 +314,7 @@ Writable.prototype.uncork = function() { if (state.corked) { state.corked--; - if (!state.writing && - !state.corked && - !state.bufferProcessing && - state.bufferedRequest) + if (!state.writing) clearBuffer(this, state); } }; @@ -339,7 +332,7 @@ Writable.prototype.setDefaultEncoding = function setDefaultEncoding(encoding) { // If we're already writing something, then just put this // in the queue, and wait our turn. Otherwise, call _write // If we return false, then we need a drain event, so set that flag. -function writeOrBuffer(stream, state, chunk, encoding, cb) { +function writeOrBuffer(stream, state, chunk, encoding, callback) { const len = state.objectMode ? 1 : chunk.length; state.length += len; @@ -350,22 +343,16 @@ function writeOrBuffer(stream, state, chunk, encoding, cb) { state.needDrain = true; if (state.writing || state.corked || state.errored) { - const last = state.lastBufferedRequest; - state.lastBufferedRequest = { - chunk, - encoding, - callback: cb, - next: null - }; - if (last) { - last.next = state.lastBufferedRequest; - } else { - state.bufferedRequest = state.lastBufferedRequest; + state.buffered.push({ chunk, encoding, callback }); + if (state.allBuffers && encoding !== 'buffer') { + state.allBuffers = false; + } + if (state.allNoop && callback !== nop) { + state.allNoop = false; } - state.bufferedRequestCount += 1; } else { state.writelen = len; - state.writecb = cb; + state.writecb = callback; state.writing = true; state.sync = true; stream._write(chunk, encoding, state.onwrite); @@ -427,30 +414,27 @@ function onwrite(stream, er) { onwriteError(stream, state, er, cb); } } else { - // Check if we're actually ready to finish, but don't emit yet - const finished = needFinish(state) || stream.destroyed; - - if (!finished && - !state.corked && - !state.bufferProcessing && - state.bufferedRequest) { + if (!state.destroyed) { clearBuffer(stream, state); } - - if (sync) { - // It is a common case that the callback passed to .write() is always - // the same. In that case, we do not schedule a new nextTick(), but rather - // just increase a counter, to improve performance and avoid memory - // allocations. - if (state.afterWriteTickInfo !== null && - state.afterWriteTickInfo.cb === cb) { - state.afterWriteTickInfo.count++; + if (state.needDrain || cb !== nop || state.ending || state.destroyed) { + if (sync) { + // It is a common case that the callback passed to .write() is always + // the same. In that case, we do not schedule a new nextTick(), but + // rather just increase a counter, to improve performance and avoid + // memory allocations. + if (state.afterWriteTickInfo !== null && + state.afterWriteTickInfo.cb === cb) { + state.afterWriteTickInfo.count++; + } else { + state.afterWriteTickInfo = { count: 1, cb, stream, state }; + process.nextTick(afterWriteTick, state.afterWriteTickInfo); + } } else { - state.afterWriteTickInfo = { count: 1, cb, stream, state }; - process.nextTick(afterWriteTick, state.afterWriteTickInfo); + afterWrite(stream, state, 1, cb); } } else { - afterWrite(stream, state, 1, cb); + state.pendingcb--; } } } @@ -482,83 +466,69 @@ function afterWrite(stream, state, count, cb) { // If there's something in the buffer waiting, then invoke callbacks. function errorBuffer(state, err) { - if (state.writing || !state.bufferedRequest) { + if (state.writing) { return; } - for (let entry = state.bufferedRequest; entry; entry = entry.next) { - const len = state.objectMode ? 1 : entry.chunk.length; + for (let n = state.bufferedIndex; n < state.buffered.length; ++n) { + const { chunk, callback } = state.buffered[n]; + const len = state.objectMode ? 1 : chunk.length; state.length -= len; - entry.callback(err); + callback(err); } - state.bufferedRequest = null; - state.lastBufferedRequest = null; - state.bufferedRequestCount = 0; + + resetBuffer(state); } // If there's something in the buffer waiting, then process it function clearBuffer(stream, state) { + if (state.corked || state.bufferProcessing) { + return; + } + + const { buffered, bufferedIndex, objectMode } = state; + const bufferedLength = buffered.length - bufferedIndex; + + if (!bufferedLength) { + return; + } + + let i = bufferedIndex; + state.bufferProcessing = true; - let entry = state.bufferedRequest; - - if (stream._writev && entry && entry.next) { - // Fast case, write everything using _writev() - const l = state.bufferedRequestCount; - const buffer = new Array(l); - const holder = state.corkedRequestsFree; - holder.entry = entry; - - let count = 0; - let allBuffers = true; - while (entry) { - buffer[count] = entry; - if (entry.encoding !== 'buffer') - allBuffers = false; - entry = entry.next; - count += 1; - } - buffer.allBuffers = allBuffers; + if (bufferedLength > 1 && stream._writev) { + state.pendingcb -= bufferedLength - 1; + + const callback = state.allNoop ? nop : (err) => { + for (let n = i; n < buffered.length; ++n) { + buffered[n].callback(err); + } + }; + // Make a copy of `buffered` if it's going to be used by `callback` above, + // since `doWrite` will mutate the array. + const chunks = state.allNoop && i === 0 ? buffered : buffered.slice(i); + chunks.allBuffers = state.allBuffers; - doWrite(stream, state, true, state.length, buffer, '', holder.finish); + doWrite(stream, state, true, state.length, chunks, '', callback); - // doWrite is almost always async, defer these to save a bit of time - // as the hot path ends with doWrite - state.pendingcb++; - state.lastBufferedRequest = null; - if (holder.next) { - state.corkedRequestsFree = holder.next; - holder.next = null; - } else { - const corkReq = { next: null, entry: null, finish: undefined }; - corkReq.finish = onCorkedFinish.bind(undefined, corkReq, state); - state.corkedRequestsFree = corkReq; - } - state.bufferedRequestCount = 0; + resetBuffer(state); } else { - // Slow case, write chunks one-by-one - while (entry) { - const chunk = entry.chunk; - const encoding = entry.encoding; - const cb = entry.callback; - const len = state.objectMode ? 1 : chunk.length; - - doWrite(stream, state, false, len, chunk, encoding, cb); - entry = entry.next; - state.bufferedRequestCount--; - // If we didn't call the onwrite immediately, then - // it means that we need to wait until it does. - // also, that means that the chunk and cb are currently - // being processed, so move the buffer counter past them. - if (state.writing) { - break; - } + do { + const { chunk, encoding, callback } = buffered[i]; + buffered[i++] = null; + const len = objectMode ? 1 : chunk.length; + doWrite(stream, state, false, len, chunk, encoding, callback); + } while (i < buffered.length && !state.writing); + + if (i === buffered.length) { + resetBuffer(state); + } else if (i > 256) { + buffered.splice(0, i); + state.bufferedIndex = 0; + } else { + state.bufferedIndex = i; } - - if (entry === null) - state.lastBufferedRequest = null; } - - state.bufferedRequest = entry; state.bufferProcessing = false; } @@ -622,7 +592,7 @@ function needFinish(state) { return (state.ending && state.length === 0 && !state.errored && - state.bufferedRequest === null && + state.buffered.length === 0 && !state.finished && !state.writing); } @@ -693,20 +663,6 @@ function finish(stream, state) { } } -function onCorkedFinish(corkReq, state, err) { - let entry = corkReq.entry; - corkReq.entry = null; - while (entry) { - const cb = entry.callback; - state.pendingcb--; - cb(err); - entry = entry.next; - } - - // Reuse the free corkReq. - state.corkedRequestsFree.next = corkReq; -} - // TODO(ronag): Avoid using events to implement internal logic. function onFinished(stream, state, cb) { function onerror(err) { From 1a2b3eb3a4744c2a802c3b9b7e0752e63b47099b Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Thu, 23 Apr 2020 21:36:32 +0200 Subject: [PATCH 42/93] stream: fix broken pipeline test An unfortunate overlap between two PR that by themselves pass CI but together pass a test. https://github.com/nodejs/node/pull/32967 changes so that pipeline does not wait for 'close'. https://github.com/nodejs/node/pull/32968 changed so that all streams are not destroyed. Which made one test fail when expected the stream to be destroyed during pipeline callback. PR-URL: https://github.com/nodejs/node/pull/33030 Reviewed-By: James M Snell Reviewed-By: Ruben Bridgewater --- test/parallel/test-stream-pipeline.js | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/test/parallel/test-stream-pipeline.js b/test/parallel/test-stream-pipeline.js index 453ac30b3f4d64..203a32b5cd74a1 100644 --- a/test/parallel/test-stream-pipeline.js +++ b/test/parallel/test-stream-pipeline.js @@ -982,7 +982,7 @@ const net = require('net'); dst.readable = false; pipeline(src, dst, common.mustCall((err) => { assert(!err); - assert.strictEqual(dst.destroyed, true); + assert.strictEqual(dst.destroyed, false); })); src.end(); } From b4ef06267d97eca0bfd734cf3be0ae31267b1716 Mon Sep 17 00:00:00 2001 From: Daniel Estiven Rico Posada Date: Thu, 2 Apr 2020 21:54:56 -0500 Subject: [PATCH 43/93] test: test-async-wrap-constructor prefer forEach MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32631 Reviewed-By: Juan José Arboleda Reviewed-By: Anna Henningsen --- test/parallel/test-async-wrap-constructor.js | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/test/parallel/test-async-wrap-constructor.js b/test/parallel/test-async-wrap-constructor.js index 8e96e9ce3021ef..e89bc49df02333 100644 --- a/test/parallel/test-async-wrap-constructor.js +++ b/test/parallel/test-async-wrap-constructor.js @@ -6,14 +6,15 @@ require('../common'); const assert = require('assert'); const async_hooks = require('async_hooks'); -for (const badArg of [0, 1, false, true, null, 'hello']) { +[0, 1, false, true, null, 'hello'].forEach((badArg) => { const hookNames = ['init', 'before', 'after', 'destroy', 'promiseResolve']; - for (const field of hookNames) { + hookNames.forEach((field) => { assert.throws(() => { async_hooks.createHook({ [field]: badArg }); }, { code: 'ERR_ASYNC_CALLBACK', name: 'TypeError', + message: `hook.${field} must be a function` }); - } -} + }); +}); From 71f90234f906e16bac0b1c423684a0c61c2598dd Mon Sep 17 00:00:00 2001 From: Nick Schonning Date: Sun, 5 Apr 2020 23:26:24 -0400 Subject: [PATCH 44/93] doc: add angle brackets around implicit links PR-URL: https://github.com/nodejs/node/pull/32676 Reviewed-By: Rich Trott --- BUILDING.md | 4 ++-- doc/api/deprecations.md | 2 +- doc/api/errors.md | 2 +- doc/api/os.md | 6 +++--- doc/guides/cve-management-process.md | 4 ++-- doc/guides/maintaining-openssl.md | 4 ++-- doc/guides/offboarding.md | 2 +- doc/guides/releases.md | 2 +- doc/guides/security-release-process.md | 4 ++-- doc/guides/using-symbols.md | 2 +- doc/guides/writing-and-running-benchmarks.md | 4 ++-- doc/guides/writing-tests.md | 2 +- glossary.md | 2 +- onboarding.md | 4 ++-- tools/icu/README.md | 2 +- 15 files changed, 23 insertions(+), 23 deletions(-) diff --git a/BUILDING.md b/BUILDING.md index a4a00a9f1ffe49..a7dc040f5f385e 100644 --- a/BUILDING.md +++ b/BUILDING.md @@ -204,7 +204,7 @@ For use of AVX2, * nasm version 2.10 or higher in Windows Please refer to - https://www.openssl.org/docs/man1.1.1/man3/OPENSSL_ia32cap.html for details. + for details. If compiling without one of the above, use `configure` with the `--openssl-no-asm` flag. Otherwise, `configure` will fail. @@ -277,7 +277,7 @@ $ make -j4 If you run into a `No module named 'distutils.spawn'` error when executing `./configure`, please try `python3 -m pip install --upgrade setuptools` or `sudo apt install python3-distutils -y`. -For more information, see https://github.com/nodejs/node/issues/30189. +For more information, see . The `-j4` option will cause `make` to run 4 simultaneous compilation jobs which may reduce build time. For more information, see the diff --git a/doc/api/deprecations.md b/doc/api/deprecations.md index 9bdb7c429eb5fa..9833e0c421a18c 100644 --- a/doc/api/deprecations.md +++ b/doc/api/deprecations.md @@ -2352,7 +2352,7 @@ The undocumented `net._setSimultaneousAccepts()` function was originally intended for debugging and performance tuning when using the `child_process` and `cluster` modules on Windows. The function is not generally useful and is being removed. See discussion here: -https://github.com/nodejs/node/issues/18391 + ### DEP0122: `tls` `Server.prototype.setOptions()` diff --git a/doc/api/errors.md b/doc/api/errors.md index 5308490b88fe8c..480bbdfbd23e42 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -1194,7 +1194,7 @@ is set for the `Http2Stream`. ### `ERR_INTERNAL_ASSERTION` There was a bug in Node.js or incorrect usage of Node.js internals. -To fix the error, open an issue at https://github.com/nodejs/node/issues. +To fix the error, open an issue at . ### `ERR_INCOMPATIBLE_OPTION_PAIR` diff --git a/doc/api/os.md b/doc/api/os.md index 27a7c3d549f78b..e2cce1f8a5580f 100644 --- a/doc/api/os.md +++ b/doc/api/os.md @@ -290,7 +290,7 @@ Returns the operating system as a string. On POSIX systems, the operating system release is determined by calling [uname(3)][]. On Windows, `GetVersionExW()` is used. See -https://en.wikipedia.org/wiki/Uname#Examples for more information. + for more information. ## `os.setPriority([pid, ]priority)` + diff --git a/doc/changelogs/CHANGELOG_IOJS.md b/doc/changelogs/CHANGELOG_IOJS.md index e736a70dedfef2..0a650253182b6d 100644 --- a/doc/changelogs/CHANGELOG_IOJS.md +++ b/doc/changelogs/CHANGELOG_IOJS.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V010.md b/doc/changelogs/CHANGELOG_V010.md index 65b6bef55bbd20..d4b579e6b5b88b 100644 --- a/doc/changelogs/CHANGELOG_V010.md +++ b/doc/changelogs/CHANGELOG_V010.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V012.md b/doc/changelogs/CHANGELOG_V012.md index 8a5ebf3e429fa6..d0485ffc58f652 100644 --- a/doc/changelogs/CHANGELOG_V012.md +++ b/doc/changelogs/CHANGELOG_V012.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V10.md b/doc/changelogs/CHANGELOG_V10.md index e347b51619e6d8..8a3baae0a53c57 100644 --- a/doc/changelogs/CHANGELOG_V10.md +++ b/doc/changelogs/CHANGELOG_V10.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V11.md b/doc/changelogs/CHANGELOG_V11.md index 960276df5e0293..be1cd8102e74a5 100644 --- a/doc/changelogs/CHANGELOG_V11.md +++ b/doc/changelogs/CHANGELOG_V11.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V12.md b/doc/changelogs/CHANGELOG_V12.md index 28e0c6ec306316..e3a457c39f18c6 100644 --- a/doc/changelogs/CHANGELOG_V12.md +++ b/doc/changelogs/CHANGELOG_V12.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V13.md b/doc/changelogs/CHANGELOG_V13.md index 5790e30d1d0a1b..a7853de9d4576b 100644 --- a/doc/changelogs/CHANGELOG_V13.md +++ b/doc/changelogs/CHANGELOG_V13.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V4.md b/doc/changelogs/CHANGELOG_V4.md index 2f5c81319025c7..69f36cf2cea9d9 100644 --- a/doc/changelogs/CHANGELOG_V4.md +++ b/doc/changelogs/CHANGELOG_V4.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V5.md b/doc/changelogs/CHANGELOG_V5.md index ac338886c858c6..c9cd28d692bfaf 100644 --- a/doc/changelogs/CHANGELOG_V5.md +++ b/doc/changelogs/CHANGELOG_V5.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V6.md b/doc/changelogs/CHANGELOG_V6.md index 763ff802884781..20c9370ff0d5ab 100644 --- a/doc/changelogs/CHANGELOG_V6.md +++ b/doc/changelogs/CHANGELOG_V6.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V7.md b/doc/changelogs/CHANGELOG_V7.md index 5a06890f58685b..a64b973c8c2192 100644 --- a/doc/changelogs/CHANGELOG_V7.md +++ b/doc/changelogs/CHANGELOG_V7.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V8.md b/doc/changelogs/CHANGELOG_V8.md index 33fa32948389b6..41e38efcf57206 100644 --- a/doc/changelogs/CHANGELOG_V8.md +++ b/doc/changelogs/CHANGELOG_V8.md @@ -2,6 +2,7 @@ +
diff --git a/doc/changelogs/CHANGELOG_V9.md b/doc/changelogs/CHANGELOG_V9.md index f58341bb34b52d..e4cdf50a9381b6 100644 --- a/doc/changelogs/CHANGELOG_V9.md +++ b/doc/changelogs/CHANGELOG_V9.md @@ -2,6 +2,7 @@ +
From f92b398c76d8273aa8e74358ca7b8ba70ec55cd4 Mon Sep 17 00:00:00 2001 From: Nick Schonning Date: Wed, 22 Apr 2020 22:46:00 -0400 Subject: [PATCH 46/93] doc: convert bare email addresses to mailto links reflowed for line length after increased url size PR-URL: https://github.com/nodejs/node/pull/32676 Reviewed-By: Rich Trott --- doc/guides/cve-management-process.md | 12 ++++++++---- 1 file changed, 8 insertions(+), 4 deletions(-) diff --git a/doc/guides/cve-management-process.md b/doc/guides/cve-management-process.md index 36c2b3a6ca38f0..eba94484b53877 100644 --- a/doc/guides/cve-management-process.md +++ b/doc/guides/cve-management-process.md @@ -18,17 +18,17 @@ of contact points. Email aliases have been setup for these as follows: * **Public contact points**. Email address to which people will be directed by Mitre when they are asked for a way to contact the Node.js team about - CVE-related issues. **cve-request@iojs.org** + CVE-related issues. **[cve-request@iojs.org][]** * **Private contact points**. Administrative contacts that Mitre can reach out to directly in case there are issues that require immediate attention. - **cve-mitre-contact@iojs.org** + **[cve-mitre-contact@iojs.org][]** * **Email addresses to add to the CNA email discussion list**. This address has been added to a closed mailing list that is used for announcements, sharing documents, or discussion relevant to the CNA community. The list rarely has more than ten messages a week. - **cna-discussion-list@iojs.org** + **[cna-discussion-list@iojs.org][]** ## CNA management processes @@ -72,7 +72,7 @@ of CVEs should then be requested using the steps listed above. ### External CVE request process -When a request for a CVE is received via the cve-request@iojs.org +When a request for a CVE is received via the [cve-request@iojs.org][] email alias the following process will be followed (likely updated after we get HackerOne up and running). @@ -135,3 +135,7 @@ following steps are used to assign, announce and report a CVE. * Move the CVE from the Pending section to the Announced section along with a link to the Node.js blog post announcing that releases are available. + +[cve-request@iojs.org]: mailto:cve-request@iojs.org +[cve-mitre-contact@iojs.org]: mailto:cve-mitre-contact@iojs.org +[cna-discussion-list@iojs.org]: mailto:cna-discussion-list@iojs.org From bf331b4e21f3a76c99e65c45668747aa7a3a0787 Mon Sep 17 00:00:00 2001 From: Nick Schonning Date: Wed, 22 Apr 2020 22:50:49 -0400 Subject: [PATCH 47/93] doc: ignore no-literal-urls in README Membership lists are currently formatted in a specific way for tooling PR-URL: https://github.com/nodejs/node/pull/32676 Reviewed-By: Rich Trott --- README.md | 1 + 1 file changed, 1 insertion(+) diff --git a/README.md b/README.md index 0048be118c988c..0542c5c7b26a11 100644 --- a/README.md +++ b/README.md @@ -1,3 +1,4 @@ +

Date: Wed, 22 Apr 2020 14:27:49 +0800 Subject: [PATCH 48/93] lib: simplify function process.emitWarning MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32992 Reviewed-By: Michaël Zasso Reviewed-By: Colin Ihrig Reviewed-By: Andrey Pechkurov Reviewed-By: Ruben Bridgewater Reviewed-By: Juan José Arboleda Reviewed-By: James M Snell --- lib/internal/process/warning.js | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/lib/internal/process/warning.js b/lib/internal/process/warning.js index 383bbd7e0fe79f..ebf4c932fa9c57 100644 --- a/lib/internal/process/warning.js +++ b/lib/internal/process/warning.js @@ -55,7 +55,7 @@ function writeToFile(message) { } function doEmitWarning(warning) { - return () => process.emit('warning', warning); + process.emit('warning', warning); } let traceWarningHelperShown = false; @@ -129,7 +129,7 @@ function emitWarning(warning, type, code, ctor) { if (process.throwDeprecation) throw warning; } - process.nextTick(doEmitWarning(warning)); + process.nextTick(doEmitWarning, warning); } function emitWarningSync(warning) { From 30c2b0f79895a269dc62d769764c4ad0b11b9b1a Mon Sep 17 00:00:00 2001 From: Anna Henningsen Date: Wed, 15 Apr 2020 01:54:43 +0200 Subject: [PATCH 49/93] src: deprecate embedder APIs with replacements MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Implement a number of TODO comments aiming at the eventual removal of some embedder APIs that now have replacements available. PR-URL: https://github.com/nodejs/node/pull/32858 Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Michael Dawson Reviewed-By: Juan José Arboleda Reviewed-By: Franziska Hinkelmann --- src/node.h | 40 ++++++++++++++++++++-------------------- 1 file changed, 20 insertions(+), 20 deletions(-) diff --git a/src/node.h b/src/node.h index 16d30e89d2a8b7..7fbbdb7ea23601 100644 --- a/src/node.h +++ b/src/node.h @@ -227,15 +227,14 @@ NODE_EXTERN int Start(int argc, char* argv[]); // in the loop and / or actively executing JavaScript code). NODE_EXTERN int Stop(Environment* env); -// TODO(addaleax): Officially deprecate this and replace it with something -// better suited for a public embedder API. // It is recommended to use InitializeNodeWithArgs() instead as an embedder. // Init() calls InitializeNodeWithArgs() and exits the process with the exit // code returned from it. -NODE_EXTERN void Init(int* argc, - const char** argv, - int* exec_argc, - const char*** exec_argv); +NODE_DEPRECATED("Use InitializeNodeWithArgs() instead", + NODE_EXTERN void Init(int* argc, + const char** argv, + int* exec_argc, + const char*** exec_argv)); // Set up per-process state needed to run Node.js. This will consume arguments // from argv, fill exec_argv, and possibly add errors resulting from parsing // the arguments to `errors`. The return value is a suggested exit code for the @@ -428,12 +427,13 @@ struct InspectorParentHandle { // Returns nullptr when the Environment cannot be created e.g. there are // pending JavaScript exceptions. // It is recommended to use the second variant taking a flags argument. -NODE_EXTERN Environment* CreateEnvironment(IsolateData* isolate_data, - v8::Local context, - int argc, - const char* const* argv, - int exec_argc, - const char* const* exec_argv); +NODE_DEPRECATED("Use overload taking a flags argument", + NODE_EXTERN Environment* CreateEnvironment(IsolateData* isolate_data, + v8::Local context, + int argc, + const char* const* argv, + int exec_argc, + const char* const* exec_argv)); NODE_EXTERN Environment* CreateEnvironment( IsolateData* isolate_data, v8::Local context, @@ -463,8 +463,8 @@ struct StartExecutionCallbackInfo { using StartExecutionCallback = std::function(const StartExecutionCallbackInfo&)>; -// TODO(addaleax): Deprecate this in favour of the MaybeLocal<> overload. -NODE_EXTERN void LoadEnvironment(Environment* env); +NODE_DEPRECATED("Use variants returning MaybeLocal<> instead", + NODE_EXTERN void LoadEnvironment(Environment* env)); // The `InspectorParentHandle` arguments here are ignored and not used. // For passing `InspectorParentHandle`, use `CreateEnvironment()`. NODE_EXTERN v8::MaybeLocal LoadEnvironment( @@ -495,18 +495,18 @@ NODE_EXTERN Environment* GetCurrentEnvironment(v8::Local context); // This returns the MultiIsolatePlatform used in the main thread of Node.js. // If NODE_USE_V8_PLATFORM has not been defined when Node.js was built, // it returns nullptr. -// TODO(addaleax): Deprecate in favour of GetMultiIsolatePlatform(). -NODE_EXTERN MultiIsolatePlatform* GetMainThreadMultiIsolatePlatform(); +NODE_DEPRECATED("Use GetMultiIsolatePlatform(env) instead", + NODE_EXTERN MultiIsolatePlatform* GetMainThreadMultiIsolatePlatform()); // This returns the MultiIsolatePlatform used for an Environment or IsolateData // instance, if one exists. NODE_EXTERN MultiIsolatePlatform* GetMultiIsolatePlatform(Environment* env); NODE_EXTERN MultiIsolatePlatform* GetMultiIsolatePlatform(IsolateData* env); // Legacy variants of MultiIsolatePlatform::Create(). -// TODO(addaleax): Deprecate in favour of the v8::TracingController variant. -NODE_EXTERN MultiIsolatePlatform* CreatePlatform( - int thread_pool_size, - node::tracing::TracingController* tracing_controller); +NODE_DEPRECATED("Use variant taking a v8::TracingController* pointer instead", + NODE_EXTERN MultiIsolatePlatform* CreatePlatform( + int thread_pool_size, + node::tracing::TracingController* tracing_controller)); NODE_EXTERN MultiIsolatePlatform* CreatePlatform( int thread_pool_size, v8::TracingController* tracing_controller); From 45e188b2e3d64bdc8e96396e27db73209f116ae1 Mon Sep 17 00:00:00 2001 From: Adrian Estrada Date: Sat, 11 Apr 2020 13:10:10 -0500 Subject: [PATCH 50/93] test: refactor events tests for invalid listeners MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32769 Reviewed-By: Anna Henningsen Reviewed-By: Gireesh Punathil Reviewed-By: Gerhard Stöbich Reviewed-By: Richard Lau --- .../test-event-emitter-add-listeners.js | 11 ---------- .../test-event-emitter-invalid-listener.js | 20 +++++++++++++++++++ test/parallel/test-event-emitter-once.js | 11 ---------- test/parallel/test-event-emitter-prepend.js | 11 ---------- .../test-event-emitter-remove-listeners.js | 11 ---------- 5 files changed, 20 insertions(+), 44 deletions(-) create mode 100644 test/parallel/test-event-emitter-invalid-listener.js diff --git a/test/parallel/test-event-emitter-add-listeners.js b/test/parallel/test-event-emitter-add-listeners.js index 4b0305c176bd14..f42d1f24878483 100644 --- a/test/parallel/test-event-emitter-add-listeners.js +++ b/test/parallel/test-event-emitter-add-listeners.js @@ -84,14 +84,3 @@ const EventEmitter = require('events'); // listeners were added. assert.deepStrictEqual(ee.listeners('hello'), [listen2, listen1]); } - -// Verify that the listener must be a function -assert.throws(() => { - const ee = new EventEmitter(); - ee.on('foo', null); -}, { - code: 'ERR_INVALID_ARG_TYPE', - name: 'TypeError', - message: 'The "listener" argument must be of type function. ' + - 'Received null' -}); diff --git a/test/parallel/test-event-emitter-invalid-listener.js b/test/parallel/test-event-emitter-invalid-listener.js new file mode 100644 index 00000000000000..3070d4ed7517de --- /dev/null +++ b/test/parallel/test-event-emitter-invalid-listener.js @@ -0,0 +1,20 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const EventEmitter = require('events'); + +const eventsMethods = ['on', 'once', 'removeListener', 'prependOnceListener']; + +// Verify that the listener must be a function for events methods +for (const method of eventsMethods) { + assert.throws(() => { + const ee = new EventEmitter(); + ee[method]('foo', null); + }, { + code: 'ERR_INVALID_ARG_TYPE', + name: 'TypeError', + message: 'The "listener" argument must be of type function. ' + + 'Received null' + }, `event.${method}('foo', null) should throw the proper error`); +} diff --git a/test/parallel/test-event-emitter-once.js b/test/parallel/test-event-emitter-once.js index 1ad2de1da556bf..983f6141b9dc4c 100644 --- a/test/parallel/test-event-emitter-once.js +++ b/test/parallel/test-event-emitter-once.js @@ -49,17 +49,6 @@ e.once('e', common.mustCall()); e.emit('e'); -// Verify that the listener must be a function -assert.throws(() => { - const ee = new EventEmitter(); - ee.once('foo', null); -}, { - code: 'ERR_INVALID_ARG_TYPE', - name: 'TypeError', - message: 'The "listener" argument must be of type function. ' + - 'Received null' -}); - { // once() has different code paths based on the number of arguments being // emitted. Verify that all of the cases are covered. diff --git a/test/parallel/test-event-emitter-prepend.js b/test/parallel/test-event-emitter-prepend.js index c65b6b8f780f5c..ffe8544911365c 100644 --- a/test/parallel/test-event-emitter-prepend.js +++ b/test/parallel/test-event-emitter-prepend.js @@ -18,17 +18,6 @@ myEE.prependOnceListener('foo', myEE.emit('foo'); -// Verify that the listener must be a function -assert.throws(() => { - const ee = new EventEmitter(); - ee.prependOnceListener('foo', null); -}, { - code: 'ERR_INVALID_ARG_TYPE', - name: 'TypeError', - message: 'The "listener" argument must be of type function. ' + - 'Received null' -}); - // Test fallback if prependListener is undefined. const stream = require('stream'); diff --git a/test/parallel/test-event-emitter-remove-listeners.js b/test/parallel/test-event-emitter-remove-listeners.js index 91e1f071046ac1..f37d26eb258c23 100644 --- a/test/parallel/test-event-emitter-remove-listeners.js +++ b/test/parallel/test-event-emitter-remove-listeners.js @@ -144,17 +144,6 @@ function listener2() {} assert.deepStrictEqual(ee, ee.removeListener('foo', () => {})); } -// Verify that the removed listener must be a function -assert.throws(() => { - const ee = new EventEmitter(); - ee.removeListener('foo', null); -}, { - code: 'ERR_INVALID_ARG_TYPE', - name: 'TypeError', - message: 'The "listener" argument must be of type function. ' + - 'Received null' -}); - { const ee = new EventEmitter(); const listener = () => {}; From 7e9f88e0053010c47923656d55ff59d09baddea9 Mon Sep 17 00:00:00 2001 From: Eileen Date: Sat, 11 Apr 2020 18:11:44 -0700 Subject: [PATCH 51/93] doc: updated directory entry information Fixes: https://github.com/nodejs/node/issues/25595 subdirectory updated def PR-URL: https://github.com/nodejs/node/pull/32791 Reviewed-By: Anna Henningsen --- doc/api/fs.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/doc/api/fs.md b/doc/api/fs.md index fa87f017d8cee2..a899686cae16a1 100644 --- a/doc/api/fs.md +++ b/doc/api/fs.md @@ -430,8 +430,9 @@ included in the iteration results. added: v10.10.0 --> -A representation of a directory entry, as returned by reading from an -[`fs.Dir`][]. +A representation of a directory entry, which can be a file or a subdirectory +within the directory, as returned by reading from an [`fs.Dir`][]. The +directory entry is a combination of the file name and file type pairs. Additionally, when [`fs.readdir()`][] or [`fs.readdirSync()`][] is called with the `withFileTypes` option set to `true`, the resulting array is filled with From 988c2fe287031261576aa8a11aba4e5d30459bae Mon Sep 17 00:00:00 2001 From: Adrian Estrada Date: Fri, 10 Apr 2020 15:04:27 -0500 Subject: [PATCH 52/93] test: better error validations for event-capture MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/32771 Reviewed-By: Anna Henningsen Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Juan José Arboleda --- test/parallel/test-event-capture-rejections.js | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/test/parallel/test-event-capture-rejections.js b/test/parallel/test-event-capture-rejections.js index dbe5deeefade99..233b6b35d55072 100644 --- a/test/parallel/test-event-capture-rejections.js +++ b/test/parallel/test-event-capture-rejections.js @@ -278,14 +278,22 @@ function resetCaptureOnThrowInError() { function argValidation() { function testType(obj) { + const received = obj.constructor.name !== 'Number' ? + `an instance of ${obj.constructor.name}` : + `type number (${obj})`; + assert.throws(() => new EventEmitter({ captureRejections: obj }), { code: 'ERR_INVALID_ARG_TYPE', - name: 'TypeError' + name: 'TypeError', + message: 'The "options.captureRejections" property must be of type ' + + `boolean. Received ${received}` }); assert.throws(() => EventEmitter.captureRejections = obj, { code: 'ERR_INVALID_ARG_TYPE', - name: 'TypeError' + name: 'TypeError', + message: 'The "EventEmitter.captureRejections" property must be of ' + + `type boolean. Received ${received}` }); } From 4c10b5f3787ad473c6c4203dfd7e4dd0637346b6 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Sun, 19 Apr 2020 23:37:00 +0200 Subject: [PATCH 53/93] stream: consistent punctuation MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Cleanup comments to use consistent punctuation. PR-URL: https://github.com/nodejs/node/pull/32934 Reviewed-By: Luigi Pinca Reviewed-By: Benjamin Gruenbaum Reviewed-By: Gerhard Stöbich Reviewed-By: Trivikram Kamat Reviewed-By: Ruben Bridgewater Reviewed-By: Juan José Arboleda Reviewed-By: Anna Henningsen --- lib/_stream_duplex.js | 2 +- lib/_stream_readable.js | 62 ++++++++++++++++++++--------------------- lib/_stream_writable.js | 32 ++++++++++----------- 3 files changed, 48 insertions(+), 48 deletions(-) diff --git a/lib/_stream_duplex.js b/lib/_stream_duplex.js index fe2281df471dc7..07a16175ffcd73 100644 --- a/lib/_stream_duplex.js +++ b/lib/_stream_duplex.js @@ -83,7 +83,7 @@ ObjectDefineProperties(Duplex.prototype, { }, set(value) { // Backward compatibility, the user is explicitly - // managing destroyed + // managing destroyed. if (this._readableState && this._writableState) { this._readableState.destroyed = value; this._writableState.destroyed = value; diff --git a/lib/_stream_readable.js b/lib/_stream_readable.js index ef847c110edc23..9c1441dd9ce405 100644 --- a/lib/_stream_readable.js +++ b/lib/_stream_readable.js @@ -94,7 +94,7 @@ function ReadableState(options, stream, isDuplex) { isDuplex = stream instanceof Stream.Duplex; // Object stream flag. Used to make read(n) ignore n and to - // make all the buffer merging and length checks go away + // make all the buffer merging and length checks go away. this.objectMode = !!(options && options.objectMode); if (isDuplex) @@ -109,7 +109,7 @@ function ReadableState(options, stream, isDuplex) { // A linked list is used to store data chunks instead of an array because the // linked list can remove elements from the beginning faster than - // array.shift() + // array.shift(). this.buffer = new BufferList(); this.length = 0; this.pipes = []; @@ -132,16 +132,16 @@ function ReadableState(options, stream, isDuplex) { this.resumeScheduled = false; this[kPaused] = null; - // True if the error was already emitted and should not be thrown again + // True if the error was already emitted and should not be thrown again. this.errorEmitted = false; // Should close be emitted on destroy. Defaults to true. this.emitClose = !options || options.emitClose !== false; - // Should .destroy() be called after 'end' (and potentially 'finish') + // Should .destroy() be called after 'end' (and potentially 'finish'). this.autoDestroy = !options || options.autoDestroy !== false; - // Has it been destroyed + // Has it been destroyed. this.destroyed = false; // Indicates whether the stream has errored. @@ -156,11 +156,11 @@ function ReadableState(options, stream, isDuplex) { this.defaultEncoding = (options && options.defaultEncoding) || 'utf8'; // Ref the piped dest which we need a drain event on it - // type: null | Writable | Set + // type: null | Writable | Set. this.awaitDrainWriters = null; this.multiAwaitDrain = false; - // If true, a maybeReadMore has been scheduled + // If true, a maybeReadMore has been scheduled. this.readingMore = false; this.decoder = null; @@ -179,7 +179,7 @@ function Readable(options) { return new Readable(options); // Checking for a Stream.Duplex instance is faster here instead of inside - // the ReadableState constructor, at least with V8 6.5 + // the ReadableState constructor, at least with V8 6.5. const isDuplex = this instanceof Stream.Duplex; this._readableState = new ReadableState(options, this, isDuplex); @@ -213,7 +213,7 @@ Readable.prototype.push = function(chunk, encoding) { return readableAddChunk(this, chunk, encoding, false); }; -// Unshift should *always* be something directly out of read() +// Unshift should *always* be something directly out of read(). Readable.prototype.unshift = function(chunk, encoding) { return readableAddChunk(this, chunk, encoding, true); }; @@ -228,7 +228,7 @@ function readableAddChunk(stream, chunk, encoding, addToFront) { encoding = encoding || state.defaultEncoding; if (addToFront && state.encoding && state.encoding !== encoding) { // When unshifting, if state.encoding is set, we have to save - // the string in the BufferList with the state encoding + // the string in the BufferList with the state encoding. chunk = Buffer.from(chunk, encoding).toString(state.encoding); } else if (encoding !== state.encoding) { chunk = Buffer.from(chunk, encoding); @@ -319,7 +319,7 @@ Readable.prototype.setEncoding = function(enc) { StringDecoder = require('string_decoder').StringDecoder; const decoder = new StringDecoder(enc); this._readableState.decoder = decoder; - // If setEncoding(null), decoder.encoding equals utf8 + // If setEncoding(null), decoder.encoding equals utf8. this._readableState.encoding = this._readableState.decoder.encoding; const buffer = this._readableState.buffer; @@ -335,7 +335,7 @@ Readable.prototype.setEncoding = function(enc) { return this; }; -// Don't raise the hwm > 1GB +// Don't raise the hwm > 1GB. const MAX_HWM = 0x40000000; function computeNewHighWaterMark(n) { if (n >= MAX_HWM) { @@ -343,7 +343,7 @@ function computeNewHighWaterMark(n) { n = MAX_HWM; } else { // Get the next highest power of 2 to prevent increasing hwm excessively in - // tiny amounts + // tiny amounts. n--; n |= n >>> 1; n |= n >>> 2; @@ -363,7 +363,7 @@ function howMuchToRead(n, state) { if (state.objectMode) return 1; if (NumberIsNaN(n)) { - // Only flow one buffer at a time + // Only flow one buffer at a time. if (state.flowing && state.length) return state.buffer.first().length; else @@ -446,7 +446,7 @@ Readable.prototype.read = function(n) { let doRead = state.needReadable; debug('need readable', doRead); - // If we currently have less than the highWaterMark, then also read some + // If we currently have less than the highWaterMark, then also read some. if (state.length === 0 || state.length - n < state.highWaterMark) { doRead = true; debug('length less than watermark', doRead); @@ -524,7 +524,7 @@ function onEofChunk(stream, state) { if (state.sync) { // If we are sync, wait until next tick to emit the data. // Otherwise we risk emitting data in the flow() - // the readable code triggers during a read() call + // the readable code triggers during a read() call. emitReadable(stream); } else { // Emit 'readable' now to make sure it gets picked up. @@ -558,7 +558,7 @@ function emitReadable_(stream) { state.emittedReadable = false; } - // The stream needs another readable event if + // The stream needs another readable event if: // 1. It is not flowing, as the flow mechanism will take // care of it. // 2. It is not ended. @@ -677,7 +677,7 @@ Readable.prototype.pipe = function(dest, pipeOpts) { let cleanedUp = false; function cleanup() { debug('cleanup'); - // Cleanup event handlers once the pipe is broken + // Cleanup event handlers once the pipe is broken. dest.removeListener('close', onclose); dest.removeListener('finish', onfinish); if (ondrain) { @@ -771,7 +771,7 @@ Readable.prototype.pipe = function(dest, pipeOpts) { src.unpipe(dest); } - // Tell the dest that it's being piped to + // Tell the dest that it's being piped to. dest.emit('pipe', src); // Start the flow if it hasn't been started already. @@ -841,7 +841,7 @@ Readable.prototype.unpipe = function(dest) { }; // Set up data events if they are asked for -// Ensure readable listeners eventually get something +// Ensure readable listeners eventually get something. Readable.prototype.on = function(ev, fn) { const res = Stream.prototype.on.call(this, ev, fn); const state = this._readableState; @@ -851,7 +851,7 @@ Readable.prototype.on = function(ev, fn) { // a few lines down. This is needed to support once('readable'). state.readableListening = this.listenerCount('readable') > 0; - // Try start flowing on next tick if stream isn't explicitly paused + // Try start flowing on next tick if stream isn't explicitly paused. if (state.flowing !== false) this.resume(); } else if (ev === 'readable') { @@ -914,7 +914,7 @@ function updateReadableListening(self) { // the upcoming resume will not flow. state.flowing = true; - // Crude way to check if we should resume + // Crude way to check if we should resume. } else if (self.listenerCount('data') > 0) { self.resume(); } else if (!state.readableListening) { @@ -935,7 +935,7 @@ Readable.prototype.resume = function() { debug('resume'); // We flow only if there is no one listening // for readable, but we still have to call - // resume() + // resume(). state.flowing = !state.readableListening; resume(this, state); } @@ -1003,7 +1003,7 @@ Readable.prototype.wrap = function(stream) { if (state.decoder) chunk = state.decoder.write(chunk); - // Don't skip over falsy values in objectMode + // Don't skip over falsy values in objectMode. if (state.objectMode && (chunk === null || chunk === undefined)) return; else if (!state.objectMode && (!chunk || !chunk.length)) @@ -1055,7 +1055,7 @@ Readable.prototype[SymbolAsyncIterator] = function() { // Making it explicit these properties are not enumerable // because otherwise some prototype manipulation in -// userland will fail +// userland will fail. ObjectDefineProperties(Readable.prototype, { readable: { get() { @@ -1132,13 +1132,13 @@ ObjectDefineProperties(Readable.prototype, { }, set(value) { // We ignore the value if the stream - // has not been initialized yet + // has not been initialized yet. if (!this._readableState) { return; } // Backward compatibility, the user is explicitly - // managing destroyed + // managing destroyed. this._readableState.destroyed = value; } }, @@ -1175,7 +1175,7 @@ Readable._fromList = fromList; // This function is designed to be inlinable, so please take care when making // changes to the function body. function fromList(n, state) { - // nothing buffered + // nothing buffered. if (state.length === 0) return null; @@ -1183,7 +1183,7 @@ function fromList(n, state) { if (state.objectMode) ret = state.buffer.shift(); else if (!n || n >= state.length) { - // Read it all, truncate the list + // Read it all, truncate the list. if (state.decoder) ret = state.buffer.join(''); else if (state.buffer.length === 1) @@ -1192,7 +1192,7 @@ function fromList(n, state) { ret = state.buffer.concat(state.length); state.buffer.clear(); } else { - // read part of list + // read part of list. ret = state.buffer.consume(n, state.decoder); } @@ -1221,7 +1221,7 @@ function endReadableNT(state, stream) { process.nextTick(endWritableNT, state, stream); } else if (state.autoDestroy) { // In case of duplex streams we need a way to detect - // if the writable side is ready for autoDestroy as well + // if the writable side is ready for autoDestroy as well. const wState = stream._writableState; const autoDestroy = !wState || ( wState.autoDestroy && diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 1d02e2ff8e27f6..57b1b9cc92fb32 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -83,21 +83,21 @@ function WritableState(options, stream, isDuplex) { // The point at which write() starts returning false // Note: 0 is a valid value, means that we always return false if - // the entire buffer is not flushed immediately on write() + // the entire buffer is not flushed immediately on write(). this.highWaterMark = options ? getHighWaterMark(this, options, 'writableHighWaterMark', isDuplex) : getDefaultHighWaterMark(false); - // if _final has been called + // if _final has been called. this.finalCalled = false; // drain event flag. this.needDrain = false; // At the start of calling end() this.ending = false; - // When end() has been called, and returned + // When end() has been called, and returned. this.ended = false; - // When 'finish' is emitted + // When 'finish' is emitted. this.finished = false; // Has it been destroyed @@ -122,7 +122,7 @@ function WritableState(options, stream, isDuplex) { // A flag to see when we're in the middle of a write. this.writing = false; - // When true all writes will be buffered until .uncork() call + // When true all writes will be buffered until .uncork() call. this.corked = 0; // A flag to be able to tell if the onwrite cb is called immediately, @@ -136,10 +136,10 @@ function WritableState(options, stream, isDuplex) { // end up in an overlapped onwrite situation. this.bufferProcessing = false; - // The callback that's passed to _write(chunk,cb) + // The callback that's passed to _write(chunk, cb). this.onwrite = onwrite.bind(undefined, stream); - // The callback that the user supplies to write(chunk,encoding,cb) + // The callback that the user supplies to write(chunk, encoding, cb). this.writecb = null; // The amount that is being written when _write is called. @@ -152,20 +152,20 @@ function WritableState(options, stream, isDuplex) { resetBuffer(this); // Number of pending user-supplied write callbacks - // this must be 0 before 'finish' can be emitted + // this must be 0 before 'finish' can be emitted. this.pendingcb = 0; // Emit prefinish if the only thing we're waiting for is _write cbs - // This is relevant for synchronous Transform streams + // This is relevant for synchronous Transform streams. this.prefinished = false; - // True if the error was already emitted and should not be thrown again + // True if the error was already emitted and should not be thrown again. this.errorEmitted = false; // Should close be emitted on destroy. Defaults to true. this.emitClose = !options || options.emitClose !== false; - // Should .destroy() be called after 'finish' (and potentially 'end') + // Should .destroy() be called after 'finish' (and potentially 'end'). this.autoDestroy = !options || options.autoDestroy !== false; // Indicates whether the stream has errored. When true all write() calls @@ -225,7 +225,7 @@ function Writable(options) { // `_writableState` that would lead to infinite recursion. // Checking for a Stream.Duplex instance is faster here instead of inside - // the WritableState constructor, at least with V8 6.5 + // the WritableState constructor, at least with V8 6.5. const isDuplex = (this instanceof Stream.Duplex); if (!isDuplex && !realHasInstance.call(Writable, this)) @@ -480,7 +480,7 @@ function errorBuffer(state, err) { resetBuffer(state); } -// If there's something in the buffer waiting, then process it +// If there's something in the buffer waiting, then process it. function clearBuffer(stream, state) { if (state.corked || state.bufferProcessing) { return; @@ -557,7 +557,7 @@ Writable.prototype.end = function(chunk, encoding, cb) { if (chunk !== null && chunk !== undefined) this.write(chunk, encoding); - // .end() fully uncorks + // .end() fully uncorks. if (state.corked) { state.corked = 1; this.uncork(); @@ -649,7 +649,7 @@ function finish(stream, state) { if (state.autoDestroy) { // In case of duplex streams we need a way to detect - // if the readable side is ready for autoDestroy as well + // if the readable side is ready for autoDestroy as well. const rState = stream._readableState; const autoDestroy = !rState || ( rState.autoDestroy && @@ -690,7 +690,7 @@ ObjectDefineProperties(Writable.prototype, { return this._writableState ? this._writableState.destroyed : false; }, set(value) { - // Backward compatibility, the user is explicitly managing destroyed + // Backward compatibility, the user is explicitly managing destroyed. if (this._writableState) { this._writableState.destroyed = value; } From bc755fc4c25c64b10f647cf48950d4e067a20d99 Mon Sep 17 00:00:00 2001 From: Daniel Bevenius Date: Thu, 23 Apr 2020 06:15:25 +0200 Subject: [PATCH 54/93] src: fix compiler warnings in node_http2.cc MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Currently, the following compiler warnings are generated: ../src/node_http2.cc: In static member function ‘static int node::http2::Http2Session::OnStreamClose(nghttp2_session*, int32_t, uint32_t, void*)’: ../src/node_http2.cc:994:16: warning: variable ‘def’ set but not used [-Wunused-but-set-variable] 994 | Local def = v8::False(env->isolate()); | ^~~ ../src/node_http2.cc: In static member function ‘static void node::http2::Http2Session::Ping( const v8::FunctionCallbackInfo&)’: ../src/node_http2.cc:2755:16: warning: unused variable ‘env’ [-Wunused-variable] 2755 | Environment* env = Environment::GetCurrent(args); | ^~~ ../src/node_http2.cc: In static member function ‘static void node::http2::Http2Session::Settings( const v8::FunctionCallbackInfo&)’: ../src/node_http2.cc:2774:16: warning: unused variable ‘env’ [-Wunused-variable] 2774 | Environment* env = Environment::GetCurrent(args); | ^~~ This commit removes these unused variables. PR-URL: https://github.com/nodejs/node/pull/33014 Reviewed-By: Zeyu Yang Reviewed-By: Richard Lau Reviewed-By: Juan José Arboleda Reviewed-By: Colin Ihrig Reviewed-By: James M Snell Reviewed-By: Sam Roberts --- src/node_http2.cc | 3 --- 1 file changed, 3 deletions(-) diff --git a/src/node_http2.cc b/src/node_http2.cc index 385a2352040c4e..189f1d50a29a0d 100644 --- a/src/node_http2.cc +++ b/src/node_http2.cc @@ -991,7 +991,6 @@ int Http2Session::OnStreamClose(nghttp2_session* handle, MaybeLocal answer = stream->MakeCallback(env->http2session_on_stream_close_function(), 1, &arg); - Local def = v8::False(env->isolate()); if (answer.IsEmpty() || answer.ToLocalChecked()->IsFalse()) { // Skip to destroy stream->Destroy(); @@ -2752,7 +2751,6 @@ void Http2Session::Origin(const FunctionCallbackInfo& args) { // Submits a PING frame to be sent to the connected peer. void Http2Session::Ping(const FunctionCallbackInfo& args) { - Environment* env = Environment::GetCurrent(args); Http2Session* session; ASSIGN_OR_RETURN_UNWRAP(&session, args.Holder()); @@ -2771,7 +2769,6 @@ void Http2Session::Ping(const FunctionCallbackInfo& args) { // Submits a SETTINGS frame for the Http2Session void Http2Session::Settings(const FunctionCallbackInfo& args) { - Environment* env = Environment::GetCurrent(args); Http2Session* session; ASSIGN_OR_RETURN_UNWRAP(&session, args.Holder()); CHECK(args[0]->IsFunction()); From d82c3c28de818119334bbe6484e5faa4dbf40146 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Juan=20Jos=C3=A9=20Arboleda?= Date: Tue, 14 Apr 2020 14:49:00 -0500 Subject: [PATCH 55/93] src: delete MicroTaskPolicy namespace PR-URL: https://github.com/nodejs/node/pull/32853 Reviewed-By: Colin Ihrig Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Michael Dawson Reviewed-By: Ruben Bridgewater Reviewed-By: Franziska Hinkelmann --- src/api/environment.cc | 1 - 1 file changed, 1 deletion(-) diff --git a/src/api/environment.cc b/src/api/environment.cc index 0d845344d4050e..b9ca6ca7451926 100644 --- a/src/api/environment.cc +++ b/src/api/environment.cc @@ -23,7 +23,6 @@ using v8::HandleScope; using v8::Isolate; using v8::Local; using v8::MaybeLocal; -using v8::MicrotasksPolicy; using v8::Null; using v8::Object; using v8::ObjectTemplate; From 737bd6205b8b21eb0e412bdd7860e1a08dc3b45c Mon Sep 17 00:00:00 2001 From: Yash Ladha Date: Tue, 21 Apr 2020 10:29:07 +0530 Subject: [PATCH 56/93] lib: unnecessary const assignment for class PR-URL: https://github.com/nodejs/node/pull/32962 Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Andrey Pechkurov --- lib/internal/fixed_queue.js | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/lib/internal/fixed_queue.js b/lib/internal/fixed_queue.js index d3ffbc2a1e154e..f6f3110d8ec5d4 100644 --- a/lib/internal/fixed_queue.js +++ b/lib/internal/fixed_queue.js @@ -56,7 +56,7 @@ const kMask = kSize - 1; // `top + 1 === bottom` it's full. This wastes a single space of storage // but allows much quicker checks. -const FixedCircularBuffer = class FixedCircularBuffer { +class FixedCircularBuffer { constructor() { this.bottom = 0; this.top = 0; @@ -85,7 +85,7 @@ const FixedCircularBuffer = class FixedCircularBuffer { this.bottom = (this.bottom + 1) & kMask; return nextItem; } -}; +} module.exports = class FixedQueue { constructor() { From 4b6aa077feb7b9526c059e5099f4709fa5435931 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Tue, 21 Apr 2020 10:46:29 +0800 Subject: [PATCH 57/93] inspector: only write coverage in fully bootstrapped Environments The NODE_V8_COVERAGE folder and the source map computation are setup during pre-execution since they rely on environment variables as well as JS states. Therefore, we need to give up serialization of JS coverage profiles for Environments that have not go through pre-execution. Currently this is only possible for Environments created using the embedder API CreateEnvironment(). As a result we won't have JS coverage data for most cctests, but if that proves to be necessary we could just run lib/internal/main/environment.js for these Environments created for cctests. Fixes: https://github.com/nodejs/node/issues/32912 Refs: https://github.com/nodejs/node/commit/65e18a8e9f912dfa04a804124b6a19514bb45165 Refs: https://github.com/nodejs/node/commit/5bf43729a403b992cc90b5cdbbaaf505769d1107 https://github.com/nodejs/node/commit/8aa7ef7840ef5f7161f3195e51a3fa6783290160 PR-URL: https://github.com/nodejs/node/pull/32960 Refs: https://github.com/nodejs/node/commit/8aa7ef7840ef5f7161f3195e51a3fa6783290160 Reviewed-By: Richard Lau Reviewed-By: David Carlier Reviewed-By: Ruben Bridgewater Reviewed-By: Colin Ihrig Reviewed-By: Franziska Hinkelmann Reviewed-By: Anna Henningsen --- src/inspector_profiler.cc | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/src/inspector_profiler.cc b/src/inspector_profiler.cc index cc4c3091757683..03cf2f6e5ca76b 100644 --- a/src/inspector_profiler.cc +++ b/src/inspector_profiler.cc @@ -209,6 +209,16 @@ void V8CoverageConnection::WriteProfile(Local message) { HandleScope handle_scope(isolate); Context::Scope context_scope(context); + // This is only set up during pre-execution (when the environment variables + // becomes available in the JS land). If it's empty, we don't have coverage + // directory path (which is resolved in JS land at the moment) either, so + // the best we could to is to just discard the profile and do nothing. + // This should only happen in half-baked Environments created using the + // embedder API. + if (env_->source_map_cache_getter().IsEmpty()) { + return; + } + // Get message.result from the response. Local result; if (!ParseProfile(env_, message, type()).ToLocal(&result)) { From c21f1f03c5f2339abf8cc131697d8a0306b62ecb Mon Sep 17 00:00:00 2001 From: Jesus Hernandez Date: Sun, 19 Apr 2020 21:01:39 -0500 Subject: [PATCH 58/93] stream: removes unnecessary params MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Removes the state param in the onFinished function since it's never used within it. PR-URL: https://github.com/nodejs/node/pull/32936 Reviewed-By: Luigi Pinca Reviewed-By: Gerhard Stöbich --- lib/_stream_writable.js | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 57b1b9cc92fb32..7c40929d73c180 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -582,7 +582,7 @@ Writable.prototype.end = function(chunk, encoding, cb) { if (err || state.finished) process.nextTick(cb, err); else - onFinished(this, state, cb); + onFinished(this, cb); } return this; @@ -664,7 +664,7 @@ function finish(stream, state) { } // TODO(ronag): Avoid using events to implement internal logic. -function onFinished(stream, state, cb) { +function onFinished(stream, cb) { function onerror(err) { stream.removeListener('finish', onfinish); stream.removeListener('error', onerror); From 4143c747fc2b0ead1657a141c4b356c273a6fa3d Mon Sep 17 00:00:00 2001 From: Gus Caplan Date: Tue, 21 Apr 2020 16:43:58 -0500 Subject: [PATCH 59/93] vm: add importModuleDynamically option to compileFunction MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fixes: https://github.com/nodejs/node/issues/31860 PR-URL: https://github.com/nodejs/node/pull/32985 Reviewed-By: Anna Henningsen Reviewed-By: Michaël Zasso --- doc/api/vm.md | 16 ++++++++++- lib/internal/modules/cjs/loader.js | 40 ++++++++------------------- lib/vm.js | 17 ++++++++++++ test/parallel/test-vm-module-basic.js | 19 ++++++++++++- tools/doc/type-parser.js | 1 + 5 files changed, 63 insertions(+), 30 deletions(-) diff --git a/doc/api/vm.md b/doc/api/vm.md index ec731ffd36c6a0..e959ebc66ff24e 100644 --- a/doc/api/vm.md +++ b/doc/api/vm.md @@ -88,7 +88,7 @@ changes: This option is part of the experimental modules API, and should not be considered stable. * `specifier` {string} specifier passed to `import()` - * `module` {vm.Module} + * `script` {vm.Script} * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is recommended in order to take advantage of error tracking, and to avoid issues with namespaces that contain `then` function exports. @@ -773,6 +773,10 @@ const vm = require('vm'); ## `vm.compileFunction(code[, params[, options]])` * `code` {string} The body of the function to compile. @@ -795,6 +799,16 @@ added: v10.10.0 * `contextExtensions` {Object[]} An array containing a collection of context extensions (objects wrapping the current scope) to be applied while compiling. **Default:** `[]`. + * `importModuleDynamically` {Function} Called during evaluation of this module + when `import()` is called. If this option is not specified, calls to + `import()` will reject with [`ERR_VM_DYNAMIC_IMPORT_CALLBACK_MISSING`][]. + This option is part of the experimental modules API, and should not be + considered stable. + * `specifier` {string} specifier passed to `import()` + * `function` {Function} + * Returns: {Module Namespace Object|vm.Module} Returning a `vm.Module` is + recommended in order to take advantage of error tracking, and to avoid + issues with namespaces that contain `then` function exports. * Returns: {Function} Compiles the given code into the provided context (if no context is diff --git a/lib/internal/modules/cjs/loader.js b/lib/internal/modules/cjs/loader.js index e09cc437068d6b..96988a6a471aad 100644 --- a/lib/internal/modules/cjs/loader.js +++ b/lib/internal/modules/cjs/loader.js @@ -77,7 +77,6 @@ const preserveSymlinksMain = getOptionValue('--preserve-symlinks-main'); const manifest = getOptionValue('--experimental-policy') ? require('internal/process/policy').manifest : null; -const { compileFunction } = internalBinding('contextify'); // Whether any user-provided CJS modules had been loaded (executed). // Used for internal assertions. @@ -1100,40 +1099,25 @@ function wrapSafe(filename, content, cjsModuleInstance) { }, }); } - let compiled; try { - compiled = compileFunction( - content, + return vm.compileFunction(content, [ + 'exports', + 'require', + 'module', + '__filename', + '__dirname', + ], { filename, - 0, - 0, - undefined, - false, - undefined, - [], - [ - 'exports', - 'require', - 'module', - '__filename', - '__dirname', - ] - ); + importModuleDynamically(specifier) { + const loader = asyncESM.ESMLoader; + return loader.import(specifier, normalizeReferrerURL(filename)); + }, + }); } catch (err) { if (process.mainModule === cjsModuleInstance) enrichCJSError(err); throw err; } - - const { callbackMap } = internalBinding('module_wrap'); - callbackMap.set(compiled.cacheKey, { - importModuleDynamically: async (specifier) => { - const loader = asyncESM.ESMLoader; - return loader.import(specifier, normalizeReferrerURL(filename)); - } - }); - - return compiled.function; } // Run the file contents in the correct scope or sandbox. Expose diff --git a/lib/vm.js b/lib/vm.js index c2d8908703b396..cffca355720dae 100644 --- a/lib/vm.js +++ b/lib/vm.js @@ -313,6 +313,7 @@ function compileFunction(code, params, options = {}) { produceCachedData = false, parsingContext = undefined, contextExtensions = [], + importModuleDynamically, } = options; validateString(filename, 'options.filename'); @@ -360,6 +361,22 @@ function compileFunction(code, params, options = {}) { result.function.cachedData = result.cachedData; } + if (importModuleDynamically !== undefined) { + if (typeof importModuleDynamically !== 'function') { + throw new ERR_INVALID_ARG_TYPE('options.importModuleDynamically', + 'function', + importModuleDynamically); + } + const { importModuleDynamicallyWrap } = + require('internal/vm/module'); + const { callbackMap } = internalBinding('module_wrap'); + const wrapped = importModuleDynamicallyWrap(importModuleDynamically); + const func = result.function; + callbackMap.set(result.cacheKey, { + importModuleDynamically: (s, _k) => wrapped(s, func), + }); + } + return result.function; } diff --git a/test/parallel/test-vm-module-basic.js b/test/parallel/test-vm-module-basic.js index 86e13f8b12bfa8..155524f7e7a176 100644 --- a/test/parallel/test-vm-module-basic.js +++ b/test/parallel/test-vm-module-basic.js @@ -8,7 +8,8 @@ const { Module, SourceTextModule, SyntheticModule, - createContext + createContext, + compileFunction, } = require('vm'); const util = require('util'); @@ -147,3 +148,19 @@ const util = require('util'); name: 'TypeError' }); } + +// Test compileFunction importModuleDynamically +{ + const module = new SyntheticModule([], () => {}); + module.link(() => {}); + const f = compileFunction('return import("x")', [], { + importModuleDynamically(specifier, referrer) { + assert.strictEqual(specifier, 'x'); + assert.strictEqual(referrer, f); + return module; + }, + }); + f().then((ns) => { + assert.strictEqual(ns, module.namespace); + }); +} diff --git a/tools/doc/type-parser.js b/tools/doc/type-parser.js index 02b59d37ffd278..b244564d8f8ebf 100644 --- a/tools/doc/type-parser.js +++ b/tools/doc/type-parser.js @@ -148,6 +148,7 @@ const customTypesMap = { 'URLSearchParams': 'url.html#url_class_urlsearchparams', 'vm.Module': 'vm.html#vm_class_vm_module', + 'vm.Script': 'vm.html#vm_class_vm_script', 'vm.SourceTextModule': 'vm.html#vm_class_vm_sourcetextmodule', 'MessagePort': 'worker_threads.html#worker_threads_class_messageport', From 44c157e45daffdef435fe835d13ce88de274fb64 Mon Sep 17 00:00:00 2001 From: Yash Ladha Date: Thu, 16 Apr 2020 18:56:27 +0530 Subject: [PATCH 60/93] src: assignment to valid type We are converting the argument to a uint32_t value but the lvalue is not consistent with the casting. PR-URL: https://github.com/nodejs/node/pull/32879 Reviewed-By: James M Snell Reviewed-By: Anna Henningsen Reviewed-By: Gireesh Punathil --- src/tcp_wrap.cc | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/src/tcp_wrap.cc b/src/tcp_wrap.cc index 1aca3a5e6aeeee..619c9ef6196373 100644 --- a/src/tcp_wrap.cc +++ b/src/tcp_wrap.cc @@ -185,7 +185,7 @@ void TCPWrap::SetKeepAlive(const FunctionCallbackInfo& args) { Environment* env = wrap->env(); int enable; if (!args[0]->Int32Value(env->context()).To(&enable)) return; - unsigned int delay = args[1].As()->Value(); + unsigned int delay = static_cast(args[1].As()->Value()); int err = uv_tcp_keepalive(&wrap->handle_, enable, delay); args.GetReturnValue().Set(err); } @@ -278,7 +278,8 @@ void TCPWrap::Listen(const FunctionCallbackInfo& args) { void TCPWrap::Connect(const FunctionCallbackInfo& args) { CHECK(args[2]->IsUint32()); - int port = args[2].As()->Value(); + // explicit cast to fit to libuv's type expectation + int port = static_cast(args[2].As()->Value()); Connect(args, [port](const char* ip_address, sockaddr_in* addr) { return uv_ip4_addr(ip_address, port, addr); From 2bb4ac409bd107a1e6cb2fe6e59051868c72eee4 Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Thu, 16 Apr 2020 21:49:41 +0200 Subject: [PATCH 61/93] stream: avoid drain for sync streams Previously a sync writable receiving chunks larger than highwatermark would unecessarily ping pong needDrain. PR-URL: https://github.com/nodejs/node/pull/32887 Reviewed-By: Matteo Collina Reviewed-By: James M Snell --- benchmark/streams/writable-manywrites.js | 7 ++++--- lib/_stream_writable.js | 11 ++++++----- test/parallel/test-stream-big-packet.js | 6 +++++- test/parallel/test-stream-catch-rejections.js | 2 +- .../test-stream-pipe-await-drain-push-while-write.js | 2 +- test/parallel/test-stream-pipe-await-drain.js | 6 +++--- test/parallel/test-stream-writable-needdrain-state.js | 6 ++++-- test/parallel/test-stream2-finish-pipe.js | 2 +- 8 files changed, 25 insertions(+), 17 deletions(-) diff --git a/benchmark/streams/writable-manywrites.js b/benchmark/streams/writable-manywrites.js index e4ae9ab91e5f4a..025a5017ee6446 100644 --- a/benchmark/streams/writable-manywrites.js +++ b/benchmark/streams/writable-manywrites.js @@ -7,11 +7,12 @@ const bench = common.createBenchmark(main, { n: [2e6], sync: ['yes', 'no'], writev: ['yes', 'no'], - callback: ['yes', 'no'] + callback: ['yes', 'no'], + len: [1024, 32 * 1024] }); -function main({ n, sync, writev, callback }) { - const b = Buffer.allocUnsafe(1024); +function main({ n, sync, writev, callback, len }) { + const b = Buffer.allocUnsafe(len); const s = new Writable(); sync = sync === 'yes'; diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 7c40929d73c180..2dc949316a2dce 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -337,11 +337,6 @@ function writeOrBuffer(stream, state, chunk, encoding, callback) { state.length += len; - const ret = state.length < state.highWaterMark; - // We must ensure that previous needDrain will not be reset to false. - if (!ret) - state.needDrain = true; - if (state.writing || state.corked || state.errored) { state.buffered.push({ chunk, encoding, callback }); if (state.allBuffers && encoding !== 'buffer') { @@ -359,6 +354,12 @@ function writeOrBuffer(stream, state, chunk, encoding, callback) { state.sync = false; } + const ret = state.length < state.highWaterMark; + + // We must ensure that previous needDrain will not be reset to false. + if (!ret) + state.needDrain = true; + // Return false if errored or destroyed in order to break // any synchronous while(stream.write(data)) loops. return ret && !state.errored && !state.destroyed; diff --git a/test/parallel/test-stream-big-packet.js b/test/parallel/test-stream-big-packet.js index 0dca3391961a93..fdbe3cd21145ee 100644 --- a/test/parallel/test-stream-big-packet.js +++ b/test/parallel/test-stream-big-packet.js @@ -36,7 +36,11 @@ class TestStream extends stream.Transform { } } -const s1 = new stream.PassThrough(); +const s1 = new stream.Transform({ + transform(chunk, encoding, cb) { + process.nextTick(cb, null, chunk); + } +}); const s2 = new stream.PassThrough(); const s3 = new TestStream(); s1.pipe(s3); diff --git a/test/parallel/test-stream-catch-rejections.js b/test/parallel/test-stream-catch-rejections.js index 848c2ada130e64..81427c35757ca8 100644 --- a/test/parallel/test-stream-catch-rejections.js +++ b/test/parallel/test-stream-catch-rejections.js @@ -30,7 +30,7 @@ const assert = require('assert'); captureRejections: true, highWaterMark: 1, write(chunk, enc, cb) { - cb(); + process.nextTick(cb); } }); diff --git a/test/parallel/test-stream-pipe-await-drain-push-while-write.js b/test/parallel/test-stream-pipe-await-drain-push-while-write.js index 6dbf3c669bc177..a717291cda2b03 100644 --- a/test/parallel/test-stream-pipe-await-drain-push-while-write.js +++ b/test/parallel/test-stream-pipe-await-drain-push-while-write.js @@ -19,7 +19,7 @@ const writable = new stream.Writable({ }); } - cb(); + process.nextTick(cb); }, 3) }); diff --git a/test/parallel/test-stream-pipe-await-drain.js b/test/parallel/test-stream-pipe-await-drain.js index 3ae248e08b854f..90d418a09783e3 100644 --- a/test/parallel/test-stream-pipe-await-drain.js +++ b/test/parallel/test-stream-pipe-await-drain.js @@ -19,7 +19,7 @@ reader._read = () => {}; writer1._write = common.mustCall(function(chunk, encoding, cb) { this.emit('chunk-received'); - cb(); + process.nextTick(cb); }, 1); writer1.once('chunk-received', () => { @@ -42,7 +42,7 @@ writer2._write = common.mustCall((chunk, encoding, cb) => { reader._readableState.awaitDrainWriters.size, 1, 'awaitDrain should be 1 after first push, actual is ' + - reader._readableState.awaitDrainWriters + reader._readableState.awaitDrainWriters.size ); // Not calling cb here to "simulate" slow stream. // This should be called exactly once, since the first .write() call @@ -54,7 +54,7 @@ writer3._write = common.mustCall((chunk, encoding, cb) => { reader._readableState.awaitDrainWriters.size, 2, 'awaitDrain should be 2 after second push, actual is ' + - reader._readableState.awaitDrainWriters + reader._readableState.awaitDrainWriters.size ); // Not calling cb here to "simulate" slow stream. // This should be called exactly once, since the first .write() call diff --git a/test/parallel/test-stream-writable-needdrain-state.js b/test/parallel/test-stream-writable-needdrain-state.js index ea5617d997d5ed..0e72d832bc3ff0 100644 --- a/test/parallel/test-stream-writable-needdrain-state.js +++ b/test/parallel/test-stream-writable-needdrain-state.js @@ -10,8 +10,10 @@ const transform = new stream.Transform({ }); function _transform(chunk, encoding, cb) { - assert.strictEqual(transform._writableState.needDrain, true); - cb(); + process.nextTick(() => { + assert.strictEqual(transform._writableState.needDrain, true); + cb(); + }); } assert.strictEqual(transform._writableState.needDrain, false); diff --git a/test/parallel/test-stream2-finish-pipe.js b/test/parallel/test-stream2-finish-pipe.js index 1cee74063233b2..5e2969aad4f259 100644 --- a/test/parallel/test-stream2-finish-pipe.js +++ b/test/parallel/test-stream2-finish-pipe.js @@ -30,7 +30,7 @@ r._read = function(size) { const w = new stream.Writable(); w._write = function(data, encoding, cb) { - cb(null); + process.nextTick(cb, null); }; r.pipe(w); From e07c4ffc395d83bbae3f440b372593a068fb02df Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Thu, 23 Apr 2020 21:48:30 +0200 Subject: [PATCH 62/93] stream: fix sync write perf regression While https://github.com/nodejs/node/pull/31046 did make async writes faster it at the same time made sync writes slower. This PR corrects this while maintaining performance improvements. PR-URL: https://github.com/nodejs/node/pull/33032 Reviewed-By: Ruben Bridgewater Reviewed-By: James M Snell Reviewed-By: Brian White Reviewed-By: Zeyu Yang --- lib/_stream_writable.js | 31 ++++++++++++++----------------- 1 file changed, 14 insertions(+), 17 deletions(-) diff --git a/lib/_stream_writable.js b/lib/_stream_writable.js index 2dc949316a2dce..4b353a52665efa 100644 --- a/lib/_stream_writable.js +++ b/lib/_stream_writable.js @@ -415,27 +415,24 @@ function onwrite(stream, er) { onwriteError(stream, state, er, cb); } } else { - if (!state.destroyed) { + if (state.buffered.length > state.bufferedIndex) { clearBuffer(stream, state); } - if (state.needDrain || cb !== nop || state.ending || state.destroyed) { - if (sync) { - // It is a common case that the callback passed to .write() is always - // the same. In that case, we do not schedule a new nextTick(), but - // rather just increase a counter, to improve performance and avoid - // memory allocations. - if (state.afterWriteTickInfo !== null && - state.afterWriteTickInfo.cb === cb) { - state.afterWriteTickInfo.count++; - } else { - state.afterWriteTickInfo = { count: 1, cb, stream, state }; - process.nextTick(afterWriteTick, state.afterWriteTickInfo); - } + + if (sync) { + // It is a common case that the callback passed to .write() is always + // the same. In that case, we do not schedule a new nextTick(), but + // rather just increase a counter, to improve performance and avoid + // memory allocations. + if (state.afterWriteTickInfo !== null && + state.afterWriteTickInfo.cb === cb) { + state.afterWriteTickInfo.count++; } else { - afterWrite(stream, state, 1, cb); + state.afterWriteTickInfo = { count: 1, cb, stream, state }; + process.nextTick(afterWriteTick, state.afterWriteTickInfo); } } else { - state.pendingcb--; + afterWrite(stream, state, 1, cb); } } } @@ -483,7 +480,7 @@ function errorBuffer(state, err) { // If there's something in the buffer waiting, then process it. function clearBuffer(stream, state) { - if (state.corked || state.bufferProcessing) { + if (state.corked || state.bufferProcessing || state.destroyed) { return; } From 8fad112d930139e2673a0112317cdf032289532f Mon Sep 17 00:00:00 2001 From: Jeremiah Senkpiel Date: Wed, 15 Apr 2020 11:48:52 -0700 Subject: [PATCH 63/93] test: remove timers-blocking-callback MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit If the bug this test is intented to catch is reintroduced, or if 5aac4c42da104c30d8f701f1042d61c2f06b7e6c is effectively reverted, many (50+) tests time out, rendering this test redundant and unnecessary. in particular, the following timer tests catch an effective revert of 5aac4c42da104c30d8f701f1042d61c2f06b7e6c: not ok 21 parallel/test-timers-api-refs not ok 22 parallel/test-timers-args not ok 23 parallel/test-timers-destroyed not ok 25 parallel/test-timers-nested not ok 26 parallel/test-timers-interval-throw not ok 28 parallel/test-timers-non-integer-delay not ok 32 parallel/test-timers-ordering not ok 33 parallel/test-timers-refresh not ok 34 parallel/test-timers-refresh-in-callback not ok 35 parallel/test-timers-reset-process-domain-on-throw not ok 40 parallel/test-timers-timeout-to-interval not ok 41 parallel/test-timers-uncaught-exception not ok 42 parallel/test-timers-timeout-with-non-integer not ok 43 parallel/test-timers-unenroll-unref-interval not ok 44 parallel/test-timers-unref not ok 45 parallel/test-timers-unref-active not ok 46 parallel/test-timers-unrefd-interval-still-fires not ok 47 parallel/test-timers-unrefed-in-callback not ok 48 parallel/test-timers-user-call not ok 49 parallel/test-timers-zero-timeout Refs: https://github.com/nodejs/node/issues/21781 PR-URL: https://github.com/nodejs/node/pull/32870 Reviewed-By: Anatoli Papirovski Reviewed-By: Michaël Zasso Reviewed-By: Rich Trott --- test/sequential/sequential.status | 2 - .../test-timers-blocking-callback.js | 114 ------------------ 2 files changed, 116 deletions(-) delete mode 100644 test/sequential/test-timers-blocking-callback.js diff --git a/test/sequential/sequential.status b/test/sequential/sequential.status index f73b428de5f390..fce8bd959f0326 100644 --- a/test/sequential/sequential.status +++ b/test/sequential/sequential.status @@ -17,8 +17,6 @@ test-worker-prof: PASS, FLAKY [$system==linux] [$system==macos] -# https://github.com/nodejs/node/issues/21781 -test-timers-blocking-callback: PASS, FLAKY [$system==solaris] # Also applies to SmartOS diff --git a/test/sequential/test-timers-blocking-callback.js b/test/sequential/test-timers-blocking-callback.js deleted file mode 100644 index a5e0f596a34b93..00000000000000 --- a/test/sequential/test-timers-blocking-callback.js +++ /dev/null @@ -1,114 +0,0 @@ -// Flags: --expose-internals -'use strict'; - -/* - * This is a regression test for - * https://github.com/nodejs/node-v0.x-archive/issues/15447 and - * https://github.com/nodejs/node-v0.x-archive/issues/9333. - * - * When a timer is added in another timer's callback, its underlying timer - * handle was started with a timeout that was actually incorrect. - * - * The reason was that the value that represents the current time was not - * updated between the time the original callback was called and the time - * the added timer was processed by timers.listOnTimeout. That led the - * logic in timers.listOnTimeout to do an incorrect computation that made - * the added timer fire with a timeout of scheduledTimeout + - * timeSpentInCallback. - * - * This test makes sure that a timer added by another timer's callback - * fires with the expected timeout. - * - * It makes sure that it works when the timers list for a given timeout is - * empty (see testAddingTimerToEmptyTimersList) and when the timers list - * is not empty (see testAddingTimerToNonEmptyTimersList). - */ - -const common = require('../common'); -const assert = require('assert'); -const { sleep } = require('internal/util'); - -const TIMEOUT = 100; - -let nbBlockingCallbackCalls; -let latestDelay; -let timeCallbackScheduled; - -// These tests are timing dependent so they may fail even when the bug is -// not present (if the host is sufficiently busy that the timers are delayed -// significantly). However, they fail 100% of the time when the bug *is* -// present, so to increase reliability, allow for a small number of retries. -let retries = 2; - -function initTest() { - nbBlockingCallbackCalls = 0; - latestDelay = 0; - timeCallbackScheduled = 0; -} - -function blockingCallback(retry, callback) { - ++nbBlockingCallbackCalls; - - if (nbBlockingCallbackCalls > 1) { - latestDelay = Date.now() - timeCallbackScheduled; - // Even if timers can fire later than when they've been scheduled - // to fire, they shouldn't generally be more than 100% late in this case. - // But they are guaranteed to be at least 100ms late given the bug in - // https://github.com/nodejs/node-v0.x-archive/issues/15447 and - // https://github.com/nodejs/node-v0.x-archive/issues/9333. - if (latestDelay >= TIMEOUT * 2) { - if (retries > 0) { - retries--; - return retry(callback); - } - assert.fail(`timeout delayed by more than 100% (${latestDelay}ms)`); - } - if (callback) - return callback(); - } else { - // Block by busy-looping to trigger the issue - sleep(TIMEOUT); - - timeCallbackScheduled = Date.now(); - setTimeout(blockingCallback.bind(null, retry, callback), TIMEOUT); - } -} - -function testAddingTimerToEmptyTimersList(callback) { - initTest(); - // Call setTimeout just once to make sure the timers list is - // empty when blockingCallback is called. - setTimeout( - blockingCallback.bind(null, testAddingTimerToEmptyTimersList, callback), - TIMEOUT - ); -} - -function testAddingTimerToNonEmptyTimersList() { - // If both timers fail and attempt a retry, only actually do anything for one - // of them. - let retryOK = true; - const retry = () => { - if (retryOK) - testAddingTimerToNonEmptyTimersList(); - retryOK = false; - }; - - initTest(); - // Call setTimeout twice with the same timeout to make - // sure the timers list is not empty when blockingCallback is called. - setTimeout( - blockingCallback.bind(null, retry), - TIMEOUT - ); - setTimeout( - blockingCallback.bind(null, retry), - TIMEOUT - ); -} - -// Run the test for the empty timers list case, and then for the non-empty -// timers list one. -testAddingTimerToEmptyTimersList( - common.mustCall(testAddingTimerToNonEmptyTimersList) -); From f116825d560b72c9348c02be83b7d7e06b9e31e4 Mon Sep 17 00:00:00 2001 From: Ishaan Jain <37652866+ishaanjain1898@users.noreply.github.com> Date: Wed, 22 Apr 2020 22:16:56 +0530 Subject: [PATCH 64/93] doc: avoid tautology in README MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Changed "UTC time" on Line 82 to "UTC" as it created a tautology. PR-URL: https://github.com/nodejs/node/pull/33005 Reviewed-By: James M Snell Reviewed-By: Michaël Zasso Reviewed-By: Robert Nagy --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 0542c5c7b26a11..af3a57ccaac992 100644 --- a/README.md +++ b/README.md @@ -80,7 +80,7 @@ contains the latest Carbon (Node.js 8) release. #### Nightly Releases -Each directory name and filename contains a date (in UTC time) and the commit +Each directory name and filename contains a date (in UTC) and the commit SHA at the HEAD of the release. #### API Documentation From 31c797cb116287e46087ba218056a70f67256fdd Mon Sep 17 00:00:00 2001 From: Robert Nagy Date: Mon, 13 Apr 2020 11:02:03 +0200 Subject: [PATCH 65/93] http: doc deprecate abort and improve docs Doc deprecates ClientRequest.abort in favor of ClientRequest.destroy. Also improves event order documentation for abort and destroy. Refs: https://github.com/nodejs/node/issues/32225 PR-URL: https://github.com/nodejs/node/pull/32807 Reviewed-By: Matteo Collina Reviewed-By: Zeyu Yang Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Trivikram Kamat Reviewed-By: Rich Trott --- doc/api/deprecations.md | 15 +++++++++ doc/api/http.md | 72 ++++++++++++++++++++++++++++++++++++++--- 2 files changed, 83 insertions(+), 4 deletions(-) diff --git a/doc/api/deprecations.md b/doc/api/deprecations.md index 9833e0c421a18c..0662e32295933f 100644 --- a/doc/api/deprecations.md +++ b/doc/api/deprecations.md @@ -2651,6 +2651,19 @@ written twice. This introduces a race condition between threads, and is a potential security vulnerability. There is no safe, cross-platform alternative API. + +### DEP0XXX: Use `request.destroy()` instead of `request.abort()` + + +Type: Documentation-only + +Use [`request.destroy()`][] instead of [`request.abort()`][]. + [`--pending-deprecation`]: cli.html#cli_pending_deprecation [`--throw-deprecation`]: cli.html#cli_throw_deprecation [`Buffer.allocUnsafeSlow(size)`]: buffer.html#buffer_class_method_buffer_allocunsafeslow_size @@ -2712,8 +2725,10 @@ API. [`punycode`]: punycode.html [`require.extensions`]: modules.html#modules_require_extensions [`require.main`]: modules.html#modules_accessing_the_main_module +[`request.abort()`]: http.html#http_request_abort [`request.socket`]: http.html#http_request_socket [`request.connection`]: http.html#http_request_connection +[`request.destroy()`]: http.html#http_request_destroy_error [`response.socket`]: http.html#http_response_socket [`response.connection`]: http.html#http_response_connection [`response.end()`]: http.html#http_response_end_data_encoding_callback diff --git a/doc/api/http.md b/doc/api/http.md index a683eebdab5d26..7f097b0bc3a7f4 100644 --- a/doc/api/http.md +++ b/doc/api/http.md @@ -568,6 +568,7 @@ server.listen(1337, '127.0.0.1', () => { ### `request.abort()` Marks the request as aborting. Calling this will cause remaining data @@ -623,6 +624,31 @@ If `data` is specified, it is equivalent to calling If `callback` is specified, it will be called when the request stream is finished. +### `request.destroy([error])` + + +* `error` {Error} Optional, an error to emit with `'error'` event. +* Returns: {this} + +Destroy the request. Optionally emit an `'error'` event, +and emit a `'close'` event. Calling this will cause remaining data +in the response to be dropped and the socket to be destroyed. + +See [`writable.destroy()`][] for further details. + +#### `request.destroyed` + + +* {boolean} + +Is `true` after [`request.destroy()`][] has been called. + +See [`writable.destroyed`][] for further details. + ### `request.finished` * `stream` {Stream} A readable and/or writable stream. @@ -1580,6 +1595,12 @@ changes: - version: v13.10.0 pr-url: https://github.com/nodejs/node/pull/31223 description: Add support for async generators. + - version: v14.0.0 + pr-url: https://github.com/nodejs/node/pull/32158 + description: The `pipeline(..., cb)` will wait for the `'close'` event + before invoking the callback. The implementation tries to + detect legacy streams and only apply this behavior to streams + which are expected to emit `'close'`. --> * `source` {Stream|Iterable|AsyncIterable|Function} From 59b64adb79434540157dd8d13883c0cacb908d05 Mon Sep 17 00:00:00 2001 From: Stephen Belanger Date: Fri, 24 Apr 2020 17:02:07 -0700 Subject: [PATCH 81/93] src: add AsyncWrapObject constructor template factory MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/33051 Reviewed-By: Anna Henningsen Reviewed-By: James M Snell Reviewed-By: Gerhard Stöbich Reviewed-By: David Carlier Reviewed-By: Juan José Arboleda --- src/async_wrap.cc | 33 ++++++++++++++++++--------------- 1 file changed, 18 insertions(+), 15 deletions(-) diff --git a/src/async_wrap.cc b/src/async_wrap.cc index b24c160156c280..42837e09818ec2 100644 --- a/src/async_wrap.cc +++ b/src/async_wrap.cc @@ -80,6 +80,20 @@ struct AsyncWrapObject : public AsyncWrap { inline AsyncWrapObject(Environment* env, Local object, ProviderType type) : AsyncWrap(env, object, type) {} + static Local GetConstructorTemplate(Environment* env) { + Local tmpl = env->async_wrap_object_ctor_template(); + if (tmpl.IsEmpty()) { + tmpl = env->NewFunctionTemplate(AsyncWrapObject::New); + tmpl->SetClassName( + FIXED_ONE_BYTE_STRING(env->isolate(), "AsyncWrap")); + tmpl->Inherit(AsyncWrap::GetConstructorTemplate(env)); + tmpl->InstanceTemplate()->SetInternalFieldCount( + AsyncWrapObject::kInternalFieldCount); + env->set_async_wrap_object_ctor_template(tmpl); + } + return tmpl; + } + SET_NO_MEMORY_INFO() SET_MEMORY_INFO_NAME(AsyncWrapObject) SET_SELF_SIZE(AsyncWrapObject) @@ -559,21 +573,10 @@ void AsyncWrap::Initialize(Local target, env->set_async_hooks_promise_resolve_function(Local()); env->set_async_hooks_binding(target); - // TODO(addaleax): This block might better work as a - // AsyncWrapObject::Initialize() or AsyncWrapObject::GetConstructorTemplate() - // function. - { - auto class_name = FIXED_ONE_BYTE_STRING(env->isolate(), "AsyncWrap"); - auto function_template = env->NewFunctionTemplate(AsyncWrapObject::New); - function_template->SetClassName(class_name); - function_template->Inherit(AsyncWrap::GetConstructorTemplate(env)); - auto instance_template = function_template->InstanceTemplate(); - instance_template->SetInternalFieldCount(AsyncWrap::kInternalFieldCount); - auto function = - function_template->GetFunction(env->context()).ToLocalChecked(); - target->Set(env->context(), class_name, function).Check(); - env->set_async_wrap_object_ctor_template(function_template); - } + target->Set(env->context(), + FIXED_ONE_BYTE_STRING(env->isolate(), "AsyncWrap"), + AsyncWrapObject::GetConstructorTemplate(env) + ->GetFunction(env->context()).ToLocalChecked()).Check(); } From 8a4de2ef255973660ab4160f6e80df6e9c228a90 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Fri, 24 Apr 2020 19:10:35 +0200 Subject: [PATCH 82/93] doc: improve release documentation Extract the "Cherry-pick the Release Commit to master" part to its own section and be more precise about what should be done to handle conflicts. PR-URL: https://github.com/nodejs/node/pull/33042 Reviewed-By: Anna Henningsen Reviewed-By: Richard Lau Reviewed-By: Michael Dawson Reviewed-By: Myles Borins Reviewed-By: Anto Aravinth Reviewed-By: Beth Griggs --- doc/guides/releases.md | 60 +++++++++++++++++++++++++++--------------- 1 file changed, 39 insertions(+), 21 deletions(-) diff --git a/doc/guides/releases.md b/doc/guides/releases.md index 141a264d5123ab..68f4d0b8bacc6f 100644 --- a/doc/guides/releases.md +++ b/doc/guides/releases.md @@ -25,13 +25,15 @@ official release builds for Node.js, hosted on . * [10. Test the Build](#10-test-the-build) * [11. Tag and Sign the Release Commit](#11-tag-and-sign-the-release-commit) * [12. Set Up For the Next Release](#12-set-up-for-the-next-release) - * [13. Promote and Sign the Release Builds](#13-promote-and-sign-the-release-builds) - * [14. Check the Release](#14-check-the-release) - * [15. Create a Blog Post](#15-create-a-blog-post) - * [16. Create the release on GitHub](#16-create-the-release-on-github) - * [17. Cleanup](#17-cleanup) - * [18. Announce](#18-announce) - * [19. Celebrate](#19-celebrate) + * [13. Cherry-pick the Release Commit to `master`](#13-cherry-pick-the-release-commit-to-master) + * [14. Push the release tag](#14-push-the-release-tag) + * [15. Promote and Sign the Release Builds](#15-promote-and-sign-the-release-builds) + * [16. Check the Release](#16-check-the-release) + * [17. Create a Blog Post](#17-create-a-blog-post) + * [18. Create the release on GitHub](#18-create-the-release-on-github) + * [19. Cleanup](#19-cleanup) + * [20. Announce](#20-announce) + * [21. Celebrate](#21-celebrate) * [LTS Releases](#lts-releases) * [Major Releases](#major-releases) @@ -528,15 +530,31 @@ $ git rebase v1.x $ git push upstream v1.x-staging ``` -Cherry-pick the release commit to `master`. After cherry-picking, edit -`src/node_version.h` to ensure the version macros contain whatever values were -previously on `master`. `NODE_VERSION_IS_RELEASE` should be `0`. **Do not** -cherry-pick the "Working on vx.y.z" commit to `master`. +### 13. Cherry-pick the Release Commit to `master` -Run `make lint` before pushing to `master`, to make sure the Changelog -formatting passes the lint rules on `master`. +```console +$ git checkout master +$ git cherry-pick v1.x^ +``` + +Git should stop to let you fix conflicts. Revert all changes that were made to +`src/node_version.h`. If there are conflicts in `doc` due to updated `REPLACEME` +placeholders (that happens when a change previously landed on another release +branch), keep both version numbers. Convert the YAML field to an array if it is +not already one. + +Then finish cherry-picking and push the commit upstream: + +```console +$ git add src/node_version.h doc +$ git cherry-pick --continue +$ make lint +$ git push upstream master +``` + +**Do not** cherry-pick the "Working on vx.y.z" commit to `master`. -### 13. Push the release tag +### 14. Push the release tag Push the tag to the repo before you promote the builds. If you haven't pushed your tag first, then build promotion won't work properly. Push the tag using the @@ -549,7 +567,7 @@ $ git push *Note*: Please do not push the tag unless you are ready to complete the remainder of the release steps. -### 14. Promote and Sign the Release Builds +### 15. Promote and Sign the Release Builds **The same individual who signed the release tag must be the one to promote the builds as the `SHASUMS256.txt` file needs to be signed with the @@ -622,7 +640,7 @@ be prompted to re-sign `SHASUMS256.txt`. **It is possible to only sign a release by running `./tools/release.sh -s vX.Y.Z`.** -### 15. Check the Release +### 16. Check the Release Your release should be available at `https://nodejs.org/dist/vx.y.z/` and . Check that the appropriate files are in @@ -631,7 +649,7 @@ have the right internal version strings. Check that the API docs are available at . Check that the release catalog files are correct at and . -### 16. Create a Blog Post +### 17. Create a Blog Post There is an automatic build that is kicked off when you promote new builds, so within a few minutes nodejs.org will be listing your new version as the latest @@ -664,7 +682,7 @@ This script will use the promoted builds and changelog to generate the post. Run * Changes to `master` on the [nodejs.org repository][] will trigger a new build of nodejs.org so your changes should appear a few minutes after pushing. -### 17. Create the release on GitHub +### 18. Create the release on GitHub * Go to the [New release page](https://github.com/nodejs/node/releases/new). * Select the tag version you pushed earlier. @@ -672,11 +690,11 @@ This script will use the promoted builds and changelog to generate the post. Run * For the description, copy the rest of the changelog entry. * Click on the "Publish release" button. -### 18. Cleanup +### 19. Cleanup Close your release proposal PR and delete the proposal branch. -### 19. Announce +### 20. Announce The nodejs.org website will automatically rebuild and include the new version. To announce the build on Twitter through the official @nodejs account, email @@ -693,7 +711,7 @@ announcements. Ping the IRC ops and the other [Partner Communities][] liaisons. -### 20. Celebrate +### 21. Celebrate _In whatever form you do this..._ From cb4d8ce889b3c5f41d9937a750476a3d983536ec Mon Sep 17 00:00:00 2001 From: Myles Borins Date: Tue, 21 Apr 2020 23:35:48 -0400 Subject: [PATCH 83/93] module: refactor condition PR-URL: https://github.com/nodejs/node/pull/32989 Reviewed-By: Zeyu Yang Reviewed-By: Colin Ihrig Reviewed-By: Andrey Pechkurov --- lib/internal/errors.js | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/lib/internal/errors.js b/lib/internal/errors.js index 6fa230349072d6..e9e11bf0fb195b 100644 --- a/lib/internal/errors.js +++ b/lib/internal/errors.js @@ -1121,8 +1121,7 @@ E('ERR_INVALID_PACKAGE_TARGET', return `Invalid "exports" main target ${JSONStringify(target)} defined ` + `in the package config ${pkgPath}${sep}package.json${relError ? '; targets must start with "./"' : ''}`; - } else if (typeof target === 'string' && target !== '' && - !StringPrototypeStartsWith(target, './')) { + } else if (relError) { return `Invalid "exports" target ${JSONStringify(target)} defined for '${ StringPrototypeSlice(key, 0, -subpath.length || key.length)}' in the ` + `package config ${pkgPath}${sep}package.json; ` + From 14e559df879511597532e46356170caf9ead0ea2 Mon Sep 17 00:00:00 2001 From: Sam Roberts Date: Tue, 21 Apr 2020 10:39:39 -0700 Subject: [PATCH 84/93] doc: make openssl maintenance position independent MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit It used to have some `cd` commands that if done literally would invalidate the subsequent commands. Modify them to be more accurate, which also simplifies pasting them directly into the console from the guide while doing an update. PR-URL: https://github.com/nodejs/node/pull/32977 Reviewed-By: Richard Lau Reviewed-By: Colin Ihrig Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Gerhard Stöbich --- doc/guides/maintaining-openssl.md | 8 +++----- 1 file changed, 3 insertions(+), 5 deletions(-) diff --git a/doc/guides/maintaining-openssl.md b/doc/guides/maintaining-openssl.md index 5bfe01e0f61b56..af59486b0f3219 100644 --- a/doc/guides/maintaining-openssl.md +++ b/doc/guides/maintaining-openssl.md @@ -57,7 +57,7 @@ This updates all sources in deps/openssl/openssl by: Use `make` to regenerate all platform dependent files in `deps/openssl/config/archs/`: ```sh -% cd deps/openssl/config; make +% make -C deps/openssl/config ``` ## 3. Check diffs @@ -66,8 +66,7 @@ Check diffs if updates are right. Even if no updates in openssl sources, `buildinf.h` files will be updated for they have a timestamp data in them. ```sh -% cd deps/openssl/config -% git diff +% git diff -- deps/openssl ``` *Note*: On Windows, OpenSSL Configure generates `makefile` that can be @@ -95,8 +94,7 @@ The commit message can be (with the openssl version set to the relevant value): After an OpenSSL source update, all the config files need to be regenerated and committed by: - $ cd deps/openssl/config - $ make + $ make -C deps/openssl/config $ git add deps/openssl/config/archs $ git add deps/openssl/openssl/include/crypto/bn_conf.h $ git add deps/openssl/openssl/include/crypto/dso_conf.h From 91e30e35a1a69f11259ee1957d6071f40b1a709c Mon Sep 17 00:00:00 2001 From: Thomas Date: Sat, 4 Apr 2020 18:09:09 +0200 Subject: [PATCH 85/93] build: fix vcbuild error for missing Visual Studio The previous error was wrongly redirecting users to the ICU installation steps, which is unrelated to missing Visual Studio. PR-URL: https://github.com/nodejs/node/pull/32658 Reviewed-By: Anna Henningsen Reviewed-By: Bartosz Sosnowski Reviewed-By: Gireesh Punathil Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Jiawen Geng --- vcbuild.bat | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vcbuild.bat b/vcbuild.bat index 3d8df205423ab5..22c3e237acf570 100644 --- a/vcbuild.bat +++ b/vcbuild.bat @@ -321,7 +321,7 @@ goto msbuild-found :msbuild-not-found echo Failed to find a suitable Visual Studio installation. echo Try to run in a "Developer Command Prompt" or consult -echo https://github.com/nodejs/node/blob/master/BUILDING.md#windows-1 +echo https://github.com/nodejs/node/blob/master/BUILDING.md#windows goto exit :msbuild-found From 794b8796dd7fa48aa25c22a904fd4c5427fdf057 Mon Sep 17 00:00:00 2001 From: Liran Tal Date: Sat, 11 Apr 2020 17:02:59 +0300 Subject: [PATCH 86/93] doc: improve WHATWG url constructor code example Currently, the URL docs for the WHATWG URL spec support are somewhat lacking in their code example of how to access the new URL constructor that lives inside the core url package. PR-URL: https://github.com/nodejs/node/pull/32782 Reviewed-By: Anna Henningsen --- doc/api/url.md | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/doc/api/url.md b/doc/api/url.md index 49f56509fe41e9..6950c92f0980c8 100644 --- a/doc/api/url.md +++ b/doc/api/url.md @@ -105,6 +105,13 @@ const myURL = new URL('/foo', 'https://example.org/'); // https://example.org/foo ``` +The URL constructor is accessible as a property on the global object. +It can also be imported from the built-in url module: + +```js +console.log(URL === require('url').URL); // Prints 'true'. +``` + A `TypeError` will be thrown if the `input` or `base` are not valid URLs. Note that an effort will be made to coerce the given values into strings. For instance: From 343c6ac63909f14f614e4bd694c74e48cc79a2cd Mon Sep 17 00:00:00 2001 From: Richard Lau Date: Mon, 27 Apr 2020 18:36:43 -0400 Subject: [PATCH 87/93] doc: assign missing deprecation code Signed-off-by: Richard Lau PR-URL: https://github.com/nodejs/node/pull/33109 Refs: https://github.com/nodejs/node/pull/32807 Reviewed-By: Anna Henningsen Reviewed-By: Beth Griggs --- doc/api/deprecations.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/doc/api/deprecations.md b/doc/api/deprecations.md index 0662e32295933f..0e1337ad60f990 100644 --- a/doc/api/deprecations.md +++ b/doc/api/deprecations.md @@ -2651,8 +2651,8 @@ written twice. This introduces a race condition between threads, and is a potential security vulnerability. There is no safe, cross-platform alternative API. - -### DEP0XXX: Use `request.destroy()` instead of `request.abort()` + +### DEP0140: Use `request.destroy()` instead of `request.abort()`

Gold Sponsors

-

Shopify Salesforce MagicLab Airbnb

Silver Sponsors

+

Shopify Salesforce Airbnb

Silver Sponsors

AMP Project

Bronze Sponsors

-

Kasinot.fi Pelisivut Nettikasinot.org BonusFinder Deutschland Top Web Design Agencies Bugsnag Stability Monitoring Mixpanel VPS Server Free Icons by Icons8 UI UX Design Agencies clay Discord ThemeIsle TekHattan Marfeel Fire Stick Tricks

+

CasinoTop.com Casino Topp Writers Per Hour Anagram Solver vpn netflix Kasinot.fi Pelisivut Nettikasinot.org BonusFinder Deutschland Top Web Design Agencies Bugsnag Stability Monitoring Mixpanel VPS Server Free Icons by Icons8 UI UX Design Agencies clay Discord ThemeIsle TekHattan Marfeel Fire Stick Tricks

## Technology Sponsors diff --git a/tools/node_modules/eslint/bin/eslint.js b/tools/node_modules/eslint/bin/eslint.js index a9f51f1d7d4c57..75b413148695e5 100755 --- a/tools/node_modules/eslint/bin/eslint.js +++ b/tools/node_modules/eslint/bin/eslint.js @@ -12,97 +12,135 @@ // to use V8's code cache to speed up instantiation time require("v8-compile-cache"); -//------------------------------------------------------------------------------ -// Helpers -//------------------------------------------------------------------------------ - -const useStdIn = process.argv.includes("--stdin"), - init = process.argv.includes("--init"), - debug = process.argv.includes("--debug"); - // must do this initialization *before* other requires in order to work -if (debug) { +if (process.argv.includes("--debug")) { require("debug").enable("eslint:*,-eslint:code-path"); } //------------------------------------------------------------------------------ -// Requirements +// Helpers //------------------------------------------------------------------------------ -// now we can safely include the other modules that use debug -const path = require("path"), - fs = require("fs"), - cli = require("../lib/cli"); - -//------------------------------------------------------------------------------ -// Execution -//------------------------------------------------------------------------------ +/** + * Read data from stdin til the end. + * + * Note: See + * - https://github.com/nodejs/node/blob/master/doc/api/process.md#processstdin + * - https://github.com/nodejs/node/blob/master/doc/api/process.md#a-note-on-process-io + * - https://lists.gnu.org/archive/html/bug-gnu-emacs/2016-01/msg00419.html + * - https://github.com/nodejs/node/issues/7439 (historical) + * + * On Windows using `fs.readFileSync(STDIN_FILE_DESCRIPTOR, "utf8")` seems + * to read 4096 bytes before blocking and never drains to read further data. + * + * The investigation on the Emacs thread indicates: + * + * > Emacs on MS-Windows uses pipes to communicate with subprocesses; a + * > pipe on Windows has a 4K buffer. So as soon as Emacs writes more than + * > 4096 bytes to the pipe, the pipe becomes full, and Emacs then waits for + * > the subprocess to read its end of the pipe, at which time Emacs will + * > write the rest of the stuff. + * @returns {Promise} The read text. + */ +function readStdin() { + return new Promise((resolve, reject) => { + let content = ""; + let chunk = ""; + + process.stdin + .setEncoding("utf8") + .on("readable", () => { + while ((chunk = process.stdin.read()) !== null) { + content += chunk; + } + }) + .on("end", () => resolve(content)) + .on("error", reject); + }); +} -process.once("uncaughtException", err => { +/** + * Get the error message of a given value. + * @param {any} error The value to get. + * @returns {string} The error message. + */ +function getErrorMessage(error) { - // lazy load + // Lazy loading because those are used only if error happened. + const fs = require("fs"); + const path = require("path"); + const util = require("util"); const lodash = require("lodash"); - if (typeof err.messageTemplate === "string" && err.messageTemplate.length > 0) { - const template = lodash.template(fs.readFileSync(path.resolve(__dirname, `../messages/${err.messageTemplate}.txt`), "utf-8")); - const pkg = require("../package.json"); + // Foolproof -- thirdparty module might throw non-object. + if (typeof error !== "object" || error === null) { + return String(error); + } + + // Use templates if `error.messageTemplate` is present. + if (typeof error.messageTemplate === "string") { + try { + const templateFilePath = path.resolve( + __dirname, + `../messages/${error.messageTemplate}.txt` + ); + + // Use sync API because Node.js should exit at this tick. + const templateText = fs.readFileSync(templateFilePath, "utf-8"); + const template = lodash.template(templateText); + + return template(error.messageData || {}); + } catch { + + // Ignore template error then fallback to use `error.stack`. + } + } - console.error("\nOops! Something went wrong! :("); - console.error(`\nESLint: ${pkg.version}.\n\n${template(err.messageData || {})}`); - } else { - console.error(err.stack); + // Use the stacktrace if it's an error object. + if (typeof error.stack === "string") { + return error.stack; } + // Otherwise, dump the object. + return util.format("%o", error); +} + +/** + * Catch and report unexpected error. + * @param {any} error The thrown error object. + * @returns {void} + */ +function onFatalError(error) { process.exitCode = 2; -}); - -if (useStdIn) { - - /* - * Note: See - * - https://github.com/nodejs/node/blob/master/doc/api/process.md#processstdin - * - https://github.com/nodejs/node/blob/master/doc/api/process.md#a-note-on-process-io - * - https://lists.gnu.org/archive/html/bug-gnu-emacs/2016-01/msg00419.html - * - https://github.com/nodejs/node/issues/7439 (historical) - * - * On Windows using `fs.readFileSync(STDIN_FILE_DESCRIPTOR, "utf8")` seems - * to read 4096 bytes before blocking and never drains to read further data. - * - * The investigation on the Emacs thread indicates: - * - * > Emacs on MS-Windows uses pipes to communicate with subprocesses; a - * > pipe on Windows has a 4K buffer. So as soon as Emacs writes more than - * > 4096 bytes to the pipe, the pipe becomes full, and Emacs then waits for - * > the subprocess to read its end of the pipe, at which time Emacs will - * > write the rest of the stuff. - * - * Using the nodejs code example for reading from stdin. - */ - let contents = "", - chunk = ""; - - process.stdin.setEncoding("utf8"); - process.stdin.on("readable", () => { - - // Use a loop to make sure we read all available data. - while ((chunk = process.stdin.read()) !== null) { - contents += chunk; - } - }); - process.stdin.on("end", () => { - process.exitCode = cli.execute(process.argv, contents, "utf8"); - }); -} else if (init) { - const configInit = require("../lib/init/config-initializer"); - - configInit.initializeConfig().then(() => { - process.exitCode = 0; - }).catch(err => { - process.exitCode = 1; - console.error(err.message); - console.error(err.stack); - }); -} else { - process.exitCode = cli.execute(process.argv); + const { version } = require("../package.json"); + const message = getErrorMessage(error); + + console.error(` +Oops! Something went wrong! :( + +ESLint: ${version} + +${message}`); } + +//------------------------------------------------------------------------------ +// Execution +//------------------------------------------------------------------------------ + +(async function main() { + process.on("uncaughtException", onFatalError); + process.on("unhandledRejection", onFatalError); + + // Call the config initializer if `--init` is present. + if (process.argv.includes("--init")) { + await require("../lib/init/config-initializer").initializeConfig(); + return; + } + + // Otherwise, call the CLI. + process.exitCode = await require("../lib/cli").execute( + process.argv, + process.argv.includes("--stdin") ? await readStdin() : null + ); +}()).catch(onFatalError); diff --git a/tools/node_modules/eslint/lib/api.js b/tools/node_modules/eslint/lib/api.js index 40a5cc9fa5ccd4..e4b6643b44780a 100644 --- a/tools/node_modules/eslint/lib/api.js +++ b/tools/node_modules/eslint/lib/api.js @@ -6,6 +6,7 @@ "use strict"; const { CLIEngine } = require("./cli-engine"); +const { ESLint } = require("./eslint"); const { Linter } = require("./linter"); const { RuleTester } = require("./rule-tester"); const { SourceCode } = require("./source-code"); @@ -13,6 +14,7 @@ const { SourceCode } = require("./source-code"); module.exports = { Linter, CLIEngine, + ESLint, RuleTester, SourceCode }; diff --git a/tools/node_modules/eslint/lib/cli-engine/cascading-config-array-factory.js b/tools/node_modules/eslint/lib/cli-engine/cascading-config-array-factory.js index b53f67bd9dce6c..f54605c4db991e 100644 --- a/tools/node_modules/eslint/lib/cli-engine/cascading-config-array-factory.js +++ b/tools/node_modules/eslint/lib/cli-engine/cascading-config-array-factory.js @@ -279,6 +279,18 @@ class CascadingConfigArrayFactory { ); } + /** + * Set the config data to override all configs. + * Require to call `clearCache()` method after this method is called. + * @param {ConfigData} configData The config data to override all configs. + * @returns {void} + */ + setOverrideConfig(configData) { + const slots = internalSlotsMap.get(this); + + slots.cliConfigData = configData; + } + /** * Clear config cache. * @returns {void} diff --git a/tools/node_modules/eslint/lib/cli-engine/cli-engine.js b/tools/node_modules/eslint/lib/cli-engine/cli-engine.js index 72d1fa4d5dcd5d..b6aa995beef933 100644 --- a/tools/node_modules/eslint/lib/cli-engine/cli-engine.js +++ b/tools/node_modules/eslint/lib/cli-engine/cli-engine.js @@ -39,6 +39,7 @@ const validFixTypes = new Set(["problem", "suggestion", "layout"]); // For VSCode IntelliSense /** @typedef {import("../shared/types").ConfigData} ConfigData */ +/** @typedef {import("../shared/types").DeprecatedRuleInfo} DeprecatedRuleInfo */ /** @typedef {import("../shared/types").LintMessage} LintMessage */ /** @typedef {import("../shared/types").ParserOptions} ParserOptions */ /** @typedef {import("../shared/types").Plugin} Plugin */ @@ -50,29 +51,29 @@ const validFixTypes = new Set(["problem", "suggestion", "layout"]); /** * The options to configure a CLI engine with. * @typedef {Object} CLIEngineOptions - * @property {boolean} allowInlineConfig Enable or disable inline configuration comments. - * @property {ConfigData} baseConfig Base config object, extended by all configs used with this CLIEngine instance - * @property {boolean} cache Enable result caching. - * @property {string} cacheLocation The cache file to use instead of .eslintcache. - * @property {string} configFile The configuration file to use. - * @property {string} cwd The value to use for the current working directory. - * @property {string[]} envs An array of environments to load. - * @property {string[]|null} extensions An array of file extensions to check. - * @property {boolean|Function} fix Execute in autofix mode. If a function, should return a boolean. - * @property {string[]} fixTypes Array of rule types to apply fixes for. - * @property {string[]} globals An array of global variables to declare. - * @property {boolean} ignore False disables use of .eslintignore. - * @property {string} ignorePath The ignore file to use instead of .eslintignore. - * @property {string|string[]} ignorePattern One or more glob patterns to ignore. - * @property {boolean} useEslintrc False disables looking for .eslintrc - * @property {string} parser The name of the parser to use. - * @property {ParserOptions} parserOptions An object of parserOption settings to use. - * @property {string[]} plugins An array of plugins to load. - * @property {Record} rules An object of rules to use. - * @property {string[]} rulePaths An array of directories to load custom rules from. - * @property {boolean} reportUnusedDisableDirectives `true` adds reports for unused eslint-disable directives - * @property {boolean} globInputPaths Set to false to skip glob resolution of input file paths to lint (default: true). If false, each input file paths is assumed to be a non-glob path to an existing file. - * @property {string} resolvePluginsRelativeTo The folder where plugins should be resolved from, defaulting to the CWD + * @property {boolean} [allowInlineConfig] Enable or disable inline configuration comments. + * @property {ConfigData} [baseConfig] Base config object, extended by all configs used with this CLIEngine instance + * @property {boolean} [cache] Enable result caching. + * @property {string} [cacheLocation] The cache file to use instead of .eslintcache. + * @property {string} [configFile] The configuration file to use. + * @property {string} [cwd] The value to use for the current working directory. + * @property {string[]} [envs] An array of environments to load. + * @property {string[]|null} [extensions] An array of file extensions to check. + * @property {boolean|Function} [fix] Execute in autofix mode. If a function, should return a boolean. + * @property {string[]} [fixTypes] Array of rule types to apply fixes for. + * @property {string[]} [globals] An array of global variables to declare. + * @property {boolean} [ignore] False disables use of .eslintignore. + * @property {string} [ignorePath] The ignore file to use instead of .eslintignore. + * @property {string|string[]} [ignorePattern] One or more glob patterns to ignore. + * @property {boolean} [useEslintrc] False disables looking for .eslintrc + * @property {string} [parser] The name of the parser to use. + * @property {ParserOptions} [parserOptions] An object of parserOption settings to use. + * @property {string[]} [plugins] An array of plugins to load. + * @property {Record} [rules] An object of rules to use. + * @property {string[]} [rulePaths] An array of directories to load custom rules from. + * @property {boolean} [reportUnusedDisableDirectives] `true` adds reports for unused eslint-disable directives + * @property {boolean} [globInputPaths] Set to false to skip glob resolution of input file paths to lint (default: true). If false, each input file paths is assumed to be a non-glob path to an existing file. + * @property {string} [resolvePluginsRelativeTo] The folder where plugins should be resolved from, defaulting to the CWD */ /** @@ -88,13 +89,6 @@ const validFixTypes = new Set(["problem", "suggestion", "layout"]); * @property {string} [output] The source code of the file that was linted, with as many fixes applied as possible. */ -/** - * Information of deprecated rules. - * @typedef {Object} DeprecatedRuleInfo - * @property {string} ruleId The rule ID. - * @property {string[]} replacedBy The rule IDs that replace this deprecated rule. - */ - /** * Linting results. * @typedef {Object} LintReport @@ -821,16 +815,22 @@ class CLIEngine { lintResultCache.reconcile(); } - // Collect used deprecated rules. - const usedDeprecatedRules = Array.from( - iterateRuleDeprecationWarnings(lastConfigArrays) - ); - debug(`Linting complete in: ${Date.now() - startTime}ms`); + let usedDeprecatedRules; + return { results, ...calculateStatsPerRun(results), - usedDeprecatedRules + + // Initialize it lazily because CLI and `ESLint` API don't use it. + get usedDeprecatedRules() { + if (!usedDeprecatedRules) { + usedDeprecatedRules = Array.from( + iterateRuleDeprecationWarnings(lastConfigArrays) + ); + } + return usedDeprecatedRules; + } }; } @@ -858,9 +858,9 @@ class CLIEngine { const startTime = Date.now(); const resolvedFilename = filename && path.resolve(cwd, filename); + // Clear the last used config arrays. lastConfigArrays.length = 0; - if (resolvedFilename && this.isPathIgnored(resolvedFilename)) { if (warnIgnored) { results.push(createIgnoreResult(resolvedFilename, cwd)); @@ -892,16 +892,22 @@ class CLIEngine { })); } - // Collect used deprecated rules. - const usedDeprecatedRules = Array.from( - iterateRuleDeprecationWarnings(lastConfigArrays) - ); - debug(`Linting complete in: ${Date.now() - startTime}ms`); + let usedDeprecatedRules; + return { results, ...calculateStatsPerRun(results), - usedDeprecatedRules + + // Initialize it lazily because CLI and `ESLint` API don't use it. + get usedDeprecatedRules() { + if (!usedDeprecatedRules) { + usedDeprecatedRules = Array.from( + iterateRuleDeprecationWarnings(lastConfigArrays) + ); + } + return usedDeprecatedRules; + } }; } @@ -955,11 +961,10 @@ class CLIEngine { } /** - * Returns the formatter representing the given format or null if no formatter - * with the given name can be found. + * Returns the formatter representing the given format or null if the `format` is not a string. * @param {string} [format] The name of the format to load or the path to a * custom formatter. - * @returns {Function} The formatter function or null if not found. + * @returns {(Function|null)} The formatter function or null if the `format` is not a string. */ getFormatter(format) { diff --git a/tools/node_modules/eslint/lib/cli-engine/config-array-factory.js b/tools/node_modules/eslint/lib/cli-engine/config-array-factory.js index b1429af6ad95cf..fa3fdb3bedd89b 100644 --- a/tools/node_modules/eslint/lib/cli-engine/config-array-factory.js +++ b/tools/node_modules/eslint/lib/cli-engine/config-array-factory.js @@ -817,7 +817,7 @@ class ConfigArrayFactory { if (configData) { return this._normalizeConfigData(configData, { ...ctx, - filePath: plugin.filePath, + filePath: plugin.filePath || ctx.filePath, name: `${ctx.name} » plugin:${plugin.id}/${configName}` }); } @@ -978,7 +978,7 @@ class ConfigArrayFactory { if (plugin) { return new ConfigDependency({ definition: normalizePlugin(plugin), - filePath: ctx.filePath, + filePath: "", // It's unknown where the plugin came from. id, importerName: ctx.name, importerPath: ctx.filePath diff --git a/tools/node_modules/eslint/lib/cli-engine/config-array/config-array.js b/tools/node_modules/eslint/lib/cli-engine/config-array/config-array.js index b3434198b19201..42a7362737fc7c 100644 --- a/tools/node_modules/eslint/lib/cli-engine/config-array/config-array.js +++ b/tools/node_modules/eslint/lib/cli-engine/config-array/config-array.js @@ -107,7 +107,7 @@ function getMatchedIndices(elements, filePath) { for (let i = elements.length - 1; i >= 0; --i) { const element = elements[i]; - if (!element.criteria || element.criteria.test(filePath)) { + if (!element.criteria || (filePath && element.criteria.test(filePath))) { indices.push(i); } } diff --git a/tools/node_modules/eslint/lib/cli-engine/config-array/ignore-pattern.js b/tools/node_modules/eslint/lib/cli-engine/config-array/ignore-pattern.js index 92690b9f8ae342..6eaec4258e1ae4 100644 --- a/tools/node_modules/eslint/lib/cli-engine/config-array/ignore-pattern.js +++ b/tools/node_modules/eslint/lib/cli-engine/config-array/ignore-pattern.js @@ -71,7 +71,13 @@ function getCommonAncestorPath(sourcePaths) { } } - return result || path.sep; + let resolvedResult = result || path.sep; + + // if Windows common ancestor is root of drive must have trailing slash to be absolute. + if (resolvedResult && resolvedResult.endsWith(":") && process.platform === "win32") { + resolvedResult += path.sep; + } + return resolvedResult; } /** diff --git a/tools/node_modules/eslint/lib/cli.js b/tools/node_modules/eslint/lib/cli.js index 815ce68c22fe2e..ce11878008f108 100644 --- a/tools/node_modules/eslint/lib/cli.js +++ b/tools/node_modules/eslint/lib/cli.js @@ -17,105 +17,176 @@ const fs = require("fs"), path = require("path"), - { CLIEngine } = require("./cli-engine"), - options = require("./options"), + { promisify } = require("util"), + { ESLint } = require("./eslint"), + CLIOptions = require("./options"), log = require("./shared/logging"), RuntimeInfo = require("./shared/runtime-info"); const debug = require("debug")("eslint:cli"); +//------------------------------------------------------------------------------ +// Types +//------------------------------------------------------------------------------ + +/** @typedef {import("./eslint/eslint").ESLintOptions} ESLintOptions */ +/** @typedef {import("./eslint/eslint").LintMessage} LintMessage */ +/** @typedef {import("./eslint/eslint").LintResult} LintResult */ + //------------------------------------------------------------------------------ // Helpers //------------------------------------------------------------------------------ +const mkdir = promisify(fs.mkdir); +const stat = promisify(fs.stat); +const writeFile = promisify(fs.writeFile); + /** * Predicate function for whether or not to apply fixes in quiet mode. * If a message is a warning, do not apply a fix. - * @param {LintResult} lintResult The lint result. + * @param {LintMessage} message The lint result. * @returns {boolean} True if the lint message is an error (and thus should be * autofixed), false otherwise. */ -function quietFixPredicate(lintResult) { - return lintResult.severity === 2; +function quietFixPredicate(message) { + return message.severity === 2; } /** * Translates the CLI options into the options expected by the CLIEngine. * @param {Object} cliOptions The CLI options to translate. - * @returns {CLIEngineOptions} The options object for the CLIEngine. + * @returns {ESLintOptions} The options object for the CLIEngine. * @private */ -function translateOptions(cliOptions) { +function translateOptions({ + cache, + cacheFile, + cacheLocation, + config, + env, + errorOnUnmatchedPattern, + eslintrc, + ext, + fix, + fixDryRun, + fixType, + global, + ignore, + ignorePath, + ignorePattern, + inlineConfig, + parser, + parserOptions, + plugin, + quiet, + reportUnusedDisableDirectives, + resolvePluginsRelativeTo, + rule, + rulesdir +}) { return { - envs: cliOptions.env, - extensions: cliOptions.ext, - rules: cliOptions.rule, - plugins: cliOptions.plugin, - globals: cliOptions.global, - ignore: cliOptions.ignore, - ignorePath: cliOptions.ignorePath, - ignorePattern: cliOptions.ignorePattern, - configFile: cliOptions.config, - rulePaths: cliOptions.rulesdir, - useEslintrc: cliOptions.eslintrc, - parser: cliOptions.parser, - parserOptions: cliOptions.parserOptions, - cache: cliOptions.cache, - cacheFile: cliOptions.cacheFile, - cacheLocation: cliOptions.cacheLocation, - fix: (cliOptions.fix || cliOptions.fixDryRun) && (cliOptions.quiet ? quietFixPredicate : true), - fixTypes: cliOptions.fixType, - allowInlineConfig: cliOptions.inlineConfig, - reportUnusedDisableDirectives: cliOptions.reportUnusedDisableDirectives, - resolvePluginsRelativeTo: cliOptions.resolvePluginsRelativeTo, - errorOnUnmatchedPattern: cliOptions.errorOnUnmatchedPattern + allowInlineConfig: inlineConfig, + cache, + cacheLocation: cacheLocation || cacheFile, + errorOnUnmatchedPattern, + extensions: ext, + fix: (fix || fixDryRun) && (quiet ? quietFixPredicate : true), + fixTypes: fixType, + ignore, + ignorePath, + overrideConfig: { + env: env && env.reduce((obj, name) => { + obj[name] = true; + return obj; + }, {}), + globals: global && global.reduce((obj, name) => { + if (name.endsWith(":true")) { + obj[name.slice(0, -5)] = "writable"; + } else { + obj[name] = "readonly"; + } + return obj; + }, {}), + ignorePatterns: ignorePattern, + parser, + parserOptions, + plugins: plugin, + rules: rule + }, + overrideConfigFile: config, + reportUnusedDisableDirectives: reportUnusedDisableDirectives ? "error" : void 0, + resolvePluginsRelativeTo, + rulePaths: rulesdir, + useEslintrc: eslintrc }; } +/** + * Count error messages. + * @param {LintResult[]} results The lint results. + * @returns {{errorCount:number;warningCount:number}} The number of error messages. + */ +function countErrors(results) { + let errorCount = 0; + let warningCount = 0; + + for (const result of results) { + errorCount += result.errorCount; + warningCount += result.warningCount; + } + + return { errorCount, warningCount }; +} + +/** + * Check if a given file path is a directory or not. + * @param {string} filePath The path to a file to check. + * @returns {Promise} `true` if the given path is a directory. + */ +async function isDirectory(filePath) { + try { + return (await stat(filePath)).isDirectory(); + } catch (error) { + if (error.code === "ENOENT" || error.code === "ENOTDIR") { + return false; + } + throw error; + } +} + /** * Outputs the results of the linting. - * @param {CLIEngine} engine The CLIEngine to use. + * @param {ESLint} engine The ESLint instance to use. * @param {LintResult[]} results The results to print. * @param {string} format The name of the formatter to use or the path to the formatter. * @param {string} outputFile The path for the output file. - * @returns {boolean} True if the printing succeeds, false if not. + * @returns {Promise} True if the printing succeeds, false if not. * @private */ -function printResults(engine, results, format, outputFile) { +async function printResults(engine, results, format, outputFile) { let formatter; - let rulesMeta; try { - formatter = engine.getFormatter(format); + formatter = await engine.loadFormatter(format); } catch (e) { log.error(e.message); return false; } - const output = formatter(results, { - get rulesMeta() { - if (!rulesMeta) { - rulesMeta = {}; - for (const [ruleId, rule] of engine.getRules()) { - rulesMeta[ruleId] = rule.meta; - } - } - return rulesMeta; - } - }); + const output = formatter.format(results); if (output) { if (outputFile) { const filePath = path.resolve(process.cwd(), outputFile); - if (fs.existsSync(filePath) && fs.statSync(filePath).isDirectory()) { + if (await isDirectory(filePath)) { log.error("Cannot write to output file path, it is a directory: %s", outputFile); return false; } try { - fs.mkdirSync(path.dirname(filePath), { recursive: true }); - fs.writeFileSync(filePath, output); + await mkdir(path.dirname(filePath), { recursive: true }); + await writeFile(filePath, output); } catch (ex) { log.error("There was a problem writing the output file:\n%s", ex); return false; @@ -126,7 +197,6 @@ function printResults(engine, results, format, outputFile) { } return true; - } //------------------------------------------------------------------------------ @@ -143,28 +213,33 @@ const cli = { * Executes the CLI based on an array of arguments that is passed in. * @param {string|Array|Object} args The arguments to process. * @param {string} [text] The text to lint (used for TTY). - * @returns {int} The exit code for the operation. + * @returns {Promise} The exit code for the operation. */ - execute(args, text) { + async execute(args, text) { if (Array.isArray(args)) { debug("CLI args: %o", args.slice(2)); } - - let currentOptions; + let options; try { - currentOptions = options.parse(args); + options = CLIOptions.parse(args); } catch (error) { log.error(error.message); return 2; } - const files = currentOptions._; + const files = options._; const useStdin = typeof text === "string"; - if (currentOptions.version) { + if (options.help) { + log.info(CLIOptions.generateHelp()); + return 0; + } + if (options.version) { log.info(RuntimeInfo.version()); - } else if (currentOptions.envInfo) { + return 0; + } + if (options.envInfo) { try { log.info(RuntimeInfo.environment()); return 0; @@ -172,7 +247,9 @@ const cli = { log.error(err.message); return 2; } - } else if (currentOptions.printConfig) { + } + + if (options.printConfig) { if (files.length) { log.error("The --print-config option must be used with exactly one file name."); return 2; @@ -182,58 +259,67 @@ const cli = { return 2; } - const engine = new CLIEngine(translateOptions(currentOptions)); - const fileConfig = engine.getConfigForFile(currentOptions.printConfig); + const engine = new ESLint(translateOptions(options)); + const fileConfig = + await engine.calculateConfigForFile(options.printConfig); log.info(JSON.stringify(fileConfig, null, " ")); return 0; - } else if (currentOptions.help || (!files.length && !useStdin)) { - log.info(options.generateHelp()); - } else { - debug(`Running on ${useStdin ? "text" : "files"}`); - - if (currentOptions.fix && currentOptions.fixDryRun) { - log.error("The --fix option and the --fix-dry-run option cannot be used together."); - return 2; - } + } - if (useStdin && currentOptions.fix) { - log.error("The --fix option is not available for piped-in code; use --fix-dry-run instead."); - return 2; - } + debug(`Running on ${useStdin ? "text" : "files"}`); - if (currentOptions.fixType && !currentOptions.fix && !currentOptions.fixDryRun) { - log.error("The --fix-type option requires either --fix or --fix-dry-run."); - return 2; - } + if (options.fix && options.fixDryRun) { + log.error("The --fix option and the --fix-dry-run option cannot be used together."); + return 2; + } + if (useStdin && options.fix) { + log.error("The --fix option is not available for piped-in code; use --fix-dry-run instead."); + return 2; + } + if (options.fixType && !options.fix && !options.fixDryRun) { + log.error("The --fix-type option requires either --fix or --fix-dry-run."); + return 2; + } - const engine = new CLIEngine(translateOptions(currentOptions)); - const report = useStdin ? engine.executeOnText(text, currentOptions.stdinFilename, true) : engine.executeOnFiles(files); + const engine = new ESLint(translateOptions(options)); + let results; - if (currentOptions.fix) { - debug("Fix mode enabled - applying fixes"); - CLIEngine.outputFixes(report); - } + if (useStdin) { + results = await engine.lintText(text, { + filePath: options.stdinFilename, + warnIgnored: true + }); + } else { + results = await engine.lintFiles(files); + } - if (currentOptions.quiet) { - debug("Quiet mode enabled - filtering out warnings"); - report.results = CLIEngine.getErrorResults(report.results); - } + if (options.fix) { + debug("Fix mode enabled - applying fixes"); + await ESLint.outputFixes(results); + } - if (printResults(engine, report.results, currentOptions.format, currentOptions.outputFile)) { - const tooManyWarnings = currentOptions.maxWarnings >= 0 && report.warningCount > currentOptions.maxWarnings; + if (options.quiet) { + debug("Quiet mode enabled - filtering out warnings"); + results = ESLint.getErrorResults(results); + } - if (!report.errorCount && tooManyWarnings) { - log.error("ESLint found too many warnings (maximum: %s).", currentOptions.maxWarnings); - } + if (await printResults(engine, results, options.format, options.outputFile)) { + const { errorCount, warningCount } = countErrors(results); + const tooManyWarnings = + options.maxWarnings >= 0 && warningCount > options.maxWarnings; - return (report.errorCount || tooManyWarnings) ? 1 : 0; + if (!errorCount && tooManyWarnings) { + log.error( + "ESLint found too many warnings (maximum: %s).", + options.maxWarnings + ); } - return 2; + return (errorCount || tooManyWarnings) ? 1 : 0; } - return 0; + return 2; } }; diff --git a/tools/node_modules/eslint/lib/eslint/eslint.js b/tools/node_modules/eslint/lib/eslint/eslint.js new file mode 100644 index 00000000000000..d195aab09f1918 --- /dev/null +++ b/tools/node_modules/eslint/lib/eslint/eslint.js @@ -0,0 +1,656 @@ +/** + * @fileoverview Main API Class + * @author Kai Cataldo + * @author Toru Nagashima + */ + +"use strict"; + +//------------------------------------------------------------------------------ +// Requirements +//------------------------------------------------------------------------------ + +const path = require("path"); +const fs = require("fs"); +const { promisify } = require("util"); +const { CLIEngine, getCLIEngineInternalSlots } = require("../cli-engine/cli-engine"); +const BuiltinRules = require("../rules"); +const { getRuleSeverity } = require("../shared/config-ops"); +const { version } = require("../../package.json"); + +//------------------------------------------------------------------------------ +// Typedefs +//------------------------------------------------------------------------------ + +/** @typedef {import("../cli-engine/cli-engine").LintReport} CLIEngineLintReport */ +/** @typedef {import("../shared/types").DeprecatedRuleInfo} DeprecatedRuleInfo */ +/** @typedef {import("../shared/types").ConfigData} ConfigData */ +/** @typedef {import("../shared/types").LintMessage} LintMessage */ +/** @typedef {import("../shared/types").Plugin} Plugin */ +/** @typedef {import("../shared/types").Rule} Rule */ +/** @typedef {import("./load-formatter").Formatter} Formatter */ + +/** + * The options with which to configure the ESLint instance. + * @typedef {Object} ESLintOptions + * @property {boolean} [allowInlineConfig] Enable or disable inline configuration comments. + * @property {ConfigData} [baseConfig] Base config object, extended by all configs used with this instance + * @property {boolean} [cache] Enable result caching. + * @property {string} [cacheLocation] The cache file to use instead of .eslintcache. + * @property {string} [cwd] The value to use for the current working directory. + * @property {boolean} [errorOnUnmatchedPattern] If `false` then `ESLint#lintFiles()` doesn't throw even if no target files found. Defaults to `true`. + * @property {string[]} [extensions] An array of file extensions to check. + * @property {boolean|Function} [fix] Execute in autofix mode. If a function, should return a boolean. + * @property {string[]} [fixTypes] Array of rule types to apply fixes for. + * @property {boolean} [globInputPaths] Set to false to skip glob resolution of input file paths to lint (default: true). If false, each input file paths is assumed to be a non-glob path to an existing file. + * @property {boolean} [ignore] False disables use of .eslintignore. + * @property {string} [ignorePath] The ignore file to use instead of .eslintignore. + * @property {ConfigData} [overrideConfig] Override config object, overrides all configs used with this instance + * @property {string} [overrideConfigFile] The configuration file to use. + * @property {Record} [plugins] An array of plugin implementations. + * @property {"error" | "warn" | "off"} [reportUnusedDisableDirectives] the severity to report unused eslint-disable directives. + * @property {string} [resolvePluginsRelativeTo] The folder where plugins should be resolved from, defaulting to the CWD. + * @property {string[]} [rulePaths] An array of directories to load custom rules from. + * @property {boolean} [useEslintrc] False disables looking for .eslintrc.* files. + */ + +/** + * A rules metadata object. + * @typedef {Object} RulesMeta + * @property {string} id The plugin ID. + * @property {Object} definition The plugin definition. + */ + +/** + * A linting result. + * @typedef {Object} LintResult + * @property {string} filePath The path to the file that was linted. + * @property {LintMessage[]} messages All of the messages for the result. + * @property {number} errorCount Number of errors for the result. + * @property {number} warningCount Number of warnings for the result. + * @property {number} fixableErrorCount Number of fixable errors for the result. + * @property {number} fixableWarningCount Number of fixable warnings for the result. + * @property {string} [source] The source code of the file that was linted. + * @property {string} [output] The source code of the file that was linted, with as many fixes applied as possible. + * @property {DeprecatedRuleInfo[]} usedDeprecatedRules The list of used deprecated rules. + */ + +/** + * Private members for the `ESLint` instance. + * @typedef {Object} ESLintPrivateMembers + * @property {CLIEngine} cliEngine The wrapped CLIEngine instance. + * @property {ESLintOptions} options The options used to instantiate the ESLint instance. + */ + +//------------------------------------------------------------------------------ +// Helpers +//------------------------------------------------------------------------------ + +const writeFile = promisify(fs.writeFile); + +/** + * The map with which to store private class members. + * @type {WeakMap} + */ +const privateMembersMap = new WeakMap(); + +/** + * Check if a given value is a non-empty string or not. + * @param {any} x The value to check. + * @returns {boolean} `true` if `x` is a non-empty string. + */ +function isNonEmptyString(x) { + return typeof x === "string" && x.trim() !== ""; +} + +/** + * Check if a given value is an array of non-empty stringss or not. + * @param {any} x The value to check. + * @returns {boolean} `true` if `x` is an array of non-empty stringss. + */ +function isArrayOfNonEmptyString(x) { + return Array.isArray(x) && x.every(isNonEmptyString); +} + +/** + * Check if a given value is a valid fix type or not. + * @param {any} x The value to check. + * @returns {boolean} `true` if `x` is valid fix type. + */ +function isFixType(x) { + return x === "problem" || x === "suggestion" || x === "layout"; +} + +/** + * Check if a given value is an array of fix types or not. + * @param {any} x The value to check. + * @returns {boolean} `true` if `x` is an array of fix types. + */ +function isFixTypeArray(x) { + return Array.isArray(x) && x.every(isFixType); +} + +/** + * The error for invalid options. + */ +class ESLintInvalidOptionsError extends Error { + constructor(messages) { + super(`Invalid Options:\n- ${messages.join("\n- ")}`); + this.code = "ESLINT_INVALID_OPTIONS"; + Error.captureStackTrace(this, ESLintInvalidOptionsError); + } +} + +/** + * Validates and normalizes options for the wrapped CLIEngine instance. + * @param {ESLintOptions} options The options to process. + * @returns {ESLintOptions} The normalized options. + */ +function processOptions({ + allowInlineConfig = true, // ← we cannot use `overrideConfig.noInlineConfig` instead because `allowInlineConfig` has side-effect that suppress warnings that show inline configs are ignored. + baseConfig = null, + cache = false, + cacheLocation = ".eslintcache", + cwd = process.cwd(), + errorOnUnmatchedPattern = true, + extensions = null, // ← should be null by default because if it's an array then it suppresses RFC20 feature. + fix = false, + fixTypes = null, // ← should be null by default because if it's an array then it suppresses rules that don't have the `meta.type` property. + globInputPaths = true, + ignore = true, + ignorePath = null, // ← should be null by default because if it's a string then it may throw ENOENT. + overrideConfig = null, + overrideConfigFile = null, + plugins = {}, + reportUnusedDisableDirectives = null, // ← should be null by default because if it's a string then it overrides the 'reportUnusedDisableDirectives' setting in config files. And we cannot use `overrideConfig.reportUnusedDisableDirectives` instead because we cannot configure the `error` severity with that. + resolvePluginsRelativeTo = null, // ← should be null by default because if it's a string then it suppresses RFC47 feature. + rulePaths = [], + useEslintrc = true, + ...unknownOptions +}) { + const errors = []; + const unknownOptionKeys = Object.keys(unknownOptions); + + if (unknownOptionKeys.length >= 1) { + errors.push(`Unknown options: ${unknownOptionKeys.join(", ")}`); + if (unknownOptionKeys.includes("cacheFile")) { + errors.push("'cacheFile' has been removed. Please use the 'cacheLocation' option instead."); + } + if (unknownOptionKeys.includes("configFile")) { + errors.push("'configFile' has been removed. Please use the 'overrideConfigFile' option instead."); + } + if (unknownOptionKeys.includes("envs")) { + errors.push("'envs' has been removed. Please use the 'overrideConfig.env' option instead."); + } + if (unknownOptionKeys.includes("globals")) { + errors.push("'globals' has been removed. Please use the 'overrideConfig.globals' option instead."); + } + if (unknownOptionKeys.includes("ignorePattern")) { + errors.push("'ignorePattern' has been removed. Please use the 'overrideConfig.ignorePatterns' option instead."); + } + if (unknownOptionKeys.includes("parser")) { + errors.push("'parser' has been removed. Please use the 'overrideConfig.parser' option instead."); + } + if (unknownOptionKeys.includes("parserOptions")) { + errors.push("'parserOptions' has been removed. Please use the 'overrideConfig.parserOptions' option instead."); + } + if (unknownOptionKeys.includes("rules")) { + errors.push("'rules' has been removed. Please use the 'overrideConfig.rules' option instead."); + } + } + if (typeof allowInlineConfig !== "boolean") { + errors.push("'allowInlineConfig' must be a boolean."); + } + if (typeof baseConfig !== "object") { + errors.push("'baseConfig' must be an object or null."); + } + if (typeof cache !== "boolean") { + errors.push("'cache' must be a boolean."); + } + if (!isNonEmptyString(cacheLocation)) { + errors.push("'cacheLocation' must be a non-empty string."); + } + if (!isNonEmptyString(cwd) || !path.isAbsolute(cwd)) { + errors.push("'cwd' must be an absolute path."); + } + if (typeof errorOnUnmatchedPattern !== "boolean") { + errors.push("'errorOnUnmatchedPattern' must be a boolean."); + } + if (!isArrayOfNonEmptyString(extensions) && extensions !== null) { + errors.push("'extensions' must be an array of non-empty strings or null."); + } + if (typeof fix !== "boolean" && typeof fix !== "function") { + errors.push("'fix' must be a boolean or a function."); + } + if (fixTypes !== null && !isFixTypeArray(fixTypes)) { + errors.push("'fixTypes' must be an array of any of \"problem\", \"suggestion\", and \"layout\"."); + } + if (typeof globInputPaths !== "boolean") { + errors.push("'globInputPaths' must be a boolean."); + } + if (typeof ignore !== "boolean") { + errors.push("'ignore' must be a boolean."); + } + if (!isNonEmptyString(ignorePath) && ignorePath !== null) { + errors.push("'ignorePath' must be a non-empty string or null."); + } + if (typeof overrideConfig !== "object") { + errors.push("'overrideConfig' must be an object or null."); + } + if (!isNonEmptyString(overrideConfigFile) && overrideConfigFile !== null) { + errors.push("'overrideConfigFile' must be a non-empty string or null."); + } + if (typeof plugins !== "object") { + errors.push("'plugins' must be an object or null."); + } else if (plugins !== null && Object.keys(plugins).includes("")) { + errors.push("'plugins' must not include an empty string."); + } + if (Array.isArray(plugins)) { + errors.push("'plugins' doesn't add plugins to configuration to load. Please use the 'overrideConfig.plugins' option instead."); + } + if ( + reportUnusedDisableDirectives !== "error" && + reportUnusedDisableDirectives !== "warn" && + reportUnusedDisableDirectives !== "off" && + reportUnusedDisableDirectives !== null + ) { + errors.push("'reportUnusedDisableDirectives' must be any of \"error\", \"warn\", \"off\", and null."); + } + if ( + !isNonEmptyString(resolvePluginsRelativeTo) && + resolvePluginsRelativeTo !== null + ) { + errors.push("'resolvePluginsRelativeTo' must be a non-empty string or null."); + } + if (!isArrayOfNonEmptyString(rulePaths)) { + errors.push("'rulePaths' must be an array of non-empty strings."); + } + if (typeof useEslintrc !== "boolean") { + errors.push("'useElintrc' must be a boolean."); + } + + if (errors.length > 0) { + throw new ESLintInvalidOptionsError(errors); + } + + return { + allowInlineConfig, + baseConfig, + cache, + cacheLocation, + configFile: overrideConfigFile, + cwd, + errorOnUnmatchedPattern, + extensions, + fix, + fixTypes, + globInputPaths, + ignore, + ignorePath, + reportUnusedDisableDirectives, + resolvePluginsRelativeTo, + rulePaths, + useEslintrc + }; +} + +/** + * Check if a value has one or more properties and that value is not undefined. + * @param {any} obj The value to check. + * @returns {boolean} `true` if `obj` has one or more properties that that value is not undefined. + */ +function hasDefinedProperty(obj) { + if (typeof obj === "object" && obj !== null) { + for (const key in obj) { + if (typeof obj[key] !== "undefined") { + return true; + } + } + } + return false; +} + +/** + * Create rulesMeta object. + * @param {Map} rules a map of rules from which to generate the object. + * @returns {Object} metadata for all enabled rules. + */ +function createRulesMeta(rules) { + return Array.from(rules).reduce((retVal, [id, rule]) => { + retVal[id] = rule.meta; + return retVal; + }, {}); +} + +/** @type {WeakMap} */ +const usedDeprecatedRulesCache = new WeakMap(); + +/** + * Create used deprecated rule list. + * @param {CLIEngine} cliEngine The CLIEngine instance. + * @param {string} maybeFilePath The absolute path to a lint target file or `""`. + * @returns {DeprecatedRuleInfo[]} The used deprecated rule list. + */ +function getOrFindUsedDeprecatedRules(cliEngine, maybeFilePath) { + const { + configArrayFactory, + options: { cwd } + } = getCLIEngineInternalSlots(cliEngine); + const filePath = path.isAbsolute(maybeFilePath) + ? maybeFilePath + : path.join(cwd, "__placeholder__.js"); + const configArray = configArrayFactory.getConfigArrayForFile(filePath); + const config = configArray.extractConfig(filePath); + + // Most files use the same config, so cache it. + if (!usedDeprecatedRulesCache.has(config)) { + const pluginRules = configArray.pluginRules; + const retv = []; + + for (const [ruleId, ruleConf] of Object.entries(config.rules)) { + if (getRuleSeverity(ruleConf) === 0) { + continue; + } + const rule = pluginRules.get(ruleId) || BuiltinRules.get(ruleId); + const meta = rule && rule.meta; + + if (meta && meta.deprecated) { + retv.push({ ruleId, replacedBy: meta.replacedBy || [] }); + } + } + + usedDeprecatedRulesCache.set(config, Object.freeze(retv)); + } + + return usedDeprecatedRulesCache.get(config); +} + +/** + * Processes the linting results generated by a CLIEngine linting report to + * match the ESLint class's API. + * @param {CLIEngine} cliEngine The CLIEngine instance. + * @param {CLIEngineLintReport} report The CLIEngine linting report to process. + * @returns {LintResult[]} The processed linting results. + */ +function processCLIEngineLintReport(cliEngine, { results }) { + const descriptor = { + configurable: true, + enumerable: true, + get() { + return getOrFindUsedDeprecatedRules(cliEngine, this.filePath); + } + }; + + for (const result of results) { + Object.defineProperty(result, "usedDeprecatedRules", descriptor); + } + + return results; +} + +/** + * An Array.prototype.sort() compatible compare function to order results by their file path. + * @param {LintResult} a The first lint result. + * @param {LintResult} b The second lint result. + * @returns {number} An integer representing the order in which the two results should occur. + */ +function compareResultsByFilePath(a, b) { + if (a.filePath < b.filePath) { + return -1; + } + + if (a.filePath > b.filePath) { + return 1; + } + + return 0; +} + +class ESLint { + + /** + * Creates a new instance of the main ESLint API. + * @param {ESLintOptions} options The options for this instance. + */ + constructor(options = {}) { + const processedOptions = processOptions(options); + const cliEngine = new CLIEngine(processedOptions); + const { + additionalPluginPool, + configArrayFactory, + lastConfigArrays + } = getCLIEngineInternalSlots(cliEngine); + let updated = false; + + /* + * Address `plugins` to add plugin implementations. + * Operate the `additionalPluginPool` internal slot directly to avoid + * using `addPlugin(id, plugin)` method that resets cache everytime. + */ + if (options.plugins) { + for (const [id, plugin] of Object.entries(options.plugins)) { + additionalPluginPool.set(id, plugin); + updated = true; + } + } + + /* + * Address `overrideConfig` to set override config. + * Operate the `configArrayFactory` internal slot directly because this + * functionality doesn't exist as the public API of CLIEngine. + */ + if (hasDefinedProperty(options.overrideConfig)) { + configArrayFactory.setOverrideConfig(options.overrideConfig); + updated = true; + } + + // Update caches. + if (updated) { + configArrayFactory.clearCache(); + lastConfigArrays[0] = configArrayFactory.getConfigArrayForFile(); + } + + // Initialize private properties. + privateMembersMap.set(this, { + cliEngine, + options: processedOptions + }); + } + + /** + * The version text. + * @type {string} + */ + static get version() { + return version; + } + + /** + * Outputs fixes from the given results to files. + * @param {LintResult[]} results The lint results. + * @returns {Promise} Returns a promise that is used to track side effects. + */ + static async outputFixes(results) { + if (!Array.isArray(results)) { + throw new Error("'results' must be an array"); + } + + await Promise.all( + results + .filter(result => { + if (typeof result !== "object" || result === null) { + throw new Error("'results' must include only objects"); + } + return ( + typeof result.output === "string" && + path.isAbsolute(result.filePath) + ); + }) + .map(r => writeFile(r.filePath, r.output)) + ); + } + + /** + * Returns results that only contains errors. + * @param {LintResult[]} results The results to filter. + * @returns {LintResult[]} The filtered results. + */ + static getErrorResults(results) { + return CLIEngine.getErrorResults(results); + } + + /** + * Executes the current configuration on an array of file and directory names. + * @param {string[]} patterns An array of file and directory names. + * @returns {Promise} The results of linting the file patterns given. + */ + async lintFiles(patterns) { + if (!isNonEmptyString(patterns) && !isArrayOfNonEmptyString(patterns)) { + throw new Error("'patterns' must be a non-empty string or an array of non-empty strings"); + } + const { cliEngine } = privateMembersMap.get(this); + + return processCLIEngineLintReport( + cliEngine, + cliEngine.executeOnFiles(patterns) + ); + } + + /** + * Executes the current configuration on text. + * @param {string} code A string of JavaScript code to lint. + * @param {Object} [options] The options. + * @param {string} [options.filePath] The path to the file of the source code. + * @param {boolean} [options.warnIgnored] When set to true, warn if given filePath is an ignored path. + * @returns {Promise} The results of linting the string of code given. + */ + async lintText(code, options = {}) { + if (typeof code !== "string") { + throw new Error("'code' must be a string"); + } + if (typeof options !== "object") { + throw new Error("'options' must be an object, null, or undefined"); + } + const { + filePath, + warnIgnored = false, + ...unknownOptions + } = options || {}; + + for (const key of Object.keys(unknownOptions)) { + throw new Error(`'options' must not include the unknown option '${key}'`); + } + if (filePath !== void 0 && !isNonEmptyString(filePath)) { + throw new Error("'options.filePath' must be a non-empty string or undefined"); + } + if (typeof warnIgnored !== "boolean") { + throw new Error("'options.warnIgnored' must be a boolean or undefined"); + } + + const { cliEngine } = privateMembersMap.get(this); + + return processCLIEngineLintReport( + cliEngine, + cliEngine.executeOnText(code, filePath, warnIgnored) + ); + } + + /** + * Returns the formatter representing the given formatter name. + * @param {string} [name] The name of the formattter to load. + * The following values are allowed: + * - `undefined` ... Load `stylish` builtin formatter. + * - A builtin formatter name ... Load the builtin formatter. + * - A thirdparty formatter name: + * - `foo` → `eslint-formatter-foo` + * - `@foo` → `@foo/eslint-formatter` + * - `@foo/bar` → `@foo/eslint-formatter-bar` + * - A file path ... Load the file. + * @returns {Promise} A promise resolving to the formatter object. + * This promise will be rejected if the given formatter was not found or not + * a function. + */ + async loadFormatter(name = "stylish") { + if (typeof name !== "string") { + throw new Error("'name' must be a string"); + } + + const { cliEngine } = privateMembersMap.get(this); + const formatter = cliEngine.getFormatter(name); + + if (typeof formatter !== "function") { + throw new Error(`Formatter must be a function, but got a ${typeof formatter}.`); + } + + return { + + /** + * The main formatter method. + * @param {LintResults[]} results The lint results to format. + * @returns {string} The formatted lint results. + */ + format(results) { + let rulesMeta = null; + + results.sort(compareResultsByFilePath); + + return formatter(results, { + get rulesMeta() { + if (!rulesMeta) { + rulesMeta = createRulesMeta(cliEngine.getRules()); + } + + return rulesMeta; + } + }); + } + }; + } + + /** + * Returns a configuration object for the given file based on the CLI options. + * This is the same logic used by the ESLint CLI executable to determine + * configuration for each file it processes. + * @param {string} filePath The path of the file to retrieve a config object for. + * @returns {Promise} A configuration object for the file. + */ + async calculateConfigForFile(filePath) { + if (!isNonEmptyString(filePath)) { + throw new Error("'filePath' must be a non-empty string"); + } + const { cliEngine } = privateMembersMap.get(this); + + return cliEngine.getConfigForFile(filePath); + } + + /** + * Checks if a given path is ignored by ESLint. + * @param {string} filePath The path of the file to check. + * @returns {Promise} Whether or not the given path is ignored. + */ + async isPathIgnored(filePath) { + if (!isNonEmptyString(filePath)) { + throw new Error("'filePath' must be a non-empty string"); + } + const { cliEngine } = privateMembersMap.get(this); + + return cliEngine.isPathIgnored(filePath); + } +} + +//------------------------------------------------------------------------------ +// Public Interface +//------------------------------------------------------------------------------ + +module.exports = { + ESLint, + + /** + * Get the private class members of a given ESLint instance for tests. + * @param {ESLint} instance The ESLint instance to get. + * @returns {ESLintPrivateMembers} The instance's private class members. + */ + getESLintPrivateMembers(instance) { + return privateMembersMap.get(instance); + } +}; diff --git a/tools/node_modules/eslint/lib/eslint/index.js b/tools/node_modules/eslint/lib/eslint/index.js new file mode 100644 index 00000000000000..c9185ee0eba0a5 --- /dev/null +++ b/tools/node_modules/eslint/lib/eslint/index.js @@ -0,0 +1,7 @@ +"use strict"; + +const { ESLint } = require("./eslint"); + +module.exports = { + ESLint +}; diff --git a/tools/node_modules/eslint/lib/init/autoconfig.js b/tools/node_modules/eslint/lib/init/autoconfig.js index 64be3d2a84f49b..2b0aa12ac13df6 100644 --- a/tools/node_modules/eslint/lib/init/autoconfig.js +++ b/tools/node_modules/eslint/lib/init/autoconfig.js @@ -301,7 +301,7 @@ class Registry { ruleSetIdx += 1; if (cb) { - cb(totalFilesLinting); // eslint-disable-line callback-return + cb(totalFilesLinting); // eslint-disable-line node/callback-return } }); @@ -316,10 +316,10 @@ class Registry { /** * Extract rule configuration into eslint:recommended where possible. * - * This will return a new config with `"extends": "eslint:recommended"` and + * This will return a new config with `["extends": [ ..., "eslint:recommended"]` and * only the rules which have configurations different from the recommended config. * @param {Object} config config object - * @returns {Object} config object using `"extends": "eslint:recommended"` + * @returns {Object} config object using `"extends": ["eslint:recommended"]` */ function extendFromRecommended(config) { const newConfig = Object.assign({}, config); @@ -333,7 +333,7 @@ function extendFromRecommended(config) { delete newConfig.rules[ruleId]; } }); - newConfig.extends = RECOMMENDED_CONFIG_NAME; + newConfig.extends.unshift(RECOMMENDED_CONFIG_NAME); return newConfig; } diff --git a/tools/node_modules/eslint/lib/init/config-initializer.js b/tools/node_modules/eslint/lib/init/config-initializer.js index 28dfad194a7e6f..70f0a250ad1dd6 100644 --- a/tools/node_modules/eslint/lib/init/config-initializer.js +++ b/tools/node_modules/eslint/lib/init/config-initializer.js @@ -15,6 +15,7 @@ const util = require("util"), inquirer = require("inquirer"), ProgressBar = require("progress"), semver = require("semver"), + espree = require("espree"), recConfig = require("../../conf/eslint-recommended"), ConfigOps = require("../shared/config-ops"), log = require("../shared/logging"), @@ -31,8 +32,6 @@ const debug = require("debug")("eslint:config-initializer"); // Private //------------------------------------------------------------------------------ -const DEFAULT_ECMA_VERSION = 2018; - /* istanbul ignore next: hard to test fs function */ /** * Create .eslintrc file in the current working directory @@ -265,8 +264,7 @@ function processAnswers(answers) { extends: [] }; - // set the latest ECMAScript version - config.parserOptions.ecmaVersion = DEFAULT_ECMA_VERSION; + config.parserOptions.ecmaVersion = espree.latestEcmaVersion; config.env.es6 = true; config.globals = { Atomics: "readonly", diff --git a/tools/node_modules/eslint/lib/init/source-code-utils.js b/tools/node_modules/eslint/lib/init/source-code-utils.js index dfc170a65cf71b..dca6541d1ed328 100644 --- a/tools/node_modules/eslint/lib/init/source-code-utils.js +++ b/tools/node_modules/eslint/lib/init/source-code-utils.js @@ -23,7 +23,7 @@ const { CLIEngine } = require("../cli-engine"); * TODO1: Expose the API that enumerates target files. * TODO2: Extract the creation logic of `SourceCode` from `Linter` class. */ -const { getCLIEngineInternalSlots } = require("../cli-engine/cli-engine"); // eslint-disable-line no-restricted-modules +const { getCLIEngineInternalSlots } = require("../cli-engine/cli-engine"); // eslint-disable-line node/no-restricted-require const debug = require("debug")("eslint:source-code-utils"); @@ -97,7 +97,7 @@ function getSourceCodeOfFiles(patterns, options, callback) { sourceCodes[filename] = sourceCode; } if (callback) { - callback(filenames.length); // eslint-disable-line callback-return + callback(filenames.length); // eslint-disable-line node/callback-return } }); diff --git a/tools/node_modules/eslint/lib/options.js b/tools/node_modules/eslint/lib/options.js index 98dc04b6eb3968..1681f1dbd1d733 100644 --- a/tools/node_modules/eslint/lib/options.js +++ b/tools/node_modules/eslint/lib/options.js @@ -46,7 +46,6 @@ module.exports = optionator({ { option: "ext", type: "[String]", - default: ".js", description: "Specify JavaScript file extensions" }, { diff --git a/tools/node_modules/eslint/lib/rule-tester/rule-tester.js b/tools/node_modules/eslint/lib/rule-tester/rule-tester.js index 1c1737152c1b19..77df1def893ccc 100644 --- a/tools/node_modules/eslint/lib/rule-tester/rule-tester.js +++ b/tools/node_modules/eslint/lib/rule-tester/rule-tester.js @@ -563,7 +563,12 @@ class RuleTester { output = SourceCodeFixer.applyFixes(code, messages).output; const errorMessageInFix = linter.verify(output, config, filename).find(m => m.fatal); - assert(!errorMessageInFix, `A fatal parsing error occurred in autofix: ${errorMessageInFix && errorMessageInFix.message}`); + assert(!errorMessageInFix, [ + "A fatal parsing error occurred in autofix.", + `Error: ${errorMessageInFix && errorMessageInFix.message}`, + "Autofix output:", + output + ].join("\n")); } else { output = code; } diff --git a/tools/node_modules/eslint/lib/rules/array-callback-return.js b/tools/node_modules/eslint/lib/rules/array-callback-return.js index eb38965024f05d..62ba7b72d87257 100644 --- a/tools/node_modules/eslint/lib/rules/array-callback-return.js +++ b/tools/node_modules/eslint/lib/rules/array-callback-return.js @@ -29,22 +29,6 @@ function isReachable(segment) { return segment.reachable; } -/** - * Gets a readable location. - * - * - FunctionExpression -> the function name or `function` keyword. - * - ArrowFunctionExpression -> `=>` token. - * @param {ASTNode} node A function node to get. - * @param {SourceCode} sourceCode A source code to get tokens. - * @returns {ASTNode|Token} The node or the token of a location. - */ -function getLocation(node, sourceCode) { - if (node.type === "ArrowFunctionExpression") { - return sourceCode.getTokenBefore(node.body); - } - return node.id || node; -} - /** * Checks a given node is a MemberExpression node which has the specified name's * property. @@ -179,6 +163,7 @@ module.exports = { create(context) { const options = context.options[0] || { allowImplicit: false, checkForEach: false }; + const sourceCode = context.getSourceCode(); let funcInfo = { arrayMethodName: null, @@ -217,12 +202,12 @@ module.exports = { } if (messageId) { - let name = astUtils.getFunctionNameWithKind(funcInfo.node); + let name = astUtils.getFunctionNameWithKind(node); name = messageId === "expectedNoReturnValue" ? lodash.upperFirst(name) : name; context.report({ node, - loc: getLocation(node, context.getSourceCode()).loc.start, + loc: astUtils.getFunctionHeadLoc(node, sourceCode), messageId, data: { name } }); diff --git a/tools/node_modules/eslint/lib/rules/callback-return.js b/tools/node_modules/eslint/lib/rules/callback-return.js index c5263cde46b752..5df792d436341e 100644 --- a/tools/node_modules/eslint/lib/rules/callback-return.js +++ b/tools/node_modules/eslint/lib/rules/callback-return.js @@ -10,6 +10,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/callback-return"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/comma-style.js b/tools/node_modules/eslint/lib/rules/comma-style.js index bc22f05dd3892c..f1a23d63b786a0 100644 --- a/tools/node_modules/eslint/lib/rules/comma-style.js +++ b/tools/node_modules/eslint/lib/rules/comma-style.js @@ -146,10 +146,7 @@ module.exports = { // lone comma context.report({ node: reportItem, - loc: { - line: commaToken.loc.end.line, - column: commaToken.loc.start.column - }, + loc: commaToken.loc, messageId: "unexpectedLineBeforeAndAfterComma", fix: getFixerFunction(styleType, previousItemToken, commaToken, currentItemToken) }); @@ -158,6 +155,7 @@ module.exports = { context.report({ node: reportItem, + loc: commaToken.loc, messageId: "expectedCommaFirst", fix: getFixerFunction(style, previousItemToken, commaToken, currentItemToken) }); @@ -166,10 +164,7 @@ module.exports = { context.report({ node: reportItem, - loc: { - line: commaToken.loc.end.line, - column: commaToken.loc.end.column - }, + loc: commaToken.loc, messageId: "expectedCommaLast", fix: getFixerFunction(style, previousItemToken, commaToken, currentItemToken) }); diff --git a/tools/node_modules/eslint/lib/rules/func-call-spacing.js b/tools/node_modules/eslint/lib/rules/func-call-spacing.js index e2edd4282da578..dccdd0a40c6d76 100644 --- a/tools/node_modules/eslint/lib/rules/func-call-spacing.js +++ b/tools/node_modules/eslint/lib/rules/func-call-spacing.js @@ -63,7 +63,8 @@ module.exports = { }, messages: { - unexpected: "Unexpected newline between function name and paren.", + unexpectedWhitespace: "Unexpected whitespace between function name and paren.", + unexpectedNewline: "Unexpected newline between function name and paren.", missing: "Missing space between function name and paren." } }, @@ -116,7 +117,7 @@ module.exports = { context.report({ node, loc: leftToken.loc.start, - messageId: "unexpected", + messageId: "unexpectedWhitespace", fix(fixer) { /* @@ -143,7 +144,7 @@ module.exports = { context.report({ node, loc: leftToken.loc.start, - messageId: "unexpected", + messageId: "unexpectedNewline", fix(fixer) { return fixer.replaceTextRange([leftToken.range[1], rightToken.range[0]], " "); } diff --git a/tools/node_modules/eslint/lib/rules/getter-return.js b/tools/node_modules/eslint/lib/rules/getter-return.js index e1468a5b19f88d..c54ebfb4ffb8cd 100644 --- a/tools/node_modules/eslint/lib/rules/getter-return.js +++ b/tools/node_modules/eslint/lib/rules/getter-return.js @@ -25,17 +25,6 @@ function isReachable(segment) { return segment.reachable; } -/** - * Gets a readable location. - * - * - FunctionExpression -> the function name or `function` keyword. - * @param {ASTNode} node A function node to get. - * @returns {ASTNode|Token} The node or the token of a location. - */ -function getId(node) { - return node.id || node; -} - //------------------------------------------------------------------------------ // Rule Definition //------------------------------------------------------------------------------ @@ -75,6 +64,7 @@ module.exports = { create(context) { const options = context.options[0] || { allowImplicit: false }; + const sourceCode = context.getSourceCode(); let funcInfo = { upper: null, @@ -99,7 +89,7 @@ module.exports = { ) { context.report({ node, - loc: getId(node).loc.start, + loc: astUtils.getFunctionHeadLoc(node, sourceCode), messageId: funcInfo.hasReturn ? "expectedAlways" : "expected", data: { name: astUtils.getFunctionNameWithKind(funcInfo.node) diff --git a/tools/node_modules/eslint/lib/rules/global-require.js b/tools/node_modules/eslint/lib/rules/global-require.js index 4af3a6a4669a79..9bd073b88546d6 100644 --- a/tools/node_modules/eslint/lib/rules/global-require.js +++ b/tools/node_modules/eslint/lib/rules/global-require.js @@ -48,6 +48,10 @@ function isShadowed(scope, node) { module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/global-require"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/handle-callback-err.js b/tools/node_modules/eslint/lib/rules/handle-callback-err.js index 640946699e7bea..8ad63bbd53a77b 100644 --- a/tools/node_modules/eslint/lib/rules/handle-callback-err.js +++ b/tools/node_modules/eslint/lib/rules/handle-callback-err.js @@ -11,6 +11,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/handle-callback-err"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/key-spacing.js b/tools/node_modules/eslint/lib/rules/key-spacing.js index c405043794c7d1..57abb00b06e8a4 100644 --- a/tools/node_modules/eslint/lib/rules/key-spacing.js +++ b/tools/node_modules/eslint/lib/rules/key-spacing.js @@ -45,7 +45,7 @@ function isSingleLine(node) { /** * Checks whether the properties on a single line. * @param {ASTNode[]} properties List of Property AST nodes. - * @returns {boolean} True if all properies is on a single line. + * @returns {boolean} True if all properties is on a single line. */ function isSingleLineProperties(properties) { const [firstProp] = properties, diff --git a/tools/node_modules/eslint/lib/rules/new-cap.js b/tools/node_modules/eslint/lib/rules/new-cap.js index 7cce968c5aed0e..0faf45efb92daf 100644 --- a/tools/node_modules/eslint/lib/rules/new-cap.js +++ b/tools/node_modules/eslint/lib/rules/new-cap.js @@ -235,7 +235,7 @@ module.exports = { callee = callee.property; } - context.report({ node, loc: callee.loc.start, messageId }); + context.report({ node, loc: callee.loc, messageId }); } //-------------------------------------------------------------------------- diff --git a/tools/node_modules/eslint/lib/rules/newline-per-chained-call.js b/tools/node_modules/eslint/lib/rules/newline-per-chained-call.js index 8ad88386c0f61d..4254fec185ef88 100644 --- a/tools/node_modules/eslint/lib/rules/newline-per-chained-call.js +++ b/tools/node_modules/eslint/lib/rules/newline-per-chained-call.js @@ -90,16 +90,19 @@ module.exports = { } if (depth > ignoreChainWithDepth && astUtils.isTokenOnSameLine(callee.object, callee.property)) { + const firstTokenAfterObject = sourceCode.getTokenAfter(callee.object, astUtils.isNotClosingParenToken); + context.report({ node: callee.property, - loc: callee.property.loc.start, + loc: { + start: firstTokenAfterObject.loc.start, + end: callee.loc.end + }, messageId: "expected", data: { callee: getPropertyText(callee) }, fix(fixer) { - const firstTokenAfterObject = sourceCode.getTokenAfter(callee.object, astUtils.isNotClosingParenToken); - return fixer.insertTextBefore(firstTokenAfterObject, "\n"); } }); diff --git a/tools/node_modules/eslint/lib/rules/no-buffer-constructor.js b/tools/node_modules/eslint/lib/rules/no-buffer-constructor.js index bf4c8891ad1adf..5dce047b92312e 100644 --- a/tools/node_modules/eslint/lib/rules/no-buffer-constructor.js +++ b/tools/node_modules/eslint/lib/rules/no-buffer-constructor.js @@ -10,6 +10,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-deprecated-api"], + type: "problem", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-empty-function.js b/tools/node_modules/eslint/lib/rules/no-empty-function.js index c74321158b3464..c512f8cd5f4500 100644 --- a/tools/node_modules/eslint/lib/rules/no-empty-function.js +++ b/tools/node_modules/eslint/lib/rules/no-empty-function.js @@ -151,7 +151,7 @@ module.exports = { ) { context.report({ node, - loc: node.body.loc.start, + loc: node.body.loc, messageId: "unexpected", data: { name } }); diff --git a/tools/node_modules/eslint/lib/rules/no-extra-parens.js b/tools/node_modules/eslint/lib/rules/no-extra-parens.js index a3dd5bab699da3..7cbb7522ebedfa 100644 --- a/tools/node_modules/eslint/lib/rules/no-extra-parens.js +++ b/tools/node_modules/eslint/lib/rules/no-extra-parens.js @@ -560,7 +560,11 @@ module.exports = { tokensToIgnore.add(secondToken); } - if (hasExcessParens(node)) { + const hasExtraParens = node.parent.type === "ExportDefaultDeclaration" + ? hasExcessParensWithPrecedence(node, PRECEDENCE_OF_ASSIGNMENT_EXPR) + : hasExcessParens(node); + + if (hasExtraParens) { report(node); } } diff --git a/tools/node_modules/eslint/lib/rules/no-inner-declarations.js b/tools/node_modules/eslint/lib/rules/no-inner-declarations.js index e1c29e0a3b4f3f..0768bc61149cec 100644 --- a/tools/node_modules/eslint/lib/rules/no-inner-declarations.js +++ b/tools/node_modules/eslint/lib/rules/no-inner-declarations.js @@ -5,10 +5,19 @@ "use strict"; +//------------------------------------------------------------------------------ +// Requirements +//------------------------------------------------------------------------------ + +const astUtils = require("./utils/ast-utils"); + //------------------------------------------------------------------------------ // Rule Definition //------------------------------------------------------------------------------ +const validParent = new Set(["Program", "ExportNamedDeclaration", "ExportDefaultDeclaration"]); +const validBlockStatementParent = new Set(["FunctionDeclaration", "FunctionExpression", "ArrowFunctionExpression"]); + module.exports = { meta: { type: "problem", @@ -33,54 +42,37 @@ module.exports = { create(context) { - /** - * Find the nearest Program or Function ancestor node. - * @returns {Object} Ancestor's type and distance from node. - */ - function nearestBody() { - const ancestors = context.getAncestors(); - let ancestor = ancestors.pop(), - generation = 1; - - while (ancestor && ["Program", "FunctionDeclaration", - "FunctionExpression", "ArrowFunctionExpression" - ].indexOf(ancestor.type) < 0) { - generation += 1; - ancestor = ancestors.pop(); - } - - return { - - // Type of containing ancestor - type: ancestor.type, - - // Separation between ancestor and node - distance: generation - }; - } - /** * Ensure that a given node is at a program or function body's root. * @param {ASTNode} node Declaration node to check. * @returns {void} */ function check(node) { - const body = nearestBody(), - valid = ((body.type === "Program" && body.distance === 1) || - body.distance === 2); - - if (!valid) { - context.report({ - node, - messageId: "moveDeclToRoot", - data: { - type: (node.type === "FunctionDeclaration" ? "function" : "variable"), - body: (body.type === "Program" ? "program" : "function body") - } - }); + const parent = node.parent; + + if ( + parent.type === "BlockStatement" && validBlockStatementParent.has(parent.parent.type) + ) { + return; + } + + if (validParent.has(parent.type)) { + return; } + + const upperFunction = astUtils.getUpperFunction(parent); + + context.report({ + node, + messageId: "moveDeclToRoot", + data: { + type: (node.type === "FunctionDeclaration" ? "function" : "variable"), + body: (upperFunction === null ? "program" : "function body") + } + }); } + return { FunctionDeclaration: check, diff --git a/tools/node_modules/eslint/lib/rules/no-lone-blocks.js b/tools/node_modules/eslint/lib/rules/no-lone-blocks.js index d7069887b8e460..290784b82ea2fb 100644 --- a/tools/node_modules/eslint/lib/rules/no-lone-blocks.js +++ b/tools/node_modules/eslint/lib/rules/no-lone-blocks.js @@ -49,7 +49,7 @@ module.exports = { } /** - * Checks for any ocurrence of a BlockStatement in a place where lists of statements can appear + * Checks for any occurrence of a BlockStatement in a place where lists of statements can appear * @param {ASTNode} node The node to check * @returns {boolean} True if the node is a lone block. */ diff --git a/tools/node_modules/eslint/lib/rules/no-mixed-requires.js b/tools/node_modules/eslint/lib/rules/no-mixed-requires.js index 8e988e32c24f84..bfe9b7aa97858a 100644 --- a/tools/node_modules/eslint/lib/rules/no-mixed-requires.js +++ b/tools/node_modules/eslint/lib/rules/no-mixed-requires.js @@ -11,6 +11,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-mixed-requires"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-new-object.js b/tools/node_modules/eslint/lib/rules/no-new-object.js index f3e99c9bd13502..e9f915db5eaa91 100644 --- a/tools/node_modules/eslint/lib/rules/no-new-object.js +++ b/tools/node_modules/eslint/lib/rules/no-new-object.js @@ -5,6 +5,12 @@ "use strict"; +//------------------------------------------------------------------------------ +// Requirements +//------------------------------------------------------------------------------ + +const astUtils = require("./utils/ast-utils"); + //------------------------------------------------------------------------------ // Rule Definition //------------------------------------------------------------------------------ @@ -28,10 +34,17 @@ module.exports = { }, create(context) { - return { - NewExpression(node) { + const variable = astUtils.getVariableByName( + context.getScope(), + node.callee.name + ); + + if (variable && variable.identifiers.length > 0) { + return; + } + if (node.callee.name === "Object") { context.report({ node, @@ -40,6 +53,5 @@ module.exports = { } } }; - } }; diff --git a/tools/node_modules/eslint/lib/rules/no-new-require.js b/tools/node_modules/eslint/lib/rules/no-new-require.js index df12a424e3527e..7f81e83fd782c3 100644 --- a/tools/node_modules/eslint/lib/rules/no-new-require.js +++ b/tools/node_modules/eslint/lib/rules/no-new-require.js @@ -11,6 +11,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-new-require"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-path-concat.js b/tools/node_modules/eslint/lib/rules/no-path-concat.js index 9fa8b852fe8358..77a03a7f952b04 100644 --- a/tools/node_modules/eslint/lib/rules/no-path-concat.js +++ b/tools/node_modules/eslint/lib/rules/no-path-concat.js @@ -10,6 +10,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-path-concat"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-process-env.js b/tools/node_modules/eslint/lib/rules/no-process-env.js index 0f8d7f8a339d0e..24bb9f9971d5c9 100644 --- a/tools/node_modules/eslint/lib/rules/no-process-env.js +++ b/tools/node_modules/eslint/lib/rules/no-process-env.js @@ -10,6 +10,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-process-env"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-process-exit.js b/tools/node_modules/eslint/lib/rules/no-process-exit.js index 29871660cc6ee1..9c70ea8808b4c4 100644 --- a/tools/node_modules/eslint/lib/rules/no-process-exit.js +++ b/tools/node_modules/eslint/lib/rules/no-process-exit.js @@ -10,6 +10,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-process-exit"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-restricted-modules.js b/tools/node_modules/eslint/lib/rules/no-restricted-modules.js index abd8d5cbe29381..61834ceeb444d3 100644 --- a/tools/node_modules/eslint/lib/rules/no-restricted-modules.js +++ b/tools/node_modules/eslint/lib/rules/no-restricted-modules.js @@ -40,6 +40,10 @@ const arrayOfStringsOrObjects = { module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-restricted-require"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-sync.js b/tools/node_modules/eslint/lib/rules/no-sync.js index d8111059631734..9790d1f94bb20c 100644 --- a/tools/node_modules/eslint/lib/rules/no-sync.js +++ b/tools/node_modules/eslint/lib/rules/no-sync.js @@ -13,6 +13,10 @@ module.exports = { meta: { + deprecated: true, + + replacedBy: ["node/no-sync"], + type: "suggestion", docs: { diff --git a/tools/node_modules/eslint/lib/rules/no-unexpected-multiline.js b/tools/node_modules/eslint/lib/rules/no-unexpected-multiline.js index eb72008a2947e7..b5ec20de4b20a3 100644 --- a/tools/node_modules/eslint/lib/rules/no-unexpected-multiline.js +++ b/tools/node_modules/eslint/lib/rules/no-unexpected-multiline.js @@ -53,7 +53,11 @@ module.exports = { const nodeExpressionEnd = sourceCode.getTokenBefore(openParen); if (openParen.loc.start.line !== nodeExpressionEnd.loc.end.line) { - context.report({ node, loc: openParen.loc.start, messageId, data: { char: openParen.value } }); + context.report({ + node, + loc: openParen.loc, + messageId + }); } } @@ -71,18 +75,24 @@ module.exports = { }, TaggedTemplateExpression(node) { - if (node.tag.loc.end.line === node.quasi.loc.start.line) { - return; - } - - // handle generics type parameters on template tags - const tokenBefore = sourceCode.getTokenBefore(node.quasi); - - if (tokenBefore.loc.end.line === node.quasi.loc.start.line) { - return; + const { quasi } = node; + + // handles common tags, parenthesized tags, and typescript's generic type arguments + const tokenBefore = sourceCode.getTokenBefore(quasi); + + if (tokenBefore.loc.end.line !== quasi.loc.start.line) { + context.report({ + node, + loc: { + start: quasi.loc.start, + end: { + line: quasi.loc.start.line, + column: quasi.loc.start.column + 1 + } + }, + messageId: "taggedTemplate" + }); } - - context.report({ node, loc: node.loc.start, messageId: "taggedTemplate" }); }, CallExpression(node) { diff --git a/tools/node_modules/eslint/lib/rules/no-useless-concat.js b/tools/node_modules/eslint/lib/rules/no-useless-concat.js index aa46742abdd5ca..cfc60c8fb51a24 100644 --- a/tools/node_modules/eslint/lib/rules/no-useless-concat.js +++ b/tools/node_modules/eslint/lib/rules/no-useless-concat.js @@ -105,7 +105,7 @@ module.exports = { context.report({ node, - loc: operatorToken.loc.start, + loc: operatorToken.loc, messageId: "unexpectedConcat" }); } diff --git a/tools/node_modules/eslint/lib/rules/space-before-function-paren.js b/tools/node_modules/eslint/lib/rules/space-before-function-paren.js index af609c2e7c72fa..1021a110cfd3f0 100644 --- a/tools/node_modules/eslint/lib/rules/space-before-function-paren.js +++ b/tools/node_modules/eslint/lib/rules/space-before-function-paren.js @@ -127,7 +127,10 @@ module.exports = { if (hasSpacing && functionConfig === "never") { context.report({ node, - loc: leftToken.loc.end, + loc: { + start: leftToken.loc.end, + end: rightToken.loc.start + }, messageId: "unexpectedSpace", fix(fixer) { const comments = sourceCode.getCommentsBefore(rightToken); @@ -145,7 +148,7 @@ module.exports = { } else if (!hasSpacing && functionConfig === "always") { context.report({ node, - loc: leftToken.loc.end, + loc: rightToken.loc, messageId: "missingSpace", fix: fixer => fixer.insertTextAfter(leftToken, " ") }); diff --git a/tools/node_modules/eslint/lib/rules/yoda.js b/tools/node_modules/eslint/lib/rules/yoda.js index c4ff3f81938595..f1159e5255df79 100644 --- a/tools/node_modules/eslint/lib/rules/yoda.js +++ b/tools/node_modules/eslint/lib/rules/yoda.js @@ -20,7 +20,7 @@ const astUtils = require("./utils/ast-utils"); * @returns {boolean} Whether or not it is a comparison operator. */ function isComparisonOperator(operator) { - return (/^(==|===|!=|!==|<|>|<=|>=)$/u).test(operator); + return /^(==|===|!=|!==|<|>|<=|>=)$/u.test(operator); } /** @@ -29,7 +29,7 @@ function isComparisonOperator(operator) { * @returns {boolean} Whether or not it is an equality operator. */ function isEqualityOperator(operator) { - return (/^(==|===)$/u).test(operator); + return /^(==|===)$/u.test(operator); } /** @@ -50,10 +50,12 @@ function isRangeTestOperator(operator) { * real literal and should be treated as such. */ function isNegativeNumericLiteral(node) { - return (node.type === "UnaryExpression" && + return ( + node.type === "UnaryExpression" && node.operator === "-" && node.prefix && - astUtils.isNumericLiteral(node.argument)); + astUtils.isNumericLiteral(node.argument) + ); } /** @@ -71,25 +73,21 @@ function isStaticTemplateLiteral(node) { * @returns {boolean} True if the node should be treated as a single Literal node. */ function looksLikeLiteral(node) { - return isNegativeNumericLiteral(node) || - isStaticTemplateLiteral(node); + return isNegativeNumericLiteral(node) || isStaticTemplateLiteral(node); } /** * Attempts to derive a Literal node from nodes that are treated like literals. * @param {ASTNode} node Node to normalize. - * @param {number} [defaultValue] The default value to be returned if the node - * is not a Literal. * @returns {ASTNode} One of the following options. * 1. The original node if the node is already a Literal * 2. A normalized Literal node with the negative number as the value if the * node represents a negative number literal. * 3. A normalized Literal node with the string as the value if the node is * a Template Literal without expression. - * 4. The Literal node which has the `defaultValue` argument if it exists. - * 5. Otherwise `null`. + * 4. Otherwise `null`. */ -function getNormalizedLiteral(node, defaultValue) { +function getNormalizedLiteral(node) { if (node.type === "Literal") { return node; } @@ -110,14 +108,6 @@ function getNormalizedLiteral(node, defaultValue) { }; } - if (defaultValue) { - return { - type: "Literal", - value: defaultValue, - raw: String(defaultValue) - }; - } - return null; } @@ -183,7 +173,7 @@ module.exports = { type: "suggestion", docs: { - description: "require or disallow \"Yoda\" conditions", + description: 'require or disallow "Yoda" conditions', category: "Best Practices", recommended: false, url: "https://eslint.org/docs/rules/yoda" @@ -211,16 +201,19 @@ module.exports = { fixable: "code", messages: { - expected: "Expected literal to be on the {{expectedSide}} side of {{operator}}." + expected: + "Expected literal to be on the {{expectedSide}} side of {{operator}}." } }, create(context) { // Default to "never" (!always) if no option - const always = (context.options[0] === "always"); - const exceptRange = (context.options[1] && context.options[1].exceptRange); - const onlyEquality = (context.options[1] && context.options[1].onlyEquality); + const always = context.options[0] === "always"; + const exceptRange = + context.options[1] && context.options[1].exceptRange; + const onlyEquality = + context.options[1] && context.options[1].onlyEquality; const sourceCode = context.getSourceCode(); @@ -243,13 +236,23 @@ module.exports = { * @returns {boolean} Whether node is a "between" range test. */ function isBetweenTest() { - let leftLiteral, rightLiteral; + if (node.operator === "&&" && same(left.right, right.left)) { + const leftLiteral = getNormalizedLiteral(left.left); + const rightLiteral = getNormalizedLiteral(right.right); + + if (leftLiteral === null && rightLiteral === null) { + return false; + } - return (node.operator === "&&" && - (leftLiteral = getNormalizedLiteral(left.left)) && - (rightLiteral = getNormalizedLiteral(right.right, Number.POSITIVE_INFINITY)) && - leftLiteral.value <= rightLiteral.value && - same(left.right, right.left)); + if (rightLiteral === null || leftLiteral === null) { + return true; + } + + if (leftLiteral.value <= rightLiteral.value) { + return true; + } + } + return false; } /** @@ -257,13 +260,24 @@ module.exports = { * @returns {boolean} Whether node is an "outside" range test. */ function isOutsideTest() { - let leftLiteral, rightLiteral; + if (node.operator === "||" && same(left.left, right.right)) { + const leftLiteral = getNormalizedLiteral(left.right); + const rightLiteral = getNormalizedLiteral(right.left); + + if (leftLiteral === null && rightLiteral === null) { + return false; + } + + if (rightLiteral === null || leftLiteral === null) { + return true; + } + + if (leftLiteral.value <= rightLiteral.value) { + return true; + } + } - return (node.operator === "||" && - (leftLiteral = getNormalizedLiteral(left.right, Number.NEGATIVE_INFINITY)) && - (rightLiteral = getNormalizedLiteral(right.left)) && - leftLiteral.value <= rightLiteral.value && - same(left.left, right.right)); + return false; } /** @@ -276,13 +290,15 @@ module.exports = { return astUtils.isParenthesised(sourceCode, node); } - return (node.type === "LogicalExpression" && + return ( + node.type === "LogicalExpression" && left.type === "BinaryExpression" && right.type === "BinaryExpression" && isRangeTestOperator(left.operator) && isRangeTestOperator(right.operator) && (isBetweenTest() || isOutsideTest()) && - isParenWrapped()); + isParenWrapped() + ); } const OPERATOR_FLIP_MAP = { @@ -303,21 +319,52 @@ module.exports = { */ function getFlippedString(node) { const tokenBefore = sourceCode.getTokenBefore(node); - const operatorToken = sourceCode.getFirstTokenBetween(node.left, node.right, token => token.value === node.operator); - const textBeforeOperator = sourceCode.getText().slice(sourceCode.getTokenBefore(operatorToken).range[1], operatorToken.range[0]); - const textAfterOperator = sourceCode.getText().slice(operatorToken.range[1], sourceCode.getTokenAfter(operatorToken).range[0]); - const leftText = sourceCode.getText().slice(node.range[0], sourceCode.getTokenBefore(operatorToken).range[1]); + const operatorToken = sourceCode.getFirstTokenBetween( + node.left, + node.right, + token => token.value === node.operator + ); + const textBeforeOperator = sourceCode + .getText() + .slice( + sourceCode.getTokenBefore(operatorToken).range[1], + operatorToken.range[0] + ); + const textAfterOperator = sourceCode + .getText() + .slice( + operatorToken.range[1], + sourceCode.getTokenAfter(operatorToken).range[0] + ); + const leftText = sourceCode + .getText() + .slice( + node.range[0], + sourceCode.getTokenBefore(operatorToken).range[1] + ); const firstRightToken = sourceCode.getTokenAfter(operatorToken); - const rightText = sourceCode.getText().slice(firstRightToken.range[0], node.range[1]); + const rightText = sourceCode + .getText() + .slice(firstRightToken.range[0], node.range[1]); let prefix = ""; - if (tokenBefore && tokenBefore.range[1] === node.range[0] && - !astUtils.canTokensBeAdjacent(tokenBefore, firstRightToken)) { + if ( + tokenBefore && + tokenBefore.range[1] === node.range[0] && + !astUtils.canTokensBeAdjacent(tokenBefore, firstRightToken) + ) { prefix = " "; } - return prefix + rightText + textBeforeOperator + OPERATOR_FLIP_MAP[operatorToken.value] + textAfterOperator + leftText; + return ( + prefix + + rightText + + textBeforeOperator + + OPERATOR_FLIP_MAP[operatorToken.value] + + textAfterOperator + + leftText + ); } //-------------------------------------------------------------------------- @@ -331,8 +378,12 @@ module.exports = { // If `expectedLiteral` is not a literal, and `expectedNonLiteral` is a literal, raise an error. if ( - (expectedNonLiteral.type === "Literal" || looksLikeLiteral(expectedNonLiteral)) && - !(expectedLiteral.type === "Literal" || looksLikeLiteral(expectedLiteral)) && + (expectedNonLiteral.type === "Literal" || + looksLikeLiteral(expectedNonLiteral)) && + !( + expectedLiteral.type === "Literal" || + looksLikeLiteral(expectedLiteral) + ) && !(!isEqualityOperator(node.operator) && onlyEquality) && isComparisonOperator(node.operator) && !(exceptRange && isRangeTest(context.getAncestors().pop())) @@ -344,12 +395,11 @@ module.exports = { operator: node.operator, expectedSide: always ? "left" : "right" }, - fix: fixer => fixer.replaceText(node, getFlippedString(node)) + fix: fixer => + fixer.replaceText(node, getFlippedString(node)) }); } - } }; - } }; diff --git a/tools/node_modules/eslint/lib/shared/relative-module-resolver.js b/tools/node_modules/eslint/lib/shared/relative-module-resolver.js index fa6cca72361df5..80335c5cfca7c8 100644 --- a/tools/node_modules/eslint/lib/shared/relative-module-resolver.js +++ b/tools/node_modules/eslint/lib/shared/relative-module-resolver.js @@ -11,6 +11,7 @@ const Module = require("module"); * `Module.createRequire` is added in v12.2.0. It supports URL as well. * We only support the case where the argument is a filepath, not a URL. */ +// eslint-disable-next-line node/no-unsupported-features/node-builtins, node/no-deprecated-api const createRequire = Module.createRequire || Module.createRequireFromPath; module.exports = { diff --git a/tools/node_modules/eslint/lib/shared/types.js b/tools/node_modules/eslint/lib/shared/types.js index bf37327fa240ca..bbd95d1b37862f 100644 --- a/tools/node_modules/eslint/lib/shared/types.js +++ b/tools/node_modules/eslint/lib/shared/types.js @@ -141,3 +141,10 @@ module.exports = {}; * @property {Record} [processors] The definition of plugin processors. * @property {Record} [rules] The definition of plugin rules. */ + +/** + * Information of deprecated rules. + * @typedef {Object} DeprecatedRuleInfo + * @property {string} ruleId The rule ID. + * @property {string[]} replacedBy The rule IDs that replace this deprecated rule. + */ diff --git a/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/lib/identifier.js b/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/lib/identifier.js index 92043ce6630710..51ec76370ccfc6 100644 --- a/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/lib/identifier.js +++ b/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/lib/identifier.js @@ -73,5 +73,5 @@ function isIdentifierName(name) { } } - return true; + return !isFirst; } \ No newline at end of file diff --git a/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/package.json b/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/package.json index bdc69e6c6f8926..da8c4e12d40ff7 100644 --- a/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/package.json +++ b/tools/node_modules/eslint/node_modules/@babel/helper-validator-identifier/package.json @@ -7,7 +7,7 @@ "unicode-13.0.0": "^0.8.0" }, "exports": "./lib/index.js", - "gitHead": "8d5e422be27251cfaadf8dd2536b31b4a5024b02", + "gitHead": "5b97e77e030cf3853a147fdff81844ea4026219d", "license": "MIT", "main": "./lib/index.js", "name": "@babel/helper-validator-identifier", @@ -18,5 +18,5 @@ "type": "git", "url": "https://github.com/babel/babel/tree/master/packages/babel-helper-validator-identifier" }, - "version": "7.9.0" + "version": "7.9.5" } \ No newline at end of file diff --git a/tools/node_modules/eslint/node_modules/ajv/README.md b/tools/node_modules/eslint/node_modules/ajv/README.md index 9bd1f571872a26..e13fdec2939c92 100644 --- a/tools/node_modules/eslint/node_modules/ajv/README.md +++ b/tools/node_modules/eslint/node_modules/ajv/README.md @@ -8,8 +8,49 @@ The fastest JSON Schema validator for Node.js and browser. Supports draft-04/06/ [![npm](https://img.shields.io/npm/v/ajv.svg)](https://www.npmjs.com/package/ajv) [![npm downloads](https://img.shields.io/npm/dm/ajv.svg)](https://www.npmjs.com/package/ajv) [![Coverage Status](https://coveralls.io/repos/epoberezkin/ajv/badge.svg?branch=master&service=github)](https://coveralls.io/github/epoberezkin/ajv?branch=master) -[![Greenkeeper badge](https://badges.greenkeeper.io/epoberezkin/ajv.svg)](https://greenkeeper.io/) [![Gitter](https://img.shields.io/gitter/room/ajv-validator/ajv.svg)](https://gitter.im/ajv-validator/ajv) +[![GitHub Sponsors](https://img.shields.io/badge/$-sponsors-brightgreen)](https://github.com/sponsors/epoberezkin) + +## Please [sponsor Ajv](https://github.com/sponsors/epoberezkin) + +Dear Ajv users! ❤️ + +I ask you to support the development of Ajv with donations. 🙏 + +Since 2015 Ajv has become widely used, thanks to your help and contributions: + +- **90** contributors 🏗 +- **5,000** dependent npm packages ⚙️ +- **7,000** github stars, from GitHub users [all over the world](https://www.google.com/maps/d/u/0/viewer?mid=1MGRV8ciFUGIbO1l0EKFWNJGYE7iSkDxP&ll=-3.81666561775622e-14%2C4.821737100000007&z=2) ⭐️ +- **5,000,000** dependent repositories on GitHub 🚀 +- **120,000,000** npm downloads per month! 💯 + +Your donations will fund futher development - small and large improvements, support of the next versions of JSON Schema specification, and, possibly, the code should be migrated to TypeScript to make it more maintainable. + +I will greatly appreciate anything you can help with to make it happen: + +- a **personal** donation - from $2 ☕️ +- your **company** donation - from $10 🍔 +- a **sponsorship** to get promoted on Ajv or related packages - from $50 💰 +- an **introduction** to a sponsor who would benefit from the promotion on Ajv page 🤝 + +| Please [make donations via my GitHub sponsors page](https://github.com/sponsors/epoberezkin)
‼️ **GitHub will DOUBLE them** ‼️ | +|---| + +#### Open Collective sponsors + + + + + + + + + + + + + ## Using version 6 @@ -273,7 +314,7 @@ The following formats are implemented for string validation with "format" keywor __Please note__: JSON Schema draft-07 also defines formats `iri`, `iri-reference`, `idn-hostname` and `idn-email` for URLs, hostnames and emails with international characters. Ajv does not implement these formats. If you create Ajv plugin that implements them please make a PR to mention this plugin here. -There are two modes of format validation: `fast` and `full`. This mode affects formats `date`, `time`, `date-time`, `uri`, `uri-reference`, `email`, and `hostname`. See [Options](#options) for details. +There are two modes of format validation: `fast` and `full`. This mode affects formats `date`, `time`, `date-time`, `uri`, `uri-reference`, and `email`. See [Options](#options) for details. You can add additional formats and replace any of the formats above using [addFormat](#api-addformat) method. @@ -1340,7 +1381,7 @@ If you have published a useful plugin please submit a PR to add it to the next s - [ajv-keywords](https://github.com/epoberezkin/ajv-keywords) - plugin with custom validation keywords (select, typeof, etc.) - [ajv-merge-patch](https://github.com/epoberezkin/ajv-merge-patch) - plugin with keywords $merge and $patch - [ajv-pack](https://github.com/epoberezkin/ajv-pack) - produces a compact module exporting validation functions - +- [ajv-formats-draft2019](https://github.com/luzlab/ajv-formats-draft2019) - format validators for draft2019 that aren't already included in ajv (ie. `idn-hostname`, `idn-email`, `iri`, `iri-reference` and `duration`). ## Some packages using Ajv diff --git a/tools/node_modules/eslint/node_modules/ajv/dist/ajv.min.js b/tools/node_modules/eslint/node_modules/ajv/dist/ajv.min.js index 157af21a92cda5..1c72a75ecfc321 100644 --- a/tools/node_modules/eslint/node_modules/ajv/dist/ajv.min.js +++ b/tools/node_modules/eslint/node_modules/ajv/dist/ajv.min.js @@ -1,3 +1,3 @@ -/* ajv 6.12.0: Another JSON Schema Validator */ -!function(e){if("object"==typeof exports&&"undefined"!=typeof module)module.exports=e();else if("function"==typeof define&&define.amd)define([],e);else{("undefined"!=typeof window?window:"undefined"!=typeof global?global:"undefined"!=typeof self?self:this).Ajv=e()}}(function(){return function o(i,n,l){function c(r,e){if(!n[r]){if(!i[r]){var t="function"==typeof require&&require;if(!e&&t)return t(r,!0);if(u)return u(r,!0);var a=new Error("Cannot find module '"+r+"'");throw a.code="MODULE_NOT_FOUND",a}var s=n[r]={exports:{}};i[r][0].call(s.exports,function(e){return c(i[r][1][e]||e)},s,s.exports,o,i,n,l)}return n[r].exports}for(var u="function"==typeof require&&require,e=0;e%\\^`{|}]|%[0-9a-f]{2})|\{[+#./;?&=,!@|]?(?:[a-z0-9_]|%[0-9a-f]{2})+(?::[1-9][0-9]{0,3}|\*)?(?:,(?:[a-z0-9_]|%[0-9a-f]{2})+(?::[1-9][0-9]{0,3}|\*)?)*\})*$/i,u=/^(?:(?:http[s\u017F]?|ftp):\/\/)(?:(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+(?::(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])*)?@)?(?:(?!10(?:\.[0-9]{1,3}){3})(?!127(?:\.[0-9]{1,3}){3})(?!169\.254(?:\.[0-9]{1,3}){2})(?!192\.168(?:\.[0-9]{1,3}){2})(?!172\.(?:1[6-9]|2[0-9]|3[01])(?:\.[0-9]{1,3}){2})(?:[1-9][0-9]?|1[0-9][0-9]|2[01][0-9]|22[0-3])(?:\.(?:1?[0-9]{1,2}|2[0-4][0-9]|25[0-5])){2}(?:\.(?:[1-9][0-9]?|1[0-9][0-9]|2[0-4][0-9]|25[0-4]))|(?:(?:(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+-?)*(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+)(?:\.(?:(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+-?)*(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+)*(?:\.(?:(?:[KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF]){2,})))(?::[0-9]{2,5})?(?:\/(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])*)?$/i,h=/^(?:urn:uuid:)?[0-9a-f]{8}-(?:[0-9a-f]{4}-){3}[0-9a-f]{12}$/i,d=/^(?:\/(?:[^~/]|~0|~1)*)*$/,f=/^#(?:\/(?:[a-z0-9_\-.!$&'()*+,;:=@]|%[0-9a-f]{2}|~0|~1)*)*$/i,p=/^(?:0|[1-9][0-9]*)(?:#|(?:\/(?:[^~/]|~0|~1)*)*)$/;function m(e){return a.copy(m[e="full"==e?"full":"fast"])}function v(e){var r=e.match(o);if(!r)return!1;var t,a=+r[2],s=+r[3];return 1<=a&&a<=12&&1<=s&&s<=(2!=a||((t=+r[1])%4!=0||t%100==0&&t%400!=0)?i[a]:29)}function y(e,r){var t=e.match(n);if(!t)return!1;var a=t[1],s=t[2],o=t[3];return(a<=23&&s<=59&&o<=59||23==a&&59==s&&60==o)&&(!r||t[5])}(r.exports=m).fast={date:/^\d\d\d\d-[0-1]\d-[0-3]\d$/,time:/^(?:[0-2]\d:[0-5]\d:[0-5]\d|23:59:60)(?:\.\d+)?(?:z|[+-]\d\d(?::?\d\d)?)?$/i,"date-time":/^\d\d\d\d-[0-1]\d-[0-3]\d[t\s](?:[0-2]\d:[0-5]\d:[0-5]\d|23:59:60)(?:\.\d+)?(?:z|[+-]\d\d(?::?\d\d)?)$/i,uri:/^(?:[a-z][a-z0-9+-.]*:)(?:\/?\/)?[^\s]*$/i,"uri-reference":/^(?:(?:[a-z][a-z0-9+-.]*:)?\/?\/)?(?:[^\\\s#][^\s#]*)?(?:#[^\\\s]*)?$/i,"uri-template":c,url:u,email:/^[a-z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?)*$/i,hostname:s,ipv4:/^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$/,ipv6:/^\s*(?:(?:(?:[0-9a-f]{1,4}:){7}(?:[0-9a-f]{1,4}|:))|(?:(?:[0-9a-f]{1,4}:){6}(?::[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){5}(?:(?:(?::[0-9a-f]{1,4}){1,2})|:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){4}(?:(?:(?::[0-9a-f]{1,4}){1,3})|(?:(?::[0-9a-f]{1,4})?:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){3}(?:(?:(?::[0-9a-f]{1,4}){1,4})|(?:(?::[0-9a-f]{1,4}){0,2}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){2}(?:(?:(?::[0-9a-f]{1,4}){1,5})|(?:(?::[0-9a-f]{1,4}){0,3}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){1}(?:(?:(?::[0-9a-f]{1,4}){1,6})|(?:(?::[0-9a-f]{1,4}){0,4}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?::(?:(?:(?::[0-9a-f]{1,4}){1,7})|(?:(?::[0-9a-f]{1,4}){0,5}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(?:%.+)?\s*$/i,regex:w,uuid:h,"json-pointer":d,"json-pointer-uri-fragment":f,"relative-json-pointer":p},m.full={date:v,time:y,"date-time":function(e){var r=e.split(g);return 2==r.length&&v(r[0])&&y(r[1],!0)},uri:function(e){return P.test(e)&&l.test(e)},"uri-reference":/^(?:[a-z][a-z0-9+\-.]*:)?(?:\/?\/(?:(?:[a-z0-9\-._~!$&'()*+,;=:]|%[0-9a-f]{2})*@)?(?:\[(?:(?:(?:(?:[0-9a-f]{1,4}:){6}|::(?:[0-9a-f]{1,4}:){5}|(?:[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){4}|(?:(?:[0-9a-f]{1,4}:){0,1}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){3}|(?:(?:[0-9a-f]{1,4}:){0,2}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){2}|(?:(?:[0-9a-f]{1,4}:){0,3}[0-9a-f]{1,4})?::[0-9a-f]{1,4}:|(?:(?:[0-9a-f]{1,4}:){0,4}[0-9a-f]{1,4})?::)(?:[0-9a-f]{1,4}:[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?))|(?:(?:[0-9a-f]{1,4}:){0,5}[0-9a-f]{1,4})?::[0-9a-f]{1,4}|(?:(?:[0-9a-f]{1,4}:){0,6}[0-9a-f]{1,4})?::)|[Vv][0-9a-f]+\.[a-z0-9\-._~!$&'()*+,;=:]+)\]|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)|(?:[a-z0-9\-._~!$&'"()*+,;=]|%[0-9a-f]{2})*)(?::\d*)?(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*|\/(?:(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*)?|(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*)?(?:\?(?:[a-z0-9\-._~!$&'"()*+,;=:@/?]|%[0-9a-f]{2})*)?(?:#(?:[a-z0-9\-._~!$&'"()*+,;=:@/?]|%[0-9a-f]{2})*)?$/i,"uri-template":c,url:u,email:/^[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?$/i,hostname:s,ipv4:/^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$/,ipv6:/^\s*(?:(?:(?:[0-9a-f]{1,4}:){7}(?:[0-9a-f]{1,4}|:))|(?:(?:[0-9a-f]{1,4}:){6}(?::[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){5}(?:(?:(?::[0-9a-f]{1,4}){1,2})|:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){4}(?:(?:(?::[0-9a-f]{1,4}){1,3})|(?:(?::[0-9a-f]{1,4})?:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){3}(?:(?:(?::[0-9a-f]{1,4}){1,4})|(?:(?::[0-9a-f]{1,4}){0,2}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){2}(?:(?:(?::[0-9a-f]{1,4}){1,5})|(?:(?::[0-9a-f]{1,4}){0,3}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){1}(?:(?:(?::[0-9a-f]{1,4}){1,6})|(?:(?::[0-9a-f]{1,4}){0,4}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?::(?:(?:(?::[0-9a-f]{1,4}){1,7})|(?:(?::[0-9a-f]{1,4}){0,5}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(?:%.+)?\s*$/i,regex:w,uuid:h,"json-pointer":d,"json-pointer-uri-fragment":f,"relative-json-pointer":p};var g=/t|\s/i;var P=/\/|:/;var E=/[^\\]\\Z/;function w(e){if(E.test(e))return!1;try{return new RegExp(e),!0}catch(e){return!1}}},{"./util":10}],5:[function(e,r,t){"use strict";var j=e("./resolve"),O=e("./util"),I=e("./error_classes"),A=e("fast-json-stable-stringify"),C=e("../dotjs/validate"),k=O.ucs2length,L=e("fast-deep-equal"),z=I.Validation;function T(e,r,t){var a=s.call(this,e,r,t);return 0<=a?{index:a,compiling:!0}:{index:a=this._compilations.length,compiling:!(this._compilations[a]={schema:e,root:r,baseId:t})}}function q(e,r,t){var a=s.call(this,e,r,t);0<=a&&this._compilations.splice(a,1)}function s(e,r,t){for(var a=0;a",y=d?">":"<",g=void 0;if(m){var P=e.util.getData(p.$data,o,e.dataPathArr),E="exclusive"+s,w="exclType"+s,b="exclIsNumber"+s,S="' + "+(x="op"+s)+" + '";a+=" var schemaExcl"+s+" = "+P+"; ";var _;g=f;(_=_||[]).push(a+=" var "+E+"; var "+w+" = typeof "+(P="schemaExcl"+s)+"; if ("+w+" != 'boolean' && "+w+" != 'undefined' && "+w+" != 'number') { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(g||"_exclusiveLimit")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: {} ",!1!==e.opts.messages&&(a+=" , message: '"+f+" should be boolean' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var F=a;a=_.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+F+"]); ":" validate.errors = ["+F+"]; return false; ":" var err = "+F+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } else if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=" "+w+" == 'number' ? ( ("+E+" = "+t+" === undefined || "+P+" "+v+"= "+t+") ? "+u+" "+y+"= "+P+" : "+u+" "+y+" "+t+" ) : ( ("+E+" = "+P+" === true) ? "+u+" "+y+"= "+t+" : "+u+" "+y+" "+t+" ) || "+u+" !== "+u+") { var op"+s+" = "+E+" ? '"+v+"' : '"+v+"='; ",void 0===i&&(l=e.errSchemaPath+"/"+(g=f),t=P,h=m)}else{S=v;if((b="number"==typeof p)&&h){var x="'"+S+"'";a+=" if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=" ( "+t+" === undefined || "+p+" "+v+"= "+t+" ? "+u+" "+y+"= "+p+" : "+u+" "+y+" "+t+" ) || "+u+" !== "+u+") { "}else{b&&void 0===i?(E=!0,l=e.errSchemaPath+"/"+(g=f),t=p,y+="="):(b&&(t=Math[d?"min":"max"](p,i)),p===(!b||t)?(E=!0,l=e.errSchemaPath+"/"+(g=f),y+="="):(E=!1,S+="="));x="'"+S+"'";a+=" if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=" "+u+" "+y+" "+t+" || "+u+" !== "+u+") { "}}g=g||r,(_=_||[]).push(a),a="",!1!==e.createErrors?(a+=" { keyword: '"+(g||"_limit")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { comparison: "+x+", limit: "+t+", exclusive: "+E+" } ",!1!==e.opts.messages&&(a+=" , message: 'should be "+S+" ",a+=h?"' + "+t:t+"'"),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";F=a;return a=_.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+F+"]); ":" validate.errors = ["+F+"]; return false; ":" var err = "+F+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",c&&(a+=" else { "),a}},{}],14:[function(e,r,t){"use strict";r.exports=function(e,r){var t,a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,c=!e.opts.allErrors,u="data"+(o||""),h=e.opts.$data&&i&&i.$data;t=h?(a+=" var schema"+s+" = "+e.util.getData(i.$data,o,e.dataPathArr)+"; ","schema"+s):i,a+="if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || ");var d=r,f=f||[];f.push(a+=" "+u+".length "+("maxItems"==r?">":"<")+" "+t+") { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(d||"_limitItems")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have ",a+="maxItems"==r?"more":"fewer",a+=" than ",a+=h?"' + "+t+" + '":""+i,a+=" items' "),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var p=a;return a=f.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],15:[function(e,r,t){"use strict";r.exports=function(e,r){var t,a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,c=!e.opts.allErrors,u="data"+(o||""),h=e.opts.$data&&i&&i.$data;t=h?(a+=" var schema"+s+" = "+e.util.getData(i.$data,o,e.dataPathArr)+"; ","schema"+s):i,a+="if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=!1===e.opts.unicode?" "+u+".length ":" ucs2length("+u+") ";var d=r,f=f||[];f.push(a+=" "+("maxLength"==r?">":"<")+" "+t+") { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(d||"_limitLength")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT be ",a+="maxLength"==r?"longer":"shorter",a+=" than ",a+=h?"' + "+t+" + '":""+i,a+=" characters' "),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var p=a;return a=f.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],16:[function(e,r,t){"use strict";r.exports=function(e,r){var t,a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,c=!e.opts.allErrors,u="data"+(o||""),h=e.opts.$data&&i&&i.$data;t=h?(a+=" var schema"+s+" = "+e.util.getData(i.$data,o,e.dataPathArr)+"; ","schema"+s):i,a+="if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || ");var d=r,f=f||[];f.push(a+=" Object.keys("+u+").length "+("maxProperties"==r?">":"<")+" "+t+") { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(d||"_limitProperties")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have ",a+="maxProperties"==r?"more":"fewer",a+=" than ",a+=h?"' + "+t+" + '":""+i,a+=" properties' "),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var p=a;return a=f.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],17:[function(e,r,t){"use strict";r.exports=function(e,r){var t=" ",a=e.schema[r],s=e.schemaPath+e.util.getProperty(r),o=e.errSchemaPath+"/"+r,i=!e.opts.allErrors,n=e.util.copy(e),l="";n.level++;var c="valid"+n.level,u=n.baseId,h=!0,d=a;if(d)for(var f,p=-1,m=d.length-1;p "+F+") { ";var $=c+"["+F+"]";d.schema=_,d.schemaPath=i+"["+F+"]",d.errSchemaPath=n+"/"+F,d.errorPath=e.util.getPathExpr(e.errorPath,F,e.opts.jsonPointers,!0),d.dataPathArr[v]=F;var R=e.validate(d);d.baseId=g,e.util.varOccurences(R,y)<2?t+=" "+e.util.varReplace(R,y,$)+" ":t+=" var "+y+" = "+$+"; "+R+" ",t+=" } ",l&&(t+=" if ("+p+") { ",f+="}")}if("object"==typeof P&&(e.opts.strictKeywords?"object"==typeof P&&0 "+o.length+") { for (var "+m+" = "+o.length+"; "+m+" < "+c+".length; "+m+"++) { ",d.errorPath=e.util.getPathExpr(e.errorPath,m,e.opts.jsonPointers,!0);$=c+"["+m+"]";d.dataPathArr[v]=m;R=e.validate(d);d.baseId=g,e.util.varOccurences(R,y)<2?t+=" "+e.util.varReplace(R,y,$)+" ":t+=" var "+y+" = "+$+"; "+R+" ",l&&(t+=" if (!"+p+") break; "),t+=" } } ",l&&(t+=" if ("+p+") { ",f+="}")}}else if(e.opts.strictKeywords?"object"==typeof o&&0 1e-"+e.opts.multipleOfPrecision+" ":" division"+s+" !== parseInt(division"+s+") ",a+=" ) ",h&&(a+=" ) ");var d=d||[];d.push(a+=" ) { "),a="",!1!==e.createErrors?(a+=" { keyword: 'multipleOf' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { multipleOf: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should be multiple of ",a+=h?"' + "+t:t+"'"),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var f=a;return a=d.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+f+"]); ":" validate.errors = ["+f+"]; return false; ":" var err = "+f+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],30:[function(e,r,t){"use strict";r.exports=function(e,r){var t=" ",a=e.level,s=e.dataLevel,o=e.schema[r],i=e.schemaPath+e.util.getProperty(r),n=e.errSchemaPath+"/"+r,l=!e.opts.allErrors,c="data"+(s||""),u="errs__"+a,h=e.util.copy(e);h.level++;var d="valid"+h.level;if(e.opts.strictKeywords?"object"==typeof o&&0 1) { ";var f=e.schema.items&&e.schema.items.type,p=Array.isArray(f);if(!f||"object"==f||"array"==f||p&&(0<=f.indexOf("object")||0<=f.indexOf("array")))a+=" outer: for (;i--;) { for (j = i; j--;) { if (equal("+u+"[i], "+u+"[j])) { "+h+" = false; break outer; } } } ";else a+=" var itemIndices = {}, item; for (;i--;) { var item = "+u+"[i]; ",a+=" if ("+e.util["checkDataType"+(p?"s":"")](f,"item",!0)+") continue; ",p&&(a+=" if (typeof item == 'string') item = '\"' + item; "),a+=" if (typeof itemIndices[item] == 'number') { "+h+" = false; j = itemIndices[item]; break; } itemIndices[item] = i; } ";a+=" } ",d&&(a+=" } ");var m=m||[];m.push(a+=" if (!"+h+") { "),a="",!1!==e.createErrors?(a+=" { keyword: 'uniqueItems' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { i: i, j: j } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have duplicate items (items ## ' + j + ' and ' + i + ' are identical)' "),e.opts.verbose&&(a+=" , schema: ",a+=d?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var v=a;a=m.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+v+"]); ":" validate.errors = ["+v+"]; return false; ":" var err = "+v+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",c&&(a+=" else { ")}else c&&(a+=" if (true) { ");return a}},{}],38:[function(e,r,t){"use strict";r.exports=function(a,e){var r="",t=!0===a.schema.$async,s=a.util.schemaHasRulesExcept(a.schema,a.RULES.all,"$ref"),o=a.self._getId(a.schema);if(a.opts.strictKeywords){var i=a.util.schemaUnknownRules(a.schema,a.RULES.keywords);if(i){var n="unknown keyword: "+i;if("log"!==a.opts.strictKeywords)throw new Error(n);a.logger.warn(n)}}if(a.isTop&&(r+=" var validate = ",t&&(a.async=!0,r+="async "),r+="function(data, dataPath, parentData, parentDataProperty, rootData) { 'use strict'; ",o&&(a.opts.sourceCode||a.opts.processCode)&&(r+=" /*# sourceURL="+o+" */ ")),"boolean"==typeof a.schema||!s&&!a.schema.$ref){var l=a.level,c=a.dataLevel,u=a.schema[e="false schema"],h=a.schemaPath+a.util.getProperty(e),d=a.errSchemaPath+"/"+e,f=!a.opts.allErrors,p="data"+(c||""),m="valid"+l;if(!1===a.schema){a.isTop?f=!0:r+=" var "+m+" = false; ",(Z=Z||[]).push(r),r="",!1!==a.createErrors?(r+=" { keyword: 'false schema' , dataPath: (dataPath || '') + "+a.errorPath+" , schemaPath: "+a.util.toQuotedString(d)+" , params: {} ",!1!==a.opts.messages&&(r+=" , message: 'boolean schema is false' "),a.opts.verbose&&(r+=" , schema: false , parentSchema: validate.schema"+a.schemaPath+" , data: "+p+" "),r+=" } "):r+=" {} ";var v=r;r=Z.pop(),r+=!a.compositeRule&&f?a.async?" throw new ValidationError(["+v+"]); ":" validate.errors = ["+v+"]; return false; ":" var err = "+v+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; "}else r+=a.isTop?t?" return data; ":" validate.errors = null; return true; ":" var "+m+" = true; ";return a.isTop&&(r+=" }; return validate; "),r}if(a.isTop){var y=a.isTop;l=a.level=0,c=a.dataLevel=0,p="data";if(a.rootId=a.resolve.fullPath(a.self._getId(a.root.schema)),a.baseId=a.baseId||a.rootId,delete a.isTop,a.dataPathArr=[void 0],void 0!==a.schema.default&&a.opts.useDefaults&&a.opts.strictDefaults){var g="default is ignored in the schema root";if("log"!==a.opts.strictDefaults)throw new Error(g);a.logger.warn(g)}r+=" var vErrors = null; ",r+=" var errors = 0; ",r+=" if (rootData === undefined) rootData = data; "}else{l=a.level,p="data"+((c=a.dataLevel)||"");if(o&&(a.baseId=a.resolve.url(a.baseId,o)),t&&!a.async)throw new Error("async schema in sync schema");r+=" var errs_"+l+" = errors;"}m="valid"+l,f=!a.opts.allErrors;var P="",E="",w=a.schema.type,b=Array.isArray(w);if(w&&a.opts.nullable&&!0===a.schema.nullable&&(b?-1==w.indexOf("null")&&(w=w.concat("null")):"null"!=w&&(w=[w,"null"],b=!0)),b&&1==w.length&&(w=w[0],b=!1),a.schema.$ref&&s){if("fail"==a.opts.extendRefs)throw new Error('$ref: validation keywords used in schema at path "'+a.errSchemaPath+'" (see option extendRefs)');!0!==a.opts.extendRefs&&(s=!1,a.logger.warn('$ref: keywords ignored in schema at path "'+a.errSchemaPath+'"'))}if(a.schema.$comment&&a.opts.$comment&&(r+=" "+a.RULES.all.$comment.code(a,"$comment")),w){if(a.opts.coerceTypes)var S=a.util.coerceToTypes(a.opts.coerceTypes,w);var _=a.RULES.types[w];if(S||b||!0===_||_&&!G(_)){h=a.schemaPath+".type",d=a.errSchemaPath+"/type",h=a.schemaPath+".type",d=a.errSchemaPath+"/type";if(r+=" if ("+a.util[b?"checkDataTypes":"checkDataType"](w,p,!0)+") { ",S){var F="dataType"+l,x="coerced"+l;r+=" var "+F+" = typeof "+p+"; ","array"==a.opts.coerceTypes&&(r+=" if ("+F+" == 'object' && Array.isArray("+p+")) "+F+" = 'array'; "),r+=" var "+x+" = undefined; ";var $="",R=S;if(R)for(var D,j=-1,O=R.length-1;j= 0x80 (not a basic code point)","invalid-input":"Invalid input"},C=Math.floor,k=String.fromCharCode;function L(e){throw new RangeError(i[e])}function n(e,r){var t=e.split("@"),a="";return 1>1,e+=C(e/r);455C((A-s)/h))&&L("overflow"),s+=f*h;var p=d<=i?1:i+26<=d?26:d-i;if(fC(A/m)&&L("overflow"),h*=m}var v=t.length+1;i=q(s-u,v,0==u),C(s/v)>A-o&&L("overflow"),o+=C(s/v),s%=v,t.splice(s++,0,o)}return String.fromCodePoint.apply(String,t)}function c(e){var r=[],t=(e=z(e)).length,a=128,s=0,o=72,i=!0,n=!1,l=void 0;try{for(var c,u=e[Symbol.iterator]();!(i=(c=u.next()).done);i=!0){var h=c.value;h<128&&r.push(k(h))}}catch(e){n=!0,l=e}finally{try{!i&&u.return&&u.return()}finally{if(n)throw l}}var d=r.length,f=d;for(d&&r.push("-");fC((A-s)/w)&&L("overflow"),s+=(p-a)*w,a=p;var b=!0,S=!1,_=void 0;try{for(var F,x=e[Symbol.iterator]();!(b=(F=x.next()).done);b=!0){var $=F.value;if($A&&L("overflow"),$==a){for(var R=s,D=36;;D+=36){var j=D<=o?1:o+26<=D?26:D-o;if(R>6|192).toString(16).toUpperCase()+"%"+(63&r|128).toString(16).toUpperCase():"%"+(r>>12|224).toString(16).toUpperCase()+"%"+(r>>6&63|128).toString(16).toUpperCase()+"%"+(63&r|128).toString(16).toUpperCase()}function f(e){for(var r="",t=0,a=e.length;tA-Z\\x5E-\\x7E]",'[\\"\\\\]')),M=new RegExp(V,"g"),B=new RegExp("(?:(?:%[EFef][0-9A-Fa-f]%[0-9A-Fa-f][0-9A-Fa-f]%[0-9A-Fa-f][0-9A-Fa-f])|(?:%[89A-Fa-f][0-9A-Fa-f]%[0-9A-Fa-f][0-9A-Fa-f])|(?:%[0-9A-Fa-f][0-9A-Fa-f]))","g"),G=new RegExp(J("[^]","[A-Za-z0-9\\!\\$\\%\\'\\*\\+\\-\\^\\_\\`\\{\\|\\}\\~]","[\\.]",'[\\"]',K),"g"),Y=new RegExp(J("[^]",V,"[\\!\\$\\'\\(\\)\\*\\+\\,\\;\\:\\@]"),"g"),W=Y;function X(e){var r=f(e);return r.match(M)?r:e}var ee={scheme:"mailto",parse:function(e,r){var t=e,a=t.to=t.path?t.path.split(","):[];if(t.path=void 0,t.query){for(var s=!1,o={},i=t.query.split("&"),n=0,l=i.length;n%\\^`{|}]|%[0-9a-f]{2})|\{[+#./;?&=,!@|]?(?:[a-z0-9_]|%[0-9a-f]{2})+(?::[1-9][0-9]{0,3}|\*)?(?:,(?:[a-z0-9_]|%[0-9a-f]{2})+(?::[1-9][0-9]{0,3}|\*)?)*\})*$/i,u=/^(?:(?:http[s\u017F]?|ftp):\/\/)(?:(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+(?::(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])*)?@)?(?:(?!10(?:\.[0-9]{1,3}){3})(?!127(?:\.[0-9]{1,3}){3})(?!169\.254(?:\.[0-9]{1,3}){2})(?!192\.168(?:\.[0-9]{1,3}){2})(?!172\.(?:1[6-9]|2[0-9]|3[01])(?:\.[0-9]{1,3}){2})(?:[1-9][0-9]?|1[0-9][0-9]|2[01][0-9]|22[0-3])(?:\.(?:1?[0-9]{1,2}|2[0-4][0-9]|25[0-5])){2}(?:\.(?:[1-9][0-9]?|1[0-9][0-9]|2[0-4][0-9]|25[0-4]))|(?:(?:(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+-?)*(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+)(?:\.(?:(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+-?)*(?:[0-9KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])+)*(?:\.(?:(?:[KSa-z\xA1-\uD7FF\uE000-\uFFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF]){2,})))(?::[0-9]{2,5})?(?:\/(?:[\0-\x08\x0E-\x1F!-\x9F\xA1-\u167F\u1681-\u1FFF\u200B-\u2027\u202A-\u202E\u2030-\u205E\u2060-\u2FFF\u3001-\uD7FF\uE000-\uFEFE\uFF00-\uFFFF]|[\uD800-\uDBFF][\uDC00-\uDFFF]|[\uD800-\uDBFF](?![\uDC00-\uDFFF])|(?:[^\uD800-\uDBFF]|^)[\uDC00-\uDFFF])*)?$/i,h=/^(?:urn:uuid:)?[0-9a-f]{8}-(?:[0-9a-f]{4}-){3}[0-9a-f]{12}$/i,d=/^(?:\/(?:[^~/]|~0|~1)*)*$/,f=/^#(?:\/(?:[a-z0-9_\-.!$&'()*+,;:=@]|%[0-9a-f]{2}|~0|~1)*)*$/i,p=/^(?:0|[1-9][0-9]*)(?:#|(?:\/(?:[^~/]|~0|~1)*)*)$/;function m(e){return a.copy(m[e="full"==e?"full":"fast"])}function v(e){var r=e.match(o);if(!r)return!1;var t,a=+r[2],s=+r[3];return 1<=a&&a<=12&&1<=s&&s<=(2!=a||((t=+r[1])%4!=0||t%100==0&&t%400!=0)?i[a]:29)}function y(e,r){var t=e.match(n);if(!t)return!1;var a=t[1],s=t[2],o=t[3];return(a<=23&&s<=59&&o<=59||23==a&&59==s&&60==o)&&(!r||t[5])}(r.exports=m).fast={date:/^\d\d\d\d-[0-1]\d-[0-3]\d$/,time:/^(?:[0-2]\d:[0-5]\d:[0-5]\d|23:59:60)(?:\.\d+)?(?:z|[+-]\d\d(?::?\d\d)?)?$/i,"date-time":/^\d\d\d\d-[0-1]\d-[0-3]\d[t\s](?:[0-2]\d:[0-5]\d:[0-5]\d|23:59:60)(?:\.\d+)?(?:z|[+-]\d\d(?::?\d\d)?)$/i,uri:/^(?:[a-z][a-z0-9+-.]*:)(?:\/?\/)?[^\s]*$/i,"uri-reference":/^(?:(?:[a-z][a-z0-9+-.]*:)?\/?\/)?(?:[^\\\s#][^\s#]*)?(?:#[^\\\s]*)?$/i,"uri-template":c,url:u,email:/^[a-z0-9.!#$%&'*+/=?^_`{|}~-]+@[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:\.[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?)*$/i,hostname:s,ipv4:/^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$/,ipv6:/^\s*(?:(?:(?:[0-9a-f]{1,4}:){7}(?:[0-9a-f]{1,4}|:))|(?:(?:[0-9a-f]{1,4}:){6}(?::[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){5}(?:(?:(?::[0-9a-f]{1,4}){1,2})|:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){4}(?:(?:(?::[0-9a-f]{1,4}){1,3})|(?:(?::[0-9a-f]{1,4})?:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){3}(?:(?:(?::[0-9a-f]{1,4}){1,4})|(?:(?::[0-9a-f]{1,4}){0,2}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){2}(?:(?:(?::[0-9a-f]{1,4}){1,5})|(?:(?::[0-9a-f]{1,4}){0,3}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){1}(?:(?:(?::[0-9a-f]{1,4}){1,6})|(?:(?::[0-9a-f]{1,4}){0,4}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?::(?:(?:(?::[0-9a-f]{1,4}){1,7})|(?:(?::[0-9a-f]{1,4}){0,5}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(?:%.+)?\s*$/i,regex:w,uuid:h,"json-pointer":d,"json-pointer-uri-fragment":f,"relative-json-pointer":p},m.full={date:v,time:y,"date-time":function(e){var r=e.split(g);return 2==r.length&&v(r[0])&&y(r[1],!0)},uri:function(e){return P.test(e)&&l.test(e)},"uri-reference":/^(?:[a-z][a-z0-9+\-.]*:)?(?:\/?\/(?:(?:[a-z0-9\-._~!$&'()*+,;=:]|%[0-9a-f]{2})*@)?(?:\[(?:(?:(?:(?:[0-9a-f]{1,4}:){6}|::(?:[0-9a-f]{1,4}:){5}|(?:[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){4}|(?:(?:[0-9a-f]{1,4}:){0,1}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){3}|(?:(?:[0-9a-f]{1,4}:){0,2}[0-9a-f]{1,4})?::(?:[0-9a-f]{1,4}:){2}|(?:(?:[0-9a-f]{1,4}:){0,3}[0-9a-f]{1,4})?::[0-9a-f]{1,4}:|(?:(?:[0-9a-f]{1,4}:){0,4}[0-9a-f]{1,4})?::)(?:[0-9a-f]{1,4}:[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?))|(?:(?:[0-9a-f]{1,4}:){0,5}[0-9a-f]{1,4})?::[0-9a-f]{1,4}|(?:(?:[0-9a-f]{1,4}:){0,6}[0-9a-f]{1,4})?::)|[Vv][0-9a-f]+\.[a-z0-9\-._~!$&'()*+,;=:]+)\]|(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)|(?:[a-z0-9\-._~!$&'"()*+,;=]|%[0-9a-f]{2})*)(?::\d*)?(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*|\/(?:(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*)?|(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})+(?:\/(?:[a-z0-9\-._~!$&'"()*+,;=:@]|%[0-9a-f]{2})*)*)?(?:\?(?:[a-z0-9\-._~!$&'"()*+,;=:@/?]|%[0-9a-f]{2})*)?(?:#(?:[a-z0-9\-._~!$&'"()*+,;=:@/?]|%[0-9a-f]{2})*)?$/i,"uri-template":c,url:u,email:/^[a-z0-9!#$%&'*+/=?^_`{|}~-]+(?:\.[a-z0-9!#$%&'*+/=?^_`{|}~-]+)*@(?:[a-z0-9](?:[a-z0-9-]*[a-z0-9])?\.)+[a-z0-9](?:[a-z0-9-]*[a-z0-9])?$/i,hostname:s,ipv4:/^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$/,ipv6:/^\s*(?:(?:(?:[0-9a-f]{1,4}:){7}(?:[0-9a-f]{1,4}|:))|(?:(?:[0-9a-f]{1,4}:){6}(?::[0-9a-f]{1,4}|(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){5}(?:(?:(?::[0-9a-f]{1,4}){1,2})|:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(?:(?:[0-9a-f]{1,4}:){4}(?:(?:(?::[0-9a-f]{1,4}){1,3})|(?:(?::[0-9a-f]{1,4})?:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){3}(?:(?:(?::[0-9a-f]{1,4}){1,4})|(?:(?::[0-9a-f]{1,4}){0,2}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){2}(?:(?:(?::[0-9a-f]{1,4}){1,5})|(?:(?::[0-9a-f]{1,4}){0,3}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?:(?:[0-9a-f]{1,4}:){1}(?:(?:(?::[0-9a-f]{1,4}){1,6})|(?:(?::[0-9a-f]{1,4}){0,4}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(?::(?:(?:(?::[0-9a-f]{1,4}){1,7})|(?:(?::[0-9a-f]{1,4}){0,5}:(?:(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(?:\.(?:25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(?:%.+)?\s*$/i,regex:w,uuid:h,"json-pointer":d,"json-pointer-uri-fragment":f,"relative-json-pointer":p};var g=/t|\s/i;var P=/\/|:/;var E=/[^\\]\\Z/;function w(e){if(E.test(e))return!1;try{return new RegExp(e),!0}catch(e){return!1}}},{"./util":10}],5:[function(e,r,t){"use strict";var $=e("./resolve"),R=e("./util"),D=e("./error_classes"),j=e("fast-json-stable-stringify"),O=e("../dotjs/validate"),I=R.ucs2length,A=e("fast-deep-equal"),C=D.Validation;function k(e,c,u,r){var d=this,f=this._opts,h=[void 0],p={},l=[],t={},m=[],a={},v=[],s=function(e,r,t){var a=L.call(this,e,r,t);return 0<=a?{index:a,compiling:!0}:{index:a=this._compilations.length,compiling:!(this._compilations[a]={schema:e,root:r,baseId:t})}}.call(this,e,c=c||{schema:e,refVal:h,refs:p},r),o=this._compilations[s.index];if(s.compiling)return o.callValidate=P;var y=this._formats,g=this.RULES;try{var i=E(e,c,u,r);o.validate=i;var n=o.callValidate;return n&&(n.schema=i.schema,n.errors=null,n.refs=i.refs,n.refVal=i.refVal,n.root=i.root,n.$async=i.$async,f.sourceCode&&(n.source=i.source)),i}finally{(function(e,r,t){var a=L.call(this,e,r,t);0<=a&&this._compilations.splice(a,1)}).call(this,e,c,r)}function P(){var e=o.validate,r=e.apply(this,arguments);return P.errors=e.errors,r}function E(e,r,t,a){var s=!r||r&&r.schema==e;if(r.schema!=c.schema)return k.call(d,e,r,t,a);var o,i=!0===e.$async,n=O({isTop:!0,schema:e,isRoot:s,baseId:a,root:r,schemaPath:"",errSchemaPath:"#",errorPath:'""',MissingRefError:D.MissingRef,RULES:g,validate:O,util:R,resolve:$,resolveRef:w,usePattern:_,useDefault:F,useCustomRule:x,opts:f,formats:y,logger:d.logger,self:d});n=Q(h,q)+Q(l,z)+Q(m,T)+Q(v,N)+n,f.processCode&&(n=f.processCode(n));try{o=new Function("self","RULES","formats","root","refVal","defaults","customRules","equal","ucs2length","ValidationError",n)(d,g,y,c,h,m,v,A,I,C),h[0]=o}catch(e){throw d.logger.error("Error compiling schema, function code:",n),e}return o.schema=e,o.errors=null,o.refs=p,o.refVal=h,o.root=s?o:r,i&&(o.$async=!0),!0===f.sourceCode&&(o.source={code:n,patterns:l,defaults:m}),o}function w(e,r,t){r=$.url(e,r);var a,s,o=p[r];if(void 0!==o)return S(a=h[o],s="refVal["+o+"]");if(!t&&c.refs){var i=c.refs[r];if(void 0!==i)return S(a=c.refVal[i],s=b(r,a))}s=b(r);var n=$.call(d,E,c,r);if(void 0===n){var l=u&&u[r];l&&(n=$.inlineRef(l,f.inlineRefs)?l:k.call(d,l,c,u,e))}if(void 0!==n)return h[p[r]]=n,S(n,s);delete p[r]}function b(e,r){var t=h.length;return h[t]=r,"refVal"+(p[e]=t)}function S(e,r){return"object"==typeof e||"boolean"==typeof e?{code:r,schema:e,inline:!0}:{code:r,$async:e&&!!e.$async}}function _(e){var r=t[e];return void 0===r&&(r=t[e]=l.length,l[r]=e),"pattern"+r}function F(e){switch(typeof e){case"boolean":case"number":return""+e;case"string":return R.toQuotedString(e);case"object":if(null===e)return"null";var r=j(e),t=a[r];return void 0===t&&(t=a[r]=m.length,m[t]=e),"default"+t}}function x(e,r,t,a){if(!1!==d._opts.validateSchema){var s=e.definition.dependencies;if(s&&!s.every(function(e){return Object.prototype.hasOwnProperty.call(t,e)}))throw new Error("parent schema must have all required keywords: "+s.join(","));var o=e.definition.validateSchema;if(o)if(!o(r)){var i="keyword schema is invalid: "+d.errorsText(o.errors);if("log"!=d._opts.validateSchema)throw new Error(i);d.logger.error(i)}}var n,l=e.definition.compile,c=e.definition.inline,u=e.definition.macro;if(l)n=l.call(d,r,t,a);else if(u)n=u.call(d,r,t,a),!1!==f.validateSchema&&d.validateSchema(n,!0);else if(c)n=c.call(d,a,e.keyword,r,t);else if(!(n=e.definition.validate))return;if(void 0===n)throw new Error('custom keyword "'+e.keyword+'"failed to compile');var h=v.length;return{code:"customRule"+h,validate:v[h]=n}}}function L(e,r,t){for(var a=0;a",y=d?">":"<",g=void 0;if(m){var P=e.util.getData(p.$data,o,e.dataPathArr),E="exclusive"+s,w="exclType"+s,b="exclIsNumber"+s,S="' + "+(x="op"+s)+" + '";a+=" var schemaExcl"+s+" = "+P+"; ";var _;g=f;(_=_||[]).push(a+=" var "+E+"; var "+w+" = typeof "+(P="schemaExcl"+s)+"; if ("+w+" != 'boolean' && "+w+" != 'undefined' && "+w+" != 'number') { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(g||"_exclusiveLimit")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: {} ",!1!==e.opts.messages&&(a+=" , message: '"+f+" should be boolean' "),e.opts.verbose&&(a+=" , schema: validate.schema"+n+" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var F=a;a=_.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+F+"]); ":" validate.errors = ["+F+"]; return false; ":" var err = "+F+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } else if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=" "+w+" == 'number' ? ( ("+E+" = "+t+" === undefined || "+P+" "+v+"= "+t+") ? "+u+" "+y+"= "+P+" : "+u+" "+y+" "+t+" ) : ( ("+E+" = "+P+" === true) ? "+u+" "+y+"= "+t+" : "+u+" "+y+" "+t+" ) || "+u+" !== "+u+") { var op"+s+" = "+E+" ? '"+v+"' : '"+v+"='; ",void 0===i&&(l=e.errSchemaPath+"/"+(g=f),t=P,h=m)}else{S=v;if((b="number"==typeof p)&&h){var x="'"+S+"'";a+=" if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=" ( "+t+" === undefined || "+p+" "+v+"= "+t+" ? "+u+" "+y+"= "+p+" : "+u+" "+y+" "+t+" ) || "+u+" !== "+u+") { "}else{b&&void 0===i?(E=!0,l=e.errSchemaPath+"/"+(g=f),t=p,y+="="):(b&&(t=Math[d?"min":"max"](p,i)),p===(!b||t)?(E=!0,l=e.errSchemaPath+"/"+(g=f),y+="="):(E=!1,S+="="));x="'"+S+"'";a+=" if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=" "+u+" "+y+" "+t+" || "+u+" !== "+u+") { "}}g=g||r,(_=_||[]).push(a),a="",!1!==e.createErrors?(a+=" { keyword: '"+(g||"_limit")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { comparison: "+x+", limit: "+t+", exclusive: "+E+" } ",!1!==e.opts.messages&&(a+=" , message: 'should be "+S+" ",a+=h?"' + "+t:t+"'"),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";F=a;return a=_.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+F+"]); ":" validate.errors = ["+F+"]; return false; ":" var err = "+F+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",c&&(a+=" else { "),a}},{}],14:[function(e,r,t){"use strict";r.exports=function(e,r){var t,a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,c=!e.opts.allErrors,u="data"+(o||""),h=e.opts.$data&&i&&i.$data;t=h?(a+=" var schema"+s+" = "+e.util.getData(i.$data,o,e.dataPathArr)+"; ","schema"+s):i,a+="if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || ");var d=r,f=f||[];f.push(a+=" "+u+".length "+("maxItems"==r?">":"<")+" "+t+") { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(d||"_limitItems")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have ",a+="maxItems"==r?"more":"fewer",a+=" than ",a+=h?"' + "+t+" + '":""+i,a+=" items' "),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var p=a;return a=f.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],15:[function(e,r,t){"use strict";r.exports=function(e,r){var t,a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,c=!e.opts.allErrors,u="data"+(o||""),h=e.opts.$data&&i&&i.$data;t=h?(a+=" var schema"+s+" = "+e.util.getData(i.$data,o,e.dataPathArr)+"; ","schema"+s):i,a+="if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || "),a+=!1===e.opts.unicode?" "+u+".length ":" ucs2length("+u+") ";var d=r,f=f||[];f.push(a+=" "+("maxLength"==r?">":"<")+" "+t+") { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(d||"_limitLength")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT be ",a+="maxLength"==r?"longer":"shorter",a+=" than ",a+=h?"' + "+t+" + '":""+i,a+=" characters' "),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var p=a;return a=f.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],16:[function(e,r,t){"use strict";r.exports=function(e,r){var t,a=" ",s=e.level,o=e.dataLevel,i=e.schema[r],n=e.schemaPath+e.util.getProperty(r),l=e.errSchemaPath+"/"+r,c=!e.opts.allErrors,u="data"+(o||""),h=e.opts.$data&&i&&i.$data;t=h?(a+=" var schema"+s+" = "+e.util.getData(i.$data,o,e.dataPathArr)+"; ","schema"+s):i,a+="if ( ",h&&(a+=" ("+t+" !== undefined && typeof "+t+" != 'number') || ");var d=r,f=f||[];f.push(a+=" Object.keys("+u+").length "+("maxProperties"==r?">":"<")+" "+t+") { "),a="",!1!==e.createErrors?(a+=" { keyword: '"+(d||"_limitProperties")+"' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { limit: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have ",a+="maxProperties"==r?"more":"fewer",a+=" than ",a+=h?"' + "+t+" + '":""+i,a+=" properties' "),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var p=a;return a=f.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+p+"]); ":" validate.errors = ["+p+"]; return false; ":" var err = "+p+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],17:[function(e,r,t){"use strict";r.exports=function(e,r){var t=" ",a=e.schema[r],s=e.schemaPath+e.util.getProperty(r),o=e.errSchemaPath+"/"+r,i=!e.opts.allErrors,n=e.util.copy(e),l="";n.level++;var c="valid"+n.level,u=n.baseId,h=!0,d=a;if(d)for(var f,p=-1,m=d.length-1;p "+F+") { ";var $=c+"["+F+"]";d.schema=_,d.schemaPath=i+"["+F+"]",d.errSchemaPath=n+"/"+F,d.errorPath=e.util.getPathExpr(e.errorPath,F,e.opts.jsonPointers,!0),d.dataPathArr[v]=F;var R=e.validate(d);d.baseId=g,e.util.varOccurences(R,y)<2?t+=" "+e.util.varReplace(R,y,$)+" ":t+=" var "+y+" = "+$+"; "+R+" ",t+=" } ",l&&(t+=" if ("+p+") { ",f+="}")}if("object"==typeof P&&(e.opts.strictKeywords?"object"==typeof P&&0 "+o.length+") { for (var "+m+" = "+o.length+"; "+m+" < "+c+".length; "+m+"++) { ",d.errorPath=e.util.getPathExpr(e.errorPath,m,e.opts.jsonPointers,!0);$=c+"["+m+"]";d.dataPathArr[v]=m;R=e.validate(d);d.baseId=g,e.util.varOccurences(R,y)<2?t+=" "+e.util.varReplace(R,y,$)+" ":t+=" var "+y+" = "+$+"; "+R+" ",l&&(t+=" if (!"+p+") break; "),t+=" } } ",l&&(t+=" if ("+p+") { ",f+="}")}}else if(e.opts.strictKeywords?"object"==typeof o&&0 1e-"+e.opts.multipleOfPrecision+" ":" division"+s+" !== parseInt(division"+s+") ",a+=" ) ",h&&(a+=" ) ");var d=d||[];d.push(a+=" ) { "),a="",!1!==e.createErrors?(a+=" { keyword: 'multipleOf' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { multipleOf: "+t+" } ",!1!==e.opts.messages&&(a+=" , message: 'should be multiple of ",a+=h?"' + "+t:t+"'"),e.opts.verbose&&(a+=" , schema: ",a+=h?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var f=a;return a=d.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+f+"]); ":" validate.errors = ["+f+"]; return false; ":" var err = "+f+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+="} ",c&&(a+=" else { "),a}},{}],30:[function(e,r,t){"use strict";r.exports=function(e,r){var t=" ",a=e.level,s=e.dataLevel,o=e.schema[r],i=e.schemaPath+e.util.getProperty(r),n=e.errSchemaPath+"/"+r,l=!e.opts.allErrors,c="data"+(s||""),u="errs__"+a,h=e.util.copy(e);h.level++;var d="valid"+h.level;if(e.opts.strictKeywords?"object"==typeof o&&0 1) { ";var f=e.schema.items&&e.schema.items.type,p=Array.isArray(f);if(!f||"object"==f||"array"==f||p&&(0<=f.indexOf("object")||0<=f.indexOf("array")))a+=" outer: for (;i--;) { for (j = i; j--;) { if (equal("+u+"[i], "+u+"[j])) { "+h+" = false; break outer; } } } ";else a+=" var itemIndices = {}, item; for (;i--;) { var item = "+u+"[i]; ",a+=" if ("+e.util["checkDataType"+(p?"s":"")](f,"item",!0)+") continue; ",p&&(a+=" if (typeof item == 'string') item = '\"' + item; "),a+=" if (typeof itemIndices[item] == 'number') { "+h+" = false; j = itemIndices[item]; break; } itemIndices[item] = i; } ";a+=" } ",d&&(a+=" } ");var m=m||[];m.push(a+=" if (!"+h+") { "),a="",!1!==e.createErrors?(a+=" { keyword: 'uniqueItems' , dataPath: (dataPath || '') + "+e.errorPath+" , schemaPath: "+e.util.toQuotedString(l)+" , params: { i: i, j: j } ",!1!==e.opts.messages&&(a+=" , message: 'should NOT have duplicate items (items ## ' + j + ' and ' + i + ' are identical)' "),e.opts.verbose&&(a+=" , schema: ",a+=d?"validate.schema"+n:""+i,a+=" , parentSchema: validate.schema"+e.schemaPath+" , data: "+u+" "),a+=" } "):a+=" {} ";var v=a;a=m.pop(),a+=!e.compositeRule&&c?e.async?" throw new ValidationError(["+v+"]); ":" validate.errors = ["+v+"]; return false; ":" var err = "+v+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; ",a+=" } ",c&&(a+=" else { ")}else c&&(a+=" if (true) { ");return a}},{}],38:[function(e,r,t){"use strict";r.exports=function(a,e){var r="",t=!0===a.schema.$async,s=a.util.schemaHasRulesExcept(a.schema,a.RULES.all,"$ref"),o=a.self._getId(a.schema);if(a.opts.strictKeywords){var i=a.util.schemaUnknownRules(a.schema,a.RULES.keywords);if(i){var n="unknown keyword: "+i;if("log"!==a.opts.strictKeywords)throw new Error(n);a.logger.warn(n)}}if(a.isTop&&(r+=" var validate = ",t&&(a.async=!0,r+="async "),r+="function(data, dataPath, parentData, parentDataProperty, rootData) { 'use strict'; ",o&&(a.opts.sourceCode||a.opts.processCode)&&(r+=" /*# sourceURL="+o+" */ ")),"boolean"==typeof a.schema||!s&&!a.schema.$ref){var l=a.level,c=a.dataLevel,u=a.schema[e="false schema"],h=a.schemaPath+a.util.getProperty(e),d=a.errSchemaPath+"/"+e,f=!a.opts.allErrors,p="data"+(c||""),m="valid"+l;if(!1===a.schema){a.isTop?f=!0:r+=" var "+m+" = false; ",(Z=Z||[]).push(r),r="",!1!==a.createErrors?(r+=" { keyword: 'false schema' , dataPath: (dataPath || '') + "+a.errorPath+" , schemaPath: "+a.util.toQuotedString(d)+" , params: {} ",!1!==a.opts.messages&&(r+=" , message: 'boolean schema is false' "),a.opts.verbose&&(r+=" , schema: false , parentSchema: validate.schema"+a.schemaPath+" , data: "+p+" "),r+=" } "):r+=" {} ";var v=r;r=Z.pop(),r+=!a.compositeRule&&f?a.async?" throw new ValidationError(["+v+"]); ":" validate.errors = ["+v+"]; return false; ":" var err = "+v+"; if (vErrors === null) vErrors = [err]; else vErrors.push(err); errors++; "}else r+=a.isTop?t?" return data; ":" validate.errors = null; return true; ":" var "+m+" = true; ";return a.isTop&&(r+=" }; return validate; "),r}if(a.isTop){var y=a.isTop;l=a.level=0,c=a.dataLevel=0,p="data";if(a.rootId=a.resolve.fullPath(a.self._getId(a.root.schema)),a.baseId=a.baseId||a.rootId,delete a.isTop,a.dataPathArr=[void 0],void 0!==a.schema.default&&a.opts.useDefaults&&a.opts.strictDefaults){var g="default is ignored in the schema root";if("log"!==a.opts.strictDefaults)throw new Error(g);a.logger.warn(g)}r+=" var vErrors = null; ",r+=" var errors = 0; ",r+=" if (rootData === undefined) rootData = data; "}else{l=a.level,p="data"+((c=a.dataLevel)||"");if(o&&(a.baseId=a.resolve.url(a.baseId,o)),t&&!a.async)throw new Error("async schema in sync schema");r+=" var errs_"+l+" = errors;"}m="valid"+l,f=!a.opts.allErrors;var P="",E="",w=a.schema.type,b=Array.isArray(w);if(w&&a.opts.nullable&&!0===a.schema.nullable&&(b?-1==w.indexOf("null")&&(w=w.concat("null")):"null"!=w&&(w=[w,"null"],b=!0)),b&&1==w.length&&(w=w[0],b=!1),a.schema.$ref&&s){if("fail"==a.opts.extendRefs)throw new Error('$ref: validation keywords used in schema at path "'+a.errSchemaPath+'" (see option extendRefs)');!0!==a.opts.extendRefs&&(s=!1,a.logger.warn('$ref: keywords ignored in schema at path "'+a.errSchemaPath+'"'))}if(a.schema.$comment&&a.opts.$comment&&(r+=" "+a.RULES.all.$comment.code(a,"$comment")),w){if(a.opts.coerceTypes)var S=a.util.coerceToTypes(a.opts.coerceTypes,w);var _=a.RULES.types[w];if(S||b||!0===_||_&&!G(_)){h=a.schemaPath+".type",d=a.errSchemaPath+"/type",h=a.schemaPath+".type",d=a.errSchemaPath+"/type";if(r+=" if ("+a.util[b?"checkDataTypes":"checkDataType"](w,p,!0)+") { ",S){var F="dataType"+l,x="coerced"+l;r+=" var "+F+" = typeof "+p+"; ","array"==a.opts.coerceTypes&&(r+=" if ("+F+" == 'object' && Array.isArray("+p+")) "+F+" = 'array'; "),r+=" var "+x+" = undefined; ";var $="",R=S;if(R)for(var D,j=-1,O=R.length-1;j= 0x80 (not a basic code point)","invalid-input":"Invalid input"},C=Math.floor,k=String.fromCharCode;function L(e){throw new RangeError(i[e])}function n(e,r){var t=e.split("@"),a="";return 1>1,e+=C(e/r);455C((A-s)/h))&&L("overflow"),s+=f*h;var p=d<=i?1:i+26<=d?26:d-i;if(fC(A/m)&&L("overflow"),h*=m}var v=t.length+1;i=q(s-u,v,0==u),C(s/v)>A-o&&L("overflow"),o+=C(s/v),s%=v,t.splice(s++,0,o)}return String.fromCodePoint.apply(String,t)}function c(e){var r=[],t=(e=z(e)).length,a=128,s=0,o=72,i=!0,n=!1,l=void 0;try{for(var c,u=e[Symbol.iterator]();!(i=(c=u.next()).done);i=!0){var h=c.value;h<128&&r.push(k(h))}}catch(e){n=!0,l=e}finally{try{!i&&u.return&&u.return()}finally{if(n)throw l}}var d=r.length,f=d;for(d&&r.push("-");fC((A-s)/w)&&L("overflow"),s+=(p-a)*w,a=p;var b=!0,S=!1,_=void 0;try{for(var F,x=e[Symbol.iterator]();!(b=(F=x.next()).done);b=!0){var $=F.value;if($A&&L("overflow"),$==a){for(var R=s,D=36;;D+=36){var j=D<=o?1:o+26<=D?26:D-o;if(R>6|192).toString(16).toUpperCase()+"%"+(63&r|128).toString(16).toUpperCase():"%"+(r>>12|224).toString(16).toUpperCase()+"%"+(r>>6&63|128).toString(16).toUpperCase()+"%"+(63&r|128).toString(16).toUpperCase()}function f(e){for(var r="",t=0,a=e.length;tA-Z\\x5E-\\x7E]",'[\\"\\\\]')),M=new RegExp(V,"g"),B=new RegExp("(?:(?:%[EFef][0-9A-Fa-f]%[0-9A-Fa-f][0-9A-Fa-f]%[0-9A-Fa-f][0-9A-Fa-f])|(?:%[89A-Fa-f][0-9A-Fa-f]%[0-9A-Fa-f][0-9A-Fa-f])|(?:%[0-9A-Fa-f][0-9A-Fa-f]))","g"),G=new RegExp(J("[^]","[A-Za-z0-9\\!\\$\\%\\'\\*\\+\\-\\^\\_\\`\\{\\|\\}\\~]","[\\.]",'[\\"]',K),"g"),Y=new RegExp(J("[^]",V,"[\\!\\$\\'\\(\\)\\*\\+\\,\\;\\:\\@]"),"g"),W=Y;function X(e){var r=f(e);return r.match(M)?r:e}var ee={scheme:"mailto",parse:function(e,r){var t=e,a=t.to=t.path?t.path.split(","):[];if(t.path=void 0,t.query){for(var s=!1,o={},i=t.query.split("&"),n=0,l=i.length;n=8" + "node": ">=10" }, "files": [ "source", "index.d.ts" ], + "funding": "https://github.com/chalk/chalk?sponsor=1", "homepage": "https://github.com/chalk/chalk#readme", "keywords": [ "color", @@ -62,11 +63,15 @@ "bench": "matcha benchmark.js", "test": "xo && nyc ava && tsd" }, - "version": "3.0.0", + "version": "4.0.0", "xo": { "rules": { "unicorn/prefer-string-slice": "off", - "unicorn/prefer-includes": "off" + "unicorn/prefer-includes": "off", + "@typescript-eslint/member-ordering": "off", + "no-redeclare": "off", + "unicorn/string-content": "off", + "unicorn/better-regex": "off" } } } \ No newline at end of file diff --git a/tools/node_modules/eslint/node_modules/chalk/readme.md b/tools/node_modules/eslint/node_modules/chalk/readme.md index 877cb93b7861f7..a0ca245604033b 100644 --- a/tools/node_modules/eslint/node_modules/chalk/readme.md +++ b/tools/node_modules/eslint/node_modules/chalk/readme.md @@ -9,11 +9,10 @@ > Terminal string styling done right -[![Build Status](https://travis-ci.org/chalk/chalk.svg?branch=master)](https://travis-ci.org/chalk/chalk) [![Coverage Status](https://coveralls.io/repos/github/chalk/chalk/badge.svg?branch=master)](https://coveralls.io/github/chalk/chalk?branch=master) [![npm dependents](https://badgen.net/npm/dependents/chalk)](https://www.npmjs.com/package/chalk?activeTab=dependents) [![Downloads](https://badgen.net/npm/dt/chalk)](https://www.npmjs.com/package/chalk) [![](https://img.shields.io/badge/unicorn-approved-ff69b4.svg)](https://www.youtube.com/watch?v=9auOCbH5Ns4) [![XO code style](https://img.shields.io/badge/code_style-XO-5ed9c7.svg)](https://github.com/xojs/xo) ![TypeScript-ready](https://img.shields.io/npm/types/chalk.svg) +[![Build Status](https://travis-ci.org/chalk/chalk.svg?branch=master)](https://travis-ci.org/chalk/chalk) [![Coverage Status](https://coveralls.io/repos/github/chalk/chalk/badge.svg?branch=master)](https://coveralls.io/github/chalk/chalk?branch=master) [![npm dependents](https://badgen.net/npm/dependents/chalk)](https://www.npmjs.com/package/chalk?activeTab=dependents) [![Downloads](https://badgen.net/npm/dt/chalk)](https://www.npmjs.com/package/chalk) [![](https://img.shields.io/badge/unicorn-approved-ff69b4.svg)](https://www.youtube.com/watch?v=9auOCbH5Ns4) [![XO code style](https://img.shields.io/badge/code_style-XO-5ed9c7.svg)](https://github.com/xojs/xo) ![TypeScript-ready](https://img.shields.io/npm/types/chalk.svg) [![run on repl.it](http://repl.it/badge/github/chalk/chalk)](https://repl.it/github/chalk/chalk) - ## Highlights - Expressive API @@ -24,8 +23,7 @@ - Doesn't extend `String.prototype` - Clean and focused - Actively maintained -- [Used by ~46,000 packages](https://www.npmjs.com/browse/depended/chalk) as of October 1, 2019 - +- [Used by ~50,000 packages](https://www.npmjs.com/browse/depended/chalk) as of January 1, 2020 ## Install @@ -33,7 +31,6 @@ $ npm install chalk ``` - ## Usage ```js @@ -107,7 +104,6 @@ console.log(chalk.green('Hello %s'), name); //=> 'Hello Sindre' ``` - ## API ### chalk.`