-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
postpack is not called after yarn pack #7924
Labels
fixed-in-modern
This issue has been fixed / implemented in Yarn 2+.
Comments
cc @haochuan |
This issue appears to continue into |
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 15, 2020
7 tasks
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached `" What is going on here? As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. The facts: 1. a Node.js process exits once it's event queue has been drained. 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solve! That's a lot going on, so how can one safely use streams ? Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); `" And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) `"
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached ``` What is going on here? As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. The facts: 1. a Node.js process exits once it's event queue has been drained. 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solve! That's a lot going on, so how can one safely use streams ? Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); `" And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) `"
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached ``` What is going on here? As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. The facts: 1. a Node.js process exits once it's event queue has been drained. 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solve! That's a lot going on, so how can one safely use streams ? Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); ``` And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) ```
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached ``` What is going on here? As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. The facts: 1. a Node.js process exits once it's event queue has been drained, not referenced IO handles, and no referenced timer handles. 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solve! That's a lot going on, so how can one safely use streams ? Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); ``` And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) ```
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached ``` What is going on here? As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. The facts: 1. a Node.js process exits once it's event queue has been drained and has no outstanding reference-able handles (such as IO, Timers etc). 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solve! That's a lot going on, so how can one safely use streams ? Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); ``` And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) ```
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached ``` ### What is going on here? As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. ### The facts: 1. a Node.js process exits once it's event queue has been drained and has no outstanding reference-able handles (such as IO, Timers etc). 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solved! ### That's a lot going on, so how can one safely use streams ? Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); ``` And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) ```
3 tasks
stefanpenner
added a commit
to stefanpenner/yarn
that referenced
this issue
Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely. ```js // src/cli/commands/pack.js await new Promise((resolve, reject) => { stream.pipe(fs2.createWriteStream(filename)); stream.on(‘error’, reject); // reject is never invoked stream.on(‘close’, resolve); // resolve is never invoked // reached }); // never reached ``` As it turns out, the above stream code is unsafe, and only appeared to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable. 1. a Node.js process exits once it's event queue has been drained and has no outstanding reference-able handles (such as IO, Timers etc). 2. stream.pipe(…) does not add a task to the event queue 3. new Promise (…) does not add a task to the event queue 4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue 5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect) 6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook. Mystery solved! Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);` This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`. and appears to be preferred in Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations. In short, rather then: ```js stream.pipe(otherStream); `" Under most circumstances it is likely wise to use ```js const { pipeline } = require(‘stream’); pipeline(stream, otherStream, err => { // node-back }); ``` And if you are using streams with promises, consider first promisifying `pipeline` ```js const { promisify } = require(‘util’); const pipeline = promisify(require(‘stream’).pipeline) ```
Fixed in v2 |
paul-soporan
added
the
fixed-in-modern
This issue has been fixed / implemented in Yarn 2+.
label
Jan 2, 2021
copybara-service bot
pushed a commit
to google/safevalues
that referenced
this issue
Jan 27, 2021
…ight place in the package postpack doesn't get invoke by yarn, but this has been fixed in the latest version: yarnpkg/yarn#7924 PiperOrigin-RevId: 354042434
copybara-service bot
pushed a commit
to google/safevalues
that referenced
this issue
Jan 27, 2021
…ight place in the package postpack doesn't get invoke by yarn, but this has been fixed in the latest version: yarnpkg/yarn#7924 PiperOrigin-RevId: 354042434
copybara-service bot
pushed a commit
to google/safevalues
that referenced
this issue
Jan 27, 2021
…ight place in the package postpack doesn't get invoke by yarn, but this has been fixed in the latest version: yarnpkg/yarn#7924 PiperOrigin-RevId: 354055453
2 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Bug description
postpack
is not called afteryarn pack
has completed. Also the success message about creating the tarball duringyarn pack
no longer shows up. This only happens on node12.16.*
What is the current behavior?
When executing the command
yarn pack
, yarn invokes the user defined scriptprepack
before packing but does not invokepostpack
.What is the expected behavior?
When executing the command
yarn pack
, yarn should invoke the user defined scriptprepack
before andpostpack
after the packaging.Steps to Reproduce
yarn init
)package.json
yarn pack
Expected Output:
Actual Output:
Environment
12.16.0
&12.16.1
1.22.0
The text was updated successfully, but these errors were encountered: