Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncatchable error when upload stream is used in failed pipeline. #2560

Open
7 tasks done
thebenperson opened this issue Dec 3, 2024 · 1 comment
Open
7 tasks done
Labels
api: storage Issues related to the googleapis/nodejs-storage API. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@thebenperson
Copy link

thebenperson commented Dec 3, 2024

Please make sure you have searched for information in the following guides.

A screenshot that you have tested with "Try this API".

N/A. This issue seems to be related to the client library's error handling ability and not the underlying API.

Link to the code that reproduces this issue. A link to a public Github Repository or gist with a minimal reproduction.

https://gist.github.com/thebenperson/c6c1e66653fdc8c05cab52726b2f0664

A step-by-step description of how to reproduce the issue, based on the linked reproduction.

  1. Create a readable input stream.
  2. Create a writable output stream by calling File.createWriteStream().
  3. Call Stream.promises.pipeline(input, output).
  4. Have the input stream emit() an error before it can provide any data.

A clear and concise description of what the bug is, and what you expected to happen.

Expected behavior: Stream.promises.pipeline() promise is rejected.
Actual behavior: Program ends due to an unhandled "error" event.

I might have found the problem.

The error from the input stream is passed on to the output stream.
But, no data has been provided from the input stream yet.

This means that this callback hasn't been called yet:

writeStream.once('writing', () => {

Meaning pipeline() hasn't been called here yet:

pipeline(

So that means that there is nothing to handle errors when
emitStream.destroy(e) is called here:

emitStream.destroy(e);

A clear and concise description WHY you expect this behavior, i.e., was it a recent change, there is documentation that points to this behavior, etc. **

Other types of writable streams do not cause errors like this one.
The writable stream from this library is interfering with the pipeline() function's ability to handle errors.

@product-auto-label product-auto-label bot added the api: storage Issues related to the googleapis/nodejs-storage API. label Dec 3, 2024
@ddelgrosso1 ddelgrosso1 added the type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. label Dec 3, 2024
@seanlennaerts
Copy link

seanlennaerts commented Feb 28, 2025

I've encountered this same issue when using pipeline() with a GCP storage writer stream. When an error occurs upstream in the pipeline it results in an uncaught error because the GCP storage writer doesn't propagate the error to the pipeline correctly.

The error propagates correctly with fs.createWriteStream but not with GCP's writer. For comparison:

  • Works: pipeline(reader, transformerThatEmitsError, fs.createWriteStream())
  • Fails with unhandled error: pipeline(reader, transformerThatEmitsError, GCP storage writer)

Hope this additional context helps resolve the issue!

edit:

So I think @thebenperson is correct about the problem. If you call writer.write(null) before the pipeline then errors are handled properly

const writer = Storage().bucket('bucket').file('file').createWriteStream()

// call write so that the 'writing' event gets emitted and the internal pipeline is inited
writer.write('')

await pipeline(reader, transformerThatEmitsError, writer)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: storage Issues related to the googleapis/nodejs-storage API. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

No branches or pull requests

3 participants