-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Close stream after calling appendRows error #348
Comments
Hi @jyeros - to diagnose this particular issue, I will need more details on this error, particularly the debug output from the failed RPC call. With something like this, it's important to double-check the request headers that are getting sent. You may also give our new client library package a try. You can see a sample here, which includes |
@loferris Here is a patch on the sample for the current main (1867062) that reproduces the problem. diff --git a/samples/append_rows_pending.js b/samples/append_rows_pending.js
index 324c79d..f2cb7e6 100644
--- a/samples/append_rows_pending.js
+++ b/samples/append_rows_pending.js
@@ -20,7 +20,7 @@ function main(
tableId = 'my_table'
) {
// [START bigquerystorage_append_rows_pending]
- const {adapt, managedwriter} = require('@google-cloud/bigquery-storage');
+ const {adapt, managedwriter} = require('../build/src');
const {WriterClient, Writer} = managedwriter;
const customer_record_pb = require('./customer_record_pb.js');
@@ -73,49 +73,6 @@ function main(
let serializedRows = [];
const pendingWrites = [];
- // Row 1
- let row = {
- rowNum: 1,
- customerName: 'Octavia',
- };
- serializedRows.push(CustomerRecord.encode(row).finish());
-
- // Row 2
- row = {
- rowNum: 2,
- customerName: 'Turing',
- };
- serializedRows.push(CustomerRecord.encode(row).finish());
-
- // Set an offset to allow resuming this stream if the connection breaks.
- // Keep track of which requests the server has acknowledged and resume the
- // stream at the first non-acknowledged message. If the server has already
- // processed a message with that offset, it will return an ALREADY_EXISTS
- // error, which can be safely ignored.
-
- // The first request must always have an offset of 0.
- let offsetValue = 0;
-
- // Send batch.
- let pw = writer.appendRows({serializedRows}, offsetValue);
- pendingWrites.push(pw);
-
- serializedRows = [];
-
- // Row 3
- row = {
- rowNum: 3,
- customerName: 'Bell',
- };
- serializedRows.push(CustomerRecord.encode(row).finish());
-
- // Offset must equal the number of rows that were previously sent.
- offsetValue = 2;
-
- // Send batch.
- pw = writer.appendRows({serializedRows}, offsetValue);
- pendingWrites.push(pw);
-
const results = await Promise.all(
pendingWrites.map(pw => pw.getResult())
); The results with GRPC tracing are as follows:
It looks like it is the |
Getting really railed by this same issue. Using the async write client in Python. |
@z3z1ma @jdleesmiller @jyeros sorry for the delay on this, I managed to reproduce the issue and created PR #439 to fix this issue. Once is merged I'll publish a new version with the fix. |
End appendRows stream without writing any data causes
Error: 3 INVALID_ARGUMENT: Cannot route on empty project id ''
Environment details
@google-cloud/bigquery-storage
version: 3.4.0Steps to reproduce
The text was updated successfully, but these errors were encountered: