You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a table with a json column and 20 records. The json values are large and the 20 records add up to a bit over 1 MB. This causes aurora api to throw an exception: "Database returned more than the allowed response size limit".
The AuroraDataAPICursor.execute() method catches that error and attempts to paginate using a SCROLL CURSOR, however, when it tries to execute that in _start_paginated_query(), it gets a 400 code back on line 169 of init.py (version 0.4.0) self._client.execute_statement(**execute_statement_args)
Error is:
botocore.errorfactory.BadRequestException: An error occurred (BadRequestException) when calling the ExecuteStatement operation: Transaction [transaction id] is not found
It isn't clear to me why the transaction is lost when it tries to run execute the SCROLL CURSOR statement, but I can see in the rds data api logs that the error is being thrown from the Data API itself.
I can keep trying to dig into it, but I thought I'd post the issue since it might be useful here.
Thanks and let me know if I can provide any further detail. Thanks!
The text was updated successfully, but these errors were encountered:
Hi, yes, this is a known issue. When this package was first written, Data API kept transactions open when this error was thrown, which allowed dynamic backoff to adjust the page size to enable paginated result fetching. Since then, Data API has started to abort transactions whenever this error is encountered, so this logic no longer works. I have spoken to the Data API team and they don't currently have a solution for this issue.
Because of this, currently the best solution I have is to manually paginate with LIMIT..OFFSET. You can call _start_paginated_query() on the cursor:
Hi there,
I have a table with a json column and 20 records. The json values are large and the 20 records add up to a bit over 1 MB. This causes aurora api to throw an exception: "Database returned more than the allowed response size limit".
The AuroraDataAPICursor.execute() method catches that error and attempts to paginate using a SCROLL CURSOR, however, when it tries to execute that in _start_paginated_query(), it gets a 400 code back on line 169 of init.py (version 0.4.0)
self._client.execute_statement(**execute_statement_args)
Error is:
It isn't clear to me why the transaction is lost when it tries to run execute the SCROLL CURSOR statement, but I can see in the rds data api logs that the error is being thrown from the Data API itself.
I can keep trying to dig into it, but I thought I'd post the issue since it might be useful here.
Thanks and let me know if I can provide any further detail. Thanks!
The text was updated successfully, but these errors were encountered: