Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transaction lost on "Database returned more than the allowed response size limit" #38

Open
sammywachtel opened this issue Mar 10, 2022 · 2 comments

Comments

@sammywachtel
Copy link

Hi there,

I have a table with a json column and 20 records. The json values are large and the 20 records add up to a bit over 1 MB. This causes aurora api to throw an exception: "Database returned more than the allowed response size limit".

The AuroraDataAPICursor.execute() method catches that error and attempts to paginate using a SCROLL CURSOR, however, when it tries to execute that in _start_paginated_query(), it gets a 400 code back on line 169 of init.py (version 0.4.0)
self._client.execute_statement(**execute_statement_args)

Error is:

botocore.errorfactory.BadRequestException: An error occurred (BadRequestException) when calling the ExecuteStatement operation: Transaction [transaction id] is not found

It isn't clear to me why the transaction is lost when it tries to run execute the SCROLL CURSOR statement, but I can see in the rds data api logs that the error is being thrown from the Data API itself.

I can keep trying to dig into it, but I thought I'd post the issue since it might be useful here.

Thanks and let me know if I can provide any further detail. Thanks!

@kislyuk
Copy link
Contributor

kislyuk commented Mar 10, 2022

Hi, yes, this is a known issue. When this package was first written, Data API kept transactions open when this error was thrown, which allowed dynamic backoff to adjust the page size to enable paginated result fetching. Since then, Data API has started to abort transactions whenever this error is encountered, so this logic no longer works. I have spoken to the Data API team and they don't currently have a solution for this issue.

Because of this, currently the best solution I have is to manually paginate with LIMIT..OFFSET. You can call _start_paginated_query() on the cursor:

https://github.com/cloud-utils/aurora-data-api/blob/main/aurora_data_api/__init__.py#L187

and then iterate over the cursor. I may or may not be able to update the API for this package to deal with this situation more gracefully.

@sammywachtel
Copy link
Author

That makes a lot of sense. Thanks for the response -- I'll either paginate or rethink the data structure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants