-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Redash crashes on query that returns a lot of results #6608
Comments
Why you don’t aggregation data? |
@trantrinhquocviet because I need that data exported in order to provide it to somebody that can't access my database. |
I tried on my local (RPi with 8gb ram, 2.7m row query, 4 columns: This sounds ~reasonable to me, Redash loads all the data into memory so if there's too much data we'll OOM. Likely something we can only fix with optimizations though, and not a simple one-liner fix. |
Can you creat a local report to import your data? After connect with redash. Because of it is not a database. It is just data visualization tool. |
I think a dataviz tool should be able to handle large datasets too, though. Maybe the limit is less than 3m rows, but e.g. a scatter plot should be able to handle a very large amount of datapoints and could still be useful. |
Large query results are yet not supported. |
Issue Summary
When I tried runnig a query returning a lot of data (more than 3M rows) I get the following output on the worker:
After some research it seems to be related to a problem of the worker not having enough memory or something similar.
Is it even possible to run such queries in redash?
Steps to Reproduce
Technical details:
commit a19b17b844063f215266286ea8bd185086e3e27a (HEAD -> master, origin/master, origin/HEAD)
The text was updated successfully, but these errors were encountered: