-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory exhaust 1024M in queued export with FromQuery method #2209
Comments
I'm having the same issue, tried changing the chunk size.. didn't help |
I tried too. Didn't help. I'm planning to make the export via linux echo command to csv file. Without Laravel-Excel package. Will attach my results here later. |
There's currently not a whole lot we can do about the increasing memory size. Unfortunately PhpSpreadsheet doesn't have a way to re-open an spreadsheet without loading it into memory. In Laravel-Excel 3.2 this will be slightly improved as it will have the option to use cell-caching (decreases memory usage). We are also planning to release a package specifically designed to handled big data export (using a queue). |
Unfortunately, Laravel Storage Facade doesn't support appending to file without reading. This is my solution:
In ExportBulkContent Job:
In ExportBulkRows Job:
And one |
On large file exports, the PhpSpreadsheet causes the file to open every time and this causes the server to run out of memory due to the size of the file, which is similar to the above issue, you mentioned there is another package to handle large data, any idea about when it would be released? |
Probably in a few weeks. |
Hi @patrickbrouwers, is there any status update on the package for large datasets? |
@patrickbrouwers I think "a few weeks" have passed by now 😅 Is there anything we can do to get 3.2 shipped? We are also stumbling upon memory issues and would really love to see if this fixes our problems. I don't see any definition of 3.2 in the milestones, so I am wondering what would define it as "done". |
Hey @patrickbrouwers! I've been searching a way to export large volumes too, even tried They all result in a memory issue. Is there a way to implement right now some optimisations to fix this? |
Also looking for this. Dealing with some exports with very large data sets. |
To be honest the maintainers don't seem to care about large data sets as their own use cases don't seem to correspond to those that have large data sets. This has been an issue for over a year at this point and they haven't addressed it yet......good luck |
@patrickbrouwers Any update on this? |
Lack of progress is unrelated to us not caring, just simply a lack of time to work on this project. :( That's open source you know :) In the meantime you can have a look at https://stefanzweifel.io/posts/lazy-collections-in-laravel-excel/ According to his benchmarks he's able to export millions of rows without running into a memory limit. We are working on making this the default behaviour for exports in 3.2, given the lack of time I really can't give any promises on timelines. Hope you all understand :) |
That unfortunately doesn't work when dispatching the export to a queue (without a queue it works fine). When you implement
I'm using Laravel 9 with the package version Using a none lazy collection isn't really an option here as I'm loading a lot of data. EDIT: after a bit of research on this issue it appears this functionality was removed in a past commit. I've decided to keep the lazy collection and just run it on a custom background task. Shame it's not possible at present using Laravel Excel. |
Can you show me the commit that removed it I think in 2020 LazyCollection still extended Collection and now it doesn't anymore |
Appears to be d3560cc#diff-eb05512b565745cc1514fe33aa948e31f9ead7e03abb8cf18a3a2a00354d2ccf - a Would a union return type fix this? private function exportCollection(
FromCollection $export,
TemporaryFile $temporaryFile,
string $writerType,
int $sheetIndex
): Collection|LazyCollection { |
We cannot use union types at the moment, as we still support older PHP versions. I would be fine removing the return type (and do it via doc blocks again) for now until a major release if someone PRs it. |
@patrickbrouwers please see the PR #4044 |
Prerequisites
Versions
Description
Memory limit is 1024M. When I try to export big data (400 000 rows) with queue, I get memory exhausted error. One
php artisan queue:work
process uses large memory while executing export. It's about 40M from start, and 900M+ in the end of exporting. I think the FromQuery method uses large memory because when it is trying to append some rows, it loads (initialize) the temp file first (which has a big size while filling with rows).I found some advices here to use FromIterator method instead of FromQuery. But there is no documentation and no example how to use it.
So how can I use FromIterator method with queue? Or why FromQuery method uses such large memory? Is it able to append some rows without loading the whole file? Maybe it's better to append it to the text file without loading the file itself (like
echo "'First column';'Second column';\r\n'Second row';\r\n" >> export.csv
linux command), then convert it to XLSX?Steps to Reproduce
Create some table with 400 000 rows. And try to export it with queue method:
Expected behavior:
The job
NotifyUserOfCompletedExport($bulk)
should be executed.Actual behavior:
php artisan queue:work
crashes with memory 1024M exhaust error.Additional Information
There is code of my controller:
In BulkExport file:
The text was updated successfully, but these errors were encountered: