-
-
Notifications
You must be signed in to change notification settings - Fork 10.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Import/Export: OOM issues & crash on large files #11005
Comments
This is the error we saw creating a backup during a migration:
and in the
This is the code line in question. Writing the backup in this way will cause OOM errors with large datasets because we push the whole database into an in-memory JS object, then call Note for further research: Determine how overall object size and the presence of large individual strings (ie, huge posts) affects memory usage/reaching of limits. In this particular case the cause was narrowed down to a very large To find a little more long-term solution we had a quick look at a number of options for streaming the creation of the JSON string and piping it into a file so that memory isn't exhausted. The most realistic appeared to be const JsonStreamStringify = require('json-stream-stringify');
writeExportFile = function writeExportFile(exportResult) {
var filename = path.resolve(urlUtils.urlJoin(config.get('paths').contentPath, 'data', exportResult.filename));
const jsonStream = new JsonStreamStringify(exportResult.data);
const fileStream = fs.createWriteStream(filename);
jsonStream.pipe(fileStream);
return new Promise((resolve, reject) => {
jsonStream.once('error', reject);
jsonStream.on('end', () => resolve(filename));
fileStream.once('error', reject);
});
}; For our example failing db I recorded the following times to complete a backup:
This shows that Other things found during research:
|
Note: This issue is conflating different problems with the export and import, buuuuuut they're fundamentally the same in that Ghost's lifecycle tooling has been outgrown by the volumes of data we're now handling, and we need to revisit them.
If you try to import a very large file into Ghost, Ghost can run out of memory and crash (Full error below)
The first part of the error includes a "MaxListenersExceededWarning", which we also see in the import/export tests.
The last part of the error is "Fatal error: Cannot read property 'hardStop' of null" which appears to be a bug at first glance.
We have also seen recently that backups fail when there is a lot of very large content due to using JSON.stringify to create the export/backup.
To Reproduce
[1]: I can provide an example if needed.
Technical details:
The text was updated successfully, but these errors were encountered: