Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feedme_logs table causes 500 error because of allowed memory limit with large amount of logs #1487

Closed
jamesmacwhite opened this issue Aug 1, 2024 · 9 comments
Labels

Comments

@jamesmacwhite
Copy link
Contributor

Description

On a production environment trying to view the logs caused a 500 error. Turns out it was due to exceeding the max memory set for the PHP node.

web_php_error 2024-08-01T21:24:40Z PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 20480 bytes) in /srv/app/nttmcoll-api/htdocs/vendor/yiisoft/yii2/db/Command.php on line 1194
apache_access xx.xx.xx.xx - - [01/Aug/2024:21:24:38 +0000] "GET /manage/feed-me/logs?p=manage/feed-me/logs HTTP/1.1" 500 304 "https://domain.com/manage/feed-me/feeds" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36 Edg/126.0.0.0"

I'm aware logs have now been moved back into the database.

I have updated the logging setting to be error only in production, however this potentially highlighted the need for automatic cleaning/truncating of the log table. For a site using multiple feeds which run regularly, this could quickly fill up and potentially bloat the database, along with causing an issue loading all the log data on the front end.

@knufri
Copy link

knufri commented Aug 2, 2024

I have the same issue. I run multiple feeds every night with a lots of entries. Click on the log tab causes a 500 error

@jamesmacwhite
Copy link
Contributor Author

jamesmacwhite commented Aug 2, 2024

@knufri Couple of things I looked at today to workaround the issue.

  1. In production set the logging value in the feed-me.php config to error, it defaults to very verbose logs by default.
  2. A quick dirty way is truncating the feedme_logs table, there's an option do this on the logs page in the Control Panel, but if you can't load it due to the 500 error you can truncate the DB table via a content migration or just run it direct as SQL.
  3. You can write a console command to query the rows in the feedme_logs DB table and find older ones after a period of time, something like this will do the job:
$rows = Craft::$app->getDb()->createCommand()
            ->delete(Plugin::$plugin->logs::LOG_TABLE, 'log_time <= ' . strtotime('-1 week'))
            ->execute();

$this->stdout(Craft::t('plugin-handle', "Deleted $rows {rowsLabel} from the feedme_logs table.", [
    'rowsLabel' => ngettext('row', 'rows', $rows)
]) . PHP_EOL, BaseConsole::FG_GREEN);

return ExitCode::OK;

The log table has a log_time column, which is a timestamp, so we can do a date comparison. 1 week is arbitrary, but you get the idea. Should help a bit.

@knufri
Copy link

knufri commented Aug 2, 2024

Hello @jamesmacwhite,

thanks for the infos 👍 I set the the logging value in the feed-me.php config to error and cleared the database table manually. No my database is about 400MB smaller 🙈 - I hope this is enough for the future logging.

@timkelty
Copy link
Contributor

I'm thinking we should:

  • add a maxAge settings for logs
  • add a CLI command to purge logs, e.g. feed-me/logs/clear
  • Trigger log clearing on Craft's garbage collection
  • Make sure the logs table is properly paginated so it loads regardless of large the table is.

Does that sounds like it would address your issues @jamesmacwhite @knufri?

/cc @jeffreyzant

@jamesmacwhite
Copy link
Contributor Author

@timkelty Definitely all great options to reduce a large log table causing issues when loading on the front end for sure! Max age is a good one, as that's the main issue currently. Pagination will make web requests better and the others are bonuses, but welcome!

@jamesmacwhite
Copy link
Contributor Author

One addition, I would consider changing the default log behaviour to errors only, the default is very verbose. I can see the reasons behind that, but it does log an awful lot by default.

@timkelty
Copy link
Contributor

@jamesmacwhite totally agree. Once we figure out the remaining issues with logging, we'll probably publish a major version with that update since it would technically be a breaking change.

@timkelty
Copy link
Contributor

After peeking at the code, it looks like the logs table has always been limited to 300 items. As a quick fix, I limited the query instead of the result, which which wont change the behavior, but should fix the issue with the page not loading.

5.7.0/6.3.0 have been released with this fix, along with a feed-me-/logs/clear console command.

@jamesmacwhite
Copy link
Contributor Author

Thanks for the initial work here, it will certainly help in the short term, with the longer-term areas discussed, it should allow for more flexibility.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants