You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The crawler ran out of memory due to PhantomJS processes hanging about. We need to try to improve the rendering behaviour, but ensure they get killed either way. Not sure what the problem URIs are. The following process got killed so this may be useful test case:
So, due to some apparent bugs/oddities in PhantomJS, the page.open can fail, after which the process hangs rather than exiting neatly.
Some clean-up of the code (see 18f894d and preceding commits) and testing indicated that the simplest option is simply to wait and then force the exit whether or not the page.open reported success. This has been included in v. 2.0.8.
The crawler ran out of memory due to PhantomJS processes hanging about. We need to try to improve the rendering behaviour, but ensure they get killed either way. Not sure what the problem URIs are. The following process got killed so this may be useful test case:
IIRC I've also seen a lot of hanging processed from rendering http://www.thejc.com/
There's a lot of instances of http://www.theatrffynnon.co.uk/ and http://www.ysgolllanfynydd.co.uk/ at the moment, so that might be slow/difficult
The text was updated successfully, but these errors were encountered: