Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow --max-requests and --max-requests jitter parameters for python wrapper #911

Closed
kparaju opened this issue Oct 4, 2019 · 3 comments · Fixed by #925
Closed

Allow --max-requests and --max-requests jitter parameters for python wrapper #911

kparaju opened this issue Oct 4, 2019 · 3 comments · Fixed by #925
Milestone

Comments

@kparaju
Copy link
Contributor

kparaju commented Oct 4, 2019

These are some parameters we currently have in our current internal model serving platform that would be very beneficial for seldon.

http://docs.gunicorn.org/en/stable/settings.html#preload-app
--preload allows you to initialize the model once and share it between all your gunicorn workers. This can save a lot of Memory if your model is particularly huge (or if you need to load lots of data as part of init)
If the model is loaded in __init__ it serves the same purpose as preload.

http://docs.gunicorn.org/en/stable/settings.html#max-requests
--max-requests and --max-requests-jitter will restart the worker every N + random(M) requests. gunicorn/flask seems to have some sort of issue of memory increasing as they process requests. We could not figure out where the mem leak was happening so we opted to use this instead.

@kparaju kparaju changed the title Allow --preload, --max-requests and --max-requests jitter parameters for python wrapper Allow --max-requests and --max-requests jitter parameters for python wrapper Oct 4, 2019
@ukclivecox
Copy link
Contributor

I think the issue is if the model is a Tensorflow model then its not so easy to share the graph between running Gunicorn workers so the current code calls load in the thread of the worker.

@ukclivecox ukclivecox added this to the 0.5.x milestone Oct 8, 2019
@kparaju
Copy link
Contributor Author

kparaju commented Oct 8, 2019

FYI, I'll put a PR out for this today.

@ukclivecox
Copy link
Contributor

ukclivecox commented Oct 8, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants