-
If I want to update my model and keep my service alive(hot update), Could the BentoML support it? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Hi @MrRace Model hot update is not supported in BentoML and currently not a priority for us. We do plan to build some base APIs for advanced users to implement dynamic model loading/unloading strategy themselves, but it won't be the recommended way of doing things in BentoML. For most users, we'd recommend using existing blue/green deployment tools to achieve zero downtime update of new model. So that DevOps can treat the ML model serving workloads the same way how other micro-services get deployed. |
Beta Was this translation helpful? Give feedback.
-
@parano Could you recommend some blue/green deployment tools ? Thanks! |
Beta Was this translation helpful? Give feedback.
Hi @MrRace
Model hot update is not supported in BentoML and currently not a priority for us. We do plan to build some base APIs for advanced users to implement dynamic model loading/unloading strategy themselves, but it won't be the recommended way of doing things in BentoML.
For most users, we'd recommend using existing blue/green deployment tools to achieve zero downtime update of new model. So that DevOps can treat the ML model serving workloads the same way how other micro-services get deployed.