You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The latest release of microsoft phi3 4.2b 128k context vision model looks promising in performance and resource saving one too as it boast just 4.2b parameter. So it would be a great feature if the lmdeploy inference server supports it
I tried running the model via the lmdeploy docker inference server, installed the required additional packages and ran the model, model loaded and running but while trying to inference it via the api's we are getting either empty response or internal server error
The text was updated successfully, but these errors were encountered:
Motivation
The latest release of microsoft phi3 4.2b 128k context vision model looks promising in performance and resource saving one too as it boast just 4.2b parameter. So it would be a great feature if the lmdeploy inference server supports it
Related resources
https://huggingface.co/microsoft/Phi-3-vision-128k-instruct/tree/main
https://azure.microsoft.com/en-us/blog/new-models-added-to-the-phi-3-family-available-on-microsoft-azure/
Additional context
I tried running the model via the lmdeploy docker inference server, installed the required additional packages and ran the model, model loaded and running but while trying to inference it via the api's we are getting either empty response or internal server error
The text was updated successfully, but these errors were encountered: