Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix descriptions of model-serving runtimes #311

Merged
merged 3 commits into from
Jun 10, 2024

Conversation

eturner24
Copy link
Contributor

Description

Tighten up intro paragraph with descriptions of model-serving runtimes and clarify inaccurate wording.

How Has This Been Tested?

Ran a local build in Gatbsy and confirmed that the changed text appeared in the relevant sections

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

Tighten up intro paragraph with descriptions of model-serving runtimes and clarify inaccurate wording.
@eturner24 eturner24 marked this pull request as draft June 7, 2024 18:17
@@ -3,11 +3,11 @@
[id="adding-a-custom-model-serving-runtime-for-the-single-model-serving-platform_{context}"]
= Adding a custom model-serving runtime for the single-model serving platform
ifdef::upstream[]
A model-serving runtime adds support for a specified set of model frameworks (that is, formats). You have the option of using the link:{odhdocshome}/serving-models/#about-the-single-model-serving-platform_serving-large-models[pre-installed runtimes] included with {productname-short} or adding your own, custom runtimes. This is useful in instances where the pre-installed runtimes don't meet your needs. For example, you might find that the TGIS runtime does not support a particular model format that is supported by link:https://huggingface.co/docs/text-generation-inference/supported_models[Hugging Face Text Generation Inference (TGI)^]. In this case, you can create a custom runtime to add support for the model.
A model-serving runtime adds support for a specified set of model frameworks and the model formats supported by those frameworks. You can use the link:{odhdocshome}/serving-models/#about-the-single-model-serving-platform_serving-large-models[pre-installed runtimes] included with {productname-short}. You can also add your own custom runtimes if the default runtimes do not meet your needs. For example, if the TGIS runtime does not support a model format that is supported by link:https://huggingface.co/docs/text-generation-inference/supported_models[Hugging Face Text Generation Inference (TGI)^], you can create a custom runtime to add support for the model.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest the following minor tweak to improve readability:

"You can use the link:{odhdocshome}/serving-models/#about-the-single-model-serving-platform_serving-large-models[pre-installed runtimes] that are included with {productname-short}"

endif::[]

ifdef::self-managed,cloud-service[]
A model-serving runtime adds support for a specified set of model frameworks (that is, formats). You have the option of using the link:{rhoaidocshome}{default-format-url}/serving_models/serving-large-models_serving-large-models#about-the-single-model-serving-platform_serving-large-models[pre-installed runtimes] included with {productname-short} or adding your own, custom runtimes. This is useful in instances where the pre-installed runtimes don't meet your needs. For example, you might find that the TGIS runtime does not support a particular model format that is supported by link:https://huggingface.co/docs/text-generation-inference/supported_models[Hugging Face Text Generation Inference (TGI)^]. In this case, you can create a custom runtime to add support for the model.
A model-serving runtime adds support for a specified set of model frameworks and the model formats supported by those frameworks. You can use the link:{rhoaidocshome}{default-format-url}/serving_models/serving-large-models_serving-large-models#about-the-single-model-serving-platform_serving-large-models[pre-installed runtimes] included with {productname-short}. You can also add your own custom runtimes if the default runtimes do not meet your needs. For example, if the TGIS runtime does not support a model format that is supported by link:https://huggingface.co/docs/text-generation-inference/supported_models[Hugging Face Text Generation Inference (TGI)^], you can create a custom runtime to add support for the model.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

@jbyrne-redhat jbyrne-redhat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updates look really good, @eturner24. The previous inaccuracy is fixed, and the text is tighter overall. You even found and fixed a stray contraction :)

One minor tweak to suggest, but PR approved nonetheless.

@eturner24 eturner24 marked this pull request as ready for review June 10, 2024 15:24
@jbyrne-redhat
Copy link
Contributor

Looks good to me, @eturner24. Ready to merge.

@eturner24 eturner24 merged commit 08bb592 into opendatahub-io:main Jun 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants