Skip to content

Commit

Permalink
Update docs/samples/pipelines/ (kubeflow#2122)
Browse files Browse the repository at this point in the history
* Update docs/samples/pipelines/
Because KFserving was updated to KServe ,the model-app-ui does not display model with using current sample code.

Signed-off-by: georgetree <basssofu@gmail.com>

* fix Lint error

Signed-off-by: georgetree <basssofu@gmail.com>
  • Loading branch information
georgetree authored Apr 1, 2022
1 parent ffa250a commit 13f4207
Show file tree
Hide file tree
Showing 4 changed files with 51 additions and 45 deletions.
16 changes: 8 additions & 8 deletions docs/samples/pipelines/README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
# Deploy to KFServing from [Kubeflow Pipelines](https://www.kubeflow.org/docs/pipelines/overview/pipelines-overview/)
# Deploy to KServe from [Kubeflow Pipelines](https://www.kubeflow.org/docs/pipelines/overview/pipelines-overview/)

## Kubeflow Pipelines KFServing component
## Kubeflow Pipelines KServe component

The following examples illustrate how to use the Kubeflow Pipelines component for KFServing using the v1beta1 API.
These assume your cluster has a KFServing version >= v0.5.0.
The following examples illustrate how to use the Kubeflow Pipelines component for KServe using the v1beta1 API.
These assume your cluster has a KServe version >= v0.7.0.

* Deploy a [custom model](./sample-custom-model.py).
* Deploy a [TensorFlow model](./sample-tf-pipeline.py). There is also [a notebook](./kfs-pipeline.ipynb) which illustrates this.

Additional usage instructions can be found in the component [README](https://github.com/kubeflow/pipelines/blob/master/components/kubeflow/kfserving/README.md).
To dive into the source behind the KFServing Kubeflow Pipelines Component, please look into the YAML for the [KFServing Component](https://github.com/kubeflow/pipelines/blob/master/components/kubeflow/kfserving/component.yaml) and the [source code](https://github.com/kubeflow/pipelines/blob/master/components/kubeflow/kfserving/src/kfservingdeployer.py).
Additional usage instructions can be found in the component [README](https://github.com/kubeflow/pipelines/blob/master/components/kserve/README.md).
To dive into the source behind the KServe Kubeflow Pipelines Component, please look into the YAML for the [KServe Component](https://github.com/kubeflow/pipelines/blob/master/components/kserve/component.yaml) and the [source code](https://github.com/kubeflow/pipelines/blob/master/components/kserve/src/kservedeployer.py).


**Note**: For those still using an older version of KFServing less than v0.5.0, an older version of the KFServing Pipelines component must be used
as demonstrated in [this notebook](./kfs-pipeline-v1alpha2.ipynb). The source code for this version of the component can be found [here](https://github.com/kubeflow/pipelines/tree/65bed9b6d1d676ef2d541a970d3edc0aee12400d/components/kubeflow/kfserving).


## End to end pipeline example using KFServing
## End to end pipeline example using KServe

Deploy a sample [MNIST model end to end using Kubeflow Pipelines with Tekton](https://github.com/kubeflow/kfp-tekton/tree/master/samples/e2e-mnist). The [notebook](https://github.com/kubeflow/kfp-tekton/blob/master/samples/e2e-mnist/mnist.ipynb) demonstrates how to compile and execute an End to End Machine Learning workflow that uses Katib, TFJob, KFServing, and Tekton pipeline. This pipeline contains 5 steps, it finds the best hyperparameter using Katib, creates PVC for storing models, processes the hyperparameter results, distributedly trains the model on TFJob with the best hyperparameter using more iterations, and finally serves the model using KFServing. You can visit [this medium blog](https://medium.com/@liuhgxa/an-end-to-end-use-case-by-kubeflow-b2f72b0b587) for more details on this pipeline.
Deploy a sample [MNIST model end to end using Kubeflow Pipelines with Tekton](https://github.com/kubeflow/kfp-tekton/tree/master/samples/e2e-mnist). The [notebook](https://github.com/kubeflow/kfp-tekton/blob/master/samples/e2e-mnist/mnist.ipynb) demonstrates how to compile and execute an End to End Machine Learning workflow that uses Katib, TFJob, KServe, and Tekton pipeline. This pipeline contains 5 steps, it finds the best hyperparameter using Katib, creates PVC for storing models, processes the hyperparameter results, distributedly trains the model on TFJob with the best hyperparameter using more iterations, and finally serves the model using KServe. You can visit [this medium blog](https://medium.com/@liuhgxa/an-end-to-end-use-case-by-kubeflow-b2f72b0b587) for more details on this pipeline.

![kfserving-mnist-pipeline](images/kfserving-mnist-pipeline.png)
34 changes: 18 additions & 16 deletions docs/samples/pipelines/kfs-pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"# KFServing Pipeline samples\n",
"# KServe Pipeline samples\n",
"\n",
"This notebook assumes your cluster has KFServing >= v0.5.0 installed which supports the v1beta1 API."
"This notebook assumes your cluster has KServe >= v0.7.0 installed which supports the v1beta1 API."
]
},
{
Expand Down Expand Up @@ -40,7 +40,7 @@
"# Note: Add the KubeFlow Pipeline endpoint below if the client is not running on the same cluster.\n",
"# Example: kfp.Client('http://192.168.1.27:31380/pipeline')\n",
"client = kfp.Client()\n",
"EXPERIMENT_NAME = 'KFServing Experiments'\n",
"EXPERIMENT_NAME = 'KServe Experiments'\n",
"experiment = client.create_experiment(name=EXPERIMENT_NAME, namespace='anonymous')"
]
},
Expand All @@ -57,27 +57,28 @@
"metadata": {},
"outputs": [],
"source": [
"kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/components/kubeflow/kfserving/component.yaml')\n",
"# kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/components/kubeflow/kfserving/component.yaml')\n",
"kserve_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/components/kserve/component.yaml')\n",
"\n",
"@dsl.pipeline(\n",
" name='KFServing pipeline',\n",
" description='A pipeline for KFServing.'\n",
" name='KServe pipeline',\n",
" description='A pipeline for KServe.'\n",
")\n",
"def kfservingPipeline(\n",
"def kservePipeline(\n",
" action='apply',\n",
" model_name='tensorflow-sample',\n",
" model_uri='gs://kfserving-samples/models/tensorflow/flowers',\n",
" namespace='anonymous',\n",
" framework='tensorflow'):\n",
"\n",
" kfserving = kfserving_op(action = action,\n",
" kserve = kserve_op(action = action,\n",
" model_name=model_name,\n",
" model_uri=model_uri,\n",
" namespace=namespace,\n",
" framework=framework).set_image_pull_policy('Always')\n",
"\n",
"# Compile pipeline\n",
"compiler.Compiler().compile(kfservingPipeline, 'tf-flower.tar.gz')\n",
"compiler.Compiler().compile(kservePipeline, 'tf-flower.tar.gz')\n",
"\n",
"# Execute pipeline\n",
"run = client.run_pipeline(experiment.id, 'tf-flower', 'tf-flower.tar.gz')"
Expand All @@ -96,26 +97,27 @@
"metadata": {},
"outputs": [],
"source": [
"kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/components/kubeflow/kfserving/component.yaml')\n",
"# kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/components/kubeflow/kfserving/component.yaml')\n",
"kserve_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/components/kserve/component.yaml')\n",
"\n",
"@dsl.pipeline(\n",
" name='KFServing pipeline',\n",
" description='A pipeline for KFServing.'\n",
" name='KServe pipeline',\n",
" description='A pipeline for KServe.'\n",
")\n",
"def kfservingPipeline(\n",
"def kservePipeline(\n",
" action='apply',\n",
" model_name='max-image-segmenter',\n",
" namespace='anonymous',\n",
" custom_model_spec='{\"name\": \"image-segmenter\", \"image\": \"codait/max-image-segmenter:latest\", \"port\": \"5000\"}'\n",
"):\n",
"\n",
" kfserving = kfserving_op(action=action,\n",
" kserve = kserve_op(action=action,\n",
" model_name=model_name,\n",
" namespace=namespace,\n",
" custom_model_spec=custom_model_spec).set_image_pull_policy('Always')\n",
"\n",
"# Compile pipeline\n",
"compiler.Compiler().compile(kfservingPipeline, 'custom.tar.gz')\n",
"compiler.Compiler().compile(kservePipeline, 'custom.tar.gz')\n",
"\n",
"# Execute pipeline\n",
"run = client.run_pipeline(experiment.id, 'custom-model', 'custom.tar.gz')"
Expand Down Expand Up @@ -143,4 +145,4 @@
},
"nbformat": 4,
"nbformat_minor": 2
}
}
22 changes: 12 additions & 10 deletions docs/samples/pipelines/sample-custom-model.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,25 +15,27 @@
import kfp.dsl as dsl
from kfp import components

kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/'
'components/kubeflow/kfserving/component.yaml')
# kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/master/'
# 'components/kubeflow/kfserving/component.yaml')
kserve_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/'
'master/components/kserve/component.yaml')


@dsl.pipeline(
name='KFServing pipeline',
description='A pipeline for KFServing.'
name='KServe pipeline',
description='A pipeline for KServe.'
)
def kfservingPipeline(
def kservePipeline(
action='apply',
model_name='max-image-segmenter',
namespace='anonymous',
custom_model_spec='{"name": "image-segmenter", "image": "codait/max-image-segmenter:latest", "port": "5000"}'
):
kfserving_op(action=action,
model_name=model_name,
namespace=namespace,
custom_model_spec=custom_model_spec)
kserve_op(action=action,
model_name=model_name,
namespace=namespace,
custom_model_spec=custom_model_spec)


if __name__ == '__main__':
compiler.Compiler().compile(kfservingPipeline, __file__ + '.tar.gz')
compiler.Compiler().compile(kservePipeline, __file__ + '.tar.gz')
24 changes: 13 additions & 11 deletions docs/samples/pipelines/sample-tf-pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,26 +15,28 @@
import kfp.dsl as dsl
from kfp import components

kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/'
'master/components/kubeflow/kfserving/component.yaml')
# kfserving_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/'
# 'master/components/kubeflow/kfserving/component.yaml')
kserve_op = components.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/'
'master/components/kserve/component.yaml')


@dsl.pipeline(
name='KFServing pipeline',
description='A pipeline for KFServing.'
name='KServe pipeline',
description='A pipeline for KServe.'
)
def kfservingPipeline(
def kservePipeline(
action='apply',
model_name='tensorflow-sample',
model_uri='gs://kfserving-samples/models/tensorflow/flowers',
namespace='anonymous',
framework='tensorflow'):
kfserving_op(action=action,
model_name=model_name,
model_uri=model_uri,
namespace=namespace,
framework=framework)
kserve_op(action=action,
model_name=model_name,
model_uri=model_uri,
namespace=namespace,
framework=framework)


if __name__ == '__main__':
compiler.Compiler().compile(kfservingPipeline, __file__ + '.tar.gz')
compiler.Compiler().compile(kservePipeline, __file__ + '.tar.gz')

0 comments on commit 13f4207

Please sign in to comment.