diff --git a/workshop/docs/modules/ROOT/pages/automating-workflows-with-pipelines.adoc b/workshop/docs/modules/ROOT/pages/automating-workflows-with-pipelines.adoc index ff82f69..2dd010f 100644 --- a/workshop/docs/modules/ROOT/pages/automating-workflows-with-pipelines.adoc +++ b/workshop/docs/modules/ROOT/pages/automating-workflows-with-pipelines.adoc @@ -99,7 +99,7 @@ The secret is named `aws-connection-my-storage`. [NOTE] ==== -If you named your data connection something other than `My Storage`, you can obtain the secret name in the {prodname-short} dashboard by hovering over the resource information icon *?* in the *Data Connections* tab. +If you named your data connection something other than `My Storage`, you can obtain the secret name in the {productname-short} dashboard by hovering over the resource information icon *?* in the *Data Connections* tab. image::pipelines/dsp-dc-secret-name.png[My Storage Secret Name, 400] ==== diff --git a/workshop/docs/modules/ROOT/pages/enabling-data-science-pipelines.adoc b/workshop/docs/modules/ROOT/pages/enabling-data-science-pipelines.adoc index 34d8934..319d445 100644 --- a/workshop/docs/modules/ROOT/pages/enabling-data-science-pipelines.adoc +++ b/workshop/docs/modules/ROOT/pages/enabling-data-science-pipelines.adoc @@ -7,7 +7,9 @@ In this section, you prepare your {deliverable} environment so that you can use .Procedure -. In the {prodname-short} dashboard, navigate to *Data Science Pipelines* -> *Pipelines*. +. In the {productname-short} dashboard, click *Data Science Projects* and then select *Fraud Detection*. + +. Navigate to the *Pipelines* section. . Click *Configure pipeline server*. + diff --git a/workshop/docs/modules/ROOT/pages/navigating-to-the-dashboard.adoc b/workshop/docs/modules/ROOT/pages/navigating-to-the-dashboard.adoc index b4f14cf..10df49b 100644 --- a/workshop/docs/modules/ROOT/pages/navigating-to-the-dashboard.adoc +++ b/workshop/docs/modules/ROOT/pages/navigating-to-the-dashboard.adoc @@ -5,7 +5,7 @@ . After you log in to the OpenShift console, access the {productname-short} dashboard by clicking the application launcher icon on the header. + -image::projects/ocp-console-ds-tile.png[{prodname-short} dashboard link] +image::projects/ocp-console-ds-tile.png[{productname-short} dashboard link] . When prompted, log in to the {productname-short} dashboard by using your OpenShift credentials. {productname-short} uses the same credentials as OpenShift for the dashboard, notebooks, and all other components. + diff --git a/workshop/docs/modules/ROOT/pages/running-a-pipeline-generated-from-python-code.adoc b/workshop/docs/modules/ROOT/pages/running-a-pipeline-generated-from-python-code.adoc index 362c4e6..293b2bb 100644 --- a/workshop/docs/modules/ROOT/pages/running-a-pipeline-generated-from-python-code.adoc +++ b/workshop/docs/modules/ROOT/pages/running-a-pipeline-generated-from-python-code.adoc @@ -17,7 +17,7 @@ image::pipelines/wb-download.png[Download Pipeline YAML] . Upload the `7_get_data_train_upload.yaml` file to {productname-short}. -.. In the {prodname-short} dashboard, navigate to your data science project page and then click *Import pipeline*. +.. In the {productname-short} dashboard, navigate to your data science project page and then click *Import pipeline*. + image::pipelines/dsp-pipeline-import.png[] diff --git a/workshop/docs/modules/ROOT/pages/testing-the-model-api.adoc b/workshop/docs/modules/ROOT/pages/testing-the-model-api.adoc index 6bfc733..fee4ca8 100644 --- a/workshop/docs/modules/ROOT/pages/testing-the-model-api.adoc +++ b/workshop/docs/modules/ROOT/pages/testing-the-model-api.adoc @@ -9,7 +9,7 @@ You can communicate directly with this internal service in the same way that an .Procedure -. In the {prodname-short} dashboard, navigate to the project details page and scroll down to the *Models and model servers* section. +. In the {productname-short} dashboard, navigate to the project details page and scroll down to the *Models and model servers* section. . Take note of the model's resource name (API endpoint name) and the internal service's grpcURL and restURL. You need this information when you test the model API. +