Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UPSTREAM: <carry>: Add ilab kfp pipeline to the DSPO repo #120

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

VaniHaripriya
Copy link

Description of your changes:
Resolves #RHOAIENG-16507

  • Included the iLab pipeline.yaml for use in provisioning within DSPA.
  • Modified the Dockerfile to copy the pipeline.yaml file into the apiserver container.

Checklist:

Copy link

openshift-ci bot commented Dec 13, 2024

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign dsp-developers for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 2 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...2e8d4cf6d5e4c7089a261ab86560f1a313550325

UPSTREAM commit 2e8d4cf has invalid summary Updated Dockerfile.

UPSTREAM commits are validated against the following regular expression:
  ^UPSTREAM: (revert: )?(([\w.-]+/[\w-.-]+)?: )?(\d+:|<carry>:|<drop>:)

UPSTREAM commit summaries should look like:

  UPSTREAM: <PR number|carry|drop>: description

UPSTREAM commits which revert previous UPSTREAM commits should look like:

  UPSTREAM: revert: <normal upstream format>

Examples of valid summaries:

  UPSTREAM: 12345: A kube fix
  UPSTREAM: <carry>: A carried kube change
  UPSTREAM: <drop>: A dropped kube change
  UPSTREAM: revert: 12345: A kube revert



UPSTREAM commit 7f991b2 has invalid summary Add ilab kfp pipeline to the DSPO repo.

UPSTREAM commits are validated against the following regular expression:
  ^UPSTREAM: (revert: )?(([\w.-]+/[\w-.-]+)?: )?(\d+:|<carry>:|<drop>:)

UPSTREAM commit summaries should look like:

  UPSTREAM: <PR number|carry|drop>: description

UPSTREAM commits which revert previous UPSTREAM commits should look like:

  UPSTREAM: revert: <normal upstream format>

Examples of valid summaries:

  UPSTREAM: 12345: A kube fix
  UPSTREAM: <carry>: A carried kube change
  UPSTREAM: <drop>: A dropped kube change
  UPSTREAM: revert: 12345: A kube revert


@VaniHaripriya VaniHaripriya force-pushed the RHOAIENG-16507 branch 2 times, most recently from f92a1dc to a699d91 Compare December 13, 2024 18:19
@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...a699d9194f25db88549c9ff105a6b077bcb45a74

UPSTREAM commit a699d91 has invalid summary Add ilab kfp pipeline to the DSPO repo.

UPSTREAM commits are validated against the following regular expression:
  ^UPSTREAM: (revert: )?(([\w.-]+/[\w-.-]+)?: )?(\d+:|<carry>:|<drop>:)

UPSTREAM commit summaries should look like:

  UPSTREAM: <PR number|carry|drop>: description

UPSTREAM commits which revert previous UPSTREAM commits should look like:

  UPSTREAM: revert: <normal upstream format>

Examples of valid summaries:

  UPSTREAM: 12345: A kube fix
  UPSTREAM: <carry>: A carried kube change
  UPSTREAM: <drop>: A dropped kube change
  UPSTREAM: revert: 12345: A kube revert


@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...e068d349fde740872d0ad7a77955e61cf3d80f4e

@dsp-developers
Copy link

A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

@dsp-developers
Copy link

An OCP cluster where you are logged in as cluster admin is required.

The Data Science Pipelines team recommends testing this using the Data Science Pipelines Operator. Check here for more information on using the DSPO.

To use and deploy a DSP stack with these images (assuming the DSPO is deployed), first save the following YAML to a file named dspa.pr-120.yaml:

apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
  name: pr-120
spec:
  dspVersion: v2
  apiServer:
    image: "quay.io/opendatahub/ds-pipelines-api-server:pr-120"
    argoDriverImage: "quay.io/opendatahub/ds-pipelines-driver:pr-120"
    argoLauncherImage: "quay.io/opendatahub/ds-pipelines-launcher:pr-120"
  persistenceAgent:
    image: "quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120"
  scheduledWorkflow:
    image: "quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120"
  mlmd:  
    deploy: true  # Optional component
    grpc:
      image: "quay.io/opendatahub/mlmd-grpc-server:latest"
    envoy:
      image: "registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2"
  mlpipelineUI:
    deploy: true  # Optional component 
    image: "quay.io/opendatahub/ds-pipelines-frontend:pr-120"
  objectStorage:
    minio:
      deploy: true
      image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'

Then run the following:

cd $(mktemp -d)
git clone git@github.com:opendatahub-io/data-science-pipelines.git
cd data-science-pipelines/
git fetch origin pull/120/head
git checkout -b pullrequest 2e8d4cf6d5e4c7089a261ab86560f1a313550325
oc apply -f dspa.pr-120.yaml

More instructions here on how to deploy and test a Data Science Pipelines Application.

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

1 similar comment
@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

@@ -44,6 +44,7 @@ WORKDIR /bin

COPY --from=builder /opt/app-root/src/backend/src/apiserver/config/ /config
COPY --from=builder /bin/apiserver /bin/apiserver
COPY --from=builder /opt/app-root/src/backend/src/apiserver/ilab_pipeline/pipeline.yaml /pipelines/
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we're going with this method of including the pipeline we should also include the iris pipeline similarly. This may be out of scope for the ticket, but let's open a story to do so before we forget

Copy link
Member

@gmfrasca gmfrasca Dec 13, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

following https://github.com/opendatahub-io/data-science-pipelines/pull/120/files#r1884437176, we could just COPY the entire directory rather than pinning individual files, so something like:

COPY --from=builder /opt/app-root/src/backend/src/apiserver/sample_pipelines /pipelines

which would be a bit easier on upkeep as we can just add new pipelines and it would automatically get picked up, rather than explicitly adding new layers every time

@@ -0,0 +1,2326 @@
# PIPELINE DEFINITION
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lets put this file in backend/src/apiserver/sample_pipelines, or something similar, so we can expand on this later.

Also, we should name this file something a little more clear than pipeline.yaml. Perhaps instructlab.yaml?

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 2 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...a30d8e063db7835ac4c3723c3ba89570c6cf24a0

UPSTREAM commit a30d8e0 has invalid summary Update name.

UPSTREAM commits are validated against the following regular expression:
  ^UPSTREAM: (revert: )?(([\w.-]+/[\w-.-]+)?: )?(\d+:|<carry>:|<drop>:)

UPSTREAM commit summaries should look like:

  UPSTREAM: <PR number|carry|drop>: description

UPSTREAM commits which revert previous UPSTREAM commits should look like:

  UPSTREAM: revert: <normal upstream format>

Examples of valid summaries:

  UPSTREAM: 12345: A kube fix
  UPSTREAM: <carry>: A carried kube change
  UPSTREAM: <drop>: A dropped kube change
  UPSTREAM: revert: 12345: A kube revert


@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...8f895caca535665d22debd61a9a8772ecfe20b70

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 2 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...cfd29b3bab92a3172211c0f4652e5b846870feac

UPSTREAM commit cfd29b3 has invalid summary Update pipelines.

UPSTREAM commits are validated against the following regular expression:
  ^UPSTREAM: (revert: )?(([\w.-]+/[\w-.-]+)?: )?(\d+:|<carry>:|<drop>:)

UPSTREAM commit summaries should look like:

  UPSTREAM: <PR number|carry|drop>: description

UPSTREAM commits which revert previous UPSTREAM commits should look like:

  UPSTREAM: revert: <normal upstream format>

Examples of valid summaries:

  UPSTREAM: 12345: A kube fix
  UPSTREAM: <carry>: A carried kube change
  UPSTREAM: <drop>: A dropped kube change
  UPSTREAM: revert: 12345: A kube revert


Signed-off-by: VaniHaripriya <vmudadla@redhat.com>
@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

@dsp-developers
Copy link

Commit Checker results:

**NOTE**: These are the results of the commit checker scans. 
If these are not commits from upstream kfp, then please ensure
you adhere to the commit checker formatting
commitchecker verson unknown
Validating 1 commits between 7c3b0204f09934458ccd7ce65deba74fb0146286...90b978f752965a675774c6b4c030a935dc8dd32d

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

2 similar comments
@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

@dsp-developers
Copy link

Change to PR detected. A new PR build was completed.
A set of new images have been built to help with testing out this PR:
API Server: quay.io/opendatahub/ds-pipelines-api-server:pr-120
DSP DRIVER: quay.io/opendatahub/ds-pipelines-driver:pr-120
DSP LAUNCHER: quay.io/opendatahub/ds-pipelines-launcher:pr-120
Persistence Agent: quay.io/opendatahub/ds-pipelines-persistenceagent:pr-120
Scheduled Workflow Manager: quay.io/opendatahub/ds-pipelines-scheduledworkflow:pr-120
MLMD Server: quay.io/opendatahub/mlmd-grpc-server:latest
MLMD Envoy Proxy: registry.redhat.io/openshift-service-mesh/proxyv2-rhel8:2.3.9-2
UI: quay.io/opendatahub/ds-pipelines-frontend:pr-120

Copy link

@hbelmiro hbelmiro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants