Skip to content

Commit

Permalink
Merge pull request opendatahub-io#210 from opendatahub-io/v1.0.x
Browse files Browse the repository at this point in the history
Add V1.0.0 to stable
  • Loading branch information
openshift-merge-robot authored Jul 19, 2023
2 parents b8901e0 + ad8715f commit 6bf32d5
Show file tree
Hide file tree
Showing 68 changed files with 2,539 additions and 484 deletions.
76 changes: 73 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,9 @@ To get started you will first need to satisfy the following pre-requisites:
## Pre-requisites
1. An OpenShift cluster that is 4.9 or higher.
2. You will need to be logged into this cluster as [cluster admin] via [oc client].
3. The OpenShift Cluster must have OpenShift Pipelines 1.7.2 or higher installed. Instructions [here][OCP Pipelines Operator].
3. The OpenShift Cluster must have OpenShift Pipelines 1.8 or higher installed. We recommend channel pipelines-1.8
on OCP 4.10 and pipelines-1.9 or pipelines-1.10 for OCP 4.11, 4.12 and 4.13.
Instructions [here][OCP Pipelines Operator].
4. Based on installation type you will need one of the following:
1. For Standalone method: You will need to have [Kustomize] version 4.5+ installed
2. For ODH method: The Open Data Hub operator needs to be installed. You can install it via [OperatorHub][installodh].
Expand Down Expand Up @@ -195,15 +197,83 @@ When a `DataSciencePipelinesApplication` is deployed, the following components a
* APIServer
* Persistence Agent
* Scheduled Workflow controller
* MLPipelines UI

If specified in the `DataSciencePipelinesApplication` resource, the following components may also be additionally deployed:
* MariaDB
* Minio
* MLPipelines UI
* MLMD (ML Metadata)

To understand how these components interact with each other please refer to the upstream
[Kubeflow Pipelines Architectural Overview] documentation.

## Deploying Optional Components

### MariaDB
To deploy a standalone MariaDB metadata database (rather than providing your own database connection details), simply add a `mariaDB` item under the `spec.database` in your DSPA definition with an `deploy` key set to `true`. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details. Note that this component is mutually exclusive with externally-provided databases (defined by `spec.database.externalDB`).

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: sample
spec:
...
database:
mariaDB: # mutually exclusive with externalDB
deploy: true
```

### Minio
To deploy a Minio Object Storage component (rather than providing your own object storage connection details), simply add a `minio` item under the `spec.objectStorage` in your DSPA definition with an `image` key set to a valid minio component container image. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details. Note that this component is mutually exclusive with externally-provided object stores (defined by `spec.objectStorage.externalStorage`).

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: sample
spec:
...
objectStorage:
minio: # mutually exclusive with externalStorage
deploy: true
# Image field is required
image: 'quay.io/opendatahub/minio:RELEASE.2019-08-14T20-37-41Z-license-compliance'
```

### ML Pipelines UI
To deploy the standalone DS Pipelines UI component, simply add a `spec.mlpipelineUI` item to your DSPA with an `image` key set to a valid ui component container image. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details.

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: sample
spec:
...
mlpipelineUI:
deploy: true
# Image field is required
image: 'quay.io/opendatahub/odh-ml-pipelines-frontend-container:beta-ui'
```


### ML Metadata
To deploy the ML Metadata artifact linage/metadata component, simply add a `spec.mlmd` item to your DSPA with `deploy` set to `true`. All other fields are defaultable/optional, see [All Fields DSPA Example](./config/samples/dspa_all_fields.yaml) for full details.

```
apiVersion: datasciencepipelinesapplications.opendatahub.io/v1alpha1
kind: DataSciencePipelinesApplication
metadata:
name: sample
spec:
...
mlmd:
deploy: true
```


# Using a DataSciencePipelinesApplication

When a `DataSciencePipelinesApplication` is deployed, use the MLPipelines UI endpoint to interact with DSP, either via a GUI or via API calls.
Expand Down Expand Up @@ -260,7 +330,7 @@ see these logs after clicking this step and navigating to "Logs."

## Using the API

> Note: By default we use kfp-tekton v1.4 for this section so you will need [kfp-tekton v1.4.x sdk installed][kfp-tekton]
> Note: By default we use kfp-tekton 1.5.x for this section so you will need [kfp-tekton v1.5.x sdk installed][kfp-tekton]
> in your environment
In the previous step we submitted a generated `Pipeline` yaml via the GUI. We can also submit the `Pipeline` code
Expand Down
32 changes: 32 additions & 0 deletions api/v1alpha1/dspipeline_types.go
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,9 @@ type DSPASpec struct {
*MlPipelineUI `json:"mlpipelineUI"`
// +kubebuilder:validation:Required
*ObjectStorage `json:"objectStorage"`
// +kubebuilder:validation:Optional
// +kubebuilder:default:={deploy: false}
*MLMD `json:"mlmd"`
}

type APIServer struct {
Expand Down Expand Up @@ -164,6 +167,35 @@ type Minio struct {
Image string `json:"image"`
}

type MLMD struct {
// +kubebuilder:default:=false
// +kubebuilder:validation:Optional
Deploy bool `json:"deploy"`
*Envoy `json:"envoy,omitempty"`
*GRPC `json:"grpc,omitempty"`
*Writer `json:"writer,omitempty"`
}

type Envoy struct {
Resources *ResourceRequirements `json:"resources,omitempty"`
// +kubebuilder:validation:Required
Image string `json:"image"`
}

type GRPC struct {
Resources *ResourceRequirements `json:"resources,omitempty"`
// +kubebuilder:validation:Required
Image string `json:"image"`
// +kubebuilder:validation:Optional
Port string `json:"port"`
}

type Writer struct {
Resources *ResourceRequirements `json:"resources,omitempty"`
// +kubebuilder:validation:Required
Image string `json:"image"`
}

// ResourceRequirements structures compute resource requirements.
// Replaces ResourceRequirements from corev1 which also includes optional storage field.
// We handle storage field separately, and should not include it as a subfield for Resources.
Expand Down
95 changes: 95 additions & 0 deletions api/v1alpha1/zz_generated.deepcopy.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

21 changes: 21 additions & 0 deletions config/base/kustomization.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,27 @@ vars:
apiVersion: v1
fieldref:
fieldpath: data.IMAGES_MARIADB
- name: IMAGES_MLMDENVOY
objref:
kind: ConfigMap
name: dspo-parameters
apiVersion: v1
fieldref:
fieldpath: data.IMAGES_MLMDENVOY
- name: IMAGES_MLMDGRPC
objref:
kind: ConfigMap
name: dspo-parameters
apiVersion: v1
fieldref:
fieldpath: data.IMAGES_MLMDGRPC
- name: IMAGES_MLMDWRITER
objref:
kind: ConfigMap
name: dspo-parameters
apiVersion: v1
fieldref:
fieldpath: data.IMAGES_MLMDWRITER
- name: IMAGES_DSPO
objref:
kind: ConfigMap
Expand Down
13 changes: 8 additions & 5 deletions config/base/params.env
Original file line number Diff line number Diff line change
@@ -1,9 +1,12 @@
IMAGES_APISERVER=quay.io/opendatahub/ds-pipelines-api-server:main-0e8a011
IMAGES_ARTIFACT=quay.io/opendatahub/ds-pipelines-artifact-manager:main-0e8a011
IMAGES_PERSISTENTAGENT=quay.io/opendatahub/ds-pipelines-persistenceagent:main-0e8a011
IMAGES_SCHEDULEDWORKFLOW=quay.io/opendatahub/ds-pipelines-scheduledworkflow:main-0e8a011
IMAGES_APISERVER=quay.io/opendatahub/ds-pipelines-api-server:v1.0.0
IMAGES_ARTIFACT=quay.io/opendatahub/ds-pipelines-artifact-manager:v1.0.0
IMAGES_PERSISTENTAGENT=quay.io/opendatahub/ds-pipelines-persistenceagent:v1.0.0
IMAGES_SCHEDULEDWORKFLOW=quay.io/opendatahub/ds-pipelines-scheduledworkflow:v1.0.0
IMAGES_CACHE=registry.access.redhat.com/ubi8/ubi-minimal
IMAGES_MOVERESULTSIMAGE=registry.access.redhat.com/ubi8/ubi-micro
IMAGES_MARIADB=registry.redhat.io/rhel8/mariadb-103:1-188
IMAGES_DSPO=quay.io/opendatahub/data-science-pipelines-operator:main
IMAGES_DSPO=quay.io/opendatahub/data-science-pipelines-operator:v1.0.0
IMAGES_OAUTHPROXY=registry.redhat.io/openshift4/ose-oauth-proxy:v4.12.0
IMAGES_MLMDENVOY=quay.io/opendatahub/ds-pipelines-metadata-envoy:v1.0.0
IMAGES_MLMDGRPC=quay.io/opendatahub/ds-pipelines-metadata-grpc:v1.0.0
IMAGES_MLMDWRITER=quay.io/opendatahub/ds-pipelines-metadata-writer:v1.0.0
3 changes: 3 additions & 0 deletions config/configmaps/files/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,3 +7,6 @@ Images:
Cache: $(IMAGES_CACHE)
MoveResultsImage: $(IMAGES_MOVERESULTSIMAGE)
MariaDB: $(IMAGES_MARIADB)
MlmdEnvoy: $(IMAGES_MLMDENVOY)
MlmdGRPC: $(IMAGES_MLMDGRPC)
MlmdWriter: $(IMAGES_MLMDWRITER)
Loading

0 comments on commit 6bf32d5

Please sign in to comment.