Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DS-9797- clarified conditions for importing external artifacts to the… #405

Merged

Conversation

chtyler
Copy link
Contributor

@chtyler chtyler commented Aug 12, 2024

… s3 bucket for pipelines

Description

I have updated the data science pipelines documentation to include information on importing artifacts generated outside of the pipeline in context. When you import external artifacts to your s3-compatible storage bucket, you can only import them to the s3 bucket that specified in the pipeline configuration.

How Has This Been Tested?

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

@chtyler chtyler force-pushed the DS-9797-artifact-import-condition branch from 886cc53 to b131089 Compare August 13, 2024 13:08
@@ -68,6 +68,8 @@ The *Configure pipeline server* dialog appears.
[IMPORTANT]
====
If you specify incorrect data connection settings, you cannot update these settings on the same pipeline server. Therefore, you must delete the pipeline server and configure another one.

If you plan to import artifacts that were not generated by tasks in your pipeline, or you plan to use external artifacts that were not generated by any pipeline, you can only import these artifacts to the S3-compatible object storage bucket that you define in the *Bucket* field.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you wish to use an existing artifact that was not generated by a task in the current pipeline you can use a dsl.importer component to load the artifact from its URI.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can update this to mention the dsl importer component. However, we also need to mention that you can only import these artifacts to the S3-compatible object storage bucket that you define in the Bucket field in the pipeline configuration, as that was the point of the documentation request.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see

If you wish to use an existing artifact that was not generated by a task in the current pipeline you can use a [dsl.importer](https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.importer) component to load the artifact from its URI. You can only import these artifacts to the S3-compatible object storage bucket that you define in the *Bucket* field.

@chtyler chtyler force-pushed the DS-9797-artifact-import-condition branch from b131089 to 366def5 Compare August 19, 2024 17:29
@chtyler chtyler merged commit 8ccc740 into opendatahub-io:main Aug 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants