-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DS-9797- clarified conditions for importing external artifacts to the… #405
DS-9797- clarified conditions for importing external artifacts to the… #405
Conversation
886cc53
to
b131089
Compare
@@ -68,6 +68,8 @@ The *Configure pipeline server* dialog appears. | |||
[IMPORTANT] | |||
==== | |||
If you specify incorrect data connection settings, you cannot update these settings on the same pipeline server. Therefore, you must delete the pipeline server and configure another one. | |||
|
|||
If you plan to import artifacts that were not generated by tasks in your pipeline, or you plan to use external artifacts that were not generated by any pipeline, you can only import these artifacts to the S3-compatible object storage bucket that you define in the *Bucket* field. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you wish to use an existing artifact that was not generated by a task in the current pipeline you can use a dsl.importer component to load the artifact from its URI.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can update this to mention the dsl importer component. However, we also need to mention that you can only import these artifacts to the S3-compatible object storage bucket that you define in the Bucket field in the pipeline configuration, as that was the point of the documentation request.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see
If you wish to use an existing artifact that was not generated by a task in the current pipeline you can use a [dsl.importer](https://kubeflow-pipelines.readthedocs.io/en/latest/source/dsl.html#kfp.dsl.importer) component to load the artifact from its URI. You can only import these artifacts to the S3-compatible object storage bucket that you define in the *Bucket* field.
b131089
to
366def5
Compare
… s3 bucket for pipelines
Description
I have updated the data science pipelines documentation to include information on importing artifacts generated outside of the pipeline in context. When you import external artifacts to your s3-compatible storage bucket, you can only import them to the s3 bucket that specified in the pipeline configuration.
How Has This Been Tested?
Merge criteria: