Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Components - De-hardcoded the UI metadata file path in GCP components #2697

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions components/gcp/bigquery/query/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -68,10 +68,10 @@ implementation:
--table_id, {inputValue: table_id},
--dataset_location, {inputValue: dataset_location},
--output_gcs_path, {inputValue: output_gcs_path},
--job_config, {inputValue: job_config}
--job_config, {inputValue: job_config},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
output_gcs_path: /tmp/kfp/output/bigquery/query-output-path.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
import argparse
import fire
import importlib
import os
import sys
import logging
from .launcher import launch
Expand All @@ -26,8 +27,14 @@ def main():
description='Launch a python module or file.')
parser.add_argument('file_or_module', type=str,
help='Either a python file path or a module name.')
parser.add_argument('--ui-metadata-path', type=str,
help='Path for the file where the mlpipeline-ui-metadata.json data should be written.')
parser.add_argument('args', nargs=argparse.REMAINDER)
args = parser.parse_args()

if args.ui_metadata_path:
os.environ['KFP_UI_METADATA_PATH'] = args.ui_metadata_path

launch(args.file_or_module, args.args)

if __name__ == '__main__':
Expand Down
4 changes: 2 additions & 2 deletions components/gcp/dataflow/launch_python/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -61,10 +61,10 @@ implementation:
--staging_dir, {inputValue: staging_dir},
--requirements_file_path, {inputValue: requirements_file_path},
--args, {inputValue: args},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
numerology marked this conversation as resolved.
Show resolved Hide resolved
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataflow/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -73,9 +73,9 @@ implementation:
--validate_only, {inputValue: validate_only},
--staging_dir, {inputValue: staging_dir},
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataflow/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/create_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -81,10 +81,10 @@ implementation:
--config_bucket, {inputValue: config_bucket},
--image_version, {inputValue: image_version},
--cluster, {inputValue: cluster},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
cluster_name: /tmp/kfp/output/dataproc/cluster_name.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/submit_hadoop_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -91,10 +91,10 @@ implementation:
--args, {inputValue: args},
--hadoop_job, {inputValue: hadoop_job},
--job, {inputValue: job},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataproc/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/submit_hive_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -86,10 +86,10 @@ implementation:
--script_variables, {inputValue: script_variables},
--hive_job, {inputValue: hive_job},
--job, {inputValue: job},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataproc/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/submit_pig_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -86,10 +86,10 @@ implementation:
--script_variables, {inputValue: script_variables},
--pig_job, {inputValue: pig_job},
--job, {inputValue: job},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataproc/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/submit_pyspark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -79,10 +79,10 @@ implementation:
--args, {inputValue: args},
--pyspark_job, {inputValue: pyspark_job},
--job, {inputValue: job},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataproc/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/submit_spark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -87,10 +87,10 @@ implementation:
--args, {inputValue: args},
--spark_job, {inputValue: spark_job},
--job, {inputValue: job},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataproc/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/dataproc/submit_sparksql_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -86,10 +86,10 @@ implementation:
--script_variables, {inputValue: script_variables},
--sparksql_job, {inputValue: sparksql_job},
--job, {inputValue: job},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/dataproc/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/ml_engine/batch_predict/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -81,10 +81,10 @@ implementation:
--output_data_format, {inputValue: output_data_format},
--prediction_input, {inputValue: prediction_input},
--job_id_prefix, {inputValue: job_id_prefix},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/ml_engine/job_id.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/deploy/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -109,11 +109,11 @@ implementation:
--replace_existing_version, {inputValue: replace_existing_version},
--set_default, {inputValue: set_default},
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
model_uri: /tmp/kfp/output/ml_engine/model_uri.txt
model_name: /tmp/kfp/output/ml_engine/model_name.txt
version_name: /tmp/kfp/output/ml_engine/version_name.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json
4 changes: 2 additions & 2 deletions components/gcp/ml_engine/train/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -118,11 +118,11 @@ implementation:
--worker_image_uri, {inputValue: worker_image_uri},
--training_input, {inputValue: training_input},
--job_id_prefix, {inputValue: job_id_prefix},
--wait_interval, {inputValue: wait_interval}
--wait_interval, {inputValue: wait_interval},
--ui-metadata-path, {outputPath: MLPipeline UI metadata},
]
env:
KFP_POD_NAME: "{{pod.name}}"
fileOutputs:
job_id: /tmp/kfp/output/ml_engine/job_id.txt
job_dir: /tmp/kfp/output/ml_engine/job_dir.txt
MLPipeline UI metadata: /mlpipeline-ui-metadata.json