Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Component Release 727c48c690c081b505c1f0979d11930bf1ef07c0 #1280

Merged
merged 1 commit into from
May 3, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion component_sdk/python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from setuptools import setup

PACKAGE_NAME = "kfp-component"
VERSION = '0.1'
VERSION = '0.1.19'
REQUIRES = []
with open('requirements.txt') as f:
REQUIRES = f.readlines()
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/predict/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ outputs:
- {name: Predictions dir, type: GCSPath, description: 'GCS or local directory.'} #Will contain prediction_results-* and schema.json files; TODO: Split outputs and replace dir with single file # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, /ml/predict.py]
args: [
--data, {inputValue: Data file pattern},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tfdv/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ outputs:
- {name: Validation result, type: String, description: Indicates whether anomalies were detected or not.}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, /ml/validate.py]
args: [
--csv-data-for-inference, {inputValue: Inference data},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tfma/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ outputs:
- {name: Analysis results dir, type: GCSPath, description: GCS or local directory where the analysis results should were written.} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, /ml/model_analysis.py]
args: [
--model, {inputValue: Model},
Expand Down
2 changes: 1 addition & 1 deletion components/dataflow/tft/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ outputs:
- {name: Transformed data dir, type: GCSPath} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, /ml/transform.py]
args: [
--train, {inputValue: Training data file pattern},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ outputs:
type: GCSPath
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.bigquery, query,
--query, {inputValue: query},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataflow, launch_python,
--python_file_path, {inputValue: python_file_path},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataflow, launch_template,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, create_cluster,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/delete_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ inputs:
type: Integer
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, delete_cluster,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hadoop_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, submit_hadoop_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hive_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, submit_hive_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pig_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, submit_pig_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pyspark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, submit_pyspark_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_spark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, submit_spark_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_sparksql_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.dataproc, submit_sparksql_job,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/batch_predict/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.ml_engine, batch_predict,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/deploy/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ outputs:
type: String
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.ml_engine, deploy,
--model_uri, {inputValue: model_uri},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/ml_engine/train/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ outputs:
type: GCSPath
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-gcp:727c48c690c081b505c1f0979d11930bf1ef07c0
args: [
kfp_component.google.ml_engine, train,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/kubeflow/deployer/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ inputs:
# - {name: Endppoint URI, type: Serving URI, description: 'URI of the deployed prediction service..'}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [/bin/deploy.sh]
args: [
--model-export-path, {inputValue: Model dir},
Expand Down
2 changes: 1 addition & 1 deletion components/kubeflow/dnntrainer/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ outputs:
- {name: Training output dir, type: GCSPath, description: 'GCS or local directory.'} # type: {GCSPath: {path_type: Directory}}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, -m, trainer.task]
args: [
--transformed-data-dir, {inputValue: Transformed data dir},
Expand Down
2 changes: 1 addition & 1 deletion components/kubeflow/launcher/kubeflow_tfjob_launcher_op.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
def kubeflow_tfjob_launcher_op(container_image, command, number_of_workers: int, number_of_parameter_servers: int, tfjob_timeout_minutes: int, output_dir=None, step_name='TFJob-launcher'):
return dsl.ContainerOp(
name = step_name,
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf:b0147bdbed9f25212408e0468a475289e80e0406',
image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments = [
'--workers', number_of_workers,
'--pss', number_of_parameter_servers,
Expand Down
6 changes: 3 additions & 3 deletions components/kubeflow/launcher/src/train.template.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ spec:
spec:
containers:
- name: tensorflow
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:727c48c690c081b505c1f0979d11930bf1ef07c0
command:
- python
- -m
Expand All @@ -49,7 +49,7 @@ spec:
spec:
containers:
- name: tensorflow
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:727c48c690c081b505c1f0979d11930bf1ef07c0
command:
- python
- -m
Expand All @@ -72,7 +72,7 @@ spec:
spec:
containers:
- name: tensorflow
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:727c48c690c081b505c1f0979d11930bf1ef07c0
command:
- python
- -m
Expand Down
2 changes: 1 addition & 1 deletion components/local/confusion_matrix/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ inputs:
# - {name: Metrics, type: Metrics}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, /ml/confusion_matrix.py]
args: [
--predictions, {inputValue: Predictions},
Expand Down
2 changes: 1 addition & 1 deletion components/local/roc/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ inputs:
# - {name: Metrics, type: Metrics}
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:b0147bdbed9f25212408e0468a475289e80e0406
image: gcr.io/ml-pipeline/ml-pipeline-local-confusion-matrix:727c48c690c081b505c1f0979d11930bf1ef07c0
command: [python2, /ml/roc.py]
args: [
--predictions, {inputValue: Predictions dir},
Expand Down
2 changes: 1 addition & 1 deletion samples/kubeflow-tf/kubeflow-training-classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def kubeflow_training(output, project,
).apply(gcp.use_gcp_secret('user-gcp-sa'))

if use_gpu:
training.image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:b0147bdbed9f25212408e0468a475289e80e0406',
training.image = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:727c48c690c081b505c1f0979d11930bf1ef07c0',
training.set_gpu_limit(1)

prediction = dataflow_tf_predict_op(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,13 +44,13 @@
"EVAL_DATA = 'gs://ml-pipeline-playground/tfx/taxi-cab-classification/eval.csv'\n",
"HIDDEN_LAYER_SIZE = '1500'\n",
"STEPS = 3000\n",
"DATAFLOW_TFDV_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"DATAFLOW_TFT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"DATAFLOW_TFMA_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"DATAFLOW_TF_PREDICT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"KUBEFLOW_TF_TRAINER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"KUBEFLOW_TF_TRAINER_GPU_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:b0147bdbed9f25212408e0468a475289e80e0406'\n",
"DATAFLOW_TFDV_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfdv:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"DATAFLOW_TFT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tft:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"DATAFLOW_TFMA_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tfma:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"DATAFLOW_TF_PREDICT_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-dataflow-tf-predict:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"KUBEFLOW_TF_TRAINER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"KUBEFLOW_TF_TRAINER_GPU_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-tf-trainer-gpu:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"KUBEFLOW_DEPLOYER_IMAGE = 'gcr.io/ml-pipeline/ml-pipeline-kubeflow-deployer:727c48c690c081b505c1f0979d11930bf1ef07c0'\n",
"DEPLOYER_MODEL = 'notebook_tfx_taxi'\n",
"DEPLOYER_VERSION_DEV = 'dev'\n",
"DEPLOYER_VERSION_PROD = 'prod'\n",
Expand Down
12 changes: 6 additions & 6 deletions samples/xgboost-spark/xgboost-training-cm.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ def dataproc_create_cluster_op(
):
return dsl.ContainerOp(
name='Dataproc - Create cluster',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-create-cluster:b0147bdbed9f25212408e0468a475289e80e0406',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-create-cluster:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments=[
'--project', project,
'--region', region,
Expand All @@ -56,7 +56,7 @@ def dataproc_delete_cluster_op(
):
return dsl.ContainerOp(
name='Dataproc - Delete cluster',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-delete-cluster:b0147bdbed9f25212408e0468a475289e80e0406',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-delete-cluster:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments=[
'--project', project,
'--region', region,
Expand All @@ -76,7 +76,7 @@ def dataproc_analyze_op(
):
return dsl.ContainerOp(
name='Dataproc - Analyze',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-analyze:b0147bdbed9f25212408e0468a475289e80e0406',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-analyze:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments=[
'--project', project,
'--region', region,
Expand All @@ -103,7 +103,7 @@ def dataproc_transform_op(
):
return dsl.ContainerOp(
name='Dataproc - Transform',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-transform:b0147bdbed9f25212408e0468a475289e80e0406',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-transform:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments=[
'--project', project,
'--region', region,
Expand Down Expand Up @@ -141,7 +141,7 @@ def dataproc_train_op(

return dsl.ContainerOp(
name='Dataproc - Train XGBoost model',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-train:b0147bdbed9f25212408e0468a475289e80e0406',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-train:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments=[
'--project', project,
'--region', region,
Expand Down Expand Up @@ -174,7 +174,7 @@ def dataproc_predict_op(
):
return dsl.ContainerOp(
name='Dataproc - Predict with XGBoost model',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-predict:b0147bdbed9f25212408e0468a475289e80e0406',
image='gcr.io/ml-pipeline/ml-pipeline-dataproc-predict:727c48c690c081b505c1f0979d11930bf1ef07c0',
arguments=[
'--project', project,
'--region', region,
Expand Down
2 changes: 1 addition & 1 deletion sdk/python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from setuptools import setup

NAME = 'kfp'
VERSION = '0.1.18'
VERSION = '0.1.19'

REQUIRES = [
'urllib3>=1.15,<1.25', #Fixing the version conflict with the "requests" package
Expand Down