Skip to content

Commit

Permalink
Refactor to match new samples folder structure (#1741)
Browse files Browse the repository at this point in the history
  • Loading branch information
carolynwang authored and k8s-ci-robot committed Aug 6, 2019
1 parent dd59bc2 commit 351f456
Show file tree
Hide file tree
Showing 7 changed files with 15 additions and 15 deletions.
2 changes: 1 addition & 1 deletion components/aws/sagemaker/ground_truth/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ active_learning_model_arn | ARN of the resulting active learning model

# Samples
## Used in a pipeline with workteam creation and training
Mini image classification demo: [Demo](https://github.com/kubeflow/pipelines/blob/master/samples/aws-samples/ground_truth_pipeline_demo/)
Mini image classification demo: [Demo](https://github.com/kubeflow/pipelines/blob/master/samples/contrib/aws-samples/ground_truth_pipeline_demo/)

# References
* [Ground Truth documentation](https://docs.aws.amazon.com/sagemaker/latest/dg/sms.html)
Expand Down
6 changes: 3 additions & 3 deletions components/aws/sagemaker/hyperparameter_tuning/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,9 +64,9 @@ training_image | The registry path of the Docker image that contains the trainin

# Samples
## On its own
K-Means algorithm tuning on MNIST dataset: [pipeline](https://github.com/kubeflow/pipelines/blob/master/samples/aws-samples/mnist-kmeans-sagemaker/kmeans-hpo-pipeline.py)
K-Means algorithm tuning on MNIST dataset: [pipeline](https://github.com/kubeflow/pipelines/blob/master/samples/contrib/aws-samples/mnist-kmeans-sagemaker/kmeans-hpo-pipeline.py)

Follow the steps as in the [README](https://github.com/kubeflow/pipelines/blob/master/samples/aws-samples/mnist-kmeans-sagemaker/README.md) with some modification:
Follow the steps as in the [README](https://github.com/kubeflow/pipelines/blob/master/samples/contrib/aws-samples/mnist-kmeans-sagemaker/README.md) with some modification:
1. Get and store data in S3 buckets
2. Prepare an IAM roles with permissions to run SageMaker jobs
3. Add 'aws-secret' to your kubeflow namespace
Expand All @@ -78,7 +78,7 @@ dsl-compile --py kmeans-hpo-pipeline.py --output kmeans-hpo-pipeline.tar.gz
6. Once the pipeline completes, you can see the outputs under 'Output parameters' in the HPO component's Input/Output section.

## Integrated into a pipeline
MNIST Classification using K-Means pipeline: [Pipeline](https://github.com/kubeflow/pipelines/blob/master/samples/aws-samples/mnist-kmeans-sagemaker/mnist-classification-pipeline.py) | [Steps](https://github.com/kubeflow/pipelines/blob/master/samples/aws-samples/mnist-kmeans-sagemaker/README.md)
MNIST Classification using K-Means pipeline: [Pipeline](https://github.com/kubeflow/pipelines/blob/master/samples/contrib/aws-samples/mnist-kmeans-sagemaker/mnist-classification-pipeline.py) | [Steps](https://github.com/kubeflow/pipelines/blob/master/samples/contrib/aws-samples/mnist-kmeans-sagemaker/README.md)

# Resources
* [Using Amazon built-in algorithms](https://docs.aws.amazon.com/sagemaker/latest/dg/sagemaker-algo-docker-registry-paths.html)
Expand Down
2 changes: 1 addition & 1 deletion components/aws/sagemaker/workteam/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ workteam_arn | ARN of the workteam

# Samples
## In a pipeline with Ground Truth and training
Mini image classification: [Demo](https://github.com/kubeflow/pipelines/blob/master/samples/aws-samples/ground_truth_pipeline_demo/)
Mini image classification: [Demo](https://github.com/kubeflow/pipelines/tree/master/samples/contrib/aws-samples/ground_truth_pipeline_demo)

# References
* [Managing a private workforce](https://docs.aws.amazon.com/sagemaker/latest/dg/sms-workforce-management-private.html)
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Run the following to download `openimgs-annotations.csv`:
```bash
wget https://storage.googleapis.com/openimages/2018_04/test/test-annotations-human-imagelabels-boxable.csv -O openimgs-annotations.csv
```
Create a s3 bucket and run [this python script](https://github.com/kubeflow/pipelines/tree/master/samples/aws-samples/ground_truth_pipeline_demo/prep_inputs.py) to get the images and generate `train.manifest`, `validation.manifest`, `class_labels.json`, and `instuctions.template`.
Create a s3 bucket and run [this python script](https://github.com/kubeflow/pipelines/tree/master/samples/contrib/aws-samples/ground_truth_pipeline_demo/prep_inputs.py) to get the images and generate `train.manifest`, `validation.manifest`, `class_labels.json`, and `instuctions.template`.


## Amazon Cognito user groups
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
from kfp import dsl
from kfp.aws import use_aws_secret

sagemaker_workteam_op = components.load_component_from_file('../../../components/aws/sagemaker/workteam/component.yaml')
sagemaker_gt_op = components.load_component_from_file('../../../components/aws/sagemaker/ground_truth/component.yaml')
sagemaker_train_op = components.load_component_from_file('../../../components/aws/sagemaker/train/component.yaml')
sagemaker_workteam_op = components.load_component_from_file('../../../../components/aws/sagemaker/workteam/component.yaml')
sagemaker_gt_op = components.load_component_from_file('../../../../components/aws/sagemaker/ground_truth/component.yaml')
sagemaker_train_op = components.load_component_from_file('../../../../components/aws/sagemaker/train/component.yaml')

@dsl.pipeline(
name='Ground Truth image classification test pipeline',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
from kfp import dsl
from kfp.aws import use_aws_secret

sagemaker_hpo_op = components.load_component_from_file('../../../components/aws/sagemaker/hyperparameter_tuning/component.yaml')
sagemaker_hpo_op = components.load_component_from_file('../../../../components/aws/sagemaker/hyperparameter_tuning/component.yaml')

@dsl.pipeline(
name='MNIST HPO test pipeline',
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@
from kfp import dsl
from kfp.aws import use_aws_secret

sagemaker_hpo_op = components.load_component_from_file('../../../components/aws/sagemaker/hyperparameter_tuning/component.yaml')
sagemaker_train_op = components.load_component_from_file('../../../components/aws/sagemaker/train/component.yaml')
sagemaker_model_op = components.load_component_from_file('../../../components/aws/sagemaker/model/component.yaml')
sagemaker_deploy_op = components.load_component_from_file('../../../components/aws/sagemaker/deploy/component.yaml')
sagemaker_batch_transform_op = components.load_component_from_file('../../../components/aws/sagemaker/batch_transform/component.yaml')
sagemaker_hpo_op = components.load_component_from_file('../../../../components/aws/sagemaker/hyperparameter_tuning/component.yaml')
sagemaker_train_op = components.load_component_from_file('../../../../components/aws/sagemaker/train/component.yaml')
sagemaker_model_op = components.load_component_from_file('../../../../components/aws/sagemaker/model/component.yaml')
sagemaker_deploy_op = components.load_component_from_file('../../../../components/aws/sagemaker/deploy/component.yaml')
sagemaker_batch_transform_op = components.load_component_from_file('../../../../components/aws/sagemaker/batch_transform/component.yaml')

@dsl.pipeline(
name='MNIST Classification pipeline',
Expand Down

0 comments on commit 351f456

Please sign in to comment.