This repo is made for kubeflow example
- I wrote an explain on my blog
- iris : https://lsjsj92.tistory.com/581
- titanic : https://lsjsj92.tistory.com/586
- metrics_evaluation_and_check_condtion : https://lsjsj92.tistory.com/589
- iris
- Make kubeflow pipeline for simple example : iris data
- Doing load data -> training data
- If you want to start this example do this:
- Make each dokcer image(preprocessing, training)
- Make kubeflow pipeline(pipeline.py)
- Start kubeflow
- Upload pipeline and start experiment
- titanic
-
Make kubeflow pipline for simple AWS example with titanic data
-
The process of this kubeflow pipeline : In preprocessing
- Get data from AWS s3
- Preprocessing data
- Upload to AWS s3 after preprocessing data Until here, doing in preprocessing
In train_model 4. Get preprocessing data from s3 5. Training model 6. Upload model to s3
- metrics_evaluation_and_check_condtion