Skip to content

CSA-OCP-GER/azure-machine-learning-bootcamp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

89 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Azure Machine Learning Services Bootcamp (Version 2.0)

  • 09/2019 - Added challenge 5 (production deployment to Azure Kubernetes Service)
  • 05/2019 - Lots of new capabilities have been announced.
  • 12/2019 - Azure Machine Learning is now general available.

alt text

Here are the top two resources you'll need this afternoon:

  1. Azure Machine Learning Services documentation
  2. Azure Machine Learning SDK for Python documentation

Challenges

Challenge 1 - Basic model training on Azure

⚡ Here are some hints for this challenge!

In this first challenge, you'll be training a basic machine learning model on Azure. We'll be using the popular MNIST dataset, as it allows us to focus on getting familiar with the mechanics of Azure Machine Learning. MNIST is a data set containing:

  • 60000 hand-written digits as training data
  • 10000 hand-written digits as testing data

Here are some examples:

alt text

The goal is to build a machine learning model, that

  • takes an unseen image as an input (28x28 pixels) and
  • outputs if there was a 0, 1, 2, 3, 4, 5, 6, 7, 8 or 9 on the image

Guidance:

  • Deploy from Azure Portal: Machine Learning service workspace
  • Write your code in a Jupyter Notebook in a Compute VM and use the new Azure ML UI
  • Use Python 3.6 - AzureML as the Notebook type in Jupyter (create a new folder before starting)
  • We'll be using scikit-learn to train a simple LogisticRegression classifier
  • Target accuracy of our model on the test data set: >92%

Challenge 2 - Advanced model training on Azure

⚡ Here are some hints for this challenge!

In this challenge, you'll be training a more advanced machine learning model on Azure (in fact, you'll be training a Deep Convolution Neural Network). We'll be using the same data set, but this time, we'll use Azure Machine Learning Compute for speeding up and scaling our training.

Guidance:

  • Use Keras with a TensorFlow backend to train a Convolution Neural Network on Azure Machine Learning Compute
  • Target accuracy of our model on the test data set: >99%

Challenge 3 - Model deployment on Azure

⚡ Here are some hints for this challenge!

In this third challenge, you'll be taking the model you've trained in the second challenge and deploy it to Azure Container Instances (ACI). This is perfect for test/dev scenarios and giving our model a quick test drive (we'll get to production deployments in a later challenge).

Guidance:

  • Take the model from challenge 2 and containerize it (Azure ML will do most of that for us)
  • Deploy it on ACI as a RESTful API

Challenge 4 - Automated Machine Learning

⚡ Here are some hints for this challenge!

In this challenge, you'll be using Automated Machine Learning to let Azure figure out which Machine Learning algorithm performs best on our dataset. We'll fully leverage the Azure Portal for that, hence no coding needed!

Guidance:

  • Create an Automated Machine Learning experiment in your Azure Machine Learning Workspace
  • Take the pima-indians-diabetes.csv dataset as the input
  • Let it figure out the best performing model
  • Bonus points: Deploy the model to ACI as a scoring endpoint (takes just a few clicks)

Challenge 5 - Production deployment to Azure Kubernetes Service (AKS)

⚡ Here are some hints for this challenge!

In this last challenge, you'll be training a model and deploying the model to showcase how you can use AutoML for a simple classification problem. This model will predict if a credit card transaction is considered a fraudulent charge. But this time we'll deploy the model to Azure Kubernetes Service (AKS). Since this will mimic a production deployment, we want to make sure to enable authentication and telemetry monitoring (using Application Insights) for our model!

Guidance:

  • Create an Automated Machine Learning experiment, train, register and deploy the model via Python
  • Retrieve the scoring script and the environment file from the Automated Machine Learning experiment
  • Take the creditcard.csv dataset as the input
  • Create a AKS cluster
  • Deploy the model with authentication and monitoring

Recommended Certifications

Further Trainings

Further Material