Skip to content

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.
  • Labs icon Lab
  • A Cloud Guru
Azure icon

Create an Experiment in Azure Machine Learning

The process of experimenting on a model's architecture is hyperparameter tuning. Azure Machine Learning Studio provides Experiments as a way to track the results of hyperparameter tuning. In this lab, we teach how to set up and run a hyperparameter tuning job using Experiments. Then we view and evaluate the results in the Azure Portal.

Azure icon

Path Info

Clock icon Advanced
Clock icon 1h 30m
Clock icon Apr 03, 2020

Contact sales

By filling out this form and clicking submit, you acknowledge our privacy policy.

Table of Contents

  1. Challenge

    Azure Setup

    1. Open Machine Learning Studio. Use the Preview Environment to become familiar with the new look and feel. Note: Create the workspace in the same region as your lab provided resource group.
    2. Create a Compute instance.
      • The Compute must be uniquely named. We can use the name of the Machine Learning Workspace as part of the Compute name to guarantee uniqueness.
      • This will take a few minutes to spin up. Grab a coffee or tea while waiting.
    3. Clone the Jupyter Notebook from GitHub.
      • Go to the Notebooks section of Machine Learning Studio.
      • Create a new Python notebook and edit it in Jupyter.
      • In the first cell, run the following:
      !git clone
      • After the repo is cloned, close this notebook and open the content-dp100/notebooks/MNIST_AzureExperiment.ipynb notebook, choosing to edit it in Jupyter.
  2. Challenge

    Create and Run the Experiment

    1. Follow the steps in the notebook.
      • Select Python 3.8 - AzureML in the top right corner.
      • Run each code cell in the MNIST_AzureExperiment notebook and view the results.
      • Read the explanations to get a better understanding of what each cell is doing.
  3. Challenge

    Evaluate the Results Of the Experiment

    1. View the results in Machine Learning Studio.
      • The final cell of the notebook provides a link directly to the mnist experiment. Run the cell and click the link.
      • Change the graph showing batch_size to instead show accuracy.
      • Remove the Compute target, Job type, and Created by columns from the table.
      • Add the accuracy column to the table.
    2. Evaluate how the changing batch_size affects the loss, accuracy, and training time. What would explain this phenomenon?
    3. Do Further Research.

    The larger batch sizes are prone to becoming stuck in local minima or missing minima entirely. The models train much faster because less backpropagation has to happen in the network, but the cost is accuracy. Batch sizes between 32 and 256 are good starting points for most models, but this requires tuning like all hyperparameters.

The Cloud Content team comprises subject matter experts hyper focused on services offered by the leading cloud vendors (AWS, GCP, and Azure), as well as cloud-related technologies such as Linux and DevOps. The team is thrilled to share their knowledge to help you build modern tech solutions from the ground up, secure and optimize your environments, and so much more!

What's a lab?

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Provided environment for hands-on practice

We will provide the credentials and environment necessary for you to practice right within your browser.

Guided walkthrough

Follow along with the author’s guided walkthrough and build something new in your provided environment!

Did you know?

On average, you retain 75% more of your learning if you get time for practice.

Start learning by doing today

View Plans