Skip to content

Latest commit

 

History

History
203 lines (135 loc) · 13.7 KB

File metadata and controls

203 lines (135 loc) · 13.7 KB

AIML40 - Taking Models to the Next Level with Azure Machine Learning Best Practices

Session information

Artificial Intelligence and Machine Learning can be used in many ways to increase productivity of business processes and gather meaningful insights, by analyzing images, texts and trends within unstructured flows of data. While many tasks can be solved using existing models, in some cases it is also required to train your own model for more specific tasks, or for increased accuracy.

In this session, we will explore the complete path of integrating text analysis intelligent services into the business processes of Tailwind Traders, starting from pre-build models available as cognitive services, up to training a third-party neural custom model for Aspect-Based Sentiment Analysis available as part of Intel NLP Architect using Azure Machine Learning Service. We will talk about cases when one needs a custom model, and demonstrate quick ways to create such a model from scratch using AutoML, and show how to fine-tune model hyperparameters using HyperDrive

Table of Content

Resources Links
PowerPoint - Presentation
Videos - Dry Run Rehearsal
- Microsoft Ignite Orlando Recording
Demos - Demo 1 - Cognitive Services Text Analytics
- Demo 2 - Automated Machine Learning
- Demo 3 - Azure Machine Learning SDK and Hyperdrive

Delivery Assets

Overview of Demonstrations

In this presentation, the following demonstrations are made:

  1. Using Cognitive Services Text Analytics to find out the sentiment of a clothing review
  2. Using Azure Automated ML to build a text classifier almost with no code
  3. Using Azure Machine Learning Service to train an Aspect-Based Sentiment Analysis model.

Starting Fast

If you want to start right away, you can deploy all required resources via Azure Template.

Below we provide more detailed instructions for the demo so you can perform the steps manually to fully understand the concepts being shown.

Initial Environment Setup

In order to perform steps 2 and 3 of the demo, we would need to:

  1. Create an Azure Machine Learning Workspace
  2. Upload the data used for AutoML training - clothing_automl.xlsx

Creating Azure Machine Learning Workspace

The Azure ML Workspace can either be created:

NOTE: (we are using absa as a name, and West US 2 datacenter in this example, but feel free to change that)

az extension add -n azure-cli-ml
az group create -n absa -l westus2
az ml workspace create -w absa_space -g absa

You would also need to know your subscription id, which can be obtained by running az account list.

Uploading data to the workspace

In our demos, we use a few datasets:

To follow the Automated ML Demo, please upload the dataset to your workspace. You can do it manually through Azure ML Portal, or use the provided file upload_dataset.py (csv/xlsx file should be in the current directory, and you should substitute [subscription_id] according to your subscription):

python upload_dataset.py -s [subscription_id] -w absa_space -g absa -f clothing_automl.xlsx

The Automated ML clothing dataset would be uploaded to the AML service datastore by the demo code.

Using the Azure ML Demo Code

You can execute demo code from any Jupyter Notebook Environment. You can use any one of the following options:

  • Install Python environment locally, as described below in Python Environment Installation
  • Use Jupyter Notebooks from within Azure ML Workspace. To do that:
    • Navigate to your Azure ML Portal
    • Select Notebooks from left-hand-side menu
    • Upload absa.ipynb file and select it
    • You will be prompted to Create a notebook VM. Now you can use the notebook directly from the portal.
  • Use Azure Notebooks. In this case you should upload the absa.ipynb file to a new Azure Notebooks project. Also, because of limitations of free compute in Azure Notebooks (1 Gb disk space), you will only be able to run this notebook on a virtual machine, as described here.

Python Environment Installation

If you decide not to use Azure Notebooks, and prefer to use your local Python environment, you need to install the Python Azure ML SDK, and make sure to install notebook and contrib:

conda create -n azureml -y Python=3.6
source activate azureml
pip install --upgrade azureml-sdk[notebooks,contrib] 
conda install ipywidgets
jupyter nbextension install --py --user azureml.widgets
jupyter nbextension enable azureml.widgets --user --py

You will need to restart Jupyter after this. Detailed instructions are here

If you need a free trial account to get started you can get one here

Pre-creating Compute Cluster

For the last two demos, you need a compute cluster. For demo purposes, we will create a cluster that consists of one node only. This can be done in one of three ways:

  1. Through Azure ML Portal go to Compute section and manually create Azure ML Compute cluster with Standard_DS3_v2 VMs, specifying number of nodes = 1. Name the cluster absa-cluster.
  2. Run the provided create_cluster.py script, providing parameters as above:
python create_cluster.py -s [subscription_id] -w absa_space -g absa
  1. Run first few cells from absa.ipynb notebook which will create the cluster for you.

Demos

Demo 1: Text Analytics Cognitive Service

💡 You must have completed the environment setup before attempting to do the demo.

In this demo, we show how Text Analytics can do sentiment analysis of a phrase in a web interface.

  1. Open Text Analytics Page
  2. Scroll down to see it in action section and enter the phrase I loved the polka dot pants that I bought in the london store (You can also leave the default phrase to demonstrate the point).
  3. Press Analyze to get the following result:

Screenshot of Azure Text Analytics

Note that Text Analytics does not only provide sentiment, but also extracts location and key words from text.

Demo 2: Azure AutoML

💡 You must have completed the environment setup before attempting to do the demo.

In this demo, we demonstrate how Automated ML can be used to build an ML model without coding.

  1. Navigate to your Azure ML Workspace (created above) in the http://ml.azure.com
  2. Go to Datasets - you should see the previously uploaded dataset there (clothing_automl.xlsx). Note that you can also upload it here through the portal.
  3. Select the dataset.
  4. From the Overview tab, expand Sample usage and show the code that can be used to access the data programmatically, if needed.
  5. From the Explore tab, have a look at the data.
  6. Go to Automated ML tab and click New Experiment
  7. Select experiment name and compute to be used.
  8. Select the dataset.
  9. Chose the type of prediction task -- Classification.
  10. Select target column -- Rating.
  11. Click Start.

The experiment will take quite a long time to run, because different algorithms will be investigated. If showing a demo, it would make sense to run this in advance and just show the results.

Demo 3: Using Azure ML Workspace with Python SDK

💡 You must have completed the environment setup before attempting to do the demo.

In this demo, we will run custom Python code that uses the Python Azure ML SDK to train, optimize and use the custom Aspect Based Sentiment Analysis (ABSA) model.

All of the instructions for this part of the demo are located in the Jupyter Notebook itself. Use one of the methods described above to run the Notebook (in Azure Notebooks, or locally), and follow instructions there.

Tear Down

To free up cloud resources used during the demo, you need to delete Azure ML workspace and resource group:

az ml workspace delete --w absa_space -g absa
az group delete -n absa

Presenter Resources

If you are going to present this content, please have a look at the additional presenter resources.

Resources and Continue Learning

Getting Started Series

Other Materials

Feedback loop

Do you have a comment, feedback, suggestion? Currently, the best feedback loop for content changes/suggestions/feedback is to create a new issue on this GitHub repository. To get all the details about how to create an issue please refer to the Contributing docs