Intelligent Vending Machine - Microsoft Cloud Workshop

Microsoft Cloud Workshop Microsoft Cloud Workshop on Dec 01, 2018

In this hands-on lab, you will implement an IoT solution for intelligent vending machines, leveraging facial feature recognition and Azure machine learning, to gain a better understanding of building cloud-based machine learning apps and real-time analytics with SQL Database in-memory and columnar indexing.

At the end of this hands-on lab, you will be better able to build IoT solutions leveraging cloud-based machine learning services and real-time analytics.

Before the Hands-on Lab

Intelligent vending machines
Before the hands-on lab setup guide
December 2018
Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein.

© 2018 Microsoft Corporation. All rights reserved.

Microsoft and the trademarks listed at https://www.microsoft.com/en-us/legal/intellectualproperty/Trademarks/Usage/General.aspx are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.

Contents

Intelligent vending machines before the hands-on lab setup guide

Requirements

  • Microsoft Azure subscription must be pay-as-you-go or MSDN

    • Trial subscriptions will not work.
  • A virtual machine configured with:

Before the hands-on lab

Duration: 45 minutes

In the Before the hands-on lab exercise, you will set up your environment for use in the rest of the hands-on lab. You should follow all the steps provided in the Before the hands-on lab section to prepare your environment before attending the hands-on lab. Failure to do so will significantly impact your ability to complete the lab within the time allowed.

Important: Most Azure resources require unique names. Throughout this lab you will see the word “SUFFIX” as part of resource names. You should replace this with your Microsoft alias, initials, or another value to ensure the resource is uniquely named.

Task 1: Execute ARM template to provision Azure resources

In this task, you will run an Azure Resource Manager (ARM) template to provision many of the Azure resources you will be using throughout this lab. The ARM template provisions the following resources, and installs software on the Lab VM using a PowerShell script:

  • Azure Resource group

  • R Server on HDInsight cluster

  • Windows Server 2016 (x64) virtual machine with the following software installed:

    • Visual Studio Community 2017
    • Power BI Desktop
    • Git Bash
    • R Tools for Visual Studio 2017 (RTVS)
    • Google Chrome web browser
  • Microsoft Machine Learning Server on Linux

  • Azure storage account for storing photos

  • Azure SQL Database

Note: If you want to review the steps for manually creating the resources provisioned by the ARM template, see Appendix A.

  1. Select the Deploy to Azure button below to launch the script in a Custom deployment blade in the Azure portal.

  2. On the Custom deployment blade, enter the following:

    • Subscription: Select the subscription you are using for this hands on lab.

    • Resource group: Choose Create new and enter hands-on-lab-SUFFIX as the resource group name.

    • Location: Select the region you would like to use for resources in this hands-on lab. Remember this location so you can use it for the other resources you'll provision throughout this lab.

    • Resource Name Suffix: Enter a unique suffix, such as your initials or Microsoft alias, to use for uniquely naming resources created by the ARM template.

    • Leave the default values for the remaining resources, but note the values for later reference:

      • Cluster Login Username: admin

      • SSH Username: remoteuser

      • Virtual Machine Username: demouser

      • ML Virtual Machine Username: radmin

      • Database Username: demouser

      • Database Name: vending

      • Database Server Name: vendingmachines

      • All usernames use the password Password.1!!

      The Custom Deployment blade is displayed in the Azure portal, with the values specified above entered into the appropriate fields.

  3. Select Purchase.

Note: It typically takes 15 - 20 minutes for the ARM template deployment to finish.

Task 2: Connect to your Lab VM

In this task, you will create an RDP connection to your Lab virtual machine (VM), which is the Windows Server 2016 (x64) VM provisioned by the ARM template.

  1. When the ARM template deployment has completed, navigate to the Azure portal, select Resource groups in the Azure navigation pane, enter your resource group name (hands-on-lab-SUFFIX) into the filter box, and select it from the list.

    Resource groups is selected in the Azure navigation pane,

  2. In the list of resources for your resource group, select the LabVM Virtual Machine.

    The list of resources in the hands-on-lab-SUFFIX resource group are displayed, and LabVM is highlighted.

  3. On your Lab VM blade, select Connect from the top menu.

    The LabVM blade is displayed, with the Connect button highlighted in the top menu.

  4. Select Download RDP file, then open the downloaded RDP file.

    The Connect to virtual machine blade is displayed, and the Download RDP file button is highlighted.

  5. Select Connect on the Remote Desktop Connection dialog.

    In the Remote Desktop Connection Dialog Box, the Connect button is highlighted.

  6. Enter the following credentials when prompted:

    1. User name: demouser

    2. Password: Password.1!!

  7. Select Yes to connect, if prompted that the identity of the remote computer cannot be verified.

    In the Remote Desktop Connection dialog box, a warning states that the identity of the remote computer cannot be verified, and asks if you want to continue anyway. At the bottom, the Yes button is circled.

  8. Once logged in, launch the Server Manager. This should start automatically, but you can access it via the Start menu if it does not start.

    The Server Manager tile is circled in the Start Menu.

  9. Select Local Server, then select On next to IE Enhanced Security Configuration.

    Screenshot of the Server Manager. In the left pane, Local Server is selected. In the right, Properties (For LabVM) pane, the IE Enhanced Security Configuration, which is set to On, is highlighted.

  10. In the Internet Explorer Enhanced Security Configuration dialog, select Off under Administrators, then select OK.

    Screenshot of the Internet Explorer Enhanced Security Configuration dialog box, with Administrators set to Off.

  11. Close the Server Manager.

Task 3: Confirm installation of R Tools for Visual Studio 2017

In this task, you will confirm that R Tools for Visual Studio 2017 (RTVS) was successfully installed by the ARM template.

  1. On your Lab VM, launch the Visual Studio Installer by select Search on the Windows task bar, entering "visual studio installer" into the search box, and selecting Visual Studio Installer for the results.

    In the Windows Search bar,

  2. Update the Visual Studio Installer, if prompted.

  3. Once the Installer starts, select More, then select Modify.

    The More dropdown is expanded, and the Modify option is highlighted.

    Note: If the Visual Studio installation is up to date, Modify may appear where the Update button is in the above screenshot. If you wish to update Visual Studio first, select Update. This is not necessary for this hands-on lab, and the operation can take 30 minutes or more to complete.

  4. Verify that the Data science and analytical applications workload is selected, indicated by a checked box in the upper right corner of the workload. If it is not, select the workload and select Modify.

    Data science and analytical applications workload in selected in the Visual Studio 2017 Installer modify screen.

  5. Close the Visual Studio Installer.

Task 4: Download the vending machines starter project

Trey Research has provided a starter solution for you. They have asked you to use this as the starting point for creating the Vending Machines solution in Azure.

  1. From your LabVM, download the starter project by downloading a .zip copy of the Intelligent vending machines GitHub repo.

  2. In a web browser, navigate to the Intelligent vending machines MCW repo.

  3. On the repo page, select Clone or download, then select Download ZIP.

    Download .zip containing the Intelligent vending machines repository

  4. Unzip the contents to the folder C:\VendingMachines\.

Task 5: Set up Photos Storage account containers

In these steps, you will add containers for photos and promos to the photostorageSUFFIX storage account in the Azure portal. This account will be used for storing photos sent from the vending machine simulator and for the storage of the promotional package resources.

  1. In the Azure portal, navigate to the photostorageSUFFIX storage account by selecting Resource groups from the Azure navigation pane, entering "hands-on-lab" in the filter box, selecting the hands-on-lab-SUFFIX resource group, and locating the photostorageSUFFIX storage account in the list of resources.

  2. From the storage account Overview blade, select the Blobs tile under Services.

    Screenshot of the Storage account blade, services section. Under Services, Blobs is selected.

  3. In the Blob service blade, select +Container from the command bar.

    Screenshot of the Blob service blade command bar, with Container highlighted.

  4. On the New container blade, set the name to "photos" and select Private as the Access type.

    Screenshot of the New container blade Name and Public access level fields.

  5. Select OK.

  6. Repeat steps 6-8 to create another container named "promo".

  7. You should now see both containers listed on the Blob service blade.

    Screenshot of the Blob service blade name rows, with photos and promo listed.

  8. Next, open Visual Studio and from the View Menu select Cloud Explorer.

    Screenshot of the Visual Studio View menu, with Cloud Explorer selected.

    Note: You may need to select and enter the credentials for your Azure subscription, by clicking the person icon and expanding the subscriptions.

  9. Expand Storage accounts and locate the photostorageSUFFIX account, and the Blob Containers you created underneath it.

    Screenshot of the Visual Studio Cloud Explorer tree view, which is expanded to: Storage Accounts\photostorageSUFFIX\Blob Containers.

  10. Right-click the promo container and select Open.

  11. Select the Upload Blob button.

    Screenshot of the Upload blob button.

  12. Select Browse, and in the File dialog, select the three images CoconutWater.png, Water.png, and Soda.png from the C:\VendingMachines\Hands-on lab\starter-project\Simulator\Images folder, then select Open.

  13. Select OK on the Upload New File Dialog to upload the images into the container.

    Screenshot of the Updated New File window.

Task 6: Configure Microsoft Machine Learning Server on Linux

In this task, you will perform some configuration on the Machine Learning Server that was provisioned by the ARM template.

  1. In the Azure portal, navigate to the Overview blade of the LabMLServer VM, select Connect, and copy the SSH command.

    Connect to virtual machine blade, with the SSH command highlighted.

  2. On your Lab VM, open a new Git Bash window (click Search, then type Git Bash).

    Open the Git Bash applications.

  3. Paste the SSH connection command you copied in the previous step. For example: ssh radmin@40.70.129.190.

  4. Execute the command to SSH into your Microsoft Machine Learning Server VM.

  5. When prompted if you want to continue connecting, enter yes.

  6. Enter your password, Password.1!!

  7. At the prompt, after successfully logging in, enter the following command:

    sudo apt-get update -y
    
  8. Type exit twice to disconnect from the SSH session.

You should follow all steps provided before performing the Hands-on lab.

Hands-on Lab Guide

Intelligent vending machines
Hands-on lab step-by-step
December 2018

Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Microsoft, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

The names of manufacturers, products, or URLs are provided for informational purposes only and Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not responsible for the contents of any linked site or any link contained in a linked site, or any changes or updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission received from any linked site. Microsoft is providing these links to you only as a convenience, and the inclusion of any link does not imply endorsement of Microsoft of the site or the products contained therein.

© 2018 Microsoft Corporation. All rights reserved.

Microsoft and the trademarks listed at https://www.microsoft.com/en-us/legal/intellectualproperty/Trademarks/Usage/General.aspx are trademarks of the Microsoft group of companies. All other trademarks are property of their respective owners.

Contents

Intelligent vending machines hands-on lab step-by-step

Abstract and learning objectives

In this hands-on lab, you will implement an IoT solution for intelligent vending machines, leveraging facial feature recognition and Azure machine learning, to gain a better understanding of building cloud-based machine learning apps and real-time analytics with SQL Database in-memory and columnar indexing.

At the end of this hands-on lab, you will be better able to build IoT solutions leveraging cloud-based machine learning services and real-time analytics.

Overview

Trey Research Inc. looks at the old way of doing things in retail and introduces innovative experiences that delight customers and drive sales. Their latest initiative focuses on intelligent vending machines that support commerce, engagement analytics, and intelligent promotions.

Solution architecture

Below is a diagram of the solution architecture you will build in this lab. Please study this carefully, so you understand the whole of the solution as you are working on the various components.

Diagram of the preferred solution. From a high-level, the commerce solution uses an API App to host the Payments web service with which the Vending Machine interacts to conduct purchase transactions. The Payment Web API invokes a 3rd party payment gateway as needed for authorizing and capturing credit card payments, and logs the purchase transaction to SQL DB. The data for these purchase transactions is stored using an In-Memory table with a Columnar Index, which will support the write-heavy workload while still allowing analytics to operate, such as queries coming from Power BI Desktop.

Requirements

  • Microsoft Azure subscription must be pay-as-you-go or MSDN

    • Trial subscriptions will not work.
  • A virtual machine configured with:

Before the hands-on lab

Refer to the Before the hands-on lab setup guide before continuing to the lab exercises. You should follow all the steps provided in the Before the hands-on lab document to prepare your environment before attending the hands-on lab. Failure to do so will significantly impact your ability to complete the lab within the time allowed.

Exercise 1: Create Dynamic Pricing Model

Duration: 45 minutes

In this exercise, you will create a machine learning model that predicts the purchase price for an item sold by the vending machine, provided the demographics of the customer and the item. You will then operationalize this model by exposing it as a web service hosted in Azure Machine Learning, and test it out.

Task 1: Create a model locally

  1. On your Lab VM, locate the starter project solution file, VendingMachines.sln, in the C:\VendingMachines\Hands-on lab\starter-project folder and double-click it to open it with Visual Studio 2017.

  2. If prompted, select Visual Studio 2017 from the Microsoft Visual Studio Version Selector.

    Visual Studio 2017 in selected in the Microsoft Visual Studio Version Selector.

  3. Sign in to Visual Studio or create an account, if prompted.

  4. If the Security Warning for Simulator window appears, uncheck Ask me for every project in this solution, and select OK.

    Screenshot of the Security warning for Simulator window. The checkbox is cleared for

    Note: If you attempt to build the solution at this point, you will see many build errors. This is intentional. You will correct these in the exercises that follow.

  5. Within Visual Studio Solution Explorer, expand the PricingModel project and open the file TrainModel.R by double-clicking on the file in the Solution Explorer.

    Screenshot of the Visual Studio Solution Explorer, expanded to: Solution VendingMachines\PricingModel\TrainModel.R.

  6. Read the script. The top portion, entitled Create Sample Data, has been provided for you. You will generate the sample data you will use to train your model.

  7. Highlight all the text between the "Create Sample Data" and "END Create Sample Data" comments.

  8. Right-click the selected text and select Execute In Interactive.

    Screenshot of the Train Model tab, with Execute In Interactive highlighted on the sub-menu.

  9. You should see it execute in the R Interactive Window, ending with a summary of the created data.

    Screenshot of the R Interactive - Microsoft R Client window.

  10. From the R Tools menu, select Windows, then select Variable Explorer.

    Screenshot of the R tools menu, with Windows / Variable Explorer selected.

  11. Expand the variable sampleData, and explore the structure of the created data.

    Screenshot of the Variable Explorer, with sampleData selected.

  12. Now save this sampleData to a file by replacing TODO 1 in the TrainModel.R script with the following code:

    # TODO: 1. Export the sample data to a file
    save(sampleData, file = "sampleData.RData")
    
  13. Highlight the new line you just pasted, right-click and select Execute In Interactive.

  14. Open File Explorer and navigate to the location of the PricingModel (C:\VendingMachines\Hands-on-lab\starter-project\PricingModel) project on disk. You should see the file sampleData.RData on disk.

    Screenshot of File Explorer, with sampleData.RData selected, and the pop-up listing the Type (R Workspace), Size (2.85 KB), and Data/time modified.

  15. Back in the TrainModel.R file in Visual Studio, replace TODO 2 with the following code that builds the model using a Linear Regression.

    # TODO: 2. Build a linear regression model to predict purchase price given age, gender and # productSelect
    pricingModel <- rxLinMod(purchasePrice ~ age + gender + productSelected, data = sampleData)
    
  16. Save that trained model to disk by replacing TODO 3 with:

    # TODO: 3. Export the trained model to a file named pricingModel.rda
    save(pricingModel, file = "pricingModel.RData")
    
  17. Finally, save the first row of the sample data to a file so you can re-use the structure later when operationalizing the model. Replace TODO 4 with:

    # TODO: 4. Save one example of the sample data to serve as an input template, to a file
    # called inputExample.rda
    inputExample <- sampleData[1,]
    save(inputExample, file = "inputExample.RData")
    
  18. Save your changes to TrainModel.R.

  19. Highlight TODO items 2 through 4, right-click and select Execute in Interactive.

  20. In the same folder as your script, you should now have the files: sampleData.RData, pricingModel.RData, and inputExample.RData.

    Screenshot of File Explorer, with the previously mentioned files highlighted.

Task 2: Try a prediction locally

  1. Switch to Visual Studio, select the Solution Explorer tab and open PredictUsingModel.r under the Pricing Model project in Solution Explorer.

  2. Replace TODO 1 with the following:

    # TODO: 1. Prepare the input to use for prediction
    inputExample[1,]$age <- 30
    inputExample[1,]$gender <- "F"
    inputExample[1,]$productSelected <- "coconut water"
    
  3. Replace TODO 2 with the following:

    # TODO: 2. Execute the prediction
    prediction <- rxPredict(pricingModel, data = inputExample)
    
  4. Highlight all of the script in the file, right-click and select Execute In Interactive.

  5. On the Variable Explorer tab, expand the prediction variable and observe the price the model suggested to use for purchasing the coconut water for input of a 30-year-old female:

    Screenshot of Variable Explorer, prediction table. The purchasePrice_Prec is 0.949.

Task 3: Create the model in R Server on HDInsight

  1. On your Lab VM, open a Git Bash shell from the start menu.

  2. At the command prompt, you will enter the following command to create the SSH connection:

    ssh remoteuser@<clustername>-ssh.azurehdinsight.net
    

    Note: Replace <clustername> with the name of your HDInsight cluster, which you can obtain from the SSH + Cluster login blade of your HDInsight cluster in the Azure portal.

    Screenshot of the HDInsight cluster blade for vendingmachineslab. Overview is selected in the left pane, and in the right pane, on the menu bar, Secure Shell (SSH) is highlighted.

    Screenshot of Connect to cluster. Ssh.azurehdinsight.net is highlighted in the Hostname field, with the endpoint highlighted below.

  3. If prompted to continue connecting, enter yes.

  4. Enter the password Password.1!!

    Screenshot of the Git Bash window, with three command lines highlighted. The first is the remoteuser@clustername information. The second line is a prompt asking whether you want to continue connecting. The third is the remoteuser cluster name request for password.

  5. At the command prompt, type R to load the R shell (be sure to use a capital letter "R").

  6. Run the following command to create a spark context for R (NOTE, these commands do not produce any immediate output):

    sparkCluster <- RxSpark()
    rxSetComputeContext(sparkCluster)
    
  7. In Visual Studio, open TrainModel.R, and copy the entire script.

  8. Paste the script in the R shell, and press ENTER (you may need to press ENTER a few times until you get to the last line of the script).

  9. When the script has finished executing, type the following:

    dir()
    
  10. You should see it list the three files created by the script, as follows:

    Screenshot of the results of running the TrainModel.R script in R shell, listing the following three files: inputExample.RData, pricingModel.RData, and sampleData.RData.

  11. Now, copy those files from local storage to Blob storage by using the Hadoop File System. First, create a folder in which to store your output.

    modelExportDir <- "/models"
    rxHadoopMakeDir(modelExportDir)
    
  12. List the contents of the root ("/") directory and confirm your "/models" folder has been created. Notice that the list you are looking at is folders directly underneath the container in Azure Storage that was created with your cluster.

    rxHadoopListFiles("/")
    

    Screenshot of the Git Bash R shell window, with the previously mentioned information displaying, and the /models folder information highlighted.

  13. Copy the pricingModel.RData from the local directory to the /models folder in HDFS by running the following command:

    rxHadoopCopyFromLocal("pricingModel.RData", modelExportDir)
    
  14. Repeat for inputExample.RData and sampleData.RData.

    rxHadoopCopyFromLocal("inputExample.RData", modelExportDir)
    rxHadoopCopyFromLocal("sampleData.RData", modelExportDir)
    
  15. Run the following command to verify the three files now exist in HDFS (and Blob storage), under /models:

    rxHadoopListFiles("/models")
    
  16. The output should look similar to the following:

    Screenshot of the Git Bash R Shell window with output verifying that the three files exist in HDFS.

  17. Using Visual Studio, Cloud Explorer, navigate to the labstorageSUFFIX storage account for your HDInsight cluster, expand:

    Screenshot of Visual Studio, Cloud Explorer expanded as: labstorageSUFFIX\Blob Containers\vendingmachinescluster.

  18. Right-click the storage container, and select Open.

  19. In the editor that appears, double-click the models folder, and verify you see your files.

    Cloud Explorer file explorer displaying files listed under models/; three files display: inputExampleRData, pricingModelRData, and sampleDataRData.

  20. Right-click inputExample.RData and select Save As.... Choose the PricingModel directory in the starter project, select Save, overwriting files if prompted.

  21. Repeat the previous step for pricingModel.RData and sampleData.RData.

  22. You have now used R Server on HDInsight to train a model that you can then upload to R Server Operationalization to expose it as a web service.

Task 4: Create predictive service in R Server Operationalization

After training a model, you want to operationalize the model so that it becomes available for integration by developers. One way to operationalize a trained model is to take the model you trained in HDInsight, and then to expose that as a predictive web service. In this task, you take a version of the scripts you have been running locally and in HDInsight and migrate them to run in the VM that is running R.

  1. In the Azure portal, navigate to the Microsoft Machine Learning Server Virtual Machine, named LabMLServer, you provisioned with the ARM template.

  2. On top of the Overview blade, select Connect, then copy the SSH command.

    The Connect dialog for the Microsoft Machine Learning Server is displayed, and the SSH command has been copied.

  3. Using a new Git Bash window on your Lab VM, SSH into your Microsoft Machine Learning Server VM by pasting the SSH command you copied above at the command prompt. For example: ssh radmin@[your-server-ip].

  4. If prompted, enter yes.

  5. Enter your password, Password.1!!

  6. At the prompt, after successfully logging in, you will need to complete a few tasks to configure and operationalize the environment.

  7. Run the following command to act as root:

    sudo -i
    
  8. Now that you are acting as root run the following command to:

    az ml admin node setup --onebox --admin-password 'Password.1!!' --confirm-password 'Password.1!!'
    

    Note: If the above command gives you issues, you can also try running the following, and then enter the password when prompted:

    az ml admin node setup --onebox
    

    Command to setup ML server to host a webservice has been completed. the port 12800 is highlighted.

  9. In Visual Studio, open the App.config for the Simulator project. This can be done by expanding the Simulator project in the Solution Explorer, and double-clicking on App.config under the Simulator project.

    The Visual Studio Solution Explorer has Simulator highlighted and expanded, and under it, App.config highlighted.

  10. Locate the appSetting rServiceBaseAddress and enter http://[your-server-public-ip]:12800 for the value. (For example: http://52.168.132.221:12800). Your server IP address is the same IP address you used for the SSH connection in Step 2 above.

  11. Locate rServicePassword in the appSettings section, and update its value with the password you defined when you configured your R server for Operationalization (step 8 above): Password.1!!

    The App.config file for the project is open with the AppSettings for three keys shown. These include the URL to the server, user name and password.

  12. Save App.config.

  13. In the PricingModel project, open the PredictPricingService.r file from Solution Explorer.

    The Visual Studio Solution Explorer has PricingModel expanded, and under it, PredictPricingService.r highlighted.

  14. Find TODO 1, and replace with the following code block:

    # TODO: 1. Load packages needed for operationalization
    usePackage <- function(p) {
    if (!is.element(p, installed.packages()[, 1]))
    install.packages(p, dep = TRUE)
    library(p, character.only = TRUE)
    }
    usePackage("curl")
    usePackage("ggplot2")
    usePackage("mrsdeploy")
    usePackage("RevoScaleR")
    
  15. Find TODO 2, and replace with the following code block to remotely connect to the R Server Operationalization service:

    # TODO: 2. Configure remote login
    remoteLogin(
    deployr_endpoint = "http://<your-server-ip>:12800",
    username = "admin",
    password = "<your-admin-password>"
    )
    pause()
    

    Note: Make sure to replace <your-server-ip> with the IP address with your VM's IP address, and enter the password you specified when you configured your R server for Operationalization (Password.1!!) in place of <your-admin-password>. The TODO 2 section should looking something like the following screen shot.

    Screenshot of the TODO 2 section, with the IP address and password highlighted.

  16. Highlight all the code in PredictPricingService.r, right-click and then Execute In Interactive. The last output status in the R Interactive window should be "Published service".

Note: This can take up to 5 minutes for the packages to download, verify and for the session to become active.

  1. Find TODO 3 and replace with the following code block to consume the API as a test:

    # TODO: 3. Consume the API as a test
    services <- listServices("apiPredictPurchasePrice")
    serviceName <- services[[1]]
    api <- getService(serviceName$name, serviceName$version)
    result <- api$apiPredictPurchasePrice(30, "F", "coconut water")
    print("Result: ")
    print(result$output("answer"))
    result
    
  2. Find TODO 4 and replace with the following code block to generate and save the Swagger JSON file for the API:

    # TODO: 4. Generate the Swagger JSON file for the API
    swagger <- api$swagger()
    cat(swagger, file = "swagger.json", append = FALSE)
    
  3. Highlight just the TODO 3 and TODO 4 code blocks you added and execute in interactive.

  4. When you scroll up through the R Interactive window results, you should see an output with your prediction like the following:

    $success
    [1] TRUE
    
    $errorMessage
    [1] ""
    
    $outputParameters
    $outputParameters$purchasePrice
    [1] 0.9348741
    
  5. Also open the swagger.json file in your PricingModel project directory to view its contents. This file can be used within the swagger.io online editor to generate client code to connect to your service. We have already done this for you within the included IO.Swagger project.

Exercise 2: Implement dynamic pricing

Duration: 45 minutes

In this exercise you will implement the code that performs dynamic pricing, capitalizing on the Face API to acquire demographics and your deployed pricing model to suggest the price based on those demographics. You will then run the vending machine simulator and see the dynamic pricing in action.

Task 1: Implement photo uploads to Azure Storage

  1. In Visual Studio Solution Explorer, expand the Simulator project and then MainWindow.xaml and then open MainWindow.xaml.cs.

    Visual Studio Solution Explorer is expanded as: Simulator\MainWindow.xaml\MainWindow.xaml.cs.

Note: Ignore any errors, you will update the NuGet packages in a later step.

  1. Scroll down to the method UpdateDynamicPricing.

    The UpdateDynamicPrice function is high-lighted in Visual Studio.

  2. Replace TODO 1 with the following:

    // TODO 1. Retrieve storage account from connection string.
    CloudStorageAccount storageAccount = CloudStorageAccount.Parse(_storageConnectionString);
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
    CloudBlobContainer container = blobClient.GetContainerReference("photos");
    
  3. Replace TODO 2 with the following:

    // TODO 2. Retrieve reference to a blob named with the value of fileName.
    string blobName = Guid.NewGuid().ToString() + System.IO.Path.GetExtension(filename);
    CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobName);
    
  4. Replace TODO 3 with the following:

    // TODO 3. Create or overwrite the blob with contents from a local file.
    using (var fileStream = System.IO.File.OpenRead(filename))
    {
        blockBlob.UploadFromStream(fileStream);
    }
    
  5. Save MainWindow.xaml.cs.

Task 2: Provision Cognitive Services Face API

To provision access to the Face API (which provides demographic information about photos of human subjects), you will need to provision a Cognitive Services account.

  1. In the Azure portal, select +Create a resource, enter "face" into the Search the Marketplace box, select Face from the results, and select Create.

    +Create a resource is highlighted in the navigation pane of the Azure portal,

  2. On the Create Face API blade:

    • Name: Enter a name, such as vendingfaceapi.

    • Subscription: Select the subscription you are using for this hands-on lab.

    • Location: Select the location you are using for this hands-on lab.

    • Pricing tier: Select F0.

    • Resource group: Choose Use existing and select the hands-on-lab-SUFFIX resource group.

      The Face API Create blade is displayed, and the values specified above are entered into the appropriate fields.

    • Select Create.

  3. When the Face API finishes provisioning, browse to the Cognitive Services Face API by selecting Go to resource in the Deployment succeeded notification.

    Screenshot of the Deployment succeeded message, with the Go to resource button highlighted.

  4. On top of the Cognitive Services overview blade, select the Copy button to the right of the Endpoint. Paste this value into a text editor, such as Notepad, for later use.

    Screenshot of the Cognitive Services overview blade, Essentials section. The Endpoint url is highlighted, and a callout points to the

  5. In the Cognitive Services blade, select on Keys under the Resource Management heading.

    Screenshot of the Keys button.

  6. Click the Copy button next to the value for Key 1. Paste this value into a text editor, such as Notepad, for later use.

    The Azure portal is shown with Key copied for the API.

Task 3: Invoke Face API

  1. Switch back to Visual Studio. Continuing with MainWindow.xaml.cs, scroll down to GetBlobSasUri. This method will create a Shared Access Signature URI that the Face API can use to securely access the image in blob storage.

    The GetBlobSasUri method is high-lighted in Visual Studio

  2. Replace TODO 4 with the following:

    //TODO: 4. Create a Read blob and Write blob Shared Access Policy that is effective 5 minutes ago and for 2 hours into the future
    SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();
    sasConstraints.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5);
    sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddHours(2);
    sasConstraints.Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write;
    
  3. Replace TODO 5 with the following:

    //TODO: 5. construct the full URI with SAS
    string sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);
    return blob.Uri + sasBlobToken;
    
  4. Scroll to the GetPhotoDemographics method. Implement the call to the Face API.

    GetPhotoDemographics is high-lighted in Visual Studio.

  5. Replace TODO 6 with the following:

    //TODO 6. Invoke Face API with URI to photo
    IFaceServiceClient faceServiceClient = new FaceServiceClient(_faceApiKey, _faceEndpoint);
    
  6. Replace TODO 7 with the following:

    //TODO 7. Configure the desired attributes Age and Gender
    IEnumerable<FaceAttributeType> desiredAttributes = new FaceAttributeType[] { FaceAttributeType.Age, FaceAttributeType.Gender };
    
  7. Replace TODO 8 with the following:

    //TODO 8. Invoke the Face API Detect operation
    Face[] faces = await faceServiceClient.DetectAsync(sasUri, false, true, desiredAttributes);
    
  8. Replace TODO 9 with the following:

    //TODO 9. Extract the age and gender from the Face API response
    double computedAge = faces[0].FaceAttributes.Age;
    string computedGender = faces[0].FaceAttributes.Gender;
    
  9. Save the file.

Task 4: Invoke pricing model

  1. Within MainWindow.xaml.cs, scroll to the end of UpdateDynamicPrice method and replace TODO 10 with the following:

    //TODO 10. Invoke the actual ML Model
    PricingModelService pricingModel = new PricingModelService();
    string gender = d.gender == "Female" ? "F" : "M";
    suggestedPrice = await pricingModel.GetSuggestedPrice((int)d.age, gender, _itemName);
    
  2. Save the file.

Task 5: Configure the Simulator

  1. In the Simulator project, open App.config.

  2. Within the appSettings section, set the following settings that your recorded in previous task steps:

    • faceAPIKey: Set this to the KEY 1 value for your Face API as acquired from the Azure Portal.

    • faceEndpoint: Set this to the ENDPOINT value for your Face API as acquired from the Azure Portal (for example: https://eastus2.api.cognitive.microsoft.com/face/v1.0).

    • storageConnectionString: Set this to the connection string of the Storage Account, photostorageINIT you created with the photos container.

      • On your Storage account blade. Select Access Keys from the left-hand menu.

        Screenshot of the Settings section on the Storage blade. Access keys is highlighted.

      • Use the copy button to the right of the Connection String for key1 to copy your storage connection string. Save the copied value to a text editor, such as Notepad, as this will be used later on, but be sure to put in the app.config now.

        The keys for the storage account are shown in the Azure portal. The key1 Connection string is highlighted.

  3. Save the App.config. The updated App.config file settings should look similar to the following:

    Screenshot of the App.config file setting.

Task 6: Test dynamic pricing in Simulator

  1. Before building the project in Visual Studio, we need to ensure all the NuGet packages are properly restored. In Visual Studio, go to Tools->NuGet Package Manager->Package Manager Console and enter the following command:

    Update-Package -Reinstall
    

Note: The starter-project may be using an older Face API. In this case, you will need to run the following as the NuGet Package may be unpublished:

```powershell
Install-Package Microsoft.ProjectOxford.Face -Version 1.4.0 
Install-Package Newtonsoft.Json -Version 10.0.1
Install-Package Microsoft.ProjectOxford.Common -Version 1.0.324.0
Update-Package Microsoft.ProjectOxford.Common -reinstall
```
  1. This will force the packages to get downloaded again. If propmpted, click Yes to update the app.config file.

  2. Now, in Solution Explorer, right-click the Simulator project, and select Build.

  3. Ensure that your build generates no errors (View the Output and Errors windows, under the View menu in Visual Studio).

Note: Only build the Simulator project, the other projects are not ready to be built just yet.

  1. Again, in solution explorer, right-click the Simulator project, and select Set as Startup Project.

    Screenshot of Solution Explorer, with Simulator selected, and in the sub-menu, Set as StartUp Project selected.

  2. From the Debug menu, select Start Without Debugging.

    Screenshot of the Debug menu, with Start Without Debugging selected.

  3. When the vending machine simulator appears, select take picture at the bottom.

    The Vending Machine Simulator displays the ad for coconut water for $1.25.

  4. In the dialog that appears, navigate to the images folder under C:\VendingMachines\Hands-on-lab\starter-project\Simulator\images\photos, pick the photo of either the man or woman to upload, and select Open.

    Screenshot of Open window, expanded to the photos folder where two images display: female.jpeg, and male.jpeg.

  5. In a few moments, you should see the price change from $1.25 to whatever value the predictive model suggested.

    The Vending Machine Simulator now displays the coconut water price as $0.91.

  6. Try using the other photo or your own photo to see what prices are suggested.

  7. Click the X at the top right of the application to stop it.

Exercise 3: Implement purchasing

Duration: 15 minutes

In this exercise, you will create an in-memory, columnar index table in SQL DB that will be used to support purchase transactions in a real-time analytics fashion, and then implement the purchasing process in the vending machine simulator. Finally, you will run the simulator and purchase items.

Task 1: Create the transactions table

  1. Switch to Visual Studio, in Solution Explorer expand the SQL Scripts folder and open the file Create Table.sql.

    Screenshot of the Visual Studio Solution Explorer expanded to Create Table.sql.

  2. Replace TODO 1 with the following:

    --TODO: 1. Transaction ID should be a Primary Key, fields with a b-tree index
    TransactionId int IDENTITY NOT NULL PRIMARY KEY NONCLUSTERED,
    
  3. Replace TODO 2 with the following:

    --TODO: 2. This table should have a columnar index
    INDEX Transactions_CCI CLUSTERED COLUMNSTORE
    
  4. Replace TODO 3 with the following:

    --TODO: 3. This should be an in-memory table
    MEMORY_OPTIMIZED = ON
    
  5. Replace TODO 4 with the following:

    --TODO: 4. In-memory tables should auto-elevate their transaction level to Snapshot
    ALTER DATABASE CURRENT SET MEMORY_OPTIMIZED_ELEVATE_TO_SNAPSHOT=ON;
    
  6. Save the script.

  7. Execute the script by pressing the play icon.

    Visual Studio execute SQL script button

  8. In the Connect window, expand Azure, and if prompted, sign in with your Azure credentials.

  9. From the Azure node, select the database you created for the vending database.

    Screenshot of the Connect window. Azure is expanded, and vending (vendingmachineslab) is highlighted.

  10. In the in fields at the bottom, enter your user name and password, Password.1!!, for the SQL Server, and select Connect. The script should run successfully.

    Screenshot of the Query executed successfully message.

Task 2: Configure the Simulator

  1. In the Azure portal, navigate to the vending SQL database and select the Show database connection strings near the top of the Overview blade.

    Screenshot of the Azure Portal, Database section. Under Connection strings, the link to

  2. Copy the connection string on the ADO.NET tab of the Database connection string blade, and paste the value into a text editor, such as Notepad, for later reference.

    Screenshot of the ADO.NET tab, with the connection string selected, and the copy button highlighted.

  3. In the Simulator project, open App.config.

  4. Within the connectionString section, set the following:

    • TransactionsModel: set the value of the connectionString attribute to the ADO.NET connection string to your SQL DB instance. Do not forget to replace the values for and with your actual credentials.

      • User name: demouser

      • Password: Password.1!!

  5. Save the App.config.

    The App.config file is shown in Visual Studio with the updated connection string to the database.

Task 3: Test purchasing

  1. In solution explorer, right-click the Simulator project, and select Build.

Note: You may need to ensure the previous instance you started has been closed before rebuilding. Also be sure you are only building the Simulator project and it compiles with no errors.

  1. From the Debug menu, select Start Without Debugging.

  2. In the running Simulator application, select buy.

    The Vending Machine Simulator displays the ad for coconut water for $1.25, with a buy button.

  3. You should see a confirmation dialog similar to the following:

    Screenshot of the Purchase Complete dialog box, with the message

  4. Close the Vending Machine simulator application window.

Exercise 4: Implement device command and control

Duration: 45 minutes

In this exercise, you will implement the ability to push new promotions to the vending machine simulator using the command and control features of IoT Hub. You will update the simulator to listen for these messages. You will also update the console application DeviceControlConsole to send selected promotions.

Task 1: Provision IoT Hub

In these steps, you will provision an instance of IoT Hub.

  1. In your browser, navigate to the Azure portal. Select +Create a resource in the navigation pane, enter "iot" into the Search the Marketplace box, select IoT Hub from the results, and then select Create.

    +Create a resource is highlighted in the navigation page of the Azure portal, and

  2. On the IoT Hub blade Basics tab, enter the following:

    • Subscription: Select the subscription you are using for this hands-on lab.

    • Resource group: Choose Use existing and select the hands-on-lab-SUFFIX resource group.

    • Region: Select the location you are using for this hands-on lab.

    • IoT Hub Name: Enter a unique name, such as vendingmachineshubSUFFIX.

      The Basics blade for IoT Hub is displaye, with the values specified above entered into the appropriate fields.

    • Select Next: Size and Scale.

    • On the Size and scale blade, accept the default Pricing and scale tier of S1: Standard tier, and select Review + create.

    • Select Create on the Review + create blade.

  3. When the IoT Hub deployment is completed, you will receive a notification in the Azure portal. Select Go to resource in the notification.

    Screenshot of the Deployment succeeded message, with the Go to resource button highlighted.

  4. From the IoT Hub's Overview blade, select Shared access policies under Settings on the left-hand menu.

    Screenshot of the Overview blade, settings section. Under Settings, Shared access policies is highlighted.

  5. Select iothubowner policy.

Note: If you did not wait for the resource to provision you will not see any policies yet.

The Azure portal is shown with the iothubowner selected.

  1. In the iothubowner blade, select the Copy button to the right of the Connection string - primary key field. Paste the connection string value into a text editor, such as Notepad, as this will be needed later in this lab.

    Screenshot of the iothubowner blade. A callout points to the copy button to the right of the connection string - primary key field.

Task 2: Listen for control messages

  1. Within Visual Studio Solution Explorer, expand the Simulator project, and open the file MainWindow.xaml.cs.

  2. Scroll down to the ListenForControlMessages method.

    In the Visual Studio Solution Explorer window, ListenForControlMessages is highlighted.

  3. Uncomment the body of the while(true) loop. You can uncomment a block of code by selecting the code, then selecting the Uncomment button on the toolbar.

    Screenshot of the Solution Explorer toolbar, with the Uncomment button highlighted.

  4. Replace TODO 1 with the following:

    //TODO: 1. Receive messages intended for the device via the instance of _deviceClient.
    Microsoft.Azure.Devices.Client.Message receivedMessage = await _deviceClient.ReceiveAsync();
    
  5. Replace TODO 2 with the following:

    //TODO: 2. A null message may be received if the wait period expired, so ignore and call the receive operation again
    if (receivedMessage == null) return;
    
  6. Replace TODO 3 with the following:

    //TODO: 3. Deserialize the received binary encoded JSON message into an instance of PromoPackage.
    string receivedJSON = Encoding.ASCII.GetString(receivedMessage.GetBytes());
    System.Diagnostics.Trace.TraceInformation("Received message: {0}", receivedJSON);
    PromoPackage promo = Newtonsoft.Json.JsonConvert.DeserializeObject<PromoPackage>(receivedJSON);
    
  7. Replace TODO 4 with the following:

    //TODO: 4. Acknowledge receipt of the message with IoT Hub
    await _deviceClient.CompleteAsync(receivedMessage);
    
  8. Save the file.

Task 3: Send control messages

  1. Within Visual Studio Solution Explorer, expand the DeviceControlConsole project, and open the file Program.cs.

    Screenshot of Visual Studio Solution Explorer, with DeviceControlConsole expanded, and Program.cs highlighted.

  2. Scroll down to the PushPromo method.

    PushPromo method

  3. Replace TODO 1 with the following:

    //TODO: 1. Create a Service Client instance provided the _IoTHubConnectionString
    _serviceClient = ServiceClient.CreateFromConnectionString(_IoTHubConnectionString);
    
  4. Replace TODO 2 with the following:

    //TODO: 2. Send the command
    await _serviceClient.SendAsync(deviceId, commandMessage);
    
  5. Save Program.cs.

Task 4: Configure the DeviceControlConsole and Simulator

  1. In DeviceControlConsole, open App.config.

    Screenshot of Visual Studio Solution Explorer, with DeviceControlConsole expanded, and App.config highlighted.

  2. Set the IoTHubServiceConnectionString appSetting to have a value of the connection string for the service policy to your IoT Hub (recall you can get this from the Azure Portal IoT Hub blade, Shared access policies, and then select the policy).

  3. Set the storageConnectionString appSetting to have the same connection string for your storage account that the App.config file in the Simulator project has.

  4. Save the file.

  5. Now, open the App.config file in the Simulator project.

  6. Set the IoTHubSenderConnectionString appSetting to have a value of the connection string for the device policy to your IoT Hub.

  7. Set the IoTHubManagerConnectionString appSetting to have a value of the connection string for the iothubowner policy to your IoT Hub.

  8. Save the file.

  9. Build the Simulator and DeviceControlConsole projects, press Ctrl-Shift-B.

  10. In Solution Explorer, right-click the top solution node VendingMachines and select Set StartUp Projects.

    Screenshot of the Solution Explorer sub-menu for Solution VendingMachines (3 projects). Set StartUp Projects is selected.

  11. In the dialog, select the Multiple startup projects option, and ensure that Action is set to Start for both DeviceControlConsole and Simulator projects.

    Screenshot of the VendingMachines solution Property Pages Dialog Box.

  12. Select OK.

  13. From the Debug menu, choose Start without Debugging.

  14. Wait for both the Simulator and the DeviceControlConsole to appear.

    Screenshot of the Vending Machine Simulator and DeviceControlConsole. The Vending Machine Simulator displays the coconut water ad.

  15. In the DeviceControlConsole, press 1 to push the promotion for Soda.

    Screenshot of the Vending Machine Simulator and DeviceControlConsole. This time, the Vending Machine Simulator displays the soda ad.

Note: If you get a DeviceNotFoundException, ensure that you entered the IoT connection strings properly.

  1. Observe that the entire promotion surface of the vending machine changes (product name, price, and image).

    Note: If the photo does not change, and after a few minutes you receive a DotNetty.Transport... error, you will need to delete and recreate your IoT Hub in the Azure portal. The error is caused by a communication error between the application and your IoT Hub. Be sure to update your App.config file with the new IoT hub connection strings.

  2. Experiment sending the other promotion or toggling between promotions.

  3. Experiment with making purchases and sending photos to verify the other functions still work with the new promoted products.

Exercise 5: Analytics with Power BI Desktop

Duration: 15 minutes

In this exercise, you will use Power BI Desktop to query purchase data from the in-memory table of SQL DB and visualize the result.

Task 1: Build the query and create the visualization

  1. From your Start menu on your Lab VM, open Power BI Desktop. Login using your Azure credentials.

    The Power BI Desktop link is shown.

Note: If PowerBI Desktop is not installed, you can install from https://powerbi.microsoft.com/en-us/desktop/.

  1. In the opening dialog, select Get Data.

    Screenshot of the Power BI Desktop opening dialog box, with the Get Data link.

  2. In the Get Data dialog, select Azure in the categories list and then Azure SQL Database.

    Screenshot of the Get Data dialog box.

  3. Select Connect.

  4. In the dialog, enter the name of your SQL Server (e.g., myserver.database.windows.net), the name of your vending database, and select the DirectQuery option. Select OK.

    Screenshot of the SQL Server Database dialog box.

  5. On the next screen, select the Database tab on the left. provide your SQL username (demouser) and password (Password.1!!), and then select Connect.

    Screenshot of the SQL Server Database dialog box.

  6. In the Navigator dialog, check the box next to Transactions.

    Screenshot of the Navigator dialog box. In the left pane, under Display options, the checkbox is selected for Transactions.

  7. Select Load.

  8. In the ribbon, select Edit Queries.

    The Power BI Desktop Edit Queries button is shown.

  9. In the Query Editor, select the TransactionDate column header to select the column.

    The TransactionDate column is shown.

  10. In the Ribbon, select the Add Column tab and select Time, Hour, Hour.

    Screenshot of the Query Editor. On the ribbon, the Add Column tab is selected. On the Add Column ribbon, Time is selected. From its sub-menu, Hour is selected, and from Hour's sub-menu, Hour is selected again.

  11. Select the TransactionDate column again.

  12. In the Ribbon, select Time, Minute.

    On the Add Column ribbon, Time is selected, and from its sub-menu, Minute is selected.

  13. Select the TransactionDate one more time.

  14. In the ribbon, select Time, Second.

  15. In the ribbon, on the Home tab, select Close & Apply.

    Close and Apply is shown.

  16. In the message that appears, select Apply Changes.

    The Apply Changes message displays, informing you that there are pending changes in your queries that haven't been applied.

  17. In the Visualizations, select Stacked column chart.

    Screenshot of the Visualizations menu, with the stacked column chart icon selected.

  18. From the Fields list, drag the Minute field over to the axis property.

    Minute has been added under Axis.

  19. From the Fields list, drag the PurchasePrice over to the value property.

    The PurchasePrice has been added under Value.

  20. Your completed visualization summarizing the most profitable minutes in each hour should appear as follows:

    Screenshot of a stacked column chart detailing the purchase price by minute.

After the hands-on lab

Duration: 10 mins

In this exercise, you will delete any Azure resources that were created in support of the lab. You should follow all steps provided after attending the Hands-on lab to ensure your account does not continue to be charged for lab resources.

Task 1: Delete the resource group

  1. Using the Azure portal, navigate to the Resource group you used throughout this hands-on lab by selecting Resource groups in the left menu.

  2. Search for the name of your research group, and select it from the list.

  3. Select Delete in the command bar. Confirm the deletion by re-typing the Resource group name and selecting Delete.

You should follow all steps provided after attending the Hands-on lab.

Attribution

This content was originally posted here:
https://github.com/Microsoft/MCW-Intelligent-Vending-Machines

License

This content is licensed with the MIT License license.

MIT License

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE