mirror of
https://github.com/Azure/MachineLearningNotebooks.git
synced 2025-12-19 17:17:04 -05:00
416 lines
14 KiB
Plaintext
416 lines
14 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
|
|
"\n",
|
|
"Licensed under the MIT License."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
""
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"# ResNet50 Image Classification using ONNX and AzureML\n",
|
|
"\n",
|
|
"This example shows how to deploy the ResNet50 ONNX model as a web service using Azure Machine Learning services and the ONNX Runtime.\n",
|
|
"\n",
|
|
"## What is ONNX\n",
|
|
"ONNX is an open format for representing machine learning and deep learning models. ONNX enables open and interoperable AI by enabling data scientists and developers to use the tools of their choice without worrying about lock-in and flexibility to deploy to a variety of platforms. ONNX is developed and supported by a community of partners including Microsoft, Facebook, and Amazon. For more information, explore the [ONNX website](http://onnx.ai).\n",
|
|
"\n",
|
|
"## ResNet50 Details\n",
|
|
"ResNet classifies the major object in an input image into a set of 1000 pre-defined classes. For more information about the ResNet50 model and how it was created can be found on the [ONNX Model Zoo github](https://github.com/onnx/models/tree/master/vision/classification/resnet). "
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Prerequisites\n",
|
|
"\n",
|
|
"To make the best use of your time, make sure you have done the following:\n",
|
|
"\n",
|
|
"* Understand the [architecture and terms](https://docs.microsoft.com/azure/machine-learning/service/concept-azure-machine-learning-architecture) introduced by Azure Machine Learning\n",
|
|
"* If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) to:\n",
|
|
" * install the AML SDK\n",
|
|
" * create a workspace and its configuration file (config.json)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Check core SDK version number\n",
|
|
"import azureml.core\n",
|
|
"\n",
|
|
"print(\"SDK version:\", azureml.core.VERSION)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"#### Download pre-trained ONNX model from ONNX Model Zoo.\n",
|
|
"\n",
|
|
"Download the [ResNet50v2 model and test data](https://s3.amazonaws.com/onnx-model-zoo/resnet/resnet50v2/resnet50v2.tar.gz) and extract it in the same folder as this tutorial notebook.\n"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"import urllib.request\n",
|
|
"\n",
|
|
"onnx_model_url = \"https://s3.amazonaws.com/onnx-model-zoo/resnet/resnet50v2/resnet50v2.tar.gz\"\n",
|
|
"urllib.request.urlretrieve(onnx_model_url, filename=\"resnet50v2.tar.gz\")\n",
|
|
"\n",
|
|
"!tar xvzf resnet50v2.tar.gz"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Deploying as a web service with Azure ML"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Load your Azure ML workspace\n",
|
|
"\n",
|
|
"We begin by instantiating a workspace object from the existing workspace created earlier in the configuration notebook."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from azureml.core import Workspace\n",
|
|
"\n",
|
|
"ws = Workspace.from_config()\n",
|
|
"print(ws.name, ws.location, ws.resource_group, sep = '\\n')"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Register your model with Azure ML\n",
|
|
"\n",
|
|
"Now we upload the model and register it in the workspace."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from azureml.core.model import Model\n",
|
|
"\n",
|
|
"model = Model.register(model_path = \"resnet50v2/resnet50v2.onnx\",\n",
|
|
" model_name = \"resnet50v2\",\n",
|
|
" tags = {\"onnx\": \"demo\"},\n",
|
|
" description = \"ResNet50v2 from ONNX Model Zoo\",\n",
|
|
" workspace = ws)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"#### Displaying your registered models\n",
|
|
"\n",
|
|
"You can optionally list out all the models that you have registered in this workspace."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"models = ws.models\n",
|
|
"for name, m in models.items():\n",
|
|
" print(\"Name:\", name,\"\\tVersion:\", m.version, \"\\tDescription:\", m.description, m.tags)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Write scoring file\n",
|
|
"\n",
|
|
"We are now going to deploy our ONNX model on Azure ML using the ONNX Runtime. We begin by writing a score.py file that will be invoked by the web service call. The `init()` function is called once when the container is started so we load the model using the ONNX Runtime into a global session object."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"%%writefile score.py\n",
|
|
"import json\n",
|
|
"import time\n",
|
|
"import sys\n",
|
|
"import os\n",
|
|
"import numpy as np # we're going to use numpy to process input and output data\n",
|
|
"import onnxruntime # to inference ONNX models, we use the ONNX Runtime\n",
|
|
"\n",
|
|
"def softmax(x):\n",
|
|
" x = x.reshape(-1)\n",
|
|
" e_x = np.exp(x - np.max(x))\n",
|
|
" return e_x / e_x.sum(axis=0)\n",
|
|
"\n",
|
|
"def init():\n",
|
|
" global session\n",
|
|
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
|
|
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
|
|
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
|
|
" model = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'resnet50v2.onnx')\n",
|
|
" session = onnxruntime.InferenceSession(model, None)\n",
|
|
"\n",
|
|
"def preprocess(input_data_json):\n",
|
|
" # convert the JSON data into the tensor input\n",
|
|
" img_data = np.array(json.loads(input_data_json)['data']).astype('float32')\n",
|
|
" \n",
|
|
" #normalize\n",
|
|
" mean_vec = np.array([0.485, 0.456, 0.406])\n",
|
|
" stddev_vec = np.array([0.229, 0.224, 0.225])\n",
|
|
" norm_img_data = np.zeros(img_data.shape).astype('float32')\n",
|
|
" for i in range(img_data.shape[0]):\n",
|
|
" norm_img_data[i,:,:] = (img_data[i,:,:]/255 - mean_vec[i]) / stddev_vec[i]\n",
|
|
"\n",
|
|
" return norm_img_data\n",
|
|
"\n",
|
|
"def postprocess(result):\n",
|
|
" return softmax(np.array(result)).tolist()\n",
|
|
"\n",
|
|
"def run(input_data_json):\n",
|
|
" try:\n",
|
|
" start = time.time()\n",
|
|
" # load in our data which is expected as NCHW 224x224 image\n",
|
|
" input_data = preprocess(input_data_json)\n",
|
|
" input_name = session.get_inputs()[0].name # get the id of the first input of the model \n",
|
|
" result = session.run([], {input_name: input_data})\n",
|
|
" end = time.time() # stop timer\n",
|
|
" return {\"result\": postprocess(result),\n",
|
|
" \"time\": end - start}\n",
|
|
" except Exception as e:\n",
|
|
" result = str(e)\n",
|
|
" return {\"error\": result}"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Create inference configuration"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"First we create a YAML file that specifies which dependencies we would like to see in our container."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
|
"\n",
|
|
"\n",
|
|
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime==1.15.1\", \"azureml-core\", \"azureml-defaults\"])\n",
|
|
"\n",
|
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
|
" f.write(myenv.serialize_to_string())"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Create the inference configuration object. Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from azureml.core.model import InferenceConfig\n",
|
|
"from azureml.core.environment import Environment\n",
|
|
"\n",
|
|
"\n",
|
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Deploy the model"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from azureml.core.webservice import AciWebservice\n",
|
|
"\n",
|
|
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
|
|
" memory_gb = 1, \n",
|
|
" tags = {'demo': 'onnx'}, \n",
|
|
" description = 'web service for ResNet50 ONNX model')"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"The following cell will likely take a few minutes to run as well."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"from random import randint\n",
|
|
"\n",
|
|
"aci_service_name = 'onnx-demo-resnet50'+str(randint(0,100))\n",
|
|
"print(\"Service\", aci_service_name)\n",
|
|
"aci_service = Model.deploy(ws, aci_service_name, [model], inference_config, aciconfig)\n",
|
|
"aci_service.wait_for_deployment(True)\n",
|
|
"print(aci_service.state)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"In case the deployment fails, you can check the logs. Make sure to delete your aci_service before trying again."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"if aci_service.state != 'Healthy':\n",
|
|
" # run this command for debugging.\n",
|
|
" print(aci_service.get_logs())\n",
|
|
" aci_service.delete()"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Success!\n",
|
|
"\n",
|
|
"If you've made it this far, you've deployed a working web service that does image classification using an ONNX model. You can get the URL for the webservice with the code below."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"print(aci_service.scoring_uri)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"When you are eventually done using the web service, remember to delete it."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": null,
|
|
"metadata": {},
|
|
"outputs": [],
|
|
"source": [
|
|
"aci_service.delete()"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"authors": [
|
|
{
|
|
"name": "viswamy"
|
|
}
|
|
],
|
|
"category": "deployment",
|
|
"compute": [
|
|
"Local"
|
|
],
|
|
"datasets": [
|
|
"ImageNet"
|
|
],
|
|
"deployment": [
|
|
"Azure Container Instance"
|
|
],
|
|
"exclude_from_index": false,
|
|
"framework": [
|
|
"ONNX"
|
|
],
|
|
"friendly_name": "Deploy ResNet50 with ONNX Runtime",
|
|
"index_order": 4,
|
|
"kernelspec": {
|
|
"display_name": "Python 3.8 - AzureML",
|
|
"language": "python",
|
|
"name": "python38-azureml"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.6.5"
|
|
},
|
|
"star_tag": [],
|
|
"tags": [
|
|
"ONNX Model Zoo"
|
|
],
|
|
"task": "Image Classification"
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 2
|
|
} |