mirror of
https://github.com/Azure/MachineLearningNotebooks.git
synced 2025-12-21 18:15:13 -05:00
Compare commits
12 Commits
azureml-sd
...
azureml-sd
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
9662505517 | ||
|
|
8e103c02ff | ||
|
|
ecb5157add | ||
|
|
d7d23d5e7c | ||
|
|
83a21ba53a | ||
|
|
3c9cb89c1a | ||
|
|
cca7c2e26f | ||
|
|
e895d7c2bf | ||
|
|
3588eb9665 | ||
|
|
a09e726f31 | ||
|
|
4fb1d9ee5b | ||
|
|
b05ff80e9d |
@@ -2,7 +2,7 @@
|
||||
|
||||
This repository contains example notebooks demonstrating the [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning-service/) Python SDK which allows you to build, train, deploy and manage machine learning solutions using Azure. The AML SDK allows you the choice of using local or cloud compute resources, while managing and maintaining the complete data science workflow from the cloud.
|
||||
|
||||

|
||||

|
||||
|
||||
|
||||
## Quick installation
|
||||
@@ -17,11 +17,11 @@ This [index](.index.md) should assist in navigating the Azure Machine Learning n
|
||||
|
||||
If you want to...
|
||||
|
||||
* ...try out and explore Azure ML, start with image classification tutorials: [Part 1 (Training)](./tutorials/img-classification-part1-training.ipynb) and [Part 2 (Deployment)](./tutorials/img-classification-part2-deploy.ipynb).
|
||||
* ...try out and explore Azure ML, start with image classification tutorials: [Part 1 (Training)](./tutorials/image-classification-mnist-data/img-classification-part1-training.ipynb) and [Part 2 (Deployment)](./tutorials/image-classification-mnist-data/img-classification-part2-deploy.ipynb).
|
||||
* ...learn about experimentation and tracking run history, first [train within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then try [training on remote VM](./how-to-use-azureml/training/train-on-remote-vm/train-on-remote-vm.ipynb) and [using logging APIs](./how-to-use-azureml/training/logging-api/logging-api.ipynb).
|
||||
* ...train deep learning models at scale, first learn about [Machine Learning Compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and then try [distributed hyperparameter tuning](./how-to-use-azureml/training-with-deep-learning/train-hyperparameter-tune-deploy-with-pytorch/train-hyperparameter-tune-deploy-with-pytorch.ipynb) and [distributed training](./how-to-use-azureml/training-with-deep-learning/distributed-pytorch-with-horovod/distributed-pytorch-with-horovod.ipynb).
|
||||
* ...deploy models as a realtime scoring service, first learn the basics by [training within Notebook and deploying to Azure Container Instance](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then learn how to [register and manage models, and create Docker images](./how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb), and [production deploy models on Azure Kubernetes Cluster](./how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb).
|
||||
* ...deploy models as a batch scoring service, first [train a model within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), learn how to [register and manage models](./how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb), then [create Machine Learning Compute for scoring compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and [use Machine Learning Pipelines to deploy your model](https://aka.ms/pl-batch-scoring).
|
||||
* ...deploy models as a realtime scoring service, first learn the basics by [training within Notebook and deploying to Azure Container Instance](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then learn how to [production deploy models on Azure Kubernetes Cluster](./how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb).
|
||||
* ...deploy models as a batch scoring service, first [train a model within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then [create Machine Learning Compute for scoring compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and [use Machine Learning Pipelines to deploy your model](https://aka.ms/pl-batch-scoring).
|
||||
* ...monitor your deployed models, learn about using [App Insights](./how-to-use-azureml/deployment/enable-app-insights-in-production-service/enable-app-insights-in-production-service.ipynb).
|
||||
|
||||
## Tutorials
|
||||
|
||||
@@ -103,7 +103,7 @@
|
||||
"source": [
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"print(\"This notebook was created using version 1.0.83 of the Azure ML SDK\")\n",
|
||||
"print(\"This notebook was created using version 1.1.0rc0 of the Azure ML SDK\")\n",
|
||||
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
|
||||
]
|
||||
},
|
||||
|
||||
@@ -9,7 +9,6 @@ As a pre-requisite, run the [configuration Notebook](../configuration.ipynb) not
|
||||
* [train-on-amlcompute](./training/train-on-amlcompute): Use a 1-n node Azure ML managed compute cluster for remote runs on Azure CPU or GPU infrastructure.
|
||||
* [train-on-remote-vm](./training/train-on-remote-vm): Use Data Science Virtual Machine as a target for remote runs.
|
||||
* [logging-api](./track-and-monitor-experiments/logging-api): Learn about the details of logging metrics to run history.
|
||||
* [register-model-create-image-deploy-service](./deployment/register-model-create-image-deploy-service): Learn about the details of model management.
|
||||
* [production-deploy-to-aks](./deployment/production-deploy-to-aks) Deploy a model to production at scale on Azure Kubernetes Service.
|
||||
* [enable-app-insights-in-production-service](./deployment/enable-app-insights-in-production-service) Learn how to use App Insights with production web service.
|
||||
|
||||
|
||||
@@ -197,6 +197,17 @@ If automl_setup_linux.sh fails on Ubuntu Linux with the error: `unable to execut
|
||||
4) Check that the region is one of the supported regions: `eastus2`, `eastus`, `westcentralus`, `southeastasia`, `westeurope`, `australiaeast`, `westus2`, `southcentralus`
|
||||
5) Check that you have access to the region using the Azure Portal.
|
||||
|
||||
## import AutoMLConfig fails after upgrade from before 1.0.76 to 1.0.76 or later
|
||||
There were package changes in automated machine learning version 1.0.76, which require the previous version to be uninstalled before upgrading to the new version.
|
||||
If you have manually upgraded from a version of automated machine learning before 1.0.76 to 1.0.76 or later, you may get the error:
|
||||
`ImportError: cannot import name 'AutoMLConfig'`
|
||||
|
||||
This can be resolved by running:
|
||||
`pip uninstall azureml-train-automl` and then
|
||||
`pip install azureml-train-automl`
|
||||
|
||||
The automl_setup.cmd script does this automatically.
|
||||
|
||||
## workspace.from_config fails
|
||||
If the call `ws = Workspace.from_config()` fails:
|
||||
1) Make sure that you have run the `configuration.ipynb` notebook successfully.
|
||||
|
||||
@@ -2,7 +2,7 @@ name: azure_automl
|
||||
dependencies:
|
||||
# The python interpreter version.
|
||||
# Currently Azure ML only supports 3.5.2 and later.
|
||||
- pip
|
||||
- pip<=19.3.1
|
||||
- python>=3.5.2,<3.6.8
|
||||
- nb_conda
|
||||
- matplotlib==2.1.0
|
||||
|
||||
@@ -2,7 +2,7 @@ name: azure_automl
|
||||
dependencies:
|
||||
# The python interpreter version.
|
||||
# Currently Azure ML only supports 3.5.2 and later.
|
||||
- pip
|
||||
- pip<=19.3.1
|
||||
- nomkl
|
||||
- python>=3.5.2,<3.6.8
|
||||
- nb_conda
|
||||
|
||||
@@ -92,6 +92,32 @@
|
||||
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Accessing the Azure ML workspace requires authentication with Azure.\n",
|
||||
"\n",
|
||||
"The default authentication is interactive authentication using the default tenant. Executing the `ws = Workspace.from_config()` line in the cell below will prompt for authentication the first time that it is run.\n",
|
||||
"\n",
|
||||
"If you have multiple Azure tenants, you can specify the tenant by replacing the `ws = Workspace.from_config()` line in the cell below with the following:\n",
|
||||
"\n",
|
||||
"```\n",
|
||||
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
|
||||
"auth = InteractiveLoginAuthentication(tenant_id = 'mytenantid')\n",
|
||||
"ws = Workspace.from_config(auth = auth)\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"If you need to run in an environment where interactive login is not possible, you can use Service Principal authentication by replacing the `ws = Workspace.from_config()` line in the cell below with the following:\n",
|
||||
"\n",
|
||||
"```\n",
|
||||
"from azureml.core.authentication import ServicePrincipalAuthentication\n",
|
||||
"auth = auth = ServicePrincipalAuthentication('mytenantid', 'myappid', 'mypassword')\n",
|
||||
"ws = Workspace.from_config(auth = auth)\n",
|
||||
"```\n",
|
||||
"For more details, see [aka.ms/aml-notebook-auth](http://aka.ms/aml-notebook-auth)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
|
||||
@@ -2,3 +2,10 @@ name: auto-ml-classification-bank-marketing-all-features
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- interpret
|
||||
- onnxruntime==1.0.0
|
||||
- azureml-explain-model
|
||||
- azureml-contrib-interpret
|
||||
|
||||
@@ -210,7 +210,6 @@
|
||||
"automl_settings = {\n",
|
||||
" \"n_cross_validations\": 3,\n",
|
||||
" \"primary_metric\": 'average_precision_score_weighted',\n",
|
||||
" \"preprocess\": True,\n",
|
||||
" \"enable_early_stopping\": True,\n",
|
||||
" \"max_concurrent_iterations\": 2, # This is a limit for testing purpose, please increase it as per cluster size\n",
|
||||
" \"experiment_timeout_hours\": 0.2, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
|
||||
@@ -283,7 +282,11 @@
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"widget-rundetails-sample"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.widgets import RunDetails\n",
|
||||
|
||||
@@ -2,3 +2,8 @@ name: auto-ml-classification-credit-card-fraud
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- interpret
|
||||
- azureml-explain-model
|
||||
|
||||
@@ -275,7 +275,6 @@
|
||||
"automl_settings = {\n",
|
||||
" \"experiment_timeout_minutes\": 20,\n",
|
||||
" \"primary_metric\": 'accuracy',\n",
|
||||
" \"preprocess\": True,\n",
|
||||
" \"max_concurrent_iterations\": 4, \n",
|
||||
" \"max_cores_per_iteration\": -1,\n",
|
||||
" \"enable_dnn\": True,\n",
|
||||
|
||||
@@ -2,3 +2,7 @@ name: auto-ml-classification-text-dnn
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- azurmel-train
|
||||
|
||||
@@ -350,7 +350,6 @@
|
||||
" \"experiment_timeout_hours\": 0.2,\n",
|
||||
" \"n_cross_validations\": 3,\n",
|
||||
" \"primary_metric\": 'r2_score',\n",
|
||||
" \"preprocess\": True,\n",
|
||||
" \"max_concurrent_iterations\": 3,\n",
|
||||
" \"max_cores_per_iteration\": -1,\n",
|
||||
" \"verbosity\": logging.INFO,\n",
|
||||
|
||||
@@ -2,3 +2,7 @@ name: auto-ml-continuous-retraining
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- azureml-pipeline
|
||||
|
||||
@@ -1,4 +1,10 @@
|
||||
name: auto-ml-forecasting-beer-remote
|
||||
dependencies:
|
||||
- fbprophet==0.5
|
||||
- py-xgboost<=0.80
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- azureml-train
|
||||
|
||||
@@ -76,9 +76,12 @@ def get_result_df(remote_run):
|
||||
def run_inference(test_experiment, compute_target, script_folder, train_run,
|
||||
test_dataset, lookback_dataset, max_horizon,
|
||||
target_column_name, time_column_name, freq):
|
||||
train_run.download_file('outputs/model.pkl', 'inference/model.pkl')
|
||||
train_run.download_file('outputs/conda_env_v_1_0_0.yml',
|
||||
'inference/condafile.yml')
|
||||
model_base_name = 'model.pkl'
|
||||
if 'model_data_location' in train_run.properties:
|
||||
model_location = train_run.properties['model_data_location']
|
||||
_, model_base_name = model_location.rsplit('/', 1)
|
||||
train_run.download_file('outputs/{}'.format(model_base_name), 'inference/{}'.format(model_base_name))
|
||||
train_run.download_file('outputs/conda_env_v_1_0_0.yml', 'inference/condafile.yml')
|
||||
|
||||
inference_env = Environment("myenv")
|
||||
inference_env.docker.enabled = True
|
||||
@@ -91,7 +94,8 @@ def run_inference(test_experiment, compute_target, script_folder, train_run,
|
||||
'--max_horizon': max_horizon,
|
||||
'--target_column_name': target_column_name,
|
||||
'--time_column_name': time_column_name,
|
||||
'--frequency': freq
|
||||
'--frequency': freq,
|
||||
'--model_path': model_base_name
|
||||
},
|
||||
inputs=[test_dataset.as_named_input('test_data'),
|
||||
lookback_dataset.as_named_input('lookback_data')],
|
||||
|
||||
@@ -232,6 +232,9 @@ parser.add_argument(
|
||||
parser.add_argument(
|
||||
'--frequency', type=str, dest='freq',
|
||||
help='Frequency of prediction')
|
||||
parser.add_argument(
|
||||
'--model_path', type=str, dest='model_path',
|
||||
default='model.pkl', help='Filename of model to be loaded')
|
||||
|
||||
|
||||
args = parser.parse_args()
|
||||
@@ -239,6 +242,7 @@ max_horizon = args.max_horizon
|
||||
target_column_name = args.target_column_name
|
||||
time_column_name = args.time_column_name
|
||||
freq = args.freq
|
||||
model_path = args.model_path
|
||||
|
||||
|
||||
print('args passed are: ')
|
||||
@@ -246,6 +250,7 @@ print(max_horizon)
|
||||
print(target_column_name)
|
||||
print(time_column_name)
|
||||
print(freq)
|
||||
print(model_path)
|
||||
|
||||
run = Run.get_context()
|
||||
# get input dataset by name
|
||||
@@ -267,7 +272,8 @@ X_lookback_df = lookback_dataset.drop_columns(columns=[target_column_name])
|
||||
y_lookback_df = lookback_dataset.with_timestamp_columns(
|
||||
None).keep_columns(columns=[target_column_name])
|
||||
|
||||
fitted_model = joblib.load('model.pkl')
|
||||
fitted_model = joblib.load(model_path)
|
||||
|
||||
|
||||
if hasattr(fitted_model, 'get_lookback'):
|
||||
lookback = fitted_model.get_lookback()
|
||||
|
||||
@@ -1,4 +1,9 @@
|
||||
name: auto-ml-forecasting-bike-share
|
||||
dependencies:
|
||||
- fbprophet==0.5
|
||||
- py-xgboost<=0.80
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
|
||||
@@ -253,7 +253,7 @@
|
||||
"source": [
|
||||
"# split into train based on time\n",
|
||||
"train = dataset.time_before(datetime(2017, 8, 8, 5), include_boundary=True)\n",
|
||||
"train.to_pandas_dataframe().sort_values(time_column_name).tail(5).reset_index(drop=True)"
|
||||
"train.to_pandas_dataframe().reset_index(drop=True).sort_values(time_column_name).tail(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -264,7 +264,7 @@
|
||||
"source": [
|
||||
"# split into test based on time\n",
|
||||
"test = dataset.time_between(datetime(2017, 8, 8, 6), datetime(2017, 8, 10, 5))\n",
|
||||
"test.to_pandas_dataframe().head(5).reset_index(drop=True)"
|
||||
"test.to_pandas_dataframe().reset_index(drop=True).head(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -2,3 +2,9 @@ name: auto-ml-forecasting-energy-demand
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- interpret
|
||||
- azureml-explain-model
|
||||
- azureml-contrib-interpret
|
||||
|
||||
@@ -1,551 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Automated Machine Learning\n",
|
||||
"\n",
|
||||
"_**Forecasting with grouping using Pipelines**_\n",
|
||||
"\n",
|
||||
"## Contents\n",
|
||||
"\n",
|
||||
"1. [Introduction](#Introduction)\n",
|
||||
"2. [Setup](#Setup)\n",
|
||||
"3. [Data](#Data)\n",
|
||||
"4. [Compute](#Compute)\n",
|
||||
"4. [AutoMLConfig](#AutoMLConfig)\n",
|
||||
"5. [Pipeline](#Pipeline)\n",
|
||||
"5. [Train](#Train)\n",
|
||||
"6. [Test](#Test)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"## Introduction\n",
|
||||
"In this example we use Automated ML and Pipelines to train, select, and operationalize forecasting models for multiple time-series.\n",
|
||||
"\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) first if you haven't already to establish your connection to the AzureML Workspace.\n",
|
||||
"\n",
|
||||
"In this notebook you will learn how to:\n",
|
||||
"\n",
|
||||
"* Create an Experiment in an existing Workspace.\n",
|
||||
"* Configure AutoML using AutoMLConfig.\n",
|
||||
"* Use our helper script to generate pipeline steps to split, train, and deploy the models.\n",
|
||||
"* Explore the results.\n",
|
||||
"* Test the models.\n",
|
||||
"\n",
|
||||
"It is advised you ensure your cluster has at least one node per group.\n",
|
||||
"\n",
|
||||
"An Enterprise workspace is required for this notebook. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade)\n",
|
||||
"\n",
|
||||
"## Setup\n",
|
||||
"As part of the setup you have already created an Azure ML `Workspace` object. For Automated ML you will need to create an `Experiment` object, which is a named object in a `Workspace` used to run experiments. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"import logging\n",
|
||||
"import warnings\n",
|
||||
"\n",
|
||||
"import numpy as np\n",
|
||||
"import pandas as pd\n",
|
||||
"\n",
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"from azureml.core.workspace import Workspace\n",
|
||||
"from azureml.core.experiment import Experiment\n",
|
||||
"from azureml.train.automl import AutoMLConfig"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Accessing the Azure ML workspace requires authentication with Azure.\n",
|
||||
"\n",
|
||||
"The default authentication is interactive authentication using the default tenant. Executing the ws = Workspace.from_config() line in the cell below will prompt for authentication the first time that it is run.\n",
|
||||
"\n",
|
||||
"If you have multiple Azure tenants, you can specify the tenant by replacing the ws = Workspace.from_config() line in the cell below with the following:\n",
|
||||
"```\n",
|
||||
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
|
||||
"auth = InteractiveLoginAuthentication(tenant_id = 'mytenantid')\n",
|
||||
"ws = Workspace.from_config(auth = auth)\n",
|
||||
"```\n",
|
||||
"If you need to run in an environment where interactive login is not possible, you can use Service Principal authentication by replacing the ws = Workspace.from_config() line in the cell below with the following:\n",
|
||||
"```\n",
|
||||
"from azureml.core.authentication import ServicePrincipalAuthentication\n",
|
||||
"auth = auth = ServicePrincipalAuthentication('mytenantid', 'myappid', 'mypassword')\n",
|
||||
"ws = Workspace.from_config(auth = auth)\n",
|
||||
"```\n",
|
||||
"For more details, see aka.ms/aml-notebook-auth"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ws = Workspace.from_config()\n",
|
||||
"ds = ws.get_default_datastore()\n",
|
||||
"\n",
|
||||
"# choose a name for the run history container in the workspace\n",
|
||||
"experiment_name = 'automl-grouping-oj'\n",
|
||||
"# project folder\n",
|
||||
"project_folder = './sample_projects/{}'.format(experiment_name)\n",
|
||||
"\n",
|
||||
"experiment = Experiment(ws, experiment_name)\n",
|
||||
"\n",
|
||||
"output = {}\n",
|
||||
"output['SDK version'] = azureml.core.VERSION\n",
|
||||
"output['Subscription ID'] = ws.subscription_id\n",
|
||||
"output['Workspace'] = ws.name\n",
|
||||
"output['Resource Group'] = ws.resource_group\n",
|
||||
"output['Location'] = ws.location\n",
|
||||
"output['Project Directory'] = project_folder\n",
|
||||
"output['Run History Name'] = experiment_name\n",
|
||||
"pd.set_option('display.max_colwidth', -1)\n",
|
||||
"outputDf = pd.DataFrame(data = output, index = [''])\n",
|
||||
"outputDf.T"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Data\n",
|
||||
"Upload data to your default datastore and then load it as a `TabularDataset`"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.dataset import Dataset"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# upload training and test data to your default datastore\n",
|
||||
"ds = ws.get_default_datastore()\n",
|
||||
"ds.upload(src_dir='./data', target_path='groupdata', overwrite=True, show_progress=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# load data from your datastore\n",
|
||||
"data = Dataset.Tabular.from_delimited_files(path=ds.path('groupdata/dominicks_OJ_2_5_8_train.csv'))\n",
|
||||
"data_test = Dataset.Tabular.from_delimited_files(path=ds.path('groupdata/dominicks_OJ_2_5_8_test.csv'))\n",
|
||||
"\n",
|
||||
"data.take(5).to_pandas_dataframe()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Compute \n",
|
||||
"\n",
|
||||
"#### Create or Attach existing AmlCompute\n",
|
||||
"\n",
|
||||
"You will need to create a compute target for your automated ML run. In this tutorial, you create AmlCompute as your training compute resource.\n",
|
||||
"#### Creation of AmlCompute takes approximately 5 minutes. \n",
|
||||
"If the AmlCompute with that name is already in your workspace this code will skip the creation process.\n",
|
||||
"As with other Azure services, there are limits on certain resources (e.g. AmlCompute) associated with the Azure Machine Learning service. Please read this article on the default limits and how to request more quota."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.compute import AmlCompute\n",
|
||||
"from azureml.core.compute import ComputeTarget\n",
|
||||
"\n",
|
||||
"# Choose a name for your cluster.\n",
|
||||
"amlcompute_cluster_name = \"cpu-cluster-11\"\n",
|
||||
"\n",
|
||||
"found = False\n",
|
||||
"# Check if this compute target already exists in the workspace.\n",
|
||||
"cts = ws.compute_targets\n",
|
||||
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
|
||||
" found = True\n",
|
||||
" print('Found existing compute target.')\n",
|
||||
" compute_target = cts[amlcompute_cluster_name]\n",
|
||||
" \n",
|
||||
"if not found:\n",
|
||||
" print('Creating a new compute target...')\n",
|
||||
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
|
||||
" #vm_priority = 'lowpriority', # optional\n",
|
||||
" max_nodes = 6)\n",
|
||||
"\n",
|
||||
" # Create the cluster.\n",
|
||||
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
|
||||
" \n",
|
||||
"print('Checking cluster status...')\n",
|
||||
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
|
||||
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
|
||||
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
|
||||
" \n",
|
||||
"# For a more detailed view of current AmlCompute status, use get_status()."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## AutoMLConfig\n",
|
||||
"#### Create a base AutoMLConfig\n",
|
||||
"This configuration will be used for all the groups in the pipeline."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"target_column = 'Quantity'\n",
|
||||
"time_column_name = 'WeekStarting'\n",
|
||||
"grain_column_names = ['Brand']\n",
|
||||
"group_column_names = ['Store']\n",
|
||||
"max_horizon = 20"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"automl_settings = {\n",
|
||||
" \"iteration_timeout_minutes\" : 5,\n",
|
||||
" \"experiment_timeout_hours\" : 0.25,\n",
|
||||
" \"primary_metric\" : 'normalized_mean_absolute_error',\n",
|
||||
" \"time_column_name\": time_column_name,\n",
|
||||
" \"grain_column_names\": grain_column_names,\n",
|
||||
" \"max_horizon\": max_horizon,\n",
|
||||
" \"drop_column_names\": ['logQuantity'],\n",
|
||||
" \"max_concurrent_iterations\": 2,\n",
|
||||
" \"max_cores_per_iteration\": -1\n",
|
||||
"}\n",
|
||||
"base_configuration = AutoMLConfig(task = 'forecasting',\n",
|
||||
" path = project_folder,\n",
|
||||
" n_cross_validations=3,\n",
|
||||
" **automl_settings\n",
|
||||
" )"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Pipeline\n",
|
||||
"We've written a script to generate the individual pipeline steps used to create each automl step. Calling this script will return a list of PipelineSteps that will train multiple groups concurrently and then deploy these models.\n",
|
||||
"\n",
|
||||
"This step requires an Enterprise workspace to gain access to this feature. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade).\n",
|
||||
"\n",
|
||||
"### Call the method to build pipeline steps\n",
|
||||
"\n",
|
||||
"`build_pipeline_steps()` takes as input:\n",
|
||||
"* **automlconfig**: This is the configuration used for every automl step\n",
|
||||
"* **df**: This is the dataset to be used for training\n",
|
||||
"* **target_column**: This is the target column of the dataset\n",
|
||||
"* **compute_target**: The compute to be used for training\n",
|
||||
"* **deploy**: The option on to deploy the models after training, if set to true an extra step will be added to deploy a webservice with all the models (default is `True`)\n",
|
||||
"* **service_name**: The service name for the model query endpoint\n",
|
||||
"* **time_column_name**: The time column of the data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import Webservice\n",
|
||||
"from azureml.exceptions import WebserviceException\n",
|
||||
"\n",
|
||||
"service_name = 'grouped-model'\n",
|
||||
"try:\n",
|
||||
" # if you want to get existing service below is the command\n",
|
||||
" # since aci name needs to be unique in subscription deleting existing aci if any\n",
|
||||
" # we use aci_service_name to create azure aci\n",
|
||||
" service = Webservice(ws, name=service_name)\n",
|
||||
" if service:\n",
|
||||
" service.delete()\n",
|
||||
"except WebserviceException as e:\n",
|
||||
" pass"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from build import build_pipeline_steps\n",
|
||||
"\n",
|
||||
"steps = build_pipeline_steps(\n",
|
||||
" base_configuration, \n",
|
||||
" data, \n",
|
||||
" target_column,\n",
|
||||
" compute_target, \n",
|
||||
" group_column_names=group_column_names, \n",
|
||||
" deploy=True, \n",
|
||||
" service_name=service_name, \n",
|
||||
" time_column_name=time_column_name\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Train\n",
|
||||
"Use the list of steps generated from above to build the pipeline and submit it to your compute for remote training."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.core import Pipeline\n",
|
||||
"pipeline = Pipeline(\n",
|
||||
" description=\"A pipeline with one model per data group using Automated ML.\",\n",
|
||||
" workspace=ws, \n",
|
||||
" steps=steps)\n",
|
||||
"\n",
|
||||
"pipeline_run = experiment.submit(pipeline)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.widgets import RunDetails\n",
|
||||
"RunDetails(pipeline_run).show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"pipeline_run.wait_for_completion(show_output=False)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Test\n",
|
||||
"\n",
|
||||
"Now we can use the holdout set to test our models and ensure our web-service is running as expected."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import AciWebservice\n",
|
||||
"service = AciWebservice(ws, service_name)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"X_test = data_test.to_pandas_dataframe()\n",
|
||||
"# Drop the column we are trying to predict (target column)\n",
|
||||
"x_pred = X_test.drop(target_column, inplace=False, axis=1)\n",
|
||||
"x_pred.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Get Predictions\n",
|
||||
"test_sample = X_test.drop(target_column, inplace=False, axis=1).to_json()\n",
|
||||
"predictions = service.run(input_data=test_sample)\n",
|
||||
"print(predictions)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Convert predictions from JSON to DataFrame\n",
|
||||
"pred_dict =json.loads(predictions)\n",
|
||||
"X_pred = pd.read_json(pred_dict['predictions'])\n",
|
||||
"X_pred.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Fix the index\n",
|
||||
"PRED = 'pred_target'\n",
|
||||
"X_pred[time_column_name] = pd.to_datetime(X_pred[time_column_name], unit='ms')\n",
|
||||
"\n",
|
||||
"X_pred.set_index([time_column_name] + grain_column_names, inplace=True, drop=True)\n",
|
||||
"X_pred.rename({'_automl_target_col': PRED}, inplace=True, axis=1)\n",
|
||||
"# Drop all but the target column and index\n",
|
||||
"X_pred.drop(list(set(X_pred.columns.values).difference({PRED})), axis=1, inplace=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"X_test[time_column_name] = pd.to_datetime(X_test[time_column_name])\n",
|
||||
"X_test.set_index([time_column_name] + grain_column_names, inplace=True, drop=True)\n",
|
||||
"# Merge predictions with raw features\n",
|
||||
"pred_test = X_test.merge(X_pred, left_index=True, right_index=True)\n",
|
||||
"pred_test.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from sklearn.metrics import mean_absolute_error, mean_squared_error\n",
|
||||
"def MAPE(actual, pred):\n",
|
||||
" \"\"\"\n",
|
||||
" Calculate mean absolute percentage error.\n",
|
||||
" Remove NA and values where actual is close to zero\n",
|
||||
" \"\"\"\n",
|
||||
" not_na = ~(np.isnan(actual) | np.isnan(pred))\n",
|
||||
" not_zero = ~np.isclose(actual, 0.0)\n",
|
||||
" actual_safe = actual[not_na & not_zero]\n",
|
||||
" pred_safe = pred[not_na & not_zero]\n",
|
||||
" APE = 100*np.abs((actual_safe - pred_safe)/actual_safe)\n",
|
||||
" return np.mean(APE)\n",
|
||||
"\n",
|
||||
"def get_metrics(actuals, preds):\n",
|
||||
" return pd.Series(\n",
|
||||
" {\n",
|
||||
" \"RMSE\": np.sqrt(mean_squared_error(actuals, preds)),\n",
|
||||
" \"NormRMSE\": np.sqrt(mean_squared_error(actuals, preds))/np.abs(actuals.max()-actuals.min()),\n",
|
||||
" \"MAE\": mean_absolute_error(actuals, preds),\n",
|
||||
" \"MAPE\": MAPE(actuals, preds)},\n",
|
||||
" )"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"get_metrics(pred_test[PRED].values, pred_test[target_column].values)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "alyerman"
|
||||
}
|
||||
],
|
||||
"category": "other",
|
||||
"compute": [
|
||||
"AML Compute"
|
||||
],
|
||||
"datasets": [
|
||||
"Orange Juice Sales"
|
||||
],
|
||||
"deployment": [
|
||||
"Azure Container Instance"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"Scikit-learn",
|
||||
"Pytorch"
|
||||
],
|
||||
"friendly_name": "Automated ML Grouping with Pipeline.",
|
||||
"index_order": 10,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
},
|
||||
"tags": [
|
||||
"AutomatedML"
|
||||
],
|
||||
"task": "Use AzureML Pipeline to trigger multiple Automated ML runs."
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,4 +0,0 @@
|
||||
name: auto-ml-forecasting-grouping
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
@@ -1,144 +0,0 @@
|
||||
from typing import List, Dict
|
||||
import copy
|
||||
import json
|
||||
import pandas as pd
|
||||
import re
|
||||
|
||||
from azureml.core import RunConfiguration
|
||||
from azureml.core.compute import ComputeTarget
|
||||
from azureml.core.conda_dependencies import CondaDependencies
|
||||
from azureml.core.dataset import Dataset
|
||||
from azureml.data import TabularDataset
|
||||
from azureml.pipeline.core import PipelineData, PipelineParameter, TrainingOutput, StepSequence
|
||||
from azureml.pipeline.steps import PythonScriptStep
|
||||
from azureml.train.automl import AutoMLConfig
|
||||
from azureml.train.automl.runtime import AutoMLStep
|
||||
|
||||
|
||||
def _get_groups(data: Dataset, group_column_names: List[str]) -> pd.DataFrame:
|
||||
return data._dataflow.distinct(columns=group_column_names)\
|
||||
.keep_columns(columns=group_column_names).to_pandas_dataframe()
|
||||
|
||||
|
||||
def _get_configs(automlconfig: AutoMLConfig,
|
||||
data: Dataset,
|
||||
target_column: str,
|
||||
compute_target: ComputeTarget,
|
||||
group_column_names: List[str]) -> Dict[str, AutoMLConfig]:
|
||||
# remove invalid characters regex
|
||||
valid_chars = re.compile('[^a-zA-Z0-9-]')
|
||||
groups = _get_groups(data, group_column_names)
|
||||
configs = {}
|
||||
for i, group in groups.iterrows():
|
||||
single = data._dataflow
|
||||
group_name = "#####".join(str(x) for x in group.values)
|
||||
group_name = valid_chars.sub('', group_name)
|
||||
for key in group.index:
|
||||
single = single.filter(data._dataflow[key] == group[key])
|
||||
t_dataset = TabularDataset._create(single)
|
||||
group_conf = copy.deepcopy(automlconfig)
|
||||
group_conf.user_settings['training_data'] = t_dataset
|
||||
group_conf.user_settings['label_column_name'] = target_column
|
||||
group_conf.user_settings['compute_target'] = compute_target
|
||||
configs[group_name] = group_conf
|
||||
return configs
|
||||
|
||||
|
||||
def build_pipeline_steps(automlconfig: AutoMLConfig,
|
||||
data: Dataset,
|
||||
target_column: str,
|
||||
compute_target: ComputeTarget,
|
||||
group_column_names: list,
|
||||
time_column_name: str,
|
||||
deploy: bool,
|
||||
service_name: str = 'grouping-demo') -> StepSequence:
|
||||
steps = []
|
||||
|
||||
metrics_output_name = 'metrics_{}'
|
||||
best_model_output_name = 'best_model_{}'
|
||||
count = 0
|
||||
model_names = []
|
||||
|
||||
# get all automl configs by group
|
||||
configs = _get_configs(automlconfig, data, target_column, compute_target, group_column_names)
|
||||
|
||||
# build a runconfig for register model
|
||||
register_config = RunConfiguration()
|
||||
cd = CondaDependencies()
|
||||
cd.add_pip_package('azureml-pipeline')
|
||||
register_config.environment.python.conda_dependencies = cd
|
||||
|
||||
# create each automl step end-to-end (train, register)
|
||||
for group_name, conf in configs.items():
|
||||
# create automl metrics output
|
||||
metrics_data = PipelineData(
|
||||
name='metrics_data_{}'.format(group_name),
|
||||
pipeline_output_name=metrics_output_name.format(group_name),
|
||||
training_output=TrainingOutput(type='Metrics'))
|
||||
# create automl model output
|
||||
model_data = PipelineData(
|
||||
name='model_data_{}'.format(group_name),
|
||||
pipeline_output_name=best_model_output_name.format(group_name),
|
||||
training_output=TrainingOutput(type='Model', metric=conf.user_settings['primary_metric']))
|
||||
|
||||
automl_step = AutoMLStep(
|
||||
name='automl_{}'.format(group_name),
|
||||
automl_config=conf,
|
||||
outputs=[metrics_data, model_data],
|
||||
allow_reuse=True)
|
||||
steps.append(automl_step)
|
||||
|
||||
# pass the group name as a parameter to the register step ->
|
||||
# this will become the name of the model for this group.
|
||||
group_name_param = PipelineParameter("group_name_{}".format(count), default_value=group_name)
|
||||
count += 1
|
||||
|
||||
reg_model_step = PythonScriptStep(
|
||||
'register.py',
|
||||
name='register_{}'.format(group_name),
|
||||
arguments=["--model_name", group_name_param, "--model_path", model_data],
|
||||
inputs=[model_data],
|
||||
compute_target=compute_target,
|
||||
runconfig=register_config,
|
||||
source_directory="register",
|
||||
allow_reuse=True
|
||||
)
|
||||
steps.append(reg_model_step)
|
||||
model_names.append(group_name)
|
||||
|
||||
final_steps = steps
|
||||
if deploy:
|
||||
# modify the conda dependencies to ensure we pick up correct
|
||||
# versions of azureml-defaults and azureml-train-automl
|
||||
cd = CondaDependencies.create(pip_packages=['azureml-defaults', 'azureml-train-automl'])
|
||||
automl_deps = CondaDependencies(conda_dependencies_file_path='deploy/myenv.yml')
|
||||
cd._merge_dependencies(automl_deps)
|
||||
cd.save('deploy/myenv.yml')
|
||||
|
||||
# add deployment step
|
||||
pp_group_column_names = PipelineParameter(
|
||||
"group_column_names",
|
||||
default_value="#####".join(list(reversed(group_column_names))))
|
||||
|
||||
pp_model_names = PipelineParameter(
|
||||
"model_names",
|
||||
default_value=json.dumps(model_names))
|
||||
|
||||
pp_service_name = PipelineParameter(
|
||||
"service_name",
|
||||
default_value=service_name)
|
||||
|
||||
deployment_step = PythonScriptStep(
|
||||
'deploy.py',
|
||||
name='service_deploy',
|
||||
arguments=["--group_column_names", pp_group_column_names,
|
||||
"--model_names", pp_model_names,
|
||||
"--service_name", pp_service_name,
|
||||
"--time_column_name", time_column_name],
|
||||
compute_target=compute_target,
|
||||
runconfig=RunConfiguration(),
|
||||
source_directory="deploy"
|
||||
)
|
||||
final_steps = StepSequence(steps=[steps, deployment_step])
|
||||
|
||||
return final_steps
|
||||
@@ -1,61 +0,0 @@
|
||||
WeekStarting,Store,Brand,Quantity,logQuantity,Advert,Price,Age60,COLLEGE,INCOME,Hincome150,Large HH,Minorities,WorkingWoman,SSTRDIST,SSTRVOL,CPDIST5,CPWVOL5
|
||||
1992-08-20,2,minute.maid,23488,10.06424493,1,1.94,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-20,2,tropicana,13376,9.501217335,1,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-27,2,tropicana,8128,9.00307017,0,2.75,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-27,2,minute.maid,19008,9.852615222,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-27,2,dominicks,9024,9.107642974,0,1.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-03,2,tropicana,19456,9.875910785,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-03,2,minute.maid,11584,9.357380115,0,1.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-03,2,dominicks,2048,7.624618986000001,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-10,2,tropicana,10048,9.215128888999999,0,2.64,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-10,2,minute.maid,26752,10.19436452,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-10,2,dominicks,1984,7.592870287999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-17,2,tropicana,6336,8.754002933999999,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-17,2,minute.maid,3904,8.269756948,0,2.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-17,2,dominicks,4160,8.333270353,0,1.77,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-24,2,tropicana,16192,9.692272572,1,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-24,2,minute.maid,3712,8.219326094,0,2.67,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-09-24,2,dominicks,35264,10.47061789,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-10-01,2,dominicks,8640,9.064157862,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-10-01,2,minute.maid,41216,10.62658181,1,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-10-01,2,tropicana,5824,8.66974259,0,2.97,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-20,5,tropicana,17728,9.78290059,1,2.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-20,5,minute.maid,27072,10.20625526,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-27,5,tropicana,9600,9.169518378,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-27,5,minute.maid,3840,8.253227646000001,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-27,5,dominicks,1856,7.526178913,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-03,5,tropicana,25664,10.15284451,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-03,5,minute.maid,6144,8.723231275,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-03,5,dominicks,3712,8.219326094,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-10,5,tropicana,9984,9.208739091,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-10,5,dominicks,2688,7.896552702,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-10,5,minute.maid,36416,10.50276352,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-17,5,tropicana,8576,9.056722882999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-17,5,minute.maid,5440,8.60153434,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-17,5,dominicks,6464,8.774003599999999,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-24,5,tropicana,13184,9.486759252,1,2.78,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-24,5,dominicks,40896,10.61878754,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-09-24,5,minute.maid,7680,8.946374826,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-10-01,5,dominicks,6144,8.723231275,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-10-01,5,minute.maid,50304,10.82583988,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-10-01,5,tropicana,7488,8.921057017999999,0,2.78,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-20,8,minute.maid,55552,10.9250748,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-20,8,tropicana,8576,9.056722882999999,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-27,8,tropicana,8000,8.987196821,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-27,8,minute.maid,18688,9.835636886,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-27,8,dominicks,19200,9.862665558,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-03,8,tropicana,21760,9.987828701,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-03,8,minute.maid,14656,9.592605087,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-03,8,dominicks,12800,9.45720045,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-10,8,tropicana,12800,9.45720045,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-10,8,minute.maid,30144,10.31374118,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-10,8,dominicks,15296,9.635346635,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-17,8,tropicana,10112,9.221478116,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-17,8,minute.maid,6208,8.733594062,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-17,8,dominicks,20992,9.951896692,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-24,8,tropicana,10304,9.240287448,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-24,8,minute.maid,7104,8.868413285,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-09-24,8,dominicks,73856,11.20987253,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-10-01,8,minute.maid,65856,11.09522582,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-10-01,8,dominicks,16192,9.692272572,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-10-01,8,tropicana,6400,8.764053269,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
|
@@ -1,973 +0,0 @@
|
||||
WeekStarting,Store,Brand,Quantity,logQuantity,Advert,Price,Age60,COLLEGE,INCOME,Hincome150,Large HH,Minorities,WorkingWoman,SSTRDIST,SSTRVOL,CPDIST5,CPWVOL5
|
||||
1990-06-14,2,dominicks,10560,9.264828557000001,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-06-14,2,minute.maid,4480,8.407378325,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-06-14,2,tropicana,8256,9.018695487999999,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-07-26,2,dominicks,8000,8.987196821,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-07-26,2,minute.maid,4672,8.449342525,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-07-26,2,tropicana,6144,8.723231275,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-02,2,tropicana,3840,8.253227646000001,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-02,2,minute.maid,20160,9.911455722000001,1,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-02,2,dominicks,6848,8.831711918,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-09,2,dominicks,2880,7.965545572999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-09,2,minute.maid,2688,7.896552702,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-09,2,tropicana,8000,8.987196821,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-23,2,dominicks,1600,7.377758908,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-23,2,minute.maid,3008,8.009030685,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-23,2,tropicana,8896,9.093357017,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-30,2,tropicana,7168,8.877381955,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-30,2,minute.maid,4672,8.449342525,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-08-30,2,dominicks,25344,10.140297300000002,1,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-06,2,dominicks,10752,9.282847063,0,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-06,2,minute.maid,2752,7.920083199,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-06,2,tropicana,10880,9.29468152,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-13,2,minute.maid,26176,10.17259824,1,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-13,2,dominicks,6656,8.803273982999999,0,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-13,2,tropicana,7744,8.954673629,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-20,2,dominicks,6592,8.793612072,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-20,2,minute.maid,3712,8.219326094,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-09-20,2,tropicana,8512,9.049232212,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-11,2,tropicana,5504,8.61323038,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-11,2,minute.maid,30656,10.33058368,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-11,2,dominicks,1728,7.454719948999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-18,2,tropicana,5888,8.68067166,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-18,2,minute.maid,3840,8.253227646000001,0,2.98,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-18,2,dominicks,33792,10.42797937,1,1.24,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-25,2,tropicana,8384,9.034080407000001,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-25,2,minute.maid,2816,7.943072717000001,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-10-25,2,dominicks,1920,7.560080465,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-01,2,tropicana,5952,8.691482577,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-01,2,minute.maid,23104,10.04776104,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-01,2,dominicks,8960,9.100525506,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-08,2,dominicks,11392,9.340666634,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-08,2,tropicana,6848,8.831711918,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-08,2,minute.maid,3392,8.129174997,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-15,2,tropicana,9216,9.128696383,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-15,2,minute.maid,26304,10.1774763,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-15,2,dominicks,28416,10.25470765,0,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-22,2,dominicks,17152,9.749870064,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-22,2,tropicana,12160,9.405907156,0,2.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-22,2,minute.maid,6336,8.754002933999999,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-29,2,tropicana,12672,9.447150114,0,2.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-29,2,minute.maid,9920,9.2023082,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-11-29,2,dominicks,26560,10.1871616,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-06,2,dominicks,6336,8.754002933999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-06,2,minute.maid,25280,10.13776885,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-06,2,tropicana,6528,8.783855897,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-13,2,dominicks,26368,10.17990643,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-13,2,tropicana,6144,8.723231275,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-13,2,minute.maid,14848,9.605620455,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-20,2,tropicana,21120,9.957975738,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-20,2,minute.maid,12288,9.416378455,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-20,2,dominicks,896,6.797940412999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-27,2,tropicana,12416,9.426741242,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-27,2,minute.maid,6272,8.743850562,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-12-27,2,dominicks,1472,7.294377299,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-03,2,tropicana,9472,9.156095357,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-03,2,minute.maid,9152,9.121727714,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-03,2,dominicks,1344,7.2034055210000005,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-10,2,tropicana,17920,9.793672686,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-10,2,minute.maid,4160,8.333270353,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-10,2,dominicks,111680,11.62339292,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-17,2,tropicana,9408,9.14931567,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-17,2,minute.maid,10176,9.227787286,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-17,2,dominicks,1856,7.526178913,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-24,2,tropicana,6272,8.743850562,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-24,2,minute.maid,29056,10.27698028,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-24,2,dominicks,5568,8.624791202,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-31,2,tropicana,6912,8.841014311,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-31,2,minute.maid,7104,8.868413285,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-01-31,2,dominicks,32064,10.37548918,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-07,2,tropicana,16768,9.727227587,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-07,2,dominicks,4352,8.378390789,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-07,2,minute.maid,7488,8.921057017999999,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-14,2,dominicks,704,6.556778356000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-14,2,minute.maid,4224,8.348537825,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-14,2,tropicana,6272,8.743850562,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-21,2,tropicana,7936,8.979164649,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-21,2,minute.maid,8960,9.100525506,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-21,2,dominicks,13760,9.529521112000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-28,2,tropicana,6144,8.723231275,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-28,2,minute.maid,22464,10.01966931,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-02-28,2,dominicks,43328,10.67655436,1,1.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-07,2,tropicana,7936,8.979164649,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-07,2,minute.maid,3840,8.253227646000001,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-07,2,dominicks,57600,10.96127785,1,1.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-14,2,tropicana,7808,8.962904128,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-14,2,minute.maid,12992,9.472089062,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-14,2,dominicks,704,6.556778356000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-21,2,tropicana,6080,8.712759975,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-21,2,minute.maid,70144,11.15830555,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-21,2,dominicks,6016,8.702177866,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-28,2,tropicana,42176,10.64960662,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-28,2,dominicks,10368,9.246479419,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-03-28,2,minute.maid,21248,9.964018052,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-04,2,dominicks,12608,9.442086812000001,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-04,2,minute.maid,5696,8.647519453,1,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-04,2,tropicana,4928,8.502688505,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-11,2,tropicana,29504,10.29228113,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-11,2,minute.maid,7680,8.946374826,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-11,2,dominicks,6336,8.754002933999999,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-18,2,tropicana,9984,9.208739091,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-18,2,minute.maid,6336,8.754002933999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-18,2,dominicks,140736,11.85464107,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-25,2,tropicana,35200,10.46880136,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-25,2,dominicks,960,6.866933285,1,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-04-25,2,minute.maid,8576,9.056722882999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-02,2,dominicks,1216,7.103322062999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-02,2,minute.maid,15104,9.622714887999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-02,2,tropicana,23936,10.08313888,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-09,2,tropicana,7104,8.868413285,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-09,2,minute.maid,76480,11.24478455,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-09,2,dominicks,1664,7.416979621,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-16,2,dominicks,4992,8.51559191,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-16,2,minute.maid,5056,8.528330936,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-16,2,tropicana,24512,10.10691807,1,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-23,2,tropicana,6336,8.754002933999999,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-23,2,minute.maid,4736,8.462948177000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-23,2,dominicks,27968,10.23881628,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-30,2,dominicks,12160,9.405907156,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-30,2,minute.maid,4480,8.407378325,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-05-30,2,tropicana,6080,8.712759975,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-06,2,tropicana,33536,10.42037477,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-06,2,minute.maid,4032,8.30201781,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-06,2,dominicks,2240,7.714231145,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-13,2,dominicks,5504,8.61323038,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-13,2,minute.maid,14784,9.601300794,1,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-13,2,tropicana,13248,9.491601877,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-20,2,tropicana,6208,8.733594062,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-20,2,dominicks,8832,9.086136769,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-20,2,minute.maid,12096,9.400630097999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-27,2,dominicks,2624,7.87245515,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-27,2,minute.maid,41792,10.64046021,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-06-27,2,tropicana,10624,9.270870872,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-04,2,tropicana,44672,10.70710219,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-04,2,minute.maid,10560,9.264828557000001,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-04,2,dominicks,10432,9.252633284,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-18,2,tropicana,20096,9.908276069,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-18,2,dominicks,8320,9.026417534,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-18,2,minute.maid,4224,8.348537825,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-25,2,dominicks,6784,8.822322178,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-25,2,minute.maid,2880,7.965545572999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-07-25,2,tropicana,9152,9.121727714,1,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-01,2,tropicana,21952,9.996613531,0,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-01,2,minute.maid,3968,8.286017467999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-01,2,dominicks,60544,11.01112565,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-08,2,dominicks,20608,9.933434629,0,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-08,2,minute.maid,3712,8.219326094,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-08,2,tropicana,13568,9.515469357999999,0,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-29,2,tropicana,4160,8.333270353,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-29,2,minute.maid,2816,7.943072717000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-08-29,2,dominicks,16064,9.684336023,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-05,2,tropicana,39424,10.58213005,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-05,2,minute.maid,4288,8.363575702999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-05,2,dominicks,12480,9.431882642,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-12,2,tropicana,5632,8.636219898,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-12,2,minute.maid,18240,9.811372264,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-12,2,dominicks,17024,9.742379392,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-19,2,dominicks,13440,9.505990614,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-19,2,minute.maid,7360,8.903815212,0,1.95,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-19,2,tropicana,9024,9.107642974,1,2.68,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-26,2,tropicana,6016,8.702177866,0,3.44,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-26,2,minute.maid,7808,8.962904128,0,1.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-09-26,2,dominicks,10112,9.221478116,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-03,2,dominicks,9088,9.114710141,0,1.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-03,2,minute.maid,13504,9.510741217,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-03,2,tropicana,7744,8.954673629,0,3.14,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-10,2,tropicana,6784,8.822322178,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-10,2,dominicks,22848,10.03661887,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-10,2,minute.maid,10048,9.215128888999999,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-17,2,dominicks,6976,8.850230966,0,1.65,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-17,2,minute.maid,135936,11.81993947,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-17,2,tropicana,6784,8.822322178,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-24,2,tropicana,6272,8.743850562,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-24,2,minute.maid,5056,8.528330936,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-24,2,dominicks,4160,8.333270353,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-31,2,tropicana,5312,8.577723691000001,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-31,2,minute.maid,27968,10.23881628,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-10-31,2,dominicks,3328,8.110126802,0,1.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-07,2,tropicana,9216,9.128696383,0,3.11,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-07,2,minute.maid,4736,8.462948177000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-07,2,dominicks,12096,9.400630097999999,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-14,2,tropicana,7296,8.895081532,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-14,2,minute.maid,7808,8.962904128,0,2.14,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-14,2,dominicks,6208,8.733594062,0,1.76,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-21,2,tropicana,34240,10.44114983,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-21,2,minute.maid,12480,9.431882642,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-21,2,dominicks,3008,8.009030685,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-28,2,dominicks,19456,9.875910785,1,1.5,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-28,2,minute.maid,9664,9.17616292,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-11-28,2,tropicana,7168,8.877381955,0,2.64,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-05,2,minute.maid,7168,8.877381955,0,2.06,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-05,2,dominicks,16768,9.727227587,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-05,2,tropicana,6080,8.712759975,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-12,2,dominicks,13568,9.515469357999999,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-12,2,minute.maid,4480,8.407378325,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-12,2,tropicana,5120,8.540909718,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-19,2,tropicana,8320,9.026417534,0,2.74,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-19,2,minute.maid,5952,8.691482577,0,2.22,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-19,2,dominicks,6080,8.712759975,0,1.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-26,2,dominicks,10432,9.252633284,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-26,2,minute.maid,21696,9.984883191,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1991-12-26,2,tropicana,17728,9.78290059,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-02,2,minute.maid,12032,9.395325046,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-02,2,dominicks,11712,9.368369236,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-02,2,tropicana,13120,9.481893063,0,2.35,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-09,2,dominicks,4032,8.30201781,0,1.76,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-09,2,minute.maid,7040,8.859363449,0,2.12,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-09,2,tropicana,13120,9.481893063,0,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-16,2,dominicks,6336,8.754002933999999,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-16,2,tropicana,9792,9.189321005,0,2.43,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-16,2,minute.maid,10240,9.234056899,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-23,2,tropicana,3520,8.166216269,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-23,2,minute.maid,6848,8.831711918,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-23,2,dominicks,13632,9.520175249,0,1.47,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-30,2,tropicana,5504,8.61323038,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-30,2,minute.maid,3968,8.286017467999999,0,2.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-01-30,2,dominicks,45120,10.71708089,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-06,2,tropicana,6720,8.812843434,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-06,2,minute.maid,5888,8.68067166,0,2.26,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-06,2,dominicks,9984,9.208739091,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-13,2,tropicana,20224,9.914625297,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-13,2,dominicks,4800,8.476371197,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-13,2,minute.maid,6208,8.733594062,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-20,2,dominicks,11776,9.373818841,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-20,2,minute.maid,72256,11.18797065,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-20,2,tropicana,5056,8.528330936,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-27,2,tropicana,43584,10.68244539,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-27,2,minute.maid,11520,9.351839934,0,2.11,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-02-27,2,dominicks,11584,9.357380115,0,1.54,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-05,2,tropicana,25728,10.15533517,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-05,2,minute.maid,5824,8.66974259,0,2.35,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-05,2,dominicks,51264,10.84474403,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-12,2,tropicana,31808,10.36747311,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-12,2,minute.maid,19392,9.872615889,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-12,2,dominicks,14976,9.614204199,0,1.44,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-19,2,tropicana,20736,9.939626599,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-19,2,minute.maid,9536,9.162829389,0,2.1,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-19,2,dominicks,30784,10.33475035,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-26,2,tropicana,15168,9.626943225,0,2.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-26,2,minute.maid,5312,8.577723691000001,0,2.28,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-03-26,2,dominicks,12480,9.431882642,0,1.6,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-02,2,tropicana,28096,10.2433825,1,2.5,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-02,2,dominicks,3264,8.090708716,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-02,2,minute.maid,14528,9.583833101,1,1.9,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-09,2,dominicks,8768,9.078864009,0,1.48,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-09,2,minute.maid,12416,9.426741242,0,2.12,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-09,2,tropicana,12416,9.426741242,0,2.58,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-16,2,tropicana,5376,8.589699882,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-16,2,minute.maid,5376,8.589699882,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-16,2,dominicks,70848,11.16829202,1,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-23,2,tropicana,9792,9.189321005,0,2.67,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-23,2,minute.maid,19008,9.852615222,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-23,2,dominicks,18560,9.828764006,0,1.42,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-30,2,tropicana,16960,9.738612909,1,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-30,2,minute.maid,3904,8.269756948,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-04-30,2,dominicks,9152,9.121727714,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-07,2,tropicana,8320,9.026417534,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-07,2,minute.maid,6336,8.754002933999999,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-07,2,dominicks,9600,9.169518378,0,2.0,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-14,2,tropicana,6912,8.841014311,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-14,2,minute.maid,5440,8.60153434,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-14,2,dominicks,4800,8.476371197,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-21,2,tropicana,6976,8.850230966,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-21,2,minute.maid,22400,10.01681624,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-21,2,dominicks,9664,9.17616292,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-28,2,minute.maid,3968,8.286017467999999,0,2.84,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-28,2,tropicana,7232,8.886270902,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-05-28,2,dominicks,45568,10.726961,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-04,2,tropicana,51520,10.84972536,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-04,2,minute.maid,3264,8.090708716,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-04,2,dominicks,20992,9.951896692,0,1.74,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-11,2,minute.maid,4352,8.378390789,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-11,2,tropicana,22272,10.01108556,0,2.21,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-11,2,dominicks,6592,8.793612072,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-18,2,dominicks,4992,8.51559191,0,2.05,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-18,2,minute.maid,4480,8.407378325,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-18,2,tropicana,46144,10.73952222,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-25,2,tropicana,4352,8.378390789,1,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-25,2,minute.maid,3840,8.253227646000001,0,2.52,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-06-25,2,dominicks,8064,8.99516499,0,1.24,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-02,2,tropicana,17280,9.757305042,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-02,2,minute.maid,13312,9.496421162999999,1,2.0,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-02,2,dominicks,7360,8.903815212,0,1.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-09,2,tropicana,5696,8.647519453,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-09,2,minute.maid,3776,8.236420527,1,2.33,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-09,2,dominicks,10048,9.215128888999999,0,1.4,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-16,2,tropicana,6848,8.831711918,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-16,2,dominicks,10112,9.221478116,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-16,2,minute.maid,4800,8.476371197,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-23,2,dominicks,9152,9.121727714,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-23,2,minute.maid,24960,10.12502982,1,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-23,2,tropicana,4416,8.392989587999999,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-30,2,tropicana,4672,8.449342525,0,3.16,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-30,2,minute.maid,4544,8.42156296,0,2.86,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-07-30,2,dominicks,36288,10.49924239,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-06,2,tropicana,7168,8.877381955,1,3.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-06,2,minute.maid,3968,8.286017467999999,1,2.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-06,2,dominicks,3776,8.236420527,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-13,2,tropicana,5056,8.528330936,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-13,2,dominicks,3328,8.110126802,0,1.97,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-13,2,minute.maid,49600,10.81174611,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1992-08-20,2,dominicks,13824,9.534161491,0,1.36,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
||||
1990-06-14,5,dominicks,1792,7.491087594,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-06-14,5,minute.maid,4224,8.348537825,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-06-14,5,tropicana,5888,8.68067166,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-06-28,5,minute.maid,4352,8.378390789,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-06-28,5,dominicks,2496,7.82244473,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-06-28,5,tropicana,6976,8.850230966,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-05,5,dominicks,2944,7.98752448,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-05,5,minute.maid,4928,8.502688505,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-05,5,tropicana,6528,8.783855897,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-12,5,dominicks,1024,6.931471806,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-12,5,minute.maid,31168,10.34714721,1,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-12,5,tropicana,4928,8.502688505,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-26,5,dominicks,4224,8.348537825,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-26,5,minute.maid,10048,9.215128888999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-07-26,5,tropicana,5312,8.577723691000001,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-02,5,minute.maid,21760,9.987828701,1,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-02,5,tropicana,5120,8.540909718,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-02,5,dominicks,4544,8.42156296,1,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-09,5,dominicks,1728,7.454719948999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-09,5,minute.maid,4544,8.42156296,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-09,5,tropicana,7936,8.979164649,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-16,5,tropicana,6080,8.712759975,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-16,5,minute.maid,52224,10.86329744,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-16,5,dominicks,1216,7.103322062999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-23,5,dominicks,1152,7.049254841000001,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-23,5,minute.maid,3584,8.184234774,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-23,5,tropicana,4160,8.333270353,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-30,5,minute.maid,5120,8.540909718,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-30,5,tropicana,5888,8.68067166,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-08-30,5,dominicks,30144,10.31374118,1,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-06,5,dominicks,8960,9.100525506,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-06,5,minute.maid,4416,8.392989587999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-06,5,tropicana,9536,9.162829389,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-13,5,tropicana,8320,9.026417534,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-13,5,dominicks,8192,9.010913347,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-13,5,minute.maid,30208,10.31586207,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-20,5,dominicks,6528,8.783855897,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-20,5,minute.maid,4160,8.333270353,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-20,5,tropicana,8000,8.987196821,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-27,5,dominicks,34688,10.45414909,1,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-27,5,minute.maid,4992,8.51559191,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-09-27,5,tropicana,5824,8.66974259,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-04,5,dominicks,4672,8.449342525,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-04,5,minute.maid,13952,9.543378146,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-04,5,tropicana,10624,9.270870872,1,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-11,5,tropicana,6656,8.803273982999999,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-11,5,dominicks,1088,6.992096427000001,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-11,5,minute.maid,47680,10.772267300000001,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-18,5,tropicana,5184,8.553332238,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-18,5,minute.maid,7616,8.938006577000001,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-18,5,dominicks,69440,11.14821835,1,1.24,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-25,5,tropicana,4928,8.502688505,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-25,5,minute.maid,8896,9.093357017,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-10-25,5,dominicks,1280,7.154615357000001,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-01,5,tropicana,5888,8.68067166,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-01,5,minute.maid,28544,10.25920204,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-01,5,dominicks,35456,10.47604777,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-08,5,tropicana,5312,8.577723691000001,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-08,5,dominicks,13824,9.534161491,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-08,5,minute.maid,5440,8.60153434,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-15,5,tropicana,9984,9.208739091,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-15,5,minute.maid,52416,10.86696717,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-15,5,dominicks,14208,9.561560465,0,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-22,5,tropicana,8448,9.041685006,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-22,5,dominicks,29312,10.28575227,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-22,5,minute.maid,11712,9.368369236,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-29,5,tropicana,10880,9.29468152,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-29,5,minute.maid,13952,9.543378146,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-11-29,5,dominicks,52992,10.87789624,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-06,5,dominicks,15680,9.660141293999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-06,5,minute.maid,36160,10.49570882,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-06,5,tropicana,5696,8.647519453,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-13,5,tropicana,5696,8.647519453,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-13,5,minute.maid,12864,9.462187991,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-13,5,dominicks,43520,10.68097588,1,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-20,5,tropicana,32384,10.38541975,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-20,5,minute.maid,22208,10.00820786,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-20,5,dominicks,3904,8.269756948,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-27,5,tropicana,10752,9.282847063,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-27,5,minute.maid,9984,9.208739091,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-12-27,5,dominicks,896,6.797940412999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-03,5,tropicana,6912,8.841014311,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-03,5,minute.maid,14016,9.547954812999999,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-03,5,dominicks,2240,7.714231145,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-10,5,tropicana,13440,9.505990614,0,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-10,5,minute.maid,6080,8.712759975,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-10,5,dominicks,125760,11.74213061,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-17,5,tropicana,7808,8.962904128,0,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-17,5,minute.maid,7808,8.962904128,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-17,5,dominicks,1408,7.249925537,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-24,5,tropicana,5248,8.565602331000001,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-24,5,minute.maid,40896,10.61878754,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-24,5,dominicks,7232,8.886270902,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-31,5,tropicana,6208,8.733594062,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-31,5,minute.maid,6272,8.743850562,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-01-31,5,dominicks,41216,10.62658181,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-07,5,tropicana,21440,9.973013615,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-07,5,minute.maid,7872,8.971067439,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-07,5,dominicks,9024,9.107642974,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-14,5,dominicks,1600,7.377758908,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-14,5,tropicana,7360,8.903815212,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-14,5,minute.maid,6144,8.723231275,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-21,5,tropicana,6720,8.812843434,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-21,5,minute.maid,8448,9.041685006,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-21,5,dominicks,2496,7.82244473,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-28,5,tropicana,6656,8.803273982999999,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-28,5,minute.maid,18688,9.835636886,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-02-28,5,dominicks,6336,8.754002933999999,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-07,5,tropicana,6016,8.702177866,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-07,5,minute.maid,6272,8.743850562,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-07,5,dominicks,56384,10.93994071,1,1.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-14,5,tropicana,6144,8.723231275,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-14,5,minute.maid,12096,9.400630097999999,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-14,5,dominicks,1600,7.377758908,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-21,5,tropicana,4928,8.502688505,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-21,5,minute.maid,73216,11.20116926,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-21,5,dominicks,2944,7.98752448,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-28,5,tropicana,67712,11.1230187,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-28,5,minute.maid,18944,9.849242538,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-03-28,5,dominicks,13504,9.510741217,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-04,5,dominicks,5376,8.589699882,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-04,5,tropicana,8640,9.064157862,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-04,5,minute.maid,6400,8.764053269,1,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-11,5,tropicana,35520,10.477851199999998,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-11,5,minute.maid,8640,9.064157862,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-11,5,dominicks,6656,8.803273982999999,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-18,5,tropicana,9664,9.17616292,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-18,5,minute.maid,7296,8.895081532,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-18,5,dominicks,95680,11.46876457,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-25,5,tropicana,49088,10.80136989,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-25,5,minute.maid,12480,9.431882642,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-04-25,5,dominicks,896,6.797940412999999,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-02,5,dominicks,1728,7.454719948999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-02,5,minute.maid,14144,9.557045785,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-02,5,tropicana,14912,9.609921537,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-09,5,minute.maid,88256,11.38799696,1,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-09,5,tropicana,6464,8.774003599999999,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-09,5,dominicks,1280,7.154615357000001,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-16,5,dominicks,5696,8.647519453,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-16,5,minute.maid,6848,8.831711918,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-16,5,tropicana,25024,10.12759064,1,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-23,5,minute.maid,7808,8.962904128,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-23,5,tropicana,6272,8.743850562,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-23,5,dominicks,28288,10.25019297,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-30,5,dominicks,4864,8.489616424,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-30,5,minute.maid,6272,8.743850562,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-05-30,5,tropicana,5056,8.528330936,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-06,5,minute.maid,6144,8.723231275,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-06,5,tropicana,47616,10.77092412,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-06,5,dominicks,2880,7.965545572999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-13,5,dominicks,5760,8.658692754,1,1.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-13,5,minute.maid,27776,10.23192762,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-13,5,tropicana,13888,9.538780437,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-20,5,tropicana,6144,8.723231275,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-20,5,minute.maid,20800,9.942708266,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-20,5,dominicks,15040,9.618468598,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-27,5,dominicks,5120,8.540909718,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-27,5,minute.maid,45696,10.72976605,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-06-27,5,tropicana,9344,9.142489705,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-04,5,minute.maid,14336,9.570529135,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-04,5,tropicana,32896,10.40110635,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-04,5,dominicks,3264,8.090708716,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-11,5,dominicks,9536,9.162829389,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-11,5,minute.maid,4928,8.502688505,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-11,5,tropicana,21056,9.954940834,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-18,5,tropicana,15360,9.639522007,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-18,5,minute.maid,4608,8.435549202,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-18,5,dominicks,6208,8.733594062,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-25,5,dominicks,6592,8.793612072,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-25,5,tropicana,8000,8.987196821,1,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-07-25,5,minute.maid,5248,8.565602331000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-01,5,tropicana,21120,9.957975738,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-01,5,dominicks,63552,11.05961375,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-01,5,minute.maid,4224,8.348537825,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-08,5,dominicks,27968,10.23881628,0,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-08,5,minute.maid,4288,8.363575702999999,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-08,5,tropicana,11904,9.384629757,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-15,5,minute.maid,16896,9.734832187,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-15,5,tropicana,5056,8.528330936,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-15,5,dominicks,21760,9.987828701,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-22,5,dominicks,2688,7.896552702,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-22,5,minute.maid,77184,11.25394746,1,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-22,5,tropicana,4608,8.435549202,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-29,5,tropicana,6016,8.702177866,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-29,5,minute.maid,5184,8.553332238,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-08-29,5,dominicks,10432,9.252633284,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-05,5,tropicana,50752,10.83470631,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-05,5,minute.maid,5248,8.565602331000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-05,5,dominicks,9792,9.189321005,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-12,5,minute.maid,20672,9.936535407000001,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-12,5,tropicana,5632,8.636219898,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-12,5,dominicks,8448,9.041685006,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-26,5,tropicana,6400,8.764053269,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-26,5,dominicks,6912,8.841014311,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-09-26,5,minute.maid,12352,9.421573272,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-03,5,dominicks,8256,9.018695487999999,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-03,5,minute.maid,12032,9.395325046,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-03,5,tropicana,5440,8.60153434,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-10,5,minute.maid,13440,9.505990614,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-10,5,dominicks,28672,10.26367632,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-10,5,tropicana,8128,9.00307017,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-24,5,tropicana,7232,8.886270902,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-24,5,minute.maid,5824,8.66974259,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-24,5,dominicks,4416,8.392989587999999,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-31,5,tropicana,7168,8.877381955,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-31,5,minute.maid,50112,10.82201578,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-10-31,5,dominicks,1856,7.526178913,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-07,5,minute.maid,5184,8.553332238,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-07,5,tropicana,7872,8.971067439,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-07,5,dominicks,6528,8.783855897,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-14,5,tropicana,7552,8.929567707999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-14,5,minute.maid,8384,9.034080407000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-14,5,dominicks,6080,8.712759975,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-21,5,tropicana,69504,11.14913958,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-21,5,dominicks,3456,8.14786713,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-21,5,minute.maid,10112,9.221478116,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-28,5,dominicks,25856,10.16029796,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-28,5,minute.maid,8384,9.034080407000001,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-11-28,5,tropicana,8960,9.100525506,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-05,5,tropicana,6912,8.841014311,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-05,5,dominicks,25728,10.15533517,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-05,5,minute.maid,11456,9.346268889,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-12,5,dominicks,23552,10.06696602,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-12,5,minute.maid,5952,8.691482577,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-12,5,tropicana,6656,8.803273982999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-19,5,tropicana,8192,9.010913347,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-19,5,dominicks,2944,7.98752448,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-19,5,minute.maid,8512,9.049232212,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-26,5,dominicks,5888,8.68067166,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-26,5,minute.maid,27968,10.23881628,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1991-12-26,5,tropicana,13440,9.505990614,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-02,5,tropicana,12160,9.405907156,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-02,5,dominicks,6848,8.831711918,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-02,5,minute.maid,24000,10.08580911,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-09,5,dominicks,1792,7.491087594,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-09,5,minute.maid,6848,8.831711918,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-09,5,tropicana,11840,9.379238908,0,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-16,5,tropicana,8640,9.064157862,0,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-16,5,dominicks,5248,8.565602331000001,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-16,5,minute.maid,15104,9.622714887999999,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-23,5,tropicana,5888,8.68067166,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-23,5,minute.maid,11392,9.340666634,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-23,5,dominicks,16768,9.727227587,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-30,5,tropicana,7424,8.912473275,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-30,5,minute.maid,5824,8.66974259,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-01-30,5,dominicks,52160,10.8620712,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-06,5,tropicana,5632,8.636219898,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-06,5,minute.maid,7488,8.921057017999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-06,5,dominicks,16640,9.719564714,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-13,5,tropicana,33600,10.42228135,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-13,5,minute.maid,8320,9.026417534,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-13,5,dominicks,1344,7.2034055210000005,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-20,5,dominicks,4608,8.435549202,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-20,5,tropicana,5376,8.589699882,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-20,5,minute.maid,99904,11.511965,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-27,5,tropicana,54272,10.90176372,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-27,5,minute.maid,6976,8.850230966,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-02-27,5,dominicks,12672,9.447150114,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-05,5,tropicana,33600,10.42228135,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-05,5,minute.maid,9984,9.208739091,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-05,5,dominicks,48640,10.79220152,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-12,5,tropicana,24448,10.10430369,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-12,5,minute.maid,32832,10.39915893,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-12,5,dominicks,13248,9.491601877,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-19,5,tropicana,22784,10.03381381,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-19,5,minute.maid,8128,9.00307017,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-19,5,dominicks,29248,10.28356647,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-26,5,tropicana,19008,9.852615222,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-26,5,minute.maid,6464,8.774003599999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-03-26,5,dominicks,4608,8.435549202,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-02,5,tropicana,15808,9.66827142,1,2.5,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-02,5,minute.maid,36800,10.51325312,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-02,5,dominicks,3136,8.050703382,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-09,5,dominicks,13184,9.486759252,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-09,5,tropicana,14144,9.557045785,0,2.5,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-09,5,minute.maid,12928,9.467150781,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-16,5,tropicana,9600,9.169518378,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-16,5,minute.maid,7424,8.912473275,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-16,5,dominicks,67712,11.1230187,1,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-23,5,tropicana,10112,9.221478116,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-23,5,minute.maid,34176,10.43927892,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-23,5,dominicks,18880,9.84585844,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-30,5,minute.maid,4160,8.333270353,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-30,5,tropicana,31872,10.36948316,1,2.24,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-04-30,5,dominicks,6208,8.733594062,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-07,5,tropicana,9280,9.135616826,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-07,5,minute.maid,5952,8.691482577,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-07,5,dominicks,5952,8.691482577,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-14,5,tropicana,7680,8.946374826,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-14,5,minute.maid,6528,8.783855897,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-14,5,dominicks,4160,8.333270353,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-21,5,tropicana,8704,9.071537969,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-21,5,minute.maid,30656,10.33058368,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-21,5,dominicks,23488,10.06424493,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-28,5,tropicana,9920,9.2023082,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-28,5,dominicks,60480,11.01006801,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-05-28,5,minute.maid,6656,8.803273982999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-04,5,tropicana,91968,11.42919597,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-04,5,minute.maid,4416,8.392989587999999,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-04,5,dominicks,20416,9.924074186,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-11,5,tropicana,44096,10.69412435,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-11,5,dominicks,6336,8.754002933999999,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-11,5,minute.maid,5696,8.647519453,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-25,5,minute.maid,5696,8.647519453,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-25,5,tropicana,7296,8.895081532,1,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-06-25,5,dominicks,1408,7.249925537,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-02,5,tropicana,12928,9.467150781,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-02,5,minute.maid,39680,10.58860256,1,2.01,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-02,5,dominicks,4672,8.449342525,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-09,5,tropicana,6848,8.831711918,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-09,5,minute.maid,6208,8.733594062,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-09,5,dominicks,19520,9.87919486,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-16,5,tropicana,8064,8.99516499,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-16,5,minute.maid,7872,8.971067439,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-16,5,dominicks,7872,8.971067439,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-23,5,dominicks,5184,8.553332238,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-23,5,tropicana,4992,8.51559191,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-23,5,minute.maid,54528,10.90646961,1,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-30,5,tropicana,7360,8.903815212,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-30,5,minute.maid,6400,8.764053269,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-07-30,5,dominicks,42240,10.65112292,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-06,5,tropicana,8384,9.034080407000001,1,2.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-06,5,minute.maid,5888,8.68067166,1,2.65,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-06,5,dominicks,6592,8.793612072,1,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-13,5,tropicana,8832,9.086136769,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-13,5,minute.maid,56384,10.93994071,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-13,5,dominicks,2112,7.655390645,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1992-08-20,5,dominicks,21248,9.964018052,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
||||
1990-06-14,8,dominicks,14336,9.570529135,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-14,8,minute.maid,6080,8.712759975,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-14,8,tropicana,8896,9.093357017,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-21,8,dominicks,6400,8.764053269,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-21,8,minute.maid,51968,10.85838342,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-21,8,tropicana,7296,8.895081532,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-28,8,tropicana,10368,9.246479419,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-28,8,minute.maid,4928,8.502688505,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-06-28,8,dominicks,3968,8.286017467999999,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-05,8,dominicks,4352,8.378390789,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-05,8,minute.maid,5312,8.577723691000001,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-05,8,tropicana,6976,8.850230966,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-12,8,tropicana,6464,8.774003599999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-12,8,dominicks,3520,8.166216269,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-12,8,minute.maid,39424,10.58213005,1,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-19,8,tropicana,8192,9.010913347,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-19,8,dominicks,6464,8.774003599999999,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-19,8,minute.maid,5568,8.624791202,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-26,8,dominicks,5952,8.691482577,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-26,8,minute.maid,14592,9.588228712000001,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-07-26,8,tropicana,7936,8.979164649,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-02,8,tropicana,6656,8.803273982999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-02,8,minute.maid,22208,10.00820786,1,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-02,8,dominicks,8832,9.086136769,1,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-09,8,dominicks,7232,8.886270902,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-09,8,minute.maid,5760,8.658692754,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-09,8,tropicana,8256,9.018695487999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-16,8,tropicana,5568,8.624791202,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-16,8,minute.maid,54016,10.89703558,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-16,8,dominicks,5504,8.61323038,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-23,8,dominicks,4800,8.476371197,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-23,8,minute.maid,5824,8.66974259,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-23,8,tropicana,7488,8.921057017999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-30,8,tropicana,6144,8.723231275,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-30,8,minute.maid,6528,8.783855897,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-08-30,8,dominicks,52672,10.87183928,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-06,8,dominicks,16448,9.707959168,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-06,8,minute.maid,5440,8.60153434,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-06,8,tropicana,11008,9.30637756,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-13,8,minute.maid,36544,10.50627229,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-13,8,dominicks,19072,9.85597657,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-13,8,tropicana,5760,8.658692754,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-20,8,dominicks,13376,9.501217335,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-20,8,minute.maid,3776,8.236420527,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-20,8,tropicana,10112,9.221478116,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-27,8,tropicana,8448,9.041685006,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-27,8,minute.maid,5504,8.61323038,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-09-27,8,dominicks,61440,11.02581637,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-04,8,tropicana,8448,9.041685006,1,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-04,8,dominicks,13760,9.529521112000001,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-04,8,minute.maid,12416,9.426741242,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-11,8,minute.maid,53696,10.89109379,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-11,8,dominicks,3136,8.050703382,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-11,8,tropicana,7424,8.912473275,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-18,8,tropicana,5824,8.66974259,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-18,8,minute.maid,5696,8.647519453,0,2.51,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-18,8,dominicks,186176,12.13444774,1,1.14,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-25,8,tropicana,6656,8.803273982999999,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-25,8,minute.maid,4864,8.489616424,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-10-25,8,dominicks,3712,8.219326094,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-01,8,tropicana,6272,8.743850562,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-01,8,minute.maid,37184,10.52363384,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-01,8,dominicks,35776,10.48503256,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-08,8,tropicana,6912,8.841014311,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-08,8,minute.maid,5504,8.61323038,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-08,8,dominicks,26880,10.1991378,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-15,8,tropicana,10496,9.258749511,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-15,8,minute.maid,51008,10.83973776,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-15,8,dominicks,71680,11.17996705,0,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-22,8,tropicana,11840,9.379238908,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-22,8,minute.maid,11072,9.312174678,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-22,8,dominicks,25088,10.13014492,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-29,8,tropicana,9664,9.17616292,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-29,8,minute.maid,12160,9.405907156,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-11-29,8,dominicks,91456,11.42361326,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-06,8,minute.maid,30528,10.32639957,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-06,8,dominicks,23808,10.07777694,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-06,8,tropicana,6272,8.743850562,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-13,8,dominicks,89856,11.40596367,1,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-13,8,minute.maid,12096,9.400630097999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-13,8,tropicana,7168,8.877381955,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-20,8,minute.maid,16448,9.707959168,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-20,8,dominicks,12224,9.411156511,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-20,8,tropicana,29504,10.29228113,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-27,8,minute.maid,9344,9.142489705,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-27,8,dominicks,3776,8.236420527,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1990-12-27,8,tropicana,8704,9.071537969,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-03,8,tropicana,9280,9.135616826,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-03,8,minute.maid,16128,9.688312171,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-03,8,dominicks,13824,9.534161491,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-10,8,minute.maid,5376,8.589699882,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-10,8,dominicks,251072,12.43349503,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-10,8,tropicana,12224,9.411156511,0,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-17,8,minute.maid,6656,8.803273982999999,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-17,8,tropicana,10368,9.246479419,0,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-17,8,dominicks,4864,8.489616424,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-24,8,minute.maid,59712,10.99728828,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-24,8,dominicks,10176,9.227787286,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-24,8,tropicana,8128,9.00307017,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-31,8,tropicana,5952,8.691482577,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-31,8,minute.maid,9856,9.195835686,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-01-31,8,dominicks,105344,11.56498647,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-07,8,minute.maid,6720,8.812843434,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-07,8,dominicks,33600,10.42228135,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-07,8,tropicana,21696,9.984883191,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-14,8,dominicks,4736,8.462948177000001,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-14,8,minute.maid,4224,8.348537825,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-14,8,tropicana,7808,8.962904128,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-21,8,tropicana,8128,9.00307017,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-21,8,minute.maid,9728,9.182763604,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-21,8,dominicks,10304,9.240287448,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-28,8,tropicana,7424,8.912473275,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-28,8,minute.maid,40320,10.604602900000001,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-02-28,8,dominicks,5056,8.528330936,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-07,8,dominicks,179968,12.10053434,1,0.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-07,8,tropicana,5952,8.691482577,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-07,8,minute.maid,5120,8.540909718,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-14,8,minute.maid,19264,9.865993348,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-14,8,dominicks,4992,8.51559191,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-14,8,tropicana,7616,8.938006577000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-21,8,tropicana,5312,8.577723691000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-21,8,minute.maid,170432,12.04609167,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-21,8,dominicks,6400,8.764053269,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-28,8,minute.maid,39680,10.58860256,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-28,8,dominicks,14912,9.609921537,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-03-28,8,tropicana,161792,11.99406684,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-04,8,dominicks,34624,10.45230236,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-04,8,minute.maid,8128,9.00307017,1,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-04,8,tropicana,17280,9.757305042,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-11,8,tropicana,47040,10.75875358,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-11,8,minute.maid,9088,9.114710141,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-11,8,dominicks,10368,9.246479419,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-18,8,tropicana,14464,9.579418083,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-18,8,minute.maid,6720,8.812843434,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-18,8,dominicks,194880,12.18013926,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-25,8,tropicana,52928,10.87668778,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-25,8,dominicks,5696,8.647519453,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-04-25,8,minute.maid,7552,8.929567707999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-02,8,dominicks,7168,8.877381955,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-02,8,minute.maid,24768,10.11730778,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-02,8,tropicana,21184,9.961001459,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-09,8,tropicana,7360,8.903815212,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-09,8,minute.maid,183296,12.11885761,1,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-09,8,dominicks,2880,7.965545572999999,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-16,8,dominicks,12288,9.416378455,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-16,8,minute.maid,8896,9.093357017,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-05-16,8,tropicana,15744,9.664214619,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-06,8,dominicks,9280,9.135616826,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-06,8,tropicana,46912,10.75602879,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-06,8,minute.maid,6656,8.803273982999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-13,8,tropicana,18240,9.811372264,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-13,8,dominicks,25856,10.16029796,1,1.26,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-13,8,minute.maid,35456,10.47604777,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-20,8,dominicks,19264,9.865993348,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-20,8,minute.maid,17408,9.76468515,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-20,8,tropicana,6464,8.774003599999999,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-27,8,dominicks,6848,8.831711918,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-27,8,minute.maid,75520,11.2321528,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-06-27,8,tropicana,8512,9.049232212,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-04,8,tropicana,28416,10.25470765,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-04,8,minute.maid,21632,9.981928979,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-04,8,dominicks,12928,9.467150781,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-11,8,dominicks,44032,10.69267192,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-11,8,minute.maid,8384,9.034080407000001,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-11,8,tropicana,16960,9.738612909,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-18,8,minute.maid,9920,9.2023082,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-18,8,dominicks,25408,10.14281936,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-18,8,tropicana,8320,9.026417534,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-25,8,dominicks,38336,10.55414468,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-25,8,minute.maid,6592,8.793612072,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-07-25,8,tropicana,11136,9.317938383,1,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-01,8,tropicana,27712,10.22962081,0,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-01,8,minute.maid,7168,8.877381955,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-01,8,dominicks,152384,11.93415893,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-08,8,dominicks,54464,10.90529521,0,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-08,8,minute.maid,6208,8.733594062,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-08,8,tropicana,7744,8.954673629,0,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-15,8,minute.maid,30528,10.32639957,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-15,8,dominicks,47680,10.772267300000001,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-15,8,tropicana,5184,8.553332238,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-22,8,dominicks,14720,9.596962392,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-22,8,minute.maid,155840,11.95658512,1,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-22,8,tropicana,6272,8.743850562,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-29,8,tropicana,7744,8.954673629,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-29,8,dominicks,53248,10.88271552,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-08-29,8,minute.maid,10752,9.282847063,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-05,8,tropicana,53184,10.88151288,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-05,8,minute.maid,6976,8.850230966,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-05,8,dominicks,40576,10.61093204,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-12,8,dominicks,25856,10.16029796,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-12,8,tropicana,6784,8.822322178,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-12,8,minute.maid,31872,10.36948316,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-19,8,dominicks,24064,10.08847223,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-19,8,minute.maid,5312,8.577723691000001,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-19,8,tropicana,8000,8.987196821,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-26,8,tropicana,6592,8.793612072,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-26,8,minute.maid,33344,10.41463313,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-09-26,8,dominicks,15680,9.660141293999999,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-03,8,minute.maid,13504,9.510741217,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-03,8,dominicks,16576,9.715711145,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-03,8,tropicana,5248,8.565602331000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-10,8,dominicks,49664,10.8130356,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-10,8,tropicana,6592,8.793612072,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-10,8,minute.maid,13504,9.510741217,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-17,8,dominicks,10752,9.282847063,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-17,8,minute.maid,335808,12.72429485,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-17,8,tropicana,5888,8.68067166,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-24,8,tropicana,6336,8.754002933999999,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-24,8,dominicks,9792,9.189321005,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-24,8,minute.maid,13120,9.481893063,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-31,8,tropicana,5888,8.68067166,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-31,8,minute.maid,49664,10.8130356,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-10-31,8,dominicks,7104,8.868413285,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-07,8,dominicks,9216,9.128696383,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-07,8,tropicana,6080,8.712759975,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-07,8,minute.maid,10880,9.29468152,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-14,8,tropicana,6848,8.831711918,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-14,8,minute.maid,9984,9.208739091,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-14,8,dominicks,12608,9.442086812000001,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-21,8,tropicana,54016,10.89703558,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-21,8,minute.maid,9216,9.128696383,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-21,8,dominicks,16448,9.707959168,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-28,8,tropicana,10368,9.246479419,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-28,8,dominicks,27968,10.23881628,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-11-28,8,minute.maid,7680,8.946374826,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-05,8,minute.maid,7296,8.895081532,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-05,8,dominicks,37824,10.5406991,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-05,8,tropicana,5568,8.624791202,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-12,8,dominicks,33664,10.4241843,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-12,8,minute.maid,8192,9.010913347,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-12,8,tropicana,4864,8.489616424,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-19,8,tropicana,7232,8.886270902,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-19,8,minute.maid,6080,8.712759975,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-19,8,dominicks,17728,9.78290059,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-26,8,tropicana,15232,9.631153757,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-26,8,dominicks,25088,10.13014492,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1991-12-26,8,minute.maid,15040,9.618468598,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-02,8,minute.maid,9472,9.156095357,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-02,8,dominicks,13184,9.486759252,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-02,8,tropicana,47040,10.75875358,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-09,8,dominicks,3136,8.050703382,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-09,8,minute.maid,5888,8.68067166,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-09,8,tropicana,9280,9.135616826,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-16,8,tropicana,6720,8.812843434,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-16,8,minute.maid,14336,9.570529135,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-16,8,dominicks,5696,8.647519453,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-23,8,minute.maid,11712,9.368369236,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-23,8,dominicks,19008,9.852615222,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-23,8,tropicana,5056,8.528330936,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-30,8,minute.maid,7936,8.979164649,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-30,8,dominicks,121664,11.70901843,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-01-30,8,tropicana,6080,8.712759975,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-06,8,tropicana,10496,9.258749511,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-06,8,minute.maid,5184,8.553332238,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-06,8,dominicks,38848,10.56741187,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-13,8,minute.maid,7168,8.877381955,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-13,8,dominicks,6144,8.723231275,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-13,8,tropicana,39040,10.57234204,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-20,8,dominicks,13632,9.520175249,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-20,8,minute.maid,216064,12.28332994,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-20,8,tropicana,4480,8.407378325,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-27,8,tropicana,61760,11.03101119,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-27,8,minute.maid,15040,9.618468598,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-02-27,8,dominicks,9792,9.189321005,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-05,8,tropicana,15360,9.639522007,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-05,8,minute.maid,11840,9.379238908,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-05,8,dominicks,86912,11.37265139,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-12,8,minute.maid,25472,10.14533509,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-12,8,dominicks,24512,10.10691807,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-12,8,tropicana,54976,10.91465201,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-19,8,minute.maid,16384,9.704060528,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-19,8,dominicks,58048,10.96902553,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-19,8,tropicana,34368,10.44488118,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-26,8,tropicana,10752,9.282847063,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-26,8,minute.maid,20480,9.927204079,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-03-26,8,dominicks,13952,9.543378146,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-02,8,minute.maid,34688,10.45414909,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-02,8,dominicks,15168,9.626943225,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-02,8,tropicana,20096,9.908276069,1,2.5,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-09,8,dominicks,14592,9.588228712000001,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-09,8,minute.maid,22400,10.01681624,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-09,8,tropicana,16192,9.692272572,0,2.5,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-16,8,tropicana,6528,8.783855897,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-16,8,minute.maid,7808,8.962904128,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-16,8,dominicks,145088,11.88509573,1,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-23,8,tropicana,8320,9.026417534,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-23,8,minute.maid,48064,10.78028874,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-23,8,dominicks,43712,10.68537794,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-30,8,tropicana,30784,10.33475035,1,2.16,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-30,8,minute.maid,7360,8.903815212,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-04-30,8,dominicks,20608,9.933434629,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-07,8,tropicana,18048,9.800790154,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-07,8,minute.maid,6272,8.743850562,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-07,8,dominicks,18752,9.839055692,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-14,8,tropicana,12864,9.462187991,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-14,8,minute.maid,6400,8.764053269,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-14,8,dominicks,20160,9.911455722000001,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-21,8,tropicana,7168,8.877381955,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-21,8,minute.maid,54592,10.90764263,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-21,8,dominicks,18688,9.835636886,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-28,8,minute.maid,8128,9.00307017,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-28,8,tropicana,9024,9.107642974,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-05-28,8,dominicks,133824,11.80428078,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-04,8,tropicana,84992,11.35031241,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-04,8,minute.maid,4928,8.502688505,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-04,8,dominicks,63488,11.05860619,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-11,8,minute.maid,5440,8.60153434,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-11,8,tropicana,14144,9.557045785,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-11,8,dominicks,71040,11.17099838,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-25,8,tropicana,7488,8.921057017999999,1,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-25,8,minute.maid,5888,8.68067166,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-06-25,8,dominicks,15360,9.639522007,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-02,8,minute.maid,23872,10.0804615,1,2.02,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-02,8,dominicks,17728,9.78290059,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-02,8,tropicana,12352,9.421573272,0,2.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-09,8,tropicana,5696,8.647519453,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-09,8,minute.maid,6848,8.831711918,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-09,8,dominicks,24256,10.09641929,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-16,8,minute.maid,8192,9.010913347,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-16,8,dominicks,19968,9.901886271,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-16,8,tropicana,7680,8.946374826,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-23,8,dominicks,15936,9.67633598,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-23,8,minute.maid,55040,10.91581547,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-23,8,tropicana,5440,8.60153434,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-30,8,tropicana,5632,8.636219898,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-30,8,minute.maid,6528,8.783855897,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-07-30,8,dominicks,76352,11.24310951,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-06,8,tropicana,8960,9.100525506,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-06,8,minute.maid,6208,8.733594062,1,2.45,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-06,8,dominicks,17408,9.76468515,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-13,8,minute.maid,94720,11.45868045,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-13,8,tropicana,6080,8.712759975,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-13,8,dominicks,17536,9.77201119,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
1992-08-20,8,dominicks,31232,10.34919849,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
||||
|
@@ -1,66 +0,0 @@
|
||||
import argparse
|
||||
import json
|
||||
|
||||
from azureml.core import Run, Model, Workspace
|
||||
from azureml.core.conda_dependencies import CondaDependencies
|
||||
from azureml.core.model import InferenceConfig
|
||||
from azureml.core.webservice import AciWebservice
|
||||
|
||||
|
||||
script_file_name = 'score.py'
|
||||
conda_env_file_name = 'myenv.yml'
|
||||
|
||||
print("In deploy.py")
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument("--time_column_name", type=str, help="time column name")
|
||||
parser.add_argument("--group_column_names", type=str, help="group column names")
|
||||
parser.add_argument("--model_names", type=str, help="model names")
|
||||
parser.add_argument("--service_name", type=str, help="service name")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# replace the group column names in scoring script to the ones set by user
|
||||
print("Update group_column_names")
|
||||
print(args.group_column_names)
|
||||
|
||||
with open(script_file_name, 'r') as cefr:
|
||||
content = cefr.read()
|
||||
with open(script_file_name, 'w') as cefw:
|
||||
content = content.replace('<<groups>>', args.group_column_names.rstrip())
|
||||
cefw.write(content.replace('<<time_colname>>', args.time_column_name.rstrip()))
|
||||
|
||||
with open(script_file_name, 'r') as cefr1:
|
||||
content1 = cefr1.read()
|
||||
print(content1)
|
||||
|
||||
model_list = json.loads(args.model_names)
|
||||
print(model_list)
|
||||
|
||||
run = Run.get_context()
|
||||
ws = run.experiment.workspace
|
||||
|
||||
deployment_config = AciWebservice.deploy_configuration(
|
||||
cpu_cores=1,
|
||||
memory_gb=2,
|
||||
tags={"method": "grouping"},
|
||||
description='grouping demo aci deployment'
|
||||
)
|
||||
|
||||
inference_config = InferenceConfig(
|
||||
entry_script=script_file_name,
|
||||
runtime='python',
|
||||
conda_file=conda_env_file_name
|
||||
)
|
||||
|
||||
models = []
|
||||
for model_name in model_list:
|
||||
models.append(Model(ws, name=model_name))
|
||||
|
||||
service = Model.deploy(
|
||||
ws,
|
||||
name=args.service_name,
|
||||
models=models,
|
||||
inference_config=inference_config,
|
||||
deployment_config=deployment_config
|
||||
)
|
||||
service.wait_for_deployment(True)
|
||||
@@ -1,11 +0,0 @@
|
||||
name: automl_grouping_env
|
||||
dependencies:
|
||||
# The python interpreter version.
|
||||
|
||||
# Currently Azure ML only supports 3.5.2 and later.
|
||||
|
||||
- python=3.6.2
|
||||
- numpy>=1.16.0,<=1.16.2
|
||||
- scikit-learn>=0.19.0,<=0.20.3
|
||||
- conda-forge::fbprophet==0.5
|
||||
|
||||
@@ -1,55 +0,0 @@
|
||||
import json
|
||||
import pickle
|
||||
import re
|
||||
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
from sklearn.externals import joblib
|
||||
from sklearn.linear_model import Ridge
|
||||
|
||||
from azureml.core.model import Model
|
||||
import azureml.train.automl
|
||||
|
||||
|
||||
def init():
|
||||
global models
|
||||
models = {}
|
||||
global group_columns_str
|
||||
group_columns_str = "<<groups>>"
|
||||
global time_column_name
|
||||
time_column_name = "<<time_colname>>"
|
||||
|
||||
global group_columns
|
||||
group_columns = group_columns_str.split("#####")
|
||||
global valid_chars
|
||||
valid_chars = re.compile('[^a-zA-Z0-9-]')
|
||||
|
||||
|
||||
def run(raw_data):
|
||||
try:
|
||||
data = pd.read_json(raw_data)
|
||||
# Make sure we have correct time points.
|
||||
data[time_column_name] = pd.to_datetime(data[time_column_name], unit='ms')
|
||||
dfs = []
|
||||
for grain, df_one in data.groupby(group_columns):
|
||||
if isinstance(grain, int):
|
||||
cur_group = str(grain)
|
||||
elif isinstance(grain, str):
|
||||
cur_group = grain
|
||||
else:
|
||||
cur_group = "#####".join(list(grain))
|
||||
cur_group = valid_chars.sub('', cur_group)
|
||||
print("Query model for group {}".format(cur_group))
|
||||
if cur_group not in models:
|
||||
model_path = Model.get_model_path(cur_group)
|
||||
model = joblib.load(model_path)
|
||||
models[cur_group] = model
|
||||
_, xtrans = models[cur_group].forecast(df_one)
|
||||
dfs.append(xtrans)
|
||||
df_ret = pd.concat(dfs)
|
||||
df_ret.reset_index(drop=False, inplace=True)
|
||||
return json.dumps({'predictions': df_ret.to_json()})
|
||||
|
||||
except Exception as e:
|
||||
error = str(e)
|
||||
return error
|
||||
@@ -1,22 +0,0 @@
|
||||
import argparse
|
||||
|
||||
from azureml.core import Run, Model
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument("--model_name")
|
||||
parser.add_argument("--model_path")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
run = Run.get_context()
|
||||
ws = run.experiment.workspace
|
||||
print('retrieved ws: {}'.format(ws))
|
||||
|
||||
print('begin register model')
|
||||
model = Model.register(
|
||||
workspace=ws,
|
||||
model_path=args.model_path,
|
||||
model_name=args.model_name
|
||||
)
|
||||
print('model registered: {}'.format(model))
|
||||
print('complete')
|
||||
@@ -1,4 +1,9 @@
|
||||
name: automl-forecasting-function
|
||||
dependencies:
|
||||
- fbprophet==0.5
|
||||
- py-xgboost<=0.80
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
|
||||
@@ -631,9 +631,7 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"# The request data frame needs to have y_query column which corresponds to query.\n",
|
||||
"X_query = X_test.copy()\n",
|
||||
"X_query['y_query'] = np.NaN\n",
|
||||
"# We have to convert datetime to string, because Timestamps cannot be serialized to JSON.\n",
|
||||
"X_query[time_column_name] = X_query[time_column_name].astype(str)\n",
|
||||
"# The Service object accept the complex dictionary, which is internally converted to JSON string.\n",
|
||||
|
||||
@@ -1,4 +1,9 @@
|
||||
name: auto-ml-forecasting-orange-juice-sales
|
||||
dependencies:
|
||||
- fbprophet==0.5
|
||||
- py-xgboost<=0.80
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
|
||||
@@ -155,7 +155,6 @@
|
||||
"automl_settings = {\n",
|
||||
" \"n_cross_validations\": 3,\n",
|
||||
" \"primary_metric\": 'average_precision_score_weighted',\n",
|
||||
" \"preprocess\": True,\n",
|
||||
" \"experiment_timeout_hours\": 0.2, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ability to find the best model possible\n",
|
||||
" \"verbosity\": logging.INFO,\n",
|
||||
" \"enable_stack_ensemble\": False\n",
|
||||
|
||||
@@ -2,3 +2,8 @@ name: auto-ml-classification-credit-card-fraud-local
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- interpret
|
||||
- azureml-explain-model
|
||||
|
||||
@@ -208,7 +208,7 @@
|
||||
"|**primary_metric**|This is the metric that you want to optimize. Regression supports the following primary metrics: <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
|
||||
"|**experiment_timeout_hours**| Maximum amount of time in hours that all iterations combined can take before the experiment terminates.|\n",
|
||||
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
|
||||
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Note: If the input data is sparse, featurization cannot be turned on.|\n",
|
||||
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Setting this enables AutoML to perform featurization on the input to handle *missing data*, and to perform some common *feature extraction*. Note: If the input data is sparse, featurization cannot be turned on.|\n",
|
||||
"|**n_cross_validations**|Number of cross validation splits.|\n",
|
||||
"|**training_data**|(sparse) array-like, shape = [n_samples, n_features]|\n",
|
||||
"|**label_column_name**|(sparse) array-like, shape = [n_samples, ], targets values.|"
|
||||
@@ -244,7 +244,7 @@
|
||||
"source": [
|
||||
"featurization_config = FeaturizationConfig()\n",
|
||||
"featurization_config.blocked_transformers = ['LabelEncoder']\n",
|
||||
"#featurization_config.drop_columns = ['ERP', 'MMIN']\n",
|
||||
"#featurization_config.drop_columns = ['MMIN']\n",
|
||||
"featurization_config.add_column_purpose('MYCT', 'Numeric')\n",
|
||||
"featurization_config.add_column_purpose('VendorName', 'CategoricalHash')\n",
|
||||
"#default strategy mean, add transformer param for for 3 columns\n",
|
||||
@@ -717,10 +717,10 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"myenv = automl_run.get_environment().python.conda_dependencies\n",
|
||||
"conda_dep = automl_run.get_environment().python.conda_dependencies\n",
|
||||
"\n",
|
||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||
" f.write(myenv.serialize_to_string())\n",
|
||||
" f.write(conda_dep.serialize_to_string())\n",
|
||||
"\n",
|
||||
"with open(\"myenv.yml\",\"r\") as f:\n",
|
||||
" print(f.read())"
|
||||
@@ -761,6 +761,7 @@
|
||||
"from azureml.core.model import InferenceConfig\n",
|
||||
"from azureml.core.webservice import AciWebservice\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
"from azureml.core.environment import Environment\n",
|
||||
"\n",
|
||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores=1, \n",
|
||||
" memory_gb=1, \n",
|
||||
@@ -768,9 +769,8 @@
|
||||
" \"method\" : \"local_explanation\"}, \n",
|
||||
" description='Get local explanations for Machine test data')\n",
|
||||
"\n",
|
||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
||||
" entry_script=\"score_explain.py\",\n",
|
||||
" conda_file=\"myenv.yml\")\n",
|
||||
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||
"inference_config = InferenceConfig(entry_script=\"score_explain.py\", environment=myenv)\n",
|
||||
"\n",
|
||||
"# Use configs and models generated above\n",
|
||||
"service = Model.deploy(ws, 'model-scoring', [scoring_explainer_model, original_model], inference_config, aciconfig)\n",
|
||||
|
||||
@@ -2,3 +2,10 @@ name: auto-ml-regression-hardware-performance-explanation-and-featurization
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
- interpret
|
||||
- azureml-explain-model
|
||||
- azureml-explain-model
|
||||
- azureml-contrib-interpret
|
||||
|
||||
@@ -198,7 +198,6 @@
|
||||
"automl_settings = {\n",
|
||||
" \"n_cross_validations\": 3,\n",
|
||||
" \"primary_metric\": 'r2_score',\n",
|
||||
" \"preprocess\": True,\n",
|
||||
" \"enable_early_stopping\": True, \n",
|
||||
" \"experiment_timeout_hours\": 0.3, #for real scenarios we reccommend a timeout of at least one hour \n",
|
||||
" \"max_concurrent_iterations\": 4,\n",
|
||||
|
||||
@@ -2,3 +2,6 @@ name: auto-ml-regression
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-train-automl
|
||||
- azureml-widgets
|
||||
- matplotlib
|
||||
|
||||
@@ -11,6 +11,13 @@
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Register Azure Databricks trained model and deploy it to ACI\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
@@ -265,6 +272,15 @@
|
||||
"myservice.delete()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Deploying to other types of computes\n",
|
||||
"\n",
|
||||
"In order to learn how to deploy to other types of compute targets, such as AKS, please take a look at the set of notebooks in the [deployment](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/deployment) folder."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
|
||||
@@ -1,312 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Azure ML & Azure Databricks notebooks by Parashar Shah.\n",
|
||||
"\n",
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"This notebook uses image from ACI notebook for deploying to AKS."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"# Check core SDK version number\n",
|
||||
"print(\"SDK version:\", azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Set auth to be used by workspace related APIs.\n",
|
||||
"# For automation or CI/CD ServicePrincipalAuthentication can be used.\n",
|
||||
"# https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.authentication.serviceprincipalauthentication?view=azure-ml-py\n",
|
||||
"auth = None"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"ws = Workspace.from_config(auth = auth)\n",
|
||||
"print('Workspace name: ' + ws.name, \n",
|
||||
" 'Azure region: ' + ws.location, \n",
|
||||
" 'Subscription id: ' + ws.subscription_id, \n",
|
||||
" 'Resource group: ' + ws.resource_group, sep = '\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#Register the model\n",
|
||||
"import os\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
"\n",
|
||||
"model_name = \"AdultCensus_runHistory_aks.mml\" # \n",
|
||||
"model_name_dbfs = os.path.join(\"/dbfs\", model_name)\n",
|
||||
"\n",
|
||||
"print(\"copy model from dbfs to local\")\n",
|
||||
"model_local = \"file:\" + os.getcwd() + \"/\" + model_name\n",
|
||||
"dbutils.fs.cp(model_name, model_local, True)\n",
|
||||
"\n",
|
||||
"mymodel = Model.register(model_path = model_name, # this points to a local file\n",
|
||||
" model_name = model_name, # this is the name the model is registered as, am using same name for both path and name. \n",
|
||||
" description = \"ADB trained model by Parashar\",\n",
|
||||
" workspace = ws)\n",
|
||||
"\n",
|
||||
"print(mymodel.name, mymodel.description, mymodel.version)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#%%writefile score_sparkml.py\n",
|
||||
"score_sparkml = \"\"\"\n",
|
||||
" \n",
|
||||
"import json\n",
|
||||
" \n",
|
||||
"def init():\n",
|
||||
" # One-time initialization of PySpark and predictive model\n",
|
||||
" import pyspark\n",
|
||||
" from azureml.core.model import Model\n",
|
||||
" from pyspark.ml import PipelineModel\n",
|
||||
" \n",
|
||||
" global trainedModel\n",
|
||||
" global spark\n",
|
||||
" \n",
|
||||
" spark = pyspark.sql.SparkSession.builder.appName(\"ADB and AML notebook by Parashar\").getOrCreate()\n",
|
||||
" model_name = \"{model_name}\" #interpolated\n",
|
||||
" model_path = Model.get_model_path(model_name)\n",
|
||||
" trainedModel = PipelineModel.load(model_path)\n",
|
||||
" \n",
|
||||
"def run(input_json):\n",
|
||||
" if isinstance(trainedModel, Exception):\n",
|
||||
" return json.dumps({{\"trainedModel\":str(trainedModel)}})\n",
|
||||
" \n",
|
||||
" try:\n",
|
||||
" sc = spark.sparkContext\n",
|
||||
" input_list = json.loads(input_json)\n",
|
||||
" input_rdd = sc.parallelize(input_list)\n",
|
||||
" input_df = spark.read.json(input_rdd)\n",
|
||||
" \n",
|
||||
" # Compute prediction\n",
|
||||
" prediction = trainedModel.transform(input_df)\n",
|
||||
" #result = prediction.first().prediction\n",
|
||||
" predictions = prediction.collect()\n",
|
||||
" \n",
|
||||
" #Get each scored result\n",
|
||||
" preds = [str(x['prediction']) for x in predictions]\n",
|
||||
" result = \",\".join(preds)\n",
|
||||
" # you can return any data type as long as it is JSON-serializable\n",
|
||||
" return result.tolist()\n",
|
||||
" except Exception as e:\n",
|
||||
" result = str(e)\n",
|
||||
" return result\n",
|
||||
" \n",
|
||||
"\"\"\".format(model_name=model_name)\n",
|
||||
" \n",
|
||||
"exec(score_sparkml)\n",
|
||||
" \n",
|
||||
"with open(\"score_sparkml.py\", \"w\") as file:\n",
|
||||
" file.write(score_sparkml)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||
"\n",
|
||||
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) #showing how to add libs as an eg. - not needed for this model.\n",
|
||||
"\n",
|
||||
"with open(\"mydeployenv.yml\",\"w\") as f:\n",
|
||||
" f.write(myacienv.serialize_to_string())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#create AKS compute\n",
|
||||
"#it may take 20-25 minutes to create a new cluster\n",
|
||||
"\n",
|
||||
"from azureml.core.compute import AksCompute, ComputeTarget\n",
|
||||
"from azureml.core.compute_target import ComputeTargetException\n",
|
||||
"\n",
|
||||
"aks_name = 'ps-aks-demo2' \n",
|
||||
"\n",
|
||||
"try:\n",
|
||||
" aks_target = ComputeTarget(workspace=ws, name=aks_name)\n",
|
||||
" print('Found existing cluster, use it.')\n",
|
||||
"except ComputeTargetException:\n",
|
||||
" # Use the default configuration (can also provide parameters to customize)\n",
|
||||
" prov_config = AksCompute.provisioning_configuration()\n",
|
||||
" \n",
|
||||
" # Create the cluster\n",
|
||||
" aks_target = ComputeTarget.create(workspace = ws, \n",
|
||||
" name = aks_name, \n",
|
||||
" provisioning_configuration = prov_config)\n",
|
||||
"\n",
|
||||
"aks_target.wait_for_completion(show_output = True)\n",
|
||||
"\n",
|
||||
"print(aks_target.provisioning_state)\n",
|
||||
"print(aks_target.provisioning_errors)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#deploy to AKS\n",
|
||||
"from azureml.core.webservice import AksWebservice, Webservice\n",
|
||||
"from azureml.exceptions import WebserviceException\n",
|
||||
"from azureml.core.model import InferenceConfig\n",
|
||||
"\n",
|
||||
"aks_config = AksWebservice.deploy_configuration(enable_app_insights=True)\n",
|
||||
"\n",
|
||||
"service_name = 'ps-aks-service'\n",
|
||||
"\n",
|
||||
"# Remove any existing service under the same name.\n",
|
||||
"try:\n",
|
||||
" Webservice(ws, service_name).delete()\n",
|
||||
"except WebserviceException:\n",
|
||||
" pass\n",
|
||||
"\n",
|
||||
"inference_config = InferenceConfig(runtime = 'spark-py', \n",
|
||||
" entry_script ='score_sparkml.py',\n",
|
||||
" conda_file ='mydeployenv.yml')\n",
|
||||
"\n",
|
||||
"aks_service = Model.deploy(ws, service_name, [mymodel], inference_config, aks_config, aks_target)\n",
|
||||
"aks_service.wait_for_deployment(show_output=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"aks_service.deployment_status"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#for using the Web HTTP API \n",
|
||||
"print(aks_service.scoring_uri)\n",
|
||||
"print(aks_service.get_keys())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"\n",
|
||||
"#get the some sample data\n",
|
||||
"test_data_path = \"AdultCensusIncomeTest\"\n",
|
||||
"test = spark.read.parquet(test_data_path).limit(5)\n",
|
||||
"\n",
|
||||
"test_json = json.dumps(test.toJSON().collect())\n",
|
||||
"\n",
|
||||
"print(test_json)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#using data defined above predict if income is >50K (1) or <=50K (0)\n",
|
||||
"aks_service.run(input_data=test_json)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"#comment to not delete the web service\n",
|
||||
"aks_service.delete()\n",
|
||||
"#model.delete()\n",
|
||||
"aks_target.delete() "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "pasha"
|
||||
}
|
||||
],
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.8"
|
||||
},
|
||||
"name": "deploy-to-aks-existingimage-05",
|
||||
"notebookId": 1030695628045968
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 1
|
||||
}
|
||||
@@ -20,7 +20,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Register model and deploy as webservice\n",
|
||||
"# Register model and deploy as webservice in ACI\n",
|
||||
"\n",
|
||||
"Following this notebook, you will:\n",
|
||||
"\n",
|
||||
@@ -45,6 +45,7 @@
|
||||
"source": [
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Check core SDK version number.\n",
|
||||
"print('SDK version:', azureml.core.VERSION)"
|
||||
]
|
||||
@@ -70,6 +71,7 @@
|
||||
"source": [
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"ws = Workspace.from_config()\n",
|
||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
|
||||
]
|
||||
@@ -91,6 +93,7 @@
|
||||
"source": [
|
||||
"from azureml.core import Dataset\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"datastore = ws.get_default_datastore()\n",
|
||||
"datastore.upload_files(files=['./features.csv', './labels.csv'],\n",
|
||||
" target_path='sklearn_regression/',\n",
|
||||
@@ -125,6 +128,7 @@
|
||||
"from azureml.core import Model\n",
|
||||
"from azureml.core.resource_configuration import ResourceConfiguration\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"model = Model.register(workspace=ws,\n",
|
||||
" model_name='my-sklearn-model', # Name of the registered model in your workspace.\n",
|
||||
" model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n",
|
||||
@@ -159,6 +163,8 @@
|
||||
"\n",
|
||||
"The Azure Machine Learning service provides a default environment for supported model frameworks, including scikit-learn, based on the metadata you provided when registering your model. This is the easiest way to deploy your model.\n",
|
||||
"\n",
|
||||
"Even when you deploy your model to ACI with a default environment you can still customize the deploy configuration (i.e. the number of cores and amount of memory made available for the deployment) using the [AciWebservice.deploy_configuration()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aci.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none--). Look at the \"Use a custom environment\" section of this notebook for more information on deploy configuration.\n",
|
||||
"\n",
|
||||
"**Note**: This step can take several minutes."
|
||||
]
|
||||
},
|
||||
@@ -171,6 +177,7 @@
|
||||
"from azureml.core import Webservice\n",
|
||||
"from azureml.exceptions import WebserviceException\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"service_name = 'my-sklearn-service'\n",
|
||||
"\n",
|
||||
"# Remove any existing service under the same name.\n",
|
||||
@@ -198,6 +205,7 @@
|
||||
"source": [
|
||||
"import json\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"input_payload = json.dumps({\n",
|
||||
" 'data': [\n",
|
||||
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
||||
@@ -231,9 +239,9 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Use a custom environment (for all models)\n",
|
||||
"### Use a custom environment\n",
|
||||
"\n",
|
||||
"If you want more control over how your model is run, if it uses another framework, or if it has special runtime requirements, you can instead specify your own environment and scoring method.\n",
|
||||
"If you want more control over how your model is run, if it uses another framework, or if it has special runtime requirements, you can instead specify your own environment and scoring method. Custom environments can be used for any model you want to deploy.\n",
|
||||
"\n",
|
||||
"Specify the model's runtime environment by creating an [Environment](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.environment%28class%29?view=azure-ml-py) object and providing the [CondaDependencies](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.conda_dependencies.condadependencies?view=azure-ml-py) needed by your model."
|
||||
]
|
||||
@@ -247,6 +255,7 @@
|
||||
"from azureml.core import Environment\n",
|
||||
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"environment = Environment('my-sklearn-environment')\n",
|
||||
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
||||
" 'azureml-defaults',\n",
|
||||
@@ -278,7 +287,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Deploy your model in the custom environment by providing an [InferenceConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.inferenceconfig?view=azure-ml-py) object to [Model.deploy()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#deploy-workspace--name--models--inference-config--deployment-config-none--deployment-target-none-).\n",
|
||||
"Deploy your model in the custom environment by providing an [InferenceConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.inferenceconfig?view=azure-ml-py) object to [Model.deploy()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#deploy-workspace--name--models--inference-config--deployment-config-none--deployment-target-none-). In this case we are also using the [AciWebservice.deploy_configuration()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aci.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none--) method to generate a custom deploy configuration.\n",
|
||||
"\n",
|
||||
"**Note**: This step can take several minutes."
|
||||
]
|
||||
@@ -288,15 +297,18 @@
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"azuremlexception-remarks-sample"
|
||||
"azuremlexception-remarks-sample",
|
||||
"sample-aciwebservice-deploy-config"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Webservice\n",
|
||||
"from azureml.core.model import InferenceConfig\n",
|
||||
"from azureml.core.webservice import AciWebservice\n",
|
||||
"from azureml.exceptions import WebserviceException\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"service_name = 'my-custom-env-service'\n",
|
||||
"\n",
|
||||
"# Remove any existing service under the same name.\n",
|
||||
@@ -305,11 +317,14 @@
|
||||
"except WebserviceException:\n",
|
||||
" pass\n",
|
||||
"\n",
|
||||
"inference_config = InferenceConfig(entry_script='score.py',\n",
|
||||
" source_directory='.',\n",
|
||||
" environment=environment)\n",
|
||||
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
|
||||
"aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
|
||||
"\n",
|
||||
"service = Model.deploy(ws, service_name, [model], inference_config)\n",
|
||||
"service = Model.deploy(workspace=ws,\n",
|
||||
" name=service_name,\n",
|
||||
" models=[model],\n",
|
||||
" inference_config=inference_config,\n",
|
||||
" deployment_config=aci_config)\n",
|
||||
"service.wait_for_deployment(show_output=True)"
|
||||
]
|
||||
},
|
||||
@@ -328,6 +343,7 @@
|
||||
"source": [
|
||||
"import json\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"input_payload = json.dumps({\n",
|
||||
" 'data': [\n",
|
||||
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
||||
|
||||
@@ -318,7 +318,11 @@
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"sample-deploy-to-aks"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Set the web service configuration (using default here)\n",
|
||||
@@ -331,7 +335,11 @@
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"sample-deploy-to-aks"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%time\n",
|
||||
|
||||
@@ -1,457 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Register Model, Create Image and Deploy Service\n",
|
||||
"\n",
|
||||
"This example shows how to deploy a web service in step-by-step fashion:\n",
|
||||
"\n",
|
||||
" 1. Register model\n",
|
||||
" 2. Query versions of models and select one to deploy\n",
|
||||
" 3. Create Docker image\n",
|
||||
" 4. Query versions of images\n",
|
||||
" 5. Deploy the image as web service\n",
|
||||
" \n",
|
||||
"**IMPORTANT**:\n",
|
||||
" * This notebook requires you to first complete [train-within-notebook](../../training/train-within-notebook/train-within-notebook.ipynb) example\n",
|
||||
" \n",
|
||||
"The train-within-notebook example taught you how to deploy a web service directly from model in one step. This Notebook shows a more advanced approach that gives you more control over model versions and Docker image versions. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Prerequisites\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the [configuration](../../../configuration.ipynb) Notebook first if you haven't."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Check core SDK version number\n",
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"print(\"SDK version:\", azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Initialize Workspace\n",
|
||||
"\n",
|
||||
"Initialize a workspace object from persisted configuration."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"create workspace"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"ws = Workspace.from_config()\n",
|
||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Register Model"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can add tags and descriptions to your models. Note you need to have a `sklearn_linreg_model.pkl` file in the current directory. This file is generated by the 01 notebook. The below call registers that file as a model with the same name `sklearn_linreg_model.pkl` in the workspace.\n",
|
||||
"\n",
|
||||
"Using tags, you can track useful information such as the name and version of the machine learning library used to train the model. Note that tags must be alphanumeric."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"register model from file"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.model import Model\n",
|
||||
"import sklearn\n",
|
||||
"\n",
|
||||
"library_version = \"sklearn\"+sklearn.__version__.replace(\".\",\"x\")\n",
|
||||
"\n",
|
||||
"model = Model.register(model_path = \"sklearn_regression_model.pkl\",\n",
|
||||
" model_name = \"sklearn_regression_model.pkl\",\n",
|
||||
" tags = {'area': \"diabetes\", 'type': \"regression\", 'version': library_version},\n",
|
||||
" description = \"Ridge regression model to predict diabetes\",\n",
|
||||
" workspace = ws)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can explore the registered models within your workspace and query by tag. Models are versioned. If you call the register_model command many times with same model name, you will get multiple versions of the model with increasing version numbers."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"register model from file"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"regression_models = Model.list(workspace=ws, tags=['area'])\n",
|
||||
"for m in regression_models:\n",
|
||||
" print(\"Name:\", m.name,\"\\tVersion:\", m.version, \"\\tDescription:\", m.description, m.tags)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can pick a specific model to deploy"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"print(model.name, model.description, model.version, sep = '\\t')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create Docker Image"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Show `score.py`. Note that the `sklearn_regression_model.pkl` in the `get_model_path` call is referring to a model named `sklearn_linreg_model.pkl` registered under the workspace. It is NOT referenceing the local file."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%writefile score.py\n",
|
||||
"import os\n",
|
||||
"import pickle\n",
|
||||
"import json\n",
|
||||
"import numpy\n",
|
||||
"from sklearn.externals import joblib\n",
|
||||
"from sklearn.linear_model import Ridge\n",
|
||||
"\n",
|
||||
"def init():\n",
|
||||
" global model\n",
|
||||
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
|
||||
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
|
||||
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
|
||||
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
|
||||
" # deserialize the model file back into a sklearn model\n",
|
||||
" model = joblib.load(model_path)\n",
|
||||
"\n",
|
||||
"# note you can pass in multiple rows for scoring\n",
|
||||
"def run(raw_data):\n",
|
||||
" try:\n",
|
||||
" data = json.loads(raw_data)['data']\n",
|
||||
" data = numpy.array(data)\n",
|
||||
" result = model.predict(data)\n",
|
||||
" # you can return any datatype as long as it is JSON-serializable\n",
|
||||
" return result.tolist()\n",
|
||||
" except Exception as e:\n",
|
||||
" error = str(e)\n",
|
||||
" return error"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||
"\n",
|
||||
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'])\n",
|
||||
"\n",
|
||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||
" f.write(myenv.serialize_to_string())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Note that following command can take few minutes. \n",
|
||||
"\n",
|
||||
"You can add tags and descriptions to images. Also, an image can contain multiple models."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"create image",
|
||||
"sample-image-create"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.image import Image, ContainerImage\n",
|
||||
"\n",
|
||||
"image_config = ContainerImage.image_configuration(runtime= \"python\",\n",
|
||||
" execution_script=\"score.py\",\n",
|
||||
" conda_file=\"myenv.yml\",\n",
|
||||
" tags = {'area': \"diabetes\", 'type': \"regression\"},\n",
|
||||
" description = \"Image with ridge regression model\")\n",
|
||||
"\n",
|
||||
"image = Image.create(name = \"myimage1\",\n",
|
||||
" # this is the model object. note you can pass in 0-n models via this list-type parameter\n",
|
||||
" # in case you need to reference multiple models, or none at all, in your scoring script.\n",
|
||||
" models = [model],\n",
|
||||
" image_config = image_config, \n",
|
||||
" workspace = ws)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"create image"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"image.wait_for_creation(show_output = True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"#### Use a custom Docker image\n",
|
||||
"\n",
|
||||
"You can also specify a custom Docker image to be used as base image if you don't want to use the default base image provided by Azure ML. Please make sure the custom Docker image has Ubuntu >= 16.04, Conda >= 4.5.\\* and Python(3.5.\\* or 3.6.\\*).\n",
|
||||
"\n",
|
||||
"Only Supported for `ContainerImage`(from azureml.core.image) with `python` runtime.\n",
|
||||
"```python\n",
|
||||
"# use an image available in public Container Registry without authentication\n",
|
||||
"image_config.base_image = \"mcr.microsoft.com/azureml/o16n-sample-user-base/ubuntu-miniconda\"\n",
|
||||
"\n",
|
||||
"# or, use an image available in a private Container Registry\n",
|
||||
"image_config.base_image = \"myregistry.azurecr.io/mycustomimage:1.0\"\n",
|
||||
"image_config.base_image_registry.address = \"myregistry.azurecr.io\"\n",
|
||||
"image_config.base_image_registry.username = \"username\"\n",
|
||||
"image_config.base_image_registry.password = \"password\"\n",
|
||||
"\n",
|
||||
"# or, use an image built during training.\n",
|
||||
"image_config.base_image = run.properties[\"AzureML.DerivedImageName\"]\n",
|
||||
"```\n",
|
||||
"You can get the address of training image from the properties of a Run object. Only new runs submitted with azureml-sdk>=1.0.22 to AMLCompute targets will have the 'AzureML.DerivedImageName' property. Instructions on how to get a Run can be found in [manage-runs](../../training/manage-runs/manage-runs.ipynb). \n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"List images by tag and find out the detailed build log for debugging."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"create image"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"for i in Image.list(workspace = ws,tags = [\"area\"]):\n",
|
||||
" print('{}(v.{} [{}]) stored at {} with build log {}'.format(i.name, i.version, i.creation_state, i.image_location, i.image_build_log_uri))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Deploy image as web service on Azure Container Instance\n",
|
||||
"\n",
|
||||
"Note that the service creation can take few minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"deploy service",
|
||||
"aci",
|
||||
"sample-aciwebservice-deploy-config"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import AciWebservice\n",
|
||||
"\n",
|
||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
|
||||
" memory_gb = 1, \n",
|
||||
" tags = {'area': \"diabetes\", 'type': \"regression\"}, \n",
|
||||
" description = 'Predict diabetes using regression model')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"deploy service",
|
||||
"aci",
|
||||
"sample-aciwebservice-deploy-from-image"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import Webservice\n",
|
||||
"\n",
|
||||
"aci_service_name = 'my-aci-service-2'\n",
|
||||
"print(aci_service_name)\n",
|
||||
"aci_service = Webservice.deploy_from_image(deployment_config = aciconfig,\n",
|
||||
" image = image,\n",
|
||||
" name = aci_service_name,\n",
|
||||
" workspace = ws)\n",
|
||||
"aci_service.wait_for_deployment(True)\n",
|
||||
"print(aci_service.state)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Test web service"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Call the web service with some dummy input data to get a prediction."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"deploy service",
|
||||
"aci"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"\n",
|
||||
"test_sample = json.dumps({'data': [\n",
|
||||
" [1,2,3,4,5,6,7,8,9,10], \n",
|
||||
" [10,9,8,7,6,5,4,3,2,1]\n",
|
||||
"]})\n",
|
||||
"test_sample = bytes(test_sample,encoding = 'utf8')\n",
|
||||
"\n",
|
||||
"prediction = aci_service.run(input_data=test_sample)\n",
|
||||
"print(prediction)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Delete ACI to clean up"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"deploy service",
|
||||
"aci"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"aci_service.delete()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "aashishb"
|
||||
}
|
||||
],
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.6"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
name: register-model-create-image-deploy-service
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- matplotlib
|
||||
- tqdm
|
||||
- scipy
|
||||
- sklearn
|
||||
Binary file not shown.
Binary file not shown.
@@ -0,0 +1 @@
|
||||
{"class":"org.apache.spark.ml.classification.LogisticRegressionModel","timestamp":1570147252329,"sparkVersion":"2.4.0","uid":"LogisticRegression_5df3978caaf3","paramMap":{"regParam":0.01},"defaultParamMap":{"aggregationDepth":2,"threshold":0.5,"rawPredictionCol":"rawPrediction","featuresCol":"features","labelCol":"label","predictionCol":"prediction","family":"auto","regParam":0.0,"tol":1.0E-6,"probabilityCol":"probability","standardization":true,"elasticNetParam":0.0,"maxIter":100,"fitIntercept":true}}
|
||||
@@ -0,0 +1,343 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Register Spark Model and deploy as Webservice\n",
|
||||
"\n",
|
||||
"This example shows how to deploy a Webservice in step-by-step fashion:\n",
|
||||
"\n",
|
||||
" 1. Register Spark Model\n",
|
||||
" 2. Deploy Spark Model as Webservice"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Prerequisites\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the [configuration](../../../configuration.ipynb) Notebook first if you haven't."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Check core SDK version number\n",
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"print(\"SDK version:\", azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Initialize Workspace\n",
|
||||
"\n",
|
||||
"Initialize a workspace object from persisted configuration."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"create workspace"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"ws = Workspace.from_config()\n",
|
||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Register Model"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can add tags and descriptions to your Models. Note you need to have a `iris.model` file in the current directory. This model file is generated using [train in spark](../training/train-in-spark/train-in-spark.ipynb) notebook. The below call registers that file as a Model with the same name `iris.model` in the workspace.\n",
|
||||
"\n",
|
||||
"Using tags, you can track useful information such as the name and version of the machine learning library used to train the model. Note that tags must be alphanumeric."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"register model from file"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.model import Model\n",
|
||||
"\n",
|
||||
"model = Model.register(model_path=\"iris.model\",\n",
|
||||
" model_name=\"iris.model\",\n",
|
||||
" tags={'type': \"regression\"},\n",
|
||||
" description=\"Logistic regression model to predict iris species\",\n",
|
||||
" workspace=ws)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Fetch Environment"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can now create and/or use an Environment object when deploying a Webservice. The Environment can have been previously registered with your Workspace, or it will be registered with it as a part of the Webservice deployment.\n",
|
||||
"\n",
|
||||
"In this notebook, we will be using 'AzureML-PySpark-MmlSpark-0.15', a curated environment.\n",
|
||||
"\n",
|
||||
"More information can be found in our [using environments notebook](../training/using-environments/using-environments.ipynb)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Environment\n",
|
||||
"\n",
|
||||
"env = Environment.get(ws, name='AzureML-PySpark-MmlSpark-0.15')\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create Inference Configuration\n",
|
||||
"\n",
|
||||
"There is now support for a source directory, you can upload an entire folder from your local machine as dependencies for the Webservice.\n",
|
||||
"Note: in that case, your entry_script is relative path to the source_directory path.\n",
|
||||
"\n",
|
||||
"Sample code for using a source directory:\n",
|
||||
"\n",
|
||||
"```python\n",
|
||||
"inference_config = InferenceConfig(source_directory=\"C:/abc\",\n",
|
||||
" entry_script=\"x/y/score.py\",\n",
|
||||
" environment=environment)\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
" - source_directory = holds source path as string, this entire folder gets added in image so its really easy to access any files within this folder or subfolder\n",
|
||||
" - entry_script = contains logic specific to initializing your model and running predictions\n",
|
||||
" - environment = An environment object to use for the deployment. Doesn't have to be registered"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"create image"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.model import InferenceConfig\n",
|
||||
"\n",
|
||||
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=env)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Deploy Model as Webservice on Azure Container Instance\n",
|
||||
"\n",
|
||||
"Note that the service creation can take few minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"azuremlexception-remarks-sample"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import AciWebservice, Webservice\n",
|
||||
"from azureml.exceptions import WebserviceException\n",
|
||||
"\n",
|
||||
"deployment_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
|
||||
"aci_service_name = 'aciservice1'\n",
|
||||
"\n",
|
||||
"try:\n",
|
||||
" # if you want to get existing service below is the command\n",
|
||||
" # since aci name needs to be unique in subscription deleting existing aci if any\n",
|
||||
" # we use aci_service_name to create azure aci\n",
|
||||
" service = Webservice(ws, name=aci_service_name)\n",
|
||||
" if service:\n",
|
||||
" service.delete()\n",
|
||||
"except WebserviceException as e:\n",
|
||||
" print()\n",
|
||||
"\n",
|
||||
"service = Model.deploy(ws, aci_service_name, [model], inference_config, deployment_config)\n",
|
||||
"\n",
|
||||
"service.wait_for_deployment(True)\n",
|
||||
"print(service.state)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"#### Test web service"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"test_sample = json.dumps({'features':{'type':1,'values':[4.3,3.0,1.1,0.1]},'label':2.0})\n",
|
||||
"\n",
|
||||
"test_sample_encoded = bytes(test_sample, encoding='utf8')\n",
|
||||
"prediction = service.run(input_data=test_sample_encoded)\n",
|
||||
"print(prediction)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"#### Delete ACI to clean up"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"deploy service",
|
||||
"aci"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"service.delete()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Model Profiling\n",
|
||||
"\n",
|
||||
"You can also take advantage of the profiling feature to estimate CPU and memory requirements for models.\n",
|
||||
"\n",
|
||||
"```python\n",
|
||||
"profile = Model.profile(ws, \"profilename\", [model], inference_config, test_sample)\n",
|
||||
"profile.wait_for_profiling(True)\n",
|
||||
"profiling_results = profile.get_results()\n",
|
||||
"print(profiling_results)\n",
|
||||
"```"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Model Packaging\n",
|
||||
"\n",
|
||||
"If you want to build a Docker image that encapsulates your model and its dependencies, you can use the model packaging option. The output image will be pushed to your workspace's ACR.\n",
|
||||
"\n",
|
||||
"You must include an Environment object in your inference configuration to use `Model.package()`.\n",
|
||||
"\n",
|
||||
"```python\n",
|
||||
"package = Model.package(ws, [model], inference_config)\n",
|
||||
"package.wait_for_creation(show_output=True) # Or show_output=False to hide the Docker build logs.\n",
|
||||
"package.pull()\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"Instead of a fully-built image, you can also generate a Dockerfile and download all the assets needed to build an image on top of your Environment.\n",
|
||||
"\n",
|
||||
"```python\n",
|
||||
"package = Model.package(ws, [model], inference_config, generate_dockerfile=True)\n",
|
||||
"package.wait_for_creation(show_output=True)\n",
|
||||
"package.save(\"./local_context_dir\")\n",
|
||||
"```"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "aashishb"
|
||||
}
|
||||
],
|
||||
"category": "deployment",
|
||||
"compute": [
|
||||
"None"
|
||||
],
|
||||
"datasets": [
|
||||
"Iris"
|
||||
],
|
||||
"deployment": [
|
||||
"Azure Container Instance"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"PySpark"
|
||||
],
|
||||
"friendly_name": "Register Spark model and deploy as webservice",
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.2"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -0,0 +1,4 @@
|
||||
name: model-register-and-deploy-spark
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
37
how-to-use-azureml/deployment/spark/score.py
Normal file
37
how-to-use-azureml/deployment/spark/score.py
Normal file
@@ -0,0 +1,37 @@
|
||||
import traceback
|
||||
from pyspark.ml.linalg import VectorUDT
|
||||
from azureml.core.model import Model
|
||||
from pyspark.ml.classification import LogisticRegressionModel
|
||||
from pyspark.sql.types import StructType, StructField
|
||||
from pyspark.sql.types import DoubleType
|
||||
from pyspark.sql import SQLContext
|
||||
from pyspark import SparkContext
|
||||
|
||||
sc = SparkContext.getOrCreate()
|
||||
sqlContext = SQLContext(sc)
|
||||
spark = sqlContext.sparkSession
|
||||
|
||||
input_schema = StructType([StructField("features", VectorUDT()), StructField("label", DoubleType())])
|
||||
reader = spark.read
|
||||
reader.schema(input_schema)
|
||||
|
||||
|
||||
def init():
|
||||
global model
|
||||
# note here "iris.model" is the name of the model registered under the workspace
|
||||
# this call should return the path to the model.pkl file on the local disk.
|
||||
model_path = Model.get_model_path('iris.model')
|
||||
# Load the model file back into a LogisticRegression model
|
||||
model = LogisticRegressionModel.load(model_path)
|
||||
|
||||
|
||||
def run(data):
|
||||
try:
|
||||
input_df = reader.json(sc.parallelize([data]))
|
||||
result = model.transform(input_df)
|
||||
# you can return any datatype as long as it is JSON-serializable
|
||||
return result.collect()[0]['prediction']
|
||||
except Exception as e:
|
||||
traceback.print_exc()
|
||||
error = str(e)
|
||||
return error
|
||||
@@ -409,16 +409,6 @@
|
||||
" print(f.read())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%writefile dockerfile\n",
|
||||
"RUN apt-get update && apt-get install -y g++ "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
@@ -439,6 +429,8 @@
|
||||
"from azureml.core.model import InferenceConfig\n",
|
||||
"from azureml.core.webservice import AciWebservice\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
"from azureml.core.environment import Environment\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores=1, \n",
|
||||
" memory_gb=1, \n",
|
||||
@@ -446,10 +438,8 @@
|
||||
" \"method\" : \"local_explanation\"}, \n",
|
||||
" description='Get local explanations for IBM Employee Attrition data')\n",
|
||||
"\n",
|
||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
||||
" entry_script=\"score_remote_explain.py\",\n",
|
||||
" conda_file=\"myenv.yml\",\n",
|
||||
" extra_docker_file_steps=\"dockerfile\")\n",
|
||||
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||
"inference_config = InferenceConfig(entry_script=\"score_remote_explain.py\", environment=myenv)\n",
|
||||
"\n",
|
||||
"# Use configs and models generated above\n",
|
||||
"service = Model.deploy(ws, 'model-scoring-service', [scoring_explainer_model, original_model], inference_config, aciconfig)\n",
|
||||
|
||||
@@ -43,5 +43,7 @@ In this directory, there are two types of notebooks:
|
||||
1. [pipeline-batch-scoring.ipynb](https://aka.ms/pl-batch-score): This notebook demonstrates how to run a batch scoring job using Azure Machine Learning pipelines.
|
||||
2. [pipeline-style-transfer.ipynb](https://aka.ms/pl-style-trans): This notebook demonstrates a multi-step pipeline that uses GPU compute. This sample also showcases how to use conda dependencies using runconfig when using Pipelines.
|
||||
3. [nyc-taxi-data-regression-model-building.ipynb](https://aka.ms/pl-nyctaxi-tutorial): This notebook is an AzureML Pipelines version of the previously published two part sample.
|
||||
4. [file-dataset-image-inference-mnist.ipynb](https://aka.ms/pl-pr-filedata): This notebook demonstrates how to use ParallelRunStep to process unstructured data (file dataset).
|
||||
5. [tabular-dataset-inference-iris.ipynb](https://aka.ms/pl-pr-tabulardata): This notebook demonstrates how to use ParallelRunStep to process structured data (tabular dataset).
|
||||
|
||||

|
||||
|
||||
@@ -18,5 +18,6 @@ These notebooks below are designed to go in sequence.
|
||||
13. [aml-pipelines-showcasing-datapath-and-pipelineparameter.ipynb](https://aka.ms/pl-datapath): This notebook showcases how to use DataPath and PipelineParameter in AML Pipeline.
|
||||
14. [aml-pipelines-how-to-use-pipeline-drafts.ipynb](http://aka.ms/pl-pl-draft): This notebook shows how to use Pipeline Drafts. Pipeline Drafts are mutable pipelines which can be used to submit runs and create Published Pipelines.
|
||||
15. [aml-pipelines-hot-to-use-modulestep.ipynb](https://aka.ms/pl-modulestep): This notebook shows how to define Module, ModuleVersion and how to use them in an AML Pipeline using ModuleStep.
|
||||
16. [aml-pipelines-with-notebook-runner-step.ipynb](https://aka.ms/pl-nbrstep): This notebook shows how you can run another notebook as a step in Azure Machine Learning Pipeline.
|
||||
|
||||

|
||||
|
||||
@@ -620,14 +620,13 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%time\n",
|
||||
"from azureml.core.model import InferenceConfig\n",
|
||||
"from azureml.core.environment import Environment\n",
|
||||
"from azureml.core.model import Model, InferenceConfig\n",
|
||||
"from azureml.core.webservice import AciWebservice\n",
|
||||
"from azureml.core.webservice import Webservice\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
"\n",
|
||||
"inference_config = InferenceConfig(runtime = \"python\", \n",
|
||||
" entry_script = \"score.py\",\n",
|
||||
" conda_file = \"myenv.yml\")\n",
|
||||
"\n",
|
||||
"myenv = Environment.from_conda_specification(name=\"env\", file_path=\"myenv.yml\")\n",
|
||||
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)\n",
|
||||
"\n",
|
||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores=1, \n",
|
||||
" memory_gb=1, \n",
|
||||
|
||||
@@ -70,8 +70,6 @@
|
||||
"from azureml.pipeline.core import PipelineData\n",
|
||||
"from azureml.core.datastore import Datastore\n",
|
||||
"\n",
|
||||
"from azureml.widgets import RunDetails\n",
|
||||
"\n",
|
||||
"from azureml.core import Workspace, Experiment\n",
|
||||
"from azureml.contrib.notebook import NotebookRunConfig, AzureMLNotebookHandler\n",
|
||||
"\n",
|
||||
@@ -149,7 +147,7 @@
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create or Attach an AmlCompute cluster\n",
|
||||
"You will need to create a [compute target](https://docs.microsoft.com/azure/machine-learning/service/concept-azure-machine-learning-architecture#compute-target) for your AutoML run. In this tutorial, you get the default `AmlCompute` as your training compute resource."
|
||||
"You will need to create a [compute target](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.computetarget?view=azure-ml-py) for your remote run. In this tutorial, you get the default `AmlCompute` as your training compute resource."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -205,7 +203,7 @@
|
||||
"conda_run_config.environment.docker.enabled = True\n",
|
||||
"conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n",
|
||||
"\n",
|
||||
"cd = CondaDependencies.create(pip_packages=['azureml-sdk'], pin_sdk_version=False)\n",
|
||||
"cd = CondaDependencies.create(pip_packages=['azureml-sdk'])\n",
|
||||
"conda_run_config.environment.python.conda_dependencies = cd\n",
|
||||
"\n",
|
||||
"print('run config is ready')"
|
||||
@@ -298,7 +296,7 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.core import PipelineParameter, TrainingOutput\n",
|
||||
"from azureml.pipeline.core import PipelineParameter\n",
|
||||
"\n",
|
||||
"output_from_notebook = PipelineData(name=\"notebook_processed_data\",\n",
|
||||
" datastore=Datastore.get(ws, \"workspaceblobstore\"))\n",
|
||||
@@ -326,7 +324,7 @@
|
||||
"\n",
|
||||
"Once we have the steps (or steps collection), we can build the [pipeline](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline.pipeline?view=azure-ml-py). By deafult, all these steps will run in **parallel** once we submit the pipeline for run.\n",
|
||||
"\n",
|
||||
"A pipeline is created with a list of steps and a workspace. Submit a pipeline using [submit](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.experiment(class)?view=azure-ml-py#submit-config--tags-none----kwargs-). When submit is called, a [PipelineRun](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelinerun?view=azure-ml-py) is created which in turn creates [StepRun](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.steprun?view=azure-ml-py) objects for each step in the workflow."
|
||||
"A pipeline is created with a list of steps and a workspace. Submit a pipeline using [submit](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline.pipeline?view=azure-ml-py#submit-experiment-name--pipeline-parameters-none--continue-on-step-failure-false--regenerate-outputs-false--parent-run-id-none----kwargs-). When submit is called, a [PipelineRun](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelinerun?view=azure-ml-py) is created which in turn creates [StepRun](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.steprun?view=azure-ml-py) objects for each step in the workflow."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -336,9 +334,7 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"pipeline1 = Pipeline(workspace=ws, steps=[notebook_runner_step])\n",
|
||||
"\n",
|
||||
"pipeline1.validate()\n",
|
||||
"print(\"Pipeline validation complete\")"
|
||||
"print(\"Pipeline creation complete\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -356,6 +352,7 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.widgets import RunDetails\n",
|
||||
"RunDetails(pipeline_run1).show()"
|
||||
]
|
||||
},
|
||||
@@ -375,8 +372,7 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"pipeline_run1.wait_for_completion()\n",
|
||||
" Retrieve the step runs by name `train.py`\n",
|
||||
"train_step = pipeline_run1.find_step_run('training_notebook_step')\n",
|
||||
"train_step = pipeline_run1.find_step_run('training_notebook_step') # Retrieve the step runs by name `train.py`\n",
|
||||
"\n",
|
||||
"if train_step:\n",
|
||||
" train_step_obj = train_step[0] # since we have only one step by name `training_notebook_step`\n",
|
||||
@@ -420,7 +416,7 @@
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.7.3"
|
||||
"version": "3.6.7"
|
||||
},
|
||||
"order_index": 12,
|
||||
"star_tag": [
|
||||
|
||||
@@ -24,9 +24,9 @@
|
||||
"In this notebook, we will demonstrate how to make predictions on large quantities of data asynchronously using the ML pipelines with Azure Machine Learning. Batch inference (or batch scoring) provides cost-effective inference, with unparalleled throughput for asynchronous applications. Batch prediction pipelines can scale to perform inference on terabytes of production data. Batch prediction is optimized for high throughput, fire-and-forget predictions for a large collection of data.\n",
|
||||
"\n",
|
||||
"> **Note**\n",
|
||||
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook.\n",
|
||||
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook. Pandas is used to display job results.\n",
|
||||
"```\n",
|
||||
"pip install azureml-contrib-pipeline-steps\n",
|
||||
"pip install azureml-contrib-pipeline-steps pandas\n",
|
||||
"```\n",
|
||||
"> **Tip**\n",
|
||||
"If your system requires low-latency processing (to process a single document or small set of documents quickly), use [real-time scoring](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-consume-web-service) instead of batch prediction.\n",
|
||||
|
||||
@@ -24,9 +24,9 @@
|
||||
"In this notebook, we will demonstrate how to make predictions on large quantities of data asynchronously using the ML pipelines with Azure Machine Learning. Batch inference (or batch scoring) provides cost-effective inference, with unparalleled throughput for asynchronous applications. Batch prediction pipelines can scale to perform inference on terabytes of production data. Batch prediction is optimized for high throughput, fire-and-forget predictions for a large collection of data.\n",
|
||||
"\n",
|
||||
"> **Note**\n",
|
||||
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook.\n",
|
||||
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook. Pandas is used to display job results.\n",
|
||||
"```\n",
|
||||
"pip install azureml-contrib-pipeline-steps\n",
|
||||
"pip install azureml-contrib-pipeline-steps pandas\n",
|
||||
"```\n",
|
||||
"> **Tip**\n",
|
||||
"If your system requires low-latency processing (to process a single document or small set of documents quickly), use [real-time scoring](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-consume-web-service) instead of batch prediction.\n",
|
||||
|
||||
@@ -1,119 +0,0 @@
|
||||
# Copyright (c) Microsoft. All rights reserved.
|
||||
# Licensed under the MIT license.
|
||||
|
||||
import os
|
||||
import argparse
|
||||
import datetime
|
||||
import time
|
||||
import tensorflow as tf
|
||||
from math import ceil
|
||||
import numpy as np
|
||||
import shutil
|
||||
from tensorflow.contrib.slim.python.slim.nets import inception_v3
|
||||
from azureml.core.model import Model
|
||||
|
||||
slim = tf.contrib.slim
|
||||
|
||||
parser = argparse.ArgumentParser(description="Start a tensorflow model serving")
|
||||
parser.add_argument('--model_name', dest="model_name", required=True)
|
||||
parser.add_argument('--label_dir', dest="label_dir", required=True)
|
||||
parser.add_argument('--dataset_path', dest="dataset_path", required=True)
|
||||
parser.add_argument('--output_dir', dest="output_dir", required=True)
|
||||
parser.add_argument('--batch_size', dest="batch_size", type=int, required=True)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
image_size = 299
|
||||
num_channel = 3
|
||||
|
||||
# create output directory if it does not exist
|
||||
os.makedirs(args.output_dir, exist_ok=True)
|
||||
|
||||
|
||||
def get_class_label_dict(label_file):
|
||||
label = []
|
||||
proto_as_ascii_lines = tf.gfile.GFile(label_file).readlines()
|
||||
for l in proto_as_ascii_lines:
|
||||
label.append(l.rstrip())
|
||||
return label
|
||||
|
||||
|
||||
class DataIterator:
|
||||
def __init__(self, data_dir):
|
||||
self.file_paths = []
|
||||
image_list = os.listdir(data_dir)
|
||||
# total_size = len(image_list)
|
||||
self.file_paths = [data_dir + '/' + file_name.rstrip() for file_name in image_list]
|
||||
|
||||
self.labels = [1 for file_name in self.file_paths]
|
||||
|
||||
@property
|
||||
def size(self):
|
||||
return len(self.labels)
|
||||
|
||||
def input_pipeline(self, batch_size):
|
||||
images_tensor = tf.convert_to_tensor(self.file_paths, dtype=tf.string)
|
||||
labels_tensor = tf.convert_to_tensor(self.labels, dtype=tf.int64)
|
||||
input_queue = tf.train.slice_input_producer([images_tensor, labels_tensor], shuffle=False)
|
||||
labels = input_queue[1]
|
||||
images_content = tf.read_file(input_queue[0])
|
||||
|
||||
image_reader = tf.image.decode_jpeg(images_content, channels=num_channel, name="jpeg_reader")
|
||||
float_caster = tf.cast(image_reader, tf.float32)
|
||||
new_size = tf.constant([image_size, image_size], dtype=tf.int32)
|
||||
images = tf.image.resize_images(float_caster, new_size)
|
||||
images = tf.divide(tf.subtract(images, [0]), [255])
|
||||
|
||||
image_batch, label_batch = tf.train.batch([images, labels], batch_size=batch_size, capacity=5 * batch_size)
|
||||
return image_batch
|
||||
|
||||
|
||||
def main(_):
|
||||
# start_time = datetime.datetime.now()
|
||||
label_file_name = os.path.join(args.label_dir, "labels.txt")
|
||||
label_dict = get_class_label_dict(label_file_name)
|
||||
classes_num = len(label_dict)
|
||||
test_feeder = DataIterator(data_dir=args.dataset_path)
|
||||
total_size = len(test_feeder.labels)
|
||||
count = 0
|
||||
# get model from model registry
|
||||
model_path = Model.get_model_path(args.model_name)
|
||||
with tf.Session() as sess:
|
||||
test_images = test_feeder.input_pipeline(batch_size=args.batch_size)
|
||||
with slim.arg_scope(inception_v3.inception_v3_arg_scope()):
|
||||
input_images = tf.placeholder(tf.float32, [args.batch_size, image_size, image_size, num_channel])
|
||||
logits, _ = inception_v3.inception_v3(input_images,
|
||||
num_classes=classes_num,
|
||||
is_training=False)
|
||||
probabilities = tf.argmax(logits, 1)
|
||||
|
||||
sess.run(tf.global_variables_initializer())
|
||||
sess.run(tf.local_variables_initializer())
|
||||
coord = tf.train.Coordinator()
|
||||
threads = tf.train.start_queue_runners(sess=sess, coord=coord)
|
||||
saver = tf.train.Saver()
|
||||
saver.restore(sess, model_path)
|
||||
out_filename = os.path.join(args.output_dir, "result-labels.txt")
|
||||
with open(out_filename, "w") as result_file:
|
||||
i = 0
|
||||
while count < total_size and not coord.should_stop():
|
||||
test_images_batch = sess.run(test_images)
|
||||
file_names_batch = test_feeder.file_paths[i * args.batch_size:
|
||||
min(test_feeder.size, (i + 1) * args.batch_size)]
|
||||
results = sess.run(probabilities, feed_dict={input_images: test_images_batch})
|
||||
new_add = min(args.batch_size, total_size - count)
|
||||
count += new_add
|
||||
i += 1
|
||||
for j in range(new_add):
|
||||
result_file.write(os.path.basename(file_names_batch[j]) + ": " + label_dict[results[j]] + "\n")
|
||||
result_file.flush()
|
||||
coord.request_stop()
|
||||
coord.join(threads)
|
||||
|
||||
# copy the file to artifacts
|
||||
shutil.copy(out_filename, "./outputs/")
|
||||
# Move the processed data out of the blob so that the next run can process the data.
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
tf.app.run()
|
||||
@@ -1,630 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"**Note**: Azure Machine Learning recently released ParallelRunStep for public preview, this will allow for parallelization of your workload across many compute nodes without the difficulty of orchestrating worker pools and queues. See the [batch inference notebooks](../../../contrib/batch_inferencing/) for examples on how to get started."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Using Azure Machine Learning Pipelines for batch prediction\n",
|
||||
"\n",
|
||||
"In this notebook we will demonstrate how to run a batch scoring job using Azure Machine Learning pipelines. Our example job will be to take an already-trained image classification model, and run that model on some unlabeled images. The image classification model that we'll use is the __[Inception-V3 model](https://arxiv.org/abs/1512.00567)__ and we'll run this model on unlabeled images from the __[ImageNet](http://image-net.org/)__ dataset. \n",
|
||||
"\n",
|
||||
"The outline of this notebook is as follows:\n",
|
||||
"\n",
|
||||
"- Register the pretrained inception model into the model registry. \n",
|
||||
"- Store the dataset images in a blob container.\n",
|
||||
"- Use the registered model to do batch scoring on the images in the data blob container."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Prerequisites\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the configuration Notebook located at https://github.com/Azure/MachineLearningNotebooks first if you haven't. This sets you up with a working config file that has information on your workspace, subscription id, etc. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Experiment\n",
|
||||
"from azureml.core.compute import AmlCompute, ComputeTarget\n",
|
||||
"from azureml.core.datastore import Datastore\n",
|
||||
"from azureml.core.runconfig import CondaDependencies, RunConfiguration\n",
|
||||
"from azureml.data.data_reference import DataReference\n",
|
||||
"from azureml.pipeline.core import Pipeline, PipelineData\n",
|
||||
"from azureml.pipeline.steps import PythonScriptStep"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"ws = Workspace.from_config()\n",
|
||||
"print('Workspace name: ' + ws.name, \n",
|
||||
" 'Azure region: ' + ws.location, \n",
|
||||
" 'Subscription id: ' + ws.subscription_id, \n",
|
||||
" 'Resource group: ' + ws.resource_group, sep = '\\n')\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Set up machine learning resources"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Set up datastores\n",
|
||||
"First, let\u00e2\u20ac\u2122s access the datastore that has the model, labels, and images. \n",
|
||||
"\n",
|
||||
"### Create a datastore that points to a blob container containing sample images\n",
|
||||
"\n",
|
||||
"We have created a public blob container `sampledata` on an account named `pipelinedata`, containing images from the ImageNet evaluation set. In the next step, we create a datastore with the name `images_datastore`, which points to this container. In the call to `register_azure_blob_container` below, setting the `overwrite` flag to `True` overwrites any datastore that was created previously with that name. \n",
|
||||
"\n",
|
||||
"This step can be changed to point to your blob container by providing your own `datastore_name`, `container_name`, and `account_name`."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"account_name = \"pipelinedata\"\n",
|
||||
"datastore_name=\"images_datastore\"\n",
|
||||
"container_name=\"sampledata\"\n",
|
||||
"\n",
|
||||
"batchscore_blob = Datastore.register_azure_blob_container(ws, \n",
|
||||
" datastore_name=datastore_name, \n",
|
||||
" container_name= container_name, \n",
|
||||
" account_name=account_name, \n",
|
||||
" overwrite=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Next, let\u00e2\u20ac\u2122s specify the default datastore for the outputs."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"def_data_store = ws.get_default_datastore()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Configure data references\n",
|
||||
"Now you need to add references to the data, as inputs to the appropriate pipeline steps in your pipeline. A data source in a pipeline is represented by a DataReference object. The DataReference object points to data that lives in, or is accessible from, a datastore. We need DataReference objects corresponding to the following: the directory containing the input images, the directory in which the pretrained model is stored, the directory containing the labels, and the output directory."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"input_images = DataReference(datastore=batchscore_blob, \n",
|
||||
" data_reference_name=\"input_images\",\n",
|
||||
" path_on_datastore=\"batchscoring/images\",\n",
|
||||
" mode=\"download\"\n",
|
||||
" )\n",
|
||||
"model_dir = DataReference(datastore=batchscore_blob, \n",
|
||||
" data_reference_name=\"input_model\",\n",
|
||||
" path_on_datastore=\"batchscoring/models\",\n",
|
||||
" mode=\"download\" \n",
|
||||
" )\n",
|
||||
"label_dir = DataReference(datastore=batchscore_blob, \n",
|
||||
" data_reference_name=\"input_labels\",\n",
|
||||
" path_on_datastore=\"batchscoring/labels\",\n",
|
||||
" mode=\"download\" \n",
|
||||
" )\n",
|
||||
"output_dir = PipelineData(name=\"scores\", \n",
|
||||
" datastore=def_data_store, \n",
|
||||
" output_path_on_compute=\"batchscoring/results\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create and attach Compute targets\n",
|
||||
"Use the below code to create and attach Compute targets. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# choose a name for your cluster\n",
|
||||
"aml_compute_name = os.environ.get(\"AML_COMPUTE_NAME\", \"gpu-cluster\")\n",
|
||||
"cluster_min_nodes = os.environ.get(\"AML_COMPUTE_MIN_NODES\", 0)\n",
|
||||
"cluster_max_nodes = os.environ.get(\"AML_COMPUTE_MAX_NODES\", 1)\n",
|
||||
"vm_size = os.environ.get(\"AML_COMPUTE_SKU\", \"STANDARD_NC6\")\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"if aml_compute_name in ws.compute_targets:\n",
|
||||
" compute_target = ws.compute_targets[aml_compute_name]\n",
|
||||
" if compute_target and type(compute_target) is AmlCompute:\n",
|
||||
" print('found compute target. just use it. ' + aml_compute_name)\n",
|
||||
"else:\n",
|
||||
" print('creating a new compute target...')\n",
|
||||
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = vm_size, # NC6 is GPU-enabled\n",
|
||||
" vm_priority = 'lowpriority', # optional\n",
|
||||
" min_nodes = cluster_min_nodes, \n",
|
||||
" max_nodes = cluster_max_nodes)\n",
|
||||
"\n",
|
||||
" # create the cluster\n",
|
||||
" compute_target = ComputeTarget.create(ws, aml_compute_name, provisioning_config)\n",
|
||||
" \n",
|
||||
" # can poll for a minimum number of nodes and for a specific timeout. \n",
|
||||
" # if no min node count is provided it will use the scale settings for the cluster\n",
|
||||
" compute_target.wait_for_completion(show_output=True, min_node_count=None, timeout_in_minutes=20)\n",
|
||||
" \n",
|
||||
" # For a more detailed view of current Azure Machine Learning Compute status, use get_status()\n",
|
||||
" print(compute_target.get_status().serialize())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Prepare the Model"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Download the Model\n",
|
||||
"\n",
|
||||
"Download and extract the model from http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz to `\"models\"`"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# create directory for model\n",
|
||||
"model_dir = 'models'\n",
|
||||
"if not os.path.isdir(model_dir):\n",
|
||||
" os.mkdir(model_dir)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import tarfile\n",
|
||||
"import urllib.request\n",
|
||||
"\n",
|
||||
"url=\"http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz\"\n",
|
||||
"response = urllib.request.urlretrieve(url, \"model.tar.gz\")\n",
|
||||
"tar = tarfile.open(\"model.tar.gz\", \"r:gz\")\n",
|
||||
"tar.extractall(model_dir)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Register the model with Workspace"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import shutil\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
"\n",
|
||||
"# register downloaded model \n",
|
||||
"model = Model.register(model_path = \"models/inception_v3.ckpt\",\n",
|
||||
" model_name = \"inception\", # this is the name the model is registered as\n",
|
||||
" tags = {'pretrained': \"inception\"},\n",
|
||||
" description = \"Imagenet trained tensorflow inception\",\n",
|
||||
" workspace = ws)\n",
|
||||
"# remove the downloaded dir after registration if you wish\n",
|
||||
"shutil.rmtree(\"models\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Write your scoring script"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"To do the scoring, we use a batch scoring script `batch_scoring.py`, which is located in the same directory that this notebook is in. You can take a look at this script to see how you might modify it for your custom batch scoring task.\n",
|
||||
"\n",
|
||||
"The python script `batch_scoring.py` takes input images, applies the image classification model to these images, and outputs a classification result to a results file.\n",
|
||||
"\n",
|
||||
"The script `batch_scoring.py` takes the following parameters:\n",
|
||||
"\n",
|
||||
"- `--model_name`: the name of the model being used, which is expected to be in the `model_dir` directory\n",
|
||||
"- `--label_dir` : the directory holding the `labels.txt` file \n",
|
||||
"- `--dataset_path`: the directory containing the input images\n",
|
||||
"- `--output_dir` : the script will run the model on the data and output a `results-label.txt` to this directory\n",
|
||||
"- `--batch_size` : the batch size used in running the model.\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Build and run the batch scoring pipeline\n",
|
||||
"You have everything you need to build the pipeline. Let\u00e2\u20ac\u2122s put all these together."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Specify the environment to run the script\n",
|
||||
"Specify the conda dependencies for your script. You will need this object when you create the pipeline step later on."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.runconfig import DEFAULT_GPU_IMAGE\n",
|
||||
"\n",
|
||||
"cd = CondaDependencies.create(pip_packages=[\"tensorflow-gpu==1.13.1\", \"azureml-defaults\"])\n",
|
||||
"\n",
|
||||
"# Runconfig\n",
|
||||
"amlcompute_run_config = RunConfiguration(conda_dependencies=cd)\n",
|
||||
"amlcompute_run_config.environment.docker.enabled = True\n",
|
||||
"amlcompute_run_config.environment.docker.base_image = DEFAULT_GPU_IMAGE\n",
|
||||
"amlcompute_run_config.environment.spark.precache_packages = False"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Specify the parameters for your pipeline\n",
|
||||
"A subset of the parameters to the python script can be given as input when we re-run a `PublishedPipeline`. In the current example, we define `batch_size` taken by the script as such parameter."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.core.graph import PipelineParameter\n",
|
||||
"batch_size_param = PipelineParameter(name=\"param_batch_size\", default_value=20)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create the pipeline step\n",
|
||||
"Create the pipeline step using the script, environment configuration, and parameters. Specify the compute target you already attached to your workspace as the target of execution of the script. We will use PythonScriptStep to create the pipeline step."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"inception_model_name = \"inception_v3.ckpt\"\n",
|
||||
"\n",
|
||||
"batch_score_step = PythonScriptStep(\n",
|
||||
" name=\"batch_scoring\",\n",
|
||||
" script_name=\"batch_scoring.py\",\n",
|
||||
" arguments=[\"--dataset_path\", input_images, \n",
|
||||
" \"--model_name\", \"inception\",\n",
|
||||
" \"--label_dir\", label_dir, \n",
|
||||
" \"--output_dir\", output_dir, \n",
|
||||
" \"--batch_size\", batch_size_param],\n",
|
||||
" compute_target=compute_target,\n",
|
||||
" inputs=[input_images, label_dir],\n",
|
||||
" outputs=[output_dir],\n",
|
||||
" runconfig=amlcompute_run_config\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Run the pipeline\n",
|
||||
"At this point you can run the pipeline and examine the output it produced. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"pipelineparameterssample"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"pipeline = Pipeline(workspace=ws, steps=[batch_score_step])\n",
|
||||
"pipeline_run = Experiment(ws, 'batch_scoring').submit(pipeline, pipeline_parameters={\"param_batch_size\": 20})"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Monitor the run"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.widgets import RunDetails\n",
|
||||
"RunDetails(pipeline_run).show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"pipeline_run.wait_for_completion(show_output=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Download and review output"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"step_run = list(pipeline_run.get_children())[0]\n",
|
||||
"step_run.download_file(\"./outputs/result-labels.txt\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import pandas as pd\n",
|
||||
"df = pd.read_csv(\"result-labels.txt\", delimiter=\":\", header=None)\n",
|
||||
"df.columns = [\"Filename\", \"Prediction\"]\n",
|
||||
"df.head()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Publish a pipeline and rerun using a REST call"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create a published pipeline\n",
|
||||
"Once you are satisfied with the outcome of the run, you can publish the pipeline to run it with different input values later. When you publish a pipeline, you will get a REST endpoint that accepts invoking of the pipeline with the set of parameters you have already incorporated above using PipelineParameter."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"published_pipeline = pipeline_run.publish_pipeline(\n",
|
||||
" name=\"Inception_v3_scoring\", description=\"Batch scoring using Inception v3 model\", version=\"1.0\")\n",
|
||||
"\n",
|
||||
"published_pipeline"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Get published pipeline\n",
|
||||
"\n",
|
||||
"You can get the published pipeline using **pipeline id**.\n",
|
||||
"\n",
|
||||
"To get all the published pipelines for a given workspace(ws): \n",
|
||||
"```css\n",
|
||||
"all_pub_pipelines = PublishedPipeline.get_all(ws)\n",
|
||||
"```"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.core import PublishedPipeline\n",
|
||||
"\n",
|
||||
"pipeline_id = published_pipeline.id # use your published pipeline id\n",
|
||||
"published_pipeline = PublishedPipeline.get(ws, pipeline_id)\n",
|
||||
"\n",
|
||||
"published_pipeline"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Rerun the pipeline using the REST endpoint"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Get AAD token\n",
|
||||
"[This notebook](https://aka.ms/pl-restep-auth) shows how to authenticate to AML workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
|
||||
"import requests\n",
|
||||
"\n",
|
||||
"auth = InteractiveLoginAuthentication()\n",
|
||||
"aad_token = auth.get_authentication_header()\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Run published pipeline"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"rest_endpoint = published_pipeline.endpoint\n",
|
||||
"# specify batch size when running the pipeline\n",
|
||||
"response = requests.post(rest_endpoint, \n",
|
||||
" headers=aad_token, \n",
|
||||
" json={\"ExperimentName\": \"batch_scoring\",\n",
|
||||
" \"ParameterAssignments\": {\"param_batch_size\": 50}})"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"try:\n",
|
||||
" response.raise_for_status()\n",
|
||||
"except Exception: \n",
|
||||
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
|
||||
" 'Response Code: {}\\n'\n",
|
||||
" 'Headers: {}\\n'\n",
|
||||
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
|
||||
"\n",
|
||||
"run_id = response.json().get('Id')\n",
|
||||
"print('Submitted pipeline run: ', run_id)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Monitor the new run"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.core.run import PipelineRun\n",
|
||||
"published_pipeline_run = PipelineRun(ws.experiments[\"batch_scoring\"], run_id)\n",
|
||||
"\n",
|
||||
"RunDetails(published_pipeline_run).show()"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "sanpil"
|
||||
}
|
||||
],
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.7"
|
||||
}
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,7 +0,0 @@
|
||||
name: pipeline-batch-scoring
|
||||
dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-widgets
|
||||
- pandas
|
||||
- requests
|
||||
@@ -100,7 +100,7 @@
|
||||
"\n",
|
||||
"# Check core SDK version number\n",
|
||||
"\n",
|
||||
"print(\"This notebook was created using SDK version 1.0.83, you are currently running version\", azureml.core.VERSION)"
|
||||
"print(\"This notebook was created using SDK version 1.1.0rc0, you are currently running version\", azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
|
||||
@@ -1,342 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Deploy Model as Azure Machine Learning Web Service using MLflow\n",
|
||||
"\n",
|
||||
"This example shows you how to use mlflow together with Azure Machine Learning services for deploying a model as a web service. You'll learn how to:\n",
|
||||
"\n",
|
||||
" 1. Retrieve a previously trained scikit-learn model\n",
|
||||
" 2. Create a Docker image from the model\n",
|
||||
" 3. Deploy the model as a web service on Azure Container Instance\n",
|
||||
" 4. Make a scoring request against the web service.\n",
|
||||
"\n",
|
||||
"## Prerequisites and Set-up\n",
|
||||
"\n",
|
||||
"This notebook requires you to first complete the [Use MLflow with Azure Machine Learning for Local Training Run](../train-local/train-local.ipnyb) or [Use MLflow with Azure Machine Learning for Remote Training Run](../train-remote/train-remote.ipnyb) notebook, so as to have an experiment run with uploaded model in your Azure Machine Learning Workspace.\n",
|
||||
"\n",
|
||||
"Also install following packages if you haven't already\n",
|
||||
"\n",
|
||||
"```\n",
|
||||
"pip install azureml-mlflow pandas\n",
|
||||
"```\n",
|
||||
"\n",
|
||||
"Then, import necessary packages:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import mlflow\n",
|
||||
"import azureml.mlflow\n",
|
||||
"import azureml.core\n",
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"# Check core SDK version number\n",
|
||||
"print(\"SDK version:\", azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Connect to workspace and set MLflow tracking URI\n",
|
||||
"\n",
|
||||
"Setting the tracking URI is required for retrieving the model and creating an image using the MLflow APIs."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ws = Workspace.from_config()\n",
|
||||
"\n",
|
||||
"mlflow.set_tracking_uri(ws.get_mlflow_tracking_uri())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Retrieve model from previous run\n",
|
||||
"\n",
|
||||
"Let's retrieve the experiment from training notebook, and list the runs within that experiment."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"experiment_name = \"experiment-with-mlflow\"\n",
|
||||
"exp = ws.experiments[experiment_name]\n",
|
||||
"\n",
|
||||
"runs = list(exp.get_runs())\n",
|
||||
"runs"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Then, let's select the most recent training run and find its ID. You also need to specify the path in run history where the model was saved. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"runid = runs[0].id\n",
|
||||
"model_save_path = \"model\""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create Docker image\n",
|
||||
"\n",
|
||||
"To create a Docker image with Azure Machine Learning for Model Management, use ```mlflow.azureml.build_image``` method. Specify the model path, your workspace, run ID and other parameters.\n",
|
||||
"\n",
|
||||
"MLflow automatically recognizes the model framework as scikit-learn, and creates the scoring logic and includes library dependencies for you.\n",
|
||||
"\n",
|
||||
"Note that the image creation can take several minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import mlflow.azureml\n",
|
||||
"\n",
|
||||
"azure_image, azure_model = mlflow.azureml.build_image(model_uri=\"runs:/{}/{}\".format(runid, model_save_path),\n",
|
||||
" workspace=ws,\n",
|
||||
" model_name='diabetes-sklearn-model',\n",
|
||||
" image_name='diabetes-sklearn-image',\n",
|
||||
" synchronous=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Deploy web service\n",
|
||||
"\n",
|
||||
"Let's use Azure Machine Learning SDK to deploy the image as a web service. \n",
|
||||
"\n",
|
||||
"First, specify the deployment configuration. Azure Container Instance is a suitable choice for a quick dev-test deployment, while Azure Kubernetes Service is suitable for scalable production deployments.\n",
|
||||
"\n",
|
||||
"Then, deploy the image using Azure Machine Learning SDK's ```deploy_from_image``` method.\n",
|
||||
"\n",
|
||||
"Note that the deployment can take several minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import AciWebservice, Webservice\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"aci_config = AciWebservice.deploy_configuration(cpu_cores=1, \n",
|
||||
" memory_gb=1, \n",
|
||||
" tags={\"method\" : \"sklearn\"}, \n",
|
||||
" description='Diabetes model',\n",
|
||||
" location='eastus2')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Deploy the image to Azure Container Instances (ACI) for real-time serving\n",
|
||||
"webservice = Webservice.deploy_from_image(\n",
|
||||
" image=azure_image, workspace=ws, name=\"diabetes-model-1\", deployment_config=aci_config)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"webservice.wait_for_deployment(show_output=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Make a scoring request\n",
|
||||
"\n",
|
||||
"Let's take the first few rows of test data and score them using the web service"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"test_rows = [\n",
|
||||
" [0.01991321, 0.05068012, 0.10480869, 0.07007254, -0.03596778,\n",
|
||||
" -0.0266789 , -0.02499266, -0.00259226, 0.00371174, 0.04034337],\n",
|
||||
" [-0.01277963, -0.04464164, 0.06061839, 0.05285819, 0.04796534,\n",
|
||||
" 0.02937467, -0.01762938, 0.03430886, 0.0702113 , 0.00720652],\n",
|
||||
" [ 0.03807591, 0.05068012, 0.00888341, 0.04252958, -0.04284755,\n",
|
||||
" -0.02104223, -0.03971921, -0.00259226, -0.01811827, 0.00720652]]"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"MLflow-based web service for scikit-learn model requires the data to be converted to Pandas DataFrame, and then serialized as JSON. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import json\n",
|
||||
"import pandas as pd\n",
|
||||
"\n",
|
||||
"test_rows_as_json = pd.DataFrame(test_rows).to_json(orient=\"split\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Let's pass the conveted and serialized data to web service to get the predictions."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"predictions = webservice.run(test_rows_as_json)\n",
|
||||
"\n",
|
||||
"print(predictions)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can use the web service's scoring URI to make a raw HTTP request"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"webservice.scoring_uri"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can diagnose the web service using ```get_logs``` method."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"webservice.get_logs()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Next Steps\n",
|
||||
"\n",
|
||||
"Learn about [model management and inference in Azure Machine Learning service](https://docs.microsoft.com/en-us/azure/machine-learning/service/concept-model-management-and-deployment)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "shipatel"
|
||||
}
|
||||
],
|
||||
"category": "deployment",
|
||||
"compute": [
|
||||
"None"
|
||||
],
|
||||
"datasets": [
|
||||
"Diabetes"
|
||||
],
|
||||
"deployment": [
|
||||
"Azure Container Instance"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"Scikit-learn"
|
||||
],
|
||||
"friendly_name": "Deploy a model as a web service using MLflow",
|
||||
"index_order": 4,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.4"
|
||||
},
|
||||
"tags": [
|
||||
"None"
|
||||
],
|
||||
"task": "Use MLflow with AML"
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
name: deploy-model
|
||||
dependencies:
|
||||
- scikit-learn
|
||||
- matplotlib
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-mlflow
|
||||
- pandas
|
||||
@@ -1,150 +0,0 @@
|
||||
# Copyright (c) 2017, PyTorch Team
|
||||
# All rights reserved
|
||||
# Licensed under BSD 3-Clause License.
|
||||
|
||||
# This example is based on PyTorch MNIST example:
|
||||
# https://github.com/pytorch/examples/blob/master/mnist/main.py
|
||||
|
||||
import mlflow
|
||||
import mlflow.pytorch
|
||||
from mlflow.utils.environment import _mlflow_conda_env
|
||||
import warnings
|
||||
import cloudpickle
|
||||
import torch
|
||||
import torch.nn as nn
|
||||
import torch.nn.functional as F
|
||||
import torch.optim as optim
|
||||
import torchvision
|
||||
from torchvision import datasets, transforms
|
||||
|
||||
|
||||
class Net(nn.Module):
|
||||
def __init__(self):
|
||||
super(Net, self).__init__()
|
||||
self.conv1 = nn.Conv2d(1, 20, 5, 1)
|
||||
self.conv2 = nn.Conv2d(20, 50, 5, 1)
|
||||
self.fc1 = nn.Linear(4 * 4 * 50, 500)
|
||||
self.fc2 = nn.Linear(500, 10)
|
||||
|
||||
def forward(self, x):
|
||||
# Added the view for reshaping score requests
|
||||
x = x.view(-1, 1, 28, 28)
|
||||
x = F.relu(self.conv1(x))
|
||||
x = F.max_pool2d(x, 2, 2)
|
||||
x = F.relu(self.conv2(x))
|
||||
x = F.max_pool2d(x, 2, 2)
|
||||
x = x.view(-1, 4 * 4 * 50)
|
||||
x = F.relu(self.fc1(x))
|
||||
x = self.fc2(x)
|
||||
return F.log_softmax(x, dim=1)
|
||||
|
||||
|
||||
def train(args, model, device, train_loader, optimizer, epoch):
|
||||
model.train()
|
||||
for batch_idx, (data, target) in enumerate(train_loader):
|
||||
data, target = data.to(device), target.to(device)
|
||||
optimizer.zero_grad()
|
||||
output = model(data)
|
||||
loss = F.nll_loss(output, target)
|
||||
loss.backward()
|
||||
optimizer.step()
|
||||
if batch_idx % args.log_interval == 0:
|
||||
print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
|
||||
epoch, batch_idx * len(data), len(train_loader.dataset),
|
||||
100. * batch_idx / len(train_loader), loss.item()))
|
||||
# Use MLflow logging
|
||||
mlflow.log_metric("epoch_loss", loss.item())
|
||||
|
||||
|
||||
def test(args, model, device, test_loader):
|
||||
model.eval()
|
||||
test_loss = 0
|
||||
correct = 0
|
||||
with torch.no_grad():
|
||||
for data, target in test_loader:
|
||||
data, target = data.to(device), target.to(device)
|
||||
output = model(data)
|
||||
# sum up batch loss
|
||||
test_loss += F.nll_loss(output, target, reduction="sum").item()
|
||||
# get the index of the max log-probability
|
||||
pred = output.argmax(dim=1, keepdim=True)
|
||||
correct += pred.eq(target.view_as(pred)).sum().item()
|
||||
|
||||
test_loss /= len(test_loader.dataset)
|
||||
print("\n")
|
||||
print("Test set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n".format(
|
||||
test_loss, correct, len(test_loader.dataset),
|
||||
100. * correct / len(test_loader.dataset)))
|
||||
# Use MLflow logging
|
||||
mlflow.log_metric("average_loss", test_loss)
|
||||
|
||||
|
||||
class Args(object):
|
||||
pass
|
||||
|
||||
|
||||
# Training settings
|
||||
args = Args()
|
||||
setattr(args, 'batch_size', 64)
|
||||
setattr(args, 'test_batch_size', 1000)
|
||||
setattr(args, 'epochs', 3) # Higher number for better convergence
|
||||
setattr(args, 'lr', 0.01)
|
||||
setattr(args, 'momentum', 0.5)
|
||||
setattr(args, 'no_cuda', True)
|
||||
setattr(args, 'seed', 1)
|
||||
setattr(args, 'log_interval', 10)
|
||||
setattr(args, 'save_model', True)
|
||||
|
||||
use_cuda = not args.no_cuda and torch.cuda.is_available()
|
||||
|
||||
torch.manual_seed(args.seed)
|
||||
|
||||
device = torch.device("cuda" if use_cuda else "cpu")
|
||||
|
||||
kwargs = {'num_workers': 1, 'pin_memory': True} if use_cuda else {}
|
||||
train_loader = torch.utils.data.DataLoader(
|
||||
datasets.MNIST('../data', train=True, download=True,
|
||||
transform=transforms.Compose([
|
||||
transforms.ToTensor(),
|
||||
transforms.Normalize((0.1307,), (0.3081,))
|
||||
])),
|
||||
batch_size=args.batch_size, shuffle=True, **kwargs)
|
||||
test_loader = torch.utils.data.DataLoader(
|
||||
datasets.MNIST(
|
||||
'../data',
|
||||
train=False,
|
||||
transform=transforms.Compose([
|
||||
transforms.ToTensor(),
|
||||
transforms.Normalize((0.1307,), (0.3081,))])),
|
||||
batch_size=args.test_batch_size, shuffle=True, **kwargs)
|
||||
|
||||
|
||||
def driver():
|
||||
warnings.filterwarnings("ignore")
|
||||
# Dependencies for deploying the model
|
||||
pytorch_index = "https://download.pytorch.org/whl/"
|
||||
pytorch_version = "cpu/torch-1.1.0-cp36-cp36m-linux_x86_64.whl"
|
||||
deps = [
|
||||
"cloudpickle=={}".format(cloudpickle.__version__),
|
||||
pytorch_index + pytorch_version,
|
||||
"torchvision=={}".format(torchvision.__version__),
|
||||
"Pillow=={}".format("6.0.0")
|
||||
]
|
||||
with mlflow.start_run() as run:
|
||||
model = Net().to(device)
|
||||
optimizer = optim.SGD(
|
||||
model.parameters(),
|
||||
lr=args.lr,
|
||||
momentum=args.momentum)
|
||||
for epoch in range(1, args.epochs + 1):
|
||||
train(args, model, device, train_loader, optimizer, epoch)
|
||||
test(args, model, device, test_loader)
|
||||
# Log model to run history using MLflow
|
||||
if args.save_model:
|
||||
model_env = _mlflow_conda_env(additional_pip_deps=deps)
|
||||
mlflow.pytorch.log_model(model, "model", conda_env=model_env)
|
||||
return run
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
driver()
|
||||
@@ -1,501 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Use MLflow with Azure Machine Learning to Train and Deploy PyTorch Image Classifier\n",
|
||||
"\n",
|
||||
"This example shows you how to use MLflow together with Azure Machine Learning services for tracking the metrics and artifacts while training a PyTorch model to classify MNIST digit images, and then deploy the model as a web service. You'll learn how to:\n",
|
||||
"\n",
|
||||
" 1. Set up MLflow tracking URI so as to use Azure ML\n",
|
||||
" 2. Create experiment\n",
|
||||
" 3. Instrument your model with MLflow tracking\n",
|
||||
" 4. Train a PyTorch model locally\n",
|
||||
" 5. Train a model on GPU compute on Azure\n",
|
||||
" 6. View your experiment within your Azure ML Workspace in Azure Portal\n",
|
||||
" 7. Create a Docker image from the trained model\n",
|
||||
" 8. Deploy the model as a web service on Azure Container Instance\n",
|
||||
" 9. Call the model to make predictions\n",
|
||||
" \n",
|
||||
"### Pre-requisites\n",
|
||||
" \n",
|
||||
"Make sure you have completed the [Configuration](../../../configuration.ipnyb) notebook to set up your Azure Machine Learning workspace and ensure other common prerequisites are met.\n",
|
||||
"\n",
|
||||
"Also, install mlflow-azureml package using ```pip install mlflow-azureml```. Note that mlflow-azureml installs mlflow package itself as a dependency, if you haven't done so previously.\n",
|
||||
"\n",
|
||||
"### Set-up\n",
|
||||
"\n",
|
||||
"Import packages and check versions of Azure ML SDK and MLflow installed on your computer. Then connect to your Workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import sys, os\n",
|
||||
"import mlflow\n",
|
||||
"import mlflow.azureml\n",
|
||||
"import mlflow.sklearn\n",
|
||||
"\n",
|
||||
"import azureml.core\n",
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"print(\"SDK version:\", azureml.core.VERSION)\n",
|
||||
"print(\"MLflow version:\", mlflow.version.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ws = Workspace.from_config()\n",
|
||||
"ws.get_details()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Set tracking URI\n",
|
||||
"\n",
|
||||
"Set the MLFlow tracking URI to point to your Azure ML Workspace. The subsequent logging calls from MLFlow APIs will go to Azure ML services and will be tracked under your Workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"mlflow.set_tracking_uri(ws.get_mlflow_tracking_uri())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create Experiment\n",
|
||||
"\n",
|
||||
"In both MLflow and Azure ML, training runs are grouped into experiments. Let's create one for our experimentation."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"experiment_name = \"pytorch-with-mlflow\"\n",
|
||||
"mlflow.set_experiment(experiment_name)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Train model locally while logging metrics and artifacts\n",
|
||||
"\n",
|
||||
"The ```scripts/train.py``` program contains the code to load the image dataset, and train and test the model. Within this program, the train.driver function wraps the end-to-end workflow.\n",
|
||||
"\n",
|
||||
"Within the driver, the ```mlflow.start_run``` starts MLflow tracking. Then, ```mlflow.log_metric``` functions are used to track the convergence of the neural network training iterations. Finally ```mlflow.pytorch.save_model``` is used to save the trained model in framework-aware manner.\n",
|
||||
"\n",
|
||||
"Let's add the program to search path, import it as a module, and then invoke the driver function. Note that the training can take few minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"lib_path = os.path.abspath(\"scripts\")\n",
|
||||
"sys.path.append(lib_path)\n",
|
||||
"\n",
|
||||
"import train"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run = train.driver()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can view the metrics of the run at Azure Portal"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"print(azureml.mlflow.get_portal_url(run))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Train model on GPU compute on Azure\n",
|
||||
"\n",
|
||||
"Next, let's run the same script on GPU-enabled compute for faster training. If you've completed the the [Configuration](../../../configuration.ipnyb) notebook, you should have a GPU cluster named \"gpu-cluster\" available in your workspace. Otherwise, follow the instructions in the notebook to create one. For simplicity, this example uses single process on single VM to train the model.\n",
|
||||
"\n",
|
||||
"Create a PyTorch estimator to specify the training configuration: script, compute as well as additional packages needed. To enable MLflow tracking, include ```azureml-mlflow``` as pip package. The low-level specifications for the training run are encapsulated in the estimator instance."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.train.dnn import PyTorch\n",
|
||||
"\n",
|
||||
"pt = PyTorch(source_directory=\"./scripts\", \n",
|
||||
" entry_script = \"train.py\", \n",
|
||||
" compute_target = \"gpu-cluster\", \n",
|
||||
" node_count = 1, \n",
|
||||
" process_count_per_node = 1, \n",
|
||||
" use_gpu=True,\n",
|
||||
" pip_packages = [\"azureml-mlflow\", \"Pillow==6.0.0\"])\n",
|
||||
"\n"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Get a reference to the experiment you created previously, but this time, as Azure Machine Learning experiment object.\n",
|
||||
"\n",
|
||||
"Then, use ```Experiment.submit``` method to start the remote training run. Note that the first training run often takes longer as Azure Machine Learning service builds the Docker image for executing the script. Subsequent runs will be faster as cached image is used."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Experiment\n",
|
||||
"\n",
|
||||
"exp = Experiment(ws, experiment_name)\n",
|
||||
"run = exp.submit(pt)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can monitor the run and its metrics on Azure Portal."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Also, you can wait for run to complete."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run.wait_for_completion(show_output=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Deploy model as web service\n",
|
||||
"\n",
|
||||
"To deploy a web service, first create a Docker image, and then deploy that Docker image on inferencing compute.\n",
|
||||
"\n",
|
||||
"The ```mlflow.azureml.build_image``` function builds a Docker image from saved PyTorch model in a framework-aware manner. It automatically creates the PyTorch-specific inferencing wrapper code and specififies package dependencies for you."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run.get_file_names()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Then build a docker image using *runs:/<run.id>/model* as the model_uri path.\n",
|
||||
"\n",
|
||||
"Note that the image building can take several minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"model_path = \"model\"\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"azure_image, azure_model = mlflow.azureml.build_image(model_uri='runs:/{}/{}'.format(run.id, model_path),\n",
|
||||
" workspace=ws,\n",
|
||||
" model_name='pytorch_mnist',\n",
|
||||
" image_name='pytorch-mnist-img',\n",
|
||||
" synchronous=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Then, deploy the Docker image to Azure Container Instance: a serverless compute capable of running a single container. You can tag and add descriptions to help keep track of your web service. \n",
|
||||
"\n",
|
||||
"[Other inferencing compute choices](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-deploy-and-where) include Azure Kubernetes Service which provides scalable endpoint suitable for production use.\n",
|
||||
"\n",
|
||||
"Note that the service deployment can take several minutes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.webservice import AciWebservice, Webservice\n",
|
||||
"\n",
|
||||
"aci_config = AciWebservice.deploy_configuration(cpu_cores=2, \n",
|
||||
" memory_gb=5, \n",
|
||||
" tags={\"data\": \"MNIST\", \"method\" : \"pytorch\"}, \n",
|
||||
" description=\"Predict using webservice\")\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"# Deploy the image to Azure Container Instances (ACI) for real-time serving\n",
|
||||
"webservice = Webservice.deploy_from_image(\n",
|
||||
" image=azure_image, workspace=ws, name=\"pytorch-mnist-1\", deployment_config=aci_config)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"webservice.wait_for_deployment()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Once the deployment has completed you can check the scoring URI of the web service."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"print(\"Scoring URI is: {}\".format(webservice.scoring_uri))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"In case of a service creation issue, you can use ```webservice.get_logs()``` to get logs to debug."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Make predictions using web service\n",
|
||||
"\n",
|
||||
"To make the web service, create a test data set as normalized PyTorch tensors. \n",
|
||||
"\n",
|
||||
"Then, let's define a utility function that takes a random image and converts it into format and shape suitable for as input to PyTorch inferencing end-point. The conversion is done by: \n",
|
||||
"\n",
|
||||
" 1. Select a random (image, label) tuple\n",
|
||||
" 2. Take the image and converting the tensor to NumPy array \n",
|
||||
" 3. Reshape array into 1 x 1 x N array\n",
|
||||
" * 1 image in batch, 1 color channel, N = 784 pixels for MNIST images\n",
|
||||
" * Note also ```x = x.view(-1, 1, 28, 28)``` in net definition in ```train.py``` program to shape incoming scoring requests.\n",
|
||||
" 4. Convert the NumPy array to list to make it into a built-in type.\n",
|
||||
" 5. Create a dictionary {\"data\", <list>} that can be converted to JSON string for web service requests."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from torchvision import datasets, transforms\n",
|
||||
"import random\n",
|
||||
"import numpy as np\n",
|
||||
"\n",
|
||||
"test_data = datasets.MNIST('../data', train=False, transform=transforms.Compose([\n",
|
||||
" transforms.ToTensor(),\n",
|
||||
" transforms.Normalize((0.1307,), (0.3081,))]))\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def get_random_image():\n",
|
||||
" image_idx = random.randint(0,len(test_data))\n",
|
||||
" image_as_tensor = test_data[image_idx][0]\n",
|
||||
" return {\"data\": elem for elem in image_as_tensor.numpy().reshape(1,1,-1).tolist()}"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Then, invoke the web service using a random test image. Convert the dictionary containing the image to JSON string before passing it to web service.\n",
|
||||
"\n",
|
||||
"The response contains the raw scores for each label, with greater value indicating higher probability. Sort the labels and select the one with greatest score to get the prediction. Let's also plot the image sent to web service for comparison purposes."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%matplotlib inline\n",
|
||||
"\n",
|
||||
"import json\n",
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"\n",
|
||||
"test_image = get_random_image()\n",
|
||||
"\n",
|
||||
"response = webservice.run(json.dumps(test_image))\n",
|
||||
"\n",
|
||||
"response = sorted(response[0].items(), key = lambda x: x[1], reverse = True)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"print(\"Predicted label:\", response[0][0])\n",
|
||||
"plt.imshow(np.array(test_image[\"data\"]).reshape(28,28), cmap = \"gray\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can also call the web service using a raw POST method against the web service"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import requests\n",
|
||||
"\n",
|
||||
"response = requests.post(url=webservice.scoring_uri, data=json.dumps(test_image),headers={\"Content-type\": \"application/json\"})\n",
|
||||
"print(response.text)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "shipatel"
|
||||
}
|
||||
],
|
||||
"category": "tutorial",
|
||||
"celltoolbar": "Edit Metadata",
|
||||
"compute": [
|
||||
"AML Compute"
|
||||
],
|
||||
"datasets": [
|
||||
"MNIST"
|
||||
],
|
||||
"deployment": [
|
||||
"Azure Container Instance"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"PyTorch"
|
||||
],
|
||||
"friendly_name": "Use MLflow with Azure Machine Learning for training and deployment",
|
||||
"index_order": 6,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.7.3"
|
||||
},
|
||||
"name": "mlflow-sparksummit-pytorch",
|
||||
"notebookId": 2495374963457641,
|
||||
"tags": [
|
||||
"None"
|
||||
],
|
||||
"task": "Use MLflow with Azure Machine Learning to train and deploy Pa yTorch image classifier model"
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 1
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
name: train-and-deploy-pytorch
|
||||
dependencies:
|
||||
- matplotlib
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-mlflow
|
||||
- https://download.pytorch.org/whl/cpu/torch-1.1.0-cp35-cp35m-win_amd64.whl
|
||||
- https://download.pytorch.org/whl/cpu/torchvision-0.3.0-cp35-cp35m-win_amd64.whl
|
||||
@@ -2,11 +2,9 @@ name: train-on-remote-vm
|
||||
dependencies:
|
||||
- matplotlib
|
||||
- tqdm
|
||||
- scikit-learn
|
||||
- scikit-learn==0.22.1
|
||||
- numpy==1.18.1
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-widgets
|
||||
- azureml-dataprep
|
||||
- pandas
|
||||
- fuse
|
||||
- scikit-learn
|
||||
- azureml-dataprep[fuse,pandas]
|
||||
|
||||
@@ -18,7 +18,7 @@ Methods to be deprecated|Replacement in the new version|
|
||||
[Dataset.from_parquet_files()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#from-parquet-files-path--include-path-false--partition-format-none-)|[Dataset.Tabular.from_parquet_files()](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-parquet-files-path--validate-true--include-path-false--set-column-types-none-)
|
||||
[Dataset.from_sql_query()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#from-sql-query-data-source--query-)|[Dataset.Tabular.from_sql_query()](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-sql-query-query--validate-true--set-column-types-none-)
|
||||
[Dataset.from_excel_files()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#from-excel-files-path--sheet-name-none--use-column-headers-false--skip-rows-0--include-path-false--infer-column-types-true--partition-format-none-)|We will support creating a TabularDataset from Excel files in a future release.
|
||||
[Dataset.from_json_files()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#from-json-files-path--encoding--fileencoding-utf8--0---flatten-nested-arrays-false--include-path-false--partition-format-none-)| We will support creating a TabularDataset from json files in a future release.
|
||||
[Dataset.from_json_files()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#from-json-files-path--encoding--fileencoding-utf8--0---flatten-nested-arrays-false--include-path-false--partition-format-none-)| [Dataset.Tabular.from_json_lines_files](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-json-lines-files-path--validate-true--include-path-false--set-column-types-none--partition-format-none-)
|
||||
[Dataset.to_pandas_dataframe()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#to-pandas-dataframe--)|[TabularDataset.to_pandas_dataframe()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py#to-pandas-dataframe--)
|
||||
[Dataset.to_spark_dataframe()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#to-spark-dataframe--)|[TabularDataset.to_spark_dataframe()](https://docs.microsoft.com/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py#to-spark-dataframe--)
|
||||
[Dataset.head(3)](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#head-count-)|[TabularDataset.take(3).to_pandas_dataframe()](https://docs.microsoft.com/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py#take-count-)
|
||||
@@ -29,27 +29,13 @@ Methods to be deprecated|Replacement in the new version|
|
||||
## Why should I use the new Dataset API if I'm only dealing with tabular data?
|
||||
The current Dataset will be kept around for backward compatibility, but we strongly encourage you to move to TabularDataset for the new capabilities listed below:
|
||||
|
||||
- You are able to version and track the new typed Datasets. [Learn How](https://aka.ms/azureml/howto/versiondata)
|
||||
- You are able to use TabularDatasets as automated ML input. [Learn How](https://aka.ms/automl-dataset)
|
||||
- You are able to version the new typed Datasets. [Learn How](https://aka.ms/azureml/howto/createdatasets)
|
||||
- You will be able to use the new typed Datasets as ScriptRun, Estimator, HyperDrive input.
|
||||
- You will be able to use the new typed Datasets in Azure Machine Learning Pipelines.
|
||||
- You will be able to track the lineage of new typed Datasets for model reproducibility.
|
||||
|
||||
- You are able to use the new typed Datasets as ScriptRun, Estimator, HyperDrive input. [Learn How](https://aka.ms/train-with-datasets)
|
||||
- You are be able to use the new typed Datasets in Azure Machine Learning Pipelines. [Learn How](https://aka.ms/pl-datasets)
|
||||
|
||||
## How to migrate registered Datasets to new typed Datasets?
|
||||
If you have registered Datasets created using the old API, you can easily migrate these old Datasets to the new typed Datasets using the following code.
|
||||
```Python
|
||||
from azureml.core.workspace import Workspace
|
||||
from azureml.core.dataset import Dataset
|
||||
|
||||
# get existing workspace
|
||||
workspace = Workspace.from_config()
|
||||
# This method will convert old Dataset without type to either a TabularDataset or a FileDataset object automatically.
|
||||
new_ds = Dataset.get_by_name(workspace, 'old_ds_name')
|
||||
|
||||
# register the new typed Dataset with the workspace
|
||||
new_ds.register(workspace, 'new_ds_name')
|
||||
```
|
||||
We handled the migration for you. All legacy datasets are migrated to new typed Datasets automatically. To use registered datasets, simply call [Dataset.get_by_name](https://docs.microsoft.com/python/api/azureml-core/azureml.core.dataset.dataset?view=azure-ml-py#get-by-name-workspace--name--version--latest--).
|
||||
|
||||
## How to provide feedback?
|
||||
If you have any feedback about our product, or if there is any missing capability that is essential for you to use new Dataset API, please email us at [AskAzureMLData@microsoft.com](mailto:AskAzureMLData@microsoft.com).
|
||||
|
||||
@@ -1,403 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Introduction to labeled datasets\n",
|
||||
"\n",
|
||||
"Labeled datasets are output from Azure Machine Learning [labeling projects](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-create-labeling-projects). It captures the reference to the data (e.g. image files) and its labels. \n",
|
||||
"\n",
|
||||
"This tutorial introduces the capabilities of labeled datasets and how to use it in training.\n",
|
||||
"\n",
|
||||
"Learn how-to:\n",
|
||||
"\n",
|
||||
"> * Set up your development environment\n",
|
||||
"> * Explore labeled datasets\n",
|
||||
"> * Train a simple deep learning neural network on a remote cluster\n",
|
||||
"\n",
|
||||
"## Prerequisite:\n",
|
||||
"* Understand the [architecture and terms](https://docs.microsoft.com/azure/machine-learning/service/concept-azure-machine-learning-architecture) introduced by Azure Machine Learning\n",
|
||||
"* Go through Azure Machine Learning [labeling projects](https://docs.microsoft.com/azure/machine-learning/service/how-to-create-labeling-projects) and export the labels as an Azure Machine Learning dataset\n",
|
||||
"* Go through the [configuration notebook](../../../configuration.ipynb) to:\n",
|
||||
" * install the latest version of azureml-sdk\n",
|
||||
" * install the latest version of azureml-contrib-dataset\n",
|
||||
" * install [PyTorch](https://pytorch.org/)\n",
|
||||
" * create a workspace and its configuration file (`config.json`)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Set up your development environment\n",
|
||||
"\n",
|
||||
"All the setup for your development work can be accomplished in a Python notebook. Setup includes:\n",
|
||||
"\n",
|
||||
"* Importing Python packages\n",
|
||||
"* Connecting to a workspace to enable communication between your local computer and remote resources\n",
|
||||
"* Creating an experiment to track all your runs\n",
|
||||
"* Creating a remote compute target to use for training\n",
|
||||
"\n",
|
||||
"### Import packages\n",
|
||||
"\n",
|
||||
"Import Python packages you need in this session. Also display the Azure Machine Learning SDK version."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"import azureml.core\n",
|
||||
"import azureml.contrib.dataset\n",
|
||||
"from azureml.core import Dataset, Workspace, Experiment\n",
|
||||
"from azureml.contrib.dataset import FileHandlingOption\n",
|
||||
"\n",
|
||||
"# check core SDK version number\n",
|
||||
"print(\"Azure ML SDK Version: \", azureml.core.VERSION)\n",
|
||||
"print(\"Azure ML Contrib Version\", azureml.contrib.dataset.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Connect to workspace\n",
|
||||
"\n",
|
||||
"Create a workspace object from the existing workspace. `Workspace.from_config()` reads the file **config.json** and loads the details into an object named `workspace`."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# load workspace\n",
|
||||
"workspace = Workspace.from_config()\n",
|
||||
"print('Workspace name: ' + workspace.name, \n",
|
||||
" 'Azure region: ' + workspace.location, \n",
|
||||
" 'Subscription id: ' + workspace.subscription_id, \n",
|
||||
" 'Resource group: ' + workspace.resource_group, sep='\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create experiment and a directory\n",
|
||||
"\n",
|
||||
"Create an experiment to track the runs in your workspace and a directory to deliver the necessary code from your computer to the remote resource."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# create an ML experiment\n",
|
||||
"exp = Experiment(workspace=workspace, name='labeled-datasets')\n",
|
||||
"\n",
|
||||
"# create a directory\n",
|
||||
"script_folder = './labeled-datasets'\n",
|
||||
"os.makedirs(script_folder, exist_ok=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create or Attach existing compute resource\n",
|
||||
"By using Azure Machine Learning Compute, a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. Examples include VMs with GPU support. In this tutorial, you will create Azure Machine Learning Compute as your training environment. The code below creates the compute clusters for you if they don't already exist in your workspace.\n",
|
||||
"\n",
|
||||
"**Creation of compute takes approximately 5 minutes.** If the AmlCompute with that name is already in your workspace the code will skip the creation process."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
|
||||
"from azureml.core.compute_target import ComputeTargetException\n",
|
||||
"\n",
|
||||
"# choose a name for your cluster\n",
|
||||
"cluster_name = \"openhack\"\n",
|
||||
"\n",
|
||||
"try:\n",
|
||||
" compute_target = ComputeTarget(workspace=workspace, name=cluster_name)\n",
|
||||
" print('Found existing compute target')\n",
|
||||
"except ComputeTargetException:\n",
|
||||
" print('Creating a new compute target...')\n",
|
||||
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_NC6', \n",
|
||||
" max_nodes=4)\n",
|
||||
"\n",
|
||||
" # create the cluster\n",
|
||||
" compute_target = ComputeTarget.create(workspace, cluster_name, compute_config)\n",
|
||||
"\n",
|
||||
" # can poll for a minimum number of nodes and for a specific timeout. \n",
|
||||
" # if no min node count is provided it uses the scale settings for the cluster\n",
|
||||
" compute_target.wait_for_completion(show_output=True, min_node_count=None, timeout_in_minutes=20)\n",
|
||||
"\n",
|
||||
"# use get_status() to get a detailed status for the current cluster. \n",
|
||||
"print(compute_target.get_status().serialize())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Explore labeled datasets\n",
|
||||
"\n",
|
||||
"**Note**: How to create labeled datasets is not covered in this tutorial. To create labeled datasets, you can go through [labeling projects](https://docs.microsoft.com/azure/machine-learning/service/how-to-create-labeling-projects) and export the output labels as Azure Machine Lerning datasets. \n",
|
||||
"\n",
|
||||
"`animal_labels` used in this tutorial section is the output from a labeling project, with the task type of \"Object Identification\"."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# get animal_labels dataset from the workspace\n",
|
||||
"animal_labels = Dataset.get_by_name(workspace, 'animal_labels')\n",
|
||||
"animal_labels"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can load labeled datasets into pandas DataFrame. There are 3 file handling option that you can choose to load the data files referenced by the labeled datasets:\n",
|
||||
"* Streaming: The default option to load data files.\n",
|
||||
"* Download: Download your data files to a local path.\n",
|
||||
"* Mount: Mount your data files to a mount point. Mount only works for Linux-based compute, including Azure Machine Learning notebook VM and Azure Machine Learning Compute."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"animal_pd = animal_labels.to_pandas_dataframe(file_handling_option=FileHandlingOption.DOWNLOAD, target_path='./download/', overwrite_download=True)\n",
|
||||
"animal_pd"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import matplotlib.pyplot as plt\n",
|
||||
"import matplotlib.image as mpimg\n",
|
||||
"\n",
|
||||
"# read images from downloaded path\n",
|
||||
"img = mpimg.imread(animal_pd.loc[0,'image_url'])\n",
|
||||
"imgplot = plt.imshow(img)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can also load labeled datasets into [torchvision datasets](https://pytorch.org/docs/stable/torchvision/datasets.html), so that you can leverage on the open source libraries provided by PyTorch for image transformation and training."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from torchvision.transforms import functional as F\n",
|
||||
"\n",
|
||||
"# load animal_labels dataset into torchvision dataset\n",
|
||||
"pytorch_dataset = animal_labels.to_torchvision()\n",
|
||||
"img = pytorch_dataset[0][0]\n",
|
||||
"print(type(img))\n",
|
||||
"\n",
|
||||
"# use methods from torchvision to transform the img into grayscale\n",
|
||||
"pil_image = F.to_pil_image(img)\n",
|
||||
"gray_image = F.to_grayscale(pil_image, num_output_channels=3)\n",
|
||||
"\n",
|
||||
"imgplot = plt.imshow(gray_image)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Train an image classification model\n",
|
||||
"\n",
|
||||
" `crack_labels` dataset used in this tutorial section is the output from a labeling project, with the task type of \"Image Classification Multi-class\". We will use this dataset to train an image classification model that classify whether an image has cracks or not."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# get crack_labels dataset from the workspace\n",
|
||||
"crack_labels = Dataset.get_by_name(workspace, 'crack_labels')\n",
|
||||
"crack_labels"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Configure Estimator for training\n",
|
||||
"\n",
|
||||
"You can ask the system to build a conda environment based on your dependency specification. Once the environment is built, and if you don't change your dependencies, it will be reused in subsequent runs."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Environment\n",
|
||||
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||
"\n",
|
||||
"conda_env = Environment('conda-env')\n",
|
||||
"conda_env.python.conda_dependencies = CondaDependencies.create(pip_packages=['azureml-sdk',\n",
|
||||
" 'azureml-contrib-dataset',\n",
|
||||
" 'torch','torchvision',\n",
|
||||
" 'azureml-dataprep[pandas]'])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"An estimator object is used to submit the run. Azure Machine Learning has pre-configured estimators for common machine learning frameworks, as well as generic Estimator. Create a generic estimator for by specifying\n",
|
||||
"\n",
|
||||
"* The name of the estimator object, `est`\n",
|
||||
"* The directory that contains your scripts. All the files in this directory are uploaded into the cluster nodes for execution. \n",
|
||||
"* The training script name, train.py\n",
|
||||
"* The input dataset for training\n",
|
||||
"* The compute target. In this case you will use the AmlCompute you created\n",
|
||||
"* The environment definition for the experiment"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.train.estimator import Estimator\n",
|
||||
"\n",
|
||||
"est = Estimator(source_directory=script_folder, \n",
|
||||
" entry_script='train.py',\n",
|
||||
" inputs=[crack_labels.as_named_input('crack_labels')],\n",
|
||||
" compute_target=compute_target,\n",
|
||||
" environment_definition= conda_env)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Submit job to run\n",
|
||||
"\n",
|
||||
"Submit the estimator to the Azure ML experiment to kick off the execution."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run = exp.submit(est)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run.wait_for_completion(show_output=True)"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "sihhu"
|
||||
}
|
||||
],
|
||||
"category": "tutorial",
|
||||
"compute": [
|
||||
"Remote"
|
||||
],
|
||||
"deployment": [
|
||||
"None"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"Azure ML"
|
||||
],
|
||||
"friendly_name": "Introduction to labeled datasets",
|
||||
"index_order": 1,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.9"
|
||||
},
|
||||
"nteract": {
|
||||
"version": "nteract-front-end@1.0.0"
|
||||
},
|
||||
"star_tag": [
|
||||
"featured"
|
||||
],
|
||||
"tags": [
|
||||
"Dataset",
|
||||
"label",
|
||||
"Estimator"
|
||||
],
|
||||
"task": "Train"
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,106 +0,0 @@
|
||||
import os
|
||||
import torchvision
|
||||
import torchvision.transforms as transforms
|
||||
import torch
|
||||
import torch.nn as nn
|
||||
import torch.nn.functional as F
|
||||
import torch.optim as optim
|
||||
|
||||
from azureml.core import Dataset, Run
|
||||
import azureml.contrib.dataset
|
||||
from azureml.contrib.dataset import FileHandlingOption, LabeledDatasetTask
|
||||
|
||||
run = Run.get_context()
|
||||
|
||||
# get input dataset by name
|
||||
labeled_dataset = run.input_datasets['crack_labels']
|
||||
pytorch_dataset = labeled_dataset.to_torchvision()
|
||||
|
||||
|
||||
indices = torch.randperm(len(pytorch_dataset)).tolist()
|
||||
dataset_train = torch.utils.data.Subset(pytorch_dataset, indices[:40])
|
||||
dataset_test = torch.utils.data.Subset(pytorch_dataset, indices[-10:])
|
||||
|
||||
trainloader = torch.utils.data.DataLoader(dataset_train, batch_size=4,
|
||||
shuffle=True, num_workers=0)
|
||||
|
||||
testloader = torch.utils.data.DataLoader(dataset_test, batch_size=4,
|
||||
shuffle=True, num_workers=0)
|
||||
|
||||
|
||||
class Net(nn.Module):
|
||||
def __init__(self):
|
||||
super(Net, self).__init__()
|
||||
self.conv1 = nn.Conv2d(3, 6, 5)
|
||||
self.pool = nn.MaxPool2d(2, 2)
|
||||
self.conv2 = nn.Conv2d(6, 16, 5)
|
||||
self.fc1 = nn.Linear(16 * 71 * 71, 120)
|
||||
self.fc2 = nn.Linear(120, 84)
|
||||
self.fc3 = nn.Linear(84, 10)
|
||||
|
||||
def forward(self, x):
|
||||
x = self.pool(F.relu(self.conv1(x)))
|
||||
x = self.pool(F.relu(self.conv2(x)))
|
||||
x = x.view(x.size(0), 16 * 71 * 71)
|
||||
x = F.relu(self.fc1(x))
|
||||
x = F.relu(self.fc2(x))
|
||||
x = self.fc3(x)
|
||||
return x
|
||||
|
||||
|
||||
net = Net()
|
||||
|
||||
criterion = nn.CrossEntropyLoss()
|
||||
optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)
|
||||
|
||||
|
||||
for epoch in range(2): # loop over the dataset multiple times
|
||||
|
||||
running_loss = 0.0
|
||||
for i, data in enumerate(trainloader, 0):
|
||||
# get the inputs; data is a list of [inputs, labels]
|
||||
inputs, labels = data
|
||||
|
||||
# zero the parameter gradients
|
||||
optimizer.zero_grad()
|
||||
|
||||
# forward + backward + optimize
|
||||
outputs = net(inputs)
|
||||
loss = criterion(outputs, labels)
|
||||
loss.backward()
|
||||
optimizer.step()
|
||||
|
||||
# print statistics
|
||||
running_loss += loss.item()
|
||||
if i % 5 == 4: # print every 5 mini-batches
|
||||
print('[%d, %5d] loss: %.3f' %
|
||||
(epoch + 1, i + 1, running_loss / 5))
|
||||
running_loss = 0.0
|
||||
|
||||
print('Finished Training')
|
||||
classes = trainloader.dataset.dataset.labels
|
||||
PATH = './cifar_net.pth'
|
||||
torch.save(net.state_dict(), PATH)
|
||||
|
||||
dataiter = iter(testloader)
|
||||
images, labels = dataiter.next()
|
||||
|
||||
net = Net()
|
||||
net.load_state_dict(torch.load(PATH))
|
||||
|
||||
outputs = net(images)
|
||||
|
||||
_, predicted = torch.max(outputs, 1)
|
||||
|
||||
correct = 0
|
||||
total = 0
|
||||
with torch.no_grad():
|
||||
for data in testloader:
|
||||
images, labels = data
|
||||
outputs = net(images)
|
||||
_, predicted = torch.max(outputs.data, 1)
|
||||
total += labels.size(0)
|
||||
correct += (predicted == labels).sum().item()
|
||||
|
||||
print('Accuracy of the network on the 10 test images: %d %%' % (100 * correct / total))
|
||||
pass
|
||||
@@ -1,35 +0,0 @@
|
||||
import os
|
||||
|
||||
|
||||
def convert(imgf, labelf, outf, n):
|
||||
f = open(imgf, "rb")
|
||||
l = open(labelf, "rb")
|
||||
o = open(outf, "w")
|
||||
|
||||
f.read(16)
|
||||
l.read(8)
|
||||
images = []
|
||||
|
||||
for i in range(n):
|
||||
image = [ord(l.read(1))]
|
||||
for j in range(28 * 28):
|
||||
image.append(ord(f.read(1)))
|
||||
images.append(image)
|
||||
|
||||
for image in images:
|
||||
o.write(",".join(str(pix) for pix in image) + "\n")
|
||||
f.close()
|
||||
o.close()
|
||||
l.close()
|
||||
|
||||
|
||||
mounted_input_path = os.environ['fashion_ds']
|
||||
mounted_output_path = os.environ['AZUREML_DATAREFERENCE_prepared_fashion_ds']
|
||||
os.makedirs(mounted_output_path, exist_ok=True)
|
||||
|
||||
convert(os.path.join(mounted_input_path, 'train-images-idx3-ubyte'),
|
||||
os.path.join(mounted_input_path, 'train-labels-idx1-ubyte'),
|
||||
os.path.join(mounted_output_path, 'mnist_train.csv'), 60000)
|
||||
convert(os.path.join(mounted_input_path, 't10k-images-idx3-ubyte'),
|
||||
os.path.join(mounted_input_path, 't10k-labels-idx1-ubyte'),
|
||||
os.path.join(mounted_output_path, 'mnist_test.csv'), 10000)
|
||||
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
@@ -1,120 +0,0 @@
|
||||
import keras
|
||||
from keras.models import Sequential
|
||||
from keras.layers import Dense, Dropout, Flatten
|
||||
from keras.layers import Conv2D, MaxPooling2D
|
||||
from keras.layers.normalization import BatchNormalization
|
||||
from keras.utils import to_categorical
|
||||
from keras.callbacks import Callback
|
||||
|
||||
import numpy as np
|
||||
import pandas as pd
|
||||
import os
|
||||
import matplotlib.pyplot as plt
|
||||
from sklearn.model_selection import train_test_split
|
||||
from azureml.core import Run
|
||||
|
||||
# dataset object from the run
|
||||
run = Run.get_context()
|
||||
dataset = run.input_datasets['prepared_fashion_ds']
|
||||
|
||||
# split dataset into train and test set
|
||||
(train_dataset, test_dataset) = dataset.random_split(percentage=0.8, seed=111)
|
||||
|
||||
# load dataset into pandas dataframe
|
||||
data_train = train_dataset.to_pandas_dataframe()
|
||||
data_test = test_dataset.to_pandas_dataframe()
|
||||
|
||||
img_rows, img_cols = 28, 28
|
||||
input_shape = (img_rows, img_cols, 1)
|
||||
|
||||
X = np.array(data_train.iloc[:, 1:])
|
||||
y = to_categorical(np.array(data_train.iloc[:, 0]))
|
||||
|
||||
# here we split validation data to optimiza classifier during training
|
||||
X_train, X_val, y_train, y_val = train_test_split(X, y, test_size=0.2, random_state=13)
|
||||
|
||||
# test data
|
||||
X_test = np.array(data_test.iloc[:, 1:])
|
||||
y_test = to_categorical(np.array(data_test.iloc[:, 0]))
|
||||
|
||||
|
||||
X_train = X_train.reshape(X_train.shape[0], img_rows, img_cols, 1).astype('float32') / 255
|
||||
X_test = X_test.reshape(X_test.shape[0], img_rows, img_cols, 1).astype('float32') / 255
|
||||
X_val = X_val.reshape(X_val.shape[0], img_rows, img_cols, 1).astype('float32') / 255
|
||||
|
||||
batch_size = 256
|
||||
num_classes = 10
|
||||
epochs = 10
|
||||
|
||||
# construct neuron network
|
||||
model = Sequential()
|
||||
model.add(Conv2D(32, kernel_size=(3, 3),
|
||||
activation='relu',
|
||||
kernel_initializer='he_normal',
|
||||
input_shape=input_shape))
|
||||
model.add(MaxPooling2D((2, 2)))
|
||||
model.add(Dropout(0.25))
|
||||
model.add(Conv2D(64, (3, 3), activation='relu'))
|
||||
model.add(MaxPooling2D(pool_size=(2, 2)))
|
||||
model.add(Dropout(0.25))
|
||||
model.add(Conv2D(128, (3, 3), activation='relu'))
|
||||
model.add(Dropout(0.4))
|
||||
model.add(Flatten())
|
||||
model.add(Dense(128, activation='relu'))
|
||||
model.add(Dropout(0.3))
|
||||
model.add(Dense(num_classes, activation='softmax'))
|
||||
|
||||
model.compile(loss=keras.losses.categorical_crossentropy,
|
||||
optimizer=keras.optimizers.Adam(),
|
||||
metrics=['accuracy'])
|
||||
|
||||
# start an Azure ML run
|
||||
run = Run.get_context()
|
||||
|
||||
|
||||
class LogRunMetrics(Callback):
|
||||
# callback at the end of every epoch
|
||||
def on_epoch_end(self, epoch, log):
|
||||
# log a value repeated which creates a list
|
||||
run.log('Loss', log['loss'])
|
||||
run.log('Accuracy', log['accuracy'])
|
||||
|
||||
|
||||
history = model.fit(X_train, y_train,
|
||||
batch_size=batch_size,
|
||||
epochs=epochs,
|
||||
verbose=1,
|
||||
validation_data=(X_val, y_val),
|
||||
callbacks=[LogRunMetrics()])
|
||||
|
||||
score = model.evaluate(X_test, y_test, verbose=0)
|
||||
|
||||
# log a single value
|
||||
run.log("Final test loss", score[0])
|
||||
print('Test loss:', score[0])
|
||||
|
||||
run.log('Final test accuracy', score[1])
|
||||
print('Test accuracy:', score[1])
|
||||
|
||||
plt.figure(figsize=(6, 3))
|
||||
plt.title('Fashion MNIST with Keras ({} epochs)'.format(epochs), fontsize=14)
|
||||
plt.plot(history.history['accuracy'], 'b-', label='Accuracy', lw=4, alpha=0.5)
|
||||
plt.plot(history.history['loss'], 'r--', label='Loss', lw=4, alpha=0.5)
|
||||
plt.legend(fontsize=12)
|
||||
plt.grid(True)
|
||||
|
||||
# log an image
|
||||
run.log_image('Loss v.s. Accuracy', plot=plt)
|
||||
|
||||
# create a ./outputs/model folder in the compute target
|
||||
# files saved in the "./outputs" folder are automatically uploaded into run history
|
||||
os.makedirs('./outputs/model', exist_ok=True)
|
||||
|
||||
# serialize NN architecture to JSON
|
||||
model_json = model.to_json()
|
||||
# save model JSON
|
||||
with open('./outputs/model/model.json', 'w') as f:
|
||||
f.write(model_json)
|
||||
# save model weights
|
||||
model.save_weights('./outputs/model/model.h5')
|
||||
print("model saved in ./outputs/model folder")
|
||||
@@ -1,488 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License [2017] Zalando SE, https://tech.zalando.com"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Build a simple ML pipeline for image classification\n",
|
||||
"\n",
|
||||
"## Introduction\n",
|
||||
"This tutorial shows how to train a simple deep neural network using the [Fashion MNIST](https://github.com/zalandoresearch/fashion-mnist) dataset and Keras on Azure Machine Learning. Fashion-MNIST is a dataset of Zalando's article images\u00e2\u20ac\u201dconsisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes.\n",
|
||||
"\n",
|
||||
"Learn how to:\n",
|
||||
"\n",
|
||||
"> * Set up your development environment\n",
|
||||
"> * Create the Fashion MNIST dataset\n",
|
||||
"> * Create a machine learning pipeline to train a simple deep learning neural network on a remote cluster\n",
|
||||
"> * Retrieve input datasets from the experiment and register the output model with datasets\n",
|
||||
"\n",
|
||||
"## Prerequisite:\n",
|
||||
"* Understand the [architecture and terms](https://docs.microsoft.com/azure/machine-learning/service/concept-azure-machine-learning-architecture) introduced by Azure Machine Learning\n",
|
||||
"* If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) to:\n",
|
||||
" * install the latest version of AzureML SDK\n",
|
||||
" * create a workspace and its configuration file (`config.json`)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Set up your development environment\n",
|
||||
"\n",
|
||||
"All the setup for your development work can be accomplished in a Python notebook. Setup includes:\n",
|
||||
"\n",
|
||||
"* Importing Python packages\n",
|
||||
"* Connecting to a workspace to enable communication between your local computer and remote resources\n",
|
||||
"* Creating an experiment to track all your runs\n",
|
||||
"* Creating a remote compute target to use for training\n",
|
||||
"\n",
|
||||
"### Import packages\n",
|
||||
"\n",
|
||||
"Import Python packages you need in this session. Also display the Azure Machine Learning SDK version."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import os\n",
|
||||
"import azureml.core\n",
|
||||
"from azureml.core import Workspace, Dataset, Datastore, ComputeTarget, RunConfiguration, Experiment\n",
|
||||
"from azureml.core.runconfig import CondaDependencies\n",
|
||||
"from azureml.pipeline.steps import PythonScriptStep, EstimatorStep\n",
|
||||
"from azureml.pipeline.core import Pipeline, PipelineData\n",
|
||||
"from azureml.train.dnn import TensorFlow\n",
|
||||
"\n",
|
||||
"# check core SDK version number\n",
|
||||
"print(\"Azure ML SDK Version: \", azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Connect to workspace\n",
|
||||
"\n",
|
||||
"Create a workspace object from the existing workspace. `Workspace.from_config()` reads the file **config.json** and loads the details into an object named `workspace`."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# load workspace\n",
|
||||
"workspace = Workspace.from_config()\n",
|
||||
"print('Workspace name: ' + workspace.name, \n",
|
||||
" 'Azure region: ' + workspace.location, \n",
|
||||
" 'Subscription id: ' + workspace.subscription_id, \n",
|
||||
" 'Resource group: ' + workspace.resource_group, sep='\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create experiment and a directory\n",
|
||||
"\n",
|
||||
"Create an experiment to track the runs in your workspace and a directory to deliver the necessary code from your computer to the remote resource."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# create an ML experiment\n",
|
||||
"exp = Experiment(workspace=workspace, name='keras-mnist-fashion')\n",
|
||||
"\n",
|
||||
"# create a directory\n",
|
||||
"script_folder = './keras-mnist-fashion'\n",
|
||||
"os.makedirs(script_folder, exist_ok=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create or Attach existing compute resource\n",
|
||||
"By using Azure Machine Learning Compute, a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. Examples include VMs with GPU support. In this tutorial, you create Azure Machine Learning Compute as your training environment. The code below creates the compute clusters for you if they don't already exist in your workspace.\n",
|
||||
"\n",
|
||||
"**Creation of compute takes approximately 5 minutes.** If the AmlCompute with that name is already in your workspace the code will skip the creation process."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
|
||||
"from azureml.core.compute_target import ComputeTargetException\n",
|
||||
"\n",
|
||||
"# choose a name for your cluster\n",
|
||||
"cluster_name = \"your-cluster-name\"\n",
|
||||
"\n",
|
||||
"try:\n",
|
||||
" compute_target = ComputeTarget(workspace=workspace, name=cluster_name)\n",
|
||||
" print('Found existing compute target')\n",
|
||||
"except ComputeTargetException:\n",
|
||||
" print('Creating a new compute target...')\n",
|
||||
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_NC6', \n",
|
||||
" max_nodes=4)\n",
|
||||
"\n",
|
||||
" # create the cluster\n",
|
||||
" compute_target = ComputeTarget.create(workspace, cluster_name, compute_config)\n",
|
||||
"\n",
|
||||
" # can poll for a minimum number of nodes and for a specific timeout. \n",
|
||||
" # if no min node count is provided it uses the scale settings for the cluster\n",
|
||||
" compute_target.wait_for_completion(show_output=True, min_node_count=None, timeout_in_minutes=20)\n",
|
||||
"\n",
|
||||
"# use get_status() to get a detailed status for the current cluster. \n",
|
||||
"print(compute_target.get_status().serialize())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create the Fashion MNIST dataset\n",
|
||||
"\n",
|
||||
"By creating a dataset, you create a reference to the data source location. If you applied any subsetting transformations to the dataset, they will be stored in the dataset as well. The data remains in its existing location, so no extra storage cost is incurred. \n",
|
||||
"\n",
|
||||
"Every workspace comes with a default [datastore](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-access-data) (and you can register more) which is backed by the Azure blob storage account associated with the workspace. We can use it to transfer data from local to the cloud, and create a dataset from it. We will now upload the [Fashion MNIST](./keras-mnist-fashion) to the default datastore (blob) within your workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"datastore = workspace.get_default_datastore()\n",
|
||||
"datastore.upload_files(files = ['keras-mnist-fashion/t10k-images-idx3-ubyte', 'keras-mnist-fashion/t10k-labels-idx1-ubyte',\n",
|
||||
" 'keras-mnist-fashion/train-images-idx3-ubyte','keras-mnist-fashion/train-labels-idx1-ubyte'],\n",
|
||||
" target_path = 'mnist-fashion',\n",
|
||||
" overwrite = True,\n",
|
||||
" show_progress = True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Then we will create an unregistered FileDataset pointing to the path in the datastore. You can also create a dataset from multiple paths. [Learn More](https://aka.ms/azureml/howto/createdatasets) "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"fashion_ds = Dataset.File.from_files([(datastore, 'mnist-fashion')])\n",
|
||||
"\n",
|
||||
"# list the files referenced by fashion_ds\n",
|
||||
"fashion_ds.to_path()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Build 2-step ML pipeline\n",
|
||||
"\n",
|
||||
"The [Azure Machine Learning Pipeline](https://docs.microsoft.com/en-us/azure/machine-learning/service/concept-ml-pipelines) enables data scientists to create and manage multiple simple and complex workflows concurrently. A typical pipeline would have multiple tasks to prepare data, train, deploy and evaluate models. Individual steps in the pipeline can make use of diverse compute options (for example: CPU for data preparation and GPU for training) and languages. [Learn More](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/machine-learning-pipelines)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"### Step 1: data preparation\n",
|
||||
"\n",
|
||||
"In step one, we will load the image and labels from Fashion MNIST dataset into mnist_train.csv and mnist_test.csv\n",
|
||||
"\n",
|
||||
"Each image is 28 pixels in height and 28 pixels in width, for a total of 784 pixels in total. Each pixel has a single pixel-value associated with it, indicating the lightness or darkness of that pixel, with higher numbers meaning darker. This pixel-value is an integer between 0 and 255. Both mnist_train.csv and mnist_test.csv contain 785 columns. The first column consists of the class labels, which represent the article of clothing. The rest of the columns contain the pixel-values of the associated image."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# set up the compute environment to install required packages\n",
|
||||
"conda = CondaDependencies.create(\n",
|
||||
" pip_packages=['azureml-sdk','azureml-dataprep[fuse,pandas]'],\n",
|
||||
" pin_sdk_version=False)\n",
|
||||
"\n",
|
||||
"conda.set_pip_option('--pre')\n",
|
||||
"\n",
|
||||
"run_config = RunConfiguration()\n",
|
||||
"run_config.environment.python.conda_dependencies = conda"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Intermediate data (or output of a step) is represented by a `PipelineData` object. preprared_fashion_ds is produced as the output of step 1, and used as the input of step 2. PipelineData introduces a data dependency between steps, and creates an implicit execution order in the pipeline. You can register a `PipelineData` as a dataset and version the output data automatically. [Learn More](https://docs.microsoft.com/azure/machine-learning/service/how-to-version-track-datasets#version-a-pipeline-output-dataset) "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# define output data\n",
|
||||
"prepared_fashion_ds = PipelineData('prepared_fashion_ds', datastore=datastore).as_dataset()\n",
|
||||
"\n",
|
||||
"# register output data as dataset\n",
|
||||
"prepared_fashion_ds = prepared_fashion_ds.register(name='prepared_fashion_ds', create_new_version=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"A **PythonScriptStep** is a basic, built-in step to run a Python Script on a compute target. It takes a script name and optionally other parameters like arguments for the script, compute target, inputs and outputs. If no compute target is specified, default compute target for the workspace is used. You can also use a [**RunConfiguration**](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.runconfiguration?view=azure-ml-py) to specify requirements for the PythonScriptStep, such as conda dependencies and docker image."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"prep_step = PythonScriptStep(name='prepare step',\n",
|
||||
" script_name=\"prepare.py\",\n",
|
||||
" # mount fashion_ds dataset to the compute_target\n",
|
||||
" inputs=[fashion_ds.as_named_input('fashion_ds').as_mount()],\n",
|
||||
" outputs=[prepared_fashion_ds],\n",
|
||||
" source_directory=script_folder,\n",
|
||||
" compute_target=compute_target,\n",
|
||||
" runconfig=run_config)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Step 2: train CNN with Keras\n",
|
||||
"\n",
|
||||
"Next, we construct an `azureml.train.dnn.TensorFlow` estimator object. The TensorFlow estimator is providing a simple way of launching a TensorFlow training job on a compute target. It will automatically provide a docker image that has TensorFlow installed.\n",
|
||||
"\n",
|
||||
"[EstimatorStep](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-steps/azureml.pipeline.steps.estimator_step.estimatorstep?view=azure-ml-py) adds a step to run Tensorflow Estimator in a Pipeline. It takes a dataset as the input."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# set up training step with Tensorflow estimator\n",
|
||||
"est = TensorFlow(entry_script='train.py',\n",
|
||||
" source_directory=script_folder, \n",
|
||||
" pip_packages = ['azureml-sdk','keras','numpy','scikit-learn', 'matplotlib'],\n",
|
||||
" compute_target=compute_target)\n",
|
||||
"\n",
|
||||
"est_step = EstimatorStep(name='train step',\n",
|
||||
" estimator=est,\n",
|
||||
" estimator_entry_script_arguments=[],\n",
|
||||
" # parse prepared_fashion_ds into TabularDataset and use it as the input\n",
|
||||
" inputs=[prepared_fashion_ds.parse_delimited_files()],\n",
|
||||
" compute_target=compute_target)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Build the pipeline\n",
|
||||
"Once we have the steps (or steps collection), we can build the [pipeline](https://docs.microsoft.com/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline.pipeline?view=azure-ml-py).\n",
|
||||
"\n",
|
||||
"A pipeline is created with a list of steps and a workspace. Submit a pipeline using [submit](https://docs.microsoft.com/python/api/azureml-core/azureml.core.experiment(class)?view=azure-ml-py#submit-config--tags-none----kwargs-). When submit is called, a [PipelineRun](https://docs.microsoft.com/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelinerun?view=azure-ml-py) is created which in turn creates [StepRun](https://docs.microsoft.com/python/api/azureml-pipeline-core/azureml.pipeline.core.steprun?view=azure-ml-py) objects for each step in the workflow."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# build pipeline & run experiment\n",
|
||||
"pipeline = Pipeline(workspace, steps=[prep_step, est_step])\n",
|
||||
"run = exp.submit(pipeline)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Monitor the PipelineRun"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"inputHidden": false,
|
||||
"outputHidden": false
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run.wait_for_completion(show_output=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run.find_step_run('train step')[0].get_metrics()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Register the input dataset and the output model"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Azure Machine Learning dataset makes it easy to trace how your data is used in ML. [Learn More](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-version-track-datasets#track-datasets-in-experiments)<br>\n",
|
||||
"For each Machine Learning experiment, you can easily trace the datasets used as the input through `Run` object."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# get input datasets\n",
|
||||
"prep_step = run.find_step_run('prepare step')[0]\n",
|
||||
"inputs = prep_step.get_details()['inputDatasets']\n",
|
||||
"input_dataset = inputs[0]['dataset']\n",
|
||||
"\n",
|
||||
"# list the files referenced by input_dataset\n",
|
||||
"input_dataset.to_path()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Register the input Fashion MNIST dataset with the workspace so that you can reuse it in other experiments or share it with your colleagues who have access to your workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"fashion_ds = input_dataset.register(workspace = workspace,\n",
|
||||
" name = 'fashion_ds',\n",
|
||||
" description = 'image and label files from fashion mnist',\n",
|
||||
" create_new_version = True)\n",
|
||||
"fashion_ds"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Register the output model with dataset"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run.find_step_run('train step')[0].register_model(model_name = 'keras-model', model_path = 'outputs/model/', \n",
|
||||
" datasets =[('train test data',fashion_ds)])"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "sihhu"
|
||||
}
|
||||
],
|
||||
"category": "tutorial",
|
||||
"compute": [
|
||||
"Remote"
|
||||
],
|
||||
"datasets": [
|
||||
"Fashion MNIST"
|
||||
],
|
||||
"deployment": [
|
||||
"None"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"Azure ML"
|
||||
],
|
||||
"friendly_name": "Datasets with ML Pipeline",
|
||||
"index_order": 1,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.9"
|
||||
},
|
||||
"nteract": {
|
||||
"version": "nteract-front-end@1.0.0"
|
||||
},
|
||||
"star_tag": [
|
||||
"featured"
|
||||
],
|
||||
"tags": [
|
||||
"Dataset",
|
||||
"Pipeline",
|
||||
"Estimator",
|
||||
"ScriptRun"
|
||||
],
|
||||
"task": "Train"
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,568 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Tabular Time Series Related API Demo with NOAA Weather Data"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved. <br>\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"In this notebook, you will learn how to use the Tabular Time Series related API to filter the data by time windows for sample data uploaded to Azure blob storage. \n",
|
||||
"\n",
|
||||
"The detailed APIs to be demoed in this script are:\n",
|
||||
"- Create Tabular Dataset instance\n",
|
||||
"- Assign fine timestamp column and coarse timestamp column for Tabular Dataset to activate Time Series related APIs\n",
|
||||
"- Clear fine timestamp column and coarse timestamp column\n",
|
||||
"- Filter in data before a specific time\n",
|
||||
"- Filter in data after a specific time\n",
|
||||
"- Filter in data in a specific time range\n",
|
||||
"- Filter in data for recent time range\n",
|
||||
"\n",
|
||||
"Besides above APIs, you'll also see:\n",
|
||||
"- Create and load a Workspace\n",
|
||||
"- Load National Oceanic & Atmospheric (NOAA) weather data into Azure blob storage\n",
|
||||
"- Create and register NOAA weather data as a Tabular dataset\n",
|
||||
"- Re-load Tabular Dataset from your Workspace"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import Dependencies\n",
|
||||
"\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, run the cells below to install the Azure Machine Learning Python SDK and create an Azure ML Workspace that's required for this demo."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Prepare Environment"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Print out your version of the Azure ML Python SDK. Version 1.0.60 or above is required for TabularDataset with timeseries attribute. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import azureml.core\n",
|
||||
"azureml.data.__version__"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Import Packages"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# import packages\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"\n",
|
||||
"from calendar import monthrange\n",
|
||||
"from datetime import datetime, timedelta\n",
|
||||
"\n",
|
||||
"from azureml.core import Dataset, Datastore, Workspace, Run\n",
|
||||
"from azureml.opendatasets import NoaaIsdWeather"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Set up Configuraton and Create Azure ML Workspace\n",
|
||||
"\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/configuration.ipynb) first if you haven't already to establish your connection to the Azure ML Workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"ws = Workspace.from_config()\n",
|
||||
"dstore = ws.get_default_datastore()\n",
|
||||
"\n",
|
||||
"dset_name = 'weather-data-florida'\n",
|
||||
"\n",
|
||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, dstore.name, sep = '\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Load Data to Blob Storage\n",
|
||||
"\n",
|
||||
"This demo uses public NOAA weather data. You can replace this data with your own. The first cell below creates a Pandas Dataframe object with the first 6 months of 2019 NOAA weather data. The last cell saves the data to a CSV file and uploads the CSV file to Azure blob storage to the location specified in the datapath variable. Currently, the Dataset class only reads uploaded files from blob storage. \n",
|
||||
"\n",
|
||||
"**NOTE:** to reduce the size of data, we will only keep specific rows with a given stationName."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"target_years = [2019]\n",
|
||||
"\n",
|
||||
"for year in target_years:\n",
|
||||
" for month in range(1, 12+1):\n",
|
||||
" path = 'data/{}/{:02d}/'.format(year, month)\n",
|
||||
" \n",
|
||||
" try: \n",
|
||||
" start = datetime(year, month, 1)\n",
|
||||
" end = datetime(year, month, monthrange(year, month)[1]) + timedelta(days=1)\n",
|
||||
" isd = NoaaIsdWeather(start, end).to_pandas_dataframe()\n",
|
||||
" isd = isd[isd['stationName'].str.contains('FLORIDA', regex=True, na=False)]\n",
|
||||
" \n",
|
||||
" os.makedirs(path, exist_ok=True)\n",
|
||||
" isd.to_parquet(path + 'data.parquet')\n",
|
||||
" except Exception as e:\n",
|
||||
" print('Month {} in year {} likely has no data.\\n'.format(month, year))\n",
|
||||
" print('Exception: {}'.format(e))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Upload data to blob storage so it can be used as a Dataset."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"dstore.upload('data', dset_name, overwrite=True, show_progress=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create & Register Tabular Dataset with time-series trait from Blob\n",
|
||||
"\n",
|
||||
"The API on Tabular datasets with time-series trait is specially designed to handle Tabular time-series data and time related operations more efficiently. By registering your time-series dataset, you are publishing your dataset to your workspace so that it is accessible to anyone with the same subscription id. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Create Tabular Dataset instance from blob storage datapath.\n",
|
||||
"\n",
|
||||
"**TIP:** you can set virtual columns in the partition_format. I.e. if you partition the weather data by state and city, the path can be '/{STATE}/{CITY}/{coarse_time:yyy/MM}/data.parquet'. STATE and CITY would then appear as virtual columns in the dataset, allowing for efficient filtering by these grains. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"datastore_path = [(dstore, dset_name + '/*/*/data.parquet')]\n",
|
||||
"dataset = Dataset.Tabular.from_parquet_files(path=datastore_path, partition_format = dset_name + '/{coarse_time:yyyy/MM}/data.parquet')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Assign fine timestamp column for Tabular Dataset to activate Time Series related APIs. The column to be assigned should be a Date type, otherwise the assigning will fail."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# for this demo, leave out coarse_time so fine_grain_timestamp is used\n",
|
||||
"tsd = dataset.with_timestamp_columns(fine_grain_timestamp='datetime') # , coarse_grain_timestamp='coarse_time')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Register the dataset for easy access from anywhere in Azure ML and to keep track of versions, lineage. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# register dataset to Workspace\n",
|
||||
"registered_ds = tsd.register(ws, dset_name, create_new_version=True, description='Data for Tabular Dataset with time-series trait demo.', tags={ 'type': 'TabularDataset' })"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Reload the Dataset from Workspace"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# get dataset by dataset name\n",
|
||||
"tsd = Dataset.get_by_name(ws, name=dset_name)\n",
|
||||
"tsd.to_pandas_dataframe().head(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Filter Data by Time Windows\n",
|
||||
"\n",
|
||||
"Once your data has been loaded into the notebook, you can query by time using the time_before(), time_after(), time_between(), and time_recent() functions. You can also choose to drop or keep certain columns. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Before Time Input"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# select data that occurs before a specified date\n",
|
||||
"tsd2 = tsd.time_before(datetime(2019, 6, 12))\n",
|
||||
"tsd2.to_pandas_dataframe().tail(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## After Time Input"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# select data that occurs after a specified date\n",
|
||||
"tsd2 = tsd.time_after(datetime(2019, 5, 30))\n",
|
||||
"tsd2.to_pandas_dataframe().head(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Before & After Time Inputs\n",
|
||||
"\n",
|
||||
"You can chain time functions together."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"**NOTE:** You must set the coarse_grain_timestamp to None to filter on the fine_grain_timestamp. The below cell will fail unless the second line is uncommented "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# select data that occurs within a given time range\n",
|
||||
"#tsd = tsd.with_timestamp_columns(fine_grain_timestamp='datetime', coarse_grain_timestamp=None)\n",
|
||||
"tsd2 = tsd.time_after(datetime(2019, 1, 2)).time_before(datetime(2019, 1, 10))\n",
|
||||
"tsd2.to_pandas_dataframe().head(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Time Range Input"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# another way to select data that occurs within a given time range\n",
|
||||
"tsd2 = tsd.time_between(start_time=datetime(2019, 1, 31, 23, 59, 59), end_time=datetime(2019, 2, 7))\n",
|
||||
"tsd2.to_pandas_dataframe().head(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Time Recent Input"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"This function takes in a datetime.timedelta and returns a dataset containing the data from datetime.now()-timedelta() to datetime.now()."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"tsd2 = tsd.time_recent(timedelta(weeks=5, days=0))\n",
|
||||
"tsd2.to_pandas_dataframe().head(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"**NOTE:** This will return an empty dataframe there is no data within the last 2 days."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"tsd2 = tsd.time_recent(timedelta(days=2))\n",
|
||||
"tsd2.to_pandas_dataframe().tail(5)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Drop Columns"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"<font color=red>The columns to be dropped should NOT include timstamp columns.</font><br>Below operation will lead to exception."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"try:\n",
|
||||
" tsd2 = tsd.drop_columns(columns=['snowDepth', 'version', 'datetime'])\n",
|
||||
"except Exception as e:\n",
|
||||
" print('Expected exception : {}'.format(str(e)))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Drop will succeed if modify column list to exclude timestamp columns."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"tsd2 = tsd.drop_columns(columns=['snowDepth', 'version', 'upload_date'])\n",
|
||||
"tsd2.take(5).to_pandas_dataframe().sort_values(by='datetime')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Keep Columns"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"<font color=red>The columns to be kept should ALWAYS include timstamp columns.</font><br>Below operation will lead to exception."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"try:\n",
|
||||
" tsd2 = tsd.keep_columns(columns=['snowDepth'], validate=False)\n",
|
||||
"except Exception as e:\n",
|
||||
" print('Expected exception : {}'.format(str(e)))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Keep will succeed if modify column list to include timestamp columns."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"tsd2 = tsd.keep_columns(columns=['snowDepth', 'datetime', 'coarse_time'], validate=False)\n",
|
||||
"tsd2.to_pandas_dataframe().tail()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Resetting Timestamp Columns"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Rules for reseting are:\n",
|
||||
"- You cannot assign 'None' to fine_grain_timestamp while assign a valid column name to coarse_grain_timestamp because coarse_grain_timestamp is optional while fine_grain_timestamp is mandatory for Tabular time series data.\n",
|
||||
"- If you assign 'None' to fine_grain_timestamp, then both fine_grain_timestamp and coarse_grain_timestamp will all be cleared.\n",
|
||||
"- If you assign only 'None' to coarse_grain_timestamp, then only coarse_grain_timestamp will be cleared."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Illegal clearing, exception is expected.\n",
|
||||
"try:\n",
|
||||
" tsd2 = tsd.with_timestamp_columns(fine_grain_timestamp=None, coarse_grain_timestamp='coarse_time')\n",
|
||||
"except Exception as e:\n",
|
||||
" print('Cleaning not allowed because {}'.format(str(e)))\n",
|
||||
"\n",
|
||||
"# clear both\n",
|
||||
"tsd2 = tsd.with_timestamp_columns(fine_grain_timestamp=None, coarse_grain_timestamp=None)\n",
|
||||
"print('after clean both with None/None, timestamp columns are: {}'.format(tsd2.timestamp_columns))\n",
|
||||
"\n",
|
||||
"# clear coarse_grain_timestamp only and assign 'datetime' as fine timestamp column\n",
|
||||
"tsd2 = tsd2.with_timestamp_columns(fine_grain_timestamp='datetime', coarse_grain_timestamp=None)\n",
|
||||
"print('after clean coarse timestamp column, timestamp columns are: {}'.format(tsd2.timestamp_columns))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "ylxiong"
|
||||
}
|
||||
],
|
||||
"category": "tutorial",
|
||||
"compute": [
|
||||
"Local"
|
||||
],
|
||||
"datasets": [
|
||||
"NOAA"
|
||||
],
|
||||
"deployment": [
|
||||
"None"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"Azure ML"
|
||||
],
|
||||
"friendly_name": "Filtering data using Tabular Timeseiries Dataset related API",
|
||||
"index_order": 1,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.8"
|
||||
},
|
||||
"notice": "Copyright (c) Microsoft Corporation. All rights reserved. Licensed under the MIT License.",
|
||||
"star_tag": [
|
||||
"featured"
|
||||
],
|
||||
"tags": [
|
||||
"Dataset",
|
||||
"Tabular Timeseries"
|
||||
],
|
||||
"task": "Filtering"
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
@@ -1,151 +0,0 @@
|
||||
sepal_length,sepal_width,petal_length,petal_width,species
|
||||
5.1,3.5,1.4,0.2,Iris-setosa
|
||||
4.9,3,1.4,0.2,Iris-setosa
|
||||
4.7,3.2,1.3,0.2,Iris-setosa
|
||||
4.6,3.1,1.5,0.2,Iris-setosa
|
||||
5,3.6,1.4,0.2,Iris-setosa
|
||||
5.4,3.9,1.7,0.4,Iris-setosa
|
||||
4.6,3.4,1.4,0.3,Iris-setosa
|
||||
5,3.4,1.5,0.2,Iris-setosa
|
||||
4.4,2.9,1.4,0.2,Iris-setosa
|
||||
4.9,3.1,1.5,0.1,Iris-setosa
|
||||
5.4,3.7,1.5,0.2,Iris-setosa
|
||||
4.8,3.4,1.6,0.2,Iris-setosa
|
||||
4.8,3,1.4,0.1,Iris-setosa
|
||||
4.3,3,1.1,0.1,Iris-setosa
|
||||
5.8,4,1.2,0.2,Iris-setosa
|
||||
5.7,4.4,1.5,0.4,Iris-setosa
|
||||
5.4,3.9,1.3,0.4,Iris-setosa
|
||||
5.1,3.5,1.4,0.3,Iris-setosa
|
||||
5.7,3.8,1.7,0.3,Iris-setosa
|
||||
5.1,3.8,1.5,0.3,Iris-setosa
|
||||
5.4,3.4,1.7,0.2,Iris-setosa
|
||||
5.1,3.7,1.5,0.4,Iris-setosa
|
||||
4.6,3.6,1,0.2,Iris-setosa
|
||||
5.1,3.3,1.7,0.5,Iris-setosa
|
||||
4.8,3.4,1.9,0.2,Iris-setosa
|
||||
5,3,1.6,0.2,Iris-setosa
|
||||
5,3.4,1.6,0.4,Iris-setosa
|
||||
5.2,3.5,1.5,0.2,Iris-setosa
|
||||
5.2,3.4,1.4,0.2,Iris-setosa
|
||||
4.7,3.2,1.6,0.2,Iris-setosa
|
||||
4.8,3.1,1.6,0.2,Iris-setosa
|
||||
5.4,3.4,1.5,0.4,Iris-setosa
|
||||
5.2,4.1,1.5,0.1,Iris-setosa
|
||||
5.5,4.2,1.4,0.2,Iris-setosa
|
||||
4.9,3.1,1.5,0.1,Iris-setosa
|
||||
5,3.2,1.2,0.2,Iris-setosa
|
||||
5.5,3.5,1.3,0.2,Iris-setosa
|
||||
4.9,3.1,1.5,0.1,Iris-setosa
|
||||
4.4,3,1.3,0.2,Iris-setosa
|
||||
5.1,3.4,1.5,0.2,Iris-setosa
|
||||
5,3.5,1.3,0.3,Iris-setosa
|
||||
4.5,2.3,1.3,0.3,Iris-setosa
|
||||
4.4,3.2,1.3,0.2,Iris-setosa
|
||||
5,3.5,1.6,0.6,Iris-setosa
|
||||
5.1,3.8,1.9,0.4,Iris-setosa
|
||||
4.8,3,1.4,0.3,Iris-setosa
|
||||
5.1,3.8,1.6,0.2,Iris-setosa
|
||||
4.6,3.2,1.4,0.2,Iris-setosa
|
||||
5.3,3.7,1.5,0.2,Iris-setosa
|
||||
5,3.3,1.4,0.2,Iris-setosa
|
||||
7,3.2,4.7,1.4,Iris-versicolor
|
||||
6.4,3.2,4.5,1.5,Iris-versicolor
|
||||
6.9,3.1,4.9,1.5,Iris-versicolor
|
||||
5.5,2.3,4,1.3,Iris-versicolor
|
||||
6.5,2.8,4.6,1.5,Iris-versicolor
|
||||
5.7,2.8,4.5,1.3,Iris-versicolor
|
||||
6.3,3.3,4.7,1.6,Iris-versicolor
|
||||
4.9,2.4,3.3,1,Iris-versicolor
|
||||
6.6,2.9,4.6,1.3,Iris-versicolor
|
||||
5.2,2.7,3.9,1.4,Iris-versicolor
|
||||
5,2,3.5,1,Iris-versicolor
|
||||
5.9,3,4.2,1.5,Iris-versicolor
|
||||
6,2.2,4,1,Iris-versicolor
|
||||
6.1,2.9,4.7,1.4,Iris-versicolor
|
||||
5.6,2.9,3.6,1.3,Iris-versicolor
|
||||
6.7,3.1,4.4,1.4,Iris-versicolor
|
||||
5.6,3,4.5,1.5,Iris-versicolor
|
||||
5.8,2.7,4.1,1,Iris-versicolor
|
||||
6.2,2.2,4.5,1.5,Iris-versicolor
|
||||
5.6,2.5,3.9,1.1,Iris-versicolor
|
||||
5.9,3.2,4.8,1.8,Iris-versicolor
|
||||
6.1,2.8,4,1.3,Iris-versicolor
|
||||
6.3,2.5,4.9,1.5,Iris-versicolor
|
||||
6.1,2.8,4.7,1.2,Iris-versicolor
|
||||
6.4,2.9,4.3,1.3,Iris-versicolor
|
||||
6.6,3,4.4,1.4,Iris-versicolor
|
||||
6.8,2.8,4.8,1.4,Iris-versicolor
|
||||
6.7,3,5,1.7,Iris-versicolor
|
||||
6,2.9,4.5,1.5,Iris-versicolor
|
||||
5.7,2.6,3.5,1,Iris-versicolor
|
||||
5.5,2.4,3.8,1.1,Iris-versicolor
|
||||
5.5,2.4,3.7,1,Iris-versicolor
|
||||
5.8,2.7,3.9,1.2,Iris-versicolor
|
||||
6,2.7,5.1,1.6,Iris-versicolor
|
||||
5.4,3,4.5,1.5,Iris-versicolor
|
||||
6,3.4,4.5,1.6,Iris-versicolor
|
||||
6.7,3.1,4.7,1.5,Iris-versicolor
|
||||
6.3,2.3,4.4,1.3,Iris-versicolor
|
||||
5.6,3,4.1,1.3,Iris-versicolor
|
||||
5.5,2.5,4,1.3,Iris-versicolor
|
||||
5.5,2.6,4.4,1.2,Iris-versicolor
|
||||
6.1,3,4.6,1.4,Iris-versicolor
|
||||
5.8,2.6,4,1.2,Iris-versicolor
|
||||
5,2.3,3.3,1,Iris-versicolor
|
||||
5.6,2.7,4.2,1.3,Iris-versicolor
|
||||
5.7,3,4.2,1.2,Iris-versicolor
|
||||
5.7,2.9,4.2,1.3,Iris-versicolor
|
||||
6.2,2.9,4.3,1.3,Iris-versicolor
|
||||
5.1,2.5,3,1.1,Iris-versicolor
|
||||
5.7,2.8,4.1,1.3,Iris-versicolor
|
||||
6.3,3.3,6,2.5,Iris-virginica
|
||||
5.8,2.7,5.1,1.9,Iris-virginica
|
||||
7.1,3,5.9,2.1,Iris-virginica
|
||||
6.3,2.9,5.6,1.8,Iris-virginica
|
||||
6.5,3,5.8,2.2,Iris-virginica
|
||||
7.6,3,6.6,2.1,Iris-virginica
|
||||
4.9,2.5,4.5,1.7,Iris-virginica
|
||||
7.3,2.9,6.3,1.8,Iris-virginica
|
||||
6.7,2.5,5.8,1.8,Iris-virginica
|
||||
7.2,3.6,6.1,2.5,Iris-virginica
|
||||
6.5,3.2,5.1,2,Iris-virginica
|
||||
6.4,2.7,5.3,1.9,Iris-virginica
|
||||
6.8,3,5.5,2.1,Iris-virginica
|
||||
5.7,2.5,5,2,Iris-virginica
|
||||
5.8,2.8,5.1,2.4,Iris-virginica
|
||||
6.4,3.2,5.3,2.3,Iris-virginica
|
||||
6.5,3,5.5,1.8,Iris-virginica
|
||||
7.7,3.8,6.7,2.2,Iris-virginica
|
||||
7.7,2.6,6.9,2.3,Iris-virginica
|
||||
6,2.2,5,1.5,Iris-virginica
|
||||
6.9,3.2,5.7,2.3,Iris-virginica
|
||||
5.6,2.8,4.9,2,Iris-virginica
|
||||
7.7,2.8,6.7,2,Iris-virginica
|
||||
6.3,2.7,4.9,1.8,Iris-virginica
|
||||
6.7,3.3,5.7,2.1,Iris-virginica
|
||||
7.2,3.2,6,1.8,Iris-virginica
|
||||
6.2,2.8,4.8,1.8,Iris-virginica
|
||||
6.1,3,4.9,1.8,Iris-virginica
|
||||
6.4,2.8,5.6,2.1,Iris-virginica
|
||||
7.2,3,5.8,1.6,Iris-virginica
|
||||
7.4,2.8,6.1,1.9,Iris-virginica
|
||||
7.9,3.8,6.4,2,Iris-virginica
|
||||
6.4,2.8,5.6,2.2,Iris-virginica
|
||||
6.3,2.8,5.1,1.5,Iris-virginica
|
||||
6.1,2.6,5.6,1.4,Iris-virginica
|
||||
7.7,3,6.1,2.3,Iris-virginica
|
||||
6.3,3.4,5.6,2.4,Iris-virginica
|
||||
6.4,3.1,5.5,1.8,Iris-virginica
|
||||
6,3,4.8,1.8,Iris-virginica
|
||||
6.9,3.1,5.4,2.1,Iris-virginica
|
||||
6.7,3.1,5.6,2.4,Iris-virginica
|
||||
6.9,3.1,5.1,2.3,Iris-virginica
|
||||
5.8,2.7,5.1,1.9,Iris-virginica
|
||||
6.8,3.2,5.9,2.3,Iris-virginica
|
||||
6.7,3.3,5.7,2.5,Iris-virginica
|
||||
6.7,3,5.2,2.3,Iris-virginica
|
||||
6.3,2.5,5,1.9,Iris-virginica
|
||||
6.5,3,5.2,2,Iris-virginica
|
||||
6.2,3.4,5.4,2.3,Iris-virginica
|
||||
5.9,3,5.1,1.8,Iris-virginica
|
||||
|
@@ -1,656 +0,0 @@
|
||||
{
|
||||
"cells": [
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||
"\n",
|
||||
"Licensed under the MIT License."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Train with Azure Machine Learning datasets\n",
|
||||
"Datasets are categorized into TabularDataset and FileDataset based on how users consume them in training. \n",
|
||||
"* A TabularDataset represents data in a tabular format by parsing the provided file or list of files. TabularDataset can be created from csv, tsv, parquet files, SQL query results etc. For the complete list, please visit our [documentation](https://aka.ms/tabulardataset-api-reference). It provides you with the ability to materialize the data into a pandas DataFrame.\n",
|
||||
"* A FileDataset references single or multiple files in your datastores or public urls. This provides you with the ability to download or mount the files to your compute. The files can be of any format, which enables a wider range of machine learning scenarios including deep learning.\n",
|
||||
"\n",
|
||||
"In this tutorial, you will learn how to train with Azure Machine Learning datasets:\n",
|
||||
"\n",
|
||||
"☑ Use datasets directly in your training script\n",
|
||||
"\n",
|
||||
"☑ Use datasets to mount files to a remote compute"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Prerequisites\n",
|
||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) first if you haven't already established your connection to the AzureML Workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# Check core SDK version number\n",
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"print('SDK version:', azureml.core.VERSION)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Initialize Workspace\n",
|
||||
"\n",
|
||||
"Initialize a workspace object from persisted configuration."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Workspace\n",
|
||||
"\n",
|
||||
"ws = Workspace.from_config()\n",
|
||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create Experiment\n",
|
||||
"\n",
|
||||
"**Experiment** is a logical container in an Azure ML Workspace. It hosts run records which can include run metrics and output artifacts from your experiments."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"experiment_name = 'train-with-datasets'\n",
|
||||
"\n",
|
||||
"from azureml.core import Experiment\n",
|
||||
"exp = Experiment(workspace=ws, name=experiment_name)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Create or Attach existing compute resource\n",
|
||||
"By using Azure Machine Learning Compute, a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. Examples include VMs with GPU support. In this tutorial, you create Azure Machine Learning Compute as your training environment. The code below creates the compute clusters for you if they don't already exist in your workspace.\n",
|
||||
"\n",
|
||||
"**Creation of compute takes approximately 5 minutes.** If the AmlCompute with that name is already in your workspace the code will skip the creation process."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core.compute import AmlCompute\n",
|
||||
"from azureml.core.compute import ComputeTarget\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"# choose a name for your cluster\n",
|
||||
"compute_name = os.environ.get('AML_COMPUTE_CLUSTER_NAME', 'cpu-cluster')\n",
|
||||
"compute_min_nodes = os.environ.get('AML_COMPUTE_CLUSTER_MIN_NODES', 0)\n",
|
||||
"compute_max_nodes = os.environ.get('AML_COMPUTE_CLUSTER_MAX_NODES', 4)\n",
|
||||
"\n",
|
||||
"# This example uses CPU VM. For using GPU VM, set SKU to STANDARD_NC6\n",
|
||||
"vm_size = os.environ.get('AML_COMPUTE_CLUSTER_SKU', 'STANDARD_D2_V2')\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"if compute_name in ws.compute_targets:\n",
|
||||
" compute_target = ws.compute_targets[compute_name]\n",
|
||||
" if compute_target and type(compute_target) is AmlCompute:\n",
|
||||
" print('found compute target. just use it. ' + compute_name)\n",
|
||||
"else:\n",
|
||||
" print('creating a new compute target...')\n",
|
||||
" provisioning_config = AmlCompute.provisioning_configuration(vm_size=vm_size,\n",
|
||||
" min_nodes=compute_min_nodes, \n",
|
||||
" max_nodes=compute_max_nodes)\n",
|
||||
"\n",
|
||||
" # create the cluster\n",
|
||||
" compute_target = ComputeTarget.create(ws, compute_name, provisioning_config)\n",
|
||||
" \n",
|
||||
" # can poll for a minimum number of nodes and for a specific timeout. \n",
|
||||
" # if no min node count is provided it will use the scale settings for the cluster\n",
|
||||
" compute_target.wait_for_completion(show_output=True, min_node_count=None, timeout_in_minutes=20)\n",
|
||||
" \n",
|
||||
" # For a more detailed view of current AmlCompute status, use get_status()\n",
|
||||
" print(compute_target.get_status().serialize())"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You now have the necessary packages and compute resources to train a model in the cloud.\n",
|
||||
"## Use datasets directly in training\n",
|
||||
"\n",
|
||||
"### Create a TabularDataset\n",
|
||||
"By creating a dataset, you create a reference to the data source location. If you applied any subsetting transformations to the dataset, they will be stored in the dataset as well. The data remains in its existing location, so no extra storage cost is incurred. \n",
|
||||
"\n",
|
||||
"Every workspace comes with a default [datastore](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-access-data) (and you can register more) which is backed by the Azure blob storage account associated with the workspace. We can use it to transfer data from local to the cloud, and create dataset from it. We will now upload the [Iris data](./train-dataset/Iris.csv) to the default datastore (blob) within your workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"datastore = ws.get_default_datastore()\n",
|
||||
"datastore.upload_files(files = ['./train-dataset/iris.csv'],\n",
|
||||
" target_path = 'train-dataset/tabular/',\n",
|
||||
" overwrite = True,\n",
|
||||
" show_progress = True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Then we will create an unregistered TabularDataset pointing to the path in the datastore. You can also create a dataset from multiple paths. [learn more](https://aka.ms/azureml/howto/createdatasets) \n",
|
||||
"\n",
|
||||
"[TabularDataset](https://docs.microsoft.com/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py) represents data in a tabular format by parsing the provided file or list of files. This provides you with the ability to materialize the data into a Pandas or Spark DataFrame. You can create a TabularDataset object from .csv, .tsv, and parquet files, and from SQL query results. For a complete list, see [TabularDatasetFactory](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py) class."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {
|
||||
"tags": [
|
||||
"dataset-remarks-tabular-sample"
|
||||
]
|
||||
},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Dataset\n",
|
||||
"dataset = Dataset.Tabular.from_delimited_files(path = [(datastore, 'train-dataset/tabular/iris.csv')])\n",
|
||||
"\n",
|
||||
"# preview the first 3 rows of the dataset\n",
|
||||
"dataset.take(3).to_pandas_dataframe()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create a training script\n",
|
||||
"\n",
|
||||
"To submit the job to the cluster, first create a training script. Run the following code to create the training script called `train_titanic.py` in the script_folder. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"script_folder = os.path.join(os.getcwd(), 'train-dataset')"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%writefile $script_folder/train_iris.py\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"\n",
|
||||
"from azureml.core import Dataset, Run\n",
|
||||
"from sklearn.model_selection import train_test_split\n",
|
||||
"from sklearn.tree import DecisionTreeClassifier\n",
|
||||
"from sklearn.externals import joblib\n",
|
||||
"\n",
|
||||
"run = Run.get_context()\n",
|
||||
"# get input dataset by name\n",
|
||||
"dataset = run.input_datasets['iris']\n",
|
||||
"\n",
|
||||
"df = dataset.to_pandas_dataframe()\n",
|
||||
"\n",
|
||||
"x_col = ['sepal_length', 'sepal_width', 'petal_length', 'petal_width']\n",
|
||||
"y_col = ['species']\n",
|
||||
"x_df = df.loc[:, x_col]\n",
|
||||
"y_df = df.loc[:, y_col]\n",
|
||||
"\n",
|
||||
"#dividing X,y into train and test data\n",
|
||||
"x_train, x_test, y_train, y_test = train_test_split(x_df, y_df, test_size=0.2, random_state=223)\n",
|
||||
"\n",
|
||||
"data = {'train': {'X': x_train, 'y': y_train},\n",
|
||||
"\n",
|
||||
" 'test': {'X': x_test, 'y': y_test}}\n",
|
||||
"\n",
|
||||
"clf = DecisionTreeClassifier().fit(data['train']['X'], data['train']['y'])\n",
|
||||
"model_file_name = 'decision_tree.pkl'\n",
|
||||
"\n",
|
||||
"print('Accuracy of Decision Tree classifier on training set: {:.2f}'.format(clf.score(x_train, y_train)))\n",
|
||||
"print('Accuracy of Decision Tree classifier on test set: {:.2f}'.format(clf.score(x_test, y_test)))\n",
|
||||
"\n",
|
||||
"os.makedirs('./outputs', exist_ok=True)\n",
|
||||
"with open(model_file_name, 'wb') as file:\n",
|
||||
" joblib.dump(value=clf, filename='outputs/' + model_file_name)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Configure and use datasets as the input to Estimator"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"You can ask the system to build a conda environment based on your dependency specification. Once the environment is built, and if you don't change your dependencies, it will be reused in subsequent runs."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Environment\n",
|
||||
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||
"\n",
|
||||
"conda_env = Environment('conda-env')\n",
|
||||
"conda_env.python.conda_dependencies = CondaDependencies.create(pip_packages=['azureml-sdk',\n",
|
||||
" 'azureml-dataprep[pandas,fuse]',\n",
|
||||
" 'scikit-learn'])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"An estimator object is used to submit the run. Azure Machine Learning has pre-configured estimators for common machine learning frameworks, as well as generic Estimator. Create a generic estimator for by specifying\n",
|
||||
"\n",
|
||||
"* The name of the estimator object, `est`\n",
|
||||
"* The directory that contains your scripts. All the files in this directory are uploaded into the cluster nodes for execution. \n",
|
||||
"* The training script name, train_titanic.py\n",
|
||||
"* The input dataset for training\n",
|
||||
"* The compute target. In this case you will use the AmlCompute you created\n",
|
||||
"* The environment definition for the experiment"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.train.estimator import Estimator\n",
|
||||
"\n",
|
||||
"est = Estimator(source_directory=script_folder, \n",
|
||||
" entry_script='train_iris.py', \n",
|
||||
" # pass dataset object as an input with name 'titanic'\n",
|
||||
" inputs=[dataset.as_named_input('iris')],\n",
|
||||
" compute_target=compute_target,\n",
|
||||
" environment_definition= conda_env) "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Submit job to run\n",
|
||||
"Submit the estimator to the Azure ML experiment to kick off the execution."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run = exp.submit(est)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.widgets import RunDetails\n",
|
||||
"\n",
|
||||
"# monitor the run\n",
|
||||
"RunDetails(run).show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Use datasets to mount files to a remote compute\n",
|
||||
"\n",
|
||||
"You can use the `Dataset` object to mount or download files referred by it. When you mount a file system, you attach that file system to a directory (mount point) and make it available to the system. Because mounting load files at the time of processing, it is usually faster than download.<br> \n",
|
||||
"Note: mounting is only available for Linux-based compute (DSVM/VM, AMLCompute, HDInsights)."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Upload data files into datastore\n",
|
||||
"We will first load diabetes data from `scikit-learn` to the train-dataset folder."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from sklearn.datasets import load_diabetes\n",
|
||||
"import numpy as np\n",
|
||||
"\n",
|
||||
"training_data = load_diabetes()\n",
|
||||
"np.save(file='train-dataset/features.npy', arr=training_data['data'])\n",
|
||||
"np.save(file='train-dataset/labels.npy', arr=training_data['target'])"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Now let's upload the 2 files into the default datastore under a path named `diabetes`:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"datastore.upload_files(['train-dataset/features.npy', 'train-dataset/labels.npy'], target_path='diabetes', overwrite=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create a FileDataset\n",
|
||||
"\n",
|
||||
"[FileDataset](https://docs.microsoft.com/python/api/azureml-core/azureml.data.file_dataset.filedataset?view=azure-ml-py) references single or multiple files in your datastores or public URLs. Using this method, you can download or mount the files to your compute as a FileDataset object. The files can be in any format, which enables a wider range of machine learning scenarios, including deep learning."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Dataset\n",
|
||||
"\n",
|
||||
"dataset = Dataset.File.from_files(path = [(datastore, 'diabetes/')])\n",
|
||||
"\n",
|
||||
"# see a list of files referenced by dataset\n",
|
||||
"dataset.to_path()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Create a training script\n",
|
||||
"\n",
|
||||
"To submit the job to the cluster, first create a training script. Run the following code to create the training script called `train_diabetes.py` in the script_folder. "
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%writefile $script_folder/train_diabetes.py\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import glob\n",
|
||||
"\n",
|
||||
"from sklearn.linear_model import Ridge\n",
|
||||
"from sklearn.metrics import mean_squared_error\n",
|
||||
"from sklearn.model_selection import train_test_split\n",
|
||||
"from azureml.core.run import Run\n",
|
||||
"from sklearn.externals import joblib\n",
|
||||
"\n",
|
||||
"import numpy as np\n",
|
||||
"\n",
|
||||
"os.makedirs('./outputs', exist_ok=True)\n",
|
||||
"\n",
|
||||
"run = Run.get_context()\n",
|
||||
"base_path = run.input_datasets['diabetes']\n",
|
||||
"\n",
|
||||
"X = np.load(glob.glob(os.path.join(base_path, '**/features.npy'), recursive=True)[0])\n",
|
||||
"y = np.load(glob.glob(os.path.join(base_path, '**/labels.npy'), recursive=True)[0])\n",
|
||||
"\n",
|
||||
"X_train, X_test, y_train, y_test = train_test_split(\n",
|
||||
" X, y, test_size=0.2, random_state=0)\n",
|
||||
"data = {'train': {'X': X_train, 'y': y_train},\n",
|
||||
" 'test': {'X': X_test, 'y': y_test}}\n",
|
||||
"\n",
|
||||
"# list of numbers from 0.0 to 1.0 with a 0.05 interval\n",
|
||||
"alphas = np.arange(0.0, 1.0, 0.05)\n",
|
||||
"\n",
|
||||
"for alpha in alphas:\n",
|
||||
" # use Ridge algorithm to create a regression model\n",
|
||||
" reg = Ridge(alpha=alpha)\n",
|
||||
" reg.fit(data['train']['X'], data['train']['y'])\n",
|
||||
"\n",
|
||||
" preds = reg.predict(data['test']['X'])\n",
|
||||
" mse = mean_squared_error(preds, data['test']['y'])\n",
|
||||
" run.log('alpha', alpha)\n",
|
||||
" run.log('mse', mse)\n",
|
||||
"\n",
|
||||
" model_file_name = 'ridge_{0:.2f}.pkl'.format(alpha)\n",
|
||||
" with open(model_file_name, 'wb') as file:\n",
|
||||
" joblib.dump(value=reg, filename='outputs/' + model_file_name)\n",
|
||||
"\n",
|
||||
" print('alpha is {0:.2f}, and mse is {1:0.2f}'.format(alpha, mse))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Configure & Run"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import ScriptRunConfig\n",
|
||||
"\n",
|
||||
"src = ScriptRunConfig(source_directory=script_folder, \n",
|
||||
" script='train_diabetes.py', \n",
|
||||
" # to mount the dataset on the remote compute and pass the mounted path as an argument to the training script\n",
|
||||
" arguments =[dataset.as_named_input('diabetes').as_mount()])\n",
|
||||
"\n",
|
||||
"src.run_config.framework = 'python'\n",
|
||||
"src.run_config.environment = conda_env\n",
|
||||
"src.run_config.target = compute_target.name"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"run = exp.submit(config=src)\n",
|
||||
"\n",
|
||||
"# monitor the run\n",
|
||||
"RunDetails(run).show()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Display run results\n",
|
||||
"You now have a model trained on a remote cluster. Retrieve all the metrics logged during the run, including the accuracy of the model:"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"print(run.get_metrics())\n",
|
||||
"metrics = run.get_metrics()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Register datasets\n",
|
||||
"Use the register() method to register datasets to your workspace so they can be shared with others, reused across various experiments, and referred to by name in your training script."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"dataset = dataset.register(workspace = ws,\n",
|
||||
" name = 'diabetes dataset',\n",
|
||||
" description='training dataset',\n",
|
||||
" create_new_version=True)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"## Register models with datasets\n",
|
||||
"The last step in the training script wrote the model files in a directory named `outputs` in the VM of the cluster where the job is executed. `outputs` is a special directory in that all content in this directory is automatically uploaded to your workspace. This content appears in the run record in the experiment under your workspace. Hence, the model file is now also available in your workspace.\n",
|
||||
"\n",
|
||||
"You can register models with datasets for reproducibility and auditing purpose."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# find the index where MSE is the smallest\n",
|
||||
"indices = list(range(0, len(metrics['mse'])))\n",
|
||||
"min_mse_index = min(indices, key=lambda x: metrics['mse'][x])\n",
|
||||
"\n",
|
||||
"print('When alpha is {1:0.2f}, we have min MSE {0:0.2f}.'.format(\n",
|
||||
" metrics['mse'][min_mse_index], \n",
|
||||
" metrics['alpha'][min_mse_index]\n",
|
||||
"))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# find the best model\n",
|
||||
"best_alpha = metrics['alpha'][min_mse_index]\n",
|
||||
"model_file_name = 'ridge_{0:.2f}.pkl'.format(best_alpha)"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# register the best model with the input dataset\n",
|
||||
"model = run.register_model(model_name='sklearn_diabetes', model_path=os.path.join('outputs', model_file_name),\n",
|
||||
" datasets =[('training data',dataset)])"
|
||||
]
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "sihhu"
|
||||
}
|
||||
],
|
||||
"category": "tutorial",
|
||||
"compute": [
|
||||
"Remote"
|
||||
],
|
||||
"datasets": [
|
||||
"Iris",
|
||||
"Diabetes"
|
||||
],
|
||||
"deployment": [
|
||||
"None"
|
||||
],
|
||||
"exclude_from_index": false,
|
||||
"framework": [
|
||||
"Azure ML"
|
||||
],
|
||||
"friendly_name": "Train with Datasets (Tabular and File)",
|
||||
"index_order": 1,
|
||||
"kernelspec": {
|
||||
"display_name": "Python 3.6",
|
||||
"language": "python",
|
||||
"name": "python36"
|
||||
},
|
||||
"language_info": {
|
||||
"codemirror_mode": {
|
||||
"name": "ipython",
|
||||
"version": 3
|
||||
},
|
||||
"file_extension": ".py",
|
||||
"mimetype": "text/x-python",
|
||||
"name": "python",
|
||||
"nbconvert_exporter": "python",
|
||||
"pygments_lexer": "ipython3",
|
||||
"version": "3.6.9"
|
||||
},
|
||||
"star_tag": [
|
||||
"featured"
|
||||
],
|
||||
"tags": [
|
||||
"Dataset",
|
||||
"Estimator",
|
||||
"ScriptRun"
|
||||
],
|
||||
"task": "Train"
|
||||
},
|
||||
"nbformat": 4,
|
||||
"nbformat_minor": 2
|
||||
}
|
||||
26
index.md
26
index.md
@@ -12,7 +12,6 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| [Using Azure ML environments](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/using-environments/using-environments.ipynb) | Creating and registering environments | None | Local | None | None | None |
|
||||
| [Estimators in AML with hyperparameter tuning](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training-with-deep-learning/how-to-use-estimator/how-to-use-estimator.ipynb) | Use the Estimator pattern in Azure Machine Learning SDK | None | AML Compute | None | None | None |
|
||||
|
||||
|
||||
## Tutorials
|
||||
|
||||
|Title| Task | Dataset | Training Compute | Deployment Target | ML Framework | Tags |
|
||||
@@ -25,15 +24,10 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| :star:[Data drift on aks](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/monitor-models/data-drift/drift-on-aks.ipynb) | Filtering | NOAA | Remote | AKS | Azure ML | Dataset, Timeseries, Drift |
|
||||
| [Train and deploy a model using Python SDK](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb) | Training and deploying a model from a notebook | Diabetes | Local | Azure Container Instance | None | None |
|
||||
| :star:[Data drift quickdemo](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/work-with-data/datadrift-tutorial/datadrift-tutorial.ipynb) | Filtering | NOAA | Remote | None | Azure ML | Dataset, Timeseries, Drift |
|
||||
| :star:[Filtering data using Tabular Timeseiries Dataset related API](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/work-with-data/datasets-tutorial/tabular-timeseries-dataset-filtering.ipynb) | Filtering | NOAA | Local | None | Azure ML | Dataset, Tabular Timeseries |
|
||||
| :star:[Introduction to labeled datasets](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/work-with-data/datasets-tutorial/labeled-datasets/labeled-datasets.ipynb) | Train | | Remote | None | Azure ML | Dataset, label, Estimator |
|
||||
| :star:[Datasets with ML Pipeline](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/work-with-data/datasets-tutorial/pipeline-with-datasets/pipeline-for-image-classification.ipynb) | Train | Fashion MNIST | Remote | None | Azure ML | Dataset, Pipeline, Estimator, ScriptRun |
|
||||
| :star:[Train with Datasets (Tabular and File)](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/work-with-data/datasets-tutorial/train-with-datasets/train-with-datasets.ipynb) | Train | Iris, Diabetes | Remote | None | Azure ML | Dataset, Estimator, ScriptRun |
|
||||
| [Forecasting away from training data](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/forecasting-high-frequency/automl-forecasting-function.ipynb) | Forecasting | None | Remote | None | Azure ML AutoML | Forecasting, Confidence Intervals |
|
||||
| [Automated ML run with basic edition features.](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/classification-bank-marketing-all-features/auto-ml-classification-bank-marketing-all-features.ipynb) | Classification | Bankmarketing | AML | ACI | None | featurization, explainability, remote_run, AutomatedML |
|
||||
| [Classification of credit card fraudulent transactions using Automated ML](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/classification-credit-card-fraud/auto-ml-classification-credit-card-fraud.ipynb) | Classification | Creditcard | AML Compute | None | None | remote_run, AutomatedML |
|
||||
| [Automated ML run with featurization and model explainability.](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/regression-hardware-performance-explanation-and-featurization/auto-ml-regression-hardware-performance-explanation-and-featurization.ipynb) | Regression | MachineData | AML | ACI | None | featurization, explainability, remote_run, AutomatedML |
|
||||
| [Use MLflow with Azure Machine Learning for training and deployment](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/track-and-monitor-experiments/using-mlflow/train-deploy-pytorch/train-and-deploy-pytorch.ipynb) | Use MLflow with Azure Machine Learning to train and deploy Pa yTorch image classifier model | MNIST | AML Compute | Azure Container Instance | PyTorch | None |
|
||||
| :star:[Azure Machine Learning Pipeline with DataTranferStep](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-data-transfer.ipynb) | Demonstrates the use of DataTranferStep | Custom | ADF | None | Azure ML | None |
|
||||
| [Getting Started with Azure Machine Learning Pipelines](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-getting-started.ipynb) | Getting Started notebook for ANML Pipelines | Custom | AML Compute | None | Azure ML | None |
|
||||
| [Azure Machine Learning Pipeline with AzureBatchStep](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-how-to-use-azurebatch-to-run-a-windows-executable.ipynb) | Demonstrates the use of AzureBatchStep | Custom | Azure Batch | None | Azure ML | None |
|
||||
@@ -51,7 +45,6 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| :star:[Azure Machine Learning Pipelines with Data Dependency](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-with-data-dependency-steps.ipynb) | Demonstrates how to construct a Pipeline with data dependency between steps | Custom | AML Compute | None | Azure ML | None |
|
||||
| [How to use run a notebook as a step in AML Pipelines](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-with-notebook-runner-step.ipynb) | Demonstrates the use of NotebookRunnerStep | Custom | AML Compute | None | Azure ML | None |
|
||||
|
||||
|
||||
## Training
|
||||
|
||||
|Title| Task | Dataset | Training Compute | Deployment Target | ML Framework | Tags |
|
||||
@@ -79,7 +72,6 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| [Use MLflow with AML for a remote training run](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/track-and-monitor-experiments/using-mlflow/train-remote/train-remote.ipynb) | Use MLflow tracking APIs together with AML for storing your metrics and artifacts | Diabetes | AML Compute | None | None | None |
|
||||
|
||||
|
||||
|
||||
## Deployment
|
||||
|
||||
|
||||
@@ -91,16 +83,14 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| :star:[Deploy models to AKS using controlled roll out](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/deploy-with-controlled-rollout/deploy-aks-with-controlled-rollout.ipynb) | Deploy a model with Azure Machine Learning | Diabetes | None | Azure Kubernetes Service | Scikit-learn | None |
|
||||
| [Train MNIST in PyTorch, convert, and deploy with ONNX Runtime](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/onnx/onnx-train-pytorch-aml-deploy-mnist.ipynb) | Image Classification | MNIST | AML Compute | Azure Container Instance | ONNX | ONNX Converter |
|
||||
| [Deploy ResNet50 with ONNX Runtime](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/onnx/onnx-modelzoo-aml-deploy-resnet50.ipynb) | Image Classification | ImageNet | Local | Azure Container Instance | ONNX | ONNX Model Zoo |
|
||||
| [Deploy a model as a web service using MLflow](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/track-and-monitor-experiments/using-mlflow/deploy-model/deploy-model.ipynb) | Use MLflow with AML | Diabetes | None | Azure Container Instance | Scikit-learn | None |
|
||||
| :star:[Convert and deploy TinyYolo with ONNX Runtime](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/onnx/onnx-convert-aml-deploy-tinyyolo.ipynb) | Object Detection | PASCAL VOC | local | Azure Container Instance | ONNX | ONNX Converter |
|
||||
|
||||
| [Register Spark model and deploy as webservice](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/spark/model-register-and-deploy-spark.ipynb) | | Iris | None | Azure Container Instance | PySpark | |
|
||||
|
||||
|
||||
## Other Notebooks
|
||||
|Title| Task | Dataset | Training Compute | Deployment Target | ML Framework | Tags |
|
||||
|:----|:-----|:-------:|:----------------:|:-----------------:|:------------:|:------------:|
|
||||
| [DNN Text Featurization](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/classification-text-dnn/auto-ml-classification-text-dnn.ipynb) | Text featurization using DNNs for classification | None | AML Compute | None | None | None |
|
||||
| [Automated ML Grouping with Pipeline.](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/forecasting-grouping/auto-ml-forecasting-grouping.ipynb) | Use AzureML Pipeline to trigger multiple Automated ML runs. | Orange Juice Sales | AML Compute | Azure Container Instance | Scikit-learn, Pytorch | AutomatedML |
|
||||
| [configuration](https://github.com/Azure/MachineLearningNotebooks/blob/master/configuration.ipynb) | | | | | | |
|
||||
| [lightgbm-example](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/gbdt/lightgbm/lightgbm-example.ipynb) | | | | | | |
|
||||
| [azure-ml-with-nvidia-rapids](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/RAPIDS/azure-ml-with-nvidia-rapids.ipynb) | | | | | | |
|
||||
@@ -110,7 +100,6 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| [auto-ml-regression](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/regression/auto-ml-regression.ipynb) | | | | | | |
|
||||
| [build-model-run-history-03](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/azure-databricks/amlsdk/build-model-run-history-03.ipynb) | | | | | | |
|
||||
| [deploy-to-aci-04](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/azure-databricks/amlsdk/deploy-to-aci-04.ipynb) | | | | | | |
|
||||
| [deploy-to-aks-05](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/azure-databricks/amlsdk/deploy-to-aks-05.ipynb) | | | | | | |
|
||||
| [ingest-data-02](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/azure-databricks/amlsdk/ingest-data-02.ipynb) | | | | | | |
|
||||
| [installation-and-configuration-01](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/azure-databricks/amlsdk/installation-and-configuration-01.ipynb) | | | | | | |
|
||||
| [automl-databricks-local-01](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/azure-databricks/automl/automl-databricks-local-01.ipynb) | | | | | | |
|
||||
@@ -124,7 +113,6 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| [enable-app-insights-in-production-service](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/enable-app-insights-in-production-service/enable-app-insights-in-production-service.ipynb) | | | | | | |
|
||||
| [onnx-model-register-and-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/onnx/onnx-model-register-and-deploy.ipynb) | | | | | | |
|
||||
| [production-deploy-to-aks](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb) | | | | | | |
|
||||
| [register-model-create-image-deploy-service](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb) | | | | | | |
|
||||
| [tensorflow-model-register-and-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/deployment/tensorflow/tensorflow-model-register-and-deploy.ipynb) | | | | | | |
|
||||
| [explain-model-on-amlcompute](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/explain-model/azure-integration/remote-explanation/explain-model-on-amlcompute.ipynb) | | | | | | |
|
||||
| [save-retrieve-explanations-run-history](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/explain-model/azure-integration/run-history/save-retrieve-explanations-run-history.ipynb) | | | | | | |
|
||||
@@ -132,15 +120,13 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|
||||
| [train-explain-model-on-amlcompute-and-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/explain-model/azure-integration/scoring-time/train-explain-model-on-amlcompute-and-deploy.ipynb) | | | | | | |
|
||||
| [training_notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/notebook_runner/training_notebook.ipynb) | | | | | | |
|
||||
| [nyc-taxi-data-regression-model-building](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/nyc-taxi-data-regression-model-building/nyc-taxi-data-regression-model-building.ipynb) | | | | | | |
|
||||
| [pipeline-batch-scoring](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/pipeline-batch-scoring/pipeline-batch-scoring.ipynb) | | | | | | |
|
||||
| [authentication-in-azureml](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/manage-azureml-service/authentication-in-azureml/authentication-in-azureml.ipynb) | | | | | | |
|
||||
| [Logging APIs](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/track-and-monitor-experiments/logging-api/logging-api.ipynb) | Logging APIs and analyzing results | None | None | None | None | None |
|
||||
| [distributed-cntk-with-custom-docker](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training-with-deep-learning/distributed-cntk-with-custom-docker/distributed-cntk-with-custom-docker.ipynb) | | | | | | |
|
||||
| [notebook_example](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training-with-deep-learning/how-to-use-estimator/notebook_example.ipynb) | | | | | | |
|
||||
| [configuration](https://github.com/Azure/MachineLearningNotebooks/blob/master//setup-environment/configuration.ipynb) | | | | | | |
|
||||
| [img-classification-part1-training](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/img-classification-part1-training.ipynb) | | | | | | |
|
||||
| [img-classification-part2-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/img-classification-part2-deploy.ipynb) | | | | | | |
|
||||
| [regression-automated-ml](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/regression-automated-ml.ipynb) | | | | | | |
|
||||
| [tutorial-1st-experiment-sdk-train](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/tutorial-1st-experiment-sdk-train.ipynb) | | | | | | |
|
||||
| [tutorial-pipeline-batch-scoring-classification](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/tutorial-pipeline-batch-scoring-classification.ipynb) | | | | | | |
|
||||
|
||||
| [tutorial-1st-experiment-sdk-train](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/create-first-ml-experiment/tutorial-1st-experiment-sdk-train.ipynb) | | | | | | |
|
||||
| [img-classification-part1-training](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/image-classification-mnist-data/img-classification-part1-training.ipynb) | | | | | | |
|
||||
| [img-classification-part2-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/image-classification-mnist-data/img-classification-part2-deploy.ipynb) | | | | | | |
|
||||
| [tutorial-pipeline-batch-scoring-classification](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/machine-learning-pipelines-advanced/tutorial-pipeline-batch-scoring-classification.ipynb) | | | | | | |
|
||||
| [regression-automated-ml](https://github.com/Azure/MachineLearningNotebooks/blob/master//tutorials/regression-automl-nyc-taxi-data/regression-automated-ml.ipynb) | | | | | | |
|
||||
|
||||
@@ -102,7 +102,7 @@
|
||||
"source": [
|
||||
"import azureml.core\n",
|
||||
"\n",
|
||||
"print(\"This notebook was created using version 1.0.83 of the Azure ML SDK\")\n",
|
||||
"print(\"This notebook was created using version 1.1.0rc0 of the Azure ML SDK\")\n",
|
||||
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
|
||||
]
|
||||
},
|
||||
|
||||
@@ -1,27 +1,35 @@
|
||||
## Azure Machine Learning service Tutorial
|
||||
# Azure Machine Learning Tutorials
|
||||
|
||||
Complete these tutorials to learn how to train and deploy models using Azure Machine Learning services and Python SDK. These Notebooks accompany the
|
||||
two sets of tutorial articles for:
|
||||
Azure Machine Learning, a cloud-based environment you can use to train, deploy, automate, manage, and track ML models.
|
||||
|
||||
* [Image classification using MNIST dataset](https://docs.microsoft.com/en-us/azure/machine-learning/service/tutorial-train-models-with-aml)
|
||||
* [Regression using NYC Taxi dataset](https://docs.microsoft.com/en-us/azure/machine-learning/service/tutorial-data-prep)
|
||||
Azure Machine Learning can be used for any kind of machine learning, from classical ML to supervised, unsupervised, and deep learning.
|
||||
|
||||
If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, run the [configuration Notebook](../configuration.ipynb) notebook first to set up your Azure ML Workspace. Then, run the notebooks in following recommended order.
|
||||
This folder contains a collection of Jupyter Notebooks with the code used in accompanying step-by-step tutorials.
|
||||
|
||||
### Create first ML experiment
|
||||
## Set up your environment.
|
||||
|
||||
* [Part 1](https://docs.microsoft.com/azure/machine-learning/service/tutorial-quickstart-setup): Set up workspace & dev environment
|
||||
* [Part 2](tutorial-quickstart-train-model.ipynb): Learn the foundational design patterns in Azure Machine Learning service, and train a simple scikit-learn model based on the diabetes data set
|
||||
If you are using an Azure Machine Learning Notebook VM, everything is already set up for you. Otherwise, see the [get started creating your first ML experiment with the Python SDK tutorial](https://docs.microsoft.com/en-us/azure/machine-learning/tutorial-1st-experiment-sdk-setup).
|
||||
|
||||
### Image classification
|
||||
## Introductory Samples
|
||||
|
||||
* [Part 1](img-classification-part1-training.ipynb): Train an image classification model with Azure Machine Learning.
|
||||
* [Part 2](img-classification-part2-deploy.ipynb): Deploy an image classification model from first tutorial in Azure Container Instance (ACI).
|
||||
The following tutorials are intended to provide an introductory overview of Azure Machine Learning.
|
||||
|
||||
### Regression
|
||||
* [Part 1](regression-part1-data-prep.ipynb): Prepare the data using Azure Machine Learning Data Prep SDK.
|
||||
* [Part 2](regression-part2-automated-ml.ipynb): Train a model using Automated Machine Learning.
|
||||
| Tutorial | Description | Notebook | Task | Framework |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| [Train your first ML Model](https://docs.microsoft.com/azure/machine-learning/tutorial-1st-experiment-sdk-train) | Learn the foundational design patterns in Azure Machine Learning and train a scikit-learn model based on a diabetes data set. | [tutorial-quickstart-train-model.ipynb](create-first-ml-experiment/tutorial-1st-experiment-sdk-train.ipynb) | Regression | Scikit-Learn
|
||||
| [Train an image classification model](https://docs.microsoft.com/azure/machine-learning/tutorial-train-models-with-aml) | Train a scikit-learn image classification model. | [img-classification-part1-training.ipynb](image-classification-mnist-data/img-classification-part1-training.ipynb) | Image Classification | Scikit-Learn
|
||||
| [Deploy an image classification model](https://docs.microsoft.com/azure/machine-learning/tutorial-deploy-models-with-aml) | Deploy a scikit-learn image classification model to Azure Container Instances. | [img-classification-part2-deploy.ipynb](image-classification-mnist-data/img-classification-part2-deploy.ipynb) | Image Classification | Scikit-Learn
|
||||
| [Use automated machine learning to predict taxi fares](https://docs.microsoft.com/azure/machine-learning/tutorial-auto-train-models) | Train a regression model to predict taxi fares using Automated Machine Learning. | [regression-part2-automated-ml.ipynb](regression-automl-nyc-taxi-data/regression-automated-ml.ipynb) | Regression | Automated ML
|
||||
|
||||
Also find quickstarts and how-tos on the [official documentation site for Azure Machine Learning service](https://docs.microsoft.com/en-us/azure/machine-learning/service/).
|
||||
## Advanced Samples
|
||||
|
||||
The following tutorials are intended to provide examples of more advanced feature in Azure Machine Learning.
|
||||
|
||||
| Tutorial | Description | Notebook | Task | Framework |
|
||||
| --- | --- | --- | --- | --- |
|
||||
| [Build an Azure Machine Learning pipeline for batch scoring](https://docs.microsoft.com/azure/machine-learning/tutorial-pipeline-batch-scoring-classification) | Create an Azure Machine Learning pipeline to run batch scoring image classification jobs | [tutorial-pipeline-batch-scoring-classification.ipynb](machine-learning-pipelines-advanced/tutorial-pipeline-batch-scoring-classification.ipynb) | Image Classification | TensorFlow
|
||||
Complete these tutorials to learn how to train and deploy models using Azure Machine Learning services and Python SDK. These Notebooks accompany the tutorial articles for:
|
||||
|
||||
For additional documentation and resources, see the [official documentation site for Azure Machine Learning](https://docs.microsoft.com/azure/machine-learning/).
|
||||
|
||||

|
||||
|
Before Width: | Height: | Size: 62 KiB After Width: | Height: | Size: 62 KiB |
|
Before Width: | Height: | Size: 19 KiB After Width: | Height: | Size: 19 KiB |
@@ -317,7 +317,9 @@
|
||||
"\n",
|
||||
"If you used a cloud notebook server, stop the VM when you are not using it to reduce cost.\n",
|
||||
"\n",
|
||||
"1. In your workspace, select **Notebook VMs**.\n",
|
||||
"1. In your workspace, select **Compute**.\n",
|
||||
"\n",
|
||||
"1. Select the **Notebook VMs** tab in the compute page.\n",
|
||||
"\n",
|
||||
"1. From the list, select the VM.\n",
|
||||
"\n",
|
||||
@@ -62,18 +62,7 @@
|
||||
" model_name=model_name,\n",
|
||||
" tags={\"data\": \"mnist\", \"model\": \"classification\"},\n",
|
||||
" description=\"Mnist handwriting recognition\",\n",
|
||||
" workspace=ws)\n",
|
||||
"\n",
|
||||
"# download test data\n",
|
||||
"import os\n",
|
||||
"import urllib.request\n",
|
||||
"\n",
|
||||
"data_folder = os.path.join(os.getcwd(), 'data')\n",
|
||||
"os.makedirs(data_folder, exist_ok = True)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"urllib.request.urlretrieve('http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz', filename=os.path.join(data_folder, 'test-images.gz'))\n",
|
||||
"urllib.request.urlretrieve('http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz', filename=os.path.join(data_folder, 'test-labels.gz'))"
|
||||
" workspace=ws)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -150,10 +139,42 @@
|
||||
"## Test model locally\n",
|
||||
"\n",
|
||||
"Before deploying, make sure your model is working locally by:\n",
|
||||
"* Downloading the test data if you haven't already\n",
|
||||
"* Loading test data\n",
|
||||
"* Predicting test data\n",
|
||||
"* Examining the confusion matrix\n",
|
||||
"* Examining the confusion matrix"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Download test data\n",
|
||||
"If you haven't already, download the test data to the **./data/** directory"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"# download test data\n",
|
||||
"import os\n",
|
||||
"import urllib.request\n",
|
||||
"\n",
|
||||
"data_folder = os.path.join(os.getcwd(), 'data')\n",
|
||||
"os.makedirs(data_folder, exist_ok = True)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"urllib.request.urlretrieve('http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz', filename=os.path.join(data_folder, 'test-images.gz'))\n",
|
||||
"urllib.request.urlretrieve('http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz', filename=os.path.join(data_folder, 'test-labels.gz'))"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Load test data\n",
|
||||
"\n",
|
||||
"Load the test data from the **./data/** directory created during the training tutorial."
|
||||
@@ -190,10 +211,11 @@
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import pickle\n",
|
||||
"from sklearn.externals import joblib\n",
|
||||
"import joblib\n",
|
||||
"\n",
|
||||
"clf = joblib.load( os.path.join(os.getcwd(), 'sklearn_mnist_model.pkl'))\n",
|
||||
"y_hat = clf.predict(X_test)"
|
||||
"y_hat = clf.predict(X_test)\n",
|
||||
"print(y_hat)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -286,8 +308,7 @@
|
||||
"import numpy as np\n",
|
||||
"import os\n",
|
||||
"import pickle\n",
|
||||
"from sklearn.externals import joblib\n",
|
||||
"from sklearn.linear_model import LogisticRegression\n",
|
||||
"import joblib\n",
|
||||
"\n",
|
||||
"def init():\n",
|
||||
" global model\n",
|
||||
@@ -327,7 +348,7 @@
|
||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||
"\n",
|
||||
"myenv = CondaDependencies()\n",
|
||||
"myenv.add_conda_package(\"scikit-learn\")\n",
|
||||
"myenv.add_pip_package(\"scikit-learn\")\n",
|
||||
"myenv.add_pip_package(\"azureml-defaults\")\n",
|
||||
"\n",
|
||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||
Binary file not shown.
@@ -0,0 +1,83 @@
|
||||
# Copyright (c) Microsoft. All rights reserved.
|
||||
# Licensed under the MIT license.
|
||||
|
||||
import os
|
||||
import argparse
|
||||
import datetime
|
||||
import time
|
||||
import tensorflow as tf
|
||||
from math import ceil
|
||||
import numpy as np
|
||||
import shutil
|
||||
from tensorflow.contrib.slim.python.slim.nets import inception_v3
|
||||
|
||||
from azureml.core import Run
|
||||
from azureml.core.model import Model
|
||||
from azureml.core.dataset import Dataset
|
||||
|
||||
slim = tf.contrib.slim
|
||||
|
||||
image_size = 299
|
||||
num_channel = 3
|
||||
|
||||
|
||||
def get_class_label_dict():
|
||||
label = []
|
||||
proto_as_ascii_lines = tf.gfile.GFile("labels.txt").readlines()
|
||||
for l in proto_as_ascii_lines:
|
||||
label.append(l.rstrip())
|
||||
return label
|
||||
|
||||
|
||||
def init():
|
||||
global g_tf_sess, probabilities, label_dict, input_images
|
||||
|
||||
parser = argparse.ArgumentParser(description="Start a tensorflow model serving")
|
||||
parser.add_argument('--model_name', dest="model_name", required=True)
|
||||
parser.add_argument('--labels_name', dest="labels_name", required=True)
|
||||
args, _ = parser.parse_known_args()
|
||||
|
||||
workspace = Run.get_context(allow_offline=False).experiment.workspace
|
||||
label_ds = Dataset.get_by_name(workspace=workspace, name=args.labels_name)
|
||||
label_ds.download(target_path='.', overwrite=True)
|
||||
|
||||
label_dict = get_class_label_dict()
|
||||
classes_num = len(label_dict)
|
||||
|
||||
with slim.arg_scope(inception_v3.inception_v3_arg_scope()):
|
||||
input_images = tf.placeholder(tf.float32, [1, image_size, image_size, num_channel])
|
||||
logits, _ = inception_v3.inception_v3(input_images,
|
||||
num_classes=classes_num,
|
||||
is_training=False)
|
||||
probabilities = tf.argmax(logits, 1)
|
||||
|
||||
config = tf.ConfigProto()
|
||||
config.gpu_options.allow_growth = True
|
||||
g_tf_sess = tf.Session(config=config)
|
||||
g_tf_sess.run(tf.global_variables_initializer())
|
||||
g_tf_sess.run(tf.local_variables_initializer())
|
||||
|
||||
model_path = Model.get_model_path(args.model_name)
|
||||
saver = tf.train.Saver()
|
||||
saver.restore(g_tf_sess, model_path)
|
||||
|
||||
|
||||
def file_to_tensor(file_path):
|
||||
image_string = tf.read_file(file_path)
|
||||
image = tf.image.decode_image(image_string, channels=3)
|
||||
|
||||
image.set_shape([None, None, None])
|
||||
image = tf.image.resize_images(image, [image_size, image_size])
|
||||
image = tf.divide(tf.subtract(image, [0]), [255])
|
||||
image.set_shape([image_size, image_size, num_channel])
|
||||
return image
|
||||
|
||||
|
||||
def run(mini_batch):
|
||||
result_list = []
|
||||
for file_path in mini_batch:
|
||||
test_image = file_to_tensor(file_path)
|
||||
out = g_tf_sess.run(test_image)
|
||||
result = g_tf_sess.run(probabilities, feed_dict={input_images: [out]})
|
||||
result_list.append(os.path.basename(file_path) + ": " + label_dict[result[0]])
|
||||
return result_list
|
||||
@@ -15,19 +15,16 @@
|
||||
""
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"**Note**: Azure Machine Learning recently released ParallelRunStep for public preview, this will allow for parallelization of your workload across many compute nodes without the difficulty of orchestrating worker pools and queues. See the [batch inference notebooks](../contrib/batch_inferencing/) for examples on how to get started."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"# Use Azure Machine Learning Pipelines for batch prediction\n",
|
||||
"\n",
|
||||
"## Note\n",
|
||||
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"In this tutorial, you use Azure Machine Learning service pipelines to run a batch scoring image classification job. The example job uses the pre-trained [Inception-V3](https://arxiv.org/abs/1512.00567) CNN (convolutional neural network) Tensorflow model to classify unlabeled images. Machine learning pipelines optimize your workflow with speed, portability, and reuse so you can focus on your expertise, machine learning, rather than on infrastructure and automation. After building and publishing a pipeline, you can configure a REST endpoint to enable triggering the pipeline from any HTTP library on any platform.\n",
|
||||
"\n",
|
||||
"\n",
|
||||
@@ -37,6 +34,7 @@
|
||||
"> * Create data objects to fetch and output data\n",
|
||||
"> * Download, prepare, and register the model to your workspace\n",
|
||||
"> * Provision compute targets and create a scoring script\n",
|
||||
"> * Use ParallelRunStep to do batch scoring\n",
|
||||
"> * Build, run, and publish a pipeline\n",
|
||||
"> * Enable a REST endpoint for the pipeline\n",
|
||||
"\n",
|
||||
@@ -111,14 +109,14 @@
|
||||
"source": [
|
||||
"## Create data objects\n",
|
||||
"\n",
|
||||
"When building pipelines, `DataReference` objects are used for reading data from workspace datastores, and `PipelineData` objects are used for transferring intermediate data between pipeline steps.\n",
|
||||
"When building pipelines, `Dataset` objects are used for reading data from workspace datastores, and `PipelineData` objects are used for transferring intermediate data between pipeline steps.\n",
|
||||
"\n",
|
||||
"This batch scoring example only uses one pipeline step, but in use-cases with multiple steps, the typical flow will include:\n",
|
||||
"\n",
|
||||
"1. Using `DataReference` objects as **inputs** to fetch raw data, performing some transformations, then **outputting** a `PipelineData` object.\n",
|
||||
"1. Using `Dataset` objects as **inputs** to fetch raw data, performing some transformations, then **outputting** a `PipelineData` object.\n",
|
||||
"1. Use the previous step's `PipelineData` **output object** as an *input object*, repeated for subsequent steps.\n",
|
||||
"\n",
|
||||
"For this scenario you create `DataReference` objects corresponding to the datastore directories for both the input images and the classification labels (y-test values). You also create a `PipelineData` object for the batch scoring output data."
|
||||
"For this scenario you create `Dataset` objects corresponding to the datastore directories for both the input images and the classification labels (y-test values). You also create a `PipelineData` object for the batch scoring output data."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -127,21 +125,11 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.data.data_reference import DataReference\n",
|
||||
"from azureml.core.dataset import Dataset\n",
|
||||
"from azureml.pipeline.core import PipelineData\n",
|
||||
"\n",
|
||||
"input_images = DataReference(datastore=batchscore_blob, \n",
|
||||
" data_reference_name=\"input_images\",\n",
|
||||
" path_on_datastore=\"batchscoring/images\",\n",
|
||||
" mode=\"download\"\n",
|
||||
" )\n",
|
||||
"\n",
|
||||
"label_dir = DataReference(datastore=batchscore_blob, \n",
|
||||
" data_reference_name=\"input_labels\",\n",
|
||||
" path_on_datastore=\"batchscoring/labels\",\n",
|
||||
" mode=\"download\" \n",
|
||||
" )\n",
|
||||
"\n",
|
||||
"input_images = Dataset.File.from_files((batchscore_blob, \"batchscoring/images/\"))\n",
|
||||
"label_ds = Dataset.File.from_files((batchscore_blob, \"batchscoring/labels/*.txt\"))\n",
|
||||
"output_dir = PipelineData(name=\"scores\", \n",
|
||||
" datastore=def_data_store, \n",
|
||||
" output_path_on_compute=\"batchscoring/results\")"
|
||||
@@ -150,6 +138,25 @@
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Next, we need to register the datasets with the workspace."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"input_images = input_images.register(workspace = ws, name = \"input_images\")\n",
|
||||
"label_ds = label_ds.register(workspace = ws, name = \"label_ds\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"## Download and register the model"
|
||||
]
|
||||
@@ -192,13 +199,17 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"import shutil\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
" \n",
|
||||
"\n",
|
||||
"# register downloaded model \n",
|
||||
"model = Model.register(model_path=\"models/inception_v3.ckpt\",\n",
|
||||
" model_name=\"inception\",\n",
|
||||
" tags={\"pretrained\": \"inception\"},\n",
|
||||
" description=\"Imagenet trained tensorflow inception\",\n",
|
||||
" workspace=ws)"
|
||||
" workspace=ws)\n",
|
||||
"# remove the downloaded dir after registration if you wish\n",
|
||||
"shutil.rmtree(\"models\")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -244,142 +255,16 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"To do the scoring, you create a batch scoring script `batch_scoring.py`, and write it to the current directory. The script takes input images, applies the classification model, and outputs the predictions to a results file.\n",
|
||||
"To do the scoring, you create a batch scoring script `batch_scoring.py`, and write it to the current directory. The script takes a minibatch of input images, applies the classification model, and outputs the predictions to a results file.\n",
|
||||
"\n",
|
||||
"The script `batch_scoring.py` takes the following parameters, which get passed from the `PythonScriptStep` that you create later:\n",
|
||||
"The script `batch_scoring.py` takes the following parameters, which get passed from the `ParallelRunStep` that you create later:\n",
|
||||
"\n",
|
||||
"- `--model_name`: the name of the model being used\n",
|
||||
"- `--label_dir` : the directory holding the `labels.txt` file \n",
|
||||
"- `--dataset_path`: the directory containing the input images\n",
|
||||
"- `--output_dir` : the script will run the model on the data and output a `results-label.txt` to this directory\n",
|
||||
"- `--batch_size` : the batch size used in running the model\n",
|
||||
"- `--labels_name` : the name of the `Dataset` holding the `labels.txt` file \n",
|
||||
"\n",
|
||||
"The pipelines infrastructure uses the `ArgumentParser` class to pass parameters into pipeline steps. For example, in the code below the first argument `--model_name` is given the property identifier `model_name`. In the `main()` function, this property is accessed using `Model.get_model_path(args.model_name)`."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"%%writefile batch_scoring.py\n",
|
||||
"\n",
|
||||
"import os\n",
|
||||
"import argparse\n",
|
||||
"import datetime\n",
|
||||
"import time\n",
|
||||
"import tensorflow as tf\n",
|
||||
"from math import ceil\n",
|
||||
"import numpy as np\n",
|
||||
"import shutil\n",
|
||||
"from tensorflow.contrib.slim.python.slim.nets import inception_v3\n",
|
||||
"from azureml.core.model import Model\n",
|
||||
"\n",
|
||||
"slim = tf.contrib.slim\n",
|
||||
"\n",
|
||||
"parser = argparse.ArgumentParser(description=\"Start a tensorflow model serving\")\n",
|
||||
"parser.add_argument('--model_name', dest=\"model_name\", required=True)\n",
|
||||
"parser.add_argument('--label_dir', dest=\"label_dir\", required=True)\n",
|
||||
"parser.add_argument('--dataset_path', dest=\"dataset_path\", required=True)\n",
|
||||
"parser.add_argument('--output_dir', dest=\"output_dir\", required=True)\n",
|
||||
"parser.add_argument('--batch_size', dest=\"batch_size\", type=int, required=True)\n",
|
||||
"\n",
|
||||
"args = parser.parse_args()\n",
|
||||
"\n",
|
||||
"image_size = 299\n",
|
||||
"num_channel = 3\n",
|
||||
"\n",
|
||||
"# create output directory if it does not exist\n",
|
||||
"os.makedirs(args.output_dir, exist_ok=True)\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def get_class_label_dict(label_file):\n",
|
||||
" label = []\n",
|
||||
" proto_as_ascii_lines = tf.gfile.GFile(label_file).readlines()\n",
|
||||
" for l in proto_as_ascii_lines:\n",
|
||||
" label.append(l.rstrip())\n",
|
||||
" return label\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"class DataIterator:\n",
|
||||
" def __init__(self, data_dir):\n",
|
||||
" self.file_paths = []\n",
|
||||
" image_list = os.listdir(data_dir)\n",
|
||||
" self.file_paths = [data_dir + '/' + file_name.rstrip() for file_name in image_list]\n",
|
||||
"\n",
|
||||
" self.labels = [1 for file_name in self.file_paths]\n",
|
||||
"\n",
|
||||
" @property\n",
|
||||
" def size(self):\n",
|
||||
" return len(self.labels)\n",
|
||||
"\n",
|
||||
" def input_pipeline(self, batch_size):\n",
|
||||
" images_tensor = tf.convert_to_tensor(self.file_paths, dtype=tf.string)\n",
|
||||
" labels_tensor = tf.convert_to_tensor(self.labels, dtype=tf.int64)\n",
|
||||
" input_queue = tf.train.slice_input_producer([images_tensor, labels_tensor], shuffle=False)\n",
|
||||
" labels = input_queue[1]\n",
|
||||
" images_content = tf.read_file(input_queue[0])\n",
|
||||
"\n",
|
||||
" image_reader = tf.image.decode_jpeg(images_content, channels=num_channel, name=\"jpeg_reader\")\n",
|
||||
" float_caster = tf.cast(image_reader, tf.float32)\n",
|
||||
" new_size = tf.constant([image_size, image_size], dtype=tf.int32)\n",
|
||||
" images = tf.image.resize_images(float_caster, new_size)\n",
|
||||
" images = tf.divide(tf.subtract(images, [0]), [255])\n",
|
||||
"\n",
|
||||
" image_batch, label_batch = tf.train.batch([images, labels], batch_size=batch_size, capacity=5 * batch_size)\n",
|
||||
" return image_batch\n",
|
||||
"\n",
|
||||
"\n",
|
||||
"def main(_):\n",
|
||||
" label_file_name = os.path.join(args.label_dir, \"labels.txt\")\n",
|
||||
" label_dict = get_class_label_dict(label_file_name)\n",
|
||||
" classes_num = len(label_dict)\n",
|
||||
" test_feeder = DataIterator(data_dir=args.dataset_path)\n",
|
||||
" total_size = len(test_feeder.labels)\n",
|
||||
" count = 0\n",
|
||||
" \n",
|
||||
" # get model from model registry\n",
|
||||
" model_path = Model.get_model_path(args.model_name)\n",
|
||||
" \n",
|
||||
" with tf.Session() as sess:\n",
|
||||
" test_images = test_feeder.input_pipeline(batch_size=args.batch_size)\n",
|
||||
" with slim.arg_scope(inception_v3.inception_v3_arg_scope()):\n",
|
||||
" input_images = tf.placeholder(tf.float32, [args.batch_size, image_size, image_size, num_channel])\n",
|
||||
" logits, _ = inception_v3.inception_v3(input_images,\n",
|
||||
" num_classes=classes_num,\n",
|
||||
" is_training=False)\n",
|
||||
" probabilities = tf.argmax(logits, 1)\n",
|
||||
"\n",
|
||||
" sess.run(tf.global_variables_initializer())\n",
|
||||
" sess.run(tf.local_variables_initializer())\n",
|
||||
" coord = tf.train.Coordinator()\n",
|
||||
" threads = tf.train.start_queue_runners(sess=sess, coord=coord)\n",
|
||||
" saver = tf.train.Saver()\n",
|
||||
" saver.restore(sess, model_path)\n",
|
||||
" out_filename = os.path.join(args.output_dir, \"result-labels.txt\")\n",
|
||||
" with open(out_filename, \"w\") as result_file:\n",
|
||||
" i = 0\n",
|
||||
" while count < total_size and not coord.should_stop():\n",
|
||||
" test_images_batch = sess.run(test_images)\n",
|
||||
" file_names_batch = test_feeder.file_paths[i * args.batch_size:\n",
|
||||
" min(test_feeder.size, (i + 1) * args.batch_size)]\n",
|
||||
" results = sess.run(probabilities, feed_dict={input_images: test_images_batch})\n",
|
||||
" new_add = min(args.batch_size, total_size - count)\n",
|
||||
" count += new_add\n",
|
||||
" i += 1\n",
|
||||
" for j in range(new_add):\n",
|
||||
" result_file.write(os.path.basename(file_names_batch[j]) + \": \" + label_dict[results[j]] + \"\\n\")\n",
|
||||
" result_file.flush()\n",
|
||||
" coord.request_stop()\n",
|
||||
" coord.join(threads)\n",
|
||||
"\n",
|
||||
" shutil.copy(out_filename, \"./outputs/\")\n",
|
||||
"\n",
|
||||
"if __name__ == \"__main__\":\n",
|
||||
" tf.app.run()"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
@@ -407,26 +292,23 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.core import Environment\n",
|
||||
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||
"from azureml.core.runconfig import DEFAULT_GPU_IMAGE\n",
|
||||
"from azureml.core.runconfig import CondaDependencies, RunConfiguration\n",
|
||||
"\n",
|
||||
"cd = CondaDependencies.create(pip_packages=[\"tensorflow-gpu==1.13.1\", \"azureml-defaults\"])\n",
|
||||
"\n",
|
||||
"amlcompute_run_config = RunConfiguration(conda_dependencies=cd)\n",
|
||||
"amlcompute_run_config.environment.docker.enabled = True\n",
|
||||
"amlcompute_run_config.environment.docker.base_image = DEFAULT_GPU_IMAGE\n",
|
||||
"amlcompute_run_config.environment.spark.precache_packages = False"
|
||||
"env = Environment(name=\"parallelenv\")\n",
|
||||
"env.python.conda_dependencies=cd\n",
|
||||
"env.docker.base_image = DEFAULT_GPU_IMAGE"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"### Parameterize the pipeline\n",
|
||||
"\n",
|
||||
"Define a custom parameter for the pipeline to control the batch size. After the pipeline has been published and exposed via a REST endpoint, any configured parameters are also exposed and can be specified in the JSON payload when rerunning the pipeline with an HTTP request.\n",
|
||||
"\n",
|
||||
"Create a `PipelineParameter` object to enable this behavior, and define a name and default value."
|
||||
"### Create the configuration to wrap the inference script\n",
|
||||
"Create the pipeline step using the script, environment configuration, and parameters. Specify the compute target you already attached to your workspace as the target of execution of the script. We will use PythonScriptStep to create the pipeline step."
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -435,8 +317,19 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.core.graph import PipelineParameter\n",
|
||||
"batch_size_param = PipelineParameter(name=\"param_batch_size\", default_value=20)"
|
||||
"from azureml.contrib.pipeline.steps import ParallelRunConfig\n",
|
||||
"\n",
|
||||
"parallel_run_config = ParallelRunConfig(\n",
|
||||
" environment=env,\n",
|
||||
" entry_script=\"batch_scoring.py\",\n",
|
||||
" source_directory=\"scripts\",\n",
|
||||
" output_action=\"append_row\",\n",
|
||||
" mini_batch_size=\"20\",\n",
|
||||
" error_threshold=1,\n",
|
||||
" compute_target=compute_target,\n",
|
||||
" process_count_per_node=2,\n",
|
||||
" node_count=1\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -452,7 +345,7 @@
|
||||
"* input and output data, and any custom parameters\n",
|
||||
"* reference to a script or SDK-logic to run during the step\n",
|
||||
"\n",
|
||||
"There are multiple classes that inherit from the parent class [`PipelineStep`](https://docs.microsoft.com/python/api/azureml-pipeline-core/azureml.pipeline.core.builder.pipelinestep?view=azure-ml-py) to assist with building a step using certain frameworks and stacks. In this example, you use the [`PythonScriptStep`](https://docs.microsoft.com/python/api/azureml-pipeline-steps/azureml.pipeline.steps.python_script_step.pythonscriptstep?view=azure-ml-py) class to define your step logic using a custom python script. Note that if an argument to your script is either an input to the step or output of the step, it must be defined **both** in the `arguments` array, **as well as** in either the `input` or `output` parameter, respectively. \n",
|
||||
"There are multiple classes that inherit from the parent class [`PipelineStep`](https://docs.microsoft.com/python/api/azureml-pipeline-core/azureml.pipeline.core.builder.pipelinestep?view=azure-ml-py) to assist with building a step using certain frameworks and stacks. In this example, you use the [`ParallelRunStep`](https://docs.microsoft.com/en-us/python/api/azureml-contrib-pipeline-steps/azureml.contrib.pipeline.steps.parallelrunstep?view=azure-ml-py) class to define your step logic using a scoring script. \n",
|
||||
"\n",
|
||||
"An object reference in the `outputs` array becomes available as an **input** for a subsequent pipeline step, for scenarios where there is more than one step."
|
||||
]
|
||||
@@ -463,20 +356,20 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"from azureml.pipeline.steps import PythonScriptStep\n",
|
||||
"from azureml.contrib.pipeline.steps import ParallelRunStep\n",
|
||||
"from datetime import datetime\n",
|
||||
"\n",
|
||||
"batch_score_step = PythonScriptStep(\n",
|
||||
" name=\"batch_scoring\",\n",
|
||||
" script_name=\"batch_scoring.py\",\n",
|
||||
" arguments=[\"--dataset_path\", input_images, \n",
|
||||
" \"--model_name\", \"inception\",\n",
|
||||
" \"--label_dir\", label_dir, \n",
|
||||
" \"--output_dir\", output_dir, \n",
|
||||
" \"--batch_size\", batch_size_param],\n",
|
||||
" compute_target=compute_target,\n",
|
||||
" inputs=[input_images, label_dir],\n",
|
||||
" outputs=[output_dir],\n",
|
||||
" runconfig=amlcompute_run_config\n",
|
||||
"parallel_step_name = \"batchscoring-\" + datetime.now().strftime(\"%Y%m%d%H%M\")\n",
|
||||
"\n",
|
||||
"batch_score_step = ParallelRunStep(\n",
|
||||
" name=parallel_step_name,\n",
|
||||
" inputs=[input_images.as_named_input(\"input_images\")],\n",
|
||||
" output=output_dir,\n",
|
||||
" models=[model],\n",
|
||||
" arguments=[\"--model_name\", \"inception\",\n",
|
||||
" \"--labels_name\", \"label_ds\"],\n",
|
||||
" parallel_run_config=parallel_run_config,\n",
|
||||
" allow_reuse=False\n",
|
||||
")"
|
||||
]
|
||||
},
|
||||
@@ -510,7 +403,7 @@
|
||||
"from azureml.pipeline.core import Pipeline\n",
|
||||
"\n",
|
||||
"pipeline = Pipeline(workspace=ws, steps=[batch_score_step])\n",
|
||||
"pipeline_run = Experiment(ws, 'batch_scoring').submit(pipeline, pipeline_parameters={\"param_batch_size\": 20})\n",
|
||||
"pipeline_run = Experiment(ws, \"batch_scoring\").submit(pipeline)\n",
|
||||
"pipeline_run.wait_for_completion(show_output=True)"
|
||||
]
|
||||
},
|
||||
@@ -534,14 +427,20 @@
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"batch_run = next(pipeline_run.get_children())\n",
|
||||
"batch_output = batch_run.get_output_data(\"scores\")\n",
|
||||
"batch_output.download(local_path=\"inception_results\")\n",
|
||||
"\n",
|
||||
"import pandas as pd\n",
|
||||
"for root, dirs, files in os.walk(\"inception_results\"):\n",
|
||||
" for file in files:\n",
|
||||
" if file.endswith(\"parallel_run_step.txt\"):\n",
|
||||
" result_file = os.path.join(root,file)\n",
|
||||
"\n",
|
||||
"step_run = list(pipeline_run.get_children())[0]\n",
|
||||
"step_run.download_file(\"./outputs/result-labels.txt\")\n",
|
||||
"\n",
|
||||
"df = pd.read_csv(\"result-labels.txt\", delimiter=\":\", header=None)\n",
|
||||
"df = pd.read_csv(result_file, delimiter=\":\", header=None)\n",
|
||||
"df.columns = [\"Filename\", \"Prediction\"]\n",
|
||||
"df.head(10)"
|
||||
"print(\"Prediction has \", df.shape[0], \" rows\")\n",
|
||||
"df.head(10) "
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -599,7 +498,7 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"Get the REST url from the `endpoint` property of the published pipeline object. You can also find the REST url in your workspace in the portal. Build an HTTP POST request to the endpoint, specifying your authentication header. Additionally, add a JSON payload object with the experiment name and the batch size parameter. As a reminder, the `param_batch_size` is passed through to your `batch_scoring.py` script because you defined it as a `PipelineParameter` object in the step configuration.\n",
|
||||
"Get the REST url from the `endpoint` property of the published pipeline object. You can also find the REST url in your workspace in the portal. Build an HTTP POST request to the endpoint, specifying your authentication header. Additionally, add a JSON payload object with the experiment name and the batch size parameter. As a reminder, the `process_count_per_node` is passed through to `ParallelRunStep` because you defined it is defined as a `PipelineParameter` object in the step configuration.\n",
|
||||
"\n",
|
||||
"Make the request to trigger the run. Access the `Id` key from the response dict to get the value of the run id."
|
||||
]
|
||||
@@ -616,8 +515,25 @@
|
||||
"response = requests.post(rest_endpoint, \n",
|
||||
" headers=auth_header, \n",
|
||||
" json={\"ExperimentName\": \"batch_scoring\",\n",
|
||||
" \"ParameterAssignments\": {\"param_batch_size\": 50}})\n",
|
||||
"run_id = response.json()[\"Id\"]"
|
||||
" \"ParameterAssignments\": {\"process_count_per_node\": 6}})"
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": [
|
||||
"try:\n",
|
||||
" response.raise_for_status()\n",
|
||||
"except Exception: \n",
|
||||
" raise Exception(\"Received bad response from the endpoint: {}\\n\"\n",
|
||||
" \"Response Code: {}\\n\"\n",
|
||||
" \"Headers: {}\\n\"\n",
|
||||
" \"Content: {}\".format(rest_endpoint, response.status_code, response.headers, response.content))\n",
|
||||
"\n",
|
||||
"run_id = response.json().get('Id')\n",
|
||||
"print('Submitted pipeline run: ', run_id)"
|
||||
]
|
||||
},
|
||||
{
|
||||
@@ -652,7 +568,8 @@
|
||||
"\n",
|
||||
"If you used a cloud notebook server, stop the VM when you are not using it to reduce cost.\n",
|
||||
"\n",
|
||||
"1. In your workspace, select **Notebook VMs**.\n",
|
||||
"1. In your workspace, select **Compute**.\n",
|
||||
"1. Select the **Notebook VMs** tab in the compute page.\n",
|
||||
"1. From the list, select the VM.\n",
|
||||
"1. Select **Stop**.\n",
|
||||
"1. When you're ready to use the server again, select **Start**.\n",
|
||||
@@ -683,19 +600,16 @@
|
||||
"\n",
|
||||
"See the [how-to](https://docs.microsoft.com/azure/machine-learning/service/how-to-create-your-first-pipeline?view=azure-devops) for additional detail on building pipelines with the machine learning SDK."
|
||||
]
|
||||
},
|
||||
{
|
||||
"cell_type": "code",
|
||||
"execution_count": null,
|
||||
"metadata": {},
|
||||
"outputs": [],
|
||||
"source": []
|
||||
}
|
||||
],
|
||||
"metadata": {
|
||||
"authors": [
|
||||
{
|
||||
"name": "sanpil"
|
||||
"name": [
|
||||
"sanpil",
|
||||
"trmccorm",
|
||||
"pansav"
|
||||
]
|
||||
}
|
||||
],
|
||||
"kernelspec": {
|
||||
@@ -3,7 +3,7 @@ dependencies:
|
||||
- pip:
|
||||
- azureml-sdk
|
||||
- azureml-pipeline-core
|
||||
- azureml-pipeline-steps
|
||||
- azureml-contrib-pipeline-steps
|
||||
- pandas
|
||||
- requests
|
||||
- azureml-widgets
|
||||
@@ -564,7 +564,8 @@
|
||||
"cell_type": "markdown",
|
||||
"metadata": {},
|
||||
"source": [
|
||||
"1. In your workspace, select **Notebook VMs**.\n",
|
||||
"1. In your workspace, select **Compute**.\n",
|
||||
"1. Select the **Notebook VMs** tab in the compute page.\n",
|
||||
"1. From the list, select the VM.\n",
|
||||
"1. Select **Stop**.\n",
|
||||
"1. When you're ready to use the server again, select **Start**."
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user