mirror of
https://github.com/Azure/MachineLearningNotebooks.git
synced 2025-12-20 09:37:04 -05:00
Compare commits
45 Commits
azureml-sd
...
azureml-sd
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d3f1212440 | ||
|
|
b95a65eef4 | ||
|
|
2218af619f | ||
|
|
0401128638 | ||
|
|
59fcb54998 | ||
|
|
e0ea99a6bb | ||
|
|
b06f5ce269 | ||
|
|
ed0ce9e895 | ||
|
|
71053d705b | ||
|
|
77f98bf75f | ||
|
|
e443fd1342 | ||
|
|
2165cf308e | ||
|
|
3d6caa10a3 | ||
|
|
4df079db1c | ||
|
|
67d0b02ef9 | ||
|
|
4e7b3784d5 | ||
|
|
ed91e39d7e | ||
|
|
a09a1a16a7 | ||
|
|
9662505517 | ||
|
|
8e103c02ff | ||
|
|
ecb5157add | ||
|
|
d7d23d5e7c | ||
|
|
83a21ba53a | ||
|
|
3c9cb89c1a | ||
|
|
cca7c2e26f | ||
|
|
e895d7c2bf | ||
|
|
3588eb9665 | ||
|
|
a09e726f31 | ||
|
|
4fb1d9ee5b | ||
|
|
b05ff80e9d | ||
|
|
512630472b | ||
|
|
ae1337fe70 | ||
|
|
c95f970dc8 | ||
|
|
9b9d112719 | ||
|
|
fe8fcd4b48 | ||
|
|
296ae01587 | ||
|
|
8f4efe15eb | ||
|
|
d179080467 | ||
|
|
0040644e7a | ||
|
|
8aa04307fb | ||
|
|
a525da4488 | ||
|
|
e149565a8a | ||
|
|
75610ec31c | ||
|
|
0c2c450b6b | ||
|
|
0d548eabff |
10
README.md
10
README.md
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
This repository contains example notebooks demonstrating the [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning-service/) Python SDK which allows you to build, train, deploy and manage machine learning solutions using Azure. The AML SDK allows you the choice of using local or cloud compute resources, while managing and maintaining the complete data science workflow from the cloud.
|
This repository contains example notebooks demonstrating the [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning-service/) Python SDK which allows you to build, train, deploy and manage machine learning solutions using Azure. The AML SDK allows you the choice of using local or cloud compute resources, while managing and maintaining the complete data science workflow from the cloud.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
|
|
||||||
## Quick installation
|
## Quick installation
|
||||||
@@ -13,15 +13,15 @@ Read more detailed instructions on [how to set up your environment](./NBSETUP.md
|
|||||||
|
|
||||||
## How to navigate and use the example notebooks?
|
## How to navigate and use the example notebooks?
|
||||||
If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, you should always run the [Configuration](./configuration.ipynb) notebook first when setting up a notebook library on a new machine or in a new environment. It configures your notebook library to connect to an Azure Machine Learning workspace, and sets up your workspace and compute to be used by many of the other examples.
|
If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, you should always run the [Configuration](./configuration.ipynb) notebook first when setting up a notebook library on a new machine or in a new environment. It configures your notebook library to connect to an Azure Machine Learning workspace, and sets up your workspace and compute to be used by many of the other examples.
|
||||||
This [index](.index.md) should assist in navigating the Azure Machine Learning notebook samples and encourage efficient retrieval of topics and content.
|
This [index](./index.md) should assist in navigating the Azure Machine Learning notebook samples and encourage efficient retrieval of topics and content.
|
||||||
|
|
||||||
If you want to...
|
If you want to...
|
||||||
|
|
||||||
* ...try out and explore Azure ML, start with image classification tutorials: [Part 1 (Training)](./tutorials/img-classification-part1-training.ipynb) and [Part 2 (Deployment)](./tutorials/img-classification-part2-deploy.ipynb).
|
* ...try out and explore Azure ML, start with image classification tutorials: [Part 1 (Training)](./tutorials/image-classification-mnist-data/img-classification-part1-training.ipynb) and [Part 2 (Deployment)](./tutorials/image-classification-mnist-data/img-classification-part2-deploy.ipynb).
|
||||||
* ...learn about experimentation and tracking run history, first [train within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then try [training on remote VM](./how-to-use-azureml/training/train-on-remote-vm/train-on-remote-vm.ipynb) and [using logging APIs](./how-to-use-azureml/training/logging-api/logging-api.ipynb).
|
* ...learn about experimentation and tracking run history, first [train within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then try [training on remote VM](./how-to-use-azureml/training/train-on-remote-vm/train-on-remote-vm.ipynb) and [using logging APIs](./how-to-use-azureml/training/logging-api/logging-api.ipynb).
|
||||||
* ...train deep learning models at scale, first learn about [Machine Learning Compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and then try [distributed hyperparameter tuning](./how-to-use-azureml/training-with-deep-learning/train-hyperparameter-tune-deploy-with-pytorch/train-hyperparameter-tune-deploy-with-pytorch.ipynb) and [distributed training](./how-to-use-azureml/training-with-deep-learning/distributed-pytorch-with-horovod/distributed-pytorch-with-horovod.ipynb).
|
* ...train deep learning models at scale, first learn about [Machine Learning Compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and then try [distributed hyperparameter tuning](./how-to-use-azureml/training-with-deep-learning/train-hyperparameter-tune-deploy-with-pytorch/train-hyperparameter-tune-deploy-with-pytorch.ipynb) and [distributed training](./how-to-use-azureml/training-with-deep-learning/distributed-pytorch-with-horovod/distributed-pytorch-with-horovod.ipynb).
|
||||||
* ...deploy models as a realtime scoring service, first learn the basics by [training within Notebook and deploying to Azure Container Instance](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then learn how to [register and manage models, and create Docker images](./how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb), and [production deploy models on Azure Kubernetes Cluster](./how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb).
|
* ...deploy models as a realtime scoring service, first learn the basics by [training within Notebook and deploying to Azure Container Instance](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then learn how to [production deploy models on Azure Kubernetes Cluster](./how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb).
|
||||||
* ...deploy models as a batch scoring service, first [train a model within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), learn how to [register and manage models](./how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb), then [create Machine Learning Compute for scoring compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and [use Machine Learning Pipelines to deploy your model](https://aka.ms/pl-batch-scoring).
|
* ...deploy models as a batch scoring service, first [train a model within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then [create Machine Learning Compute for scoring compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and [use Machine Learning Pipelines to deploy your model](https://aka.ms/pl-batch-scoring).
|
||||||
* ...monitor your deployed models, learn about using [App Insights](./how-to-use-azureml/deployment/enable-app-insights-in-production-service/enable-app-insights-in-production-service.ipynb).
|
* ...monitor your deployed models, learn about using [App Insights](./how-to-use-azureml/deployment/enable-app-insights-in-production-service/enable-app-insights-in-production-service.ipynb).
|
||||||
|
|
||||||
## Tutorials
|
## Tutorials
|
||||||
|
|||||||
@@ -103,7 +103,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"import azureml.core\n",
|
"import azureml.core\n",
|
||||||
"\n",
|
"\n",
|
||||||
"print(\"This notebook was created using version 1.0.76.1 of the Azure ML SDK\")\n",
|
"print(\"This notebook was created using version 1.2.0 of the Azure ML SDK\")\n",
|
||||||
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
|
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -9,7 +9,6 @@ As a pre-requisite, run the [configuration Notebook](../configuration.ipynb) not
|
|||||||
* [train-on-amlcompute](./training/train-on-amlcompute): Use a 1-n node Azure ML managed compute cluster for remote runs on Azure CPU or GPU infrastructure.
|
* [train-on-amlcompute](./training/train-on-amlcompute): Use a 1-n node Azure ML managed compute cluster for remote runs on Azure CPU or GPU infrastructure.
|
||||||
* [train-on-remote-vm](./training/train-on-remote-vm): Use Data Science Virtual Machine as a target for remote runs.
|
* [train-on-remote-vm](./training/train-on-remote-vm): Use Data Science Virtual Machine as a target for remote runs.
|
||||||
* [logging-api](./track-and-monitor-experiments/logging-api): Learn about the details of logging metrics to run history.
|
* [logging-api](./track-and-monitor-experiments/logging-api): Learn about the details of logging metrics to run history.
|
||||||
* [register-model-create-image-deploy-service](./deployment/register-model-create-image-deploy-service): Learn about the details of model management.
|
|
||||||
* [production-deploy-to-aks](./deployment/production-deploy-to-aks) Deploy a model to production at scale on Azure Kubernetes Service.
|
* [production-deploy-to-aks](./deployment/production-deploy-to-aks) Deploy a model to production at scale on Azure Kubernetes Service.
|
||||||
* [enable-app-insights-in-production-service](./deployment/enable-app-insights-in-production-service) Learn how to use App Insights with production web service.
|
* [enable-app-insights-in-production-service](./deployment/enable-app-insights-in-production-service) Learn how to use App Insights with production web service.
|
||||||
|
|
||||||
|
|||||||
@@ -144,7 +144,7 @@ jupyter notebook
|
|||||||
- Dataset: forecasting for a bike-sharing
|
- Dataset: forecasting for a bike-sharing
|
||||||
- Example of training an automated ML forecasting model on multiple time-series
|
- Example of training an automated ML forecasting model on multiple time-series
|
||||||
|
|
||||||
- [automl-forecasting-function.ipynb](forecasting-high-frequency/automl-forecasting-function.ipynb)
|
- [auto-ml-forecasting-function.ipynb](forecasting-high-frequency/auto-ml-forecasting-function.ipynb)
|
||||||
- Example of training an automated ML forecasting model on multiple time-series
|
- Example of training an automated ML forecasting model on multiple time-series
|
||||||
|
|
||||||
- [auto-ml-forecasting-beer-remote.ipynb](forecasting-beer-remote/auto-ml-forecasting-beer-remote.ipynb)
|
- [auto-ml-forecasting-beer-remote.ipynb](forecasting-beer-remote/auto-ml-forecasting-beer-remote.ipynb)
|
||||||
@@ -197,6 +197,17 @@ If automl_setup_linux.sh fails on Ubuntu Linux with the error: `unable to execut
|
|||||||
4) Check that the region is one of the supported regions: `eastus2`, `eastus`, `westcentralus`, `southeastasia`, `westeurope`, `australiaeast`, `westus2`, `southcentralus`
|
4) Check that the region is one of the supported regions: `eastus2`, `eastus`, `westcentralus`, `southeastasia`, `westeurope`, `australiaeast`, `westus2`, `southcentralus`
|
||||||
5) Check that you have access to the region using the Azure Portal.
|
5) Check that you have access to the region using the Azure Portal.
|
||||||
|
|
||||||
|
## import AutoMLConfig fails after upgrade from before 1.0.76 to 1.0.76 or later
|
||||||
|
There were package changes in automated machine learning version 1.0.76, which require the previous version to be uninstalled before upgrading to the new version.
|
||||||
|
If you have manually upgraded from a version of automated machine learning before 1.0.76 to 1.0.76 or later, you may get the error:
|
||||||
|
`ImportError: cannot import name 'AutoMLConfig'`
|
||||||
|
|
||||||
|
This can be resolved by running:
|
||||||
|
`pip uninstall azureml-train-automl` and then
|
||||||
|
`pip install azureml-train-automl`
|
||||||
|
|
||||||
|
The automl_setup.cmd script does this automatically.
|
||||||
|
|
||||||
## workspace.from_config fails
|
## workspace.from_config fails
|
||||||
If the call `ws = Workspace.from_config()` fails:
|
If the call `ws = Workspace.from_config()` fails:
|
||||||
1) Make sure that you have run the `configuration.ipynb` notebook successfully.
|
1) Make sure that you have run the `configuration.ipynb` notebook successfully.
|
||||||
|
|||||||
@@ -2,8 +2,9 @@ name: azure_automl
|
|||||||
dependencies:
|
dependencies:
|
||||||
# The python interpreter version.
|
# The python interpreter version.
|
||||||
# Currently Azure ML only supports 3.5.2 and later.
|
# Currently Azure ML only supports 3.5.2 and later.
|
||||||
- pip
|
- pip<=19.3.1
|
||||||
- python>=3.5.2,<3.6.8
|
- python>=3.5.2,<3.6.8
|
||||||
|
- wheel==0.30.0
|
||||||
- nb_conda
|
- nb_conda
|
||||||
- matplotlib==2.1.0
|
- matplotlib==2.1.0
|
||||||
- numpy>=1.16.0,<=1.16.2
|
- numpy>=1.16.0,<=1.16.2
|
||||||
@@ -12,8 +13,7 @@ dependencies:
|
|||||||
- scipy>=1.0.0,<=1.1.0
|
- scipy>=1.0.0,<=1.1.0
|
||||||
- scikit-learn>=0.19.0,<=0.20.3
|
- scikit-learn>=0.19.0,<=0.20.3
|
||||||
- pandas>=0.22.0,<=0.23.4
|
- pandas>=0.22.0,<=0.23.4
|
||||||
- py-xgboost<=0.80
|
- py-xgboost<=0.90
|
||||||
- pyarrow>=0.11.0
|
|
||||||
- fbprophet==0.5
|
- fbprophet==0.5
|
||||||
- pytorch=1.1.0
|
- pytorch=1.1.0
|
||||||
- cudatoolkit=9.0
|
- cudatoolkit=9.0
|
||||||
@@ -21,18 +21,18 @@ dependencies:
|
|||||||
- pip:
|
- pip:
|
||||||
# Required packages for AzureML execution, history, and data preparation.
|
# Required packages for AzureML execution, history, and data preparation.
|
||||||
- azureml-defaults
|
- azureml-defaults
|
||||||
|
- azureml-dataprep[pandas]
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-train
|
- azureml-train
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- azureml-explain-model
|
|
||||||
- azureml-pipeline
|
- azureml-pipeline
|
||||||
- azureml-contrib-interpret
|
- azureml-contrib-interpret
|
||||||
- pytorch-transformers==1.0.0
|
- pytorch-transformers==1.0.0
|
||||||
- spacy==2.1.8
|
- spacy==2.1.8
|
||||||
- joblib
|
- onnxruntime==1.0.0
|
||||||
- onnxruntime==0.4.0
|
|
||||||
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
|
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
|
||||||
|
|
||||||
channels:
|
channels:
|
||||||
|
- anaconda
|
||||||
- conda-forge
|
- conda-forge
|
||||||
- pytorch
|
- pytorch
|
||||||
|
|||||||
@@ -2,9 +2,10 @@ name: azure_automl
|
|||||||
dependencies:
|
dependencies:
|
||||||
# The python interpreter version.
|
# The python interpreter version.
|
||||||
# Currently Azure ML only supports 3.5.2 and later.
|
# Currently Azure ML only supports 3.5.2 and later.
|
||||||
- pip
|
- pip<=19.3.1
|
||||||
- nomkl
|
- nomkl
|
||||||
- python>=3.5.2,<3.6.8
|
- python>=3.5.2,<3.6.8
|
||||||
|
- wheel==0.30.0
|
||||||
- nb_conda
|
- nb_conda
|
||||||
- matplotlib==2.1.0
|
- matplotlib==2.1.0
|
||||||
- numpy>=1.16.0,<=1.16.2
|
- numpy>=1.16.0,<=1.16.2
|
||||||
@@ -14,7 +15,6 @@ dependencies:
|
|||||||
- scikit-learn>=0.19.0,<=0.20.3
|
- scikit-learn>=0.19.0,<=0.20.3
|
||||||
- pandas>=0.22.0,<0.23.0
|
- pandas>=0.22.0,<0.23.0
|
||||||
- py-xgboost<=0.80
|
- py-xgboost<=0.80
|
||||||
- pyarrow>=0.11.0
|
|
||||||
- fbprophet==0.5
|
- fbprophet==0.5
|
||||||
- pytorch=1.1.0
|
- pytorch=1.1.0
|
||||||
- cudatoolkit=9.0
|
- cudatoolkit=9.0
|
||||||
@@ -22,18 +22,18 @@ dependencies:
|
|||||||
- pip:
|
- pip:
|
||||||
# Required packages for AzureML execution, history, and data preparation.
|
# Required packages for AzureML execution, history, and data preparation.
|
||||||
- azureml-defaults
|
- azureml-defaults
|
||||||
|
- azureml-dataprep[pandas]
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-train
|
- azureml-train
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- azureml-explain-model
|
|
||||||
- azureml-pipeline
|
- azureml-pipeline
|
||||||
- azureml-contrib-interpret
|
- azureml-contrib-interpret
|
||||||
- pytorch-transformers==1.0.0
|
- pytorch-transformers==1.0.0
|
||||||
- spacy==2.1.8
|
- spacy==2.1.8
|
||||||
- joblib
|
- onnxruntime==1.0.0
|
||||||
- onnxruntime==0.4.0
|
|
||||||
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
|
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
|
||||||
|
|
||||||
channels:
|
channels:
|
||||||
|
- anaconda
|
||||||
- conda-forge
|
- conda-forge
|
||||||
- pytorch
|
- pytorch
|
||||||
@@ -14,7 +14,7 @@ IF "%CONDA_EXE%"=="" GOTO CondaMissing
|
|||||||
call conda activate %conda_env_name% 2>nul:
|
call conda activate %conda_env_name% 2>nul:
|
||||||
|
|
||||||
if not errorlevel 1 (
|
if not errorlevel 1 (
|
||||||
echo Upgrading azureml-sdk[automl,notebooks,explain] in existing conda environment %conda_env_name%
|
echo Upgrading existing conda environment %conda_env_name%
|
||||||
call pip uninstall azureml-train-automl -y -q
|
call pip uninstall azureml-train-automl -y -q
|
||||||
call conda env update --name %conda_env_name% --file %automl_env_file%
|
call conda env update --name %conda_env_name% --file %automl_env_file%
|
||||||
if errorlevel 1 goto ErrorExit
|
if errorlevel 1 goto ErrorExit
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ fi
|
|||||||
|
|
||||||
if source activate $CONDA_ENV_NAME 2> /dev/null
|
if source activate $CONDA_ENV_NAME 2> /dev/null
|
||||||
then
|
then
|
||||||
echo "Upgrading azureml-sdk[automl,notebooks,explain] in existing conda environment" $CONDA_ENV_NAME
|
echo "Upgrading existing conda environment" $CONDA_ENV_NAME
|
||||||
pip uninstall azureml-train-automl -y -q
|
pip uninstall azureml-train-automl -y -q
|
||||||
conda env update --name $CONDA_ENV_NAME --file $AUTOML_ENV_FILE &&
|
conda env update --name $CONDA_ENV_NAME --file $AUTOML_ENV_FILE &&
|
||||||
jupyter nbextension uninstall --user --py azureml.widgets
|
jupyter nbextension uninstall --user --py azureml.widgets
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ fi
|
|||||||
|
|
||||||
if source activate $CONDA_ENV_NAME 2> /dev/null
|
if source activate $CONDA_ENV_NAME 2> /dev/null
|
||||||
then
|
then
|
||||||
echo "Upgrading azureml-sdk[automl,notebooks,explain] in existing conda environment" $CONDA_ENV_NAME
|
echo "Upgrading existing conda environment" $CONDA_ENV_NAME
|
||||||
pip uninstall azureml-train-automl -y -q
|
pip uninstall azureml-train-automl -y -q
|
||||||
conda env update --name $CONDA_ENV_NAME --file $AUTOML_ENV_FILE &&
|
conda env update --name $CONDA_ENV_NAME --file $AUTOML_ENV_FILE &&
|
||||||
jupyter nbextension uninstall --user --py azureml.widgets
|
jupyter nbextension uninstall --user --py azureml.widgets
|
||||||
|
|||||||
@@ -92,6 +92,32 @@
|
|||||||
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
|
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Accessing the Azure ML workspace requires authentication with Azure.\n",
|
||||||
|
"\n",
|
||||||
|
"The default authentication is interactive authentication using the default tenant. Executing the `ws = Workspace.from_config()` line in the cell below will prompt for authentication the first time that it is run.\n",
|
||||||
|
"\n",
|
||||||
|
"If you have multiple Azure tenants, you can specify the tenant by replacing the `ws = Workspace.from_config()` line in the cell below with the following:\n",
|
||||||
|
"\n",
|
||||||
|
"```\n",
|
||||||
|
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
|
||||||
|
"auth = InteractiveLoginAuthentication(tenant_id = 'mytenantid')\n",
|
||||||
|
"ws = Workspace.from_config(auth = auth)\n",
|
||||||
|
"```\n",
|
||||||
|
"\n",
|
||||||
|
"If you need to run in an environment where interactive login is not possible, you can use Service Principal authentication by replacing the `ws = Workspace.from_config()` line in the cell below with the following:\n",
|
||||||
|
"\n",
|
||||||
|
"```\n",
|
||||||
|
"from azureml.core.authentication import ServicePrincipalAuthentication\n",
|
||||||
|
"auth = auth = ServicePrincipalAuthentication('mytenantid', 'myappid', 'mypassword')\n",
|
||||||
|
"ws = Workspace.from_config(auth = auth)\n",
|
||||||
|
"```\n",
|
||||||
|
"For more details, see [aka.ms/aml-notebook-auth](http://aka.ms/aml-notebook-auth)"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
@@ -285,9 +311,10 @@
|
|||||||
"|**task**|classification or regression or forecasting|\n",
|
"|**task**|classification or regression or forecasting|\n",
|
||||||
"|**primary_metric**|This is the metric that you want to optimize. Classification supports the following primary metrics: <br><i>accuracy</i><br><i>AUC_weighted</i><br><i>average_precision_score_weighted</i><br><i>norm_macro_recall</i><br><i>precision_score_weighted</i>|\n",
|
"|**primary_metric**|This is the metric that you want to optimize. Classification supports the following primary metrics: <br><i>accuracy</i><br><i>AUC_weighted</i><br><i>average_precision_score_weighted</i><br><i>norm_macro_recall</i><br><i>precision_score_weighted</i>|\n",
|
||||||
"|**iteration_timeout_minutes**|Time limit in minutes for each iteration.|\n",
|
"|**iteration_timeout_minutes**|Time limit in minutes for each iteration.|\n",
|
||||||
"|**blacklist_models** or **whitelist_models** |*List* of *strings* indicating machine learning algorithms for AutoML to avoid in this run.<br><br> Allowed values for **Classification**<br><i>LogisticRegression</i><br><i>SGD</i><br><i>MultinomialNaiveBayes</i><br><i>BernoulliNaiveBayes</i><br><i>SVM</i><br><i>LinearSVM</i><br><i>KNN</i><br><i>DecisionTree</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>GradientBoosting</i><br><i>TensorFlowDNN</i><br><i>TensorFlowLinearClassifier</i><br><br>Allowed values for **Regression**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><br>Allowed values for **Forecasting**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><i>Arima</i><br><i>Prophet</i>|\n",
|
"|**blacklist_models** | *List* of *strings* indicating machine learning algorithms for AutoML to avoid in this run. <br><br> Allowed values for **Classification**<br><i>LogisticRegression</i><br><i>SGD</i><br><i>MultinomialNaiveBayes</i><br><i>BernoulliNaiveBayes</i><br><i>SVM</i><br><i>LinearSVM</i><br><i>KNN</i><br><i>DecisionTree</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>GradientBoosting</i><br><i>TensorFlowDNN</i><br><i>TensorFlowLinearClassifier</i><br><br>Allowed values for **Regression**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><br>Allowed values for **Forecasting**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><i>Arima</i><br><i>Prophet</i>|\n",
|
||||||
|
"| **whitelist_models** | *List* of *strings* indicating machine learning algorithms for AutoML to use in this run. Same values listed above for **blacklist_models** allowed for **whitelist_models**.|\n",
|
||||||
"|**experiment_exit_score**| Value indicating the target for *primary_metric*. <br>Once the target is surpassed the run terminates.|\n",
|
"|**experiment_exit_score**| Value indicating the target for *primary_metric*. <br>Once the target is surpassed the run terminates.|\n",
|
||||||
"|**experiment_timeout_minutes**| Maximum amount of time in minutes that all iterations combined can take before the experiment terminates.|\n",
|
"|**experiment_timeout_hours**| Maximum amount of time in hours that all iterations combined can take before the experiment terminates.|\n",
|
||||||
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
|
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
|
||||||
"|**featurization**| 'auto' / 'off' Indicator for whether featurization step should be done automatically or not. Note: If the input data is sparse, featurization cannot be turned on.|\n",
|
"|**featurization**| 'auto' / 'off' Indicator for whether featurization step should be done automatically or not. Note: If the input data is sparse, featurization cannot be turned on.|\n",
|
||||||
"|**n_cross_validations**|Number of cross validation splits.|\n",
|
"|**n_cross_validations**|Number of cross validation splits.|\n",
|
||||||
@@ -304,7 +331,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"experiment_timeout_minutes\" : 20,\n",
|
" \"experiment_timeout_hours\" : 0.3,\n",
|
||||||
" \"enable_early_stopping\" : True,\n",
|
" \"enable_early_stopping\" : True,\n",
|
||||||
" \"iteration_timeout_minutes\": 5,\n",
|
" \"iteration_timeout_minutes\": 5,\n",
|
||||||
" \"max_concurrent_iterations\": 4,\n",
|
" \"max_concurrent_iterations\": 4,\n",
|
||||||
@@ -456,6 +483,72 @@
|
|||||||
"RunDetails(remote_run).show() "
|
"RunDetails(remote_run).show() "
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"### Retrieve the Best Model's explanation\n",
|
||||||
|
"Retrieve the explanation from the best_run which includes explanations for engineered features and raw features. Make sure that the run for generating explanations for the best model is completed."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Wait for the best model explanation run to complete\n",
|
||||||
|
"from azureml.core.run import Run\n",
|
||||||
|
"model_explainability_run_id = remote_run.get_properties().get('ModelExplainRunId')\n",
|
||||||
|
"print(model_explainability_run_id)\n",
|
||||||
|
"if model_explainability_run_id is not None:\n",
|
||||||
|
" model_explainability_run = Run(experiment=experiment, run_id=model_explainability_run_id)\n",
|
||||||
|
" model_explainability_run.wait_for_completion()\n",
|
||||||
|
"\n",
|
||||||
|
"# Get the best run object\n",
|
||||||
|
"best_run, fitted_model = remote_run.get_output()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Download engineered feature importance from artifact store\n",
|
||||||
|
"You can use ExplanationClient to download the engineered feature explanations from the artifact store of the best_run."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"client = ExplanationClient.from_run(best_run)\n",
|
||||||
|
"engineered_explanations = client.download_model_explanation(raw=False)\n",
|
||||||
|
"exp_data = engineered_explanations.get_feature_importance_dict()\n",
|
||||||
|
"exp_data"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Download raw feature importance from artifact store\n",
|
||||||
|
"You can use ExplanationClient to download the raw feature explanations from the artifact store of the best_run."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"client = ExplanationClient.from_run(best_run)\n",
|
||||||
|
"engineered_explanations = client.download_model_explanation(raw=True)\n",
|
||||||
|
"exp_data = engineered_explanations.get_feature_importance_dict()\n",
|
||||||
|
"exp_data"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
@@ -572,20 +665,6 @@
|
|||||||
"best_run, fitted_model = remote_run.get_output()"
|
"best_run, fitted_model = remote_run.get_output()"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"import os\n",
|
|
||||||
"import shutil\n",
|
|
||||||
"\n",
|
|
||||||
"sript_folder = os.path.join(os.getcwd(), 'inference')\n",
|
|
||||||
"project_folder = '/inference'\n",
|
|
||||||
"os.makedirs(project_folder, exist_ok=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
@@ -639,10 +718,10 @@
|
|||||||
"from azureml.core.webservice import AciWebservice\n",
|
"from azureml.core.webservice import AciWebservice\n",
|
||||||
"from azureml.core.webservice import Webservice\n",
|
"from azureml.core.webservice import Webservice\n",
|
||||||
"from azureml.core.model import Model\n",
|
"from azureml.core.model import Model\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime = \"python\", \n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=conda_env_file_name)\n",
|
||||||
" entry_script = script_file_name,\n",
|
"inference_config = InferenceConfig(entry_script=script_file_name, environment=myenv)\n",
|
||||||
" conda_file = conda_env_file_name)\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
|
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
|
||||||
" memory_gb = 1, \n",
|
" memory_gb = 1, \n",
|
||||||
|
|||||||
@@ -2,12 +2,9 @@ name: auto-ml-classification-bank-marketing-all-features
|
|||||||
dependencies:
|
dependencies:
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- interpret
|
|
||||||
- azureml-defaults
|
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- onnxruntime==1.0.0
|
||||||
- onnxruntime==0.4.0
|
|
||||||
- azureml-explain-model
|
- azureml-explain-model
|
||||||
- azureml-contrib-interpret
|
- azureml-contrib-interpret
|
||||||
|
|||||||
@@ -122,35 +122,22 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.compute import AmlCompute\n",
|
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
|
||||||
"from azureml.core.compute import ComputeTarget\n",
|
"from azureml.core.compute_target import ComputeTargetException\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Choose a name for your AmlCompute cluster.\n",
|
"# Choose a name for your CPU cluster\n",
|
||||||
"amlcompute_cluster_name = \"cpu-cluster-1\"\n",
|
"cpu_cluster_name = \"cpu-cluster-1\"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"found = False\n",
|
"# Verify that cluster does not exist already\n",
|
||||||
"# Check if this compute target already exists in the workspace.\n",
|
"try:\n",
|
||||||
"cts = ws.compute_targets\n",
|
" compute_target = ComputeTarget(workspace=ws, name=cpu_cluster_name)\n",
|
||||||
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'cpu-cluster-1':\n",
|
" print('Found existing cluster, use it.')\n",
|
||||||
" found = True\n",
|
"except ComputeTargetException:\n",
|
||||||
" print('Found existing compute target.')\n",
|
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_DS12_V2',\n",
|
||||||
" compute_target = cts[amlcompute_cluster_name]\n",
|
" max_nodes=6)\n",
|
||||||
" \n",
|
" compute_target = ComputeTarget.create(ws, cpu_cluster_name, compute_config)\n",
|
||||||
"if not found:\n",
|
|
||||||
" print('Creating a new compute target...')\n",
|
|
||||||
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_DS12_V2\", # for GPU, use \"STANDARD_NC6\"\n",
|
|
||||||
" #vm_priority = 'lowpriority', # optional\n",
|
|
||||||
" max_nodes = 6)\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
" # Create the cluster.\n",
|
"compute_target.wait_for_completion(show_output=True)"
|
||||||
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
|
|
||||||
" \n",
|
|
||||||
"print('Checking cluster status...')\n",
|
|
||||||
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
|
|
||||||
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
|
|
||||||
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
|
|
||||||
"\n",
|
|
||||||
"# For a more detailed view of current AmlCompute status, use get_status()."
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -210,10 +197,9 @@
|
|||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"n_cross_validations\": 3,\n",
|
" \"n_cross_validations\": 3,\n",
|
||||||
" \"primary_metric\": 'average_precision_score_weighted',\n",
|
" \"primary_metric\": 'average_precision_score_weighted',\n",
|
||||||
" \"preprocess\": True,\n",
|
|
||||||
" \"enable_early_stopping\": True,\n",
|
" \"enable_early_stopping\": True,\n",
|
||||||
" \"max_concurrent_iterations\": 2, # This is a limit for testing purpose, please increase it as per cluster size\n",
|
" \"max_concurrent_iterations\": 2, # This is a limit for testing purpose, please increase it as per cluster size\n",
|
||||||
" \"experiment_timeout_minutes\": 10, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
|
" \"experiment_timeout_hours\": 0.25, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
|
||||||
" \"verbosity\": logging.INFO,\n",
|
" \"verbosity\": logging.INFO,\n",
|
||||||
"}\n",
|
"}\n",
|
||||||
"\n",
|
"\n",
|
||||||
@@ -283,7 +269,11 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
"metadata": {},
|
"metadata": {
|
||||||
|
"tags": [
|
||||||
|
"widget-rundetails-sample"
|
||||||
|
]
|
||||||
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.widgets import RunDetails\n",
|
"from azureml.widgets import RunDetails\n",
|
||||||
@@ -305,7 +295,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"#### Explain model\n",
|
"#### Explain model\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Automated ML models can be explained and visualized using the SDK Explainability library. [Learn how to use the explainer](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/model-explanation-remote-amlcompute/auto-ml-model-explanations-remote-compute.ipynb)."
|
"Automated ML models can be explained and visualized using the SDK Explainability library. "
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -334,17 +324,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"#### Print the properties of the model\n",
|
"#### Print the properties of the model\n",
|
||||||
"The fitted_model is a python object and you can read the different properties of the object.\n",
|
"The fitted_model is a python object and you can read the different properties of the object.\n"
|
||||||
"See *Print the properties of the model* section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification/auto-ml-classification.ipynb)."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Deploy\n",
|
|
||||||
"\n",
|
|
||||||
"To deploy the model into a web service endpoint, see _Deploy_ section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-with-deployment/auto-ml-classification-with-deployment.ipynb)"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -2,10 +2,7 @@ name: auto-ml-classification-credit-card-fraud
|
|||||||
dependencies:
|
dependencies:
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- interpret
|
|
||||||
- azureml-defaults
|
|
||||||
- azureml-explain-model
|
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- azureml-explain-model
|
||||||
|
|||||||
@@ -121,9 +121,9 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## Set up a compute cluster\n",
|
"## Set up a compute cluster\n",
|
||||||
"This section uses a user-provided compute cluster (named \"cpu-cluster\" in this example). If a cluster with this name does not exist in the user's workspace, the below code will create a new cluster. You can choose the parameters of the cluster as mentioned in the comments.\n",
|
"This section uses a user-provided compute cluster (named \"dnntext-cluster\" in this example). If a cluster with this name does not exist in the user's workspace, the below code will create a new cluster. You can choose the parameters of the cluster as mentioned in the comments.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Whether you provide/select a CPU or GPU cluster, AutoML will choose the appropriate DNN for that setup - BiLSTM or BERT text featurizer will be included in the candidate featurizers on CPU and GPU respectively."
|
"Whether you provide/select a CPU or GPU cluster, AutoML will choose the appropriate DNN for that setup - BiLSTM or BERT text featurizer will be included in the candidate featurizers on CPU and GPU respectively. If your goal is to obtain the most accurate model, we recommend you use GPU clusters since BERT featurizers usually outperform BiLSTM featurizers."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -133,7 +133,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Choose a name for your cluster.\n",
|
"# Choose a name for your cluster.\n",
|
||||||
"amlcompute_cluster_name = \"cpu-dnntext\"\n",
|
"amlcompute_cluster_name = \"dnntext-cluster\"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"found = False\n",
|
"found = False\n",
|
||||||
"# Check if this compute target already exists in the workspace.\n",
|
"# Check if this compute target already exists in the workspace.\n",
|
||||||
@@ -145,11 +145,11 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"if not found:\n",
|
"if not found:\n",
|
||||||
" print('Creating a new compute target...')\n",
|
" print('Creating a new compute target...')\n",
|
||||||
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # CPU for BiLSTM\n",
|
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_NC6\", # CPU for BiLSTM, such as \"STANDARD_D2_V2\" \n",
|
||||||
" # To use BERT, select a GPU such as \"STANDARD_NC6\" \n",
|
" # To use BERT (this is recommended for best performance), select a GPU such as \"STANDARD_NC6\" \n",
|
||||||
" # or similar GPU option\n",
|
" # or similar GPU option\n",
|
||||||
" # available in your workspace\n",
|
" # available in your workspace\n",
|
||||||
" max_nodes = 6)\n",
|
" max_nodes = 1)\n",
|
||||||
"\n",
|
"\n",
|
||||||
" # Create the cluster\n",
|
" # Create the cluster\n",
|
||||||
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
|
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
|
||||||
@@ -218,7 +218,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"Featch data and upload to datastore for use in training"
|
"#### Fetch data and upload to datastore for use in training"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -275,7 +275,6 @@
|
|||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"experiment_timeout_minutes\": 20,\n",
|
" \"experiment_timeout_minutes\": 20,\n",
|
||||||
" \"primary_metric\": 'accuracy',\n",
|
" \"primary_metric\": 'accuracy',\n",
|
||||||
" \"preprocess\": True,\n",
|
|
||||||
" \"max_concurrent_iterations\": 4, \n",
|
" \"max_concurrent_iterations\": 4, \n",
|
||||||
" \"max_cores_per_iteration\": -1,\n",
|
" \"max_cores_per_iteration\": -1,\n",
|
||||||
" \"enable_dnn\": True,\n",
|
" \"enable_dnn\": True,\n",
|
||||||
@@ -348,7 +347,26 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"#best_run, fitted_model = automl_run.get_output()"
|
"best_run, fitted_model = automl_run.get_output()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"You can now see what text transformations are used to convert text data to features for this dataset, including deep learning transformations based on BiLSTM or Transformer (BERT is one implementation of a Transformer) models."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"text_transformations_used = []\n",
|
||||||
|
"for column_group in fitted_model.named_steps['datatransformer'].get_featurization_summary():\n",
|
||||||
|
" text_transformations_used.extend(column_group['Transformations'])\n",
|
||||||
|
"text_transformations_used"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -519,12 +537,12 @@
|
|||||||
"name": "anshirga"
|
"name": "anshirga"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"datasets": [
|
|
||||||
"None"
|
|
||||||
],
|
|
||||||
"compute": [
|
"compute": [
|
||||||
"AML Compute"
|
"AML Compute"
|
||||||
],
|
],
|
||||||
|
"datasets": [
|
||||||
|
"None"
|
||||||
|
],
|
||||||
"deployment": [
|
"deployment": [
|
||||||
"None"
|
"None"
|
||||||
],
|
],
|
||||||
|
|||||||
@@ -3,8 +3,11 @@ dependencies:
|
|||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-train
|
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- azurmel-train
|
||||||
- statsmodels
|
- https://download.pytorch.org/whl/cpu/torch-1.1.0-cp35-cp35m-win_amd64.whl
|
||||||
|
- sentencepiece==0.1.82
|
||||||
|
- pytorch-transformers==1.0
|
||||||
|
- spacy==2.1.8
|
||||||
|
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
|
||||||
|
|||||||
@@ -197,7 +197,7 @@
|
|||||||
"conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n",
|
"conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n",
|
||||||
"\n",
|
"\n",
|
||||||
"cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets'], \n",
|
"cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets'], \n",
|
||||||
" conda_packages=['numpy', 'py-xgboost'], \n",
|
" conda_packages=['numpy==1.16.2'], \n",
|
||||||
" pin_sdk_version=False)\n",
|
" pin_sdk_version=False)\n",
|
||||||
"#cd.add_pip_package('azureml-explain-model')\n",
|
"#cd.add_pip_package('azureml-explain-model')\n",
|
||||||
"conda_run_config.environment.python.conda_dependencies = cd\n",
|
"conda_run_config.environment.python.conda_dependencies = cd\n",
|
||||||
@@ -210,7 +210,24 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## Data Ingestion Pipeline \n",
|
"## Data Ingestion Pipeline \n",
|
||||||
"For this demo, we will use NOAA weather data from [Azure Open Datasets](https://azure.microsoft.com/services/open-datasets/). You can replace this with your own dataset, or you can skip this pipeline if you already have a time-series based `TabularDataset`.\n",
|
"For this demo, we will use NOAA weather data from [Azure Open Datasets](https://azure.microsoft.com/services/open-datasets/). You can replace this with your own dataset, or you can skip this pipeline if you already have a time-series based `TabularDataset`.\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# The name and target column of the Dataset to create \n",
|
||||||
|
"dataset = \"NOAA-Weather-DS4\"\n",
|
||||||
|
"target_column_name = \"temperature\""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
"\n",
|
"\n",
|
||||||
"### Upload Data Step\n",
|
"### Upload Data Step\n",
|
||||||
"The data ingestion pipeline has a single step with a script to query the latest weather data and upload it to the blob store. During the first run, the script will create and register a time-series based `TabularDataset` with the past one week of weather data. For each subsequent run, the script will create a partition in the blob store by querying NOAA for new weather data since the last modified time of the dataset (`dataset.data_changed_time`) and creating a data.csv file."
|
"The data ingestion pipeline has a single step with a script to query the latest weather data and upload it to the blob store. During the first run, the script will create and register a time-series based `TabularDataset` with the past one week of weather data. For each subsequent run, the script will create a partition in the blob store by querying NOAA for new weather data since the last modified time of the dataset (`dataset.data_changed_time`) and creating a data.csv file."
|
||||||
@@ -225,8 +242,6 @@
|
|||||||
"from azureml.pipeline.core import Pipeline, PipelineParameter\n",
|
"from azureml.pipeline.core import Pipeline, PipelineParameter\n",
|
||||||
"from azureml.pipeline.steps import PythonScriptStep\n",
|
"from azureml.pipeline.steps import PythonScriptStep\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# The name of the Dataset to create \n",
|
|
||||||
"dataset = \"NOAA-Weather-DS4\"\n",
|
|
||||||
"ds_name = PipelineParameter(name=\"ds_name\", default_value=dataset)\n",
|
"ds_name = PipelineParameter(name=\"ds_name\", default_value=dataset)\n",
|
||||||
"upload_data_step = PythonScriptStep(script_name=\"upload_weather_data.py\", \n",
|
"upload_data_step = PythonScriptStep(script_name=\"upload_weather_data.py\", \n",
|
||||||
" allow_reuse=False,\n",
|
" allow_reuse=False,\n",
|
||||||
@@ -262,7 +277,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"data_pipeline_run.wait_for_completion()"
|
"data_pipeline_run.wait_for_completion(show_output=False)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -272,7 +287,7 @@
|
|||||||
"## Training Pipeline\n",
|
"## Training Pipeline\n",
|
||||||
"### Prepare Training Data Step\n",
|
"### Prepare Training Data Step\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Script to bring data into common X,y format. We need to set allow_reuse flag to False to allow the pipeline to run even when inputs don't change. We also need the name of the model to check the time the model was last trained."
|
"Script to check if new data is available since the model was last trained. If no new data is available, we cancel the remaining pipeline steps. We need to set allow_reuse flag to False to allow the pipeline to run even when inputs don't change. We also need the name of the model to check the time the model was last trained."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -283,11 +298,8 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.pipeline.core import PipelineData\n",
|
"from azureml.pipeline.core import PipelineData\n",
|
||||||
"\n",
|
"\n",
|
||||||
"target_column = PipelineParameter(\"target_column\", default_value=\"y\")\n",
|
|
||||||
"# The model name with which to register the trained model in the workspace.\n",
|
"# The model name with which to register the trained model in the workspace.\n",
|
||||||
"model_name = PipelineParameter(\"model_name\", default_value=\"y\")\n",
|
"model_name = PipelineParameter(\"model_name\", default_value=\"noaaweatherds\")"
|
||||||
"output_x = PipelineData(\"output_x\", datastore=dstor)\n",
|
|
||||||
"output_y = PipelineData(\"output_y\", datastore=dstor)"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -299,16 +311,23 @@
|
|||||||
"data_prep_step = PythonScriptStep(script_name=\"check_data.py\", \n",
|
"data_prep_step = PythonScriptStep(script_name=\"check_data.py\", \n",
|
||||||
" allow_reuse=False,\n",
|
" allow_reuse=False,\n",
|
||||||
" name=\"check_data\",\n",
|
" name=\"check_data\",\n",
|
||||||
" arguments=[\"--target_column\", target_column,\n",
|
" arguments=[\"--ds_name\", ds_name,\n",
|
||||||
" \"--output_x\", output_x,\n",
|
" \"--model_name\", model_name],\n",
|
||||||
" \"--output_y\", output_y,\n",
|
|
||||||
" \"--ds_name\", ds_name,\n",
|
|
||||||
" \"--model_name\", model_name],\n",
|
|
||||||
" outputs=[output_x, output_y], \n",
|
|
||||||
" compute_target=compute_target, \n",
|
" compute_target=compute_target, \n",
|
||||||
" runconfig=conda_run_config)"
|
" runconfig=conda_run_config)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core import Dataset\n",
|
||||||
|
"train_ds = Dataset.get_by_name(ws, dataset)\n",
|
||||||
|
"train_ds = train_ds.drop_columns([\"partition_date\"])"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
@@ -324,14 +343,13 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.train.automl import AutoMLConfig\n",
|
"from azureml.train.automl import AutoMLConfig\n",
|
||||||
"from azureml.train.automl.runtime import AutoMLStep\n",
|
"from azureml.pipeline.steps import AutoMLStep\n",
|
||||||
"\n",
|
"\n",
|
||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"iteration_timeout_minutes\": 20,\n",
|
" \"iteration_timeout_minutes\": 10,\n",
|
||||||
" \"experiment_timeout_minutes\": 30,\n",
|
" \"experiment_timeout_hours\": 0.25,\n",
|
||||||
" \"n_cross_validations\": 3,\n",
|
" \"n_cross_validations\": 3,\n",
|
||||||
" \"primary_metric\": 'r2_score',\n",
|
" \"primary_metric\": 'r2_score',\n",
|
||||||
" \"preprocess\": True,\n",
|
|
||||||
" \"max_concurrent_iterations\": 3,\n",
|
" \"max_concurrent_iterations\": 3,\n",
|
||||||
" \"max_cores_per_iteration\": -1,\n",
|
" \"max_cores_per_iteration\": -1,\n",
|
||||||
" \"verbosity\": logging.INFO,\n",
|
" \"verbosity\": logging.INFO,\n",
|
||||||
@@ -342,8 +360,8 @@
|
|||||||
" debug_log = 'automl_errors.log',\n",
|
" debug_log = 'automl_errors.log',\n",
|
||||||
" path = \".\",\n",
|
" path = \".\",\n",
|
||||||
" compute_target=compute_target,\n",
|
" compute_target=compute_target,\n",
|
||||||
" run_configuration=conda_run_config,\n",
|
" training_data = train_ds,\n",
|
||||||
" data_script = \"get_data.py\",\n",
|
" label_column_name = target_column_name,\n",
|
||||||
" **automl_settings\n",
|
" **automl_settings\n",
|
||||||
" )"
|
" )"
|
||||||
]
|
]
|
||||||
@@ -359,7 +377,7 @@
|
|||||||
"metrics_output_name = 'metrics_output'\n",
|
"metrics_output_name = 'metrics_output'\n",
|
||||||
"best_model_output_name = 'best_model_output'\n",
|
"best_model_output_name = 'best_model_output'\n",
|
||||||
"\n",
|
"\n",
|
||||||
"metirics_data = PipelineData(name='metrics_data',\n",
|
"metrics_data = PipelineData(name='metrics_data',\n",
|
||||||
" datastore=dstor,\n",
|
" datastore=dstor,\n",
|
||||||
" pipeline_output_name=metrics_output_name,\n",
|
" pipeline_output_name=metrics_output_name,\n",
|
||||||
" training_output=TrainingOutput(type='Metrics'))\n",
|
" training_output=TrainingOutput(type='Metrics'))\n",
|
||||||
@@ -378,8 +396,7 @@
|
|||||||
"automl_step = AutoMLStep(\n",
|
"automl_step = AutoMLStep(\n",
|
||||||
" name='automl_module',\n",
|
" name='automl_module',\n",
|
||||||
" automl_config=automl_config,\n",
|
" automl_config=automl_config,\n",
|
||||||
" inputs=[output_x, output_y],\n",
|
" outputs=[metrics_data, model_data],\n",
|
||||||
" outputs=[metirics_data, model_data],\n",
|
|
||||||
" allow_reuse=False)"
|
" allow_reuse=False)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -432,7 +449,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"training_pipeline_run = experiment.submit(training_pipeline, pipeline_parameters={\n",
|
"training_pipeline_run = experiment.submit(training_pipeline, pipeline_parameters={\n",
|
||||||
" \"target_column\": \"temperature\", \"ds_name\": dataset, \"model_name\": \"noaaweatherds\"})"
|
" \"ds_name\": dataset, \"model_name\": \"noaaweatherds\"})"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -475,7 +492,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.pipeline.core import Schedule\n",
|
"from azureml.pipeline.core import Schedule\n",
|
||||||
"schedule = Schedule.create(workspace=ws, name=\"RetrainingSchedule\",\n",
|
"schedule = Schedule.create(workspace=ws, name=\"RetrainingSchedule\",\n",
|
||||||
" pipeline_parameters={\"target_column\": \"temperature\",\"ds_name\": dataset, \"model_name\": \"noaaweatherds\"},\n",
|
" pipeline_parameters={\"ds_name\": dataset, \"model_name\": \"noaaweatherds\"},\n",
|
||||||
" pipeline_id=published_pipeline.id, \n",
|
" pipeline_id=published_pipeline.id, \n",
|
||||||
" experiment_name=experiment_name, \n",
|
" experiment_name=experiment_name, \n",
|
||||||
" datastore=dstor,\n",
|
" datastore=dstor,\n",
|
||||||
|
|||||||
@@ -3,7 +3,6 @@ dependencies:
|
|||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-pipeline
|
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- azureml-pipeline
|
||||||
|
|||||||
@@ -15,32 +15,16 @@ if type(run) == _OfflineRun:
|
|||||||
else:
|
else:
|
||||||
ws = run.experiment.workspace
|
ws = run.experiment.workspace
|
||||||
|
|
||||||
|
print("Check for new data.")
|
||||||
def write_output(df, path):
|
|
||||||
os.makedirs(path, exist_ok=True)
|
|
||||||
print("%s created" % path)
|
|
||||||
df.to_csv(path + "/part-00000", index=False)
|
|
||||||
|
|
||||||
|
|
||||||
print("Check for new data and prepare the data")
|
|
||||||
|
|
||||||
parser = argparse.ArgumentParser("split")
|
parser = argparse.ArgumentParser("split")
|
||||||
parser.add_argument("--target_column", type=str, help="input split features")
|
|
||||||
parser.add_argument("--ds_name", help="input dataset name")
|
parser.add_argument("--ds_name", help="input dataset name")
|
||||||
parser.add_argument("--model_name", help="name of the deployed model")
|
parser.add_argument("--model_name", help="name of the deployed model")
|
||||||
parser.add_argument("--output_x", type=str,
|
|
||||||
help="output features")
|
|
||||||
parser.add_argument("--output_y", type=str,
|
|
||||||
help="output labels")
|
|
||||||
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
print("Argument 1(ds_name): %s" % args.ds_name)
|
print("Argument 1(ds_name): %s" % args.ds_name)
|
||||||
print("Argument 2(target_column): %s" % args.target_column)
|
print("Argument 2(model_name): %s" % args.model_name)
|
||||||
print("Argument 3(model_name): %s" % args.model_name)
|
|
||||||
print("Argument 4(output_x): %s" % args.output_x)
|
|
||||||
print("Argument 5(output_y): %s" % args.output_y)
|
|
||||||
|
|
||||||
# Get the latest registered model
|
# Get the latest registered model
|
||||||
try:
|
try:
|
||||||
@@ -54,22 +38,9 @@ except Exception as e:
|
|||||||
train_ds = Dataset.get_by_name(ws, args.ds_name)
|
train_ds = Dataset.get_by_name(ws, args.ds_name)
|
||||||
dataset_changed_time = train_ds.data_changed_time
|
dataset_changed_time = train_ds.data_changed_time
|
||||||
|
|
||||||
if dataset_changed_time > last_train_time:
|
if not dataset_changed_time > last_train_time:
|
||||||
# New data is available since the model was last trained
|
|
||||||
print("Dataset was last updated on {0}. Retraining...".format(dataset_changed_time))
|
|
||||||
train_ds = train_ds.drop_columns(["partition_date"])
|
|
||||||
X_train = train_ds.drop_columns(
|
|
||||||
columns=[args.target_column]).to_pandas_dataframe()
|
|
||||||
y_train = train_ds.keep_columns(
|
|
||||||
columns=[args.target_column]).to_pandas_dataframe()
|
|
||||||
|
|
||||||
non_null = y_train[args.target_column].notnull()
|
|
||||||
y = y_train[non_null]
|
|
||||||
X = X_train[non_null]
|
|
||||||
|
|
||||||
if not (args.output_x is None and args.output_y is None):
|
|
||||||
write_output(X, args.output_x)
|
|
||||||
write_output(y, args.output_y)
|
|
||||||
else:
|
|
||||||
print("Cancelling run since there is no new data.")
|
print("Cancelling run since there is no new data.")
|
||||||
run.parent.cancel()
|
run.parent.cancel()
|
||||||
|
else:
|
||||||
|
# New data is available since the model was last trained
|
||||||
|
print("Dataset was last updated on {0}. Retraining...".format(dataset_changed_time))
|
||||||
|
|||||||
@@ -1,15 +0,0 @@
|
|||||||
import os
|
|
||||||
import pandas as pd
|
|
||||||
|
|
||||||
|
|
||||||
def get_data():
|
|
||||||
print("In get_data")
|
|
||||||
print(os.environ['AZUREML_DATAREFERENCE_output_x'])
|
|
||||||
X_train = pd.read_csv(
|
|
||||||
os.environ['AZUREML_DATAREFERENCE_output_x'] + "/part-00000")
|
|
||||||
y_train = pd.read_csv(
|
|
||||||
os.environ['AZUREML_DATAREFERENCE_output_y'] + "/part-00000")
|
|
||||||
|
|
||||||
print(X_train.head(3))
|
|
||||||
|
|
||||||
return {"X": X_train.values, "y": y_train.values.flatten()}
|
|
||||||
@@ -58,7 +58,7 @@ except Exception as e:
|
|||||||
print(traceback.format_exc())
|
print(traceback.format_exc())
|
||||||
print("Dataset with name {0} not found, registering new dataset.".format(args.ds_name))
|
print("Dataset with name {0} not found, registering new dataset.".format(args.ds_name))
|
||||||
register_dataset = True
|
register_dataset = True
|
||||||
end_time_last_slice = datetime.today() - relativedelta(weeks=1)
|
end_time_last_slice = datetime.today() - relativedelta(weeks=2)
|
||||||
|
|
||||||
end_time = datetime.utcnow()
|
end_time = datetime.utcnow()
|
||||||
train_df = get_noaa_data(end_time_last_slice, end_time)
|
train_df = get_noaa_data(end_time_last_slice, end_time)
|
||||||
@@ -80,10 +80,10 @@ if train_df.size > 0:
|
|||||||
target_path=folder_name,
|
target_path=folder_name,
|
||||||
overwrite=True,
|
overwrite=True,
|
||||||
show_progress=True)
|
show_progress=True)
|
||||||
|
|
||||||
if register_dataset:
|
|
||||||
ds = Dataset.Tabular.from_delimited_files(dstor.path("{}/**/*.csv".format(
|
|
||||||
args.ds_name)), partition_format='/{partition_date:yyyy/MM/dd/hh/mm/ss}/data.csv')
|
|
||||||
ds.register(ws, name=args.ds_name)
|
|
||||||
else:
|
else:
|
||||||
print("No new data since {0}.".format(end_time_last_slice))
|
print("No new data since {0}.".format(end_time_last_slice))
|
||||||
|
|
||||||
|
if register_dataset:
|
||||||
|
ds = Dataset.Tabular.from_delimited_files(dstor.path("{}/**/*.csv".format(
|
||||||
|
args.ds_name)), partition_format='/{partition_date:yyyy/MM/dd/HH/mm/ss}/data.csv')
|
||||||
|
ds.register(ws, name=args.ds_name)
|
||||||
|
|||||||
@@ -358,7 +358,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"automl_config = AutoMLConfig(task='forecasting', \n",
|
"automl_config = AutoMLConfig(task='forecasting', \n",
|
||||||
" primary_metric='normalized_root_mean_squared_error',\n",
|
" primary_metric='normalized_root_mean_squared_error',\n",
|
||||||
" experiment_timeout_minutes = 60,\n",
|
" experiment_timeout_hours = 1,\n",
|
||||||
" training_data=train_dataset,\n",
|
" training_data=train_dataset,\n",
|
||||||
" label_column_name=target_column_name,\n",
|
" label_column_name=target_column_name,\n",
|
||||||
" validation_data=valid_dataset, \n",
|
" validation_data=valid_dataset, \n",
|
||||||
|
|||||||
@@ -1,12 +1,10 @@
|
|||||||
name: auto-ml-forecasting-beer-remote
|
name: auto-ml-forecasting-beer-remote
|
||||||
dependencies:
|
dependencies:
|
||||||
- fbprophet==0.5
|
- py-xgboost<=0.90
|
||||||
- py-xgboost<=0.80
|
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
|
- numpy==1.16.2
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-train
|
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- azureml-train
|
||||||
- statsmodels
|
|
||||||
|
|||||||
@@ -76,9 +76,12 @@ def get_result_df(remote_run):
|
|||||||
def run_inference(test_experiment, compute_target, script_folder, train_run,
|
def run_inference(test_experiment, compute_target, script_folder, train_run,
|
||||||
test_dataset, lookback_dataset, max_horizon,
|
test_dataset, lookback_dataset, max_horizon,
|
||||||
target_column_name, time_column_name, freq):
|
target_column_name, time_column_name, freq):
|
||||||
train_run.download_file('outputs/model.pkl', 'inference/model.pkl')
|
model_base_name = 'model.pkl'
|
||||||
train_run.download_file('outputs/conda_env_v_1_0_0.yml',
|
if 'model_data_location' in train_run.properties:
|
||||||
'inference/condafile.yml')
|
model_location = train_run.properties['model_data_location']
|
||||||
|
_, model_base_name = model_location.rsplit('/', 1)
|
||||||
|
train_run.download_file('outputs/{}'.format(model_base_name), 'inference/{}'.format(model_base_name))
|
||||||
|
train_run.download_file('outputs/conda_env_v_1_0_0.yml', 'inference/condafile.yml')
|
||||||
|
|
||||||
inference_env = Environment("myenv")
|
inference_env = Environment("myenv")
|
||||||
inference_env.docker.enabled = True
|
inference_env.docker.enabled = True
|
||||||
@@ -91,7 +94,8 @@ def run_inference(test_experiment, compute_target, script_folder, train_run,
|
|||||||
'--max_horizon': max_horizon,
|
'--max_horizon': max_horizon,
|
||||||
'--target_column_name': target_column_name,
|
'--target_column_name': target_column_name,
|
||||||
'--time_column_name': time_column_name,
|
'--time_column_name': time_column_name,
|
||||||
'--frequency': freq
|
'--frequency': freq,
|
||||||
|
'--model_path': model_base_name
|
||||||
},
|
},
|
||||||
inputs=[test_dataset.as_named_input('test_data'),
|
inputs=[test_dataset.as_named_input('test_data'),
|
||||||
lookback_dataset.as_named_input('lookback_data')],
|
lookback_dataset.as_named_input('lookback_data')],
|
||||||
|
|||||||
@@ -232,6 +232,9 @@ parser.add_argument(
|
|||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
'--frequency', type=str, dest='freq',
|
'--frequency', type=str, dest='freq',
|
||||||
help='Frequency of prediction')
|
help='Frequency of prediction')
|
||||||
|
parser.add_argument(
|
||||||
|
'--model_path', type=str, dest='model_path',
|
||||||
|
default='model.pkl', help='Filename of model to be loaded')
|
||||||
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
@@ -239,6 +242,7 @@ max_horizon = args.max_horizon
|
|||||||
target_column_name = args.target_column_name
|
target_column_name = args.target_column_name
|
||||||
time_column_name = args.time_column_name
|
time_column_name = args.time_column_name
|
||||||
freq = args.freq
|
freq = args.freq
|
||||||
|
model_path = args.model_path
|
||||||
|
|
||||||
|
|
||||||
print('args passed are: ')
|
print('args passed are: ')
|
||||||
@@ -246,6 +250,7 @@ print(max_horizon)
|
|||||||
print(target_column_name)
|
print(target_column_name)
|
||||||
print(time_column_name)
|
print(time_column_name)
|
||||||
print(freq)
|
print(freq)
|
||||||
|
print(model_path)
|
||||||
|
|
||||||
run = Run.get_context()
|
run = Run.get_context()
|
||||||
# get input dataset by name
|
# get input dataset by name
|
||||||
@@ -267,7 +272,8 @@ X_lookback_df = lookback_dataset.drop_columns(columns=[target_column_name])
|
|||||||
y_lookback_df = lookback_dataset.with_timestamp_columns(
|
y_lookback_df = lookback_dataset.with_timestamp_columns(
|
||||||
None).keep_columns(columns=[target_column_name])
|
None).keep_columns(columns=[target_column_name])
|
||||||
|
|
||||||
fitted_model = joblib.load('model.pkl')
|
fitted_model = joblib.load(model_path)
|
||||||
|
|
||||||
|
|
||||||
if hasattr(fitted_model, 'get_lookback'):
|
if hasattr(fitted_model, 'get_lookback'):
|
||||||
lookback = fitted_model.get_lookback()
|
lookback = fitted_model.get_lookback()
|
||||||
|
|||||||
@@ -42,7 +42,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"AutoML highlights here include built-in holiday featurization, accessing engineered feature names, and working with the `forecast` function. Please also look at the additional forecasting notebooks, which document lagging, rolling windows, forecast quantiles, other ways to use the forecast function, and forecaster deployment.\n",
|
"AutoML highlights here include built-in holiday featurization, accessing engineered feature names, and working with the `forecast` function. Please also look at the additional forecasting notebooks, which document lagging, rolling windows, forecast quantiles, other ways to use the forecast function, and forecaster deployment.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Make sure you have executed the [configuration](../configuration.ipynb) before running this notebook.\n",
|
"Make sure you have executed the [configuration notebook](../../../configuration.ipynb) before running this notebook.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Notebook synopsis:\n",
|
"Notebook synopsis:\n",
|
||||||
"1. Creating an Experiment in an existing Workspace\n",
|
"1. Creating an Experiment in an existing Workspace\n",
|
||||||
@@ -202,7 +202,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"dataset = Dataset.Tabular.from_delimited_files(path = [(datastore, 'dataset/bike-no.csv')]).with_timestamp_columns(fine_grain_timestamp=time_column_name) \n",
|
"dataset = Dataset.Tabular.from_delimited_files(path = [(datastore, 'dataset/bike-no.csv')]).with_timestamp_columns(fine_grain_timestamp=time_column_name) \n",
|
||||||
"dataset.take(5).to_pandas_dataframe()"
|
"dataset.take(5).to_pandas_dataframe().reset_index(drop=True)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -221,8 +221,8 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# select data that occurs before a specified date\n",
|
"# select data that occurs before a specified date\n",
|
||||||
"train = dataset.time_before(datetime(2012, 9, 1))\n",
|
"train = dataset.time_before(datetime(2012, 8, 31), include_boundary=True)\n",
|
||||||
"train.to_pandas_dataframe().tail(5)"
|
"train.to_pandas_dataframe().tail(5).reset_index(drop=True)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -231,8 +231,8 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"test = dataset.time_after(datetime(2012, 8, 31))\n",
|
"test = dataset.time_after(datetime(2012, 9, 1), include_boundary=True)\n",
|
||||||
"test.to_pandas_dataframe().head(5)"
|
"test.to_pandas_dataframe().head(5).reset_index(drop=True)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -247,8 +247,8 @@
|
|||||||
"|-|-|\n",
|
"|-|-|\n",
|
||||||
"|**task**|forecasting|\n",
|
"|**task**|forecasting|\n",
|
||||||
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>\n",
|
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>\n",
|
||||||
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl/azureml.train.automl.constants.supportedmodels.regression?view=azure-ml-py).|\n",
|
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl-client/azureml.train.automl.constants.supportedmodels.forecasting?view=azure-ml-py).|\n",
|
||||||
"|**experiment_timeout_minutes**|Experimentation timeout in minutes.|\n",
|
"|**experiment_timeout_hours**|Experimentation timeout in hours.|\n",
|
||||||
"|**training_data**|Input dataset, containing both features and label column.|\n",
|
"|**training_data**|Input dataset, containing both features and label column.|\n",
|
||||||
"|**label_column_name**|The name of the label column.|\n",
|
"|**label_column_name**|The name of the label column.|\n",
|
||||||
"|**compute_target**|The remote compute for training.|\n",
|
"|**compute_target**|The remote compute for training.|\n",
|
||||||
@@ -260,7 +260,7 @@
|
|||||||
"|**target_lags**|The target_lags specifies how far back we will construct the lags of the target variable.|\n",
|
"|**target_lags**|The target_lags specifies how far back we will construct the lags of the target variable.|\n",
|
||||||
"|**drop_column_names**|Name(s) of columns to drop prior to modeling|\n",
|
"|**drop_column_names**|Name(s) of columns to drop prior to modeling|\n",
|
||||||
"\n",
|
"\n",
|
||||||
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_minutes parameter value to get results."
|
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_hours parameter value to get results."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -305,7 +305,7 @@
|
|||||||
"automl_config = AutoMLConfig(task='forecasting', \n",
|
"automl_config = AutoMLConfig(task='forecasting', \n",
|
||||||
" primary_metric='normalized_root_mean_squared_error',\n",
|
" primary_metric='normalized_root_mean_squared_error',\n",
|
||||||
" blacklist_models = ['ExtremeRandomTrees'], \n",
|
" blacklist_models = ['ExtremeRandomTrees'], \n",
|
||||||
" experiment_timeout_minutes=20,\n",
|
" experiment_timeout_hours=0.3,\n",
|
||||||
" training_data=train,\n",
|
" training_data=train,\n",
|
||||||
" label_column_name=target_column_name,\n",
|
" label_column_name=target_column_name,\n",
|
||||||
" compute_target=compute_target,\n",
|
" compute_target=compute_target,\n",
|
||||||
|
|||||||
@@ -1,11 +1,9 @@
|
|||||||
name: auto-ml-forecasting-bike-share
|
name: auto-ml-forecasting-bike-share
|
||||||
dependencies:
|
dependencies:
|
||||||
- fbprophet==0.5
|
- py-xgboost<=0.90
|
||||||
- py-xgboost<=0.80
|
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
|
- numpy==1.16.2
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
|
||||||
- statsmodels
|
|
||||||
|
|||||||
@@ -32,18 +32,17 @@ test_dataset = run.input_datasets['test_data']
|
|||||||
|
|
||||||
grain_column_names = []
|
grain_column_names = []
|
||||||
|
|
||||||
df = test_dataset.to_pandas_dataframe()
|
df = test_dataset.to_pandas_dataframe().reset_index(drop=True)
|
||||||
|
|
||||||
X_test_df = test_dataset.drop_columns(columns=[target_column_name])
|
X_test_df = test_dataset.drop_columns(columns=[target_column_name]).to_pandas_dataframe().reset_index(drop=True)
|
||||||
y_test_df = test_dataset.with_timestamp_columns(
|
y_test_df = test_dataset.with_timestamp_columns(None).keep_columns(columns=[target_column_name]).to_pandas_dataframe()
|
||||||
None).keep_columns(columns=[target_column_name])
|
|
||||||
|
|
||||||
fitted_model = joblib.load('model.pkl')
|
fitted_model = joblib.load('model.pkl')
|
||||||
|
|
||||||
df_all = forecasting_helper.do_rolling_forecast(
|
df_all = forecasting_helper.do_rolling_forecast(
|
||||||
fitted_model,
|
fitted_model,
|
||||||
X_test_df.to_pandas_dataframe(),
|
X_test_df,
|
||||||
y_test_df.to_pandas_dataframe().values.T[0],
|
y_test_df.values.T[0],
|
||||||
target_column_name,
|
target_column_name,
|
||||||
time_column_name,
|
time_column_name,
|
||||||
max_horizon,
|
max_horizon,
|
||||||
|
|||||||
@@ -31,8 +31,8 @@
|
|||||||
"1. [Results](#Results)\n",
|
"1. [Results](#Results)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Advanced Forecasting\n",
|
"Advanced Forecasting\n",
|
||||||
"1. [Advanced Training](#Advanced Training)\n",
|
"1. [Advanced Training](#advanced_training)\n",
|
||||||
"1. [Advanced Results](#Advanced Results)"
|
"1. [Advanced Results](#advanced_results)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -211,7 +211,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"dataset = Dataset.Tabular.from_delimited_files(path = \"https://automlsamplenotebookdata.blob.core.windows.net/automl-sample-notebook-data/nyc_energy.csv\").with_timestamp_columns(fine_grain_timestamp=time_column_name) \n",
|
"dataset = Dataset.Tabular.from_delimited_files(path = \"https://automlsamplenotebookdata.blob.core.windows.net/automl-sample-notebook-data/nyc_energy.csv\").with_timestamp_columns(fine_grain_timestamp=time_column_name) \n",
|
||||||
"dataset.take(5).to_pandas_dataframe()"
|
"dataset.take(5).to_pandas_dataframe().reset_index(drop=True)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -253,7 +253,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"# split into train based on time\n",
|
"# split into train based on time\n",
|
||||||
"train = dataset.time_before(datetime(2017, 8, 8, 5), include_boundary=True)\n",
|
"train = dataset.time_before(datetime(2017, 8, 8, 5), include_boundary=True)\n",
|
||||||
"train.to_pandas_dataframe().sort_values(time_column_name).tail(5)"
|
"train.to_pandas_dataframe().reset_index(drop=True).sort_values(time_column_name).tail(5)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -263,8 +263,8 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# split into test based on time\n",
|
"# split into test based on time\n",
|
||||||
"test = dataset.time_between(datetime(2017, 8, 8, 5), datetime(2017, 8, 10, 5))\n",
|
"test = dataset.time_between(datetime(2017, 8, 8, 6), datetime(2017, 8, 10, 5))\n",
|
||||||
"test.to_pandas_dataframe().head(5)"
|
"test.to_pandas_dataframe().reset_index(drop=True).head(5)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -301,8 +301,8 @@
|
|||||||
"|-|-|\n",
|
"|-|-|\n",
|
||||||
"|**task**|forecasting|\n",
|
"|**task**|forecasting|\n",
|
||||||
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
|
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
|
||||||
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl/azureml.train.automl.constants.supportedmodels.regression?view=azure-ml-py).|\n",
|
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl-client/azureml.train.automl.constants.supportedmodels.forecasting?view=azure-ml-py).|\n",
|
||||||
"|**experiment_timeout_minutes**|Maximum amount of time in minutes that the experiment take before it terminates.|\n",
|
"|**experiment_timeout_hours**|Maximum amount of time in hours that the experiment take before it terminates.|\n",
|
||||||
"|**training_data**|The training data to be used within the experiment.|\n",
|
"|**training_data**|The training data to be used within the experiment.|\n",
|
||||||
"|**label_column_name**|The name of the label column.|\n",
|
"|**label_column_name**|The name of the label column.|\n",
|
||||||
"|**compute_target**|The remote compute for training.|\n",
|
"|**compute_target**|The remote compute for training.|\n",
|
||||||
@@ -316,7 +316,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_minutes parameter value to get results."
|
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_hours parameter value to get results."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -333,7 +333,7 @@
|
|||||||
"automl_config = AutoMLConfig(task='forecasting', \n",
|
"automl_config = AutoMLConfig(task='forecasting', \n",
|
||||||
" primary_metric='normalized_root_mean_squared_error',\n",
|
" primary_metric='normalized_root_mean_squared_error',\n",
|
||||||
" blacklist_models = ['ExtremeRandomTrees', 'AutoArima', 'Prophet'], \n",
|
" blacklist_models = ['ExtremeRandomTrees', 'AutoArima', 'Prophet'], \n",
|
||||||
" experiment_timeout_minutes=20,\n",
|
" experiment_timeout_hours=0.3,\n",
|
||||||
" training_data=train,\n",
|
" training_data=train,\n",
|
||||||
" label_column_name=target_column_name,\n",
|
" label_column_name=target_column_name,\n",
|
||||||
" compute_target=compute_target,\n",
|
" compute_target=compute_target,\n",
|
||||||
@@ -454,7 +454,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"X_test = test.to_pandas_dataframe()\n",
|
"X_test = test.to_pandas_dataframe().reset_index(drop=True)\n",
|
||||||
"y_test = X_test.pop(target_column_name).values"
|
"y_test = X_test.pop(target_column_name).values"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -463,11 +463,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Forecast Function\n",
|
"### Forecast Function\n",
|
||||||
"For forecasting, we will use the forecast function instead of the predict function. There are two reasons for this.\n",
|
"For forecasting, we will use the forecast function instead of the predict function. Using the predict method would result in getting predictions for EVERY horizon the forecaster can predict at. This is useful when training and evaluating the performance of the forecaster at various horizons, but the level of detail is excessive for normal use. Forecast function also can handle more complicated scenarios, see notebook on [high frequency forecasting](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/forecasting-high-frequency/automl-forecasting-function.ipynb)."
|
||||||
"\n",
|
|
||||||
"We need to pass the recent values of the target variable y, whereas the scikit-compatible predict function only takes the non-target variables 'test'. In our case, the test data immediately follows the training data, and we fill the target variable with NaN. The NaN serves as a question mark for the forecaster to fill with the actuals. Using the forecast function will produce forecasts using the shortest possible forecast horizon. The last time at which a definite (non-NaN) value is seen is the forecast origin - the last time when the value of the target is known.\n",
|
|
||||||
"\n",
|
|
||||||
"Using the predict method would result in getting predictions for EVERY horizon the forecaster can predict at. This is useful when training and evaluating the performance of the forecaster at various horizons, but the level of detail is excessive for normal use."
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -476,15 +472,10 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Replace ALL values in y by NaN.\n",
|
|
||||||
"# The forecast origin will be at the beginning of the first forecast period.\n",
|
|
||||||
"# (Which is the same time as the end of the last training period.)\n",
|
|
||||||
"y_query = y_test.copy().astype(np.float)\n",
|
|
||||||
"y_query.fill(np.nan)\n",
|
|
||||||
"# The featurized data, aligned to y, will also be returned.\n",
|
"# The featurized data, aligned to y, will also be returned.\n",
|
||||||
"# This contains the assumptions that were made in the forecast\n",
|
"# This contains the assumptions that were made in the forecast\n",
|
||||||
"# and helps align the forecast to the original data\n",
|
"# and helps align the forecast to the original data\n",
|
||||||
"y_predictions, X_trans = fitted_model.forecast(X_test, y_query)"
|
"y_predictions, X_trans = fitted_model.forecast(X_test)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -557,7 +548,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## Advanced Training\n",
|
"## Advanced Training <a id=\"advanced_training\"></a>\n",
|
||||||
"We did not use lags in the previous model specification. In effect, the prediction was the result of a simple regression on date, grain and any additional features. This is often a very good prediction as common time series patterns like seasonality and trends can be captured in this manner. Such simple regression is horizon-less: it doesn't matter how far into the future we are predicting, because we are not using past data. In the previous example, the horizon was only used to split the data for cross-validation."
|
"We did not use lags in the previous model specification. In effect, the prediction was the result of a simple regression on date, grain and any additional features. This is often a very good prediction as common time series patterns like seasonality and trends can be captured in this manner. Such simple regression is horizon-less: it doesn't matter how far into the future we are predicting, because we are not using past data. In the previous example, the horizon was only used to split the data for cross-validation."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -587,7 +578,7 @@
|
|||||||
"automl_config = AutoMLConfig(task='forecasting', \n",
|
"automl_config = AutoMLConfig(task='forecasting', \n",
|
||||||
" primary_metric='normalized_root_mean_squared_error',\n",
|
" primary_metric='normalized_root_mean_squared_error',\n",
|
||||||
" blacklist_models = ['ElasticNet','ExtremeRandomTrees','GradientBoosting','XGBoostRegressor','ExtremeRandomTrees', 'AutoArima', 'Prophet'], #These models are blacklisted for tutorial purposes, remove this for real use cases. \n",
|
" blacklist_models = ['ElasticNet','ExtremeRandomTrees','GradientBoosting','XGBoostRegressor','ExtremeRandomTrees', 'AutoArima', 'Prophet'], #These models are blacklisted for tutorial purposes, remove this for real use cases. \n",
|
||||||
" experiment_timeout_minutes=20,\n",
|
" experiment_timeout_hours=0.3,\n",
|
||||||
" training_data=train,\n",
|
" training_data=train,\n",
|
||||||
" label_column_name=target_column_name,\n",
|
" label_column_name=target_column_name,\n",
|
||||||
" compute_target=compute_target,\n",
|
" compute_target=compute_target,\n",
|
||||||
@@ -642,7 +633,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## Advanced Results\n",
|
"## Advanced Results<a id=\"advanced_results\"></a>\n",
|
||||||
"We did not use lags in the previous model specification. In effect, the prediction was the result of a simple regression on date, grain and any additional features. This is often a very good prediction as common time series patterns like seasonality and trends can be captured in this manner. Such simple regression is horizon-less: it doesn't matter how far into the future we are predicting, because we are not using past data. In the previous example, the horizon was only used to split the data for cross-validation."
|
"We did not use lags in the previous model specification. In effect, the prediction was the result of a simple regression on date, grain and any additional features. This is often a very good prediction as common time series patterns like seasonality and trends can be captured in this manner. Such simple regression is horizon-less: it doesn't matter how far into the future we are predicting, because we are not using past data. In the previous example, the horizon was only used to split the data for cross-validation."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -652,15 +643,10 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Replace ALL values in y by NaN.\n",
|
|
||||||
"# The forecast origin will be at the beginning of the first forecast period.\n",
|
|
||||||
"# (Which is the same time as the end of the last training period.)\n",
|
|
||||||
"y_query = y_test.copy().astype(np.float)\n",
|
|
||||||
"y_query.fill(np.nan)\n",
|
|
||||||
"# The featurized data, aligned to y, will also be returned.\n",
|
"# The featurized data, aligned to y, will also be returned.\n",
|
||||||
"# This contains the assumptions that were made in the forecast\n",
|
"# This contains the assumptions that were made in the forecast\n",
|
||||||
"# and helps align the forecast to the original data\n",
|
"# and helps align the forecast to the original data\n",
|
||||||
"y_predictions, X_trans = fitted_model_lags.forecast(X_test, y_query)"
|
"y_predictions, X_trans = fitted_model_lags.forecast(X_test)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -2,11 +2,9 @@ name: auto-ml-forecasting-energy-demand
|
|||||||
dependencies:
|
dependencies:
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- interpret
|
- numpy==1.16.2
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
|
||||||
- statsmodels
|
|
||||||
- azureml-explain-model
|
- azureml-explain-model
|
||||||
- azureml-contrib-interpret
|
- azureml-contrib-interpret
|
||||||
|
|||||||
@@ -1,551 +0,0 @@
|
|||||||
{
|
|
||||||
"cells": [
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
|
||||||
"\n",
|
|
||||||
"Licensed under the MIT License."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
""
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"# Automated Machine Learning\n",
|
|
||||||
"\n",
|
|
||||||
"_**Forecasting with grouping using Pipelines**_\n",
|
|
||||||
"\n",
|
|
||||||
"## Contents\n",
|
|
||||||
"\n",
|
|
||||||
"1. [Introduction](#Introduction)\n",
|
|
||||||
"2. [Setup](#Setup)\n",
|
|
||||||
"3. [Data](#Data)\n",
|
|
||||||
"4. [Compute](#Compute)\n",
|
|
||||||
"4. [AutoMLConfig](#AutoMLConfig)\n",
|
|
||||||
"5. [Pipeline](#Pipeline)\n",
|
|
||||||
"5. [Train](#Train)\n",
|
|
||||||
"6. [Test](#Test)\n",
|
|
||||||
"\n",
|
|
||||||
"\n",
|
|
||||||
"## Introduction\n",
|
|
||||||
"In this example we use Automated ML and Pipelines to train, select, and operationalize forecasting models for multiple time-series.\n",
|
|
||||||
"\n",
|
|
||||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) first if you haven't already to establish your connection to the AzureML Workspace.\n",
|
|
||||||
"\n",
|
|
||||||
"In this notebook you will learn how to:\n",
|
|
||||||
"\n",
|
|
||||||
"* Create an Experiment in an existing Workspace.\n",
|
|
||||||
"* Configure AutoML using AutoMLConfig.\n",
|
|
||||||
"* Use our helper script to generate pipeline steps to split, train, and deploy the models.\n",
|
|
||||||
"* Explore the results.\n",
|
|
||||||
"* Test the models.\n",
|
|
||||||
"\n",
|
|
||||||
"It is advised you ensure your cluster has at least one node per group.\n",
|
|
||||||
"\n",
|
|
||||||
"An Enterprise workspace is required for this notebook. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade)\n",
|
|
||||||
"\n",
|
|
||||||
"## Setup\n",
|
|
||||||
"As part of the setup you have already created an Azure ML `Workspace` object. For Automated ML you will need to create an `Experiment` object, which is a named object in a `Workspace` used to run experiments. "
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"import json\n",
|
|
||||||
"import logging\n",
|
|
||||||
"import warnings\n",
|
|
||||||
"\n",
|
|
||||||
"import numpy as np\n",
|
|
||||||
"import pandas as pd\n",
|
|
||||||
"\n",
|
|
||||||
"import azureml.core\n",
|
|
||||||
"\n",
|
|
||||||
"from azureml.core.workspace import Workspace\n",
|
|
||||||
"from azureml.core.experiment import Experiment\n",
|
|
||||||
"from azureml.train.automl import AutoMLConfig"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Accessing the Azure ML workspace requires authentication with Azure.\n",
|
|
||||||
"\n",
|
|
||||||
"The default authentication is interactive authentication using the default tenant. Executing the ws = Workspace.from_config() line in the cell below will prompt for authentication the first time that it is run.\n",
|
|
||||||
"\n",
|
|
||||||
"If you have multiple Azure tenants, you can specify the tenant by replacing the ws = Workspace.from_config() line in the cell below with the following:\n",
|
|
||||||
"```\n",
|
|
||||||
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
|
|
||||||
"auth = InteractiveLoginAuthentication(tenant_id = 'mytenantid')\n",
|
|
||||||
"ws = Workspace.from_config(auth = auth)\n",
|
|
||||||
"```\n",
|
|
||||||
"If you need to run in an environment where interactive login is not possible, you can use Service Principal authentication by replacing the ws = Workspace.from_config() line in the cell below with the following:\n",
|
|
||||||
"```\n",
|
|
||||||
"from azureml.core.authentication import ServicePrincipalAuthentication\n",
|
|
||||||
"auth = auth = ServicePrincipalAuthentication('mytenantid', 'myappid', 'mypassword')\n",
|
|
||||||
"ws = Workspace.from_config(auth = auth)\n",
|
|
||||||
"```\n",
|
|
||||||
"For more details, see aka.ms/aml-notebook-auth"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"ws = Workspace.from_config()\n",
|
|
||||||
"ds = ws.get_default_datastore()\n",
|
|
||||||
"\n",
|
|
||||||
"# choose a name for the run history container in the workspace\n",
|
|
||||||
"experiment_name = 'automl-grouping-oj'\n",
|
|
||||||
"# project folder\n",
|
|
||||||
"project_folder = './sample_projects/{}'.format(experiment_name)\n",
|
|
||||||
"\n",
|
|
||||||
"experiment = Experiment(ws, experiment_name)\n",
|
|
||||||
"\n",
|
|
||||||
"output = {}\n",
|
|
||||||
"output['SDK version'] = azureml.core.VERSION\n",
|
|
||||||
"output['Subscription ID'] = ws.subscription_id\n",
|
|
||||||
"output['Workspace'] = ws.name\n",
|
|
||||||
"output['Resource Group'] = ws.resource_group\n",
|
|
||||||
"output['Location'] = ws.location\n",
|
|
||||||
"output['Project Directory'] = project_folder\n",
|
|
||||||
"output['Run History Name'] = experiment_name\n",
|
|
||||||
"pd.set_option('display.max_colwidth', -1)\n",
|
|
||||||
"outputDf = pd.DataFrame(data = output, index = [''])\n",
|
|
||||||
"outputDf.T"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Data\n",
|
|
||||||
"Upload data to your default datastore and then load it as a `TabularDataset`"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.dataset import Dataset"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# upload training and test data to your default datastore\n",
|
|
||||||
"ds = ws.get_default_datastore()\n",
|
|
||||||
"ds.upload(src_dir='./data', target_path='groupdata', overwrite=True, show_progress=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# load data from your datastore\n",
|
|
||||||
"data = Dataset.Tabular.from_delimited_files(path=ds.path('groupdata/dominicks_OJ_2_5_8_train.csv'))\n",
|
|
||||||
"data_test = Dataset.Tabular.from_delimited_files(path=ds.path('groupdata/dominicks_OJ_2_5_8_test.csv'))\n",
|
|
||||||
"\n",
|
|
||||||
"data.take(5).to_pandas_dataframe()"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Compute \n",
|
|
||||||
"\n",
|
|
||||||
"#### Create or Attach existing AmlCompute\n",
|
|
||||||
"\n",
|
|
||||||
"You will need to create a compute target for your automated ML run. In this tutorial, you create AmlCompute as your training compute resource.\n",
|
|
||||||
"#### Creation of AmlCompute takes approximately 5 minutes. \n",
|
|
||||||
"If the AmlCompute with that name is already in your workspace this code will skip the creation process.\n",
|
|
||||||
"As with other Azure services, there are limits on certain resources (e.g. AmlCompute) associated with the Azure Machine Learning service. Please read this article on the default limits and how to request more quota."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.compute import AmlCompute\n",
|
|
||||||
"from azureml.core.compute import ComputeTarget\n",
|
|
||||||
"\n",
|
|
||||||
"# Choose a name for your cluster.\n",
|
|
||||||
"amlcompute_cluster_name = \"cpu-cluster-11\"\n",
|
|
||||||
"\n",
|
|
||||||
"found = False\n",
|
|
||||||
"# Check if this compute target already exists in the workspace.\n",
|
|
||||||
"cts = ws.compute_targets\n",
|
|
||||||
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
|
|
||||||
" found = True\n",
|
|
||||||
" print('Found existing compute target.')\n",
|
|
||||||
" compute_target = cts[amlcompute_cluster_name]\n",
|
|
||||||
" \n",
|
|
||||||
"if not found:\n",
|
|
||||||
" print('Creating a new compute target...')\n",
|
|
||||||
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
|
|
||||||
" #vm_priority = 'lowpriority', # optional\n",
|
|
||||||
" max_nodes = 6)\n",
|
|
||||||
"\n",
|
|
||||||
" # Create the cluster.\n",
|
|
||||||
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
|
|
||||||
" \n",
|
|
||||||
"print('Checking cluster status...')\n",
|
|
||||||
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
|
|
||||||
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
|
|
||||||
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
|
|
||||||
" \n",
|
|
||||||
"# For a more detailed view of current AmlCompute status, use get_status()."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## AutoMLConfig\n",
|
|
||||||
"#### Create a base AutoMLConfig\n",
|
|
||||||
"This configuration will be used for all the groups in the pipeline."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"target_column = 'Quantity'\n",
|
|
||||||
"time_column_name = 'WeekStarting'\n",
|
|
||||||
"grain_column_names = ['Brand']\n",
|
|
||||||
"group_column_names = ['Store']\n",
|
|
||||||
"max_horizon = 20"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"automl_settings = {\n",
|
|
||||||
" \"iteration_timeout_minutes\" : 5,\n",
|
|
||||||
" \"experiment_timeout_minutes\" : 15,\n",
|
|
||||||
" \"primary_metric\" : 'normalized_mean_absolute_error',\n",
|
|
||||||
" \"time_column_name\": time_column_name,\n",
|
|
||||||
" \"grain_column_names\": grain_column_names,\n",
|
|
||||||
" \"max_horizon\": max_horizon,\n",
|
|
||||||
" \"drop_column_names\": ['logQuantity'],\n",
|
|
||||||
" \"max_concurrent_iterations\": 2,\n",
|
|
||||||
" \"max_cores_per_iteration\": -1\n",
|
|
||||||
"}\n",
|
|
||||||
"base_configuration = AutoMLConfig(task = 'forecasting',\n",
|
|
||||||
" path = project_folder,\n",
|
|
||||||
" n_cross_validations=3,\n",
|
|
||||||
" **automl_settings\n",
|
|
||||||
" )"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Pipeline\n",
|
|
||||||
"We've written a script to generate the individual pipeline steps used to create each automl step. Calling this script will return a list of PipelineSteps that will train multiple groups concurrently and then deploy these models.\n",
|
|
||||||
"\n",
|
|
||||||
"This step requires an Enterprise workspace to gain access to this feature. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade).\n",
|
|
||||||
"\n",
|
|
||||||
"### Call the method to build pipeline steps\n",
|
|
||||||
"\n",
|
|
||||||
"`build_pipeline_steps()` takes as input:\n",
|
|
||||||
"* **automlconfig**: This is the configuration used for every automl step\n",
|
|
||||||
"* **df**: This is the dataset to be used for training\n",
|
|
||||||
"* **target_column**: This is the target column of the dataset\n",
|
|
||||||
"* **compute_target**: The compute to be used for training\n",
|
|
||||||
"* **deploy**: The option on to deploy the models after training, if set to true an extra step will be added to deploy a webservice with all the models (default is `True`)\n",
|
|
||||||
"* **service_name**: The service name for the model query endpoint\n",
|
|
||||||
"* **time_column_name**: The time column of the data"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.webservice import Webservice\n",
|
|
||||||
"from azureml.exceptions import WebserviceException\n",
|
|
||||||
"\n",
|
|
||||||
"service_name = 'grouped-model'\n",
|
|
||||||
"try:\n",
|
|
||||||
" # if you want to get existing service below is the command\n",
|
|
||||||
" # since aci name needs to be unique in subscription deleting existing aci if any\n",
|
|
||||||
" # we use aci_service_name to create azure aci\n",
|
|
||||||
" service = Webservice(ws, name=service_name)\n",
|
|
||||||
" if service:\n",
|
|
||||||
" service.delete()\n",
|
|
||||||
"except WebserviceException as e:\n",
|
|
||||||
" pass"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from build import build_pipeline_steps\n",
|
|
||||||
"\n",
|
|
||||||
"steps = build_pipeline_steps(\n",
|
|
||||||
" base_configuration, \n",
|
|
||||||
" data, \n",
|
|
||||||
" target_column,\n",
|
|
||||||
" compute_target, \n",
|
|
||||||
" group_column_names=group_column_names, \n",
|
|
||||||
" deploy=True, \n",
|
|
||||||
" service_name=service_name, \n",
|
|
||||||
" time_column_name=time_column_name\n",
|
|
||||||
")"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Train\n",
|
|
||||||
"Use the list of steps generated from above to build the pipeline and submit it to your compute for remote training."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.pipeline.core import Pipeline\n",
|
|
||||||
"pipeline = Pipeline(\n",
|
|
||||||
" description=\"A pipeline with one model per data group using Automated ML.\",\n",
|
|
||||||
" workspace=ws, \n",
|
|
||||||
" steps=steps)\n",
|
|
||||||
"\n",
|
|
||||||
"pipeline_run = experiment.submit(pipeline)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.widgets import RunDetails\n",
|
|
||||||
"RunDetails(pipeline_run).show()"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"pipeline_run.wait_for_completion(show_output=False)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Test\n",
|
|
||||||
"\n",
|
|
||||||
"Now we can use the holdout set to test our models and ensure our web-service is running as expected."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.webservice import AciWebservice\n",
|
|
||||||
"service = AciWebservice(ws, service_name)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"X_test = data_test.to_pandas_dataframe()\n",
|
|
||||||
"# Drop the column we are trying to predict (target column)\n",
|
|
||||||
"x_pred = X_test.drop(target_column, inplace=False, axis=1)\n",
|
|
||||||
"x_pred.head()"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# Get Predictions\n",
|
|
||||||
"test_sample = X_test.drop(target_column, inplace=False, axis=1).to_json()\n",
|
|
||||||
"predictions = service.run(input_data=test_sample)\n",
|
|
||||||
"print(predictions)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# Convert predictions from JSON to DataFrame\n",
|
|
||||||
"pred_dict =json.loads(predictions)\n",
|
|
||||||
"X_pred = pd.read_json(pred_dict['predictions'])\n",
|
|
||||||
"X_pred.head()"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# Fix the index\n",
|
|
||||||
"PRED = 'pred_target'\n",
|
|
||||||
"X_pred[time_column_name] = pd.to_datetime(X_pred[time_column_name], unit='ms')\n",
|
|
||||||
"\n",
|
|
||||||
"X_pred.set_index([time_column_name] + grain_column_names, inplace=True, drop=True)\n",
|
|
||||||
"X_pred.rename({'_automl_target_col': PRED}, inplace=True, axis=1)\n",
|
|
||||||
"# Drop all but the target column and index\n",
|
|
||||||
"X_pred.drop(list(set(X_pred.columns.values).difference({PRED})), axis=1, inplace=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"X_test[time_column_name] = pd.to_datetime(X_test[time_column_name])\n",
|
|
||||||
"X_test.set_index([time_column_name] + grain_column_names, inplace=True, drop=True)\n",
|
|
||||||
"# Merge predictions with raw features\n",
|
|
||||||
"pred_test = X_test.merge(X_pred, left_index=True, right_index=True)\n",
|
|
||||||
"pred_test.head()"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from sklearn.metrics import mean_absolute_error, mean_squared_error\n",
|
|
||||||
"def MAPE(actual, pred):\n",
|
|
||||||
" \"\"\"\n",
|
|
||||||
" Calculate mean absolute percentage error.\n",
|
|
||||||
" Remove NA and values where actual is close to zero\n",
|
|
||||||
" \"\"\"\n",
|
|
||||||
" not_na = ~(np.isnan(actual) | np.isnan(pred))\n",
|
|
||||||
" not_zero = ~np.isclose(actual, 0.0)\n",
|
|
||||||
" actual_safe = actual[not_na & not_zero]\n",
|
|
||||||
" pred_safe = pred[not_na & not_zero]\n",
|
|
||||||
" APE = 100*np.abs((actual_safe - pred_safe)/actual_safe)\n",
|
|
||||||
" return np.mean(APE)\n",
|
|
||||||
"\n",
|
|
||||||
"def get_metrics(actuals, preds):\n",
|
|
||||||
" return pd.Series(\n",
|
|
||||||
" {\n",
|
|
||||||
" \"RMSE\": np.sqrt(mean_squared_error(actuals, preds)),\n",
|
|
||||||
" \"NormRMSE\": np.sqrt(mean_squared_error(actuals, preds))/np.abs(actuals.max()-actuals.min()),\n",
|
|
||||||
" \"MAE\": mean_absolute_error(actuals, preds),\n",
|
|
||||||
" \"MAPE\": MAPE(actuals, preds)},\n",
|
|
||||||
" )"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"get_metrics(pred_test[PRED].values, pred_test[target_column].values)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": []
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"authors": [
|
|
||||||
{
|
|
||||||
"name": "alyerman"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"category": "other",
|
|
||||||
"compute": [
|
|
||||||
"AML Compute"
|
|
||||||
],
|
|
||||||
"datasets": [
|
|
||||||
"Orange Juice Sales"
|
|
||||||
],
|
|
||||||
"deployment": [
|
|
||||||
"Azure Container Instance"
|
|
||||||
],
|
|
||||||
"exclude_from_index": false,
|
|
||||||
"framework": [
|
|
||||||
"Scikit-learn",
|
|
||||||
"Pytorch"
|
|
||||||
],
|
|
||||||
"friendly_name": "Automated ML Grouping with Pipeline.",
|
|
||||||
"index_order": 10,
|
|
||||||
"kernelspec": {
|
|
||||||
"display_name": "Python 3.6",
|
|
||||||
"language": "python",
|
|
||||||
"name": "python36"
|
|
||||||
},
|
|
||||||
"language_info": {
|
|
||||||
"codemirror_mode": {
|
|
||||||
"name": "ipython",
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"file_extension": ".py",
|
|
||||||
"mimetype": "text/x-python",
|
|
||||||
"name": "python",
|
|
||||||
"nbconvert_exporter": "python",
|
|
||||||
"pygments_lexer": "ipython3",
|
|
||||||
"version": "3.6.6"
|
|
||||||
},
|
|
||||||
"tags": [
|
|
||||||
"AutomatedML"
|
|
||||||
],
|
|
||||||
"task": "Use AzureML Pipeline to trigger multiple Automated ML runs."
|
|
||||||
},
|
|
||||||
"nbformat": 4,
|
|
||||||
"nbformat_minor": 2
|
|
||||||
}
|
|
||||||
@@ -1,142 +0,0 @@
|
|||||||
from typing import List, Dict
|
|
||||||
import copy
|
|
||||||
import json
|
|
||||||
import pandas as pd
|
|
||||||
import re
|
|
||||||
|
|
||||||
from azureml.core import RunConfiguration
|
|
||||||
from azureml.core.compute import ComputeTarget
|
|
||||||
from azureml.core.conda_dependencies import CondaDependencies
|
|
||||||
from azureml.core.dataset import Dataset
|
|
||||||
from azureml.pipeline.core import PipelineData, PipelineParameter, TrainingOutput, StepSequence
|
|
||||||
from azureml.pipeline.steps import PythonScriptStep
|
|
||||||
from azureml.train.automl import AutoMLConfig
|
|
||||||
from azureml.train.automl.runtime import AutoMLStep
|
|
||||||
|
|
||||||
|
|
||||||
def _get_groups(data: Dataset, group_column_names: List[str]) -> pd.DataFrame:
|
|
||||||
return data._dataflow.distinct(columns=group_column_names)\
|
|
||||||
.keep_columns(columns=group_column_names).to_pandas_dataframe()
|
|
||||||
|
|
||||||
|
|
||||||
def _get_configs(automlconfig: AutoMLConfig,
|
|
||||||
data: Dataset,
|
|
||||||
target_column: str,
|
|
||||||
compute_target: ComputeTarget,
|
|
||||||
group_column_names: List[str]) -> Dict[str, AutoMLConfig]:
|
|
||||||
# remove invalid characters regex
|
|
||||||
valid_chars = re.compile('[^a-zA-Z0-9-]')
|
|
||||||
groups = _get_groups(data, group_column_names)
|
|
||||||
configs = {}
|
|
||||||
for i, group in groups.iterrows():
|
|
||||||
single = data
|
|
||||||
group_name = "#####".join(str(x) for x in group.values)
|
|
||||||
group_name = valid_chars.sub('', group_name)
|
|
||||||
for key in group.index:
|
|
||||||
single = single._dataflow.filter(data._dataflow[key] == group[key])
|
|
||||||
group_conf = copy.deepcopy(automlconfig)
|
|
||||||
group_conf.user_settings['training_data'] = single
|
|
||||||
group_conf.user_settings['label_column_name'] = target_column
|
|
||||||
group_conf.user_settings['compute_target'] = compute_target
|
|
||||||
configs[group_name] = group_conf
|
|
||||||
return configs
|
|
||||||
|
|
||||||
|
|
||||||
def build_pipeline_steps(automlconfig: AutoMLConfig,
|
|
||||||
data: Dataset,
|
|
||||||
target_column: str,
|
|
||||||
compute_target: ComputeTarget,
|
|
||||||
group_column_names: list,
|
|
||||||
time_column_name: str,
|
|
||||||
deploy: bool,
|
|
||||||
service_name: str = 'grouping-demo') -> StepSequence:
|
|
||||||
steps = []
|
|
||||||
|
|
||||||
metrics_output_name = 'metrics_{}'
|
|
||||||
best_model_output_name = 'best_model_{}'
|
|
||||||
count = 0
|
|
||||||
model_names = []
|
|
||||||
|
|
||||||
# get all automl configs by group
|
|
||||||
configs = _get_configs(automlconfig, data, target_column, compute_target, group_column_names)
|
|
||||||
|
|
||||||
# build a runconfig for register model
|
|
||||||
register_config = RunConfiguration()
|
|
||||||
cd = CondaDependencies()
|
|
||||||
cd.add_pip_package('azureml-pipeline')
|
|
||||||
register_config.environment.python.conda_dependencies = cd
|
|
||||||
|
|
||||||
# create each automl step end-to-end (train, register)
|
|
||||||
for group_name, conf in configs.items():
|
|
||||||
# create automl metrics output
|
|
||||||
metirics_data = PipelineData(
|
|
||||||
name='metrics_data_{}'.format(group_name),
|
|
||||||
pipeline_output_name=metrics_output_name.format(group_name),
|
|
||||||
training_output=TrainingOutput(type='Metrics'))
|
|
||||||
# create automl model output
|
|
||||||
model_data = PipelineData(
|
|
||||||
name='model_data_{}'.format(group_name),
|
|
||||||
pipeline_output_name=best_model_output_name.format(group_name),
|
|
||||||
training_output=TrainingOutput(type='Model', metric=conf.user_settings['primary_metric']))
|
|
||||||
|
|
||||||
automl_step = AutoMLStep(
|
|
||||||
name='automl_{}'.format(group_name),
|
|
||||||
automl_config=conf,
|
|
||||||
outputs=[metirics_data, model_data],
|
|
||||||
allow_reuse=True)
|
|
||||||
steps.append(automl_step)
|
|
||||||
|
|
||||||
# pass the group name as a parameter to the register step ->
|
|
||||||
# this will become the name of the model for this group.
|
|
||||||
group_name_param = PipelineParameter("group_name_{}".format(count), default_value=group_name)
|
|
||||||
count += 1
|
|
||||||
|
|
||||||
reg_model_step = PythonScriptStep(
|
|
||||||
'register.py',
|
|
||||||
name='register_{}'.format(group_name),
|
|
||||||
arguments=["--model_name", group_name_param, "--model_path", model_data],
|
|
||||||
inputs=[model_data],
|
|
||||||
compute_target=compute_target,
|
|
||||||
runconfig=register_config,
|
|
||||||
source_directory="register",
|
|
||||||
allow_reuse=True
|
|
||||||
)
|
|
||||||
steps.append(reg_model_step)
|
|
||||||
model_names.append(group_name)
|
|
||||||
|
|
||||||
final_steps = steps
|
|
||||||
if deploy:
|
|
||||||
# modify the conda dependencies to ensure we pick up correct
|
|
||||||
# versions of azureml-defaults and azureml-train-automl
|
|
||||||
cd = CondaDependencies.create(pip_packages=['azureml-defaults', 'azureml-train-automl'])
|
|
||||||
automl_deps = CondaDependencies(conda_dependencies_file_path='deploy/myenv.yml')
|
|
||||||
cd._merge_dependencies(automl_deps)
|
|
||||||
cd.save('deploy/myenv.yml')
|
|
||||||
|
|
||||||
# add deployment step
|
|
||||||
pp_group_column_names = PipelineParameter(
|
|
||||||
"group_column_names",
|
|
||||||
default_value="#####".join(list(reversed(group_column_names))))
|
|
||||||
|
|
||||||
pp_model_names = PipelineParameter(
|
|
||||||
"model_names",
|
|
||||||
default_value=json.dumps(model_names))
|
|
||||||
|
|
||||||
pp_service_name = PipelineParameter(
|
|
||||||
"service_name",
|
|
||||||
default_value=service_name)
|
|
||||||
|
|
||||||
deployment_step = PythonScriptStep(
|
|
||||||
'deploy.py',
|
|
||||||
name='service_deploy',
|
|
||||||
arguments=["--group_column_names", pp_group_column_names,
|
|
||||||
"--model_names", pp_model_names,
|
|
||||||
"--service_name", pp_service_name,
|
|
||||||
"--time_column_name", time_column_name],
|
|
||||||
compute_target=compute_target,
|
|
||||||
runconfig=RunConfiguration(),
|
|
||||||
source_directory="deploy"
|
|
||||||
)
|
|
||||||
final_steps = StepSequence(steps=[steps, deployment_step])
|
|
||||||
|
|
||||||
return final_steps
|
|
||||||
@@ -1,61 +0,0 @@
|
|||||||
WeekStarting,Store,Brand,Quantity,logQuantity,Advert,Price,Age60,COLLEGE,INCOME,Hincome150,Large HH,Minorities,WorkingWoman,SSTRDIST,SSTRVOL,CPDIST5,CPWVOL5
|
|
||||||
1992-08-20,2,minute.maid,23488,10.06424493,1,1.94,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-20,2,tropicana,13376,9.501217335,1,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-27,2,tropicana,8128,9.00307017,0,2.75,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-27,2,minute.maid,19008,9.852615222,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-27,2,dominicks,9024,9.107642974,0,1.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-03,2,tropicana,19456,9.875910785,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-03,2,minute.maid,11584,9.357380115,0,1.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-03,2,dominicks,2048,7.624618986000001,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-10,2,tropicana,10048,9.215128888999999,0,2.64,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-10,2,minute.maid,26752,10.19436452,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-10,2,dominicks,1984,7.592870287999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-17,2,tropicana,6336,8.754002933999999,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-17,2,minute.maid,3904,8.269756948,0,2.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-17,2,dominicks,4160,8.333270353,0,1.77,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-24,2,tropicana,16192,9.692272572,1,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-24,2,minute.maid,3712,8.219326094,0,2.67,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-09-24,2,dominicks,35264,10.47061789,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-10-01,2,dominicks,8640,9.064157862,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-10-01,2,minute.maid,41216,10.62658181,1,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-10-01,2,tropicana,5824,8.66974259,0,2.97,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-20,5,tropicana,17728,9.78290059,1,2.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-20,5,minute.maid,27072,10.20625526,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-27,5,tropicana,9600,9.169518378,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-27,5,minute.maid,3840,8.253227646000001,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-27,5,dominicks,1856,7.526178913,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-03,5,tropicana,25664,10.15284451,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-03,5,minute.maid,6144,8.723231275,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-03,5,dominicks,3712,8.219326094,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-10,5,tropicana,9984,9.208739091,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-10,5,dominicks,2688,7.896552702,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-10,5,minute.maid,36416,10.50276352,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-17,5,tropicana,8576,9.056722882999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-17,5,minute.maid,5440,8.60153434,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-17,5,dominicks,6464,8.774003599999999,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-24,5,tropicana,13184,9.486759252,1,2.78,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-24,5,dominicks,40896,10.61878754,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-09-24,5,minute.maid,7680,8.946374826,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-10-01,5,dominicks,6144,8.723231275,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-10-01,5,minute.maid,50304,10.82583988,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-10-01,5,tropicana,7488,8.921057017999999,0,2.78,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-20,8,minute.maid,55552,10.9250748,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-20,8,tropicana,8576,9.056722882999999,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-27,8,tropicana,8000,8.987196821,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-27,8,minute.maid,18688,9.835636886,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-27,8,dominicks,19200,9.862665558,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-03,8,tropicana,21760,9.987828701,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-03,8,minute.maid,14656,9.592605087,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-03,8,dominicks,12800,9.45720045,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-10,8,tropicana,12800,9.45720045,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-10,8,minute.maid,30144,10.31374118,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-10,8,dominicks,15296,9.635346635,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-17,8,tropicana,10112,9.221478116,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-17,8,minute.maid,6208,8.733594062,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-17,8,dominicks,20992,9.951896692,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-24,8,tropicana,10304,9.240287448,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-24,8,minute.maid,7104,8.868413285,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-09-24,8,dominicks,73856,11.20987253,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-10-01,8,minute.maid,65856,11.09522582,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-10-01,8,dominicks,16192,9.692272572,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-10-01,8,tropicana,6400,8.764053269,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
|
@@ -1,973 +0,0 @@
|
|||||||
WeekStarting,Store,Brand,Quantity,logQuantity,Advert,Price,Age60,COLLEGE,INCOME,Hincome150,Large HH,Minorities,WorkingWoman,SSTRDIST,SSTRVOL,CPDIST5,CPWVOL5
|
|
||||||
1990-06-14,2,dominicks,10560,9.264828557000001,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-06-14,2,minute.maid,4480,8.407378325,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-06-14,2,tropicana,8256,9.018695487999999,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-07-26,2,dominicks,8000,8.987196821,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-07-26,2,minute.maid,4672,8.449342525,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-07-26,2,tropicana,6144,8.723231275,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-02,2,tropicana,3840,8.253227646000001,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-02,2,minute.maid,20160,9.911455722000001,1,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-02,2,dominicks,6848,8.831711918,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-09,2,dominicks,2880,7.965545572999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-09,2,minute.maid,2688,7.896552702,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-09,2,tropicana,8000,8.987196821,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-23,2,dominicks,1600,7.377758908,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-23,2,minute.maid,3008,8.009030685,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-23,2,tropicana,8896,9.093357017,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-30,2,tropicana,7168,8.877381955,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-30,2,minute.maid,4672,8.449342525,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-08-30,2,dominicks,25344,10.140297300000002,1,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-06,2,dominicks,10752,9.282847063,0,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-06,2,minute.maid,2752,7.920083199,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-06,2,tropicana,10880,9.29468152,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-13,2,minute.maid,26176,10.17259824,1,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-13,2,dominicks,6656,8.803273982999999,0,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-13,2,tropicana,7744,8.954673629,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-20,2,dominicks,6592,8.793612072,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-20,2,minute.maid,3712,8.219326094,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-09-20,2,tropicana,8512,9.049232212,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-11,2,tropicana,5504,8.61323038,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-11,2,minute.maid,30656,10.33058368,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-11,2,dominicks,1728,7.454719948999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-18,2,tropicana,5888,8.68067166,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-18,2,minute.maid,3840,8.253227646000001,0,2.98,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-18,2,dominicks,33792,10.42797937,1,1.24,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-25,2,tropicana,8384,9.034080407000001,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-25,2,minute.maid,2816,7.943072717000001,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-10-25,2,dominicks,1920,7.560080465,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-01,2,tropicana,5952,8.691482577,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-01,2,minute.maid,23104,10.04776104,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-01,2,dominicks,8960,9.100525506,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-08,2,dominicks,11392,9.340666634,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-08,2,tropicana,6848,8.831711918,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-08,2,minute.maid,3392,8.129174997,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-15,2,tropicana,9216,9.128696383,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-15,2,minute.maid,26304,10.1774763,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-15,2,dominicks,28416,10.25470765,0,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-22,2,dominicks,17152,9.749870064,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-22,2,tropicana,12160,9.405907156,0,2.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-22,2,minute.maid,6336,8.754002933999999,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-29,2,tropicana,12672,9.447150114,0,2.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-29,2,minute.maid,9920,9.2023082,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-11-29,2,dominicks,26560,10.1871616,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-06,2,dominicks,6336,8.754002933999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-06,2,minute.maid,25280,10.13776885,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-06,2,tropicana,6528,8.783855897,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-13,2,dominicks,26368,10.17990643,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-13,2,tropicana,6144,8.723231275,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-13,2,minute.maid,14848,9.605620455,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-20,2,tropicana,21120,9.957975738,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-20,2,minute.maid,12288,9.416378455,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-20,2,dominicks,896,6.797940412999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-27,2,tropicana,12416,9.426741242,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-27,2,minute.maid,6272,8.743850562,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-12-27,2,dominicks,1472,7.294377299,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-03,2,tropicana,9472,9.156095357,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-03,2,minute.maid,9152,9.121727714,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-03,2,dominicks,1344,7.2034055210000005,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-10,2,tropicana,17920,9.793672686,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-10,2,minute.maid,4160,8.333270353,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-10,2,dominicks,111680,11.62339292,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-17,2,tropicana,9408,9.14931567,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-17,2,minute.maid,10176,9.227787286,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-17,2,dominicks,1856,7.526178913,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-24,2,tropicana,6272,8.743850562,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-24,2,minute.maid,29056,10.27698028,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-24,2,dominicks,5568,8.624791202,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-31,2,tropicana,6912,8.841014311,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-31,2,minute.maid,7104,8.868413285,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-01-31,2,dominicks,32064,10.37548918,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-07,2,tropicana,16768,9.727227587,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-07,2,dominicks,4352,8.378390789,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-07,2,minute.maid,7488,8.921057017999999,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-14,2,dominicks,704,6.556778356000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-14,2,minute.maid,4224,8.348537825,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-14,2,tropicana,6272,8.743850562,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-21,2,tropicana,7936,8.979164649,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-21,2,minute.maid,8960,9.100525506,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-21,2,dominicks,13760,9.529521112000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-28,2,tropicana,6144,8.723231275,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-28,2,minute.maid,22464,10.01966931,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-02-28,2,dominicks,43328,10.67655436,1,1.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-07,2,tropicana,7936,8.979164649,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-07,2,minute.maid,3840,8.253227646000001,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-07,2,dominicks,57600,10.96127785,1,1.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-14,2,tropicana,7808,8.962904128,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-14,2,minute.maid,12992,9.472089062,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-14,2,dominicks,704,6.556778356000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-21,2,tropicana,6080,8.712759975,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-21,2,minute.maid,70144,11.15830555,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-21,2,dominicks,6016,8.702177866,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-28,2,tropicana,42176,10.64960662,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-28,2,dominicks,10368,9.246479419,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-03-28,2,minute.maid,21248,9.964018052,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-04,2,dominicks,12608,9.442086812000001,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-04,2,minute.maid,5696,8.647519453,1,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-04,2,tropicana,4928,8.502688505,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-11,2,tropicana,29504,10.29228113,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-11,2,minute.maid,7680,8.946374826,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-11,2,dominicks,6336,8.754002933999999,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-18,2,tropicana,9984,9.208739091,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-18,2,minute.maid,6336,8.754002933999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-18,2,dominicks,140736,11.85464107,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-25,2,tropicana,35200,10.46880136,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-25,2,dominicks,960,6.866933285,1,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-04-25,2,minute.maid,8576,9.056722882999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-02,2,dominicks,1216,7.103322062999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-02,2,minute.maid,15104,9.622714887999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-02,2,tropicana,23936,10.08313888,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-09,2,tropicana,7104,8.868413285,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-09,2,minute.maid,76480,11.24478455,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-09,2,dominicks,1664,7.416979621,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-16,2,dominicks,4992,8.51559191,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-16,2,minute.maid,5056,8.528330936,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-16,2,tropicana,24512,10.10691807,1,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-23,2,tropicana,6336,8.754002933999999,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-23,2,minute.maid,4736,8.462948177000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-23,2,dominicks,27968,10.23881628,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-30,2,dominicks,12160,9.405907156,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-30,2,minute.maid,4480,8.407378325,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-05-30,2,tropicana,6080,8.712759975,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-06,2,tropicana,33536,10.42037477,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-06,2,minute.maid,4032,8.30201781,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-06,2,dominicks,2240,7.714231145,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-13,2,dominicks,5504,8.61323038,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-13,2,minute.maid,14784,9.601300794,1,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-13,2,tropicana,13248,9.491601877,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-20,2,tropicana,6208,8.733594062,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-20,2,dominicks,8832,9.086136769,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-20,2,minute.maid,12096,9.400630097999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-27,2,dominicks,2624,7.87245515,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-27,2,minute.maid,41792,10.64046021,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-06-27,2,tropicana,10624,9.270870872,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-04,2,tropicana,44672,10.70710219,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-04,2,minute.maid,10560,9.264828557000001,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-04,2,dominicks,10432,9.252633284,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-18,2,tropicana,20096,9.908276069,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-18,2,dominicks,8320,9.026417534,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-18,2,minute.maid,4224,8.348537825,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-25,2,dominicks,6784,8.822322178,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-25,2,minute.maid,2880,7.965545572999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-07-25,2,tropicana,9152,9.121727714,1,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-01,2,tropicana,21952,9.996613531,0,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-01,2,minute.maid,3968,8.286017467999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-01,2,dominicks,60544,11.01112565,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-08,2,dominicks,20608,9.933434629,0,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-08,2,minute.maid,3712,8.219326094,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-08,2,tropicana,13568,9.515469357999999,0,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-29,2,tropicana,4160,8.333270353,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-29,2,minute.maid,2816,7.943072717000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-08-29,2,dominicks,16064,9.684336023,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-05,2,tropicana,39424,10.58213005,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-05,2,minute.maid,4288,8.363575702999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-05,2,dominicks,12480,9.431882642,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-12,2,tropicana,5632,8.636219898,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-12,2,minute.maid,18240,9.811372264,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-12,2,dominicks,17024,9.742379392,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-19,2,dominicks,13440,9.505990614,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-19,2,minute.maid,7360,8.903815212,0,1.95,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-19,2,tropicana,9024,9.107642974,1,2.68,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-26,2,tropicana,6016,8.702177866,0,3.44,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-26,2,minute.maid,7808,8.962904128,0,1.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-09-26,2,dominicks,10112,9.221478116,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-03,2,dominicks,9088,9.114710141,0,1.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-03,2,minute.maid,13504,9.510741217,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-03,2,tropicana,7744,8.954673629,0,3.14,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-10,2,tropicana,6784,8.822322178,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-10,2,dominicks,22848,10.03661887,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-10,2,minute.maid,10048,9.215128888999999,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-17,2,dominicks,6976,8.850230966,0,1.65,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-17,2,minute.maid,135936,11.81993947,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-17,2,tropicana,6784,8.822322178,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-24,2,tropicana,6272,8.743850562,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-24,2,minute.maid,5056,8.528330936,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-24,2,dominicks,4160,8.333270353,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-31,2,tropicana,5312,8.577723691000001,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-31,2,minute.maid,27968,10.23881628,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-10-31,2,dominicks,3328,8.110126802,0,1.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-07,2,tropicana,9216,9.128696383,0,3.11,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-07,2,minute.maid,4736,8.462948177000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-07,2,dominicks,12096,9.400630097999999,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-14,2,tropicana,7296,8.895081532,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-14,2,minute.maid,7808,8.962904128,0,2.14,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-14,2,dominicks,6208,8.733594062,0,1.76,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-21,2,tropicana,34240,10.44114983,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-21,2,minute.maid,12480,9.431882642,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-21,2,dominicks,3008,8.009030685,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-28,2,dominicks,19456,9.875910785,1,1.5,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-28,2,minute.maid,9664,9.17616292,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-11-28,2,tropicana,7168,8.877381955,0,2.64,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-05,2,minute.maid,7168,8.877381955,0,2.06,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-05,2,dominicks,16768,9.727227587,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-05,2,tropicana,6080,8.712759975,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-12,2,dominicks,13568,9.515469357999999,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-12,2,minute.maid,4480,8.407378325,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-12,2,tropicana,5120,8.540909718,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-19,2,tropicana,8320,9.026417534,0,2.74,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-19,2,minute.maid,5952,8.691482577,0,2.22,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-19,2,dominicks,6080,8.712759975,0,1.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-26,2,dominicks,10432,9.252633284,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-26,2,minute.maid,21696,9.984883191,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1991-12-26,2,tropicana,17728,9.78290059,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-02,2,minute.maid,12032,9.395325046,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-02,2,dominicks,11712,9.368369236,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-02,2,tropicana,13120,9.481893063,0,2.35,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-09,2,dominicks,4032,8.30201781,0,1.76,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-09,2,minute.maid,7040,8.859363449,0,2.12,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-09,2,tropicana,13120,9.481893063,0,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-16,2,dominicks,6336,8.754002933999999,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-16,2,tropicana,9792,9.189321005,0,2.43,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-16,2,minute.maid,10240,9.234056899,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-23,2,tropicana,3520,8.166216269,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-23,2,minute.maid,6848,8.831711918,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-23,2,dominicks,13632,9.520175249,0,1.47,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-30,2,tropicana,5504,8.61323038,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-30,2,minute.maid,3968,8.286017467999999,0,2.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-01-30,2,dominicks,45120,10.71708089,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-06,2,tropicana,6720,8.812843434,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-06,2,minute.maid,5888,8.68067166,0,2.26,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-06,2,dominicks,9984,9.208739091,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-13,2,tropicana,20224,9.914625297,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-13,2,dominicks,4800,8.476371197,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-13,2,minute.maid,6208,8.733594062,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-20,2,dominicks,11776,9.373818841,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-20,2,minute.maid,72256,11.18797065,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-20,2,tropicana,5056,8.528330936,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-27,2,tropicana,43584,10.68244539,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-27,2,minute.maid,11520,9.351839934,0,2.11,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-02-27,2,dominicks,11584,9.357380115,0,1.54,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-05,2,tropicana,25728,10.15533517,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-05,2,minute.maid,5824,8.66974259,0,2.35,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-05,2,dominicks,51264,10.84474403,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-12,2,tropicana,31808,10.36747311,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-12,2,minute.maid,19392,9.872615889,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-12,2,dominicks,14976,9.614204199,0,1.44,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-19,2,tropicana,20736,9.939626599,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-19,2,minute.maid,9536,9.162829389,0,2.1,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-19,2,dominicks,30784,10.33475035,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-26,2,tropicana,15168,9.626943225,0,2.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-26,2,minute.maid,5312,8.577723691000001,0,2.28,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-03-26,2,dominicks,12480,9.431882642,0,1.6,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-02,2,tropicana,28096,10.2433825,1,2.5,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-02,2,dominicks,3264,8.090708716,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-02,2,minute.maid,14528,9.583833101,1,1.9,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-09,2,dominicks,8768,9.078864009,0,1.48,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-09,2,minute.maid,12416,9.426741242,0,2.12,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-09,2,tropicana,12416,9.426741242,0,2.58,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-16,2,tropicana,5376,8.589699882,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-16,2,minute.maid,5376,8.589699882,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-16,2,dominicks,70848,11.16829202,1,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-23,2,tropicana,9792,9.189321005,0,2.67,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-23,2,minute.maid,19008,9.852615222,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-23,2,dominicks,18560,9.828764006,0,1.42,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-30,2,tropicana,16960,9.738612909,1,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-30,2,minute.maid,3904,8.269756948,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-04-30,2,dominicks,9152,9.121727714,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-07,2,tropicana,8320,9.026417534,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-07,2,minute.maid,6336,8.754002933999999,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-07,2,dominicks,9600,9.169518378,0,2.0,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-14,2,tropicana,6912,8.841014311,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-14,2,minute.maid,5440,8.60153434,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-14,2,dominicks,4800,8.476371197,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-21,2,tropicana,6976,8.850230966,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-21,2,minute.maid,22400,10.01681624,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-21,2,dominicks,9664,9.17616292,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-28,2,minute.maid,3968,8.286017467999999,0,2.84,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-28,2,tropicana,7232,8.886270902,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-05-28,2,dominicks,45568,10.726961,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-04,2,tropicana,51520,10.84972536,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-04,2,minute.maid,3264,8.090708716,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-04,2,dominicks,20992,9.951896692,0,1.74,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-11,2,minute.maid,4352,8.378390789,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-11,2,tropicana,22272,10.01108556,0,2.21,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-11,2,dominicks,6592,8.793612072,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-18,2,dominicks,4992,8.51559191,0,2.05,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-18,2,minute.maid,4480,8.407378325,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-18,2,tropicana,46144,10.73952222,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-25,2,tropicana,4352,8.378390789,1,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-25,2,minute.maid,3840,8.253227646000001,0,2.52,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-06-25,2,dominicks,8064,8.99516499,0,1.24,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-02,2,tropicana,17280,9.757305042,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-02,2,minute.maid,13312,9.496421162999999,1,2.0,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-02,2,dominicks,7360,8.903815212,0,1.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-09,2,tropicana,5696,8.647519453,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-09,2,minute.maid,3776,8.236420527,1,2.33,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-09,2,dominicks,10048,9.215128888999999,0,1.4,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-16,2,tropicana,6848,8.831711918,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-16,2,dominicks,10112,9.221478116,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-16,2,minute.maid,4800,8.476371197,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-23,2,dominicks,9152,9.121727714,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-23,2,minute.maid,24960,10.12502982,1,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-23,2,tropicana,4416,8.392989587999999,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-30,2,tropicana,4672,8.449342525,0,3.16,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-30,2,minute.maid,4544,8.42156296,0,2.86,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-07-30,2,dominicks,36288,10.49924239,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-06,2,tropicana,7168,8.877381955,1,3.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-06,2,minute.maid,3968,8.286017467999999,1,2.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-06,2,dominicks,3776,8.236420527,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-13,2,tropicana,5056,8.528330936,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-13,2,dominicks,3328,8.110126802,0,1.97,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-13,2,minute.maid,49600,10.81174611,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1992-08-20,2,dominicks,13824,9.534161491,0,1.36,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
|
|
||||||
1990-06-14,5,dominicks,1792,7.491087594,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-06-14,5,minute.maid,4224,8.348537825,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-06-14,5,tropicana,5888,8.68067166,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-06-28,5,minute.maid,4352,8.378390789,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-06-28,5,dominicks,2496,7.82244473,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-06-28,5,tropicana,6976,8.850230966,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-05,5,dominicks,2944,7.98752448,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-05,5,minute.maid,4928,8.502688505,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-05,5,tropicana,6528,8.783855897,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-12,5,dominicks,1024,6.931471806,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-12,5,minute.maid,31168,10.34714721,1,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-12,5,tropicana,4928,8.502688505,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-26,5,dominicks,4224,8.348537825,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-26,5,minute.maid,10048,9.215128888999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-07-26,5,tropicana,5312,8.577723691000001,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-02,5,minute.maid,21760,9.987828701,1,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-02,5,tropicana,5120,8.540909718,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-02,5,dominicks,4544,8.42156296,1,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-09,5,dominicks,1728,7.454719948999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-09,5,minute.maid,4544,8.42156296,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-09,5,tropicana,7936,8.979164649,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-16,5,tropicana,6080,8.712759975,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-16,5,minute.maid,52224,10.86329744,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-16,5,dominicks,1216,7.103322062999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-23,5,dominicks,1152,7.049254841000001,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-23,5,minute.maid,3584,8.184234774,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-23,5,tropicana,4160,8.333270353,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-30,5,minute.maid,5120,8.540909718,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-30,5,tropicana,5888,8.68067166,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-08-30,5,dominicks,30144,10.31374118,1,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-06,5,dominicks,8960,9.100525506,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-06,5,minute.maid,4416,8.392989587999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-06,5,tropicana,9536,9.162829389,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-13,5,tropicana,8320,9.026417534,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-13,5,dominicks,8192,9.010913347,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-13,5,minute.maid,30208,10.31586207,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-20,5,dominicks,6528,8.783855897,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-20,5,minute.maid,4160,8.333270353,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-20,5,tropicana,8000,8.987196821,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-27,5,dominicks,34688,10.45414909,1,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-27,5,minute.maid,4992,8.51559191,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-09-27,5,tropicana,5824,8.66974259,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-04,5,dominicks,4672,8.449342525,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-04,5,minute.maid,13952,9.543378146,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-04,5,tropicana,10624,9.270870872,1,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-11,5,tropicana,6656,8.803273982999999,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-11,5,dominicks,1088,6.992096427000001,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-11,5,minute.maid,47680,10.772267300000001,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-18,5,tropicana,5184,8.553332238,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-18,5,minute.maid,7616,8.938006577000001,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-18,5,dominicks,69440,11.14821835,1,1.24,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-25,5,tropicana,4928,8.502688505,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-25,5,minute.maid,8896,9.093357017,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-10-25,5,dominicks,1280,7.154615357000001,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-01,5,tropicana,5888,8.68067166,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-01,5,minute.maid,28544,10.25920204,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-01,5,dominicks,35456,10.47604777,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-08,5,tropicana,5312,8.577723691000001,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-08,5,dominicks,13824,9.534161491,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-08,5,minute.maid,5440,8.60153434,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-15,5,tropicana,9984,9.208739091,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-15,5,minute.maid,52416,10.86696717,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-15,5,dominicks,14208,9.561560465,0,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-22,5,tropicana,8448,9.041685006,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-22,5,dominicks,29312,10.28575227,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-22,5,minute.maid,11712,9.368369236,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-29,5,tropicana,10880,9.29468152,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-29,5,minute.maid,13952,9.543378146,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-11-29,5,dominicks,52992,10.87789624,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-06,5,dominicks,15680,9.660141293999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-06,5,minute.maid,36160,10.49570882,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-06,5,tropicana,5696,8.647519453,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-13,5,tropicana,5696,8.647519453,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-13,5,minute.maid,12864,9.462187991,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-13,5,dominicks,43520,10.68097588,1,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-20,5,tropicana,32384,10.38541975,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-20,5,minute.maid,22208,10.00820786,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-20,5,dominicks,3904,8.269756948,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-27,5,tropicana,10752,9.282847063,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-27,5,minute.maid,9984,9.208739091,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-12-27,5,dominicks,896,6.797940412999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-03,5,tropicana,6912,8.841014311,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-03,5,minute.maid,14016,9.547954812999999,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-03,5,dominicks,2240,7.714231145,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-10,5,tropicana,13440,9.505990614,0,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-10,5,minute.maid,6080,8.712759975,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-10,5,dominicks,125760,11.74213061,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-17,5,tropicana,7808,8.962904128,0,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-17,5,minute.maid,7808,8.962904128,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-17,5,dominicks,1408,7.249925537,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-24,5,tropicana,5248,8.565602331000001,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-24,5,minute.maid,40896,10.61878754,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-24,5,dominicks,7232,8.886270902,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-31,5,tropicana,6208,8.733594062,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-31,5,minute.maid,6272,8.743850562,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-01-31,5,dominicks,41216,10.62658181,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-07,5,tropicana,21440,9.973013615,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-07,5,minute.maid,7872,8.971067439,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-07,5,dominicks,9024,9.107642974,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-14,5,dominicks,1600,7.377758908,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-14,5,tropicana,7360,8.903815212,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-14,5,minute.maid,6144,8.723231275,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-21,5,tropicana,6720,8.812843434,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-21,5,minute.maid,8448,9.041685006,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-21,5,dominicks,2496,7.82244473,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-28,5,tropicana,6656,8.803273982999999,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-28,5,minute.maid,18688,9.835636886,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-02-28,5,dominicks,6336,8.754002933999999,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-07,5,tropicana,6016,8.702177866,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-07,5,minute.maid,6272,8.743850562,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-07,5,dominicks,56384,10.93994071,1,1.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-14,5,tropicana,6144,8.723231275,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-14,5,minute.maid,12096,9.400630097999999,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-14,5,dominicks,1600,7.377758908,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-21,5,tropicana,4928,8.502688505,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-21,5,minute.maid,73216,11.20116926,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-21,5,dominicks,2944,7.98752448,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-28,5,tropicana,67712,11.1230187,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-28,5,minute.maid,18944,9.849242538,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-03-28,5,dominicks,13504,9.510741217,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-04,5,dominicks,5376,8.589699882,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-04,5,tropicana,8640,9.064157862,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-04,5,minute.maid,6400,8.764053269,1,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-11,5,tropicana,35520,10.477851199999998,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-11,5,minute.maid,8640,9.064157862,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-11,5,dominicks,6656,8.803273982999999,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-18,5,tropicana,9664,9.17616292,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-18,5,minute.maid,7296,8.895081532,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-18,5,dominicks,95680,11.46876457,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-25,5,tropicana,49088,10.80136989,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-25,5,minute.maid,12480,9.431882642,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-04-25,5,dominicks,896,6.797940412999999,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-02,5,dominicks,1728,7.454719948999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-02,5,minute.maid,14144,9.557045785,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-02,5,tropicana,14912,9.609921537,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-09,5,minute.maid,88256,11.38799696,1,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-09,5,tropicana,6464,8.774003599999999,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-09,5,dominicks,1280,7.154615357000001,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-16,5,dominicks,5696,8.647519453,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-16,5,minute.maid,6848,8.831711918,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-16,5,tropicana,25024,10.12759064,1,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-23,5,minute.maid,7808,8.962904128,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-23,5,tropicana,6272,8.743850562,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-23,5,dominicks,28288,10.25019297,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-30,5,dominicks,4864,8.489616424,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-30,5,minute.maid,6272,8.743850562,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-05-30,5,tropicana,5056,8.528330936,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-06,5,minute.maid,6144,8.723231275,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-06,5,tropicana,47616,10.77092412,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-06,5,dominicks,2880,7.965545572999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-13,5,dominicks,5760,8.658692754,1,1.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-13,5,minute.maid,27776,10.23192762,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-13,5,tropicana,13888,9.538780437,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-20,5,tropicana,6144,8.723231275,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-20,5,minute.maid,20800,9.942708266,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-20,5,dominicks,15040,9.618468598,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-27,5,dominicks,5120,8.540909718,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-27,5,minute.maid,45696,10.72976605,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-06-27,5,tropicana,9344,9.142489705,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-04,5,minute.maid,14336,9.570529135,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-04,5,tropicana,32896,10.40110635,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-04,5,dominicks,3264,8.090708716,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-11,5,dominicks,9536,9.162829389,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-11,5,minute.maid,4928,8.502688505,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-11,5,tropicana,21056,9.954940834,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-18,5,tropicana,15360,9.639522007,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-18,5,minute.maid,4608,8.435549202,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-18,5,dominicks,6208,8.733594062,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-25,5,dominicks,6592,8.793612072,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-25,5,tropicana,8000,8.987196821,1,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-07-25,5,minute.maid,5248,8.565602331000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-01,5,tropicana,21120,9.957975738,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-01,5,dominicks,63552,11.05961375,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-01,5,minute.maid,4224,8.348537825,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-08,5,dominicks,27968,10.23881628,0,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-08,5,minute.maid,4288,8.363575702999999,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-08,5,tropicana,11904,9.384629757,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-15,5,minute.maid,16896,9.734832187,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-15,5,tropicana,5056,8.528330936,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-15,5,dominicks,21760,9.987828701,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-22,5,dominicks,2688,7.896552702,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-22,5,minute.maid,77184,11.25394746,1,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-22,5,tropicana,4608,8.435549202,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-29,5,tropicana,6016,8.702177866,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-29,5,minute.maid,5184,8.553332238,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-08-29,5,dominicks,10432,9.252633284,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-05,5,tropicana,50752,10.83470631,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-05,5,minute.maid,5248,8.565602331000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-05,5,dominicks,9792,9.189321005,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-12,5,minute.maid,20672,9.936535407000001,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-12,5,tropicana,5632,8.636219898,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-12,5,dominicks,8448,9.041685006,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-26,5,tropicana,6400,8.764053269,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-26,5,dominicks,6912,8.841014311,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-09-26,5,minute.maid,12352,9.421573272,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-03,5,dominicks,8256,9.018695487999999,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-03,5,minute.maid,12032,9.395325046,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-03,5,tropicana,5440,8.60153434,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-10,5,minute.maid,13440,9.505990614,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-10,5,dominicks,28672,10.26367632,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-10,5,tropicana,8128,9.00307017,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-24,5,tropicana,7232,8.886270902,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-24,5,minute.maid,5824,8.66974259,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-24,5,dominicks,4416,8.392989587999999,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-31,5,tropicana,7168,8.877381955,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-31,5,minute.maid,50112,10.82201578,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-10-31,5,dominicks,1856,7.526178913,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-07,5,minute.maid,5184,8.553332238,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-07,5,tropicana,7872,8.971067439,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-07,5,dominicks,6528,8.783855897,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-14,5,tropicana,7552,8.929567707999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-14,5,minute.maid,8384,9.034080407000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-14,5,dominicks,6080,8.712759975,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-21,5,tropicana,69504,11.14913958,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-21,5,dominicks,3456,8.14786713,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-21,5,minute.maid,10112,9.221478116,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-28,5,dominicks,25856,10.16029796,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-28,5,minute.maid,8384,9.034080407000001,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-11-28,5,tropicana,8960,9.100525506,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-05,5,tropicana,6912,8.841014311,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-05,5,dominicks,25728,10.15533517,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-05,5,minute.maid,11456,9.346268889,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-12,5,dominicks,23552,10.06696602,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-12,5,minute.maid,5952,8.691482577,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-12,5,tropicana,6656,8.803273982999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-19,5,tropicana,8192,9.010913347,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-19,5,dominicks,2944,7.98752448,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-19,5,minute.maid,8512,9.049232212,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-26,5,dominicks,5888,8.68067166,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-26,5,minute.maid,27968,10.23881628,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1991-12-26,5,tropicana,13440,9.505990614,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-02,5,tropicana,12160,9.405907156,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-02,5,dominicks,6848,8.831711918,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-02,5,minute.maid,24000,10.08580911,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-09,5,dominicks,1792,7.491087594,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-09,5,minute.maid,6848,8.831711918,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-09,5,tropicana,11840,9.379238908,0,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-16,5,tropicana,8640,9.064157862,0,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-16,5,dominicks,5248,8.565602331000001,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-16,5,minute.maid,15104,9.622714887999999,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-23,5,tropicana,5888,8.68067166,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-23,5,minute.maid,11392,9.340666634,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-23,5,dominicks,16768,9.727227587,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-30,5,tropicana,7424,8.912473275,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-30,5,minute.maid,5824,8.66974259,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-01-30,5,dominicks,52160,10.8620712,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-06,5,tropicana,5632,8.636219898,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-06,5,minute.maid,7488,8.921057017999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-06,5,dominicks,16640,9.719564714,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-13,5,tropicana,33600,10.42228135,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-13,5,minute.maid,8320,9.026417534,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-13,5,dominicks,1344,7.2034055210000005,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-20,5,dominicks,4608,8.435549202,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-20,5,tropicana,5376,8.589699882,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-20,5,minute.maid,99904,11.511965,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-27,5,tropicana,54272,10.90176372,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-27,5,minute.maid,6976,8.850230966,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-02-27,5,dominicks,12672,9.447150114,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-05,5,tropicana,33600,10.42228135,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-05,5,minute.maid,9984,9.208739091,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-05,5,dominicks,48640,10.79220152,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-12,5,tropicana,24448,10.10430369,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-12,5,minute.maid,32832,10.39915893,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-12,5,dominicks,13248,9.491601877,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-19,5,tropicana,22784,10.03381381,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-19,5,minute.maid,8128,9.00307017,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-19,5,dominicks,29248,10.28356647,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-26,5,tropicana,19008,9.852615222,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-26,5,minute.maid,6464,8.774003599999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-03-26,5,dominicks,4608,8.435549202,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-02,5,tropicana,15808,9.66827142,1,2.5,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-02,5,minute.maid,36800,10.51325312,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-02,5,dominicks,3136,8.050703382,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-09,5,dominicks,13184,9.486759252,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-09,5,tropicana,14144,9.557045785,0,2.5,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-09,5,minute.maid,12928,9.467150781,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-16,5,tropicana,9600,9.169518378,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-16,5,minute.maid,7424,8.912473275,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-16,5,dominicks,67712,11.1230187,1,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-23,5,tropicana,10112,9.221478116,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-23,5,minute.maid,34176,10.43927892,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-23,5,dominicks,18880,9.84585844,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-30,5,minute.maid,4160,8.333270353,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-30,5,tropicana,31872,10.36948316,1,2.24,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-04-30,5,dominicks,6208,8.733594062,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-07,5,tropicana,9280,9.135616826,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-07,5,minute.maid,5952,8.691482577,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-07,5,dominicks,5952,8.691482577,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-14,5,tropicana,7680,8.946374826,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-14,5,minute.maid,6528,8.783855897,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-14,5,dominicks,4160,8.333270353,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-21,5,tropicana,8704,9.071537969,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-21,5,minute.maid,30656,10.33058368,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-21,5,dominicks,23488,10.06424493,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-28,5,tropicana,9920,9.2023082,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-28,5,dominicks,60480,11.01006801,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-05-28,5,minute.maid,6656,8.803273982999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-04,5,tropicana,91968,11.42919597,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-04,5,minute.maid,4416,8.392989587999999,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-04,5,dominicks,20416,9.924074186,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-11,5,tropicana,44096,10.69412435,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-11,5,dominicks,6336,8.754002933999999,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-11,5,minute.maid,5696,8.647519453,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-25,5,minute.maid,5696,8.647519453,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-25,5,tropicana,7296,8.895081532,1,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-06-25,5,dominicks,1408,7.249925537,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-02,5,tropicana,12928,9.467150781,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-02,5,minute.maid,39680,10.58860256,1,2.01,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-02,5,dominicks,4672,8.449342525,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-09,5,tropicana,6848,8.831711918,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-09,5,minute.maid,6208,8.733594062,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-09,5,dominicks,19520,9.87919486,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-16,5,tropicana,8064,8.99516499,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-16,5,minute.maid,7872,8.971067439,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-16,5,dominicks,7872,8.971067439,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-23,5,dominicks,5184,8.553332238,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-23,5,tropicana,4992,8.51559191,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-23,5,minute.maid,54528,10.90646961,1,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-30,5,tropicana,7360,8.903815212,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-30,5,minute.maid,6400,8.764053269,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-07-30,5,dominicks,42240,10.65112292,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-06,5,tropicana,8384,9.034080407000001,1,2.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-06,5,minute.maid,5888,8.68067166,1,2.65,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-06,5,dominicks,6592,8.793612072,1,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-13,5,tropicana,8832,9.086136769,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-13,5,minute.maid,56384,10.93994071,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-13,5,dominicks,2112,7.655390645,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1992-08-20,5,dominicks,21248,9.964018052,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
|
|
||||||
1990-06-14,8,dominicks,14336,9.570529135,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-14,8,minute.maid,6080,8.712759975,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-14,8,tropicana,8896,9.093357017,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-21,8,dominicks,6400,8.764053269,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-21,8,minute.maid,51968,10.85838342,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-21,8,tropicana,7296,8.895081532,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-28,8,tropicana,10368,9.246479419,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-28,8,minute.maid,4928,8.502688505,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-06-28,8,dominicks,3968,8.286017467999999,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-05,8,dominicks,4352,8.378390789,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-05,8,minute.maid,5312,8.577723691000001,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-05,8,tropicana,6976,8.850230966,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-12,8,tropicana,6464,8.774003599999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-12,8,dominicks,3520,8.166216269,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-12,8,minute.maid,39424,10.58213005,1,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-19,8,tropicana,8192,9.010913347,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-19,8,dominicks,6464,8.774003599999999,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-19,8,minute.maid,5568,8.624791202,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-26,8,dominicks,5952,8.691482577,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-26,8,minute.maid,14592,9.588228712000001,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-07-26,8,tropicana,7936,8.979164649,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-02,8,tropicana,6656,8.803273982999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-02,8,minute.maid,22208,10.00820786,1,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-02,8,dominicks,8832,9.086136769,1,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-09,8,dominicks,7232,8.886270902,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-09,8,minute.maid,5760,8.658692754,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-09,8,tropicana,8256,9.018695487999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-16,8,tropicana,5568,8.624791202,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-16,8,minute.maid,54016,10.89703558,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-16,8,dominicks,5504,8.61323038,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-23,8,dominicks,4800,8.476371197,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-23,8,minute.maid,5824,8.66974259,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-23,8,tropicana,7488,8.921057017999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-30,8,tropicana,6144,8.723231275,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-30,8,minute.maid,6528,8.783855897,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-08-30,8,dominicks,52672,10.87183928,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-06,8,dominicks,16448,9.707959168,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-06,8,minute.maid,5440,8.60153434,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-06,8,tropicana,11008,9.30637756,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-13,8,minute.maid,36544,10.50627229,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-13,8,dominicks,19072,9.85597657,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-13,8,tropicana,5760,8.658692754,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-20,8,dominicks,13376,9.501217335,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-20,8,minute.maid,3776,8.236420527,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-20,8,tropicana,10112,9.221478116,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-27,8,tropicana,8448,9.041685006,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-27,8,minute.maid,5504,8.61323038,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-09-27,8,dominicks,61440,11.02581637,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-04,8,tropicana,8448,9.041685006,1,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-04,8,dominicks,13760,9.529521112000001,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-04,8,minute.maid,12416,9.426741242,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-11,8,minute.maid,53696,10.89109379,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-11,8,dominicks,3136,8.050703382,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-11,8,tropicana,7424,8.912473275,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-18,8,tropicana,5824,8.66974259,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-18,8,minute.maid,5696,8.647519453,0,2.51,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-18,8,dominicks,186176,12.13444774,1,1.14,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-25,8,tropicana,6656,8.803273982999999,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-25,8,minute.maid,4864,8.489616424,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-10-25,8,dominicks,3712,8.219326094,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-01,8,tropicana,6272,8.743850562,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-01,8,minute.maid,37184,10.52363384,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-01,8,dominicks,35776,10.48503256,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-08,8,tropicana,6912,8.841014311,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-08,8,minute.maid,5504,8.61323038,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-08,8,dominicks,26880,10.1991378,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-15,8,tropicana,10496,9.258749511,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-15,8,minute.maid,51008,10.83973776,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-15,8,dominicks,71680,11.17996705,0,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-22,8,tropicana,11840,9.379238908,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-22,8,minute.maid,11072,9.312174678,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-22,8,dominicks,25088,10.13014492,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-29,8,tropicana,9664,9.17616292,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-29,8,minute.maid,12160,9.405907156,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-11-29,8,dominicks,91456,11.42361326,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-06,8,minute.maid,30528,10.32639957,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-06,8,dominicks,23808,10.07777694,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-06,8,tropicana,6272,8.743850562,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-13,8,dominicks,89856,11.40596367,1,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-13,8,minute.maid,12096,9.400630097999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-13,8,tropicana,7168,8.877381955,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-20,8,minute.maid,16448,9.707959168,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-20,8,dominicks,12224,9.411156511,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-20,8,tropicana,29504,10.29228113,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-27,8,minute.maid,9344,9.142489705,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-27,8,dominicks,3776,8.236420527,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1990-12-27,8,tropicana,8704,9.071537969,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-03,8,tropicana,9280,9.135616826,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-03,8,minute.maid,16128,9.688312171,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-03,8,dominicks,13824,9.534161491,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-10,8,minute.maid,5376,8.589699882,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-10,8,dominicks,251072,12.43349503,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-10,8,tropicana,12224,9.411156511,0,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-17,8,minute.maid,6656,8.803273982999999,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-17,8,tropicana,10368,9.246479419,0,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-17,8,dominicks,4864,8.489616424,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-24,8,minute.maid,59712,10.99728828,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-24,8,dominicks,10176,9.227787286,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-24,8,tropicana,8128,9.00307017,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-31,8,tropicana,5952,8.691482577,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-31,8,minute.maid,9856,9.195835686,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-01-31,8,dominicks,105344,11.56498647,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-07,8,minute.maid,6720,8.812843434,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-07,8,dominicks,33600,10.42228135,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-07,8,tropicana,21696,9.984883191,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-14,8,dominicks,4736,8.462948177000001,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-14,8,minute.maid,4224,8.348537825,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-14,8,tropicana,7808,8.962904128,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-21,8,tropicana,8128,9.00307017,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-21,8,minute.maid,9728,9.182763604,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-21,8,dominicks,10304,9.240287448,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-28,8,tropicana,7424,8.912473275,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-28,8,minute.maid,40320,10.604602900000001,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-02-28,8,dominicks,5056,8.528330936,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-07,8,dominicks,179968,12.10053434,1,0.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-07,8,tropicana,5952,8.691482577,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-07,8,minute.maid,5120,8.540909718,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-14,8,minute.maid,19264,9.865993348,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-14,8,dominicks,4992,8.51559191,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-14,8,tropicana,7616,8.938006577000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-21,8,tropicana,5312,8.577723691000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-21,8,minute.maid,170432,12.04609167,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-21,8,dominicks,6400,8.764053269,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-28,8,minute.maid,39680,10.58860256,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-28,8,dominicks,14912,9.609921537,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-03-28,8,tropicana,161792,11.99406684,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-04,8,dominicks,34624,10.45230236,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-04,8,minute.maid,8128,9.00307017,1,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-04,8,tropicana,17280,9.757305042,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-11,8,tropicana,47040,10.75875358,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-11,8,minute.maid,9088,9.114710141,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-11,8,dominicks,10368,9.246479419,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-18,8,tropicana,14464,9.579418083,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-18,8,minute.maid,6720,8.812843434,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-18,8,dominicks,194880,12.18013926,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-25,8,tropicana,52928,10.87668778,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-25,8,dominicks,5696,8.647519453,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-04-25,8,minute.maid,7552,8.929567707999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-02,8,dominicks,7168,8.877381955,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-02,8,minute.maid,24768,10.11730778,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-02,8,tropicana,21184,9.961001459,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-09,8,tropicana,7360,8.903815212,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-09,8,minute.maid,183296,12.11885761,1,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-09,8,dominicks,2880,7.965545572999999,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-16,8,dominicks,12288,9.416378455,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-16,8,minute.maid,8896,9.093357017,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-05-16,8,tropicana,15744,9.664214619,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-06,8,dominicks,9280,9.135616826,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-06,8,tropicana,46912,10.75602879,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-06,8,minute.maid,6656,8.803273982999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-13,8,tropicana,18240,9.811372264,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-13,8,dominicks,25856,10.16029796,1,1.26,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-13,8,minute.maid,35456,10.47604777,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-20,8,dominicks,19264,9.865993348,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-20,8,minute.maid,17408,9.76468515,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-20,8,tropicana,6464,8.774003599999999,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-27,8,dominicks,6848,8.831711918,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-27,8,minute.maid,75520,11.2321528,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-06-27,8,tropicana,8512,9.049232212,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-04,8,tropicana,28416,10.25470765,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-04,8,minute.maid,21632,9.981928979,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-04,8,dominicks,12928,9.467150781,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-11,8,dominicks,44032,10.69267192,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-11,8,minute.maid,8384,9.034080407000001,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-11,8,tropicana,16960,9.738612909,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-18,8,minute.maid,9920,9.2023082,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-18,8,dominicks,25408,10.14281936,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-18,8,tropicana,8320,9.026417534,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-25,8,dominicks,38336,10.55414468,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-25,8,minute.maid,6592,8.793612072,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-07-25,8,tropicana,11136,9.317938383,1,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-01,8,tropicana,27712,10.22962081,0,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-01,8,minute.maid,7168,8.877381955,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-01,8,dominicks,152384,11.93415893,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-08,8,dominicks,54464,10.90529521,0,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-08,8,minute.maid,6208,8.733594062,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-08,8,tropicana,7744,8.954673629,0,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-15,8,minute.maid,30528,10.32639957,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-15,8,dominicks,47680,10.772267300000001,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-15,8,tropicana,5184,8.553332238,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-22,8,dominicks,14720,9.596962392,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-22,8,minute.maid,155840,11.95658512,1,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-22,8,tropicana,6272,8.743850562,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-29,8,tropicana,7744,8.954673629,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-29,8,dominicks,53248,10.88271552,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-08-29,8,minute.maid,10752,9.282847063,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-05,8,tropicana,53184,10.88151288,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-05,8,minute.maid,6976,8.850230966,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-05,8,dominicks,40576,10.61093204,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-12,8,dominicks,25856,10.16029796,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-12,8,tropicana,6784,8.822322178,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-12,8,minute.maid,31872,10.36948316,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-19,8,dominicks,24064,10.08847223,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-19,8,minute.maid,5312,8.577723691000001,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-19,8,tropicana,8000,8.987196821,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-26,8,tropicana,6592,8.793612072,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-26,8,minute.maid,33344,10.41463313,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-09-26,8,dominicks,15680,9.660141293999999,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-03,8,minute.maid,13504,9.510741217,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-03,8,dominicks,16576,9.715711145,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-03,8,tropicana,5248,8.565602331000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-10,8,dominicks,49664,10.8130356,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-10,8,tropicana,6592,8.793612072,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-10,8,minute.maid,13504,9.510741217,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-17,8,dominicks,10752,9.282847063,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-17,8,minute.maid,335808,12.72429485,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-17,8,tropicana,5888,8.68067166,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-24,8,tropicana,6336,8.754002933999999,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-24,8,dominicks,9792,9.189321005,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-24,8,minute.maid,13120,9.481893063,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-31,8,tropicana,5888,8.68067166,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-31,8,minute.maid,49664,10.8130356,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-10-31,8,dominicks,7104,8.868413285,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-07,8,dominicks,9216,9.128696383,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-07,8,tropicana,6080,8.712759975,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-07,8,minute.maid,10880,9.29468152,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-14,8,tropicana,6848,8.831711918,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-14,8,minute.maid,9984,9.208739091,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-14,8,dominicks,12608,9.442086812000001,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-21,8,tropicana,54016,10.89703558,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-21,8,minute.maid,9216,9.128696383,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-21,8,dominicks,16448,9.707959168,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-28,8,tropicana,10368,9.246479419,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-28,8,dominicks,27968,10.23881628,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-11-28,8,minute.maid,7680,8.946374826,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-05,8,minute.maid,7296,8.895081532,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-05,8,dominicks,37824,10.5406991,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-05,8,tropicana,5568,8.624791202,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-12,8,dominicks,33664,10.4241843,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-12,8,minute.maid,8192,9.010913347,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-12,8,tropicana,4864,8.489616424,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-19,8,tropicana,7232,8.886270902,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-19,8,minute.maid,6080,8.712759975,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-19,8,dominicks,17728,9.78290059,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-26,8,tropicana,15232,9.631153757,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-26,8,dominicks,25088,10.13014492,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1991-12-26,8,minute.maid,15040,9.618468598,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-02,8,minute.maid,9472,9.156095357,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-02,8,dominicks,13184,9.486759252,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-02,8,tropicana,47040,10.75875358,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-09,8,dominicks,3136,8.050703382,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-09,8,minute.maid,5888,8.68067166,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-09,8,tropicana,9280,9.135616826,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-16,8,tropicana,6720,8.812843434,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-16,8,minute.maid,14336,9.570529135,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-16,8,dominicks,5696,8.647519453,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-23,8,minute.maid,11712,9.368369236,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-23,8,dominicks,19008,9.852615222,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-23,8,tropicana,5056,8.528330936,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-30,8,minute.maid,7936,8.979164649,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-30,8,dominicks,121664,11.70901843,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-01-30,8,tropicana,6080,8.712759975,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-06,8,tropicana,10496,9.258749511,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-06,8,minute.maid,5184,8.553332238,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-06,8,dominicks,38848,10.56741187,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-13,8,minute.maid,7168,8.877381955,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-13,8,dominicks,6144,8.723231275,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-13,8,tropicana,39040,10.57234204,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-20,8,dominicks,13632,9.520175249,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-20,8,minute.maid,216064,12.28332994,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-20,8,tropicana,4480,8.407378325,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-27,8,tropicana,61760,11.03101119,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-27,8,minute.maid,15040,9.618468598,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-02-27,8,dominicks,9792,9.189321005,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-05,8,tropicana,15360,9.639522007,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-05,8,minute.maid,11840,9.379238908,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-05,8,dominicks,86912,11.37265139,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-12,8,minute.maid,25472,10.14533509,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-12,8,dominicks,24512,10.10691807,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-12,8,tropicana,54976,10.91465201,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-19,8,minute.maid,16384,9.704060528,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-19,8,dominicks,58048,10.96902553,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-19,8,tropicana,34368,10.44488118,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-26,8,tropicana,10752,9.282847063,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-26,8,minute.maid,20480,9.927204079,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-03-26,8,dominicks,13952,9.543378146,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-02,8,minute.maid,34688,10.45414909,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-02,8,dominicks,15168,9.626943225,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-02,8,tropicana,20096,9.908276069,1,2.5,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-09,8,dominicks,14592,9.588228712000001,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-09,8,minute.maid,22400,10.01681624,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-09,8,tropicana,16192,9.692272572,0,2.5,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-16,8,tropicana,6528,8.783855897,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-16,8,minute.maid,7808,8.962904128,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-16,8,dominicks,145088,11.88509573,1,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-23,8,tropicana,8320,9.026417534,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-23,8,minute.maid,48064,10.78028874,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-23,8,dominicks,43712,10.68537794,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-30,8,tropicana,30784,10.33475035,1,2.16,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-30,8,minute.maid,7360,8.903815212,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-04-30,8,dominicks,20608,9.933434629,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-07,8,tropicana,18048,9.800790154,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-07,8,minute.maid,6272,8.743850562,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-07,8,dominicks,18752,9.839055692,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-14,8,tropicana,12864,9.462187991,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-14,8,minute.maid,6400,8.764053269,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-14,8,dominicks,20160,9.911455722000001,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-21,8,tropicana,7168,8.877381955,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-21,8,minute.maid,54592,10.90764263,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-21,8,dominicks,18688,9.835636886,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-28,8,minute.maid,8128,9.00307017,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-28,8,tropicana,9024,9.107642974,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-05-28,8,dominicks,133824,11.80428078,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-04,8,tropicana,84992,11.35031241,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-04,8,minute.maid,4928,8.502688505,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-04,8,dominicks,63488,11.05860619,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-11,8,minute.maid,5440,8.60153434,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-11,8,tropicana,14144,9.557045785,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-11,8,dominicks,71040,11.17099838,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-25,8,tropicana,7488,8.921057017999999,1,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-25,8,minute.maid,5888,8.68067166,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-06-25,8,dominicks,15360,9.639522007,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-02,8,minute.maid,23872,10.0804615,1,2.02,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-02,8,dominicks,17728,9.78290059,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-02,8,tropicana,12352,9.421573272,0,2.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-09,8,tropicana,5696,8.647519453,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-09,8,minute.maid,6848,8.831711918,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-09,8,dominicks,24256,10.09641929,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-16,8,minute.maid,8192,9.010913347,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-16,8,dominicks,19968,9.901886271,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-16,8,tropicana,7680,8.946374826,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-23,8,dominicks,15936,9.67633598,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-23,8,minute.maid,55040,10.91581547,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-23,8,tropicana,5440,8.60153434,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-30,8,tropicana,5632,8.636219898,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-30,8,minute.maid,6528,8.783855897,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-07-30,8,dominicks,76352,11.24310951,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-06,8,tropicana,8960,9.100525506,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-06,8,minute.maid,6208,8.733594062,1,2.45,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-06,8,dominicks,17408,9.76468515,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-13,8,minute.maid,94720,11.45868045,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-13,8,tropicana,6080,8.712759975,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-13,8,dominicks,17536,9.77201119,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
1992-08-20,8,dominicks,31232,10.34919849,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
|
|
||||||
|
@@ -1,66 +0,0 @@
|
|||||||
import argparse
|
|
||||||
import json
|
|
||||||
|
|
||||||
from azureml.core import Run, Model, Workspace
|
|
||||||
from azureml.core.conda_dependencies import CondaDependencies
|
|
||||||
from azureml.core.model import InferenceConfig
|
|
||||||
from azureml.core.webservice import AciWebservice
|
|
||||||
|
|
||||||
|
|
||||||
script_file_name = 'score.py'
|
|
||||||
conda_env_file_name = 'myenv.yml'
|
|
||||||
|
|
||||||
print("In deploy.py")
|
|
||||||
parser = argparse.ArgumentParser()
|
|
||||||
parser.add_argument("--time_column_name", type=str, help="time column name")
|
|
||||||
parser.add_argument("--group_column_names", type=str, help="group column names")
|
|
||||||
parser.add_argument("--model_names", type=str, help="model names")
|
|
||||||
parser.add_argument("--service_name", type=str, help="service name")
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
# replace the group column names in scoring script to the ones set by user
|
|
||||||
print("Update group_column_names")
|
|
||||||
print(args.group_column_names)
|
|
||||||
|
|
||||||
with open(script_file_name, 'r') as cefr:
|
|
||||||
content = cefr.read()
|
|
||||||
with open(script_file_name, 'w') as cefw:
|
|
||||||
content = content.replace('<<groups>>', args.group_column_names.rstrip())
|
|
||||||
cefw.write(content.replace('<<time_colname>>', args.time_column_name.rstrip()))
|
|
||||||
|
|
||||||
with open(script_file_name, 'r') as cefr1:
|
|
||||||
content1 = cefr1.read()
|
|
||||||
print(content1)
|
|
||||||
|
|
||||||
model_list = json.loads(args.model_names)
|
|
||||||
print(model_list)
|
|
||||||
|
|
||||||
run = Run.get_context()
|
|
||||||
ws = run.experiment.workspace
|
|
||||||
|
|
||||||
deployment_config = AciWebservice.deploy_configuration(
|
|
||||||
cpu_cores=1,
|
|
||||||
memory_gb=2,
|
|
||||||
tags={"method": "grouping"},
|
|
||||||
description='grouping demo aci deployment'
|
|
||||||
)
|
|
||||||
|
|
||||||
inference_config = InferenceConfig(
|
|
||||||
entry_script=script_file_name,
|
|
||||||
runtime='python',
|
|
||||||
conda_file=conda_env_file_name
|
|
||||||
)
|
|
||||||
|
|
||||||
models = []
|
|
||||||
for model_name in model_list:
|
|
||||||
models.append(Model(ws, name=model_name))
|
|
||||||
|
|
||||||
service = Model.deploy(
|
|
||||||
ws,
|
|
||||||
name=args.service_name,
|
|
||||||
models=models,
|
|
||||||
inference_config=inference_config,
|
|
||||||
deployment_config=deployment_config
|
|
||||||
)
|
|
||||||
service.wait_for_deployment(True)
|
|
||||||
@@ -1,11 +0,0 @@
|
|||||||
name: automl_grouping_env
|
|
||||||
dependencies:
|
|
||||||
# The python interpreter version.
|
|
||||||
|
|
||||||
# Currently Azure ML only supports 3.5.2 and later.
|
|
||||||
|
|
||||||
- python=3.6.2
|
|
||||||
- numpy>=1.16.0,<=1.16.2
|
|
||||||
- scikit-learn>=0.19.0,<=0.20.3
|
|
||||||
- conda-forge::fbprophet==0.5
|
|
||||||
|
|
||||||
@@ -1,55 +0,0 @@
|
|||||||
import json
|
|
||||||
import pickle
|
|
||||||
import re
|
|
||||||
|
|
||||||
import numpy as np
|
|
||||||
import pandas as pd
|
|
||||||
from sklearn.externals import joblib
|
|
||||||
from sklearn.linear_model import Ridge
|
|
||||||
|
|
||||||
from azureml.core.model import Model
|
|
||||||
import azureml.train.automl
|
|
||||||
|
|
||||||
|
|
||||||
def init():
|
|
||||||
global models
|
|
||||||
models = {}
|
|
||||||
global group_columns_str
|
|
||||||
group_columns_str = "<<groups>>"
|
|
||||||
global time_column_name
|
|
||||||
time_column_name = "<<time_colname>>"
|
|
||||||
|
|
||||||
global group_columns
|
|
||||||
group_columns = group_columns_str.split("#####")
|
|
||||||
global valid_chars
|
|
||||||
valid_chars = re.compile('[^a-zA-Z0-9-]')
|
|
||||||
|
|
||||||
|
|
||||||
def run(raw_data):
|
|
||||||
try:
|
|
||||||
data = pd.read_json(raw_data)
|
|
||||||
# Make sure we have correct time points.
|
|
||||||
data[time_column_name] = pd.to_datetime(data[time_column_name], unit='ms')
|
|
||||||
dfs = []
|
|
||||||
for grain, df_one in data.groupby(group_columns):
|
|
||||||
if isinstance(grain, int):
|
|
||||||
cur_group = str(grain)
|
|
||||||
elif isinstance(grain, str):
|
|
||||||
cur_group = grain
|
|
||||||
else:
|
|
||||||
cur_group = "#####".join(list(grain))
|
|
||||||
cur_group = valid_chars.sub('', cur_group)
|
|
||||||
print("Query model for group {}".format(cur_group))
|
|
||||||
if cur_group not in models:
|
|
||||||
model_path = Model.get_model_path(cur_group)
|
|
||||||
model = joblib.load(model_path)
|
|
||||||
models[cur_group] = model
|
|
||||||
_, xtrans = models[cur_group].forecast(df_one, np.repeat(np.nan, len(df_one)))
|
|
||||||
dfs.append(xtrans)
|
|
||||||
df_ret = pd.concat(dfs)
|
|
||||||
df_ret.reset_index(drop=False, inplace=True)
|
|
||||||
return json.dumps({'predictions': df_ret.to_json()})
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
error = str(e)
|
|
||||||
return error
|
|
||||||
@@ -1,22 +0,0 @@
|
|||||||
import argparse
|
|
||||||
|
|
||||||
from azureml.core import Run, Model
|
|
||||||
|
|
||||||
parser = argparse.ArgumentParser()
|
|
||||||
parser.add_argument("--model_name")
|
|
||||||
parser.add_argument("--model_path")
|
|
||||||
|
|
||||||
args = parser.parse_args()
|
|
||||||
|
|
||||||
run = Run.get_context()
|
|
||||||
ws = run.experiment.workspace
|
|
||||||
print('retrieved ws: {}'.format(ws))
|
|
||||||
|
|
||||||
print('begin register model')
|
|
||||||
model = Model.register(
|
|
||||||
workspace=ws,
|
|
||||||
model_path=args.model_path,
|
|
||||||
model_name=args.model_name
|
|
||||||
)
|
|
||||||
print('model registered: {}'.format(model))
|
|
||||||
print('complete')
|
|
||||||
@@ -335,7 +335,7 @@
|
|||||||
"automl_config = AutoMLConfig(task='forecasting',\n",
|
"automl_config = AutoMLConfig(task='forecasting',\n",
|
||||||
" debug_log='automl_forecasting_function.log',\n",
|
" debug_log='automl_forecasting_function.log',\n",
|
||||||
" primary_metric='normalized_root_mean_squared_error',\n",
|
" primary_metric='normalized_root_mean_squared_error',\n",
|
||||||
" experiment_timeout_minutes=15,\n",
|
" experiment_timeout_hours=0.25,\n",
|
||||||
" enable_early_stopping=True,\n",
|
" enable_early_stopping=True,\n",
|
||||||
" training_data=train_data,\n",
|
" training_data=train_data,\n",
|
||||||
" compute_target=compute_target,\n",
|
" compute_target=compute_target,\n",
|
||||||
@@ -377,9 +377,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"The `X_test` and `y_query` below, taken together, form the **forecast request**. The two are interpreted as aligned - `y_query` could actally be a column in `X_test`. `NaN`s in `y_query` are the question marks. These will be filled with the forecasts.\n",
|
"We use `X_test` as a **forecast request** to generate the predictions."
|
||||||
"\n",
|
|
||||||
"When the forecast period immediately follows the training period, the models retain the last few points of data. You can simply fill `y_query` filled with question marks - the model has the data for the lookback already.\n"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -408,8 +406,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"y_query = np.repeat(np.NaN, X_test.shape[0])\n",
|
"y_pred_no_gap, xy_nogap = fitted_model.forecast(X_test)\n",
|
||||||
"y_pred_no_gap, xy_nogap = fitted_model.forecast(X_test, y_query)\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"# xy_nogap contains the predictions in the _automl_target_col column.\n",
|
"# xy_nogap contains the predictions in the _automl_target_col column.\n",
|
||||||
"# Those same numbers are output in y_pred_no_gap\n",
|
"# Those same numbers are output in y_pred_no_gap\n",
|
||||||
@@ -437,7 +434,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"quantiles = fitted_model.forecast_quantiles(X_test, y_query)\n",
|
"quantiles = fitted_model.forecast_quantiles(X_test)\n",
|
||||||
"quantiles"
|
"quantiles"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -460,10 +457,10 @@
|
|||||||
"# specify which quantiles you would like \n",
|
"# specify which quantiles you would like \n",
|
||||||
"fitted_model.quantiles = [0.01, 0.5, 0.95]\n",
|
"fitted_model.quantiles = [0.01, 0.5, 0.95]\n",
|
||||||
"# use forecast_quantiles function, not the forecast() one\n",
|
"# use forecast_quantiles function, not the forecast() one\n",
|
||||||
"y_pred_quantiles = fitted_model.forecast_quantiles(X_test, y_query)\n",
|
"y_pred_quantiles = fitted_model.forecast_quantiles(X_test)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# it all nicely aligns column-wise\n",
|
"# quantile forecasts returned in a Dataframe along with the time and grain columns \n",
|
||||||
"pd.concat([X_test.reset_index(), pd.DataFrame({'query' : y_query}), y_pred_quantiles], axis=1)"
|
"y_pred_quantiles"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -539,9 +536,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"try: \n",
|
"try: \n",
|
||||||
" y_query = y_away.copy()\n",
|
" y_pred_away, xy_away = fitted_model.forecast(X_away)\n",
|
||||||
" y_query.fill(np.NaN)\n",
|
|
||||||
" y_pred_away, xy_away = fitted_model.forecast(X_away, y_query)\n",
|
|
||||||
" xy_away\n",
|
" xy_away\n",
|
||||||
"except Exception as e:\n",
|
"except Exception as e:\n",
|
||||||
" print(e)"
|
" print(e)"
|
||||||
@@ -551,7 +546,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"How should we read that eror message? The forecast origin is at the last time the model saw an actual value of `y` (the target). That was at the end of the training data! Because the model received all `NaN` (and not an actual target value), it is attempting to forecast from the end of training data. But the requested forecast periods are past the maximum horizon. We need to provide a define `y` value to establish the forecast origin.\n",
|
"How should we read that eror message? The forecast origin is at the last time the model saw an actual value of `y` (the target). That was at the end of the training data! The model is attempting to forecast from the end of training data. But the requested forecast periods are past the maximum horizon. We need to provide a define `y` value to establish the forecast origin.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"We will use this helper function to take the required amount of context from the data preceding the testing data. It's definition is intentionally simplified to keep the idea in the clear."
|
"We will use this helper function to take the required amount of context from the data preceding the testing data. It's definition is intentionally simplified to keep the idea in the clear."
|
||||||
]
|
]
|
||||||
@@ -706,7 +701,7 @@
|
|||||||
"metadata": {
|
"metadata": {
|
||||||
"authors": [
|
"authors": [
|
||||||
{
|
{
|
||||||
"name": "erwright, nirovins"
|
"name": "erwright"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"category": "tutorial",
|
"category": "tutorial",
|
||||||
@@ -740,7 +735,7 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.6.7"
|
"version": "3.6.8"
|
||||||
},
|
},
|
||||||
"tags": [
|
"tags": [
|
||||||
"Forecasting",
|
"Forecasting",
|
||||||
@@ -1,10 +1,9 @@
|
|||||||
name: auto-ml-forecasting-grouping
|
name: auto-ml-forecasting-function
|
||||||
dependencies:
|
dependencies:
|
||||||
|
- py-xgboost<=0.90
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
|
- numpy==1.16.2
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-pipeline
|
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- pandas_ml
|
|
||||||
- statsmodels
|
|
||||||
- matplotlib
|
- matplotlib
|
||||||
@@ -1,11 +0,0 @@
|
|||||||
name: automl-forecasting-function
|
|
||||||
dependencies:
|
|
||||||
- fbprophet==0.5
|
|
||||||
- py-xgboost<=0.80
|
|
||||||
- pip:
|
|
||||||
- azureml-sdk
|
|
||||||
- azureml-train-automl
|
|
||||||
- azureml-widgets
|
|
||||||
- pandas_ml
|
|
||||||
- statsmodels
|
|
||||||
- matplotlib
|
|
||||||
@@ -40,7 +40,7 @@
|
|||||||
"## Introduction\n",
|
"## Introduction\n",
|
||||||
"In this example, we use AutoML to train, select, and operationalize a time-series forecasting model for multiple time-series.\n",
|
"In this example, we use AutoML to train, select, and operationalize a time-series forecasting model for multiple time-series.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Make sure you have executed the [configuration notebook](../configuration.ipynb) before running this notebook.\n",
|
"Make sure you have executed the [configuration notebook](../../../configuration.ipynb) before running this notebook.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"The examples in the follow code samples use the University of Chicago's Dominick's Finer Foods dataset to forecast orange juice sales. Dominick's was a grocery chain in the Chicago metropolitan area."
|
"The examples in the follow code samples use the University of Chicago's Dominick's Finer Foods dataset to forecast orange juice sales. Dominick's was a grocery chain in the Chicago metropolitan area."
|
||||||
]
|
]
|
||||||
@@ -335,7 +335,7 @@
|
|||||||
"|-|-|\n",
|
"|-|-|\n",
|
||||||
"|**task**|forecasting|\n",
|
"|**task**|forecasting|\n",
|
||||||
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>\n",
|
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>\n",
|
||||||
"|**experiment_timeout_minutes**|Experimentation timeout in minutes.|\n",
|
"|**experiment_timeout_hours**|Experimentation timeout in hours.|\n",
|
||||||
"|**enable_early_stopping**|If early stopping is on, training will stop when the primary metric is no longer improving.|\n",
|
"|**enable_early_stopping**|If early stopping is on, training will stop when the primary metric is no longer improving.|\n",
|
||||||
"|**training_data**|Input dataset, containing both features and label column.|\n",
|
"|**training_data**|Input dataset, containing both features and label column.|\n",
|
||||||
"|**label_column_name**|The name of the label column.|\n",
|
"|**label_column_name**|The name of the label column.|\n",
|
||||||
@@ -366,7 +366,7 @@
|
|||||||
"automl_config = AutoMLConfig(task='forecasting',\n",
|
"automl_config = AutoMLConfig(task='forecasting',\n",
|
||||||
" debug_log='automl_oj_sales_errors.log',\n",
|
" debug_log='automl_oj_sales_errors.log',\n",
|
||||||
" primary_metric='normalized_mean_absolute_error',\n",
|
" primary_metric='normalized_mean_absolute_error',\n",
|
||||||
" experiment_timeout_minutes=15,\n",
|
" experiment_timeout_hours=0.25,\n",
|
||||||
" training_data=train_dataset,\n",
|
" training_data=train_dataset,\n",
|
||||||
" label_column_name=target_column_name,\n",
|
" label_column_name=target_column_name,\n",
|
||||||
" compute_target=compute_target,\n",
|
" compute_target=compute_target,\n",
|
||||||
@@ -454,9 +454,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"To produce predictions on the test set, we need to know the feature values at all dates in the test set. This requirement is somewhat reasonable for the OJ sales data since the features mainly consist of price, which is usually set in advance, and customer demographics which are approximately constant for each store over the 20 week forecast horizon in the testing data. \n",
|
"To produce predictions on the test set, we need to know the feature values at all dates in the test set. This requirement is somewhat reasonable for the OJ sales data since the features mainly consist of price, which is usually set in advance, and customer demographics which are approximately constant for each store over the 20 week forecast horizon in the testing data."
|
||||||
"\n",
|
|
||||||
"We will first create a query `y_query`, which is aligned index-for-index to `X_test`. This is a vector of target values where each `NaN` serves the function of the question mark to be replaced by forecast. Passing definite values in the `y` argument allows the `forecast` function to make predictions on data that does not immediately follow the train data which contains `y`. In each grain, the last time point where the model sees a definite value of `y` is that grain's _forecast origin_."
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -465,15 +463,10 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Replace ALL values in y by NaN.\n",
|
|
||||||
"# The forecast origin will be at the beginning of the first forecast period.\n",
|
|
||||||
"# (Which is the same time as the end of the last training period.)\n",
|
|
||||||
"y_query = y_test.copy().astype(np.float)\n",
|
|
||||||
"y_query.fill(np.nan)\n",
|
|
||||||
"# The featurized data, aligned to y, will also be returned.\n",
|
"# The featurized data, aligned to y, will also be returned.\n",
|
||||||
"# This contains the assumptions that were made in the forecast\n",
|
"# This contains the assumptions that were made in the forecast\n",
|
||||||
"# and helps align the forecast to the original data\n",
|
"# and helps align the forecast to the original data\n",
|
||||||
"y_predictions, X_trans = fitted_model.forecast(X_test, y_query)"
|
"y_predictions, X_trans = fitted_model.forecast(X_test)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -638,9 +631,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"import json\n",
|
"import json\n",
|
||||||
"# The request data frame needs to have y_query column which corresponds to query.\n",
|
|
||||||
"X_query = X_test.copy()\n",
|
"X_query = X_test.copy()\n",
|
||||||
"X_query['y_query'] = y_query\n",
|
|
||||||
"# We have to convert datetime to string, because Timestamps cannot be serialized to JSON.\n",
|
"# We have to convert datetime to string, because Timestamps cannot be serialized to JSON.\n",
|
||||||
"X_query[time_column_name] = X_query[time_column_name].astype(str)\n",
|
"X_query[time_column_name] = X_query[time_column_name].astype(str)\n",
|
||||||
"# The Service object accept the complex dictionary, which is internally converted to JSON string.\n",
|
"# The Service object accept the complex dictionary, which is internally converted to JSON string.\n",
|
||||||
@@ -705,9 +696,6 @@
|
|||||||
"framework": [
|
"framework": [
|
||||||
"Azure ML AutoML"
|
"Azure ML AutoML"
|
||||||
],
|
],
|
||||||
"tags": [
|
|
||||||
"None"
|
|
||||||
],
|
|
||||||
"friendly_name": "Forecasting orange juice sales with deployment",
|
"friendly_name": "Forecasting orange juice sales with deployment",
|
||||||
"index_order": 1,
|
"index_order": 1,
|
||||||
"kernelspec": {
|
"kernelspec": {
|
||||||
@@ -725,8 +713,11 @@
|
|||||||
"name": "python",
|
"name": "python",
|
||||||
"nbconvert_exporter": "python",
|
"nbconvert_exporter": "python",
|
||||||
"pygments_lexer": "ipython3",
|
"pygments_lexer": "ipython3",
|
||||||
"version": "3.6.7"
|
"version": "3.6.8"
|
||||||
},
|
},
|
||||||
|
"tags": [
|
||||||
|
"None"
|
||||||
|
],
|
||||||
"task": "Forecasting"
|
"task": "Forecasting"
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
|||||||
@@ -1,11 +1,10 @@
|
|||||||
name: auto-ml-forecasting-orange-juice-sales
|
name: auto-ml-forecasting-orange-juice-sales
|
||||||
dependencies:
|
dependencies:
|
||||||
- fbprophet==0.5
|
- py-xgboost<=0.90
|
||||||
- py-xgboost<=0.80
|
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
|
- numpy==1.16.2
|
||||||
|
- pandas==0.23.4
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
|
||||||
- statsmodels
|
|
||||||
|
|||||||
@@ -49,7 +49,9 @@
|
|||||||
"2. Configure AutoML using `AutoMLConfig`.\n",
|
"2. Configure AutoML using `AutoMLConfig`.\n",
|
||||||
"3. Train the model.\n",
|
"3. Train the model.\n",
|
||||||
"4. Explore the results.\n",
|
"4. Explore the results.\n",
|
||||||
"5. Test the fitted model."
|
"5. Visualization model's feature importance in azure portal\n",
|
||||||
|
"6. Explore any model's explanation and explore feature importance in azure portal\n",
|
||||||
|
"7. Test the fitted model."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -71,13 +73,13 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"from matplotlib import pyplot as plt\n",
|
"from matplotlib import pyplot as plt\n",
|
||||||
"import pandas as pd\n",
|
"import pandas as pd\n",
|
||||||
"import os\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"import azureml.core\n",
|
"import azureml.core\n",
|
||||||
"from azureml.core.experiment import Experiment\n",
|
"from azureml.core.experiment import Experiment\n",
|
||||||
"from azureml.core.workspace import Workspace\n",
|
"from azureml.core.workspace import Workspace\n",
|
||||||
"from azureml.core.dataset import Dataset\n",
|
"from azureml.core.dataset import Dataset\n",
|
||||||
"from azureml.train.automl import AutoMLConfig"
|
"from azureml.train.automl import AutoMLConfig\n",
|
||||||
|
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -155,8 +157,7 @@
|
|||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"n_cross_validations\": 3,\n",
|
" \"n_cross_validations\": 3,\n",
|
||||||
" \"primary_metric\": 'average_precision_score_weighted',\n",
|
" \"primary_metric\": 'average_precision_score_weighted',\n",
|
||||||
" \"preprocess\": True,\n",
|
" \"experiment_timeout_hours\": 0.25, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ability to find the best model possible\n",
|
||||||
" \"experiment_timeout_minutes\": 10, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
|
|
||||||
" \"verbosity\": logging.INFO,\n",
|
" \"verbosity\": logging.INFO,\n",
|
||||||
" \"enable_stack_ensemble\": False\n",
|
" \"enable_stack_ensemble\": False\n",
|
||||||
"}\n",
|
"}\n",
|
||||||
@@ -260,17 +261,134 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"#### Print the properties of the model\n",
|
"#### Print the properties of the model\n",
|
||||||
"The fitted_model is a python object and you can read the different properties of the object.\n",
|
"The fitted_model is a python object and you can read the different properties of the object.\n"
|
||||||
"See *Print the properties of the model* section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification/auto-ml-classification.ipynb)."
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Deploy\n",
|
"## Best Model 's explanation\n",
|
||||||
|
"Retrieve the explanation from the best_run which includes explanations for engineered features and raw features.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"To deploy the model into a web service endpoint, see _Deploy_ section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-with-deployment/auto-ml-classification-with-deployment.ipynb)"
|
"#### Download engineered feature importance from artifact store\n",
|
||||||
|
"You can use ExplanationClient to download the engineered feature explanations from the artifact store of the best_run. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"client = ExplanationClient.from_run(best_run)\n",
|
||||||
|
"engineered_explanations = client.download_model_explanation(raw=False)\n",
|
||||||
|
"print(engineered_explanations.get_feature_importance_dict())\n",
|
||||||
|
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + best_run.get_portal_url())"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Explanations\n",
|
||||||
|
"In this section, we will show how to compute model explanations and visualize the explanations using azureml-explain-model package. Besides retrieving an existing model explanation for an AutoML model, you can also explain your AutoML model with different test data. The following steps will allow you to compute and visualize engineered feature importance based on your test data."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Retrieve any other AutoML model from training"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"automl_run, fitted_model = local_run.get_output(metric='accuracy')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Setup the model explanations for AutoML models\n",
|
||||||
|
"The fitted_model can generate the following which will be used for getting the engineered explanations using automl_setup_model_explanations:-\n",
|
||||||
|
"\n",
|
||||||
|
"1. Featurized data from train samples/test samples\n",
|
||||||
|
"2. Gather engineered name lists\n",
|
||||||
|
"3. Find the classes in your labeled column in classification scenarios\n",
|
||||||
|
"\n",
|
||||||
|
"The automl_explainer_setup_obj contains all the structures from above list."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"X_train = training_data.drop_columns(columns=[label_column_name])\n",
|
||||||
|
"y_train = training_data.keep_columns(columns=[label_column_name], validate=True)\n",
|
||||||
|
"X_test = validation_data.drop_columns(columns=[label_column_name])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.train.automl.runtime.automl_explain_utilities import automl_setup_model_explanations\n",
|
||||||
|
"\n",
|
||||||
|
"automl_explainer_setup_obj = automl_setup_model_explanations(fitted_model, X=X_train, \n",
|
||||||
|
" X_test=X_test, y=y_train, \n",
|
||||||
|
" task='classification')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Initialize the Mimic Explainer for feature importance\n",
|
||||||
|
"For explaining the AutoML models, use the MimicWrapper from azureml.explain.model package. The MimicWrapper can be initialized with fields in automl_explainer_setup_obj, your workspace and a LightGBM model which acts as a surrogate model to explain the AutoML model (fitted_model here). The MimicWrapper also takes the automl_run object where engineered explanations will be uploaded."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.explain.model.mimic.models.lightgbm_model import LGBMExplainableModel\n",
|
||||||
|
"from azureml.explain.model.mimic_wrapper import MimicWrapper\n",
|
||||||
|
"explainer = MimicWrapper(ws, automl_explainer_setup_obj.automl_estimator, LGBMExplainableModel, \n",
|
||||||
|
" init_dataset=automl_explainer_setup_obj.X_transform, run=automl_run,\n",
|
||||||
|
" features=automl_explainer_setup_obj.engineered_feature_names, \n",
|
||||||
|
" feature_maps=[automl_explainer_setup_obj.feature_map],\n",
|
||||||
|
" classes=automl_explainer_setup_obj.classes)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"#### Use Mimic Explainer for computing and visualizing engineered feature importance\n",
|
||||||
|
"The explain() method in MimicWrapper can be called with the transformed test samples to get the feature importance for the generated engineered features. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"engineered_explanations = explainer.explain(['local', 'global'], eval_dataset=automl_explainer_setup_obj.X_test_transform)\n",
|
||||||
|
"print(engineered_explanations.get_feature_importance_dict())\n",
|
||||||
|
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -369,7 +487,7 @@
|
|||||||
"metadata": {
|
"metadata": {
|
||||||
"authors": [
|
"authors": [
|
||||||
{
|
{
|
||||||
"name": "tzvikei"
|
"name": "anumamah"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"category": "tutorial",
|
"category": "tutorial",
|
||||||
|
|||||||
@@ -2,10 +2,7 @@ name: auto-ml-classification-credit-card-fraud-local
|
|||||||
dependencies:
|
dependencies:
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- interpret
|
|
||||||
- azureml-defaults
|
|
||||||
- azureml-explain-model
|
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- azureml-explain-model
|
||||||
|
|||||||
@@ -51,8 +51,8 @@
|
|||||||
"4. Explore the results and featurization transparency options\n",
|
"4. Explore the results and featurization transparency options\n",
|
||||||
"5. Setup remote compute for computing the model explanations for a given AutoML model.\n",
|
"5. Setup remote compute for computing the model explanations for a given AutoML model.\n",
|
||||||
"6. Start an AzureML experiment on your remote compute to compute explanations for an AutoML model.\n",
|
"6. Start an AzureML experiment on your remote compute to compute explanations for an AutoML model.\n",
|
||||||
"7. Download the feature importance for engineered features and visualize the explanations for engineered features. \n",
|
"7. Download the feature importance for engineered features and visualize the explanations for engineered features on azure portal. \n",
|
||||||
"8. Download the feature importance for raw features and visualize the explanations for raw features. \n"
|
"8. Download the feature importance for raw features and visualize the explanations for raw features on azure portal. \n"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -206,9 +206,9 @@
|
|||||||
"|-|-|\n",
|
"|-|-|\n",
|
||||||
"|**task**|classification, regression or forecasting|\n",
|
"|**task**|classification, regression or forecasting|\n",
|
||||||
"|**primary_metric**|This is the metric that you want to optimize. Regression supports the following primary metrics: <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
|
"|**primary_metric**|This is the metric that you want to optimize. Regression supports the following primary metrics: <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
|
||||||
"|**experiment_timeout_minutes**| Maximum amount of time in minutes that all iterations combined can take before the experiment terminates.|\n",
|
"|**experiment_timeout_hours**| Maximum amount of time in hours that all iterations combined can take before the experiment terminates.|\n",
|
||||||
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
|
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
|
||||||
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Note: If the input data is sparse, featurization cannot be turned on.|\n",
|
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Setting this enables AutoML to perform featurization on the input to handle *missing data*, and to perform some common *feature extraction*. Note: If the input data is sparse, featurization cannot be turned on.|\n",
|
||||||
"|**n_cross_validations**|Number of cross validation splits.|\n",
|
"|**n_cross_validations**|Number of cross validation splits.|\n",
|
||||||
"|**training_data**|(sparse) array-like, shape = [n_samples, n_features]|\n",
|
"|**training_data**|(sparse) array-like, shape = [n_samples, n_features]|\n",
|
||||||
"|**label_column_name**|(sparse) array-like, shape = [n_samples, ], targets values.|"
|
"|**label_column_name**|(sparse) array-like, shape = [n_samples, ], targets values.|"
|
||||||
@@ -244,7 +244,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"featurization_config = FeaturizationConfig()\n",
|
"featurization_config = FeaturizationConfig()\n",
|
||||||
"featurization_config.blocked_transformers = ['LabelEncoder']\n",
|
"featurization_config.blocked_transformers = ['LabelEncoder']\n",
|
||||||
"#featurization_config.drop_columns = ['ERP', 'MMIN']\n",
|
"#featurization_config.drop_columns = ['MMIN']\n",
|
||||||
"featurization_config.add_column_purpose('MYCT', 'Numeric')\n",
|
"featurization_config.add_column_purpose('MYCT', 'Numeric')\n",
|
||||||
"featurization_config.add_column_purpose('VendorName', 'CategoricalHash')\n",
|
"featurization_config.add_column_purpose('VendorName', 'CategoricalHash')\n",
|
||||||
"#default strategy mean, add transformer param for for 3 columns\n",
|
"#default strategy mean, add transformer param for for 3 columns\n",
|
||||||
@@ -262,7 +262,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"enable_early_stopping\": True, \n",
|
" \"enable_early_stopping\": True, \n",
|
||||||
" \"experiment_timeout_minutes\" : 10,\n",
|
" \"experiment_timeout_hours\" : 0.25,\n",
|
||||||
" \"max_concurrent_iterations\": 4,\n",
|
" \"max_concurrent_iterations\": 4,\n",
|
||||||
" \"max_cores_per_iteration\": -1,\n",
|
" \"max_cores_per_iteration\": -1,\n",
|
||||||
" \"n_cross_validations\": 5,\n",
|
" \"n_cross_validations\": 5,\n",
|
||||||
@@ -514,7 +514,7 @@
|
|||||||
" content = cefr.read()\n",
|
" content = cefr.read()\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Replace the values in train_explainer.py file with the appropriate values\n",
|
"# Replace the values in train_explainer.py file with the appropriate values\n",
|
||||||
"content = content.replace('<<experimnet_name>>', automl_run.experiment.name) # your experiment name.\n",
|
"content = content.replace('<<experiment_name>>', automl_run.experiment.name) # your experiment name.\n",
|
||||||
"content = content.replace('<<run_id>>', automl_run.id) # Run-id of the AutoML run for which you want to explain the model.\n",
|
"content = content.replace('<<run_id>>', automl_run.id) # Run-id of the AutoML run for which you want to explain the model.\n",
|
||||||
"content = content.replace('<<target_column_name>>', 'ERP') # Your target column name\n",
|
"content = content.replace('<<target_column_name>>', 'ERP') # Your target column name\n",
|
||||||
"content = content.replace('<<task>>', 'regression') # Training task type\n",
|
"content = content.replace('<<task>>', 'regression') # Training task type\n",
|
||||||
@@ -532,8 +532,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"#### Create conda configuration for model explanations experiment\n",
|
"#### Create conda configuration for model explanations experiment from automl_run object"
|
||||||
"We need `azureml-explain-model`, `azureml-train-automl` and `azureml-core` packages for computing model explanations for your AutoML model on remote compute."
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -552,14 +551,9 @@
|
|||||||
"# Set compute target to AmlCompute\n",
|
"# Set compute target to AmlCompute\n",
|
||||||
"conda_run_config.target = compute_target\n",
|
"conda_run_config.target = compute_target\n",
|
||||||
"conda_run_config.environment.docker.enabled = True\n",
|
"conda_run_config.environment.docker.enabled = True\n",
|
||||||
"azureml_pip_packages = [\n",
|
|
||||||
" 'azureml-train-automl', 'azureml-core', 'azureml-explain-model'\n",
|
|
||||||
"]\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"# specify CondaDependencies obj\n",
|
"# specify CondaDependencies obj\n",
|
||||||
"conda_run_config.environment.python.conda_dependencies = CondaDependencies.create(\n",
|
"conda_run_config.environment.python.conda_dependencies = automl_run.get_environment().python.conda_dependencies"
|
||||||
" conda_packages=['scikit-learn', 'numpy','py-xgboost<=0.80'],\n",
|
|
||||||
" pip_packages=azureml_pip_packages)"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -604,38 +598,8 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Feature importance and explanation dashboard\n",
|
"### Feature importance and visualizing explanation dashboard\n",
|
||||||
"In this section we describe how you can download the explanation results from the explanations experiment and visualize the feature importance for your AutoML model. "
|
"In this section we describe how you can download the explanation results from the explanations experiment and visualize the feature importance for your AutoML model on the azure portal."
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"#### Setup for visualizing the model explanation results\n",
|
|
||||||
"For visualizing the explanation results for the *fitted_model* we need to perform the following steps:-\n",
|
|
||||||
"1. Featurize test data samples.\n",
|
|
||||||
"\n",
|
|
||||||
"The *automl_explainer_setup_obj* contains all the structures from above list. "
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"X_test = test_data.drop_columns([label]).to_pandas_dataframe()"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, automl_setup_model_explanations\n",
|
|
||||||
"explainer_setup_class = automl_setup_model_explanations(fitted_model, 'regression', X_test=X_test)"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -643,7 +607,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"#### Download engineered feature importance from artifact store\n",
|
"#### Download engineered feature importance from artifact store\n",
|
||||||
"You can use *ExplanationClient* to download the engineered feature explanations from the artifact store of the *automl_run*. You can also use ExplanationDashboard to view the dash board visualization of the feature importance values of the engineered features."
|
"You can use *ExplanationClient* to download the engineered feature explanations from the artifact store of the *automl_run*. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -653,11 +617,10 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.explain.model._internal.explanation_client import ExplanationClient\n",
|
"from azureml.explain.model._internal.explanation_client import ExplanationClient\n",
|
||||||
"from interpret_community.widget import ExplanationDashboard\n",
|
|
||||||
"client = ExplanationClient.from_run(automl_run)\n",
|
"client = ExplanationClient.from_run(automl_run)\n",
|
||||||
"engineered_explanations = client.download_model_explanation(raw=False)\n",
|
"engineered_explanations = client.download_model_explanation(raw=False)\n",
|
||||||
"print(engineered_explanations.get_feature_importance_dict())\n",
|
"print(engineered_explanations.get_feature_importance_dict())\n",
|
||||||
"ExplanationDashboard(engineered_explanations, explainer_setup_class.automl_estimator, datasetX=explainer_setup_class.X_test_transform)"
|
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -665,7 +628,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"#### Download raw feature importance from artifact store\n",
|
"#### Download raw feature importance from artifact store\n",
|
||||||
"You can use *ExplanationClient* to download the raw feature explanations from the artifact store of the *automl_run*. You can also use ExplanationDashboard to view the dash board visualization of the feature importance values of the raw features."
|
"You can use *ExplanationClient* to download the raw feature explanations from the artifact store of the *automl_run*. You can also use azure portal url to view the dash board visualization of the feature importance values of the raw features."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -676,7 +639,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"raw_explanations = client.download_model_explanation(raw=True)\n",
|
"raw_explanations = client.download_model_explanation(raw=True)\n",
|
||||||
"print(raw_explanations.get_feature_importance_dict())\n",
|
"print(raw_explanations.get_feature_importance_dict())\n",
|
||||||
"ExplanationDashboard(raw_explanations, explainer_setup_class.automl_pipeline, datasetX=explainer_setup_class.X_test_raw)"
|
"print(\"You can visualize the raw explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -718,20 +681,10 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"conda_dep = automl_run.get_environment().python.conda_dependencies\n",
|
||||||
"\n",
|
|
||||||
"azureml_pip_packages = [\n",
|
|
||||||
" 'azureml-explain-model', 'azureml-train-automl', 'azureml-defaults'\n",
|
|
||||||
"]\n",
|
|
||||||
" \n",
|
|
||||||
"\n",
|
|
||||||
"# specify CondaDependencies obj\n",
|
|
||||||
"myenv = CondaDependencies.create(conda_packages=['scikit-learn', 'pandas', 'numpy', 'py-xgboost<=0.80'],\n",
|
|
||||||
" pip_packages=azureml_pip_packages,\n",
|
|
||||||
" pin_sdk_version=True)\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())\n",
|
" f.write(conda_dep.serialize_to_string())\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"r\") as f:\n",
|
"with open(\"myenv.yml\",\"r\") as f:\n",
|
||||||
" print(f.read())"
|
" print(f.read())"
|
||||||
@@ -772,6 +725,7 @@
|
|||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
"from azureml.core.webservice import AciWebservice\n",
|
"from azureml.core.webservice import AciWebservice\n",
|
||||||
"from azureml.core.model import Model\n",
|
"from azureml.core.model import Model\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores=1, \n",
|
"aciconfig = AciWebservice.deploy_configuration(cpu_cores=1, \n",
|
||||||
" memory_gb=1, \n",
|
" memory_gb=1, \n",
|
||||||
@@ -779,9 +733,8 @@
|
|||||||
" \"method\" : \"local_explanation\"}, \n",
|
" \"method\" : \"local_explanation\"}, \n",
|
||||||
" description='Get local explanations for Machine test data')\n",
|
" description='Get local explanations for Machine test data')\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" entry_script=\"score_explain.py\",\n",
|
"inference_config = InferenceConfig(entry_script=\"score_explain.py\", environment=myenv)\n",
|
||||||
" conda_file=\"myenv.yml\")\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"# Use configs and models generated above\n",
|
"# Use configs and models generated above\n",
|
||||||
"service = Model.deploy(ws, 'model-scoring', [scoring_explainer_model, original_model], inference_config, aciconfig)\n",
|
"service = Model.deploy(ws, 'model-scoring', [scoring_explainer_model, original_model], inference_config, aciconfig)\n",
|
||||||
@@ -819,6 +772,7 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"if service.state == 'Healthy':\n",
|
"if service.state == 'Healthy':\n",
|
||||||
|
" X_test = test_data.drop_columns([label]).to_pandas_dataframe()\n",
|
||||||
" # Serialize the first row of the test data into json\n",
|
" # Serialize the first row of the test data into json\n",
|
||||||
" X_test_json = X_test[:1].to_json(orient='records')\n",
|
" X_test_json = X_test[:1].to_json(orient='records')\n",
|
||||||
" print(X_test_json)\n",
|
" print(X_test_json)\n",
|
||||||
|
|||||||
@@ -2,12 +2,9 @@ name: auto-ml-regression-hardware-performance-explanation-and-featurization
|
|||||||
dependencies:
|
dependencies:
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
- interpret
|
|
||||||
- azureml-defaults
|
|
||||||
- azureml-explain-model
|
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
- azureml-explain-model
|
||||||
- azureml-explain-model
|
- azureml-explain-model
|
||||||
- azureml-contrib-interpret
|
- azureml-contrib-interpret
|
||||||
|
|||||||
@@ -22,7 +22,7 @@ run = Run.get_context()
|
|||||||
ws = run.experiment.workspace
|
ws = run.experiment.workspace
|
||||||
|
|
||||||
# Get the AutoML run object from the experiment name and the workspace
|
# Get the AutoML run object from the experiment name and the workspace
|
||||||
experiment = Experiment(ws, '<<experimnet_name>>')
|
experiment = Experiment(ws, '<<experiment_name>>')
|
||||||
automl_run = Run(experiment=experiment, run_id='<<run_id>>')
|
automl_run = Run(experiment=experiment, run_id='<<run_id>>')
|
||||||
|
|
||||||
# Check if this AutoML model is explainable
|
# Check if this AutoML model is explainable
|
||||||
|
|||||||
@@ -188,15 +188,18 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
"metadata": {},
|
"metadata": {
|
||||||
|
"tags": [
|
||||||
|
"automlconfig-remarks-sample"
|
||||||
|
]
|
||||||
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"automl_settings = {\n",
|
"automl_settings = {\n",
|
||||||
" \"n_cross_validations\": 3,\n",
|
" \"n_cross_validations\": 3,\n",
|
||||||
" \"primary_metric\": 'r2_score',\n",
|
" \"primary_metric\": 'r2_score',\n",
|
||||||
" \"preprocess\": True,\n",
|
|
||||||
" \"enable_early_stopping\": True, \n",
|
" \"enable_early_stopping\": True, \n",
|
||||||
" \"experiment_timeout_minutes\": 20, #for real scenarios we reccommend a timeout of at least one hour \n",
|
" \"experiment_timeout_hours\": 0.3, #for real scenarios we reccommend a timeout of at least one hour \n",
|
||||||
" \"max_concurrent_iterations\": 4,\n",
|
" \"max_concurrent_iterations\": 4,\n",
|
||||||
" \"max_cores_per_iteration\": -1,\n",
|
" \"max_cores_per_iteration\": -1,\n",
|
||||||
" \"verbosity\": logging.INFO,\n",
|
" \"verbosity\": logging.INFO,\n",
|
||||||
|
|||||||
@@ -2,8 +2,7 @@ name: auto-ml-regression
|
|||||||
dependencies:
|
dependencies:
|
||||||
- pip:
|
- pip:
|
||||||
- azureml-sdk
|
- azureml-sdk
|
||||||
|
- pandas==0.23.4
|
||||||
- azureml-train-automl
|
- azureml-train-automl
|
||||||
- azureml-widgets
|
- azureml-widgets
|
||||||
- matplotlib
|
- matplotlib
|
||||||
- pandas_ml
|
|
||||||
- paramiko<2.5.0
|
|
||||||
|
|||||||
@@ -56,7 +56,7 @@ CREATE OR ALTER PROCEDURE [dbo].[AutoMLTrain]
|
|||||||
@task NVARCHAR(40)='classification', -- The type of task. Can be classification, regression or forecasting.
|
@task NVARCHAR(40)='classification', -- The type of task. Can be classification, regression or forecasting.
|
||||||
@experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.
|
@experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.
|
||||||
@iteration_timeout_minutes INT = 15, -- The maximum time in minutes for training a single pipeline.
|
@iteration_timeout_minutes INT = 15, -- The maximum time in minutes for training a single pipeline.
|
||||||
@experiment_timeout_minutes INT = 60, -- The maximum time in minutes for training all pipelines.
|
@experiment_timeout_hours FLOAT = 1, -- The maximum time in hours for training all pipelines.
|
||||||
@n_cross_validations INT = 3, -- The number of cross validations.
|
@n_cross_validations INT = 3, -- The number of cross validations.
|
||||||
@blacklist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that will not be used.
|
@blacklist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that will not be used.
|
||||||
-- The list of possible models can be found at:
|
-- The list of possible models can be found at:
|
||||||
@@ -131,8 +131,8 @@ if __name__.startswith("sqlindb"):
|
|||||||
|
|
||||||
X_train = data_train
|
X_train = data_train
|
||||||
|
|
||||||
if experiment_timeout_minutes == 0:
|
if experiment_timeout_hours == 0:
|
||||||
experiment_timeout_minutes = None
|
experiment_timeout_hours = None
|
||||||
|
|
||||||
if experiment_exit_score == 0:
|
if experiment_exit_score == 0:
|
||||||
experiment_exit_score = None
|
experiment_exit_score = None
|
||||||
@@ -163,7 +163,7 @@ if __name__.startswith("sqlindb"):
|
|||||||
debug_log = log_file_name,
|
debug_log = log_file_name,
|
||||||
primary_metric = primary_metric,
|
primary_metric = primary_metric,
|
||||||
iteration_timeout_minutes = iteration_timeout_minutes,
|
iteration_timeout_minutes = iteration_timeout_minutes,
|
||||||
experiment_timeout_minutes = experiment_timeout_minutes,
|
experiment_timeout_hours = experiment_timeout_hours,
|
||||||
iterations = iterations,
|
iterations = iterations,
|
||||||
n_cross_validations = n_cross_validations,
|
n_cross_validations = n_cross_validations,
|
||||||
preprocess = preprocess,
|
preprocess = preprocess,
|
||||||
@@ -204,7 +204,7 @@ if __name__.startswith("sqlindb"):
|
|||||||
@iterations INT, @task NVARCHAR(40),
|
@iterations INT, @task NVARCHAR(40),
|
||||||
@experiment_name NVARCHAR(32),
|
@experiment_name NVARCHAR(32),
|
||||||
@iteration_timeout_minutes INT,
|
@iteration_timeout_minutes INT,
|
||||||
@experiment_timeout_minutes INT,
|
@experiment_timeout_hours FLOAT,
|
||||||
@n_cross_validations INT,
|
@n_cross_validations INT,
|
||||||
@blacklist_models NVARCHAR(MAX),
|
@blacklist_models NVARCHAR(MAX),
|
||||||
@whitelist_models NVARCHAR(MAX),
|
@whitelist_models NVARCHAR(MAX),
|
||||||
@@ -223,7 +223,7 @@ if __name__.startswith("sqlindb"):
|
|||||||
, @task = @task
|
, @task = @task
|
||||||
, @experiment_name = @experiment_name
|
, @experiment_name = @experiment_name
|
||||||
, @iteration_timeout_minutes = @iteration_timeout_minutes
|
, @iteration_timeout_minutes = @iteration_timeout_minutes
|
||||||
, @experiment_timeout_minutes = @experiment_timeout_minutes
|
, @experiment_timeout_hours = @experiment_timeout_hours
|
||||||
, @n_cross_validations = @n_cross_validations
|
, @n_cross_validations = @n_cross_validations
|
||||||
, @blacklist_models = @blacklist_models
|
, @blacklist_models = @blacklist_models
|
||||||
, @whitelist_models = @whitelist_models
|
, @whitelist_models = @whitelist_models
|
||||||
|
|||||||
@@ -235,7 +235,7 @@
|
|||||||
" @task NVARCHAR(40)='classification', -- The type of task. Can be classification, regression or forecasting.\r\n",
|
" @task NVARCHAR(40)='classification', -- The type of task. Can be classification, regression or forecasting.\r\n",
|
||||||
" @experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.\r\n",
|
" @experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.\r\n",
|
||||||
" @iteration_timeout_minutes INT = 15, -- The maximum time in minutes for training a single pipeline. \r\n",
|
" @iteration_timeout_minutes INT = 15, -- The maximum time in minutes for training a single pipeline. \r\n",
|
||||||
" @experiment_timeout_minutes INT = 60, -- The maximum time in minutes for training all pipelines.\r\n",
|
" @experiment_timeout_hours FLOAT = 1, -- The maximum time in hours for training all pipelines.\r\n",
|
||||||
" @n_cross_validations INT = 3, -- The number of cross validations.\r\n",
|
" @n_cross_validations INT = 3, -- The number of cross validations.\r\n",
|
||||||
" @blacklist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that will not be used.\r\n",
|
" @blacklist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that will not be used.\r\n",
|
||||||
" -- The list of possible models can be found at:\r\n",
|
" -- The list of possible models can be found at:\r\n",
|
||||||
@@ -307,8 +307,8 @@
|
|||||||
"\r\n",
|
"\r\n",
|
||||||
" X_train = data_train\r\n",
|
" X_train = data_train\r\n",
|
||||||
"\r\n",
|
"\r\n",
|
||||||
" if experiment_timeout_minutes == 0:\r\n",
|
" if experiment_timeout_hours == 0:\r\n",
|
||||||
" experiment_timeout_minutes = None\r\n",
|
" experiment_timeout_hours = None\r\n",
|
||||||
"\r\n",
|
"\r\n",
|
||||||
" if experiment_exit_score == 0:\r\n",
|
" if experiment_exit_score == 0:\r\n",
|
||||||
" experiment_exit_score = None\r\n",
|
" experiment_exit_score = None\r\n",
|
||||||
@@ -337,7 +337,7 @@
|
|||||||
" debug_log = log_file_name, \r\n",
|
" debug_log = log_file_name, \r\n",
|
||||||
" primary_metric = primary_metric, \r\n",
|
" primary_metric = primary_metric, \r\n",
|
||||||
" iteration_timeout_minutes = iteration_timeout_minutes, \r\n",
|
" iteration_timeout_minutes = iteration_timeout_minutes, \r\n",
|
||||||
" experiment_timeout_minutes = experiment_timeout_minutes,\r\n",
|
" experiment_timeout_hours = experiment_timeout_hours,\r\n",
|
||||||
" iterations = iterations, \r\n",
|
" iterations = iterations, \r\n",
|
||||||
" n_cross_validations = n_cross_validations, \r\n",
|
" n_cross_validations = n_cross_validations, \r\n",
|
||||||
" preprocess = preprocess,\r\n",
|
" preprocess = preprocess,\r\n",
|
||||||
@@ -378,7 +378,7 @@
|
|||||||
"\t\t\t\t @iterations INT, @task NVARCHAR(40),\r\n",
|
"\t\t\t\t @iterations INT, @task NVARCHAR(40),\r\n",
|
||||||
"\t\t\t\t @experiment_name NVARCHAR(32),\r\n",
|
"\t\t\t\t @experiment_name NVARCHAR(32),\r\n",
|
||||||
"\t\t\t\t @iteration_timeout_minutes INT,\r\n",
|
"\t\t\t\t @iteration_timeout_minutes INT,\r\n",
|
||||||
"\t\t\t\t @experiment_timeout_minutes INT,\r\n",
|
"\t\t\t\t @experiment_timeout_hours FLOAT,\r\n",
|
||||||
"\t\t\t\t @n_cross_validations INT,\r\n",
|
"\t\t\t\t @n_cross_validations INT,\r\n",
|
||||||
"\t\t\t\t @blacklist_models NVARCHAR(MAX),\r\n",
|
"\t\t\t\t @blacklist_models NVARCHAR(MAX),\r\n",
|
||||||
"\t\t\t\t @whitelist_models NVARCHAR(MAX),\r\n",
|
"\t\t\t\t @whitelist_models NVARCHAR(MAX),\r\n",
|
||||||
@@ -396,7 +396,7 @@
|
|||||||
"\t, @task = @task\r\n",
|
"\t, @task = @task\r\n",
|
||||||
"\t, @experiment_name = @experiment_name\r\n",
|
"\t, @experiment_name = @experiment_name\r\n",
|
||||||
"\t, @iteration_timeout_minutes = @iteration_timeout_minutes\r\n",
|
"\t, @iteration_timeout_minutes = @iteration_timeout_minutes\r\n",
|
||||||
"\t, @experiment_timeout_minutes = @experiment_timeout_minutes\r\n",
|
"\t, @experiment_timeout_hours = @experiment_timeout_hours\r\n",
|
||||||
"\t, @n_cross_validations = @n_cross_validations\r\n",
|
"\t, @n_cross_validations = @n_cross_validations\r\n",
|
||||||
"\t, @blacklist_models = @blacklist_models\r\n",
|
"\t, @blacklist_models = @blacklist_models\r\n",
|
||||||
"\t, @whitelist_models = @whitelist_models\r\n",
|
"\t, @whitelist_models = @whitelist_models\r\n",
|
||||||
@@ -560,9 +560,6 @@
|
|||||||
"framework": [
|
"framework": [
|
||||||
"Azure ML AutoML"
|
"Azure ML AutoML"
|
||||||
],
|
],
|
||||||
"tags": [
|
|
||||||
""
|
|
||||||
],
|
|
||||||
"friendly_name": "Setup automated ML SQL integration",
|
"friendly_name": "Setup automated ML SQL integration",
|
||||||
"index_order": 1,
|
"index_order": 1,
|
||||||
"kernelspec": {
|
"kernelspec": {
|
||||||
@@ -574,6 +571,9 @@
|
|||||||
"name": "sql",
|
"name": "sql",
|
||||||
"version": ""
|
"version": ""
|
||||||
},
|
},
|
||||||
|
"tags": [
|
||||||
|
""
|
||||||
|
],
|
||||||
"task": "None"
|
"task": "None"
|
||||||
},
|
},
|
||||||
"nbformat": 4,
|
"nbformat": 4,
|
||||||
|
|||||||
@@ -11,6 +11,13 @@
|
|||||||
"Licensed under the MIT License."
|
"Licensed under the MIT License."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Register Azure Databricks trained model and deploy it to ACI\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
@@ -161,9 +168,9 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) #showing how to add libs as an eg. - not needed for this model.\n",
|
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) # showing how to add libs as an eg. - not needed for this model.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"mydeployenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myacienv.serialize_to_string())"
|
" f.write(myacienv.serialize_to_string())"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -177,6 +184,9 @@
|
|||||||
"from azureml.core.webservice import AciWebservice, Webservice\n",
|
"from azureml.core.webservice import AciWebservice, Webservice\n",
|
||||||
"from azureml.exceptions import WebserviceException\n",
|
"from azureml.exceptions import WebserviceException\n",
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
|
"\n",
|
||||||
"\n",
|
"\n",
|
||||||
"myaci_config = AciWebservice.deploy_configuration(cpu_cores = 2, \n",
|
"myaci_config = AciWebservice.deploy_configuration(cpu_cores = 2, \n",
|
||||||
" memory_gb = 2, \n",
|
" memory_gb = 2, \n",
|
||||||
@@ -191,9 +201,16 @@
|
|||||||
"except WebserviceException:\n",
|
"except WebserviceException:\n",
|
||||||
" pass\n",
|
" pass\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= 'spark-py', \n",
|
"myenv = Environment.get(ws, name='AzureML-PySpark-MmlSpark-0.15')\n",
|
||||||
" entry_script='score_sparkml.py',\n",
|
"# we need to add extra packages to procured environment\n",
|
||||||
" conda_file='mydeployenv.yml')\n",
|
"# in order to deploy amended environment we need to rename it\n",
|
||||||
|
"myenv.name = 'myenv'\n",
|
||||||
|
"model_dependencies = CondaDependencies('myenv.yml')\n",
|
||||||
|
"for pip_dep in model_dependencies.pip_packages:\n",
|
||||||
|
" myenv.python.conda_dependencies.add_pip_package(pip_dep)\n",
|
||||||
|
"for conda_dep in model_dependencies.conda_packages:\n",
|
||||||
|
" myenv.python.conda_dependencies.add_conda_package(conda_dep)\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script='score_sparkml.py', environment=myenv)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"myservice = Model.deploy(ws, service_name, [mymodel], inference_config, myaci_config)\n",
|
"myservice = Model.deploy(ws, service_name, [mymodel], inference_config, myaci_config)\n",
|
||||||
"myservice.wait_for_deployment(show_output=True)"
|
"myservice.wait_for_deployment(show_output=True)"
|
||||||
@@ -255,6 +272,15 @@
|
|||||||
"myservice.delete()"
|
"myservice.delete()"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Deploying to other types of computes\n",
|
||||||
|
"\n",
|
||||||
|
"In order to learn how to deploy to other types of compute targets, such as AKS, please take a look at the set of notebooks in the [deployment](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/deployment) folder."
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
|
|||||||
@@ -1,312 +0,0 @@
|
|||||||
{
|
|
||||||
"cells": [
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Azure ML & Azure Databricks notebooks by Parashar Shah.\n",
|
|
||||||
"\n",
|
|
||||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
|
||||||
"\n",
|
|
||||||
"Licensed under the MIT License."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"This notebook uses image from ACI notebook for deploying to AKS."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"import azureml.core\n",
|
|
||||||
"\n",
|
|
||||||
"# Check core SDK version number\n",
|
|
||||||
"print(\"SDK version:\", azureml.core.VERSION)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# Set auth to be used by workspace related APIs.\n",
|
|
||||||
"# For automation or CI/CD ServicePrincipalAuthentication can be used.\n",
|
|
||||||
"# https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.authentication.serviceprincipalauthentication?view=azure-ml-py\n",
|
|
||||||
"auth = None"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core import Workspace\n",
|
|
||||||
"\n",
|
|
||||||
"ws = Workspace.from_config(auth = auth)\n",
|
|
||||||
"print('Workspace name: ' + ws.name, \n",
|
|
||||||
" 'Azure region: ' + ws.location, \n",
|
|
||||||
" 'Subscription id: ' + ws.subscription_id, \n",
|
|
||||||
" 'Resource group: ' + ws.resource_group, sep = '\\n')"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#Register the model\n",
|
|
||||||
"import os\n",
|
|
||||||
"from azureml.core.model import Model\n",
|
|
||||||
"\n",
|
|
||||||
"model_name = \"AdultCensus_runHistory_aks.mml\" # \n",
|
|
||||||
"model_name_dbfs = os.path.join(\"/dbfs\", model_name)\n",
|
|
||||||
"\n",
|
|
||||||
"print(\"copy model from dbfs to local\")\n",
|
|
||||||
"model_local = \"file:\" + os.getcwd() + \"/\" + model_name\n",
|
|
||||||
"dbutils.fs.cp(model_name, model_local, True)\n",
|
|
||||||
"\n",
|
|
||||||
"mymodel = Model.register(model_path = model_name, # this points to a local file\n",
|
|
||||||
" model_name = model_name, # this is the name the model is registered as, am using same name for both path and name. \n",
|
|
||||||
" description = \"ADB trained model by Parashar\",\n",
|
|
||||||
" workspace = ws)\n",
|
|
||||||
"\n",
|
|
||||||
"print(mymodel.name, mymodel.description, mymodel.version)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#%%writefile score_sparkml.py\n",
|
|
||||||
"score_sparkml = \"\"\"\n",
|
|
||||||
" \n",
|
|
||||||
"import json\n",
|
|
||||||
" \n",
|
|
||||||
"def init():\n",
|
|
||||||
" # One-time initialization of PySpark and predictive model\n",
|
|
||||||
" import pyspark\n",
|
|
||||||
" from azureml.core.model import Model\n",
|
|
||||||
" from pyspark.ml import PipelineModel\n",
|
|
||||||
" \n",
|
|
||||||
" global trainedModel\n",
|
|
||||||
" global spark\n",
|
|
||||||
" \n",
|
|
||||||
" spark = pyspark.sql.SparkSession.builder.appName(\"ADB and AML notebook by Parashar\").getOrCreate()\n",
|
|
||||||
" model_name = \"{model_name}\" #interpolated\n",
|
|
||||||
" model_path = Model.get_model_path(model_name)\n",
|
|
||||||
" trainedModel = PipelineModel.load(model_path)\n",
|
|
||||||
" \n",
|
|
||||||
"def run(input_json):\n",
|
|
||||||
" if isinstance(trainedModel, Exception):\n",
|
|
||||||
" return json.dumps({{\"trainedModel\":str(trainedModel)}})\n",
|
|
||||||
" \n",
|
|
||||||
" try:\n",
|
|
||||||
" sc = spark.sparkContext\n",
|
|
||||||
" input_list = json.loads(input_json)\n",
|
|
||||||
" input_rdd = sc.parallelize(input_list)\n",
|
|
||||||
" input_df = spark.read.json(input_rdd)\n",
|
|
||||||
" \n",
|
|
||||||
" # Compute prediction\n",
|
|
||||||
" prediction = trainedModel.transform(input_df)\n",
|
|
||||||
" #result = prediction.first().prediction\n",
|
|
||||||
" predictions = prediction.collect()\n",
|
|
||||||
" \n",
|
|
||||||
" #Get each scored result\n",
|
|
||||||
" preds = [str(x['prediction']) for x in predictions]\n",
|
|
||||||
" result = \",\".join(preds)\n",
|
|
||||||
" # you can return any data type as long as it is JSON-serializable\n",
|
|
||||||
" return result.tolist()\n",
|
|
||||||
" except Exception as e:\n",
|
|
||||||
" result = str(e)\n",
|
|
||||||
" return result\n",
|
|
||||||
" \n",
|
|
||||||
"\"\"\".format(model_name=model_name)\n",
|
|
||||||
" \n",
|
|
||||||
"exec(score_sparkml)\n",
|
|
||||||
" \n",
|
|
||||||
"with open(\"score_sparkml.py\", \"w\") as file:\n",
|
|
||||||
" file.write(score_sparkml)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
|
||||||
"\n",
|
|
||||||
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) #showing how to add libs as an eg. - not needed for this model.\n",
|
|
||||||
"\n",
|
|
||||||
"with open(\"mydeployenv.yml\",\"w\") as f:\n",
|
|
||||||
" f.write(myacienv.serialize_to_string())"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#create AKS compute\n",
|
|
||||||
"#it may take 20-25 minutes to create a new cluster\n",
|
|
||||||
"\n",
|
|
||||||
"from azureml.core.compute import AksCompute, ComputeTarget\n",
|
|
||||||
"from azureml.core.compute_target import ComputeTargetException\n",
|
|
||||||
"\n",
|
|
||||||
"aks_name = 'ps-aks-demo2' \n",
|
|
||||||
"\n",
|
|
||||||
"try:\n",
|
|
||||||
" aks_target = ComputeTarget(workspace=ws, name=aks_name)\n",
|
|
||||||
" print('Found existing cluster, use it.')\n",
|
|
||||||
"except ComputeTargetException:\n",
|
|
||||||
" # Use the default configuration (can also provide parameters to customize)\n",
|
|
||||||
" prov_config = AksCompute.provisioning_configuration()\n",
|
|
||||||
" \n",
|
|
||||||
" # Create the cluster\n",
|
|
||||||
" aks_target = ComputeTarget.create(workspace = ws, \n",
|
|
||||||
" name = aks_name, \n",
|
|
||||||
" provisioning_configuration = prov_config)\n",
|
|
||||||
"\n",
|
|
||||||
"aks_target.wait_for_completion(show_output = True)\n",
|
|
||||||
"\n",
|
|
||||||
"print(aks_target.provisioning_state)\n",
|
|
||||||
"print(aks_target.provisioning_errors)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#deploy to AKS\n",
|
|
||||||
"from azureml.core.webservice import AksWebservice, Webservice\n",
|
|
||||||
"from azureml.exceptions import WebserviceException\n",
|
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
|
||||||
"\n",
|
|
||||||
"aks_config = AksWebservice.deploy_configuration(enable_app_insights=True)\n",
|
|
||||||
"\n",
|
|
||||||
"service_name = 'ps-aks-service'\n",
|
|
||||||
"\n",
|
|
||||||
"# Remove any existing service under the same name.\n",
|
|
||||||
"try:\n",
|
|
||||||
" Webservice(ws, service_name).delete()\n",
|
|
||||||
"except WebserviceException:\n",
|
|
||||||
" pass\n",
|
|
||||||
"\n",
|
|
||||||
"inference_config = InferenceConfig(runtime = 'spark-py', \n",
|
|
||||||
" entry_script ='score_sparkml.py',\n",
|
|
||||||
" conda_file ='mydeployenv.yml')\n",
|
|
||||||
"\n",
|
|
||||||
"aks_service = Model.deploy(ws, service_name, [mymodel], inference_config, aks_config, aks_target)\n",
|
|
||||||
"aks_service.wait_for_deployment(show_output=True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"aks_service.deployment_status"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#for using the Web HTTP API \n",
|
|
||||||
"print(aks_service.scoring_uri)\n",
|
|
||||||
"print(aks_service.get_keys())"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"import json\n",
|
|
||||||
"\n",
|
|
||||||
"#get the some sample data\n",
|
|
||||||
"test_data_path = \"AdultCensusIncomeTest\"\n",
|
|
||||||
"test = spark.read.parquet(test_data_path).limit(5)\n",
|
|
||||||
"\n",
|
|
||||||
"test_json = json.dumps(test.toJSON().collect())\n",
|
|
||||||
"\n",
|
|
||||||
"print(test_json)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#using data defined above predict if income is >50K (1) or <=50K (0)\n",
|
|
||||||
"aks_service.run(input_data=test_json)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"#comment to not delete the web service\n",
|
|
||||||
"aks_service.delete()\n",
|
|
||||||
"#model.delete()\n",
|
|
||||||
"aks_target.delete() "
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
""
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"authors": [
|
|
||||||
{
|
|
||||||
"name": "pasha"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"kernelspec": {
|
|
||||||
"display_name": "Python 3.6",
|
|
||||||
"language": "python",
|
|
||||||
"name": "python36"
|
|
||||||
},
|
|
||||||
"language_info": {
|
|
||||||
"codemirror_mode": {
|
|
||||||
"name": "ipython",
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"file_extension": ".py",
|
|
||||||
"mimetype": "text/x-python",
|
|
||||||
"name": "python",
|
|
||||||
"nbconvert_exporter": "python",
|
|
||||||
"pygments_lexer": "ipython3",
|
|
||||||
"version": "3.6.8"
|
|
||||||
},
|
|
||||||
"name": "deploy-to-aks-existingimage-05",
|
|
||||||
"notebookId": 1030695628045968
|
|
||||||
},
|
|
||||||
"nbformat": 4,
|
|
||||||
"nbformat_minor": 1
|
|
||||||
}
|
|
||||||
@@ -640,7 +640,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'], pip_packages=['azureml-defaults', 'azureml-sdk[automl]'])\n",
|
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'], pip_packages=['azureml-defaults', 'azureml-sdk[automl]'])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"conda_env_file_name = 'mydeployenv.yml'\n",
|
"conda_env_file_name = 'myenv.yml'\n",
|
||||||
"myenv.save_to_file('.', conda_env_file_name)"
|
"myenv.save_to_file('.', conda_env_file_name)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -664,17 +664,27 @@
|
|||||||
"from azureml.exceptions import WebserviceException\n",
|
"from azureml.exceptions import WebserviceException\n",
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
"from azureml.core.model import Model\n",
|
"from azureml.core.model import Model\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
"import uuid\n",
|
"import uuid\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"myaci_config = AciWebservice.deploy_configuration(\n",
|
"myaci_config = AciWebservice.deploy_configuration(\n",
|
||||||
" cpu_cores = 2, \n",
|
" cpu_cores = 2, \n",
|
||||||
" memory_gb = 2, \n",
|
" memory_gb = 2, \n",
|
||||||
" tags = {'name':'Databricks Azure ML ACI'}, \n",
|
" tags = {'name':'Databricks Azure ML ACI'}, \n",
|
||||||
" description = 'This is for ADB and AutoML example.')\n",
|
" description = 'This is for ADB and AutoML example.')\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= 'spark-py', \n",
|
"myenv = Environment.get(ws, name='AzureML-PySpark-MmlSpark-0.15')\n",
|
||||||
" entry_script='score.py',\n",
|
"# we need to add extra packages to procured environment\n",
|
||||||
" conda_file='mydeployenv.yml')\n",
|
"# in order to deploy amended environment we need to rename it\n",
|
||||||
|
"myenv.name = 'myenv'\n",
|
||||||
|
"model_dependencies = CondaDependencies('myenv.yml')\n",
|
||||||
|
"for pip_dep in model_dependencies.pip_packages:\n",
|
||||||
|
" myenv.python.conda_dependencies.add_pip_package(pip_dep)\n",
|
||||||
|
"for conda_dep in model_dependencies.conda_packages:\n",
|
||||||
|
" myenv.python.conda_dependencies.add_conda_package(conda_dep)\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script='score_sparkml.py', environment=myenv)\n",
|
||||||
"\n",
|
"\n",
|
||||||
"guid = str(uuid.uuid4()).split(\"-\")[0]\n",
|
"guid = str(uuid.uuid4()).split(\"-\")[0]\n",
|
||||||
"service_name = \"myservice-{}\".format(guid)\n",
|
"service_name = \"myservice-{}\".format(guid)\n",
|
||||||
|
|||||||
36
how-to-use-azureml/azureml-sdk-for-r/README.md
Normal file
36
how-to-use-azureml/azureml-sdk-for-r/README.md
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
## Examples to get started with Azure Machine Learning SDK for R
|
||||||
|
|
||||||
|
Learn how to use Azure Machine Learning SDK for R for experimentation and model management.
|
||||||
|
|
||||||
|
As a pre-requisite, go through the [Installation](vignettes/installation.Rmd) and [Configuration](vignettes/configuration.Rmd) vignettes to first install the package and set up your Azure Machine Learning Workspace unless you are running these examples on an Azure Machine Learning compute instance. Azure Machine Learning compute instances have the Azure Machine Learning SDK pre-installed and your workspace details pre-configured.
|
||||||
|
|
||||||
|
|
||||||
|
Samples
|
||||||
|
* Deployment
|
||||||
|
* [deploy-to-aci](./samples/deployment/deploy-to-aci): Deploy a model as a web service to Azure Container Instances (ACI).
|
||||||
|
* [deploy-to-local](./samples/deployment/deploy-to-local): Deploy a model as a web service locally.
|
||||||
|
* Training
|
||||||
|
* [train-on-amlcompute](./samples/training/train-on-amlcompute): Train a model on a remote AmlCompute cluster.
|
||||||
|
* [train-on-local](./samples/training/train-on-local): Train a model locally with Docker.
|
||||||
|
|
||||||
|
Vignettes
|
||||||
|
* [deploy-to-aks](./vignettes/deploy-to-aks): Production deploy a model as a web service to Azure Kubernetes Service (AKS).
|
||||||
|
* [hyperparameter-tune-with-keras](./vignettes/hyperparameter-tune-with-keras): Hyperparameter tune a Keras model using HyperDrive, Azure ML's hyperparameter tuning functionality.
|
||||||
|
* [train-and-deploy-to-aci](./vignettes/train-and-deploy-to-aci): Train a caret model and deploy as a web service to Azure Container Instances (ACI).
|
||||||
|
* [train-with-tensorflow](./vignettes/train-with-tensorflow): Train a deep learning TensorFlow model with Azure ML.
|
||||||
|
|
||||||
|
Find more information on the [official documentation site for Azure Machine Learning SDK for R](https://azure.github.io/azureml-sdk-for-r/).
|
||||||
|
|
||||||
|
|
||||||
|
### Troubleshooting
|
||||||
|
|
||||||
|
- If the following error occurs when submitting an experiment using RStudio:
|
||||||
|
```R
|
||||||
|
Error in py_call_impl(callable, dots$args, dots$keywords) :
|
||||||
|
PermissionError: [Errno 13] Permission denied
|
||||||
|
```
|
||||||
|
Move the files for your project into a subdirectory and reset the working directory to that directory before re-submitting.
|
||||||
|
|
||||||
|
In order to submit an experiment, the Azure ML SDK must create a .zip file of the project directory to send to the service. However,
|
||||||
|
the SDK does not have permission to write into the .Rproj.user subdirectory that is automatically created during an RStudio
|
||||||
|
session. For this reason, the recommended best practice is to isolate project files into their own directory.
|
||||||
11
how-to-use-azureml/azureml-sdk-for-r/samples/README.md
Normal file
11
how-to-use-azureml/azureml-sdk-for-r/samples/README.md
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
## Azure Machine Learning samples
|
||||||
|
These samples are short code examples for using Azure Machine Learning SDK for R. If you are new to the R SDK, we recommend that you first take a look at the more detailed end-to-end [vignettes](../vignettes).
|
||||||
|
|
||||||
|
Before running a sample in RStudio, set the working directory to the folder that contains the sample script in RStudio using `setwd(dirname)` or Session -> Set Working Directory -> To Source File Location. Each vignette assumes that the data and scripts are in the current working directory.
|
||||||
|
|
||||||
|
1. [train-on-amlcompute](training/train-on-amlcompute): Train a model on a remote AmlCompute cluster.
|
||||||
|
2. [train-on-local](training/train-on-local): Train a model locally with Docker.
|
||||||
|
2. [deploy-to-aci](deployment/deploy-to-aci): Deploy a model as a web service to Azure Container Instances (ACI).
|
||||||
|
3. [deploy-to-local](deployment/deploy-to-local): Deploy a model as a web service locally.
|
||||||
|
|
||||||
|
> Before you run these samples, make sure you have an Azure Machine Learning workspace. You can follow the [configuration vignette](../vignettes/configuration.Rmd) to set up a workspace. (You do not need to do this if you are running these examples on an Azure Machine Learning compute instance).
|
||||||
@@ -0,0 +1,59 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
|
||||||
|
# Register the model
|
||||||
|
model <- register_model(ws, model_path = "project_files/model.rds",
|
||||||
|
model_name = "model.rds")
|
||||||
|
|
||||||
|
# Create environment
|
||||||
|
r_env <- r_environment(name = "r_env")
|
||||||
|
|
||||||
|
# Create inference config
|
||||||
|
inference_config <- inference_config(
|
||||||
|
entry_script = "score.R",
|
||||||
|
source_directory = "project_files",
|
||||||
|
environment = r_env)
|
||||||
|
|
||||||
|
# Create ACI deployment config
|
||||||
|
deployment_config <- aci_webservice_deployment_config(cpu_cores = 1,
|
||||||
|
memory_gb = 1)
|
||||||
|
|
||||||
|
# Deploy the web service
|
||||||
|
service <- deploy_model(ws,
|
||||||
|
'rservice',
|
||||||
|
list(model),
|
||||||
|
inference_config,
|
||||||
|
deployment_config)
|
||||||
|
wait_for_deployment(service, show_output = TRUE)
|
||||||
|
|
||||||
|
# If you encounter any issue in deploying the webservice, please visit
|
||||||
|
# https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment
|
||||||
|
|
||||||
|
# Inferencing
|
||||||
|
# versicolor
|
||||||
|
plant <- data.frame(Sepal.Length = 6.4,
|
||||||
|
Sepal.Width = 2.8,
|
||||||
|
Petal.Length = 4.6,
|
||||||
|
Petal.Width = 1.8)
|
||||||
|
# setosa
|
||||||
|
plant <- data.frame(Sepal.Length = 5.1,
|
||||||
|
Sepal.Width = 3.5,
|
||||||
|
Petal.Length = 1.4,
|
||||||
|
Petal.Width = 0.2)
|
||||||
|
# virginica
|
||||||
|
plant <- data.frame(Sepal.Length = 6.7,
|
||||||
|
Sepal.Width = 3.3,
|
||||||
|
Petal.Length = 5.2,
|
||||||
|
Petal.Width = 2.3)
|
||||||
|
|
||||||
|
# Test the web service
|
||||||
|
predicted_val <- invoke_webservice(service, toJSON(plant))
|
||||||
|
predicted_val
|
||||||
|
|
||||||
|
# Delete the web service
|
||||||
|
delete_webservice(service)
|
||||||
Binary file not shown.
@@ -0,0 +1,17 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
init <- function() {
|
||||||
|
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
|
||||||
|
model <- readRDS(file.path(model_path, "model.rds"))
|
||||||
|
message("model is loaded")
|
||||||
|
|
||||||
|
function(data) {
|
||||||
|
plant <- as.data.frame(fromJSON(data))
|
||||||
|
prediction <- predict(model, plant)
|
||||||
|
result <- as.character(prediction)
|
||||||
|
toJSON(result)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,112 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
# Register model and deploy locally
|
||||||
|
# This example shows how to deploy a web service in step-by-step fashion:
|
||||||
|
#
|
||||||
|
# 1) Register model
|
||||||
|
# 2) Deploy the model as a web service in a local Docker container.
|
||||||
|
# 3) Invoke web service with SDK or call web service with raw HTTP call.
|
||||||
|
# 4) Quickly test changes to your entry script by reloading the local service.
|
||||||
|
# 5) Optionally, you can also make changes to model and update the local service.
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
|
||||||
|
# Register the model
|
||||||
|
model <- register_model(ws, model_path = "project_files/model.rds",
|
||||||
|
model_name = "model.rds")
|
||||||
|
|
||||||
|
# Create environment
|
||||||
|
r_env <- r_environment(name = "r_env")
|
||||||
|
|
||||||
|
# Create inference config
|
||||||
|
inference_config <- inference_config(
|
||||||
|
entry_script = "score.R",
|
||||||
|
source_directory = "project_files",
|
||||||
|
environment = r_env)
|
||||||
|
|
||||||
|
# Create local deployment config
|
||||||
|
local_deployment_config <- local_webservice_deployment_config()
|
||||||
|
|
||||||
|
# Deploy the web service
|
||||||
|
# NOTE:
|
||||||
|
# The Docker image runs as a Linux container. If you are running Docker for Windows, you need to ensure the Linux Engine is running:
|
||||||
|
# # PowerShell command to switch to Linux engine
|
||||||
|
# & 'C:\Program Files\Docker\Docker\DockerCli.exe' -SwitchLinuxEngine
|
||||||
|
service <- deploy_model(ws,
|
||||||
|
'rservice-local',
|
||||||
|
list(model),
|
||||||
|
inference_config,
|
||||||
|
local_deployment_config)
|
||||||
|
# Wait for deployment
|
||||||
|
wait_for_deployment(service, show_output = TRUE)
|
||||||
|
|
||||||
|
# Show the port of local service
|
||||||
|
message(service$port)
|
||||||
|
|
||||||
|
# If you encounter any issue in deploying the webservice, please visit
|
||||||
|
# https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment
|
||||||
|
|
||||||
|
# Inferencing
|
||||||
|
# versicolor
|
||||||
|
# plant <- data.frame(Sepal.Length = 6.4,
|
||||||
|
# Sepal.Width = 2.8,
|
||||||
|
# Petal.Length = 4.6,
|
||||||
|
# Petal.Width = 1.8)
|
||||||
|
# setosa
|
||||||
|
plant <- data.frame(Sepal.Length = 5.1,
|
||||||
|
Sepal.Width = 3.5,
|
||||||
|
Petal.Length = 1.4,
|
||||||
|
Petal.Width = 0.2)
|
||||||
|
# # virginica
|
||||||
|
# plant <- data.frame(Sepal.Length = 6.7,
|
||||||
|
# Sepal.Width = 3.3,
|
||||||
|
# Petal.Length = 5.2,
|
||||||
|
# Petal.Width = 2.3)
|
||||||
|
|
||||||
|
#Test the web service
|
||||||
|
invoke_webservice(service, toJSON(plant))
|
||||||
|
|
||||||
|
## The last few lines of the logs should have the correct prediction and should display -> R[write to console]: "setosa"
|
||||||
|
cat(gsub(pattern = "\n", replacement = " \n", x = get_webservice_logs(service)))
|
||||||
|
|
||||||
|
## Test the web service with a HTTP Raw request
|
||||||
|
#
|
||||||
|
# NOTE:
|
||||||
|
# To test the service locally use the https://localhost:<local_service$port> URL
|
||||||
|
|
||||||
|
# Import the request library
|
||||||
|
library(httr)
|
||||||
|
# Get the service scoring URL from the service object, its URL is for testing locally
|
||||||
|
local_service_url <- service$scoring_uri #Same as https://localhost:<local_service$port>
|
||||||
|
|
||||||
|
#POST request to web service
|
||||||
|
resp <- POST(local_service_url, body = plant, encode = "json", verbose())
|
||||||
|
|
||||||
|
## The last few lines of the logs should have the correct prediction and should display -> R[write to console]: "setosa"
|
||||||
|
cat(gsub(pattern = "\n", replacement = " \n", x = get_webservice_logs(service)))
|
||||||
|
|
||||||
|
|
||||||
|
# Optional, use a new scoring script
|
||||||
|
inference_config <- inference_config(
|
||||||
|
entry_script = "score_new.R",
|
||||||
|
source_directory = "project_files",
|
||||||
|
environment = r_env)
|
||||||
|
|
||||||
|
## Then reload the service to see the changes made
|
||||||
|
reload_local_webservice_assets(service)
|
||||||
|
|
||||||
|
## Check reloaded service, you will see the last line will say "this is a new scoring script! I was reloaded"
|
||||||
|
invoke_webservice(service, toJSON(plant))
|
||||||
|
cat(gsub(pattern = "\n", replacement = " \n", x = get_webservice_logs(service)))
|
||||||
|
|
||||||
|
# Update service
|
||||||
|
# If you want to change your model(s), environment, or deployment configuration, call update() to rebuild the Docker image.
|
||||||
|
|
||||||
|
# update_local_webservice(service, models = [NewModelObject], deployment_config = deployment_config, wait = FALSE, inference_config = inference_config)
|
||||||
|
|
||||||
|
# Delete service
|
||||||
|
delete_local_webservice(service)
|
||||||
Binary file not shown.
@@ -0,0 +1,18 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
init <- function() {
|
||||||
|
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
|
||||||
|
model <- readRDS(file.path(model_path, "model.rds"))
|
||||||
|
message("model is loaded")
|
||||||
|
|
||||||
|
function(data) {
|
||||||
|
plant <- as.data.frame(fromJSON(data))
|
||||||
|
prediction <- predict(model, plant)
|
||||||
|
result <- as.character(prediction)
|
||||||
|
message(result)
|
||||||
|
toJSON(result)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,19 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
init <- function() {
|
||||||
|
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
|
||||||
|
model <- readRDS(file.path(model_path, "model.rds"))
|
||||||
|
message("model is loaded")
|
||||||
|
|
||||||
|
function(data) {
|
||||||
|
plant <- as.data.frame(fromJSON(data))
|
||||||
|
prediction <- predict(model, plant)
|
||||||
|
result <- as.character(prediction)
|
||||||
|
message(result)
|
||||||
|
message("this is a new scoring script! I was reloaded")
|
||||||
|
toJSON(result)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,34 @@
|
|||||||
|
# This script loads a dataset of which the last column is supposed to be the
|
||||||
|
# class and logs the accuracy
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
library(caret)
|
||||||
|
library(optparse)
|
||||||
|
library(datasets)
|
||||||
|
|
||||||
|
|
||||||
|
iris_data <- data(iris)
|
||||||
|
summary(iris_data)
|
||||||
|
|
||||||
|
in_train <- createDataPartition(y = iris_data$Species, p = .8, list = FALSE)
|
||||||
|
train_data <- iris_data[in_train,]
|
||||||
|
test_data <- iris_data[-in_train,]
|
||||||
|
|
||||||
|
# Run algorithms using 10-fold cross validation
|
||||||
|
control <- trainControl(method = "cv", number = 10)
|
||||||
|
metric <- "Accuracy"
|
||||||
|
|
||||||
|
set.seed(7)
|
||||||
|
model <- train(Species ~ .,
|
||||||
|
data = train_data,
|
||||||
|
method = "lda",
|
||||||
|
metric = metric,
|
||||||
|
trControl = control)
|
||||||
|
predictions <- predict(model, test_data)
|
||||||
|
conf_matrix <- confusionMatrix(predictions, test_data$Species)
|
||||||
|
message(conf_matrix)
|
||||||
|
|
||||||
|
log_metric_to_run(metric, conf_matrix$overall["Accuracy"])
|
||||||
|
|
||||||
|
saveRDS(model, file = "./outputs/model.rds")
|
||||||
|
message("Model saved")
|
||||||
@@ -0,0 +1,41 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
# Reminder: set working directory to current file location prior to running this script
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
|
||||||
|
# Create AmlCompute cluster
|
||||||
|
cluster_name <- "r-cluster"
|
||||||
|
compute_target <- get_compute(ws, cluster_name = cluster_name)
|
||||||
|
if (is.null(compute_target)) {
|
||||||
|
vm_size <- "STANDARD_D2_V2"
|
||||||
|
compute_target <- create_aml_compute(workspace = ws,
|
||||||
|
cluster_name = cluster_name,
|
||||||
|
vm_size = vm_size,
|
||||||
|
max_nodes = 1)
|
||||||
|
|
||||||
|
wait_for_provisioning_completion(compute_target, show_output = TRUE)
|
||||||
|
}
|
||||||
|
|
||||||
|
# Define estimator
|
||||||
|
est <- estimator(source_directory = "scripts",
|
||||||
|
entry_script = "train.R",
|
||||||
|
compute_target = compute_target)
|
||||||
|
|
||||||
|
experiment_name <- "train-r-script-on-amlcompute"
|
||||||
|
exp <- experiment(ws, experiment_name)
|
||||||
|
|
||||||
|
# Submit job and display the run details
|
||||||
|
run <- submit_experiment(exp, est)
|
||||||
|
view_run_details(run)
|
||||||
|
wait_for_run_completion(run, show_output = TRUE)
|
||||||
|
|
||||||
|
# Get the run metrics
|
||||||
|
metrics <- get_run_metrics(run)
|
||||||
|
metrics
|
||||||
|
|
||||||
|
# Delete cluster
|
||||||
|
delete_compute(compute_target)
|
||||||
@@ -0,0 +1,28 @@
|
|||||||
|
# This script loads a dataset of which the last column is supposed to be the
|
||||||
|
# class and logs the accuracy
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
library(caret)
|
||||||
|
library(datasets)
|
||||||
|
|
||||||
|
iris_data <- data(iris)
|
||||||
|
summary(iris_data)
|
||||||
|
|
||||||
|
in_train <- createDataPartition(y = iris_data$Species, p = .8, list = FALSE)
|
||||||
|
train_data <- iris_data[in_train,]
|
||||||
|
test_data <- iris_data[-in_train,]
|
||||||
|
# Run algorithms using 10-fold cross validation
|
||||||
|
control <- trainControl(method = "cv", number = 10)
|
||||||
|
metric <- "Accuracy"
|
||||||
|
|
||||||
|
set.seed(7)
|
||||||
|
model <- train(Species ~ .,
|
||||||
|
data = train_data,
|
||||||
|
method = "lda",
|
||||||
|
metric = metric,
|
||||||
|
trControl = control)
|
||||||
|
predictions <- predict(model, test_data)
|
||||||
|
conf_matrix <- confusionMatrix(predictions, test_data$Species)
|
||||||
|
message(conf_matrix)
|
||||||
|
|
||||||
|
log_metric_to_run(metric, conf_matrix$overall["Accuracy"])
|
||||||
@@ -0,0 +1,26 @@
|
|||||||
|
# Copyright(c) Microsoft Corporation.
|
||||||
|
# Licensed under the MIT license.
|
||||||
|
|
||||||
|
# Reminder: set working directory to current file location prior to running this script
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
|
||||||
|
# Define estimator
|
||||||
|
est <- estimator(source_directory = "scripts",
|
||||||
|
entry_script = "train.R",
|
||||||
|
compute_target = "local")
|
||||||
|
|
||||||
|
# Initialize experiment
|
||||||
|
experiment_name <- "train-r-script-on-local"
|
||||||
|
exp <- experiment(ws, experiment_name)
|
||||||
|
|
||||||
|
# Submit job and display the run details
|
||||||
|
run <- submit_experiment(exp, est)
|
||||||
|
view_run_details(run)
|
||||||
|
wait_for_run_completion(run, show_output = TRUE)
|
||||||
|
|
||||||
|
# Get the run metrics
|
||||||
|
metrics <- get_run_metrics(run)
|
||||||
|
metrics
|
||||||
17
how-to-use-azureml/azureml-sdk-for-r/vignettes/README.md
Normal file
17
how-to-use-azureml/azureml-sdk-for-r/vignettes/README.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
## Azure Machine Learning vignettes
|
||||||
|
|
||||||
|
These vignettes are end-to-end tutorials for using Azure Machine Learning SDK for R.
|
||||||
|
|
||||||
|
Before running a vignette in RStudio, set the working directory to the folder that contains the vignette file (.Rmd file) in RStudio using `setwd(dirname)` or Session -> Set Working Directory -> To Source File Location. Each vignette assumes that the data and scripts are in the current working directory.
|
||||||
|
|
||||||
|
The following vignettes are included:
|
||||||
|
1. [installation](installation.Rmd): Install the Azure ML SDK for R.
|
||||||
|
2. [configuration](configuration.Rmd): Set up an Azure ML workspace.
|
||||||
|
3. [train-and-deploy-to-aci](train-and-deploy-to-aci): Train a caret model and deploy as a web service to Azure Container Instances (ACI).
|
||||||
|
4. [train-with-tensorflow](train-with-tensorflow/): Train a deep learning TensorFlow model with Azure ML.
|
||||||
|
5. [hyperparameter-tune-with-keras](hyperparameter-tune-with-keras/): Hyperparameter tune a Keras model using HyperDrive, Azure ML's hyperparameter tuning functionality.
|
||||||
|
6. [deploy-to-aks](deploy-to-aks/): Production deploy a model as a web service to Azure Kubernetes Service (AKS).
|
||||||
|
|
||||||
|
> Before you run these samples, make sure you have an Azure Machine Learning workspace. You can follow the [configuration vignette](../vignettes/configuration.Rmd) to set up a workspace. (You do not need to do this if you are running these examples on an Azure Machine Learning compute instance).
|
||||||
|
|
||||||
|
For additional examples on using the R SDK, see the [samples](../samples) folder.
|
||||||
108
how-to-use-azureml/azureml-sdk-for-r/vignettes/configuration.Rmd
Normal file
108
how-to-use-azureml/azureml-sdk-for-r/vignettes/configuration.Rmd
Normal file
@@ -0,0 +1,108 @@
|
|||||||
|
---
|
||||||
|
title: "Set up an Azure ML workspace"
|
||||||
|
date: "`r Sys.Date()`"
|
||||||
|
output: rmarkdown::html_vignette
|
||||||
|
vignette: >
|
||||||
|
%\VignetteIndexEntry{Set up an Azure ML workspace}
|
||||||
|
%\VignetteEngine{knitr::rmarkdown}
|
||||||
|
\use_package{UTF-8}
|
||||||
|
---
|
||||||
|
|
||||||
|
This tutorial gets you started with the Azure Machine Learning service by walking through the requirements and instructions for setting up a workspace, the top-level resource for Azure ML.
|
||||||
|
|
||||||
|
You do not need run this if you are working on an Azure Machine Learning Compute Instance, as the compute instance is already associated with an existing workspace.
|
||||||
|
|
||||||
|
## What is an Azure ML workspace?
|
||||||
|
The workspace is the top-level resource for Azure ML, providing a centralized place to work with all the artifacts you create when you use Azure ML. The workspace keeps a history of all training runs, including logs, metrics, output, and a snapshot of your scripts.
|
||||||
|
|
||||||
|
When you create a new workspace, it automatically creates several Azure resources that are used by the workspace:
|
||||||
|
|
||||||
|
* Azure Container Registry: Registers docker containers that you use during training and when you deploy a model. To minimize costs, ACR is lazy-loaded until deployment images are created.
|
||||||
|
* Azure Storage account: Used as the default datastore for the workspace.
|
||||||
|
* Azure Application Insights: Stores monitoring information about your models.
|
||||||
|
* Azure Key Vault: Stores secrets that are used by compute targets and other sensitive information that's needed by the workspace.
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
This section describes the steps required before you can access any Azure ML service functionality.
|
||||||
|
|
||||||
|
### Azure subscription
|
||||||
|
In order to create an Azure ML workspace, first you need access to an Azure subscription. An Azure subscription allows you to manage storage, compute, and other assets in the Azure cloud. You can [create a new subscription](https://azure.microsoft.com/en-us/free/) or access existing subscription information from the [Azure portal](https://portal.azure.com/). Later in this tutorial you will need information such as your subscription ID in order to create and access workspaces.
|
||||||
|
|
||||||
|
### Azure ML SDK installation
|
||||||
|
Follow the [installation guide](https://azure.github.io/azureml-sdk-for-r/articles/installation.html) to install **azuremlsdk** on your machine.
|
||||||
|
|
||||||
|
## Configure your workspace
|
||||||
|
### Workspace parameters
|
||||||
|
To use an Azure ML workspace, you will need to supply the following information:
|
||||||
|
|
||||||
|
* Your subscription ID
|
||||||
|
* A resource group name
|
||||||
|
* (Optional) The region that will host your workspace
|
||||||
|
* A name for your workspace
|
||||||
|
|
||||||
|
You can get your subscription ID from the [Azure portal](https://portal.azure.com/).
|
||||||
|
|
||||||
|
You will also need access to a [resource group](https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-overview#resource-groups), which organizes Azure resources and provides a default region for the resources in a group. You can see what resource groups to which you have access, or create a new one in the Azure portal. If you don't have a resource group, the `create_workspace()` method will create one for you using the name you provide.
|
||||||
|
|
||||||
|
The region to host your workspace will be used if you are creating a new workspace. You do not need to specify this if you are using an existing workspace. You can find the list of supported regions [here](https://azure.microsoft.com/en-us/global-infrastructure/services/?products=machine-learning-service). You should pick a region that is close to your location or that contains your data.
|
||||||
|
|
||||||
|
The name for your workspace is unique within the subscription and should be descriptive enough to discern among other workspaces. The subscription may be used only by you, or it may be used by your department or your entire enterprise, so choose a name that makes sense for your situation.
|
||||||
|
|
||||||
|
The following code chunk allows you to specify your workspace parameters. It uses `Sys.getenv` to read values from environment variables, which is useful for automation. If no environment variable exists, the parameters will be set to the specified default values. Replace the default values in the code below with your default parameter values.
|
||||||
|
|
||||||
|
``` {r configure_parameters, eval=FALSE}
|
||||||
|
subscription_id <- Sys.getenv("SUBSCRIPTION_ID", unset = "<my-subscription-id>")
|
||||||
|
resource_group <- Sys.getenv("RESOURCE_GROUP", default="<my-resource-group>")
|
||||||
|
workspace_name <- Sys.getenv("WORKSPACE_NAME", default="<my-workspace-name>")
|
||||||
|
workspace_region <- Sys.getenv("WORKSPACE_REGION", default="eastus2")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Create a new workspace
|
||||||
|
If you don't have an existing workspace and are the owner of the subscription or resource group, you can create a new workspace. If you don't have a resource group, `create_workspace()` will create one for you using the name you provide. If you don't want it to do so, set the `create_resource_group = FALSE` parameter.
|
||||||
|
|
||||||
|
Note: As with other Azure services, there are limits on certain resources (e.g. AmlCompute quota) associated with the Azure ML service. Please read this [article](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-manage-quotas) on the default limits and how to request more quota.
|
||||||
|
|
||||||
|
This cell will create an Azure ML workspace for you in a subscription, provided you have the correct permissions.
|
||||||
|
|
||||||
|
This will fail if:
|
||||||
|
|
||||||
|
* You do not have permission to create a workspace in the resource group.
|
||||||
|
* You do not have permission to create a resource group if it does not exist.
|
||||||
|
* You are not a subscription owner or contributor and no Azure ML workspaces have ever been created in this subscription.
|
||||||
|
|
||||||
|
If workspace creation fails, please work with your IT admin to provide you with the appropriate permissions or to provision the required resources.
|
||||||
|
|
||||||
|
There are additional parameters that are not shown below that can be configured when creating a workspace. Please see [`create_workspace()`](https://azure.github.io/azureml-sdk-for-r/reference/create_workspace.html) for more details.
|
||||||
|
|
||||||
|
``` {r create_workspace, eval=FALSE}
|
||||||
|
library(azuremlsdk)
|
||||||
|
|
||||||
|
ws <- create_workspace(name = workspace_name,
|
||||||
|
subscription_id = subscription_id,
|
||||||
|
resource_group = resource_group,
|
||||||
|
location = workspace_region,
|
||||||
|
exist_ok = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
You can out write out the workspace ARM properties to a config file with [`write_workspace_config()`](https://azure.github.io/azureml-sdk-for-r/reference/write_workspace_config.html). The method provides a simple way of reusing the same workspace across multiple files or projects. Users can save the workspace details with `write_workspace_config()`, and use [`load_workspace_from_config()`](https://azure.github.io/azureml-sdk-for-r/reference/load_workspace_from_config.html) to load the same workspace in different files or projects without retyping the workspace ARM properties. The method defaults to writing out the config file to the current working directory with "config.json" as the file name. To specify a different path or file name, set the `path` and `file_name` parameters.
|
||||||
|
|
||||||
|
``` {r write_config, eval=FALSE}
|
||||||
|
write_workspace_config(ws)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Access an existing workspace
|
||||||
|
You can access an existing workspace in a couple of ways. If your workspace properties were previously saved to a config file, you can load the workspace as follows:
|
||||||
|
|
||||||
|
``` {r load_config, eval=FALSE}
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
```
|
||||||
|
|
||||||
|
If Azure ML cannot find the config file, specify the path to the config file with the `path` parameter. The method defaults to starting the search in the current directory.
|
||||||
|
|
||||||
|
You can also initialize a workspace using the [`get_workspace()`](https://azure.github.io/azureml-sdk-for-r/reference/get_workspace.html) method.
|
||||||
|
|
||||||
|
``` {r get_workspace, eval=FALSE}
|
||||||
|
ws <- get_workspace(name = workspace_name,
|
||||||
|
subscription_id = subscription_id,
|
||||||
|
resource_group = resource_group)
|
||||||
|
```
|
||||||
@@ -0,0 +1,188 @@
|
|||||||
|
---
|
||||||
|
title: "Deploy a web service to Azure Kubernetes Service"
|
||||||
|
date: "`r Sys.Date()`"
|
||||||
|
output: rmarkdown::html_vignette
|
||||||
|
vignette: >
|
||||||
|
%\VignetteIndexEntry{Deploy a web service to Azure Kubernetes Service}
|
||||||
|
%\VignetteEngine{knitr::rmarkdown}
|
||||||
|
\use_package{UTF-8}
|
||||||
|
---
|
||||||
|
|
||||||
|
This tutorial demonstrates how to deploy a model as a web service on [Azure Kubernetes Service](https://azure.microsoft.com/en-us/services/kubernetes-service/) (AKS). AKS is good for high-scale production deployments; use it if you need one or more of the following capabilities:
|
||||||
|
|
||||||
|
* Fast response time
|
||||||
|
* Autoscaling of the deployed service
|
||||||
|
* Hardware acceleration options such as GPU
|
||||||
|
|
||||||
|
You will learn to:
|
||||||
|
|
||||||
|
* Set up your testing environment
|
||||||
|
* Register a model
|
||||||
|
* Provision an AKS cluster
|
||||||
|
* Deploy the model to AKS
|
||||||
|
* Test the deployed service
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
If you don’t have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
|
||||||
|
|
||||||
|
## Set up your testing environment
|
||||||
|
Start by setting up your environment. This includes importing the **azuremlsdk** package and connecting to your workspace.
|
||||||
|
|
||||||
|
### Import package
|
||||||
|
```{r import_package, eval=FALSE}
|
||||||
|
library(azuremlsdk)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Load your workspace
|
||||||
|
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from a **config.json** file if you previously wrote one out with `write_workspace_config()`.
|
||||||
|
```{r load_workspace, eval=FALSE}
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
```
|
||||||
|
|
||||||
|
Or, you can retrieve a workspace by directly specifying your workspace details:
|
||||||
|
```{r get_workspace, eval=FALSE}
|
||||||
|
ws <- get_workspace("<your workspace name>", "<your subscription ID>", "<your resource group>")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Register the model
|
||||||
|
In this tutorial we will deploy a model that was trained in one of the [samples](https://github.com/Azure/azureml-sdk-for-r/blob/master/samples/training/train-on-amlcompute/train-on-amlcompute.R). The model was trained with the Iris dataset and can be used to determine if a flower is one of three Iris flower species (setosa, versicolor, virginica). We have provided the model file (`model.rds`) for the tutorial; it is located in the "project_files" directory of this vignette.
|
||||||
|
|
||||||
|
First, register the model to your workspace with [`register_model()`](https://azure.github.io/azureml-sdk-for-r/reference/register_model.html). A registered model can be any collection of files, but in this case the R model file is sufficient. Azure ML will use the registered model for deployment.
|
||||||
|
|
||||||
|
```{r register_model, eval=FALSE}
|
||||||
|
model <- register_model(ws,
|
||||||
|
model_path = "project_files/model.rds",
|
||||||
|
model_name = "iris_model",
|
||||||
|
description = "Predict an Iris flower type")
|
||||||
|
```
|
||||||
|
|
||||||
|
## Provision an AKS cluster
|
||||||
|
When deploying a web service to AKS, you deploy to an AKS cluster that is connected to your workspace. There are two ways to connect an AKS cluster to your workspace:
|
||||||
|
|
||||||
|
* Create the AKS cluster. The process automatically connects the cluster to the workspace.
|
||||||
|
* Attach an existing AKS cluster to your workspace. You can attach a cluster with the [`attach_aks_compute()`](https://azure.github.io/azureml-sdk-for-r/reference/attach_aks_compute.html) method.
|
||||||
|
|
||||||
|
Creating or attaching an AKS cluster is a one-time process for your workspace. You can reuse this cluster for multiple deployments. If you delete the cluster or the resource group that contains it, you must create a new cluster the next time you need to deploy.
|
||||||
|
|
||||||
|
In this tutorial, we will go with the first method of provisioning a new cluster. See the [`create_aks_compute()`](https://azure.github.io/azureml-sdk-for-r/reference/create_aks_compute.html) reference for the full set of configurable parameters. If you pick custom values for the `agent_count` and `vm_size` parameters, you need to make sure `agent_count` multiplied by `vm_size` is greater than or equal to `12` virtual CPUs.
|
||||||
|
|
||||||
|
``` {r provision_cluster, eval=FALSE}
|
||||||
|
aks_target <- create_aks_compute(ws, cluster_name = 'myakscluster')
|
||||||
|
|
||||||
|
wait_for_provisioning_completion(aks_target, show_output = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
The Azure ML SDK does not provide support for scaling an AKS cluster. To scale the nodes in the cluster, use the UI for your AKS cluster in the Azure portal. You can only change the node count, not the VM size of the cluster.
|
||||||
|
|
||||||
|
## Deploy as a web service
|
||||||
|
### Define the inference dependencies
|
||||||
|
To deploy a model, you need an **inference configuration**, which describes the environment needed to host the model and web service. To create an inference config, you will first need a scoring script and an Azure ML environment.
|
||||||
|
|
||||||
|
The scoring script (`entry_script`) is an R script that will take as input variable values (in JSON format) and output a prediction from your model. For this tutorial, use the provided scoring file `score.R`. The scoring script must contain an `init()` method that loads your model and returns a function that uses the model to make a prediction based on the input data. See the [documentation](https://azure.github.io/azureml-sdk-for-r/reference/inference_config.html#details) for more details.
|
||||||
|
|
||||||
|
Next, define an Azure ML **environment** for your script’s package dependencies. With an environment, you specify R packages (from CRAN or elsewhere) that are needed for your script to run. You can also provide the values of environment variables that your script can reference to modify its behavior.
|
||||||
|
|
||||||
|
By default Azure ML will build a default Docker image that includes R, the Azure ML SDK, and additional required dependencies for deployment. See the documentation here for the full list of dependencies that will be installed in the default container. You can also specify additional packages to be installed at runtime, or even a custom Docker image to be used instead of the base image that will be built, using the other available parameters to [`r_environment()`](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html).
|
||||||
|
|
||||||
|
```{r create_env, eval=FALSE}
|
||||||
|
r_env <- r_environment(name = "deploy_env")
|
||||||
|
```
|
||||||
|
|
||||||
|
Now you have everything you need to create an inference config for encapsulating your scoring script and environment dependencies.
|
||||||
|
|
||||||
|
``` {r create_inference_config, eval=FALSE}
|
||||||
|
inference_config <- inference_config(
|
||||||
|
entry_script = "score.R",
|
||||||
|
source_directory = "project_files",
|
||||||
|
environment = r_env)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Deploy to AKS
|
||||||
|
Now, define the deployment configuration that describes the compute resources needed, for example, the number of cores and memory. See the [`aks_webservice_deployment_config()`](https://azure.github.io/azureml-sdk-for-r/reference/aks_webservice_deployment_config.html) for the full set of configurable parameters.
|
||||||
|
|
||||||
|
``` {r deploy_config, eval=FALSE}
|
||||||
|
aks_config <- aks_webservice_deployment_config(cpu_cores = 1, memory_gb = 1)
|
||||||
|
```
|
||||||
|
|
||||||
|
Now, deploy your model as a web service to the AKS cluster you created earlier.
|
||||||
|
|
||||||
|
```{r deploy_service, eval=FALSE}
|
||||||
|
aks_service <- deploy_model(ws,
|
||||||
|
'my-new-aksservice',
|
||||||
|
models = list(model),
|
||||||
|
inference_config = inference_config,
|
||||||
|
deployment_config = aks_config,
|
||||||
|
deployment_target = aks_target)
|
||||||
|
|
||||||
|
wait_for_deployment(aks_service, show_output = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
To inspect the logs from the deployment:
|
||||||
|
```{r get_logs, eval=FALSE}
|
||||||
|
get_webservice_logs(aks_service)
|
||||||
|
```
|
||||||
|
|
||||||
|
If you encounter any issue in deploying the web service, please visit the [troubleshooting guide](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment).
|
||||||
|
|
||||||
|
## Test the deployed service
|
||||||
|
Now that your model is deployed as a service, you can test the service from R using [`invoke_webservice()`](https://azure.github.io/azureml-sdk-for-r/reference/invoke_webservice.html). Provide a new set of data to predict from, convert it to JSON, and send it to the service.
|
||||||
|
|
||||||
|
``` {r test_service, eval=FALSE}
|
||||||
|
library(jsonlite)
|
||||||
|
# versicolor
|
||||||
|
plant <- data.frame(Sepal.Length = 6.4,
|
||||||
|
Sepal.Width = 2.8,
|
||||||
|
Petal.Length = 4.6,
|
||||||
|
Petal.Width = 1.8)
|
||||||
|
|
||||||
|
# setosa
|
||||||
|
# plant <- data.frame(Sepal.Length = 5.1,
|
||||||
|
# Sepal.Width = 3.5,
|
||||||
|
# Petal.Length = 1.4,
|
||||||
|
# Petal.Width = 0.2)
|
||||||
|
|
||||||
|
# virginica
|
||||||
|
# plant <- data.frame(Sepal.Length = 6.7,
|
||||||
|
# Sepal.Width = 3.3,
|
||||||
|
# Petal.Length = 5.2,
|
||||||
|
# Petal.Width = 2.3)
|
||||||
|
|
||||||
|
predicted_val <- invoke_webservice(aks_service, toJSON(plant))
|
||||||
|
message(predicted_val)
|
||||||
|
```
|
||||||
|
|
||||||
|
You can also get the web service’s HTTP endpoint, which accepts REST client calls. You can share this endpoint with anyone who wants to test the web service or integrate it into an application.
|
||||||
|
|
||||||
|
``` {r eval=FALSE}
|
||||||
|
aks_service$scoring_uri
|
||||||
|
```
|
||||||
|
|
||||||
|
## Web service authentication
|
||||||
|
When deploying to AKS, key-based authentication is enabled by default. You can also enable token-based authentication. Token-based authentication requires clients to use an Azure Active Directory account to request an authentication token, which is used to make requests to the deployed service.
|
||||||
|
|
||||||
|
To disable key-based auth, set the `auth_enabled = FALSE` parameter when creating the deployment configuration with [`aks_webservice_deployment_config()`](https://azure.github.io/azureml-sdk-for-r/reference/aks_webservice_deployment_config.html).
|
||||||
|
To enable token-based auth, set `token_auth_enabled = TRUE` when creating the deployment config.
|
||||||
|
|
||||||
|
### Key-based authentication
|
||||||
|
If key authentication is enabled, you can use the [`get_webservice_keys()`](https://azure.github.io/azureml-sdk-for-r/reference/get_webservice_keys.html) method to retrieve a primary and secondary authentication key. To generate a new key, use [`generate_new_webservice_key()`](https://azure.github.io/azureml-sdk-for-r/reference/generate_new_webservice_key.html).
|
||||||
|
|
||||||
|
### Token-based authentication
|
||||||
|
If token authentication is enabled, you can use the [`get_webservice_token()`](https://azure.github.io/azureml-sdk-for-r/reference/get_webservice_token.html) method to retrieve a JWT token and that token's expiration time. Make sure to request a new token after the token's expiration time.
|
||||||
|
|
||||||
|
## Clean up resources
|
||||||
|
Delete the resources once you no longer need them. Do not delete any resource you plan on still using.
|
||||||
|
|
||||||
|
Delete the web service:
|
||||||
|
```{r delete_service, eval=FALSE}
|
||||||
|
delete_webservice(aks_service)
|
||||||
|
```
|
||||||
|
|
||||||
|
Delete the registered model:
|
||||||
|
```{r delete_model, eval=FALSE}
|
||||||
|
delete_model(model)
|
||||||
|
```
|
||||||
|
|
||||||
|
Delete the AKS cluster:
|
||||||
|
```{r delete_cluster, eval=FALSE}
|
||||||
|
delete_compute(aks_target)
|
||||||
|
```
|
||||||
Binary file not shown.
@@ -0,0 +1,17 @@
|
|||||||
|
#' Copyright(c) Microsoft Corporation.
|
||||||
|
#' Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
init <- function() {
|
||||||
|
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
|
||||||
|
model <- readRDS(file.path(model_path, "model.rds"))
|
||||||
|
message("model is loaded")
|
||||||
|
|
||||||
|
function(data) {
|
||||||
|
plant <- as.data.frame(fromJSON(data))
|
||||||
|
prediction <- predict(model, plant)
|
||||||
|
result <- as.character(prediction)
|
||||||
|
toJSON(result)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,242 @@
|
|||||||
|
---
|
||||||
|
title: "Hyperparameter tune a Keras model"
|
||||||
|
date: "`r Sys.Date()`"
|
||||||
|
output: rmarkdown::html_vignette
|
||||||
|
vignette: >
|
||||||
|
%\VignetteIndexEntry{Hyperparameter tune a Keras model}
|
||||||
|
%\VignetteEngine{knitr::rmarkdown}
|
||||||
|
\use_package{UTF-8}
|
||||||
|
---
|
||||||
|
|
||||||
|
This tutorial demonstrates how you can efficiently tune hyperparameters for a model using HyperDrive, Azure ML's hyperparameter tuning functionality. You will train a Keras model on the CIFAR10 dataset, automate hyperparameter exploration, launch parallel jobs, log your results, and find the best run.
|
||||||
|
|
||||||
|
### What are hyperparameters?
|
||||||
|
|
||||||
|
Hyperparameters are variable parameters chosen to train a model. Learning rate, number of epochs, and batch size are all examples of hyperparameters.
|
||||||
|
|
||||||
|
Using brute-force methods to find the optimal values for parameters can be time-consuming, and poor-performing runs can result in wasted money. To avoid this, HyperDrive automates hyperparameter exploration in a time-saving and cost-effective manner by launching several parallel runs with different configurations and finding the configuration that results in best performance on your primary metric.
|
||||||
|
|
||||||
|
Let's get started with the example to see how it works!
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
If you don’t have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
|
||||||
|
|
||||||
|
## Set up development environment
|
||||||
|
The setup for your development work in this tutorial includes the following actions:
|
||||||
|
|
||||||
|
* Import required packages
|
||||||
|
* Connect to a workspace
|
||||||
|
* Create an experiment to track your runs
|
||||||
|
* Create a remote compute target to use for training
|
||||||
|
|
||||||
|
### Import **azuremlsdk** package
|
||||||
|
```{r eval=FALSE}
|
||||||
|
library(azuremlsdk)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Load your workspace
|
||||||
|
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from a **config.json** file if you previously wrote one out with [`write_workspace_config()`](https://azure.github.io/azureml-sdk-for-r/reference/write_workspace_config.html).
|
||||||
|
```{r load_workpace, eval=FALSE}
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
```
|
||||||
|
|
||||||
|
Or, you can retrieve a workspace by directly specifying your workspace details:
|
||||||
|
```{r get_workpace, eval=FALSE}
|
||||||
|
ws <- get_workspace("<your workspace name>", "<your subscription ID>", "<your resource group>")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Create an experiment
|
||||||
|
An Azure ML **experiment** tracks a grouping of runs, typically from the same training script. Create an experiment to track hyperparameter tuning runs for the Keras model.
|
||||||
|
|
||||||
|
```{r create_experiment, eval=FALSE}
|
||||||
|
exp <- experiment(workspace = ws, name = 'hyperdrive-cifar10')
|
||||||
|
```
|
||||||
|
|
||||||
|
If you would like to track your runs in an existing experiment, simply specify that experiment's name to the `name` parameter of `experiment()`.
|
||||||
|
|
||||||
|
### Create a compute target
|
||||||
|
By using Azure Machine Learning Compute (AmlCompute), a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. In this tutorial, you create a GPU-enabled cluster as your training environment. The code below creates the compute cluster for you if it doesn't already exist in your workspace.
|
||||||
|
|
||||||
|
You may need to wait a few minutes for your compute cluster to be provisioned if it doesn't already exist.
|
||||||
|
|
||||||
|
```{r create_cluster, eval=FALSE}
|
||||||
|
cluster_name <- "gpucluster"
|
||||||
|
|
||||||
|
compute_target <- get_compute(ws, cluster_name = cluster_name)
|
||||||
|
if (is.null(compute_target))
|
||||||
|
{
|
||||||
|
vm_size <- "STANDARD_NC6"
|
||||||
|
compute_target <- create_aml_compute(workspace = ws,
|
||||||
|
cluster_name = cluster_name,
|
||||||
|
vm_size = vm_size,
|
||||||
|
max_nodes = 4)
|
||||||
|
|
||||||
|
wait_for_provisioning_completion(compute_target, show_output = TRUE)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Prepare the training script
|
||||||
|
A training script called `cifar10_cnn.R` has been provided for you in the "project_files" directory of this tutorial.
|
||||||
|
|
||||||
|
In order to leverage HyperDrive, the training script for your model must log the relevant metrics during model training. When you configure the hyperparameter tuning run, you specify the primary metric to use for evaluating run performance. You must log this metric so it is available to the hyperparameter tuning process.
|
||||||
|
|
||||||
|
In order to log the required metrics, you need to do the following **inside the training script**:
|
||||||
|
|
||||||
|
* Import the **azuremlsdk** package
|
||||||
|
```
|
||||||
|
library(azuremlsdk)
|
||||||
|
```
|
||||||
|
|
||||||
|
* Take the hyperparameters as command-line arguments to the script. This is necessary so that when HyperDrive carries out the hyperparameter sweep, it can run the training script with different values to the hyperparameters as defined by the search space.
|
||||||
|
|
||||||
|
* Use the [`log_metric_to_run()`](https://azure.github.io/azureml-sdk-for-r/reference/log_metric_to_run.html) function to log the hyperparameters and the primary metric.
|
||||||
|
```
|
||||||
|
log_metric_to_run("batch_size", batch_size)
|
||||||
|
...
|
||||||
|
log_metric_to_run("epochs", epochs)
|
||||||
|
...
|
||||||
|
log_metric_to_run("lr", lr)
|
||||||
|
...
|
||||||
|
log_metric_to_run("decay", decay)
|
||||||
|
...
|
||||||
|
log_metric_to_run("Loss", results[[1]])
|
||||||
|
```
|
||||||
|
|
||||||
|
## Create an estimator
|
||||||
|
|
||||||
|
An Azure ML **estimator** encapsulates the run configuration information needed for executing a training script on the compute target. Azure ML runs are run as containerized jobs on the specified compute target. By default, the Docker image built for your training job will include R, the Azure ML SDK, and a set of commonly used R packages. See the full list of default packages included [here](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html). The estimator is used to define the configuration for each of the child runs that the parent HyperDrive run will kick off.
|
||||||
|
|
||||||
|
To create the estimator, define the following:
|
||||||
|
|
||||||
|
* The directory that contains your scripts needed for training (`source_directory`). All the files in this directory are uploaded to the cluster node(s) for execution. The directory must contain your training script and any additional scripts required.
|
||||||
|
* The training script that will be executed (`entry_script`).
|
||||||
|
* The compute target (`compute_target`), in this case the AmlCompute cluster you created earlier.
|
||||||
|
* Any environment dependencies required for training. Since the training script requires the Keras package, which is not included in the image by default, pass the package name to the `cran_packages` parameter to have it installed in the Docker container where the job will run. See the [`estimator()`](https://azure.github.io/azureml-sdk-for-r/reference/estimator.html) reference for the full set of configurable options.
|
||||||
|
* Set the `use_gpu = TRUE` flag so the default base GPU Docker image will be built, since the job will be run on a GPU cluster.
|
||||||
|
|
||||||
|
```{r create_estimator, eval=FALSE}
|
||||||
|
est <- estimator(source_directory = "project_files",
|
||||||
|
entry_script = "cifar10_cnn.R",
|
||||||
|
compute_target = compute_target,
|
||||||
|
cran_packages = c("keras"),
|
||||||
|
use_gpu = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configure the HyperDrive run
|
||||||
|
To kick off hyperparameter tuning in Azure ML, you will need to configure a HyperDrive run, which will in turn launch individual children runs of the training scripts with the corresponding hyperparameter values.
|
||||||
|
|
||||||
|
### Define search space
|
||||||
|
|
||||||
|
In this experiment, we will use four hyperparameters: batch size, number of epochs, learning rate, and decay. In order to begin tuning, we must define the range of values we would like to explore from and how they will be distributed. This is called a parameter space definition and can be created with discrete or continuous ranges.
|
||||||
|
|
||||||
|
__Discrete hyperparameters__ are specified as a choice among discrete values represented as a list.
|
||||||
|
|
||||||
|
Advanced discrete hyperparameters can also be specified using a distribution. The following distributions are supported:
|
||||||
|
|
||||||
|
* `quniform(low, high, q)`
|
||||||
|
* `qloguniform(low, high, q)`
|
||||||
|
* `qnormal(mu, sigma, q)`
|
||||||
|
* `qlognormal(mu, sigma, q)`
|
||||||
|
|
||||||
|
__Continuous hyperparameters__ are specified as a distribution over a continuous range of values. The following distributions are supported:
|
||||||
|
|
||||||
|
* `uniform(low, high)`
|
||||||
|
* `loguniform(low, high)`
|
||||||
|
* `normal(mu, sigma)`
|
||||||
|
* `lognormal(mu, sigma)`
|
||||||
|
|
||||||
|
Here, we will use the [`random_parameter_sampling()`](https://azure.github.io/azureml-sdk-for-r/reference/random_parameter_sampling.html) function to define the search space for each hyperparameter. `batch_size` and `epochs` will be chosen from discrete sets while `lr` and `decay` will be drawn from continuous distributions.
|
||||||
|
|
||||||
|
Other available sampling function options are:
|
||||||
|
|
||||||
|
* [`grid_parameter_sampling()`](https://azure.github.io/azureml-sdk-for-r/reference/grid_parameter_sampling.html)
|
||||||
|
* [`bayesian_parameter_sampling()`](https://azure.github.io/azureml-sdk-for-r/reference/bayesian_parameter_sampling.html)
|
||||||
|
|
||||||
|
```{r search_space, eval=FALSE}
|
||||||
|
sampling <- random_parameter_sampling(list(batch_size = choice(c(16, 32, 64)),
|
||||||
|
epochs = choice(c(200, 350, 500)),
|
||||||
|
lr = normal(0.0001, 0.005),
|
||||||
|
decay = uniform(1e-6, 3e-6)))
|
||||||
|
```
|
||||||
|
|
||||||
|
### Define termination policy
|
||||||
|
|
||||||
|
To prevent resource waste, Azure ML can detect and terminate poorly performing runs. HyperDrive will do this automatically if you specify an early termination policy.
|
||||||
|
|
||||||
|
Here, you will use the [`bandit_policy()`](https://azure.github.io/azureml-sdk-for-r/reference/bandit_policy.html), which terminates any runs where the primary metric is not within the specified slack factor with respect to the best performing training run.
|
||||||
|
|
||||||
|
```{r termination_policy, eval=FALSE}
|
||||||
|
policy <- bandit_policy(slack_factor = 0.15)
|
||||||
|
```
|
||||||
|
|
||||||
|
Other termination policy options are:
|
||||||
|
|
||||||
|
* [`median_stopping_policy()`](https://azure.github.io/azureml-sdk-for-r/reference/median_stopping_policy.html)
|
||||||
|
* [`truncation_selection_policy()`](https://azure.github.io/azureml-sdk-for-r/reference/truncation_selection_policy.html)
|
||||||
|
|
||||||
|
If no policy is provided, all runs will continue to completion regardless of performance.
|
||||||
|
|
||||||
|
### Finalize configuration
|
||||||
|
|
||||||
|
Now, you can create a `HyperDriveConfig` object to define your HyperDrive run. Along with the sampling and policy definitions, you need to specify the name of the primary metric that you want to track and whether we want to maximize it or minimize it. The `primary_metric_name` must correspond with the name of the primary metric you logged in your training script. `max_total_runs` specifies the total number of child runs to launch. See the [hyperdrive_config()](https://azure.github.io/azureml-sdk-for-r/reference/hyperdrive_config.html) reference for the full set of configurable parameters.
|
||||||
|
|
||||||
|
```{r create_config, eval=FALSE}
|
||||||
|
hyperdrive_config <- hyperdrive_config(hyperparameter_sampling = sampling,
|
||||||
|
primary_metric_goal("MINIMIZE"),
|
||||||
|
primary_metric_name = "Loss",
|
||||||
|
max_total_runs = 4,
|
||||||
|
policy = policy,
|
||||||
|
estimator = est)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Submit the HyperDrive run
|
||||||
|
|
||||||
|
Finally submit the experiment to run on your cluster. The parent HyperDrive run will launch the individual child runs. `submit_experiment()` will return a `HyperDriveRun` object that you will use to interface with the run. In this tutorial, since the cluster we created scales to a max of `4` nodes, all 4 child runs will be launched in parallel.
|
||||||
|
|
||||||
|
```{r submit_run, eval=FALSE}
|
||||||
|
hyperdrive_run <- submit_experiment(exp, hyperdrive_config)
|
||||||
|
```
|
||||||
|
|
||||||
|
You can view the HyperDrive run’s details as a table. Clicking the “Web View” link provided will bring you to Azure Machine Learning studio, where you can monitor the run in the UI.
|
||||||
|
|
||||||
|
```{r eval=FALSE}
|
||||||
|
view_run_details(hyperdrive_run)
|
||||||
|
```
|
||||||
|
|
||||||
|
Wait until hyperparameter tuning is complete before you run more code.
|
||||||
|
|
||||||
|
```{r eval=FALSE}
|
||||||
|
wait_for_run_completion(hyperdrive_run, show_output = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Analyse runs by performance
|
||||||
|
|
||||||
|
Finally, you can view and compare the metrics collected during all of the child runs!
|
||||||
|
|
||||||
|
```{r analyse_runs, eval=FALSE}
|
||||||
|
# Get the metrics of all the child runs
|
||||||
|
child_run_metrics <- get_child_run_metrics(hyperdrive_run)
|
||||||
|
child_run_metrics
|
||||||
|
|
||||||
|
# Get the child run objects sorted in descending order by the best primary metric
|
||||||
|
child_runs <- get_child_runs_sorted_by_primary_metric(hyperdrive_run)
|
||||||
|
child_runs
|
||||||
|
|
||||||
|
# Directly get the run object of the best performing run
|
||||||
|
best_run <- get_best_run_by_primary_metric(hyperdrive_run)
|
||||||
|
|
||||||
|
# Get the metrics of the best performing run
|
||||||
|
metrics <- get_run_metrics(best_run)
|
||||||
|
metrics
|
||||||
|
```
|
||||||
|
|
||||||
|
The `metrics` variable will include the values of the hyperparameters that resulted in the best performing run.
|
||||||
|
|
||||||
|
## Clean up resources
|
||||||
|
Delete the resources once you no longer need them. Don't delete any resource you plan to still use.
|
||||||
|
|
||||||
|
Delete the compute cluster:
|
||||||
|
```{r delete_compute, eval=FALSE}
|
||||||
|
delete_compute(compute_target)
|
||||||
|
```
|
||||||
@@ -0,0 +1,124 @@
|
|||||||
|
#' Modified from: "https://github.com/rstudio/keras/blob/master/vignettes/
|
||||||
|
#' examples/cifar10_cnn.R"
|
||||||
|
#'
|
||||||
|
#' Train a simple deep CNN on the CIFAR10 small images dataset.
|
||||||
|
#'
|
||||||
|
#' It gets down to 0.65 test logloss in 25 epochs, and down to 0.55 after 50
|
||||||
|
#' epochs, though it is still underfitting at that point.
|
||||||
|
|
||||||
|
library(keras)
|
||||||
|
install_keras()
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
|
||||||
|
# Parameters --------------------------------------------------------------
|
||||||
|
|
||||||
|
args <- commandArgs(trailingOnly = TRUE)
|
||||||
|
|
||||||
|
batch_size <- as.numeric(args[2])
|
||||||
|
log_metric_to_run("batch_size", batch_size)
|
||||||
|
|
||||||
|
epochs <- as.numeric(args[4])
|
||||||
|
log_metric_to_run("epochs", epochs)
|
||||||
|
|
||||||
|
lr <- as.numeric(args[6])
|
||||||
|
log_metric_to_run("lr", lr)
|
||||||
|
|
||||||
|
decay <- as.numeric(args[8])
|
||||||
|
log_metric_to_run("decay", decay)
|
||||||
|
|
||||||
|
data_augmentation <- TRUE
|
||||||
|
|
||||||
|
|
||||||
|
# Data Preparation --------------------------------------------------------
|
||||||
|
|
||||||
|
# See ?dataset_cifar10 for more info
|
||||||
|
cifar10 <- dataset_cifar10()
|
||||||
|
|
||||||
|
# Feature scale RGB values in test and train inputs
|
||||||
|
x_train <- cifar10$train$x / 255
|
||||||
|
x_test <- cifar10$test$x / 255
|
||||||
|
y_train <- to_categorical(cifar10$train$y, num_classes = 10)
|
||||||
|
y_test <- to_categorical(cifar10$test$y, num_classes = 10)
|
||||||
|
|
||||||
|
|
||||||
|
# Defining Model ----------------------------------------------------------
|
||||||
|
|
||||||
|
# Initialize sequential model
|
||||||
|
model <- keras_model_sequential()
|
||||||
|
|
||||||
|
model %>%
|
||||||
|
|
||||||
|
# Start with hidden 2D convolutional layer being fed 32x32 pixel images
|
||||||
|
layer_conv_2d(
|
||||||
|
filter = 32, kernel_size = c(3, 3), padding = "same",
|
||||||
|
input_shape = c(32, 32, 3)
|
||||||
|
) %>%
|
||||||
|
layer_activation("relu") %>%
|
||||||
|
|
||||||
|
# Second hidden layer
|
||||||
|
layer_conv_2d(filter = 32, kernel_size = c(3, 3)) %>%
|
||||||
|
layer_activation("relu") %>%
|
||||||
|
|
||||||
|
# Use max pooling
|
||||||
|
layer_max_pooling_2d(pool_size = c(2, 2)) %>%
|
||||||
|
layer_dropout(0.25) %>%
|
||||||
|
|
||||||
|
# 2 additional hidden 2D convolutional layers
|
||||||
|
layer_conv_2d(filter = 32, kernel_size = c(3, 3), padding = "same") %>%
|
||||||
|
layer_activation("relu") %>%
|
||||||
|
layer_conv_2d(filter = 32, kernel_size = c(3, 3)) %>%
|
||||||
|
layer_activation("relu") %>%
|
||||||
|
|
||||||
|
# Use max pooling once more
|
||||||
|
layer_max_pooling_2d(pool_size = c(2, 2)) %>%
|
||||||
|
layer_dropout(0.25) %>%
|
||||||
|
|
||||||
|
# Flatten max filtered output into feature vector
|
||||||
|
# and feed into dense layer
|
||||||
|
layer_flatten() %>%
|
||||||
|
layer_dense(512) %>%
|
||||||
|
layer_activation("relu") %>%
|
||||||
|
layer_dropout(0.5) %>%
|
||||||
|
|
||||||
|
# Outputs from dense layer are projected onto 10 unit output layer
|
||||||
|
layer_dense(10) %>%
|
||||||
|
layer_activation("softmax")
|
||||||
|
|
||||||
|
opt <- optimizer_rmsprop(lr, decay)
|
||||||
|
|
||||||
|
model %>%
|
||||||
|
compile(loss = "categorical_crossentropy",
|
||||||
|
optimizer = opt,
|
||||||
|
metrics = "accuracy"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Training ----------------------------------------------------------------
|
||||||
|
|
||||||
|
if (!data_augmentation) {
|
||||||
|
|
||||||
|
model %>%
|
||||||
|
fit(x_train,
|
||||||
|
y_train,
|
||||||
|
batch_size = batch_size,
|
||||||
|
epochs = epochs,
|
||||||
|
validation_data = list(x_test, y_test),
|
||||||
|
shuffle = TRUE
|
||||||
|
)
|
||||||
|
|
||||||
|
} else {
|
||||||
|
|
||||||
|
datagen <- image_data_generator(rotation_range = 20,
|
||||||
|
width_shift_range = 0.2,
|
||||||
|
height_shift_range = 0.2,
|
||||||
|
horizontal_flip = TRUE
|
||||||
|
)
|
||||||
|
|
||||||
|
datagen %>% fit_image_data_generator(x_train)
|
||||||
|
|
||||||
|
results <- evaluate(model, x_train, y_train, batch_size)
|
||||||
|
log_metric_to_run("Loss", results[[1]])
|
||||||
|
cat("Loss: ", results[[1]], "\n")
|
||||||
|
cat("Accuracy: ", results[[2]], "\n")
|
||||||
|
}
|
||||||
100
how-to-use-azureml/azureml-sdk-for-r/vignettes/installation.Rmd
Normal file
100
how-to-use-azureml/azureml-sdk-for-r/vignettes/installation.Rmd
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
---
|
||||||
|
title: "Install the Azure ML SDK for R"
|
||||||
|
date: "`r Sys.Date()`"
|
||||||
|
output: rmarkdown::html_vignette
|
||||||
|
vignette: >
|
||||||
|
%\VignetteIndexEntry{Install the Azure ML SDK for R}
|
||||||
|
%\VignetteEngine{knitr::rmarkdown}
|
||||||
|
\use_package{UTF-8}
|
||||||
|
---
|
||||||
|
|
||||||
|
This article covers the step-by-step instructions for installing the Azure ML SDK for R.
|
||||||
|
|
||||||
|
You do not need run this if you are working on an Azure Machine Learning Compute Instance, as the compute instance already has the Azure ML SDK preinstalled.
|
||||||
|
|
||||||
|
## Install Conda
|
||||||
|
|
||||||
|
If you do not have Conda already installed on your machine, you will first need to install it, since the Azure ML R SDK uses **reticulate** to bind to the Python SDK. We recommend installing [Miniconda](https://docs.conda.io/en/latest/miniconda.html), which is a smaller, lightweight version of Anaconda. Choose the 64-bit binary for Python 3.5 or later.
|
||||||
|
|
||||||
|
## Install the **azuremlsdk** R package
|
||||||
|
You will need **remotes** to install **azuremlsdk** from the GitHub repo.
|
||||||
|
``` {r install_remotes, eval=FALSE}
|
||||||
|
install.packages('remotes')
|
||||||
|
```
|
||||||
|
|
||||||
|
Then, you can use the `install_github` function to install the package.
|
||||||
|
``` {r install_azuremlsdk, eval=FALSE}
|
||||||
|
remotes::install_cran('azuremlsdk', repos = 'https://cloud.r-project.org/')
|
||||||
|
```
|
||||||
|
|
||||||
|
If you are using R installed from CRAN, which comes with 32-bit and 64-bit binaries, you may need to specify the parameter `INSTALL_opts=c("--no-multiarch")` to only build for the current 64-bit architecture.
|
||||||
|
``` {r eval=FALSE}
|
||||||
|
remotes::install_cran('azuremlsdk', repos = 'https://cloud.r-project.org/', INSTALL_opts=c("--no-multiarch"))
|
||||||
|
```
|
||||||
|
|
||||||
|
## Install the Azure ML Python SDK
|
||||||
|
Lastly, use the **azuremlsdk** R library to install the Python SDK. By default, `azuremlsdk::install_azureml()` will install the [latest version of the Python SDK](https://pypi.org/project/azureml-sdk/) in a conda environment called `r-azureml` if reticulate < 1.14 or `r-reticulate` if reticulate ≥ 1.14.
|
||||||
|
``` {r install_pythonsdk, eval=FALSE}
|
||||||
|
azuremlsdk::install_azureml()
|
||||||
|
```
|
||||||
|
|
||||||
|
If you would like to override the default version, environment name, or Python version, you can pass in those arguments. If you would like to restart the R session after installation or delete the conda environment if it already exists and create a new environment, you can also do so:
|
||||||
|
``` {r eval=FALSE}
|
||||||
|
azuremlsdk::install_azureml(version = NULL,
|
||||||
|
custom_envname = "<your conda environment name>",
|
||||||
|
conda_python_version = "<desired python version>",
|
||||||
|
restart_session = TRUE,
|
||||||
|
remove_existing_env = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Test installation
|
||||||
|
You can confirm your installation worked by loading the library and successfully retrieving a run.
|
||||||
|
``` {r test_installation, eval=FALSE}
|
||||||
|
library(azuremlsdk)
|
||||||
|
get_current_run()
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
- In step 3 of the installation, if you get ssl errors on windows, it is due to an
|
||||||
|
outdated openssl binary. Install the latest openssl binaries from
|
||||||
|
[here](https://wiki.openssl.org/index.php/Binaries).
|
||||||
|
|
||||||
|
- If installation fails due to this error:
|
||||||
|
|
||||||
|
```R
|
||||||
|
Error in strptime(xx, f, tz = tz) :
|
||||||
|
(converted from warning) unable to identify current timezone 'C':
|
||||||
|
please set environment variable 'TZ'
|
||||||
|
In R CMD INSTALL
|
||||||
|
Error in i.p(...) :
|
||||||
|
(converted from warning) installation of package ‘C:/.../azureml_0.4.0.tar.gz’ had non-zero exit
|
||||||
|
status
|
||||||
|
```
|
||||||
|
|
||||||
|
You will need to set your time zone environment variable to GMT and restart the installation process.
|
||||||
|
|
||||||
|
```R
|
||||||
|
Sys.setenv(TZ='GMT')
|
||||||
|
```
|
||||||
|
|
||||||
|
- If the following permission error occurs while installing in RStudio,
|
||||||
|
change your RStudio session to administrator mode, and re-run the installation command.
|
||||||
|
|
||||||
|
```R
|
||||||
|
Downloading GitHub repo Azure/azureml-sdk-for-r@master
|
||||||
|
Skipping 2 packages ahead of CRAN: reticulate, rlang
|
||||||
|
Running `R CMD build`...
|
||||||
|
|
||||||
|
Error: (converted from warning) invalid package
|
||||||
|
'C:/.../file2b441bf23631'
|
||||||
|
In R CMD INSTALL
|
||||||
|
Error in i.p(...) :
|
||||||
|
(converted from warning) installation of package
|
||||||
|
‘C:/.../file2b441bf23631’ had non-zero exit status
|
||||||
|
In addition: Warning messages:
|
||||||
|
1: In file(con, "r") :
|
||||||
|
cannot open file 'C:...\file2b44144a540f': Permission denied
|
||||||
|
2: In file(con, "r") :
|
||||||
|
cannot open file 'C:...\file2b4463c21577': Permission denied
|
||||||
|
```
|
||||||
|
|
||||||
@@ -0,0 +1,16 @@
|
|||||||
|
#' Copyright(c) Microsoft Corporation.
|
||||||
|
#' Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
init <- function() {
|
||||||
|
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
|
||||||
|
model <- readRDS(file.path(model_path, "model.rds"))
|
||||||
|
message("logistic regression model loaded")
|
||||||
|
|
||||||
|
function(data) {
|
||||||
|
vars <- as.data.frame(fromJSON(data))
|
||||||
|
prediction <- as.numeric(predict(model, vars, type = "response") * 100)
|
||||||
|
toJSON(prediction)
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -0,0 +1,33 @@
|
|||||||
|
#' Copyright(c) Microsoft Corporation.
|
||||||
|
#' Licensed under the MIT license.
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
library(optparse)
|
||||||
|
library(caret)
|
||||||
|
|
||||||
|
options <- list(
|
||||||
|
make_option(c("-d", "--data_folder"))
|
||||||
|
)
|
||||||
|
|
||||||
|
opt_parser <- OptionParser(option_list = options)
|
||||||
|
opt <- parse_args(opt_parser)
|
||||||
|
|
||||||
|
paste(opt$data_folder)
|
||||||
|
|
||||||
|
accidents <- readRDS(file.path(opt$data_folder, "accidents.Rd"))
|
||||||
|
summary(accidents)
|
||||||
|
|
||||||
|
mod <- glm(dead ~ dvcat + seatbelt + frontal + sex + ageOFocc + yearVeh + airbag + occRole, family = binomial, data = accidents)
|
||||||
|
summary(mod)
|
||||||
|
predictions <- factor(ifelse(predict(mod) > 0.1, "dead", "alive"))
|
||||||
|
conf_matrix <- confusionMatrix(predictions, accidents$dead)
|
||||||
|
message(conf_matrix)
|
||||||
|
|
||||||
|
log_metric_to_run("Accuracy", conf_matrix$overall["Accuracy"])
|
||||||
|
|
||||||
|
output_dir = "outputs"
|
||||||
|
if (!dir.exists(output_dir)) {
|
||||||
|
dir.create(output_dir)
|
||||||
|
}
|
||||||
|
saveRDS(mod, file = "./outputs/model.rds")
|
||||||
|
message("Model saved")
|
||||||
@@ -0,0 +1,326 @@
|
|||||||
|
---
|
||||||
|
title: "Train and deploy your first model with Azure ML"
|
||||||
|
author: "David Smith"
|
||||||
|
date: "`r Sys.Date()`"
|
||||||
|
output: rmarkdown::html_vignette
|
||||||
|
vignette: >
|
||||||
|
%\VignetteIndexEntry{Train and deploy your first model with Azure ML}
|
||||||
|
%\VignetteEngine{knitr::rmarkdown}
|
||||||
|
\use_package{UTF-8}
|
||||||
|
---
|
||||||
|
|
||||||
|
In this tutorial, you learn the foundational design patterns in Azure Machine Learning. You'll train and deploy a **caret** model to predict the likelihood of a fatality in an automobile accident. After completing this tutorial, you'll have the practical knowledge of the R SDK to scale up to developing more-complex experiments and workflows.
|
||||||
|
|
||||||
|
In this tutorial, you learn the following tasks:
|
||||||
|
|
||||||
|
* Connect your workspace
|
||||||
|
* Load data and prepare for training
|
||||||
|
* Upload data to the datastore so it is available for remote training
|
||||||
|
* Create a compute resource
|
||||||
|
* Train a caret model to predict probability of fatality
|
||||||
|
* Deploy a prediction endpoint
|
||||||
|
* Test the model from R
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
If you don't have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
|
||||||
|
|
||||||
|
## Set up your development environment
|
||||||
|
The setup for your development work in this tutorial includes the following actions:
|
||||||
|
|
||||||
|
* Install required packages
|
||||||
|
* Connect to a workspace, so that your local computer can communicate with remote resources
|
||||||
|
* Create an experiment to track your runs
|
||||||
|
* Create a remote compute target to use for training
|
||||||
|
|
||||||
|
### Install required packages
|
||||||
|
This tutorial assumes you already have the Azure ML SDK installed. Go ahead and import the **azuremlsdk** package.
|
||||||
|
|
||||||
|
```{r eval=FALSE}
|
||||||
|
library(azuremlsdk)
|
||||||
|
```
|
||||||
|
|
||||||
|
The tutorial uses data from the [**DAAG** package](https://cran.r-project.org/package=DAAG). Install the package if you don't have it.
|
||||||
|
|
||||||
|
```{r eval=FALSE}
|
||||||
|
install.packages("DAAG")
|
||||||
|
```
|
||||||
|
|
||||||
|
The training and scoring scripts (`accidents.R` and `accident_predict.R`) have some additional dependencies. If you plan on running those scripts locally, make sure you have those required packages as well.
|
||||||
|
|
||||||
|
### Load your workspace
|
||||||
|
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from the **config.json** file. You can also retrieve a workspace using [`get_workspace()`](https://azure.github.io/azureml-sdk-for-r/reference/get_workspace.html).
|
||||||
|
|
||||||
|
```{r load_workpace, eval=FALSE}
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
```
|
||||||
|
|
||||||
|
### Create an experiment
|
||||||
|
An Azure ML experiment tracks a grouping of runs, typically from the same training script. Create an experiment to track the runs for training the caret model on the accidents data.
|
||||||
|
|
||||||
|
```{r create_experiment, eval=FALSE}
|
||||||
|
experiment_name <- "accident-logreg"
|
||||||
|
exp <- experiment(ws, experiment_name)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Create a compute target
|
||||||
|
By using Azure Machine Learning Compute (AmlCompute), a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. Examples include VMs with GPU support. In this tutorial, you create a single-node AmlCompute cluster as your training environment. The code below creates the compute cluster for you if it doesn't already exist in your workspace.
|
||||||
|
|
||||||
|
You may need to wait a few minutes for your compute cluster to be provisioned if it doesn't already exist.
|
||||||
|
|
||||||
|
```{r create_cluster, eval=FALSE}
|
||||||
|
cluster_name <- "rcluster"
|
||||||
|
compute_target <- get_compute(ws, cluster_name = cluster_name)
|
||||||
|
if (is.null(compute_target)) {
|
||||||
|
vm_size <- "STANDARD_D2_V2"
|
||||||
|
compute_target <- create_aml_compute(workspace = ws,
|
||||||
|
cluster_name = cluster_name,
|
||||||
|
vm_size = vm_size,
|
||||||
|
max_nodes = 1)
|
||||||
|
|
||||||
|
wait_for_provisioning_completion(compute_target, show_output = TRUE)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Prepare data for training
|
||||||
|
This tutorial uses data from the **DAAG** package. This dataset includes data from over 25,000 car crashes in the US, with variables you can use to predict the likelihood of a fatality. First, import the data into R and transform it into a new dataframe `accidents` for analysis, and export it to an `Rdata` file.
|
||||||
|
|
||||||
|
```{r load_data, eval=FALSE}
|
||||||
|
library(DAAG)
|
||||||
|
data(nassCDS)
|
||||||
|
|
||||||
|
accidents <- na.omit(nassCDS[,c("dead","dvcat","seatbelt","frontal","sex","ageOFocc","yearVeh","airbag","occRole")])
|
||||||
|
accidents$frontal <- factor(accidents$frontal, labels=c("notfrontal","frontal"))
|
||||||
|
accidents$occRole <- factor(accidents$occRole)
|
||||||
|
|
||||||
|
saveRDS(accidents, file="accidents.Rd")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Upload data to the datastore
|
||||||
|
Upload data to the cloud so that it can be access by your remote training environment. Each Azure ML workspace comes with a default datastore that stores the connection information to the Azure blob container that is provisioned in the storage account attached to the workspace. The following code will upload the accidents data you created above to that datastore.
|
||||||
|
|
||||||
|
```{r upload_data, eval=FALSE}
|
||||||
|
ds <- get_default_datastore(ws)
|
||||||
|
|
||||||
|
target_path <- "accidentdata"
|
||||||
|
upload_files_to_datastore(ds,
|
||||||
|
list("./project_files/accidents.Rd"),
|
||||||
|
target_path = target_path,
|
||||||
|
overwrite = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
## Train a model
|
||||||
|
|
||||||
|
For this tutorial, fit a logistic regression model on your uploaded data using your remote compute cluster. To submit a job, you need to:
|
||||||
|
|
||||||
|
* Prepare the training script
|
||||||
|
* Create an estimator
|
||||||
|
* Submit the job
|
||||||
|
|
||||||
|
### Prepare the training script
|
||||||
|
A training script called `accidents.R` has been provided for you in the "project_files" directory of this tutorial. Notice the following details **inside the training script** that have been done to leverage the Azure ML service for training:
|
||||||
|
|
||||||
|
* The training script takes an argument `-d` to find the directory that contains the training data. When you define and submit your job later, you point to the datastore for this argument. Azure ML will mount the storage folder to the remote cluster for the training job.
|
||||||
|
* The training script logs the final accuracy as a metric to the run record in Azure ML using `log_metric_to_run()`. The Azure ML SDK provides a set of logging APIs for logging various metrics during training runs. These metrics are recorded and persisted in the experiment run record. The metrics can then be accessed at any time or viewed in the run details page in [Azure Machine Learning studio](http://ml.azure.com). See the [reference](https://azure.github.io/azureml-sdk-for-r/reference/index.html#section-training-experimentation) for the full set of logging methods `log_*()`.
|
||||||
|
* The training script saves your model into a directory named **outputs**. The `./outputs` folder receives special treatment by Azure ML. During training, files written to `./outputs` are automatically uploaded to your run record by Azure ML and persisted as artifacts. By saving the trained model to `./outputs`, you'll be able to access and retrieve your model file even after the run is over and you no longer have access to your remote training environment.
|
||||||
|
|
||||||
|
### Create an estimator
|
||||||
|
|
||||||
|
An Azure ML estimator encapsulates the run configuration information needed for executing a training script on the compute target. Azure ML runs are run as containerized jobs on the specified compute target. By default, the Docker image built for your training job will include R, the Azure ML SDK, and a set of commonly used R packages. See the full list of default packages included [here](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html).
|
||||||
|
|
||||||
|
To create the estimator, define:
|
||||||
|
|
||||||
|
* The directory that contains your scripts needed for training (`source_directory`). All the files in this directory are uploaded to the cluster node(s) for execution. The directory must contain your training script and any additional scripts required.
|
||||||
|
* The training script that will be executed (`entry_script`).
|
||||||
|
* The compute target (`compute_target`), in this case the AmlCompute cluster you created earlier.
|
||||||
|
* The parameters required from the training script (`script_params`). Azure ML will run your training script as a command-line script with `Rscript`. In this tutorial you specify one argument to the script, the data directory mounting point, which you can access with `ds$path(target_path)`.
|
||||||
|
* Any environment dependencies required for training. The default Docker image built for training already contains the three packages (`caret`, `e1071`, and `optparse`) needed in the training script. So you don't need to specify additional information. If you are using R packages that are not included by default, use the estimator's `cran_packages` parameter to add additional CRAN packages. See the [`estimator()`](https://azure.github.io/azureml-sdk-for-r/reference/estimator.html) reference for the full set of configurable options.
|
||||||
|
|
||||||
|
```{r create_estimator, eval=FALSE}
|
||||||
|
est <- estimator(source_directory = "project_files",
|
||||||
|
entry_script = "accidents.R",
|
||||||
|
script_params = list("--data_folder" = ds$path(target_path)),
|
||||||
|
compute_target = compute_target
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Submit the job on the remote cluster
|
||||||
|
|
||||||
|
Finally submit the job to run on your cluster. `submit_experiment()` returns a Run object that you then use to interface with the run. In total, the first run takes **about 10 minutes**. But for later runs, the same Docker image is reused as long as the script dependencies don't change. In this case, the image is cached and the container startup time is much faster.
|
||||||
|
|
||||||
|
```{r submit_job, eval=FALSE}
|
||||||
|
run <- submit_experiment(exp, est)
|
||||||
|
```
|
||||||
|
|
||||||
|
You can view a table of the run's details. Clicking the "Web View" link provided will bring you to Azure Machine Learning studio, where you can monitor the run in the UI.
|
||||||
|
|
||||||
|
```{r view_run, eval=FALSE}
|
||||||
|
view_run_details(run)
|
||||||
|
```
|
||||||
|
|
||||||
|
Model training happens in the background. Wait until the model has finished training before you run more code.
|
||||||
|
|
||||||
|
```{r wait_run, eval=FALSE}
|
||||||
|
wait_for_run_completion(run, show_output = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
You -- and colleagues with access to the workspace -- can submit multiple experiments in parallel, and Azure ML will take of scheduling the tasks on the compute cluster. You can even configure the cluster to automatically scale up to multiple nodes, and scale back when there are no more compute tasks in the queue. This configuration is a cost-effective way for teams to share compute resources.
|
||||||
|
|
||||||
|
## Retrieve training results
|
||||||
|
Once your model has finished training, you can access the artifacts of your job that were persisted to the run record, including any metrics logged and the final trained model.
|
||||||
|
|
||||||
|
### Get the logged metrics
|
||||||
|
In the training script `accidents.R`, you logged a metric from your model: the accuracy of the predictions in the training data. You can see metrics in the [studio](https://ml.azure.com), or extract them to the local session as an R list as follows:
|
||||||
|
|
||||||
|
```{r metrics, eval=FALSE}
|
||||||
|
metrics <- get_run_metrics(run)
|
||||||
|
metrics
|
||||||
|
```
|
||||||
|
|
||||||
|
If you've run multiple experiments (say, using differing variables, algorithms, or hyperparamers), you can use the metrics from each run to compare and choose the model you'll use in production.
|
||||||
|
|
||||||
|
### Get the trained model
|
||||||
|
You can retrieve the trained model and look at the results in your local R session. The following code will download the contents of the `./outputs` directory, which includes the model file.
|
||||||
|
|
||||||
|
```{r retrieve_model, eval=FALSE}
|
||||||
|
download_files_from_run(run, prefix="outputs/")
|
||||||
|
accident_model <- readRDS("project_files/outputs/model.rds")
|
||||||
|
summary(accident_model)
|
||||||
|
```
|
||||||
|
|
||||||
|
You see some factors that contribute to an increase in the estimated probability of death:
|
||||||
|
|
||||||
|
* higher impact speed
|
||||||
|
* male driver
|
||||||
|
* older occupant
|
||||||
|
* passenger
|
||||||
|
|
||||||
|
You see lower probabilities of death with:
|
||||||
|
|
||||||
|
* presence of airbags
|
||||||
|
* presence seatbelts
|
||||||
|
* frontal collision
|
||||||
|
|
||||||
|
The vehicle year of manufacture does not have a significant effect.
|
||||||
|
|
||||||
|
You can use this model to make new predictions:
|
||||||
|
|
||||||
|
```{r manual_predict, eval=FALSE}
|
||||||
|
newdata <- data.frame( # valid values shown below
|
||||||
|
dvcat="10-24", # "1-9km/h" "10-24" "25-39" "40-54" "55+"
|
||||||
|
seatbelt="none", # "none" "belted"
|
||||||
|
frontal="frontal", # "notfrontal" "frontal"
|
||||||
|
sex="f", # "f" "m"
|
||||||
|
ageOFocc=16, # age in years, 16-97
|
||||||
|
yearVeh=2002, # year of vehicle, 1955-2003
|
||||||
|
airbag="none", # "none" "airbag"
|
||||||
|
occRole="pass" # "driver" "pass"
|
||||||
|
)
|
||||||
|
|
||||||
|
## predicted probability of death for these variables, as a percentage
|
||||||
|
as.numeric(predict(accident_model,newdata, type="response")*100)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Deploy as a web service
|
||||||
|
|
||||||
|
With your model, you can predict the danger of death from a collision. Use Azure ML to deploy your model as a prediction service. In this tutorial, you will deploy the web service in [Azure Container Instances](https://docs.microsoft.com/en-us/azure/container-instances/) (ACI).
|
||||||
|
|
||||||
|
### Register the model
|
||||||
|
|
||||||
|
First, register the model you downloaded to your workspace with [`register_model()`](https://azure.github.io/azureml-sdk-for-r/reference/register_model.html). A registered model can be any collection of files, but in this case the R model object is sufficient. Azure ML will use the registered model for deployment.
|
||||||
|
|
||||||
|
```{r register_model, eval=FALSE}
|
||||||
|
model <- register_model(ws,
|
||||||
|
model_path = "project_files/outputs/model.rds",
|
||||||
|
model_name = "accidents_model",
|
||||||
|
description = "Predict probablity of auto accident")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Define the inference dependencies
|
||||||
|
To create a web service for your model, you first need to create a scoring script (`entry_script`), an R script that will take as input variable values (in JSON format) and output a prediction from your model. For this tutorial, use the provided scoring file `accident_predict.R`. The scoring script must contain an `init()` method that loads your model and returns a function that uses the model to make a prediction based on the input data. See the [documentation](https://azure.github.io/azureml-sdk-for-r/reference/inference_config.html#details) for more details.
|
||||||
|
|
||||||
|
Next, define an Azure ML **environment** for your script's package dependencies. With an environment, you specify R packages (from CRAN or elsewhere) that are needed for your script to run. You can also provide the values of environment variables that your script can reference to modify its behavior. By default, Azure ML will build the same default Docker image used with the estimator for training. Since the tutorial has no special requirements, create an environment with no special attributes.
|
||||||
|
|
||||||
|
```{r create_environment, eval=FALSE}
|
||||||
|
r_env <- r_environment(name = "basic_env")
|
||||||
|
```
|
||||||
|
|
||||||
|
If you want to use your own Docker image for deployment instead, specify the `custom_docker_image` parameter. See the [`r_environment()`](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html) reference for the full set of configurable options for defining an environment.
|
||||||
|
|
||||||
|
Now you have everything you need to create an **inference config** for encapsulating your scoring script and environment dependencies.
|
||||||
|
|
||||||
|
``` {r create_inference_config, eval=FALSE}
|
||||||
|
inference_config <- inference_config(
|
||||||
|
entry_script = "accident_predict.R",
|
||||||
|
source_directory = "project_files",
|
||||||
|
environment = r_env)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Deploy to ACI
|
||||||
|
In this tutorial, you will deploy your service to ACI. This code provisions a single container to respond to inbound requests, which is suitable for testing and light loads. See [`aci_webservice_deployment_config()`](https://azure.github.io/azureml-sdk-for-r/reference/aci_webservice_deployment_config.html) for additional configurable options. (For production-scale deployments, you can also [deploy to Azure Kubernetes Service](https://azure.github.io/azureml-sdk-for-r/articles/deploy-to-aks/deploy-to-aks.html).)
|
||||||
|
|
||||||
|
``` {r create_aci_config, eval=FALSE}
|
||||||
|
aci_config <- aci_webservice_deployment_config(cpu_cores = 1, memory_gb = 0.5)
|
||||||
|
```
|
||||||
|
|
||||||
|
Now you deploy your model as a web service. Deployment **can take several minutes**.
|
||||||
|
|
||||||
|
```{r deploy_service, eval=FALSE}
|
||||||
|
aci_service <- deploy_model(ws,
|
||||||
|
'accident-pred',
|
||||||
|
list(model),
|
||||||
|
inference_config,
|
||||||
|
aci_config)
|
||||||
|
|
||||||
|
wait_for_deployment(aci_service, show_output = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
If you encounter any issue in deploying the web service, please visit the [troubleshooting guide](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment).
|
||||||
|
|
||||||
|
## Test the deployed service
|
||||||
|
|
||||||
|
Now that your model is deployed as a service, you can test the service from R using [`invoke_webservice()`](https://azure.github.io/azureml-sdk-for-r/reference/invoke_webservice.html). Provide a new set of data to predict from, convert it to JSON, and send it to the service.
|
||||||
|
|
||||||
|
```{r test_deployment, eval=FALSE}
|
||||||
|
library(jsonlite)
|
||||||
|
|
||||||
|
newdata <- data.frame( # valid values shown below
|
||||||
|
dvcat="10-24", # "1-9km/h" "10-24" "25-39" "40-54" "55+"
|
||||||
|
seatbelt="none", # "none" "belted"
|
||||||
|
frontal="frontal", # "notfrontal" "frontal"
|
||||||
|
sex="f", # "f" "m"
|
||||||
|
ageOFocc=22, # age in years, 16-97
|
||||||
|
yearVeh=2002, # year of vehicle, 1955-2003
|
||||||
|
airbag="none", # "none" "airbag"
|
||||||
|
occRole="pass" # "driver" "pass"
|
||||||
|
)
|
||||||
|
|
||||||
|
prob <- invoke_webservice(aci_service, toJSON(newdata))
|
||||||
|
prob
|
||||||
|
```
|
||||||
|
|
||||||
|
You can also get the web service's HTTP endpoint, which accepts REST client calls. You can share this endpoint with anyone who wants to test the web service or integrate it into an application.
|
||||||
|
|
||||||
|
```{r get_endpoint, eval=FALSE}
|
||||||
|
aci_service$scoring_uri
|
||||||
|
```
|
||||||
|
|
||||||
|
## Clean up resources
|
||||||
|
|
||||||
|
Delete the resources once you no longer need them. Don't delete any resource you plan to still use.
|
||||||
|
|
||||||
|
Delete the web service:
|
||||||
|
```{r delete_service, eval=FALSE}
|
||||||
|
delete_webservice(aci_service)
|
||||||
|
```
|
||||||
|
|
||||||
|
Delete the registered model:
|
||||||
|
```{r delete_model, eval=FALSE}
|
||||||
|
delete_model(model)
|
||||||
|
```
|
||||||
|
|
||||||
|
Delete the compute cluster:
|
||||||
|
```{r delete_compute, eval=FALSE}
|
||||||
|
delete_compute(compute_target)
|
||||||
|
```
|
||||||
@@ -0,0 +1,62 @@
|
|||||||
|
# Copyright 2015 The TensorFlow Authors. All Rights Reserved.
|
||||||
|
# Copyright 2016 RStudio, Inc. All Rights Reserved.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
# ==============================================================================
|
||||||
|
|
||||||
|
|
||||||
|
library(tensorflow)
|
||||||
|
install_tensorflow(version = "1.13.2-gpu")
|
||||||
|
|
||||||
|
library(azuremlsdk)
|
||||||
|
|
||||||
|
# Create the model
|
||||||
|
x <- tf$placeholder(tf$float32, shape(NULL, 784L))
|
||||||
|
W <- tf$Variable(tf$zeros(shape(784L, 10L)))
|
||||||
|
b <- tf$Variable(tf$zeros(shape(10L)))
|
||||||
|
|
||||||
|
y <- tf$nn$softmax(tf$matmul(x, W) + b)
|
||||||
|
|
||||||
|
# Define loss and optimizer
|
||||||
|
y_ <- tf$placeholder(tf$float32, shape(NULL, 10L))
|
||||||
|
cross_entropy <- tf$reduce_mean(-tf$reduce_sum(y_ * log(y),
|
||||||
|
reduction_indices = 1L))
|
||||||
|
train_step <- tf$train$GradientDescentOptimizer(0.5)$minimize(cross_entropy)
|
||||||
|
|
||||||
|
# Create session and initialize variables
|
||||||
|
sess <- tf$Session()
|
||||||
|
sess$run(tf$global_variables_initializer())
|
||||||
|
|
||||||
|
# Load mnist data )
|
||||||
|
datasets <- tf$contrib$learn$datasets
|
||||||
|
mnist <- datasets$mnist$read_data_sets("MNIST-data", one_hot = TRUE)
|
||||||
|
|
||||||
|
# Train
|
||||||
|
for (i in 1:1000) {
|
||||||
|
batches <- mnist$train$next_batch(100L)
|
||||||
|
batch_xs <- batches[[1]]
|
||||||
|
batch_ys <- batches[[2]]
|
||||||
|
sess$run(train_step,
|
||||||
|
feed_dict = dict(x = batch_xs, y_ = batch_ys))
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test trained model
|
||||||
|
correct_prediction <- tf$equal(tf$argmax(y, 1L), tf$argmax(y_, 1L))
|
||||||
|
accuracy <- tf$reduce_mean(tf$cast(correct_prediction, tf$float32))
|
||||||
|
cat("Accuracy: ", sess$run(accuracy,
|
||||||
|
feed_dict = dict(x = mnist$test$images,
|
||||||
|
y_ = mnist$test$labels)))
|
||||||
|
|
||||||
|
log_metric_to_run("accuracy",
|
||||||
|
sess$run(accuracy, feed_dict = dict(x = mnist$test$images,
|
||||||
|
y_ = mnist$test$labels)))
|
||||||
@@ -0,0 +1,143 @@
|
|||||||
|
---
|
||||||
|
title: "Train a TensorFlow model"
|
||||||
|
date: "`r Sys.Date()`"
|
||||||
|
output: rmarkdown::html_vignette
|
||||||
|
vignette: >
|
||||||
|
%\VignetteIndexEntry{Train a TensorFlow model}
|
||||||
|
%\VignetteEngine{knitr::rmarkdown}
|
||||||
|
\use_package{UTF-8}
|
||||||
|
---
|
||||||
|
|
||||||
|
This tutorial demonstrates how run a TensorFlow job at scale using Azure ML. You will train a TensorFlow model to classify handwritten digits (MNIST) using a deep neural network (DNN) and log your results to the Azure ML service.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
If you don’t have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
|
||||||
|
|
||||||
|
## Set up development environment
|
||||||
|
The setup for your development work in this tutorial includes the following actions:
|
||||||
|
|
||||||
|
* Import required packages
|
||||||
|
* Connect to a workspace
|
||||||
|
* Create an experiment to track your runs
|
||||||
|
* Create a remote compute target to use for training
|
||||||
|
|
||||||
|
### Import **azuremlsdk** package
|
||||||
|
```{r eval=FALSE}
|
||||||
|
library(azuremlsdk)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Load your workspace
|
||||||
|
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from a **config.json** file if you previously wrote one out with [`write_workspace_config()`](https://azure.github.io/azureml-sdk-for-r/reference/write_workspace_config.html).
|
||||||
|
```{r load_workpace, eval=FALSE}
|
||||||
|
ws <- load_workspace_from_config()
|
||||||
|
```
|
||||||
|
|
||||||
|
Or, you can retrieve a workspace by directly specifying your workspace details:
|
||||||
|
```{r get_workpace, eval=FALSE}
|
||||||
|
ws <- get_workspace("<your workspace name>", "<your subscription ID>", "<your resource group>")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Create an experiment
|
||||||
|
An Azure ML **experiment** tracks a grouping of runs, typically from the same training script. Create an experiment to track the runs for training the TensorFlow model on the MNIST data.
|
||||||
|
|
||||||
|
```{r create_experiment, eval=FALSE}
|
||||||
|
exp <- experiment(workspace = ws, name = "tf-mnist")
|
||||||
|
```
|
||||||
|
|
||||||
|
If you would like to track your runs in an existing experiment, simply specify that experiment's name to the `name` parameter of `experiment()`.
|
||||||
|
|
||||||
|
### Create a compute target
|
||||||
|
By using Azure Machine Learning Compute (AmlCompute), a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. In this tutorial, you create a GPU-enabled cluster as your training environment. The code below creates the compute cluster for you if it doesn't already exist in your workspace.
|
||||||
|
|
||||||
|
You may need to wait a few minutes for your compute cluster to be provisioned if it doesn't already exist.
|
||||||
|
|
||||||
|
```{r create_cluster, eval=FALSE}
|
||||||
|
cluster_name <- "gpucluster"
|
||||||
|
compute_target <- get_compute(ws, cluster_name = cluster_name)
|
||||||
|
if (is.null(compute_target))
|
||||||
|
{
|
||||||
|
vm_size <- "STANDARD_NC6"
|
||||||
|
compute_target <- create_aml_compute(workspace = ws,
|
||||||
|
cluster_name = cluster_name,
|
||||||
|
vm_size = vm_size,
|
||||||
|
max_nodes = 4)
|
||||||
|
|
||||||
|
wait_for_provisioning_completion(compute_target, show_output = TRUE)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Prepare the training script
|
||||||
|
|
||||||
|
A training script called `tf_mnist.R` has been provided for you in the "project_files" directory of this tutorial. The Azure ML SDK provides a set of logging APIs for logging various metrics during training runs. These metrics are recorded and persisted in the experiment run record, and can be be accessed at any time or viewed in the run details page in [Azure Machine Learning studio](http://ml.azure.com/).
|
||||||
|
|
||||||
|
In order to collect and upload run metrics, you need to do the following **inside the training script**:
|
||||||
|
|
||||||
|
* Import the **azuremlsdk** package
|
||||||
|
```
|
||||||
|
library(azuremlsdk)
|
||||||
|
```
|
||||||
|
|
||||||
|
* Add the [`log_metric_to_run()`](https://azure.github.io/azureml-sdk-for-r/reference/log_metric_to_run.html) function to track our primary metric, "accuracy", for this experiment. If you have your own training script with several important metrics, simply create a logging call for each one within the script.
|
||||||
|
```
|
||||||
|
log_metric_to_run("accuracy",
|
||||||
|
sess$run(accuracy,
|
||||||
|
feed_dict = dict(x = mnist$test$images, y_ = mnist$test$labels)))
|
||||||
|
```
|
||||||
|
|
||||||
|
See the [reference](https://azure.github.io/azureml-sdk-for-r/reference/index.html#section-training-experimentation) for the full set of logging methods `log_*()` available from the R SDK.
|
||||||
|
|
||||||
|
## Create an estimator
|
||||||
|
|
||||||
|
An Azure ML **estimator** encapsulates the run configuration information needed for executing a training script on the compute target. Azure ML runs are run as containerized jobs on the specified compute target. By default, the Docker image built for your training job will include R, the Azure ML SDK, and a set of commonly used R packages. See the full list of default packages included [here](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html).
|
||||||
|
|
||||||
|
To create the estimator, define the following:
|
||||||
|
|
||||||
|
* The directory that contains your scripts needed for training (`source_directory`). All the files in this directory are uploaded to the cluster node(s) for execution. The directory must contain your training script and any additional scripts required.
|
||||||
|
* The training script that will be executed (`entry_script`).
|
||||||
|
* The compute target (`compute_target`), in this case the AmlCompute cluster you created earlier.
|
||||||
|
* Any environment dependencies required for training. Since the training script requires the TensorFlow package, which is not included in the image by default, pass the package name to the `cran_packages` parameter to have it installed in the Docker container where the job will run. See the [`estimator()`](https://azure.github.io/azureml-sdk-for-r/reference/estimator.html) reference for the full set of configurable options.
|
||||||
|
* Set the `use_gpu = TRUE` flag so the default base GPU Docker image will be built, since the job will be run on a GPU cluster.
|
||||||
|
|
||||||
|
```{r create_estimator, eval=FALSE}
|
||||||
|
est <- estimator(source_directory = "project_files",
|
||||||
|
entry_script = "tf_mnist.R",
|
||||||
|
compute_target = compute_target,
|
||||||
|
cran_packages = c("tensorflow"),
|
||||||
|
use_gpu = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Submit the job
|
||||||
|
|
||||||
|
Finally submit the job to run on your cluster. [`submit_experiment()`](https://azure.github.io/azureml-sdk-for-r/reference/submit_experiment.html) returns a `Run` object that you can then use to interface with the run.
|
||||||
|
|
||||||
|
```{r submit_job, eval=FALSE}
|
||||||
|
run <- submit_experiment(exp, est)
|
||||||
|
```
|
||||||
|
|
||||||
|
You can view the run’s details as a table. Clicking the “Web View” link provided will bring you to Azure Machine Learning studio, where you can monitor the run in the UI.
|
||||||
|
|
||||||
|
```{r eval=FALSE}
|
||||||
|
view_run_details(run)
|
||||||
|
```
|
||||||
|
|
||||||
|
Model training happens in the background. Wait until the model has finished training before you run more code.
|
||||||
|
|
||||||
|
```{r eval=FALSE}
|
||||||
|
wait_for_run_completion(run, show_output = TRUE)
|
||||||
|
```
|
||||||
|
|
||||||
|
## View run metrics
|
||||||
|
Once your job has finished, you can view the metrics collected during your TensorFlow run.
|
||||||
|
|
||||||
|
```{r get_metrics, eval=FALSE}
|
||||||
|
metrics <- get_run_metrics(run)
|
||||||
|
metrics
|
||||||
|
```
|
||||||
|
|
||||||
|
## Clean up resources
|
||||||
|
Delete the resources once you no longer need them. Don't delete any resource you plan to still use.
|
||||||
|
|
||||||
|
Delete the compute cluster:
|
||||||
|
```{r delete_compute, eval=FALSE}
|
||||||
|
delete_compute(compute_target)
|
||||||
|
```
|
||||||
@@ -195,7 +195,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"You can now create and/or use an Environment object when deploying a Webservice. The Environment can have been previously registered with your Workspace, or it will be registered with it as a part of the Webservice deployment. Only Environments that were created using azureml-defaults version 1.0.48 or later will work with this new handling however.\n",
|
"You can now create and/or use an Environment object when deploying a Webservice. The Environment can have been previously registered with your Workspace, or it will be registered with it as a part of the Webservice deployment. Please note that your environment must include azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"More information can be found in our [using environments notebook](../training/using-environments/using-environments.ipynb)."
|
"More information can be found in our [using environments notebook](../training/using-environments/using-environments.ipynb)."
|
||||||
]
|
]
|
||||||
@@ -221,23 +221,30 @@
|
|||||||
"## Create Inference Configuration\n",
|
"## Create Inference Configuration\n",
|
||||||
"\n",
|
"\n",
|
||||||
"There is now support for a source directory, you can upload an entire folder from your local machine as dependencies for the Webservice.\n",
|
"There is now support for a source directory, you can upload an entire folder from your local machine as dependencies for the Webservice.\n",
|
||||||
"Note: in that case, your entry_script, conda_file, and extra_docker_file_steps paths are relative paths to the source_directory path.\n",
|
"Note: in that case, environments's entry_script and file_path are relative paths to the source_directory path; myenv.docker.base_dockerfile is a string containing extra docker steps or contents of the docker file.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Sample code for using a source directory:\n",
|
"Sample code for using a source directory:\n",
|
||||||
"\n",
|
"\n",
|
||||||
"```python\n",
|
"```python\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"\n",
|
||||||
|
"myenv = Environment.from_conda_specification(name='myenv', file_path='env/myenv.yml')\n",
|
||||||
|
"\n",
|
||||||
|
"# explicitly set base_image to None when setting base_dockerfile\n",
|
||||||
|
"myenv.docker.base_image = None\n",
|
||||||
|
"# add extra docker commends to execute\n",
|
||||||
|
"myenv.docker.base_dockerfile = \"FROM ubuntu\\n RUN echo \\\"hello\\\"\"\n",
|
||||||
|
"\n",
|
||||||
"inference_config = InferenceConfig(source_directory=\"C:/abc\",\n",
|
"inference_config = InferenceConfig(source_directory=\"C:/abc\",\n",
|
||||||
" runtime= \"python\", \n",
|
|
||||||
" entry_script=\"x/y/score.py\",\n",
|
" entry_script=\"x/y/score.py\",\n",
|
||||||
" conda_file=\"env/myenv.yml\", \n",
|
" environment=myenv)\n",
|
||||||
" extra_docker_file_steps=\"helloworld.txt\")\n",
|
|
||||||
"```\n",
|
"```\n",
|
||||||
"\n",
|
"\n",
|
||||||
" - source_directory = holds source path as string, this entire folder gets added in image so its really easy to access any files within this folder or subfolder\n",
|
" - file_path: input parameter to Environment constructor. Manages conda and python package dependencies.\n",
|
||||||
" - runtime = Which runtime to use for the image. Current supported runtimes are 'spark-py' and 'python\n",
|
" - env.docker.base_dockerfile: any extra steps you want to inject into docker file\n",
|
||||||
" - entry_script = contains logic specific to initializing your model and running predictions\n",
|
" - source_directory: holds source path as string, this entire folder gets added in image so its really easy to access any files within this folder or subfolder\n",
|
||||||
" - conda_file = manages conda and python package dependencies.\n",
|
" - entry_script: contains logic specific to initializing your model and running predictions"
|
||||||
" - extra_docker_file_steps = optional: any extra steps you want to inject into docker file"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -20,7 +20,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"# Register model and deploy as webservice\n",
|
"# Register model and deploy as webservice in ACI\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Following this notebook, you will:\n",
|
"Following this notebook, you will:\n",
|
||||||
"\n",
|
"\n",
|
||||||
@@ -45,6 +45,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"import azureml.core\n",
|
"import azureml.core\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"# Check core SDK version number.\n",
|
"# Check core SDK version number.\n",
|
||||||
"print('SDK version:', azureml.core.VERSION)"
|
"print('SDK version:', azureml.core.VERSION)"
|
||||||
]
|
]
|
||||||
@@ -70,6 +71,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core import Workspace\n",
|
"from azureml.core import Workspace\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"ws = Workspace.from_config()\n",
|
"ws = Workspace.from_config()\n",
|
||||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
|
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
|
||||||
]
|
]
|
||||||
@@ -91,6 +93,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core import Dataset\n",
|
"from azureml.core import Dataset\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"datastore = ws.get_default_datastore()\n",
|
"datastore = ws.get_default_datastore()\n",
|
||||||
"datastore.upload_files(files=['./features.csv', './labels.csv'],\n",
|
"datastore.upload_files(files=['./features.csv', './labels.csv'],\n",
|
||||||
" target_path='sklearn_regression/',\n",
|
" target_path='sklearn_regression/',\n",
|
||||||
@@ -125,6 +128,7 @@
|
|||||||
"from azureml.core import Model\n",
|
"from azureml.core import Model\n",
|
||||||
"from azureml.core.resource_configuration import ResourceConfiguration\n",
|
"from azureml.core.resource_configuration import ResourceConfiguration\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"model = Model.register(workspace=ws,\n",
|
"model = Model.register(workspace=ws,\n",
|
||||||
" model_name='my-sklearn-model', # Name of the registered model in your workspace.\n",
|
" model_name='my-sklearn-model', # Name of the registered model in your workspace.\n",
|
||||||
" model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n",
|
" model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n",
|
||||||
@@ -159,6 +163,8 @@
|
|||||||
"\n",
|
"\n",
|
||||||
"The Azure Machine Learning service provides a default environment for supported model frameworks, including scikit-learn, based on the metadata you provided when registering your model. This is the easiest way to deploy your model.\n",
|
"The Azure Machine Learning service provides a default environment for supported model frameworks, including scikit-learn, based on the metadata you provided when registering your model. This is the easiest way to deploy your model.\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"Even when you deploy your model to ACI with a default environment you can still customize the deploy configuration (i.e. the number of cores and amount of memory made available for the deployment) using the [AciWebservice.deploy_configuration()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aci.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none--). Look at the \"Use a custom environment\" section of this notebook for more information on deploy configuration.\n",
|
||||||
|
"\n",
|
||||||
"**Note**: This step can take several minutes."
|
"**Note**: This step can take several minutes."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -171,6 +177,7 @@
|
|||||||
"from azureml.core import Webservice\n",
|
"from azureml.core import Webservice\n",
|
||||||
"from azureml.exceptions import WebserviceException\n",
|
"from azureml.exceptions import WebserviceException\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"service_name = 'my-sklearn-service'\n",
|
"service_name = 'my-sklearn-service'\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Remove any existing service under the same name.\n",
|
"# Remove any existing service under the same name.\n",
|
||||||
@@ -198,6 +205,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"import json\n",
|
"import json\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"input_payload = json.dumps({\n",
|
"input_payload = json.dumps({\n",
|
||||||
" 'data': [\n",
|
" 'data': [\n",
|
||||||
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
||||||
@@ -231,9 +239,9 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Use a custom environment (for all models)\n",
|
"### Use a custom environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"If you want more control over how your model is run, if it uses another framework, or if it has special runtime requirements, you can instead specify your own environment and scoring method.\n",
|
"If you want more control over how your model is run, if it uses another framework, or if it has special runtime requirements, you can instead specify your own environment and scoring method. Custom environments can be used for any model you want to deploy.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"Specify the model's runtime environment by creating an [Environment](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.environment%28class%29?view=azure-ml-py) object and providing the [CondaDependencies](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.conda_dependencies.condadependencies?view=azure-ml-py) needed by your model."
|
"Specify the model's runtime environment by creating an [Environment](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.environment%28class%29?view=azure-ml-py) object and providing the [CondaDependencies](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.conda_dependencies.condadependencies?view=azure-ml-py) needed by your model."
|
||||||
]
|
]
|
||||||
@@ -247,6 +255,7 @@
|
|||||||
"from azureml.core import Environment\n",
|
"from azureml.core import Environment\n",
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"environment = Environment('my-sklearn-environment')\n",
|
"environment = Environment('my-sklearn-environment')\n",
|
||||||
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
||||||
" 'azureml-defaults',\n",
|
" 'azureml-defaults',\n",
|
||||||
@@ -278,7 +287,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"Deploy your model in the custom environment by providing an [InferenceConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.inferenceconfig?view=azure-ml-py) object to [Model.deploy()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#deploy-workspace--name--models--inference-config--deployment-config-none--deployment-target-none-).\n",
|
"Deploy your model in the custom environment by providing an [InferenceConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.inferenceconfig?view=azure-ml-py) object to [Model.deploy()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#deploy-workspace--name--models--inference-config--deployment-config-none--deployment-target-none-). In this case we are also using the [AciWebservice.deploy_configuration()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.webservice.aci.aciwebservice#deploy-configuration-cpu-cores-none--memory-gb-none--tags-none--properties-none--description-none--location-none--auth-enabled-none--ssl-enabled-none--enable-app-insights-none--ssl-cert-pem-file-none--ssl-key-pem-file-none--ssl-cname-none--dns-name-label-none--) method to generate a custom deploy configuration.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"**Note**: This step can take several minutes."
|
"**Note**: This step can take several minutes."
|
||||||
]
|
]
|
||||||
@@ -288,15 +297,18 @@
|
|||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
"metadata": {
|
"metadata": {
|
||||||
"tags": [
|
"tags": [
|
||||||
"azuremlexception-remarks-sample"
|
"azuremlexception-remarks-sample",
|
||||||
|
"sample-aciwebservice-deploy-config"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core import Webservice\n",
|
"from azureml.core import Webservice\n",
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.webservice import AciWebservice\n",
|
||||||
"from azureml.exceptions import WebserviceException\n",
|
"from azureml.exceptions import WebserviceException\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
"service_name = 'my-custom-env-service'\n",
|
"service_name = 'my-custom-env-service'\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# Remove any existing service under the same name.\n",
|
"# Remove any existing service under the same name.\n",
|
||||||
@@ -305,11 +317,14 @@
|
|||||||
"except WebserviceException:\n",
|
"except WebserviceException:\n",
|
||||||
" pass\n",
|
" pass\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(entry_script='score.py',\n",
|
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
|
||||||
" source_directory='.',\n",
|
"aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
|
||||||
" environment=environment)\n",
|
|
||||||
"\n",
|
"\n",
|
||||||
"service = Model.deploy(ws, service_name, [model], inference_config)\n",
|
"service = Model.deploy(workspace=ws,\n",
|
||||||
|
" name=service_name,\n",
|
||||||
|
" models=[model],\n",
|
||||||
|
" inference_config=inference_config,\n",
|
||||||
|
" deployment_config=aci_config)\n",
|
||||||
"service.wait_for_deployment(show_output=True)"
|
"service.wait_for_deployment(show_output=True)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
@@ -326,8 +341,6 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"import json\n",
|
|
||||||
"\n",
|
|
||||||
"input_payload = json.dumps({\n",
|
"input_payload = json.dumps({\n",
|
||||||
" 'data': [\n",
|
" 'data': [\n",
|
||||||
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
||||||
@@ -360,16 +373,101 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Model profiling\n",
|
"### Model Profiling\n",
|
||||||
"\n",
|
"\n",
|
||||||
"You can also take advantage of the profiling feature to estimate CPU and memory requirements for models.\n",
|
"Profile your model to understand how much CPU and memory the service, created as a result of its deployment, will need. Profiling returns information such as CPU usage, memory usage, and response latency. It also provides a CPU and memory recommendation based on the resource usage. You can profile your model (or more precisely the service built based on your model) on any CPU and/or memory combination where 0.1 <= CPU <= 3.5 and 0.1GB <= memory <= 15GB. If you do not provide a CPU and/or memory requirement, we will test it on the default configuration of 3.5 CPU and 15GB memory.\n",
|
||||||
"\n",
|
"\n",
|
||||||
"```python\n",
|
"In order to profile your model you will need:\n",
|
||||||
"profile = Model.profile(ws, \"profilename\", [model], inference_config, test_sample)\n",
|
"- a registered model\n",
|
||||||
"profile.wait_for_profiling(True)\n",
|
"- an entry script\n",
|
||||||
"profiling_results = profile.get_results()\n",
|
"- an inference configuration\n",
|
||||||
"print(profiling_results)\n",
|
"- a single column tabular dataset, where each row contains a string representing sample request data sent to the service.\n",
|
||||||
"```"
|
"\n",
|
||||||
|
"At this point we only support profiling of services that expect their request data to be a string, for example: string serialized json, text, string serialized image, etc. The content of each row of the dataset (string) will be put into the body of the HTTP request and sent to the service encapsulating the model for scoring.\n",
|
||||||
|
"\n",
|
||||||
|
"Below is an example of how you can construct an input dataset to profile a service which expects its incoming requests to contain serialized json. In this case we created a dataset based one hundred instances of the same request data. In real world scenarios however, we suggest that you use larger datasets with various inputs, especially if your model resource usage/behavior is input dependent."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core import Datastore\n",
|
||||||
|
"from azureml.core.dataset import Dataset\n",
|
||||||
|
"from azureml.data import dataset_type_definitions\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"# create a string that can be utf-8 encoded and\n",
|
||||||
|
"# put in the body of the request\n",
|
||||||
|
"serialized_input_json = json.dumps({\n",
|
||||||
|
" 'data': [\n",
|
||||||
|
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
|
||||||
|
" -0.03482076, -0.04340085, -0.00259226, 0.01990842, -0.01764613]\n",
|
||||||
|
" ]\n",
|
||||||
|
"})\n",
|
||||||
|
"dataset_content = []\n",
|
||||||
|
"for i in range(100):\n",
|
||||||
|
" dataset_content.append(serialized_input_json)\n",
|
||||||
|
"dataset_content = '\\n'.join(dataset_content)\n",
|
||||||
|
"file_name = 'sample_request_data.txt'\n",
|
||||||
|
"f = open(file_name, 'w')\n",
|
||||||
|
"f.write(dataset_content)\n",
|
||||||
|
"f.close()\n",
|
||||||
|
"\n",
|
||||||
|
"# upload the txt file created above to the Datastore and create a dataset from it\n",
|
||||||
|
"data_store = Datastore.get_default(ws)\n",
|
||||||
|
"data_store.upload_files(['./' + file_name], target_path='sample_request_data')\n",
|
||||||
|
"datastore_path = [(data_store, 'sample_request_data' +'/' + file_name)]\n",
|
||||||
|
"sample_request_data = Dataset.Tabular.from_delimited_files(\n",
|
||||||
|
" datastore_path,\n",
|
||||||
|
" separator='\\n',\n",
|
||||||
|
" infer_column_types=True,\n",
|
||||||
|
" header=dataset_type_definitions.PromoteHeadersBehavior.NO_HEADERS)\n",
|
||||||
|
"sample_request_data = sample_request_data.register(workspace=ws,\n",
|
||||||
|
" name='diabetes_sample_request_data',\n",
|
||||||
|
" create_new_version=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Now that we have an input dataset we are ready to go ahead with profiling. In this case we are testing the previously introduced sklearn regression model on 1 CPU and 0.5 GB memory. The memory usage and recommendation presented in the result is measured in Gigabytes. The CPU usage and recommendation is measured in CPU cores."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from datetime import datetime\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"environment = Environment('my-sklearn-environment')\n",
|
||||||
|
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
||||||
|
" 'azureml-defaults',\n",
|
||||||
|
" 'inference-schema[numpy-support]',\n",
|
||||||
|
" 'joblib',\n",
|
||||||
|
" 'numpy',\n",
|
||||||
|
" 'scikit-learn'\n",
|
||||||
|
"])\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
|
||||||
|
"# if cpu and memory_in_gb parameters are not provided\n",
|
||||||
|
"# the model will be profiled on default configuration of\n",
|
||||||
|
"# 3.5CPU and 15GB memory\n",
|
||||||
|
"profile = Model.profile(ws,\n",
|
||||||
|
" 'rgrsn-%s' % datetime.now().strftime('%m%d%Y-%H%M%S'),\n",
|
||||||
|
" [model],\n",
|
||||||
|
" inference_config,\n",
|
||||||
|
" input_dataset=sample_request_data,\n",
|
||||||
|
" cpu=1.0,\n",
|
||||||
|
" memory_in_gb=0.5)\n",
|
||||||
|
"\n",
|
||||||
|
"profile.wait_for_completion(True)\n",
|
||||||
|
"details = profile.get_details()"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -405,7 +503,7 @@
|
|||||||
"\n",
|
"\n",
|
||||||
" - To run a production-ready web service, see the [notebook on deployment to Azure Kubernetes Service](../production-deploy-to-aks/production-deploy-to-aks.ipynb).\n",
|
" - To run a production-ready web service, see the [notebook on deployment to Azure Kubernetes Service](../production-deploy-to-aks/production-deploy-to-aks.ipynb).\n",
|
||||||
" - To run a local web service, see the [notebook on deployment to a local Docker container](../deploy-to-local/register-model-deploy-local.ipynb).\n",
|
" - To run a local web service, see the [notebook on deployment to a local Docker container](../deploy-to-local/register-model-deploy-local.ipynb).\n",
|
||||||
" - For more information on datasets, see the [notebook on training with datasets](../../work-with-data/datasets-tutorial/train-with-datasets.ipynb).\n",
|
" - For more information on datasets, see the [notebook on training with datasets](../../work-with-data/datasets-tutorial/train-with-datasets/train-with-datasets.ipynb).\n",
|
||||||
" - For more information on environments, see the [notebook on using environments](../../training/using-environments/using-environments.ipynb).\n",
|
" - For more information on environments, see the [notebook on using environments](../../training/using-environments/using-environments.ipynb).\n",
|
||||||
" - For information on all the available deployment targets, see [“How and where to deploy models”](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-deploy-and-where#choose-a-compute-target)."
|
" - For information on all the available deployment targets, see [“How and where to deploy models”](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-deploy-and-where#choose-a-compute-target)."
|
||||||
]
|
]
|
||||||
|
|||||||
@@ -189,6 +189,15 @@
|
|||||||
" return error"
|
" return error"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency for your environemnt. This package contains the functionality needed to host the model as a web service."
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
@@ -206,16 +215,6 @@
|
|||||||
" - inference-schema[numpy-support]"
|
" - inference-schema[numpy-support]"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"%%writefile C:/abc/dockerstep/customDockerStep.txt\n",
|
|
||||||
"RUN echo \"this is test\""
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
@@ -240,11 +239,10 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"## Create Inference Configuration\n",
|
"## Create Inference Configuration\n",
|
||||||
"\n",
|
"\n",
|
||||||
" - source_directory = holds source path as string, this entire folder gets added in image so its really easy to access any files within this folder or subfolder\n",
|
" - file_path: input parameter to Environment constructor. Manages conda and python package dependencies.\n",
|
||||||
" - runtime = Which runtime to use for the image. Current supported runtimes are 'spark-py' and 'python\n",
|
" - env.docker.base_dockerfile: any extra steps you want to inject into docker file\n",
|
||||||
" - entry_script = contains logic specific to initializing your model and running predictions\n",
|
" - source_directory: holds source path as string, this entire folder gets added in image so its really easy to access any files within this folder or subfolder\n",
|
||||||
" - conda_file = manages conda and python package dependencies.\n",
|
" - entry_script: contains logic specific to initializing your model and running predictions"
|
||||||
" - extra_docker_file_steps = optional: any extra steps you want to inject into docker file"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -253,13 +251,19 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
"\n",
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"myenv = Environment.from_conda_specification(name='myenv', file_path='env/myenv.yml')\n",
|
||||||
|
"\n",
|
||||||
|
"# explicitly set base_image to None when setting base_dockerfile\n",
|
||||||
|
"myenv.docker.base_image = None\n",
|
||||||
|
"myenv.docker.base_dockerfile = \"RUN echo \\\"this is test\\\"\"\n",
|
||||||
|
"\n",
|
||||||
"inference_config = InferenceConfig(source_directory=\"C:/abc\",\n",
|
"inference_config = InferenceConfig(source_directory=\"C:/abc\",\n",
|
||||||
" runtime=\"python\", \n",
|
|
||||||
" entry_script=\"x/y/score.py\",\n",
|
" entry_script=\"x/y/score.py\",\n",
|
||||||
" conda_file=\"env/myenv.yml\", \n",
|
" environment=myenv)\n"
|
||||||
" extra_docker_file_steps=\"dockerstep/customDockerStep.txt\")"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -145,6 +145,110 @@
|
|||||||
" environment=environment)"
|
" environment=environment)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Model Profiling\n",
|
||||||
|
"\n",
|
||||||
|
"Profile your model to understand how much CPU and memory the service, created as a result of its deployment, will need. Profiling returns information such as CPU usage, memory usage, and response latency. It also provides a CPU and memory recommendation based on the resource usage. You can profile your model (or more precisely the service built based on your model) on any CPU and/or memory combination where 0.1 <= CPU <= 3.5 and 0.1GB <= memory <= 15GB. If you do not provide a CPU and/or memory requirement, we will test it on the default configuration of 3.5 CPU and 15GB memory.\n",
|
||||||
|
"\n",
|
||||||
|
"In order to profile your model you will need:\n",
|
||||||
|
"- a registered model\n",
|
||||||
|
"- an entry script\n",
|
||||||
|
"- an inference configuration\n",
|
||||||
|
"- a single column tabular dataset, where each row contains a string representing sample request data sent to the service.\n",
|
||||||
|
"\n",
|
||||||
|
"At this point we only support profiling of services that expect their request data to be a string, for example: string serialized json, text, string serialized image, etc. The content of each row of the dataset (string) will be put into the body of the HTTP request and sent to the service encapsulating the model for scoring.\n",
|
||||||
|
"\n",
|
||||||
|
"Below is an example of how you can construct an input dataset to profile a service which expects its incoming requests to contain serialized json. In this case we created a dataset based one hundred instances of the same request data. In real world scenarios however, we suggest that you use larger datasets with various inputs, especially if your model resource usage/behavior is input dependent."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import json\n",
|
||||||
|
"from azureml.core import Datastore\n",
|
||||||
|
"from azureml.core.dataset import Dataset\n",
|
||||||
|
"from azureml.data import dataset_type_definitions\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"# create a string that can be put in the body of the request\n",
|
||||||
|
"serialized_input_json = json.dumps({\n",
|
||||||
|
" 'data': [\n",
|
||||||
|
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
|
||||||
|
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
|
||||||
|
" ]\n",
|
||||||
|
"})\n",
|
||||||
|
"dataset_content = []\n",
|
||||||
|
"for i in range(100):\n",
|
||||||
|
" dataset_content.append(serialized_input_json)\n",
|
||||||
|
"dataset_content = '\\n'.join(dataset_content)\n",
|
||||||
|
"file_name = 'sample_request_data_diabetes.txt'\n",
|
||||||
|
"f = open(file_name, 'w')\n",
|
||||||
|
"f.write(dataset_content)\n",
|
||||||
|
"f.close()\n",
|
||||||
|
"\n",
|
||||||
|
"# upload the txt file created above to the Datastore and create a dataset from it\n",
|
||||||
|
"data_store = Datastore.get_default(ws)\n",
|
||||||
|
"data_store.upload_files(['./' + file_name], target_path='sample_request_data_diabetes')\n",
|
||||||
|
"datastore_path = [(data_store, 'sample_request_data_diabetes' +'/' + file_name)]\n",
|
||||||
|
"sample_request_data_diabetes = Dataset.Tabular.from_delimited_files(\n",
|
||||||
|
" datastore_path,\n",
|
||||||
|
" separator='\\n',\n",
|
||||||
|
" infer_column_types=True,\n",
|
||||||
|
" header=dataset_type_definitions.PromoteHeadersBehavior.NO_HEADERS)\n",
|
||||||
|
"sample_request_data_diabetes = sample_request_data_diabetes.register(workspace=ws,\n",
|
||||||
|
" name='sample_request_data_diabetes',\n",
|
||||||
|
" create_new_version=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Now that we have an input dataset we are ready to go ahead with profiling. In this case we are testing the previously introduced sklearn regression model on 1 CPU and 0.5 GB memory. The memory usage and recommendation presented in the result is measured in Gigabytes. The CPU usage and recommendation is measured in CPU cores."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from datetime import datetime\n",
|
||||||
|
"from azureml.core import Environment\n",
|
||||||
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
|
"from azureml.core.model import Model, InferenceConfig\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"environment = Environment('my-sklearn-environment')\n",
|
||||||
|
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
||||||
|
" 'azureml-defaults',\n",
|
||||||
|
" 'inference-schema[numpy-support]',\n",
|
||||||
|
" 'joblib',\n",
|
||||||
|
" 'numpy',\n",
|
||||||
|
" 'scikit-learn'\n",
|
||||||
|
"])\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
|
||||||
|
"# if cpu and memory_in_gb parameters are not provided\n",
|
||||||
|
"# the model will be profiled on default configuration of\n",
|
||||||
|
"# 3.5CPU and 15GB memory\n",
|
||||||
|
"profile = Model.profile(ws,\n",
|
||||||
|
" 'profile-%s' % datetime.now().strftime('%m%d%Y-%H%M%S'),\n",
|
||||||
|
" [model],\n",
|
||||||
|
" inference_config,\n",
|
||||||
|
" input_dataset=sample_request_data_diabetes,\n",
|
||||||
|
" cpu=1.0,\n",
|
||||||
|
" memory_in_gb=0.5)\n",
|
||||||
|
"\n",
|
||||||
|
"profile.wait_for_completion(True)\n",
|
||||||
|
"details = profile.get_details()"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
|
|||||||
@@ -0,0 +1,369 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||||
|
"\n",
|
||||||
|
"Licensed under the MIT License."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Deploy models to Azure Kubernetes Service (AKS) using controlled roll out\n",
|
||||||
|
"This notebook will show you how to deploy mulitple AKS webservices with the same scoring endpoint and how to roll out your models in a controlled manner by configuring % of scoring traffic going to each webservice. If you are using a Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) to install the Azure Machine Learning Python SDK and create an Azure ML Workspace."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Check for latest version\n",
|
||||||
|
"import azureml.core\n",
|
||||||
|
"print(azureml.core.VERSION)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Initialize workspace\n",
|
||||||
|
"Create a [Workspace](https://docs.microsoft.com/python/api/azureml-core/azureml.core.workspace%28class%29?view=azure-ml-py) object from your persisted configuration."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core.workspace import Workspace\n",
|
||||||
|
"\n",
|
||||||
|
"ws = Workspace.from_config()\n",
|
||||||
|
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Register the model\n",
|
||||||
|
"Register a file or folder as a model by calling [Model.register()](https://docs.microsoft.com/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#register-workspace--model-path--model-name--tags-none--properties-none--description-none--datasets-none--model-framework-none--model-framework-version-none--child-paths-none-).\n",
|
||||||
|
"In addition to the content of the model file itself, your registered model will also store model metadata -- model description, tags, and framework information -- that will be useful when managing and deploying models in your workspace. Using tags, for instance, you can categorize your models and apply filters when listing models in your workspace."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core import Model\n",
|
||||||
|
"\n",
|
||||||
|
"model = Model.register(workspace=ws,\n",
|
||||||
|
" model_name='sklearn_regression_model.pkl', # Name of the registered model in your workspace.\n",
|
||||||
|
" model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n",
|
||||||
|
" model_framework=Model.Framework.SCIKITLEARN, # Framework used to create the model.\n",
|
||||||
|
" model_framework_version='0.19.1', # Version of scikit-learn used to create the model.\n",
|
||||||
|
" description='Ridge regression model to predict diabetes progression.',\n",
|
||||||
|
" tags={'area': 'diabetes', 'type': 'regression'})\n",
|
||||||
|
"\n",
|
||||||
|
"print('Name:', model.name)\n",
|
||||||
|
"print('Version:', model.version)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Register an environment (for all models)\n",
|
||||||
|
"\n",
|
||||||
|
"If you control over how your model is run, or if it has special runtime requirements, you can specify your own environment and scoring method.\n",
|
||||||
|
"\n",
|
||||||
|
"Specify the model's runtime environment by creating an [Environment](https://docs.microsoft.com/python/api/azureml-core/azureml.core.environment%28class%29?view=azure-ml-py) object and providing the [CondaDependencies](https://docs.microsoft.com/python/api/azureml-core/azureml.core.conda_dependencies.condadependencies?view=azure-ml-py) needed by your model."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core import Environment\n",
|
||||||
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
|
"\n",
|
||||||
|
"environment=Environment('my-sklearn-environment')\n",
|
||||||
|
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
||||||
|
" 'azureml-defaults',\n",
|
||||||
|
" 'inference-schema[numpy-support]',\n",
|
||||||
|
" 'joblib',\n",
|
||||||
|
" 'numpy',\n",
|
||||||
|
" 'scikit-learn'\n",
|
||||||
|
"])"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"When using a custom environment, you must also provide Python code for initializing and running your model. An example script is included with this notebook."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"with open('score.py') as f:\n",
|
||||||
|
" print(f.read())"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Create the InferenceConfig\n",
|
||||||
|
"Create the inference configuration to reference your environment and entry script during deployment"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script='score.py', \n",
|
||||||
|
" source_directory='.',\n",
|
||||||
|
" environment=environment)\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Provision the AKS Cluster\n",
|
||||||
|
"If you already have an AKS cluster attached to this workspace, skip the step below and provide the name of the cluster."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core.compute import AksCompute\n",
|
||||||
|
"from azureml.core.compute import ComputeTarget\n",
|
||||||
|
"# Use the default configuration (can also provide parameters to customize)\n",
|
||||||
|
"prov_config = AksCompute.provisioning_configuration()\n",
|
||||||
|
"\n",
|
||||||
|
"aks_name = 'my-aks' \n",
|
||||||
|
"# Create the cluster\n",
|
||||||
|
"aks_target = ComputeTarget.create(workspace = ws, \n",
|
||||||
|
" name = aks_name, \n",
|
||||||
|
" provisioning_configuration = prov_config) \n",
|
||||||
|
"aks_target.wait_for_completion(show_output=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Create an Endpoint and add a version (AKS service)\n",
|
||||||
|
"This creates a new endpoint and adds a version behind it. By default the first version added is the default version. You can specify the traffic percentile a version takes behind an endpoint. \n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# deploying the model and create a new endpoint\n",
|
||||||
|
"from azureml.core.webservice import AksEndpoint\n",
|
||||||
|
"# from azureml.core.compute import ComputeTarget\n",
|
||||||
|
"\n",
|
||||||
|
"#select a created compute\n",
|
||||||
|
"compute = ComputeTarget(ws, 'my-aks')\n",
|
||||||
|
"namespace_name=\"endpointnamespace\"\n",
|
||||||
|
"# define the endpoint name\n",
|
||||||
|
"endpoint_name = \"myendpoint1\"\n",
|
||||||
|
"# define the service name\n",
|
||||||
|
"version_name= \"versiona\"\n",
|
||||||
|
"\n",
|
||||||
|
"endpoint_deployment_config = AksEndpoint.deploy_configuration(tags = {'modelVersion':'firstversion', 'department':'finance'}, \n",
|
||||||
|
" description = \"my first version\", namespace = namespace_name, \n",
|
||||||
|
" version_name = version_name, traffic_percentile = 40)\n",
|
||||||
|
"\n",
|
||||||
|
"endpoint = Model.deploy(ws, endpoint_name, [model], inference_config, endpoint_deployment_config, compute)\n",
|
||||||
|
"endpoint.wait_for_deployment(True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"endpoint.get_logs()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Add another version of the service to an existing endpoint\n",
|
||||||
|
"This adds another version behind an existing endpoint. You can specify the traffic percentile the new version takes. If no traffic_percentile is specified then it defaults to 0. All the unspecified traffic percentile (in this example 50) across all versions goes to default version."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Adding a new version to an existing Endpoint.\n",
|
||||||
|
"version_name_add=\"versionb\" \n",
|
||||||
|
"\n",
|
||||||
|
"endpoint.create_version(version_name = version_name_add, inference_config=inference_config, models=[model], tags = {'modelVersion':'secondversion', 'department':'finance'}, \n",
|
||||||
|
" description = \"my second version\", traffic_percentile = 10)\n",
|
||||||
|
"endpoint.wait_for_deployment(True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Update an existing version in an endpoint\n",
|
||||||
|
"There are two types of versions: control and treatment. An endpoint contains one or more treatment versions but only one control version. This categorization helps compare the different versions against the defined control version."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"endpoint.update_version(version_name=endpoint.versions[version_name_add].name, description=\"my second version update\", traffic_percentile=40, is_default=True, is_control_version_type=True)\n",
|
||||||
|
"endpoint.wait_for_deployment(True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Test the web service using run method\n",
|
||||||
|
"Test the web sevice by passing in data. Run() method retrieves API keys behind the scenes to make sure that call is authenticated."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Scoring on endpoint\n",
|
||||||
|
"import json\n",
|
||||||
|
"test_sample = json.dumps({'data': [\n",
|
||||||
|
" [1,2,3,4,5,6,7,8,9,10], \n",
|
||||||
|
" [10,9,8,7,6,5,4,3,2,1]\n",
|
||||||
|
"]})\n",
|
||||||
|
"\n",
|
||||||
|
"test_sample_encoded = bytes(test_sample, encoding='utf8')\n",
|
||||||
|
"prediction = endpoint.run(input_data=test_sample_encoded)\n",
|
||||||
|
"print(prediction)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"## Delete Resources"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# deleting a version in an endpoint\n",
|
||||||
|
"endpoint.delete_version(version_name=version_name)\n",
|
||||||
|
"endpoint.wait_for_deployment(True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# deleting an endpoint, this will delete all versions in the endpoint and the endpoint itself\n",
|
||||||
|
"endpoint.delete()"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"authors": [
|
||||||
|
{
|
||||||
|
"name": "shipatel"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"category": "deployment",
|
||||||
|
"compute": [
|
||||||
|
"None"
|
||||||
|
],
|
||||||
|
"datasets": [
|
||||||
|
"Diabetes"
|
||||||
|
],
|
||||||
|
"deployment": [
|
||||||
|
"Azure Kubernetes Service"
|
||||||
|
],
|
||||||
|
"exclude_from_index": false,
|
||||||
|
"framework": [
|
||||||
|
"Scikit-learn"
|
||||||
|
],
|
||||||
|
"friendly_name": "Deploy models to AKS using controlled roll out",
|
||||||
|
"index_order": 3,
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3.6",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python36"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.0"
|
||||||
|
},
|
||||||
|
"star_tag": [
|
||||||
|
"featured"
|
||||||
|
],
|
||||||
|
"tags": [
|
||||||
|
"None"
|
||||||
|
],
|
||||||
|
"task": "Deploy a model with Azure Machine Learning"
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
||||||
@@ -0,0 +1,4 @@
|
|||||||
|
name: deploy-aks-with-controlled-rollout
|
||||||
|
dependencies:
|
||||||
|
- pip:
|
||||||
|
- azureml-sdk
|
||||||
@@ -0,0 +1,28 @@
|
|||||||
|
import pickle
|
||||||
|
import json
|
||||||
|
import numpy
|
||||||
|
from sklearn.externals import joblib
|
||||||
|
from sklearn.linear_model import Ridge
|
||||||
|
from azureml.core.model import Model
|
||||||
|
|
||||||
|
|
||||||
|
def init():
|
||||||
|
global model
|
||||||
|
# note here "sklearn_regression_model.pkl" is the name of the model registered under
|
||||||
|
# this is a different behavior than before when the code is run locally, even though the code is the same.
|
||||||
|
model_path = Model.get_model_path('sklearn_regression_model.pkl')
|
||||||
|
# deserialize the model file back into a sklearn model
|
||||||
|
model = joblib.load(model_path)
|
||||||
|
|
||||||
|
|
||||||
|
# note you can pass in multiple rows for scoring
|
||||||
|
def run(raw_data):
|
||||||
|
try:
|
||||||
|
data = json.loads(raw_data)['data']
|
||||||
|
data = numpy.array(data)
|
||||||
|
result = model.predict(data)
|
||||||
|
# you can return any data type as long as it is JSON-serializable
|
||||||
|
return result.tolist()
|
||||||
|
except Exception as e:
|
||||||
|
error = str(e)
|
||||||
|
return error
|
||||||
@@ -158,7 +158,8 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"## 5. *Create myenv.yml file*"
|
"## 5. *Create myenv.yml file*\n",
|
||||||
|
"Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -169,7 +170,8 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'])\n",
|
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'],\n",
|
||||||
|
" pip_packages=['azureml-defaults'])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())"
|
" f.write(myenv.serialize_to_string())"
|
||||||
@@ -189,10 +191,11 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"\n",
|
||||||
" entry_script=\"score.py\",\n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" conda_file=\"myenv.yml\")"
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -244,7 +244,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Setting up inference configuration\n",
|
"### Setting up inference configuration\n",
|
||||||
"First we create a YAML file that specifies which dependencies we would like to see in our container."
|
"First we create a YAML file that specifies which dependencies we would like to see in our container. Please note that you must include azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -255,7 +255,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(pip_packages=[\"numpy\",\"onnxruntime==0.4.0\",\"azureml-core\"])\n",
|
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime==0.4.0\", \"azureml-core\", \"azureml-defaults\"])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())"
|
" f.write(myenv.serialize_to_string())"
|
||||||
@@ -275,11 +275,11 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"\n",
|
||||||
" entry_script=\"score.py\",\n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" conda_file=\"myenv.yml\",\n",
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
||||||
" extra_docker_file_steps = \"Dockerfile\")"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -373,7 +373,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"#aci_service.delete()"
|
"aci_service.delete()"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
|||||||
@@ -319,7 +319,8 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Write Environment File"
|
"### Write Environment File\n",
|
||||||
|
"Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -330,7 +331,8 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime\", \"azureml-core\"])\n",
|
"\n",
|
||||||
|
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime\", \"azureml-core\", \"azureml-defaults\"])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())"
|
" f.write(myenv.serialize_to_string())"
|
||||||
@@ -350,11 +352,11 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"\n",
|
||||||
" entry_script=\"score.py\",\n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" conda_file=\"myenv.yml\",\n",
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
||||||
" extra_docker_file_steps = \"Dockerfile\")"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -724,7 +726,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"# remember to delete your service after you are done using it!\n",
|
"# remember to delete your service after you are done using it!\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# aci_service.delete()"
|
"aci_service.delete()"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -306,7 +306,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"### Write Environment File\n",
|
"### Write Environment File\n",
|
||||||
"\n",
|
"\n",
|
||||||
"This step creates a YAML environment file that specifies which dependencies we would like to see in our Linux Virtual Machine."
|
"This step creates a YAML environment file that specifies which dependencies we would like to see in our Linux Virtual Machine. Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -317,7 +317,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime\", \"azureml-core\"])\n",
|
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime\", \"azureml-core\", \"azureml-defaults\"])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())"
|
" f.write(myenv.serialize_to_string())"
|
||||||
@@ -337,11 +337,11 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"\n",
|
||||||
" entry_script=\"score.py\",\n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" extra_docker_file_steps = \"Dockerfile\",\n",
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
||||||
" conda_file=\"myenv.yml\")"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -733,7 +733,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"# remember to delete your service after you are done using it!\n",
|
"# remember to delete your service after you are done using it!\n",
|
||||||
"\n",
|
"\n",
|
||||||
"# aci_service.delete()"
|
"aci_service.delete()"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -241,7 +241,8 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(pip_packages=[\"numpy\",\"onnxruntime\",\"azureml-core\"])\n",
|
"\n",
|
||||||
|
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime\", \"azureml-core\", \"azureml-defaults\"])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())"
|
" f.write(myenv.serialize_to_string())"
|
||||||
@@ -251,7 +252,7 @@
|
|||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"Create the inference configuration object"
|
"Create the inference configuration object. Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -261,11 +262,11 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"\n",
|
||||||
" entry_script=\"score.py\",\n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" conda_file=\"myenv.yml\",\n",
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
||||||
" extra_docker_file_steps = \"Dockerfile\")"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -361,7 +362,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"#aci_service.delete()"
|
"aci_service.delete()"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
|||||||
@@ -405,7 +405,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"source": [
|
"source": [
|
||||||
"### Create inference configuration\n",
|
"### Create inference configuration\n",
|
||||||
"First we create a YAML file that specifies which dependencies we would like to see in our container."
|
"First we create a YAML file that specifies which dependencies we would like to see in our container. Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency, because it contains the functionality needed to host the model as a web service."
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -416,7 +416,7 @@
|
|||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
||||||
"\n",
|
"\n",
|
||||||
"myenv = CondaDependencies.create(pip_packages=[\"numpy\",\"onnxruntime\",\"azureml-core\"])\n",
|
"myenv = CondaDependencies.create(pip_packages=[\"numpy\",\"onnxruntime\",\"azureml-core\", \"azureml-defaults\"])\n",
|
||||||
"\n",
|
"\n",
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
"with open(\"myenv.yml\",\"w\") as f:\n",
|
||||||
" f.write(myenv.serialize_to_string())"
|
" f.write(myenv.serialize_to_string())"
|
||||||
@@ -436,11 +436,11 @@
|
|||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"from azureml.core.model import InferenceConfig\n",
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.environment import Environment\n",
|
||||||
"\n",
|
"\n",
|
||||||
"inference_config = InferenceConfig(runtime= \"python\", \n",
|
"\n",
|
||||||
" entry_script=\"score.py\",\n",
|
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
|
||||||
" conda_file=\"myenv.yml\",\n",
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
|
||||||
" extra_docker_file_steps = \"Dockerfile\")"
|
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
@@ -537,7 +537,7 @@
|
|||||||
"metadata": {},
|
"metadata": {},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"#aci_service.delete()"
|
"aci_service.delete()"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
|||||||
@@ -0,0 +1,314 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
||||||
|
"\n",
|
||||||
|
"Licensed under the MIT License."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
""
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Deploying a web service to Azure Kubernetes Service (AKS)\n",
|
||||||
|
"This notebook shows the steps for deploying a service: registering a model, creating an image, provisioning a cluster (one time action), and deploying a service to it. \n",
|
||||||
|
"We then test and delete the service, image and model."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import azureml.core\n",
|
||||||
|
"print(azureml.core.VERSION)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Get workspace\n",
|
||||||
|
"Load existing workspace from the config file info."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core.workspace import Workspace\n",
|
||||||
|
"\n",
|
||||||
|
"ws = Workspace.from_config()\n",
|
||||||
|
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Register the model\n",
|
||||||
|
"Register an existing trained model, add descirption and tags. Prior to registering the model, you should have a TensorFlow [Saved Model](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md) in the `resnet50` directory. You can download a [pretrained resnet50](http://download.tensorflow.org/models/official/20181001_resnet/savedmodels/resnet_v1_fp32_savedmodel_NCHW_jpg.tar.gz) and unpack it to that directory."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"#Register the model\n",
|
||||||
|
"from azureml.core.model import Model\n",
|
||||||
|
"model = Model.register(model_path = \"resnet50\", # this points to a local file\n",
|
||||||
|
" model_name = \"resnet50\", # this is the name the model is registered as\n",
|
||||||
|
" tags = {'area': \"Image classification\", 'type': \"classification\"},\n",
|
||||||
|
" description = \"Image classification trained on Imagenet Dataset\",\n",
|
||||||
|
" workspace = ws)\n",
|
||||||
|
"\n",
|
||||||
|
"print(model.name, model.description, model.version)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Provision the AKS Cluster\n",
|
||||||
|
"This is a one time setup. You can reuse this cluster for multiple deployments after it has been created. If you delete the cluster or the resource group that contains it, then you would have to recreate it."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from azureml.core.compute import ComputeTarget, AksCompute\n",
|
||||||
|
"from azureml.core.compute_target import ComputeTargetException\n",
|
||||||
|
"\n",
|
||||||
|
"# Choose a name for your GPU cluster\n",
|
||||||
|
"gpu_cluster_name = \"aks-gpu-cluster\"\n",
|
||||||
|
"\n",
|
||||||
|
"# Verify that cluster does not exist already\n",
|
||||||
|
"try:\n",
|
||||||
|
" gpu_cluster = ComputeTarget(workspace=ws, name=gpu_cluster_name)\n",
|
||||||
|
" print(\"Found existing gpu cluster\")\n",
|
||||||
|
"except ComputeTargetException:\n",
|
||||||
|
" print(\"Creating new gpu-cluster\")\n",
|
||||||
|
" \n",
|
||||||
|
" # Specify the configuration for the new cluster\n",
|
||||||
|
" compute_config = AksCompute.provisioning_configuration(cluster_purpose=AksCompute.ClusterPurpose.DEV_TEST,\n",
|
||||||
|
" agent_count=1,\n",
|
||||||
|
" vm_size=\"Standard_NV6\")\n",
|
||||||
|
" # Create the cluster with the specified name and configuration\n",
|
||||||
|
" gpu_cluster = ComputeTarget.create(ws, gpu_cluster_name, compute_config)\n",
|
||||||
|
"\n",
|
||||||
|
" # Wait for the cluster to complete, show the output log\n",
|
||||||
|
" gpu_cluster.wait_for_completion(show_output=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Deploy the model as a web service to AKS\n",
|
||||||
|
"\n",
|
||||||
|
"First create a scoring script"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%writefile score.py\n",
|
||||||
|
"import tensorflow as tf\n",
|
||||||
|
"import numpy as np\n",
|
||||||
|
"import json\n",
|
||||||
|
"import os\n",
|
||||||
|
"from azureml.contrib.services.aml_request import AMLRequest, rawhttp\n",
|
||||||
|
"from azureml.contrib.services.aml_response import AMLResponse\n",
|
||||||
|
"\n",
|
||||||
|
"def init():\n",
|
||||||
|
" global session\n",
|
||||||
|
" global input_name\n",
|
||||||
|
" global output_name\n",
|
||||||
|
" \n",
|
||||||
|
" session = tf.Session()\n",
|
||||||
|
"\n",
|
||||||
|
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
|
||||||
|
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
|
||||||
|
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
|
||||||
|
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'resnet50')\n",
|
||||||
|
" model = tf.saved_model.loader.load(session, ['serve'], model_path)\n",
|
||||||
|
" if len(model.signature_def['serving_default'].inputs) > 1:\n",
|
||||||
|
" raise ValueError(\"This score.py only supports one input\")\n",
|
||||||
|
" input_name = [tensor.name for tensor in model.signature_def['serving_default'].inputs.values()][0]\n",
|
||||||
|
" output_name = [tensor.name for tensor in model.signature_def['serving_default'].outputs.values()]\n",
|
||||||
|
" \n",
|
||||||
|
"\n",
|
||||||
|
"@rawhttp\n",
|
||||||
|
"def run(request):\n",
|
||||||
|
" if request.method == 'POST':\n",
|
||||||
|
" reqBody = request.get_data(False)\n",
|
||||||
|
" resp = score(reqBody)\n",
|
||||||
|
" return AMLResponse(resp, 200)\n",
|
||||||
|
" if request.method == 'GET':\n",
|
||||||
|
" respBody = str.encode(\"GET is not supported\")\n",
|
||||||
|
" return AMLResponse(respBody, 405)\n",
|
||||||
|
" return AMLResponse(\"bad request\", 500)\n",
|
||||||
|
"\n",
|
||||||
|
"def score(data):\n",
|
||||||
|
" result = session.run(output_name, {input_name: [data]})\n",
|
||||||
|
" return json.dumps(result[1].tolist())\n",
|
||||||
|
"\n",
|
||||||
|
"if __name__ == \"__main__\":\n",
|
||||||
|
" init()\n",
|
||||||
|
" with open(\"test_image.jpg\", 'rb') as f:\n",
|
||||||
|
" content = f.read()\n",
|
||||||
|
" print(score(content))"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Now create the deployment configuration objects and deploy the model as a webservice."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"# Set the web service configuration (using default here)\n",
|
||||||
|
"from azureml.core.model import InferenceConfig\n",
|
||||||
|
"from azureml.core.webservice import AksWebservice\n",
|
||||||
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
|
"from azureml.core.environment import Environment, DEFAULT_GPU_IMAGE\n",
|
||||||
|
"\n",
|
||||||
|
"env = Environment('deploytocloudenv')\n",
|
||||||
|
"# Please see [Azure ML Containers repository](https://github.com/Azure/AzureML-Containers#featured-tags)\n",
|
||||||
|
"# for open-sourced GPU base images.\n",
|
||||||
|
"env.docker.base_image = DEFAULT_GPU_IMAGE\n",
|
||||||
|
"env.python.conda_dependencies = CondaDependencies.create(conda_packages=['tensorflow-gpu==1.12.0','numpy'],\n",
|
||||||
|
" pip_packages=['azureml-contrib-services', 'azureml-defaults'])\n",
|
||||||
|
"\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=env)\n",
|
||||||
|
"aks_config = AksWebservice.deploy_configuration()\n",
|
||||||
|
"\n",
|
||||||
|
"# # Enable token auth and disable (key) auth on the webservice\n",
|
||||||
|
"# aks_config = AksWebservice.deploy_configuration(token_auth_enabled=True, auth_enabled=False)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%time\n",
|
||||||
|
"aks_service_name ='gpu-rn50'\n",
|
||||||
|
"\n",
|
||||||
|
"aks_service = Model.deploy(workspace=ws,\n",
|
||||||
|
" name=aks_service_name,\n",
|
||||||
|
" models=[model],\n",
|
||||||
|
" inference_config=inference_config,\n",
|
||||||
|
" deployment_config=aks_config,\n",
|
||||||
|
" deployment_target=gpu_cluster)\n",
|
||||||
|
"\n",
|
||||||
|
"aks_service.wait_for_deployment(show_output = True)\n",
|
||||||
|
"print(aks_service.state)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Test the web service\n",
|
||||||
|
"We test the web sevice by passing the test images content."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%time\n",
|
||||||
|
"import requests\n",
|
||||||
|
"\n",
|
||||||
|
"# if (key) auth is enabled, fetch keys and include in the request\n",
|
||||||
|
"key1, key2 = aks_service.get_keys()\n",
|
||||||
|
"\n",
|
||||||
|
"headers = {'Content-Type':'application/json', 'Authorization': 'Bearer ' + key1}\n",
|
||||||
|
"\n",
|
||||||
|
"# # if token auth is enabled, fetch token and include in the request\n",
|
||||||
|
"# access_token, fetch_after = aks_service.get_token()\n",
|
||||||
|
"# headers = {'Content-Type':'application/json', 'Authorization': 'Bearer ' + access_token}\n",
|
||||||
|
"\n",
|
||||||
|
"test_sample = open('snowleopardgaze.jpg', 'rb').read()\n",
|
||||||
|
"resp = requests.post(aks_service.scoring_uri, test_sample, headers=headers)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Clean up\n",
|
||||||
|
"Delete the service, image, model and compute target"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"%%time\n",
|
||||||
|
"aks_service.delete()\n",
|
||||||
|
"model.delete()\n",
|
||||||
|
"gpu_cluster.delete()\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"authors": [
|
||||||
|
{
|
||||||
|
"name": "aashishb"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3.6",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python36"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.6.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 2
|
||||||
|
}
|
||||||
@@ -0,0 +1,5 @@
|
|||||||
|
name: production-deploy-to-aks-gpu
|
||||||
|
dependencies:
|
||||||
|
- pip:
|
||||||
|
- azureml-sdk
|
||||||
|
- tensorflow
|
||||||
Binary file not shown.
|
After Width: | Height: | Size: 61 KiB |
@@ -198,6 +198,106 @@
|
|||||||
"inf_config = InferenceConfig(entry_script='score.py', environment=myenv)"
|
"inf_config = InferenceConfig(entry_script='score.py', environment=myenv)"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"# Model Profiling\n",
|
||||||
|
"\n",
|
||||||
|
"Profile your model to understand how much CPU and memory the service, created as a result of its deployment, will need. Profiling returns information such as CPU usage, memory usage, and response latency. It also provides a CPU and memory recommendation based on the resource usage. You can profile your model (or more precisely the service built based on your model) on any CPU and/or memory combination where 0.1 <= CPU <= 3.5 and 0.1GB <= memory <= 15GB. If you do not provide a CPU and/or memory requirement, we will test it on the default configuration of 3.5 CPU and 15GB memory.\n",
|
||||||
|
"\n",
|
||||||
|
"In order to profile your model you will need:\n",
|
||||||
|
"- a registered model\n",
|
||||||
|
"- an entry script\n",
|
||||||
|
"- an inference configuration\n",
|
||||||
|
"- a single column tabular dataset, where each row contains a string representing sample request data sent to the service.\n",
|
||||||
|
"\n",
|
||||||
|
"At this point we only support profiling of services that expect their request data to be a string, for example: string serialized json, text, string serialized image, etc. The content of each row of the dataset (string) will be put into the body of the HTTP request and sent to the service encapsulating the model for scoring.\n",
|
||||||
|
"\n",
|
||||||
|
"Below is an example of how you can construct an input dataset to profile a service which expects its incoming requests to contain serialized json. In this case we created a dataset based one hundred instances of the same request data. In real world scenarios however, we suggest that you use larger datasets with various inputs, especially if your model resource usage/behavior is input dependent."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"import json\n",
|
||||||
|
"from azureml.core import Datastore\n",
|
||||||
|
"from azureml.core.dataset import Dataset\n",
|
||||||
|
"from azureml.data import dataset_type_definitions\n",
|
||||||
|
"\n",
|
||||||
|
"input_json = {'data': [[1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
|
||||||
|
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]]}\n",
|
||||||
|
"# create a string that can be put in the body of the request\n",
|
||||||
|
"serialized_input_json = json.dumps(input_json)\n",
|
||||||
|
"dataset_content = []\n",
|
||||||
|
"for i in range(100):\n",
|
||||||
|
" dataset_content.append(serialized_input_json)\n",
|
||||||
|
"sample_request_data = '\\n'.join(dataset_content)\n",
|
||||||
|
"file_name = 'sample_request_data.txt'\n",
|
||||||
|
"f = open(file_name, 'w')\n",
|
||||||
|
"f.write(sample_request_data)\n",
|
||||||
|
"f.close()\n",
|
||||||
|
"\n",
|
||||||
|
"# upload the txt file created above to the Datastore and create a dataset from it\n",
|
||||||
|
"data_store = Datastore.get_default(ws)\n",
|
||||||
|
"data_store.upload_files(['./' + file_name], target_path='sample_request_data')\n",
|
||||||
|
"datastore_path = [(data_store, 'sample_request_data' +'/' + file_name)]\n",
|
||||||
|
"sample_request_data = Dataset.Tabular.from_delimited_files(\n",
|
||||||
|
" datastore_path,\n",
|
||||||
|
" separator='\\n',\n",
|
||||||
|
" infer_column_types=True,\n",
|
||||||
|
" header=dataset_type_definitions.PromoteHeadersBehavior.NO_HEADERS)\n",
|
||||||
|
"sample_request_data = sample_request_data.register(workspace=ws,\n",
|
||||||
|
" name='sample_request_data',\n",
|
||||||
|
" create_new_version=True)"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "markdown",
|
||||||
|
"metadata": {},
|
||||||
|
"source": [
|
||||||
|
"Now that we have an input dataset we are ready to go ahead with profiling. In this case we are testing the previously introduced sklearn regression model on 1 CPU and 0.5 GB memory. The memory usage and recommendation presented in the result is measured in Gigabytes. The CPU usage and recommendation is measured in CPU cores."
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": null,
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"from datetime import datetime\n",
|
||||||
|
"from azureml.core import Environment\n",
|
||||||
|
"from azureml.core.conda_dependencies import CondaDependencies\n",
|
||||||
|
"from azureml.core.model import Model, InferenceConfig\n",
|
||||||
|
"\n",
|
||||||
|
"\n",
|
||||||
|
"environment = Environment('my-sklearn-environment')\n",
|
||||||
|
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
|
||||||
|
" 'azureml-defaults',\n",
|
||||||
|
" 'inference-schema[numpy-support]',\n",
|
||||||
|
" 'joblib',\n",
|
||||||
|
" 'numpy',\n",
|
||||||
|
" 'scikit-learn'\n",
|
||||||
|
"])\n",
|
||||||
|
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
|
||||||
|
"# if cpu and memory_in_gb parameters are not provided\n",
|
||||||
|
"# the model will be profiled on default configuration of\n",
|
||||||
|
"# 3.5CPU and 15GB memory\n",
|
||||||
|
"profile = Model.profile(ws,\n",
|
||||||
|
" 'sklearn-%s' % datetime.now().strftime('%m%d%Y-%H%M%S'),\n",
|
||||||
|
" [model],\n",
|
||||||
|
" inference_config,\n",
|
||||||
|
" input_dataset=sample_request_data,\n",
|
||||||
|
" cpu=1.0,\n",
|
||||||
|
" memory_in_gb=0.5)\n",
|
||||||
|
"\n",
|
||||||
|
"profile.wait_for_completion(True)\n",
|
||||||
|
"details = profile.get_details()"
|
||||||
|
]
|
||||||
|
},
|
||||||
{
|
{
|
||||||
"cell_type": "markdown",
|
"cell_type": "markdown",
|
||||||
"metadata": {},
|
"metadata": {},
|
||||||
@@ -318,7 +418,11 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
"metadata": {},
|
"metadata": {
|
||||||
|
"tags": [
|
||||||
|
"sample-deploy-to-aks"
|
||||||
|
]
|
||||||
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"# Set the web service configuration (using default here)\n",
|
"# Set the web service configuration (using default here)\n",
|
||||||
@@ -331,7 +435,11 @@
|
|||||||
{
|
{
|
||||||
"cell_type": "code",
|
"cell_type": "code",
|
||||||
"execution_count": null,
|
"execution_count": null,
|
||||||
"metadata": {},
|
"metadata": {
|
||||||
|
"tags": [
|
||||||
|
"sample-deploy-to-aks"
|
||||||
|
]
|
||||||
|
},
|
||||||
"outputs": [],
|
"outputs": [],
|
||||||
"source": [
|
"source": [
|
||||||
"%%time\n",
|
"%%time\n",
|
||||||
|
|||||||
@@ -1,457 +0,0 @@
|
|||||||
{
|
|
||||||
"cells": [
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
|
|
||||||
"\n",
|
|
||||||
"Licensed under the MIT License."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
""
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Register Model, Create Image and Deploy Service\n",
|
|
||||||
"\n",
|
|
||||||
"This example shows how to deploy a web service in step-by-step fashion:\n",
|
|
||||||
"\n",
|
|
||||||
" 1. Register model\n",
|
|
||||||
" 2. Query versions of models and select one to deploy\n",
|
|
||||||
" 3. Create Docker image\n",
|
|
||||||
" 4. Query versions of images\n",
|
|
||||||
" 5. Deploy the image as web service\n",
|
|
||||||
" \n",
|
|
||||||
"**IMPORTANT**:\n",
|
|
||||||
" * This notebook requires you to first complete [train-within-notebook](../../training/train-within-notebook/train-within-notebook.ipynb) example\n",
|
|
||||||
" \n",
|
|
||||||
"The train-within-notebook example taught you how to deploy a web service directly from model in one step. This Notebook shows a more advanced approach that gives you more control over model versions and Docker image versions. "
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Prerequisites\n",
|
|
||||||
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the [configuration](../../../configuration.ipynb) Notebook first if you haven't."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"# Check core SDK version number\n",
|
|
||||||
"import azureml.core\n",
|
|
||||||
"\n",
|
|
||||||
"print(\"SDK version:\", azureml.core.VERSION)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"## Initialize Workspace\n",
|
|
||||||
"\n",
|
|
||||||
"Initialize a workspace object from persisted configuration."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"create workspace"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core import Workspace\n",
|
|
||||||
"\n",
|
|
||||||
"ws = Workspace.from_config()\n",
|
|
||||||
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Register Model"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"You can add tags and descriptions to your models. Note you need to have a `sklearn_linreg_model.pkl` file in the current directory. This file is generated by the 01 notebook. The below call registers that file as a model with the same name `sklearn_linreg_model.pkl` in the workspace.\n",
|
|
||||||
"\n",
|
|
||||||
"Using tags, you can track useful information such as the name and version of the machine learning library used to train the model. Note that tags must be alphanumeric."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"register model from file"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.model import Model\n",
|
|
||||||
"import sklearn\n",
|
|
||||||
"\n",
|
|
||||||
"library_version = \"sklearn\"+sklearn.__version__.replace(\".\",\"x\")\n",
|
|
||||||
"\n",
|
|
||||||
"model = Model.register(model_path = \"sklearn_regression_model.pkl\",\n",
|
|
||||||
" model_name = \"sklearn_regression_model.pkl\",\n",
|
|
||||||
" tags = {'area': \"diabetes\", 'type': \"regression\", 'version': library_version},\n",
|
|
||||||
" description = \"Ridge regression model to predict diabetes\",\n",
|
|
||||||
" workspace = ws)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"You can explore the registered models within your workspace and query by tag. Models are versioned. If you call the register_model command many times with same model name, you will get multiple versions of the model with increasing version numbers."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"register model from file"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"regression_models = Model.list(workspace=ws, tags=['area'])\n",
|
|
||||||
"for m in regression_models:\n",
|
|
||||||
" print(\"Name:\", m.name,\"\\tVersion:\", m.version, \"\\tDescription:\", m.description, m.tags)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"You can pick a specific model to deploy"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"print(model.name, model.description, model.version, sep = '\\t')"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Create Docker Image"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Show `score.py`. Note that the `sklearn_regression_model.pkl` in the `get_model_path` call is referring to a model named `sklearn_linreg_model.pkl` registered under the workspace. It is NOT referenceing the local file."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"%%writefile score.py\n",
|
|
||||||
"import os\n",
|
|
||||||
"import pickle\n",
|
|
||||||
"import json\n",
|
|
||||||
"import numpy\n",
|
|
||||||
"from sklearn.externals import joblib\n",
|
|
||||||
"from sklearn.linear_model import Ridge\n",
|
|
||||||
"\n",
|
|
||||||
"def init():\n",
|
|
||||||
" global model\n",
|
|
||||||
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
|
|
||||||
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
|
|
||||||
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
|
|
||||||
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
|
|
||||||
" # deserialize the model file back into a sklearn model\n",
|
|
||||||
" model = joblib.load(model_path)\n",
|
|
||||||
"\n",
|
|
||||||
"# note you can pass in multiple rows for scoring\n",
|
|
||||||
"def run(raw_data):\n",
|
|
||||||
" try:\n",
|
|
||||||
" data = json.loads(raw_data)['data']\n",
|
|
||||||
" data = numpy.array(data)\n",
|
|
||||||
" result = model.predict(data)\n",
|
|
||||||
" # you can return any datatype as long as it is JSON-serializable\n",
|
|
||||||
" return result.tolist()\n",
|
|
||||||
" except Exception as e:\n",
|
|
||||||
" error = str(e)\n",
|
|
||||||
" return error"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.conda_dependencies import CondaDependencies \n",
|
|
||||||
"\n",
|
|
||||||
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'])\n",
|
|
||||||
"\n",
|
|
||||||
"with open(\"myenv.yml\",\"w\") as f:\n",
|
|
||||||
" f.write(myenv.serialize_to_string())"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Note that following command can take few minutes. \n",
|
|
||||||
"\n",
|
|
||||||
"You can add tags and descriptions to images. Also, an image can contain multiple models."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"create image",
|
|
||||||
"sample-image-create"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.image import Image, ContainerImage\n",
|
|
||||||
"\n",
|
|
||||||
"image_config = ContainerImage.image_configuration(runtime= \"python\",\n",
|
|
||||||
" execution_script=\"score.py\",\n",
|
|
||||||
" conda_file=\"myenv.yml\",\n",
|
|
||||||
" tags = {'area': \"diabetes\", 'type': \"regression\"},\n",
|
|
||||||
" description = \"Image with ridge regression model\")\n",
|
|
||||||
"\n",
|
|
||||||
"image = Image.create(name = \"myimage1\",\n",
|
|
||||||
" # this is the model object. note you can pass in 0-n models via this list-type parameter\n",
|
|
||||||
" # in case you need to reference multiple models, or none at all, in your scoring script.\n",
|
|
||||||
" models = [model],\n",
|
|
||||||
" image_config = image_config, \n",
|
|
||||||
" workspace = ws)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"create image"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"image.wait_for_creation(show_output = True)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"#### Use a custom Docker image\n",
|
|
||||||
"\n",
|
|
||||||
"You can also specify a custom Docker image to be used as base image if you don't want to use the default base image provided by Azure ML. Please make sure the custom Docker image has Ubuntu >= 16.04, Conda >= 4.5.\\* and Python(3.5.\\* or 3.6.\\*).\n",
|
|
||||||
"\n",
|
|
||||||
"Only Supported for `ContainerImage`(from azureml.core.image) with `python` runtime.\n",
|
|
||||||
"```python\n",
|
|
||||||
"# use an image available in public Container Registry without authentication\n",
|
|
||||||
"image_config.base_image = \"mcr.microsoft.com/azureml/o16n-sample-user-base/ubuntu-miniconda\"\n",
|
|
||||||
"\n",
|
|
||||||
"# or, use an image available in a private Container Registry\n",
|
|
||||||
"image_config.base_image = \"myregistry.azurecr.io/mycustomimage:1.0\"\n",
|
|
||||||
"image_config.base_image_registry.address = \"myregistry.azurecr.io\"\n",
|
|
||||||
"image_config.base_image_registry.username = \"username\"\n",
|
|
||||||
"image_config.base_image_registry.password = \"password\"\n",
|
|
||||||
"\n",
|
|
||||||
"# or, use an image built during training.\n",
|
|
||||||
"image_config.base_image = run.properties[\"AzureML.DerivedImageName\"]\n",
|
|
||||||
"```\n",
|
|
||||||
"You can get the address of training image from the properties of a Run object. Only new runs submitted with azureml-sdk>=1.0.22 to AMLCompute targets will have the 'AzureML.DerivedImageName' property. Instructions on how to get a Run can be found in [manage-runs](../../training/manage-runs/manage-runs.ipynb). \n"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"List images by tag and find out the detailed build log for debugging."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"create image"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"for i in Image.list(workspace = ws,tags = [\"area\"]):\n",
|
|
||||||
" print('{}(v.{} [{}]) stored at {} with build log {}'.format(i.name, i.version, i.creation_state, i.image_location, i.image_build_log_uri))"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Deploy image as web service on Azure Container Instance\n",
|
|
||||||
"\n",
|
|
||||||
"Note that the service creation can take few minutes."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"deploy service",
|
|
||||||
"aci",
|
|
||||||
"sample-aciwebservice-deploy-config"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.webservice import AciWebservice\n",
|
|
||||||
"\n",
|
|
||||||
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
|
|
||||||
" memory_gb = 1, \n",
|
|
||||||
" tags = {'area': \"diabetes\", 'type': \"regression\"}, \n",
|
|
||||||
" description = 'Predict diabetes using regression model')"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"deploy service",
|
|
||||||
"aci",
|
|
||||||
"sample-aciwebservice-deploy-from-image"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"from azureml.core.webservice import Webservice\n",
|
|
||||||
"\n",
|
|
||||||
"aci_service_name = 'my-aci-service-2'\n",
|
|
||||||
"print(aci_service_name)\n",
|
|
||||||
"aci_service = Webservice.deploy_from_image(deployment_config = aciconfig,\n",
|
|
||||||
" image = image,\n",
|
|
||||||
" name = aci_service_name,\n",
|
|
||||||
" workspace = ws)\n",
|
|
||||||
"aci_service.wait_for_deployment(True)\n",
|
|
||||||
"print(aci_service.state)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Test web service"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"Call the web service with some dummy input data to get a prediction."
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"deploy service",
|
|
||||||
"aci"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"import json\n",
|
|
||||||
"\n",
|
|
||||||
"test_sample = json.dumps({'data': [\n",
|
|
||||||
" [1,2,3,4,5,6,7,8,9,10], \n",
|
|
||||||
" [10,9,8,7,6,5,4,3,2,1]\n",
|
|
||||||
"]})\n",
|
|
||||||
"test_sample = bytes(test_sample,encoding = 'utf8')\n",
|
|
||||||
"\n",
|
|
||||||
"prediction = aci_service.run(input_data=test_sample)\n",
|
|
||||||
"print(prediction)"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "markdown",
|
|
||||||
"metadata": {},
|
|
||||||
"source": [
|
|
||||||
"### Delete ACI to clean up"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
"cell_type": "code",
|
|
||||||
"execution_count": null,
|
|
||||||
"metadata": {
|
|
||||||
"tags": [
|
|
||||||
"deploy service",
|
|
||||||
"aci"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"outputs": [],
|
|
||||||
"source": [
|
|
||||||
"aci_service.delete()"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"metadata": {
|
|
||||||
"authors": [
|
|
||||||
{
|
|
||||||
"name": "aashishb"
|
|
||||||
}
|
|
||||||
],
|
|
||||||
"kernelspec": {
|
|
||||||
"display_name": "Python 3.6",
|
|
||||||
"language": "python",
|
|
||||||
"name": "python36"
|
|
||||||
},
|
|
||||||
"language_info": {
|
|
||||||
"codemirror_mode": {
|
|
||||||
"name": "ipython",
|
|
||||||
"version": 3
|
|
||||||
},
|
|
||||||
"file_extension": ".py",
|
|
||||||
"mimetype": "text/x-python",
|
|
||||||
"name": "python",
|
|
||||||
"nbconvert_exporter": "python",
|
|
||||||
"pygments_lexer": "ipython3",
|
|
||||||
"version": "3.6.6"
|
|
||||||
}
|
|
||||||
},
|
|
||||||
"nbformat": 4,
|
|
||||||
"nbformat_minor": 2
|
|
||||||
}
|
|
||||||
@@ -1,8 +0,0 @@
|
|||||||
name: register-model-create-image-deploy-service
|
|
||||||
dependencies:
|
|
||||||
- pip:
|
|
||||||
- azureml-sdk
|
|
||||||
- matplotlib
|
|
||||||
- tqdm
|
|
||||||
- scipy
|
|
||||||
- sklearn
|
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user