Compare commits

...

66 Commits

Author SHA1 Message Date
vizhur
879a272a8d update samples from Release-52 as a part of SDK release 2020-05-18 19:21:05 +00:00
Harneet Virk
bc65bde097 Merge pull request #971 from Azure/release_update/Release-51
update samples from Release-51 as a part of  SDK release
2020-05-13 22:17:45 -07:00
vizhur
690bdfbdbe update samples from Release-51 as a part of SDK release 2020-05-14 05:03:47 +00:00
Harneet Virk
3c02bd8782 Merge pull request #967 from Azure/release_update/Release-50
update samples from Release-50 as a part of  SDK release
2020-05-12 19:57:40 -07:00
vizhur
5c14610a1c update samples from Release-50 as a part of SDK release 2020-05-13 02:45:40 +00:00
Harneet Virk
4e3afae6fb Merge pull request #965 from Azure/release_update/Release-49
update samples from Release-49 as a part of  SDK release
2020-05-11 19:25:28 -07:00
vizhur
a2144aa083 update samples from Release-49 as a part of SDK release 2020-05-12 02:24:34 +00:00
Harneet Virk
0e6334178f Merge pull request #963 from Azure/release_update/Release-46
update samples from Release-46 as a part of  SDK release
2020-05-11 14:49:34 -07:00
vizhur
4ec9178d22 update samples from Release-46 as a part of SDK release 2020-05-11 21:48:31 +00:00
Harneet Virk
2aa7c53b0c Merge pull request #962 from Azure/release_update_stablev2/Release-11
update samples from Release-11 as a part of 1.5.0 SDK stable release
2020-05-11 12:42:32 -07:00
vizhur
553fa43e17 update samples from Release-11 as a part of 1.5.0 SDK stable release 2020-05-11 18:59:22 +00:00
Harneet Virk
e98131729e Merge pull request #949 from Azure/release_update_stablev2/Release-8
update samples from Release-8 as a part of 1.4.0 SDK stable release
2020-04-27 11:00:37 -07:00
vizhur
fd2b09e2c2 update samples from Release-8 as a part of 1.4.0 SDK stable release 2020-04-27 17:44:41 +00:00
Harneet Virk
7970209069 Merge pull request #930 from Azure/release_update/Release-44
update samples from Release-44 as a part of  SDK release
2020-04-17 12:46:29 -07:00
vizhur
24f8651bb5 update samples from Release-44 as a part of SDK release 2020-04-17 19:45:37 +00:00
Harneet Virk
b881f78e46 Merge pull request #918 from Azure/release_update_stablev2/Release-6
update samples from Release-6 as a part of 1.3.0 SDK stable release
2020-04-13 09:23:38 -07:00
vizhur
057e22b253 update samples from Release-6 as a part of 1.3.0 SDK stable release 2020-04-13 16:22:23 +00:00
Harneet Virk
c520bd1d41 Merge pull request #884 from Azure/release_update/Release-43
update samples from Release-43 as a part of  SDK release
2020-03-23 16:49:27 -07:00
vizhur
d3f1212440 update samples from Release-43 as a part of SDK release 2020-03-23 23:39:45 +00:00
Harneet Virk
b95a65eef4 Merge pull request #883 from Azure/release_update_stablev2/Release-3
update samples from Release-3 as a part of 1.2.0 SDK stable release
2020-03-23 16:21:53 -07:00
vizhur
2218af619f update samples from Release-3 as a part of 1.2.0 SDK stable release 2020-03-23 23:11:53 +00:00
Harneet Virk
0401128638 Merge pull request #878 from Azure/release_update/Release-42
update samples from Release-42 as a part of  SDK release
2020-03-20 11:14:02 -07:00
vizhur
59fcb54998 update samples from Release-42 as a part of SDK release 2020-03-20 18:10:08 +00:00
Harneet Virk
e0ea99a6bb Merge pull request #862 from Azure/release_update/Release-41
update samples from Release-41 as a part of  SDK release
2020-03-13 14:57:58 -07:00
vizhur
b06f5ce269 update samples from Release-41 as a part of SDK release 2020-03-13 21:57:04 +00:00
Harneet Virk
ed0ce9e895 Merge pull request #856 from Azure/release_update/Release-40
update samples from Release-40 as a part of  SDK release
2020-03-12 12:28:18 -07:00
vizhur
71053d705b update samples from Release-40 as a part of SDK release 2020-03-12 19:25:26 +00:00
Harneet Virk
77f98bf75f Merge pull request #852 from Azure/release_update_stable/Release-6
update samples from Release-6 as a part of 1.1.5 SDK stable release
2020-03-11 15:37:59 -06:00
vizhur
e443fd1342 update samples from Release-6 as a part of 1.1.5rc0 SDK stable release 2020-03-11 19:51:02 +00:00
Harneet Virk
2165cf308e update samples from Release-25 as a part of 1.1.2rc0 SDK experimental release (#829)
Co-authored-by: vizhur <vizhur@live.com>
2020-03-02 15:42:04 -05:00
Harneet Virk
3d6caa10a3 Merge pull request #801 from Azure/release_update/Release-39
update samples from Release-39 as a part of  SDK release
2020-02-13 19:03:36 -07:00
vizhur
4df079db1c update samples from Release-39 as a part of SDK release 2020-02-14 02:01:41 +00:00
Sander Vanhove
67d0b02ef9 Fix broken link in README (#797) 2020-02-13 08:20:28 -05:00
Harneet Virk
4e7b3784d5 Merge pull request #788 from Azure/release_update/Release-38
update samples from Release-38 as a part of  SDK release
2020-02-11 13:16:15 -07:00
vizhur
ed91e39d7e update samples from Release-38 as a part of SDK release 2020-02-11 20:00:16 +00:00
Harneet Virk
a09a1a16a7 Merge pull request #780 from Azure/release_update/Release-37
update samples from Release-37 as a part of  SDK release
2020-02-07 21:52:34 -07:00
vizhur
9662505517 update samples from Release-37 as a part of SDK release 2020-02-08 04:49:27 +00:00
Harneet Virk
8e103c02ff Merge pull request #779 from Azure/release_update/Release-36
update samples from Release-36 as a part of  SDK release
2020-02-07 21:40:57 -07:00
vizhur
ecb5157add update samples from Release-36 as a part of SDK release 2020-02-08 04:35:14 +00:00
Shané Winner
d7d23d5e7c Update index.md 2020-02-05 22:41:22 -08:00
Harneet Virk
83a21ba53a update samples from Release-35 as a part of SDK release (#765)
Co-authored-by: vizhur <vizhur@live.com>
2020-02-05 20:03:41 -05:00
Harneet Virk
3c9cb89c1a update samples from Release-18 as a part of 1.1.0rc0 SDK experimental release (#760)
Co-authored-by: vizhur <vizhur@live.com>
2020-02-04 22:19:52 -05:00
Sheri Gilley
cca7c2e26f add cell metadata 2020-02-04 11:31:07 -06:00
Harneet Virk
e895d7c2bf update samples - test (#758)
Co-authored-by: vizhur <vizhur@live.com>
2020-01-31 15:19:58 -05:00
Shané Winner
3588eb9665 Update index.md 2020-01-23 15:46:43 -08:00
Harneet Virk
a09e726f31 update samples - test (#748)
Co-authored-by: vizhur <vizhur@live.com>
2020-01-23 16:50:29 -05:00
Shané Winner
4fb1d9ee5b Update index.md 2020-01-22 11:38:24 -08:00
Harneet Virk
b05ff80e9d update samples from Release-169 as a part of 1.0.85 SDK release (#742)
Co-authored-by: vizhur <vizhur@live.com>
2020-01-21 18:00:15 -05:00
Shané Winner
512630472b Update index.md 2020-01-08 14:52:23 -08:00
vizhur
ae1337fe70 Merge pull request #724 from Azure/release_update/Release-167
update samples from Release-167 as a part of 1.0.83 SDK release
2020-01-06 15:38:25 -05:00
vizhur
c95f970dc8 update samples from Release-167 as a part of 1.0.83 SDK release 2020-01-06 20:16:21 +00:00
Shané Winner
9b9d112719 Update index.md 2019-12-24 07:40:48 -08:00
vizhur
fe8fcd4b48 Merge pull request #712 from Azure/release_update/Release-31
update samples - test
2019-12-23 20:28:02 -05:00
vizhur
296ae01587 update samples - test 2019-12-24 00:42:48 +00:00
Shané Winner
8f4efe15eb Update index.md 2019-12-10 09:05:23 -08:00
vizhur
d179080467 Merge pull request #690 from Azure/release_update/Release-163
update samples from Release-163 as a part of 1.0.79 SDK release
2019-12-09 15:41:03 -05:00
vizhur
0040644e7a update samples from Release-163 as a part of 1.0.79 SDK release 2019-12-09 20:09:30 +00:00
Shané Winner
8aa04307fb Update index.md 2019-12-03 10:24:18 -08:00
Shané Winner
a525da4488 Update index.md 2019-11-27 13:08:21 -08:00
Shané Winner
e149565a8a Merge pull request #679 from Azure/release_update/Release-30
update samples - test
2019-11-27 13:05:00 -08:00
vizhur
75610ec31c update samples - test 2019-11-27 21:02:21 +00:00
Shané Winner
0c2c450b6b Update index.md 2019-11-25 14:34:48 -08:00
Shané Winner
0d548eabff Merge pull request #677 from Azure/release_update/Release-29
update samples - test
2019-11-25 14:31:50 -08:00
vizhur
e4029801e6 update samples - test 2019-11-25 22:24:09 +00:00
Shané Winner
156974ee7b Update index.md 2019-11-25 11:42:53 -08:00
Shané Winner
1f05157d24 Merge pull request #676 from Azure/release_update/Release-160
update samples from Release-160 as a part of 1.0.76 SDK release
2019-11-25 11:39:27 -08:00
305 changed files with 21412 additions and 11156 deletions

View File

@@ -2,7 +2,7 @@
This repository contains example notebooks demonstrating the [Azure Machine Learning](https://azure.microsoft.com/en-us/services/machine-learning-service/) Python SDK which allows you to build, train, deploy and manage machine learning solutions using Azure. The AML SDK allows you the choice of using local or cloud compute resources, while managing and maintaining the complete data science workflow from the cloud.
![Azure ML Workflow](https://raw.githubusercontent.com/MicrosoftDocs/azure-docs/master/articles/machine-learning/service/media/concept-azure-machine-learning-architecture/workflow.png)
![Azure ML Workflow](https://raw.githubusercontent.com/MicrosoftDocs/azure-docs/master/articles/machine-learning/media/concept-azure-machine-learning-architecture/workflow.png)
## Quick installation
@@ -13,15 +13,15 @@ Read more detailed instructions on [how to set up your environment](./NBSETUP.md
## How to navigate and use the example notebooks?
If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, you should always run the [Configuration](./configuration.ipynb) notebook first when setting up a notebook library on a new machine or in a new environment. It configures your notebook library to connect to an Azure Machine Learning workspace, and sets up your workspace and compute to be used by many of the other examples.
This [index](.index.md) should assist in navigating the Azure Machine Learning notebook samples and encourage efficient retrieval of topics and content.
This [index](./index.md) should assist in navigating the Azure Machine Learning notebook samples and encourage efficient retrieval of topics and content.
If you want to...
* ...try out and explore Azure ML, start with image classification tutorials: [Part 1 (Training)](./tutorials/img-classification-part1-training.ipynb) and [Part 2 (Deployment)](./tutorials/img-classification-part2-deploy.ipynb).
* ...try out and explore Azure ML, start with image classification tutorials: [Part 1 (Training)](./tutorials/image-classification-mnist-data/img-classification-part1-training.ipynb) and [Part 2 (Deployment)](./tutorials/image-classification-mnist-data/img-classification-part2-deploy.ipynb).
* ...learn about experimentation and tracking run history, first [train within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then try [training on remote VM](./how-to-use-azureml/training/train-on-remote-vm/train-on-remote-vm.ipynb) and [using logging APIs](./how-to-use-azureml/training/logging-api/logging-api.ipynb).
* ...train deep learning models at scale, first learn about [Machine Learning Compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and then try [distributed hyperparameter tuning](./how-to-use-azureml/training-with-deep-learning/train-hyperparameter-tune-deploy-with-pytorch/train-hyperparameter-tune-deploy-with-pytorch.ipynb) and [distributed training](./how-to-use-azureml/training-with-deep-learning/distributed-pytorch-with-horovod/distributed-pytorch-with-horovod.ipynb).
* ...deploy models as a realtime scoring service, first learn the basics by [training within Notebook and deploying to Azure Container Instance](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then learn how to [register and manage models, and create Docker images](./how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb), and [production deploy models on Azure Kubernetes Cluster](./how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb).
* ...deploy models as a batch scoring service, first [train a model within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), learn how to [register and manage models](./how-to-use-azureml/deployment/register-model-create-image-deploy-service/register-model-create-image-deploy-service.ipynb), then [create Machine Learning Compute for scoring compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and [use Machine Learning Pipelines to deploy your model](https://aka.ms/pl-batch-scoring).
* ...deploy models as a realtime scoring service, first learn the basics by [training within Notebook and deploying to Azure Container Instance](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then learn how to [production deploy models on Azure Kubernetes Cluster](./how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.ipynb).
* ...deploy models as a batch scoring service, first [train a model within Notebook](./how-to-use-azureml/training/train-within-notebook/train-within-notebook.ipynb), then [create Machine Learning Compute for scoring compute](./how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb), and [use Machine Learning Pipelines to deploy your model](https://aka.ms/pl-batch-scoring).
* ...monitor your deployed models, learn about using [App Insights](./how-to-use-azureml/deployment/enable-app-insights-in-production-service/enable-app-insights-in-production-service.ipynb).
## Tutorials

View File

@@ -103,7 +103,7 @@
"source": [
"import azureml.core\n",
"\n",
"print(\"This notebook was created using version 1.0.76 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},

View File

@@ -9,7 +9,6 @@ As a pre-requisite, run the [configuration Notebook](../configuration.ipynb) not
* [train-on-amlcompute](./training/train-on-amlcompute): Use a 1-n node Azure ML managed compute cluster for remote runs on Azure CPU or GPU infrastructure.
* [train-on-remote-vm](./training/train-on-remote-vm): Use Data Science Virtual Machine as a target for remote runs.
* [logging-api](./track-and-monitor-experiments/logging-api): Learn about the details of logging metrics to run history.
* [register-model-create-image-deploy-service](./deployment/register-model-create-image-deploy-service): Learn about the details of model management.
* [production-deploy-to-aks](./deployment/production-deploy-to-aks) Deploy a model to production at scale on Azure Kubernetes Service.
* [enable-app-insights-in-production-service](./deployment/enable-app-insights-in-production-service) Learn how to use App Insights with production web service.

View File

@@ -1,8 +1,8 @@
# Table of Contents
1. [Automated ML Introduction](#introduction)
1. [Setup using Azure Notebooks](#jupyter)
1. [Setup using Azure Databricks](#databricks)
1. [Setup using Compute Instances](#jupyter)
1. [Setup using a Local Conda environment](#localconda)
1. [Setup using Azure Databricks](#databricks)
1. [Automated ML SDK Sample Notebooks](#samples)
1. [Documentation](#documentation)
1. [Running using python command](#pythoncommand)
@@ -21,13 +21,13 @@ Below are the three execution environments supported by automated ML.
<a name="jupyter"></a>
## Setup using Notebook VMs - Jupyter based notebooks from a Azure VM
## Setup using Compute Instances - Jupyter based notebooks from a Azure Virtual Machine
1. Open the [ML Azure portal](https://ml.azure.com)
1. Select Compute
1. Select Notebook VMs
1. Select Compute Instances
1. Click New
1. Type a name for the Vm and select a VM type
1. Type a Compute Name, select a Virtual Machine type and select a Virtual Machine size
1. Click Create
<a name="localconda"></a>
@@ -117,7 +117,7 @@ jupyter notebook
- Simple example of using automated ML for regression
- Uses azure compute for training
- [auto-ml-regression-hardware-performance-explanation-and-featurization.ipynb](regression-hardware-performance-explanation-and-featurization/auto-ml-regression-hardware-performance-explanation-and-featurization.ipynb)
- [auto-ml-regression-explanation-featurization.ipynb](regression-explanation-featurization/auto-ml-regression-explanation-featurization.ipynb)
- Dataset: Hardware Performance Dataset
- Shows featurization and excplanation
- Uses azure compute for training
@@ -144,7 +144,7 @@ jupyter notebook
- Dataset: forecasting for a bike-sharing
- Example of training an automated ML forecasting model on multiple time-series
- [automl-forecasting-function.ipynb](forecasting-high-frequency/automl-forecasting-function.ipynb)
- [auto-ml-forecasting-function.ipynb](forecasting-high-frequency/auto-ml-forecasting-function.ipynb)
- Example of training an automated ML forecasting model on multiple time-series
- [auto-ml-forecasting-beer-remote.ipynb](forecasting-beer-remote/auto-ml-forecasting-beer-remote.ipynb)
@@ -152,7 +152,7 @@ jupyter notebook
- Beer Production Forecasting
- [auto-ml-continuous-retraining.ipynb](continuous-retraining/auto-ml-continuous-retraining.ipynb)
- Continous retraining using Pipelines and Time-Series TabularDataset
- Continuous retraining using Pipelines and Time-Series TabularDataset
- [auto-ml-classification-text-dnn.ipynb](classification-text-dnn/auto-ml-classification-text-dnn.ipynb)
- Classification with text data using deep learning in AutoML
@@ -197,6 +197,17 @@ If automl_setup_linux.sh fails on Ubuntu Linux with the error: `unable to execut
4) Check that the region is one of the supported regions: `eastus2`, `eastus`, `westcentralus`, `southeastasia`, `westeurope`, `australiaeast`, `westus2`, `southcentralus`
5) Check that you have access to the region using the Azure Portal.
## import AutoMLConfig fails after upgrade from before 1.0.76 to 1.0.76 or later
There were package changes in automated machine learning version 1.0.76, which require the previous version to be uninstalled before upgrading to the new version.
If you have manually upgraded from a version of automated machine learning before 1.0.76 to 1.0.76 or later, you may get the error:
`ImportError: cannot import name 'AutoMLConfig'`
This can be resolved by running:
`pip uninstall azureml-train-automl` and then
`pip install azureml-train-automl`
The automl_setup.cmd script does this automatically.
## workspace.from_config fails
If the call `ws = Workspace.from_config()` fails:
1) Make sure that you have run the `configuration.ipynb` notebook successfully.

View File

@@ -2,21 +2,20 @@ name: azure_automl
dependencies:
# The python interpreter version.
# Currently Azure ML only supports 3.5.2 and later.
- pip
- pip<=19.3.1
- python>=3.5.2,<3.6.8
- nb_conda
- matplotlib==2.1.0
- numpy>=1.16.0,<=1.16.2
- cython
- urllib3<1.24
- scipy>=1.0.0,<=1.1.0
- scipy==1.4.1
- scikit-learn>=0.19.0,<=0.20.3
- pandas>=0.22.0,<=0.23.4
- py-xgboost<=0.80
- pyarrow>=0.11.0
- fbprophet==0.5
- pytorch=1.1.0
- cudatoolkit=9.0
- py-xgboost<=0.90
- conda-forge::fbprophet==0.5
- pytorch::pytorch=1.4.0
- cudatoolkit=10.1.243
- pip:
# Required packages for AzureML execution, history, and data preparation.
@@ -24,15 +23,9 @@ dependencies:
- azureml-train-automl
- azureml-train
- azureml-widgets
- azureml-explain-model
- azureml-pipeline
- azureml-contrib-interpret
- pytorch-transformers==1.0.0
- spacy==2.1.8
- joblib
- onnxruntime==0.4.0
- pyarrow==0.17.0
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
channels:
- conda-forge
- pytorch

View File

@@ -2,7 +2,7 @@ name: azure_automl
dependencies:
# The python interpreter version.
# Currently Azure ML only supports 3.5.2 and later.
- pip
- pip<=19.3.1
- nomkl
- python>=3.5.2,<3.6.8
- nb_conda
@@ -10,13 +10,12 @@ dependencies:
- numpy>=1.16.0,<=1.16.2
- cython
- urllib3<1.24
- scipy>=1.0.0,<=1.1.0
- scipy==1.4.1
- scikit-learn>=0.19.0,<=0.20.3
- pandas>=0.22.0,<0.23.0
- py-xgboost<=0.80
- pyarrow>=0.11.0
- fbprophet==0.5
- pytorch=1.1.0
- pandas>=0.22.0,<=0.23.4
- py-xgboost<=0.90
- conda-forge::fbprophet==0.5
- pytorch::pytorch=1.4.0
- cudatoolkit=9.0
- pip:
@@ -25,15 +24,8 @@ dependencies:
- azureml-train-automl
- azureml-train
- azureml-widgets
- azureml-explain-model
- azureml-pipeline
- azureml-contrib-interpret
- pytorch-transformers==1.0.0
- spacy==2.1.8
- joblib
- onnxruntime==0.4.0
- pyarrow==0.17.0
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz
channels:
- conda-forge
- pytorch

View File

@@ -14,8 +14,9 @@ IF "%CONDA_EXE%"=="" GOTO CondaMissing
call conda activate %conda_env_name% 2>nul:
if not errorlevel 1 (
echo Upgrading azureml-sdk[automl,notebooks,explain] in existing conda environment %conda_env_name%
call pip install --upgrade azureml-sdk[automl,notebooks,explain]
echo Upgrading existing conda environment %conda_env_name%
call pip uninstall azureml-train-automl -y -q
call conda env update --name %conda_env_name% --file %automl_env_file%
if errorlevel 1 goto ErrorExit
) else (
call conda env create -f %automl_env_file% -n %conda_env_name%

View File

@@ -22,8 +22,9 @@ fi
if source activate $CONDA_ENV_NAME 2> /dev/null
then
echo "Upgrading azureml-sdk[automl,notebooks,explain] in existing conda environment" $CONDA_ENV_NAME
pip install --upgrade azureml-sdk[automl,notebooks,explain] &&
echo "Upgrading existing conda environment" $CONDA_ENV_NAME
pip uninstall azureml-train-automl -y -q
conda env update --name $CONDA_ENV_NAME --file $AUTOML_ENV_FILE &&
jupyter nbextension uninstall --user --py azureml.widgets
else
conda env create -f $AUTOML_ENV_FILE -n $CONDA_ENV_NAME &&

View File

@@ -22,8 +22,9 @@ fi
if source activate $CONDA_ENV_NAME 2> /dev/null
then
echo "Upgrading azureml-sdk[automl,notebooks,explain] in existing conda environment" $CONDA_ENV_NAME
pip install --upgrade azureml-sdk[automl,notebooks,explain] &&
echo "Upgrading existing conda environment" $CONDA_ENV_NAME
pip uninstall azureml-train-automl -y -q
conda env update --name $CONDA_ENV_NAME --file $AUTOML_ENV_FILE &&
jupyter nbextension uninstall --user --py azureml.widgets
else
conda env create -f $AUTOML_ENV_FILE -n $CONDA_ENV_NAME &&

View File

@@ -41,7 +41,7 @@
"\n",
"In this example we use the UCI Bank Marketing dataset to showcase how you can use AutoML for a classification problem and deploy it to an Azure Container Instance (ACI). The classification goal is to predict if the client will subscribe to a term deposit with the bank.\n",
"\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"If you are using an Azure Machine Learning Compute Instance, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"\n",
"Please find the ONNX related documentations [here](https://github.com/onnx/onnx).\n",
"\n",
@@ -92,6 +92,49 @@
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Accessing the Azure ML workspace requires authentication with Azure.\n",
"\n",
"The default authentication is interactive authentication using the default tenant. Executing the `ws = Workspace.from_config()` line in the cell below will prompt for authentication the first time that it is run.\n",
"\n",
"If you have multiple Azure tenants, you can specify the tenant by replacing the `ws = Workspace.from_config()` line in the cell below with the following:\n",
"\n",
"```\n",
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
"auth = InteractiveLoginAuthentication(tenant_id = 'mytenantid')\n",
"ws = Workspace.from_config(auth = auth)\n",
"```\n",
"\n",
"If you need to run in an environment where interactive login is not possible, you can use Service Principal authentication by replacing the `ws = Workspace.from_config()` line in the cell below with the following:\n",
"\n",
"```\n",
"from azureml.core.authentication import ServicePrincipalAuthentication\n",
"auth = auth = ServicePrincipalAuthentication('mytenantid', 'myappid', 'mypassword')\n",
"ws = Workspace.from_config(auth = auth)\n",
"```\n",
"For more details, see [aka.ms/aml-notebook-auth](http://aka.ms/aml-notebook-auth)"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -106,7 +149,6 @@
"experiment=Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -134,35 +176,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-4\"\n",
"# Choose a name for your CPU cluster\n",
"cpu_cluster_name = \"cpu-cluster-4\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 6)\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=cpu_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D2_V2',\n",
" max_nodes=6)\n",
" compute_target = ComputeTarget.create(ws, cpu_cluster_name, compute_config)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
" \n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -285,9 +314,10 @@
"|**task**|classification or regression or forecasting|\n",
"|**primary_metric**|This is the metric that you want to optimize. Classification supports the following primary metrics: <br><i>accuracy</i><br><i>AUC_weighted</i><br><i>average_precision_score_weighted</i><br><i>norm_macro_recall</i><br><i>precision_score_weighted</i>|\n",
"|**iteration_timeout_minutes**|Time limit in minutes for each iteration.|\n",
"|**blacklist_models** or **whitelist_models** |*List* of *strings* indicating machine learning algorithms for AutoML to avoid in this run.<br><br> Allowed values for **Classification**<br><i>LogisticRegression</i><br><i>SGD</i><br><i>MultinomialNaiveBayes</i><br><i>BernoulliNaiveBayes</i><br><i>SVM</i><br><i>LinearSVM</i><br><i>KNN</i><br><i>DecisionTree</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>GradientBoosting</i><br><i>TensorFlowDNN</i><br><i>TensorFlowLinearClassifier</i><br><br>Allowed values for **Regression**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><br>Allowed values for **Forecasting**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><i>Arima</i><br><i>Prophet</i>|\n",
"|**blacklist_models** | *List* of *strings* indicating machine learning algorithms for AutoML to avoid in this run. <br><br> Allowed values for **Classification**<br><i>LogisticRegression</i><br><i>SGD</i><br><i>MultinomialNaiveBayes</i><br><i>BernoulliNaiveBayes</i><br><i>SVM</i><br><i>LinearSVM</i><br><i>KNN</i><br><i>DecisionTree</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>GradientBoosting</i><br><i>TensorFlowDNN</i><br><i>TensorFlowLinearClassifier</i><br><br>Allowed values for **Regression**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><br>Allowed values for **Forecasting**<br><i>ElasticNet</i><br><i>GradientBoosting</i><br><i>DecisionTree</i><br><i>KNN</i><br><i>LassoLars</i><br><i>SGD</i><br><i>RandomForest</i><br><i>ExtremeRandomTrees</i><br><i>LightGBM</i><br><i>TensorFlowLinearRegressor</i><br><i>TensorFlowDNN</i><br><i>Arima</i><br><i>Prophet</i>|\n",
"| **whitelist_models** | *List* of *strings* indicating machine learning algorithms for AutoML to use in this run. Same values listed above for **blacklist_models** allowed for **whitelist_models**.|\n",
"|**experiment_exit_score**| Value indicating the target for *primary_metric*. <br>Once the target is surpassed the run terminates.|\n",
"|**experiment_timeout_minutes**| Maximum amount of time in minutes that all iterations combined can take before the experiment terminates.|\n",
"|**experiment_timeout_hours**| Maximum amount of time in hours that all iterations combined can take before the experiment terminates.|\n",
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
"|**featurization**| 'auto' / 'off' Indicator for whether featurization step should be done automatically or not. Note: If the input data is sparse, featurization cannot be turned on.|\n",
"|**n_cross_validations**|Number of cross validation splits.|\n",
@@ -304,7 +334,7 @@
"outputs": [],
"source": [
"automl_settings = {\n",
" \"experiment_timeout_minutes\" : 20,\n",
" \"experiment_timeout_hours\" : 0.3,\n",
" \"enable_early_stopping\" : True,\n",
" \"iteration_timeout_minutes\": 5,\n",
" \"max_concurrent_iterations\": 4,\n",
@@ -367,8 +397,6 @@
"outputs": [],
"source": [
"#from azureml.train.automl.run import AutoMLRun\n",
"#experiment_name = 'automl-classification-bmarketing'\n",
"#experiment = Experiment(ws, experiment_name)\n",
"#remote_run = AutoMLRun(experiment=experiment, run_id='<run_ID_goes_here')\n",
"#remote_run"
]
@@ -456,6 +484,72 @@
"RunDetails(remote_run).show() "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Retrieve the Best Model's explanation\n",
"Retrieve the explanation from the best_run which includes explanations for engineered features and raw features. Make sure that the run for generating explanations for the best model is completed."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Wait for the best model explanation run to complete\n",
"from azureml.core.run import Run\n",
"model_explainability_run_id = remote_run.get_properties().get('ModelExplainRunId')\n",
"print(model_explainability_run_id)\n",
"if model_explainability_run_id is not None:\n",
" model_explainability_run = Run(experiment=experiment, run_id=model_explainability_run_id)\n",
" model_explainability_run.wait_for_completion()\n",
"\n",
"# Get the best run object\n",
"best_run, fitted_model = remote_run.get_output()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Download engineered feature importance from artifact store\n",
"You can use ExplanationClient to download the engineered feature explanations from the artifact store of the best_run."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"client = ExplanationClient.from_run(best_run)\n",
"engineered_explanations = client.download_model_explanation(raw=False)\n",
"exp_data = engineered_explanations.get_feature_importance_dict()\n",
"exp_data"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Download raw feature importance from artifact store\n",
"You can use ExplanationClient to download the raw feature explanations from the artifact store of the best_run."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"client = ExplanationClient.from_run(best_run)\n",
"engineered_explanations = client.download_model_explanation(raw=True)\n",
"exp_data = engineered_explanations.get_feature_importance_dict()\n",
"exp_data"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -549,7 +643,7 @@
"\n",
"### Retrieve the Best Model\n",
"\n",
"Below we select the best pipeline from our iterations. The `get_output` method on `automl_classifier` returns the best run and the fitted model for the last invocation. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
"Below we select the best pipeline from our iterations. The `get_output` method returns the best run and the fitted model. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
]
},
{
@@ -572,20 +666,6 @@
"best_run, fitted_model = remote_run.get_output()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import shutil\n",
"\n",
"sript_folder = os.path.join(os.getcwd(), 'inference')\n",
"project_folder = '/inference'\n",
"os.makedirs(project_folder, exist_ok=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -639,10 +719,10 @@
"from azureml.core.webservice import AciWebservice\n",
"from azureml.core.webservice import Webservice\n",
"from azureml.core.model import Model\n",
"from azureml.core.environment import Environment\n",
"\n",
"inference_config = InferenceConfig(runtime = \"python\", \n",
" entry_script = script_file_name,\n",
" conda_file = conda_env_file_name)\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=conda_env_file_name)\n",
"inference_config = InferenceConfig(entry_script=script_file_name, environment=myenv)\n",
"\n",
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
" memory_gb = 1, \n",

View File

@@ -2,12 +2,7 @@ name: auto-ml-classification-bank-marketing-all-features
dependencies:
- pip:
- azureml-sdk
- interpret
- azureml-defaults
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml
- onnxruntime==0.4.0
- azureml-explain-model
- azureml-contrib-interpret
- onnxruntime==1.0.0

View File

@@ -42,7 +42,7 @@
"\n",
"This notebook is using remote compute to train the model.\n",
"\n",
"If you are using an Azure Machine Learning [Notebook VM](https://docs.microsoft.com/en-us/azure/machine-learning/service/tutorial-1st-experiment-sdk-setup), you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"If you are using an Azure Machine Learning Compute Instance, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"\n",
"In this notebook you will learn how to:\n",
"1. Create an experiment using an existing workspace.\n",
@@ -80,6 +80,23 @@
"from azureml.train.automl import AutoMLConfig"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -94,7 +111,6 @@
"experiment=Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -122,35 +138,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your AmlCompute cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-1\"\n",
"# Choose a name for your CPU cluster\n",
"cpu_cluster_name = \"cpu-cluster-1\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'cpu-cluster-1':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_DS12_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 6)\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=cpu_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_DS12_V2',\n",
" max_nodes=6)\n",
" compute_target = ComputeTarget.create(ws, cpu_cluster_name, compute_config)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
"\n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -210,10 +213,9 @@
"automl_settings = {\n",
" \"n_cross_validations\": 3,\n",
" \"primary_metric\": 'average_precision_score_weighted',\n",
" \"preprocess\": True,\n",
" \"enable_early_stopping\": True,\n",
" \"max_concurrent_iterations\": 2, # This is a limit for testing purpose, please increase it as per cluster size\n",
" \"experiment_timeout_minutes\": 10, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
" \"experiment_timeout_hours\": 0.25, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
" \"verbosity\": logging.INFO,\n",
"}\n",
"\n",
@@ -283,7 +285,11 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"widget-rundetails-sample"
]
},
"outputs": [],
"source": [
"from azureml.widgets import RunDetails\n",
@@ -305,7 +311,7 @@
"source": [
"#### Explain model\n",
"\n",
"Automated ML models can be explained and visualized using the SDK Explainability library. [Learn how to use the explainer](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/model-explanation-remote-amlcompute/auto-ml-model-explanations-remote-compute.ipynb)."
"Automated ML models can be explained and visualized using the SDK Explainability library. "
]
},
{
@@ -316,7 +322,7 @@
"\n",
"### Retrieve the Best Model\n",
"\n",
"Below we select the best pipeline from our iterations. The `get_output` method on `automl_classifier` returns the best run and the fitted model for the last invocation. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
"Below we select the best pipeline from our iterations. The `get_output` method returns the best run and the fitted model. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
]
},
{
@@ -334,17 +340,7 @@
"metadata": {},
"source": [
"#### Print the properties of the model\n",
"The fitted_model is a python object and you can read the different properties of the object.\n",
"See *Print the properties of the model* section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification/auto-ml-classification.ipynb)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Deploy\n",
"\n",
"To deploy the model into a web service endpoint, see _Deploy_ section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-with-deployment/auto-ml-classification-with-deployment.ipynb)"
"The fitted_model is a python object and you can read the different properties of the object.\n"
]
},
{

View File

@@ -2,10 +2,6 @@ name: auto-ml-classification-credit-card-fraud
dependencies:
- pip:
- azureml-sdk
- interpret
- azureml-defaults
- azureml-explain-model
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml

View File

@@ -47,8 +47,8 @@
"Notebook synopsis:\n",
"1. Creating an Experiment in an existing Workspace\n",
"2. Configuration and remote run of AutoML for a text dataset (20 Newsgroups dataset from scikit-learn) for classification\n",
"3. Evaluating the final model on a test set\n",
"4. Deploying the model on ACI"
"3. Registering the best model for future use\n",
"4. Evaluating the final model on a test set"
]
},
{
@@ -84,6 +84,23 @@
"from sklearn.datasets import fetch_20newsgroups"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -105,7 +122,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace Name'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -121,9 +137,9 @@
"metadata": {},
"source": [
"## Set up a compute cluster\n",
"This section uses a user-provided compute cluster (named \"cpu-cluster\" in this example). If a cluster with this name does not exist in the user's workspace, the below code will create a new cluster. You can choose the parameters of the cluster as mentioned in the comments.\n",
"This section uses a user-provided compute cluster (named \"dnntext-cluster\" in this example). If a cluster with this name does not exist in the user's workspace, the below code will create a new cluster. You can choose the parameters of the cluster as mentioned in the comments.\n",
"\n",
"Whether you provide/select a CPU or GPU cluster, AutoML will choose the appropriate DNN for that setup - BiLSTM or BERT text featurizer will be included in the candidate featurizers on CPU and GPU respectively."
"Whether you provide/select a CPU or GPU cluster, AutoML will choose the appropriate DNN for that setup - BiLSTM or BERT text featurizer will be included in the candidate featurizers on CPU and GPU respectively. If your goal is to obtain the most accurate model, we recommend you use GPU clusters since BERT featurizers usually outperform BiLSTM featurizers."
]
},
{
@@ -132,34 +148,25 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-dnntext\"\n",
"amlcompute_cluster_name = \"dnntext-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_NC6\", # CPU for BiLSTM, such as \"STANDARD_D2_V2\" \n",
" # To use BERT (this is recommended for best performance), select a GPU such as \"STANDARD_NC6\" \n",
" # or similar GPU option\n",
" # available in your workspace\n",
" max_nodes = 1)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # CPU for BiLSTM\n",
" # To use BERT, select a GPU such as \"STANDARD_NC6\" \n",
" # or similar GPU option\n",
" # available in your workspace\n",
" max_nodes = 6)\n",
"\n",
" # Create the cluster\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
"\n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
"\n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -187,8 +194,8 @@
" '''\n",
" remove = ('headers', 'footers', 'quotes')\n",
" categories = [\n",
" 'alt.atheism',\n",
" 'talk.religion.misc',\n",
" 'rec.sport.baseball',\n",
" 'rec.sport.hockey',\n",
" 'comp.graphics',\n",
" 'sci.space',\n",
" ]\n",
@@ -218,7 +225,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Featch data and upload to datastore for use in training"
"#### Fetch data and upload to datastore for use in training"
]
},
{
@@ -275,7 +282,6 @@
"automl_settings = {\n",
" \"experiment_timeout_minutes\": 20,\n",
" \"primary_metric\": 'accuracy',\n",
" \"preprocess\": True,\n",
" \"max_concurrent_iterations\": 4, \n",
" \"max_cores_per_iteration\": -1,\n",
" \"enable_dnn\": True,\n",
@@ -339,7 +345,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"You can test the model locally to get a feel of the input/output. This step may require additional package installations such as pytorch."
"You can test the model locally to get a feel of the input/output. When the model contains BERT, this step will require pytorch and pytorch-transformers installed in your local environment. The exact versions of these packages can be found in the **automl_env.yml** file located in the local copy of your MachineLearningNotebooks folder here:\n",
"MachineLearningNotebooks/how-to-use-azureml/automated-machine-learning/automl_env.yml"
]
},
{
@@ -348,15 +355,34 @@
"metadata": {},
"outputs": [],
"source": [
"#best_run, fitted_model = automl_run.get_output()"
"best_run, fitted_model = automl_run.get_output()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Deploying the model\n",
"We now use the best fitted model from the AutoML Run to make predictions on the test set. "
"You can now see what text transformations are used to convert text data to features for this dataset, including deep learning transformations based on BiLSTM or Transformer (BERT is one implementation of a Transformer) models."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"text_transformations_used = []\n",
"for column_group in fitted_model.named_steps['datatransformer'].get_featurization_summary():\n",
" text_transformations_used.extend(column_group['Transformations'])\n",
"text_transformations_used"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Registering the best model\n",
"We now register the best fitted model from the AutoML Run for use in future deployments. "
]
},
{
@@ -456,7 +482,7 @@
"source": [
"script_folder = os.path.join(os.getcwd(), 'inference')\n",
"os.makedirs(script_folder, exist_ok=True)\n",
"shutil.copy2('infer.py', script_folder)"
"shutil.copy('infer.py', script_folder)"
]
},
{
@@ -519,12 +545,12 @@
"name": "anshirga"
}
],
"datasets": [
"None"
],
"compute": [
"AML Compute"
],
"datasets": [
"None"
],
"deployment": [
"None"
],

View File

@@ -3,8 +3,10 @@ dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-train
- azureml-widgets
- matplotlib
- pandas_ml
- statsmodels
- https://download.pytorch.org/whl/cpu/torch-1.1.0-cp35-cp35m-win_amd64.whl
- sentencepiece==0.1.82
- pytorch-transformers==1.0
- spacy==2.1.8
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz

View File

@@ -2,8 +2,7 @@ import numpy as np
import argparse
from azureml.core import Run
from sklearn.externals import joblib
from azureml.automl.core._vendor.automl.client.core.common import metrics
from automl.client.core.common import constants
from azureml.automl.core.shared import constants, metrics
from azureml.core.model import Model

View File

@@ -20,7 +20,7 @@
"metadata": {},
"source": [
"# Automated Machine Learning \n",
"**Continous retraining using Pipelines and Time-Series TabularDataset**\n",
"**Continuous retraining using Pipelines and Time-Series TabularDataset**\n",
"## Contents\n",
"1. [Introduction](#Introduction)\n",
"2. [Setup](#Setup)\n",
@@ -75,6 +75,23 @@
"from azureml.train.automl import AutoMLConfig"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -112,7 +129,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -143,33 +159,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute, ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-42\"\n",
"# Choose a name for your CPU cluster\n",
"amlcompute_cluster_name = \"cont-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 4)\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D2_V2',\n",
" max_nodes=4)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
" # Can poll for a minimum number of nodes and for a specific timeout.\n",
" # If no min_node_count is provided, it will use the scale settings for the cluster.\n",
" compute_target.wait_for_completion(show_output = True, min_node_count = 0, timeout_in_minutes = 10)\n",
" \n",
" # For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -197,7 +202,7 @@
"conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n",
"\n",
"cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets'], \n",
" conda_packages=['numpy', 'py-xgboost'], \n",
" conda_packages=['numpy==1.16.2'], \n",
" pin_sdk_version=False)\n",
"#cd.add_pip_package('azureml-explain-model')\n",
"conda_run_config.environment.python.conda_dependencies = cd\n",
@@ -210,7 +215,24 @@
"metadata": {},
"source": [
"## Data Ingestion Pipeline \n",
"For this demo, we will use NOAA weather data from [Azure Open Datasets](https://azure.microsoft.com/services/open-datasets/). You can replace this with your own dataset, or you can skip this pipeline if you already have a time-series based `TabularDataset`.\n",
"For this demo, we will use NOAA weather data from [Azure Open Datasets](https://azure.microsoft.com/services/open-datasets/). You can replace this with your own dataset, or you can skip this pipeline if you already have a time-series based `TabularDataset`.\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# The name and target column of the Dataset to create \n",
"dataset = \"NOAA-Weather-DS4\"\n",
"target_column_name = \"temperature\""
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"### Upload Data Step\n",
"The data ingestion pipeline has a single step with a script to query the latest weather data and upload it to the blob store. During the first run, the script will create and register a time-series based `TabularDataset` with the past one week of weather data. For each subsequent run, the script will create a partition in the blob store by querying NOAA for new weather data since the last modified time of the dataset (`dataset.data_changed_time`) and creating a data.csv file."
@@ -225,8 +247,6 @@
"from azureml.pipeline.core import Pipeline, PipelineParameter\n",
"from azureml.pipeline.steps import PythonScriptStep\n",
"\n",
"# The name of the Dataset to create \n",
"dataset = \"NOAA-Weather-DS4\"\n",
"ds_name = PipelineParameter(name=\"ds_name\", default_value=dataset)\n",
"upload_data_step = PythonScriptStep(script_name=\"upload_weather_data.py\", \n",
" allow_reuse=False,\n",
@@ -262,7 +282,7 @@
"metadata": {},
"outputs": [],
"source": [
"data_pipeline_run.wait_for_completion()"
"data_pipeline_run.wait_for_completion(show_output=False)"
]
},
{
@@ -272,7 +292,7 @@
"## Training Pipeline\n",
"### Prepare Training Data Step\n",
"\n",
"Script to bring data into common X,y format. We need to set allow_reuse flag to False to allow the pipeline to run even when inputs don't change. We also need the name of the model to check the time the model was last trained."
"Script to check if new data is available since the model was last trained. If no new data is available, we cancel the remaining pipeline steps. We need to set allow_reuse flag to False to allow the pipeline to run even when inputs don't change. We also need the name of the model to check the time the model was last trained."
]
},
{
@@ -283,11 +303,8 @@
"source": [
"from azureml.pipeline.core import PipelineData\n",
"\n",
"target_column = PipelineParameter(\"target_column\", default_value=\"y\")\n",
"# The model name with which to register the trained model in the workspace.\n",
"model_name = PipelineParameter(\"model_name\", default_value=\"y\")\n",
"output_x = PipelineData(\"output_x\", datastore=dstor)\n",
"output_y = PipelineData(\"output_y\", datastore=dstor)"
"model_name = PipelineParameter(\"model_name\", default_value=\"noaaweatherds\")"
]
},
{
@@ -299,16 +316,23 @@
"data_prep_step = PythonScriptStep(script_name=\"check_data.py\", \n",
" allow_reuse=False,\n",
" name=\"check_data\",\n",
" arguments=[\"--target_column\", target_column,\n",
" \"--output_x\", output_x,\n",
" \"--output_y\", output_y,\n",
" \"--ds_name\", ds_name,\n",
" \"--model_name\", model_name],\n",
" outputs=[output_x, output_y], \n",
" arguments=[\"--ds_name\", ds_name,\n",
" \"--model_name\", model_name],\n",
" compute_target=compute_target, \n",
" runconfig=conda_run_config)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Dataset\n",
"train_ds = Dataset.get_by_name(ws, dataset)\n",
"train_ds = train_ds.drop_columns([\"partition_date\"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -324,14 +348,13 @@
"outputs": [],
"source": [
"from azureml.train.automl import AutoMLConfig\n",
"from azureml.train.automl.runtime import AutoMLStep\n",
"from azureml.pipeline.steps import AutoMLStep\n",
"\n",
"automl_settings = {\n",
" \"iteration_timeout_minutes\": 20,\n",
" \"experiment_timeout_minutes\": 30,\n",
" \"iteration_timeout_minutes\": 10,\n",
" \"experiment_timeout_hours\": 0.25,\n",
" \"n_cross_validations\": 3,\n",
" \"primary_metric\": 'r2_score',\n",
" \"preprocess\": True,\n",
" \"max_concurrent_iterations\": 3,\n",
" \"max_cores_per_iteration\": -1,\n",
" \"verbosity\": logging.INFO,\n",
@@ -342,8 +365,8 @@
" debug_log = 'automl_errors.log',\n",
" path = \".\",\n",
" compute_target=compute_target,\n",
" run_configuration=conda_run_config,\n",
" data_script = \"get_data.py\",\n",
" training_data = train_ds,\n",
" label_column_name = target_column_name,\n",
" **automl_settings\n",
" )"
]
@@ -359,7 +382,7 @@
"metrics_output_name = 'metrics_output'\n",
"best_model_output_name = 'best_model_output'\n",
"\n",
"metirics_data = PipelineData(name='metrics_data',\n",
"metrics_data = PipelineData(name='metrics_data',\n",
" datastore=dstor,\n",
" pipeline_output_name=metrics_output_name,\n",
" training_output=TrainingOutput(type='Metrics'))\n",
@@ -378,8 +401,7 @@
"automl_step = AutoMLStep(\n",
" name='automl_module',\n",
" automl_config=automl_config,\n",
" inputs=[output_x, output_y],\n",
" outputs=[metirics_data, model_data],\n",
" outputs=[metrics_data, model_data],\n",
" allow_reuse=False)"
]
},
@@ -432,7 +454,7 @@
"outputs": [],
"source": [
"training_pipeline_run = experiment.submit(training_pipeline, pipeline_parameters={\n",
" \"target_column\": \"temperature\", \"ds_name\": dataset, \"model_name\": \"noaaweatherds\"})"
" \"ds_name\": dataset, \"model_name\": \"noaaweatherds\"})"
]
},
{
@@ -441,7 +463,7 @@
"metadata": {},
"outputs": [],
"source": [
"training_pipeline_run.wait_for_completion()"
"training_pipeline_run.wait_for_completion(show_output=False)"
]
},
{
@@ -475,7 +497,7 @@
"source": [
"from azureml.pipeline.core import Schedule\n",
"schedule = Schedule.create(workspace=ws, name=\"RetrainingSchedule\",\n",
" pipeline_parameters={\"target_column\": \"temperature\",\"ds_name\": dataset, \"model_name\": \"noaaweatherds\"},\n",
" pipeline_parameters={\"ds_name\": dataset, \"model_name\": \"noaaweatherds\"},\n",
" pipeline_id=published_pipeline.id, \n",
" experiment_name=experiment_name, \n",
" datastore=dstor,\n",

View File

@@ -3,7 +3,6 @@ dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-pipeline
- azureml-widgets
- matplotlib
- pandas_ml
- azureml-pipeline

View File

@@ -15,32 +15,16 @@ if type(run) == _OfflineRun:
else:
ws = run.experiment.workspace
def write_output(df, path):
os.makedirs(path, exist_ok=True)
print("%s created" % path)
df.to_csv(path + "/part-00000", index=False)
print("Check for new data and prepare the data")
print("Check for new data.")
parser = argparse.ArgumentParser("split")
parser.add_argument("--target_column", type=str, help="input split features")
parser.add_argument("--ds_name", help="input dataset name")
parser.add_argument("--model_name", help="name of the deployed model")
parser.add_argument("--output_x", type=str,
help="output features")
parser.add_argument("--output_y", type=str,
help="output labels")
args = parser.parse_args()
print("Argument 1(ds_name): %s" % args.ds_name)
print("Argument 2(target_column): %s" % args.target_column)
print("Argument 3(model_name): %s" % args.model_name)
print("Argument 4(output_x): %s" % args.output_x)
print("Argument 5(output_y): %s" % args.output_y)
print("Argument 2(model_name): %s" % args.model_name)
# Get the latest registered model
try:
@@ -54,22 +38,9 @@ except Exception as e:
train_ds = Dataset.get_by_name(ws, args.ds_name)
dataset_changed_time = train_ds.data_changed_time
if dataset_changed_time > last_train_time:
# New data is available since the model was last trained
print("Dataset was last updated on {0}. Retraining...".format(dataset_changed_time))
train_ds = train_ds.drop_columns(["partition_date"])
X_train = train_ds.drop_columns(
columns=[args.target_column]).to_pandas_dataframe()
y_train = train_ds.keep_columns(
columns=[args.target_column]).to_pandas_dataframe()
non_null = y_train[args.target_column].notnull()
y = y_train[non_null]
X = X_train[non_null]
if not (args.output_x is None and args.output_y is None):
write_output(X, args.output_x)
write_output(y, args.output_y)
else:
if not dataset_changed_time > last_train_time:
print("Cancelling run since there is no new data.")
run.parent.cancel()
else:
# New data is available since the model was last trained
print("Dataset was last updated on {0}. Retraining...".format(dataset_changed_time))

View File

@@ -1,15 +0,0 @@
import os
import pandas as pd
def get_data():
print("In get_data")
print(os.environ['AZUREML_DATAREFERENCE_output_x'])
X_train = pd.read_csv(
os.environ['AZUREML_DATAREFERENCE_output_x'] + "/part-00000")
y_train = pd.read_csv(
os.environ['AZUREML_DATAREFERENCE_output_y'] + "/part-00000")
print(X_train.head(3))
return {"X": X_train.values, "y": y_train.values.flatten()}

View File

@@ -58,7 +58,7 @@ except Exception as e:
print(traceback.format_exc())
print("Dataset with name {0} not found, registering new dataset.".format(args.ds_name))
register_dataset = True
end_time_last_slice = datetime.today() - relativedelta(weeks=1)
end_time_last_slice = datetime.today() - relativedelta(weeks=2)
end_time = datetime.utcnow()
train_df = get_noaa_data(end_time_last_slice, end_time)
@@ -80,10 +80,10 @@ if train_df.size > 0:
target_path=folder_name,
overwrite=True,
show_progress=True)
if register_dataset:
ds = Dataset.Tabular.from_delimited_files(dstor.path("{}/**/*.csv".format(
args.ds_name)), partition_format='/{partition_date:yyyy/MM/dd/hh/mm/ss}/data.csv')
ds.register(ws, name=args.ds_name)
else:
print("No new data since {0}.".format(end_time_last_slice))
if register_dataset:
ds = Dataset.Tabular.from_delimited_files(dstor.path("{}/**/*.csv".format(
args.ds_name)), partition_format='/{partition_date:yyyy/MM/dd/HH/mm/ss}/data.csv')
ds.register(ws, name=args.ds_name)

View File

@@ -101,6 +101,23 @@
"from azureml.train.estimator import Estimator"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "markdown",
"metadata": {
@@ -128,7 +145,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -163,7 +179,7 @@
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your CPU cluster\n",
"cpu_cluster_name = \"cpu-cluster\"\n",
"cpu_cluster_name = \"beer-cluster\"\n",
"\n",
"# Verify that cluster does not exist already\n",
"try:\n",
@@ -218,19 +234,18 @@
"import pandas as pd\n",
"from pandas import DataFrame\n",
"from pandas import Grouper\n",
"from matplotlib import pyplot\n",
"from pandas import concat\n",
"from matplotlib import pyplot\n",
"from pandas.plotting import register_matplotlib_converters\n",
"\n",
"register_matplotlib_converters()\n",
"plt.tight_layout()\n",
"plt.figure(figsize=(20, 10))\n",
"plt.tight_layout()\n",
"\n",
"plt.subplot(2, 1, 1)\n",
"plt.title('Beer Production By Year')\n",
"df = pd.read_csv(\"Beer_no_valid_split_train.csv\", parse_dates=True, index_col= 'DATE').drop(columns='grain')\n",
"test_df = pd.read_csv(\"Beer_no_valid_split_test.csv\", parse_dates=True, index_col= 'DATE').drop(columns='grain')\n",
"pyplot.plot(df)\n",
"plt.plot(df)\n",
"\n",
"plt.subplot(2, 1, 2)\n",
"plt.title('Beer Production By Month')\n",
@@ -239,7 +254,8 @@
"months = DataFrame(months)\n",
"months.columns = range(1,13)\n",
"months.boxplot()\n",
"pyplot.show()\n"
"\n",
"plt.show()"
]
},
{
@@ -358,7 +374,7 @@
"\n",
"automl_config = AutoMLConfig(task='forecasting', \n",
" primary_metric='normalized_root_mean_squared_error',\n",
" experiment_timeout_minutes = 60,\n",
" experiment_timeout_hours = 1,\n",
" training_data=train_dataset,\n",
" label_column_name=target_column_name,\n",
" validation_data=valid_dataset, \n",
@@ -538,7 +554,7 @@
"metadata": {},
"outputs": [],
"source": [
"compute_target = ws.compute_targets['cpu-cluster']\n",
"compute_target = ws.compute_targets['beer-cluster']\n",
"test_experiment = Experiment(ws, experiment_name + \"_test\")"
]
},
@@ -556,7 +572,7 @@
"\n",
"script_folder = os.path.join(os.getcwd(), 'inference')\n",
"os.makedirs(script_folder, exist_ok=True)\n",
"shutil.copy2('infer.py', script_folder)"
"shutil.copy('infer.py', script_folder)"
]
},
{

View File

@@ -1,12 +1,11 @@
name: auto-ml-forecasting-beer-remote
dependencies:
- fbprophet==0.5
- py-xgboost<=0.80
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-train
- azureml-widgets
- matplotlib
- pandas_ml
- statsmodels
- azureml-train

View File

@@ -76,9 +76,12 @@ def get_result_df(remote_run):
def run_inference(test_experiment, compute_target, script_folder, train_run,
test_dataset, lookback_dataset, max_horizon,
target_column_name, time_column_name, freq):
train_run.download_file('outputs/model.pkl', 'inference/model.pkl')
train_run.download_file('outputs/conda_env_v_1_0_0.yml',
'inference/condafile.yml')
model_base_name = 'model.pkl'
if 'model_data_location' in train_run.properties:
model_location = train_run.properties['model_data_location']
_, model_base_name = model_location.rsplit('/', 1)
train_run.download_file('outputs/{}'.format(model_base_name), 'inference/{}'.format(model_base_name))
train_run.download_file('outputs/conda_env_v_1_0_0.yml', 'inference/condafile.yml')
inference_env = Environment("myenv")
inference_env.docker.enabled = True
@@ -91,7 +94,8 @@ def run_inference(test_experiment, compute_target, script_folder, train_run,
'--max_horizon': max_horizon,
'--target_column_name': target_column_name,
'--time_column_name': time_column_name,
'--frequency': freq
'--frequency': freq,
'--model_path': model_base_name
},
inputs=[test_dataset.as_named_input('test_data'),
lookback_dataset.as_named_input('lookback_data')],

View File

@@ -4,8 +4,7 @@ import argparse
from azureml.core import Run
from sklearn.externals import joblib
from sklearn.metrics import mean_absolute_error, mean_squared_error
from azureml.automl.core._vendor.automl.client.core.common import metrics
from automl.client.core.common import constants
from azureml.automl.core.shared import constants, metrics
from pandas.tseries.frequencies import to_offset
@@ -232,6 +231,9 @@ parser.add_argument(
parser.add_argument(
'--frequency', type=str, dest='freq',
help='Frequency of prediction')
parser.add_argument(
'--model_path', type=str, dest='model_path',
default='model.pkl', help='Filename of model to be loaded')
args = parser.parse_args()
@@ -239,6 +241,7 @@ max_horizon = args.max_horizon
target_column_name = args.target_column_name
time_column_name = args.time_column_name
freq = args.freq
model_path = args.model_path
print('args passed are: ')
@@ -246,6 +249,7 @@ print(max_horizon)
print(target_column_name)
print(time_column_name)
print(freq)
print(model_path)
run = Run.get_context()
# get input dataset by name
@@ -267,7 +271,8 @@ X_lookback_df = lookback_dataset.drop_columns(columns=[target_column_name])
y_lookback_df = lookback_dataset.with_timestamp_columns(
None).keep_columns(columns=[target_column_name])
fitted_model = joblib.load('model.pkl')
fitted_model = joblib.load(model_path)
if hasattr(fitted_model, 'get_lookback'):
lookback = fitted_model.get_lookback()

View File

@@ -42,7 +42,7 @@
"\n",
"AutoML highlights here include built-in holiday featurization, accessing engineered feature names, and working with the `forecast` function. Please also look at the additional forecasting notebooks, which document lagging, rolling windows, forecast quantiles, other ways to use the forecast function, and forecaster deployment.\n",
"\n",
"Make sure you have executed the [configuration](../configuration.ipynb) before running this notebook.\n",
"Make sure you have executed the [configuration notebook](../../../configuration.ipynb) before running this notebook.\n",
"\n",
"Notebook synopsis:\n",
"1. Creating an Experiment in an existing Workspace\n",
@@ -74,6 +74,23 @@
"from datetime import datetime"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -95,7 +112,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['SKU'] = ws.sku\n",
@@ -124,35 +140,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-bike\"\n",
"amlcompute_cluster_name = \"bike-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 4)\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D2_V2',\n",
" max_nodes=4)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
" \n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -202,7 +205,7 @@
"outputs": [],
"source": [
"dataset = Dataset.Tabular.from_delimited_files(path = [(datastore, 'dataset/bike-no.csv')]).with_timestamp_columns(fine_grain_timestamp=time_column_name) \n",
"dataset.take(5).to_pandas_dataframe()"
"dataset.take(5).to_pandas_dataframe().reset_index(drop=True)"
]
},
{
@@ -221,8 +224,8 @@
"outputs": [],
"source": [
"# select data that occurs before a specified date\n",
"train = dataset.time_before(datetime(2012, 9, 1))\n",
"train.to_pandas_dataframe().tail(5)"
"train = dataset.time_before(datetime(2012, 8, 31), include_boundary=True)\n",
"train.to_pandas_dataframe().tail(5).reset_index(drop=True)"
]
},
{
@@ -231,8 +234,8 @@
"metadata": {},
"outputs": [],
"source": [
"test = dataset.time_after(datetime(2012, 8, 31))\n",
"test.to_pandas_dataframe().head(5)"
"test = dataset.time_after(datetime(2012, 9, 1), include_boundary=True)\n",
"test.to_pandas_dataframe().head(5).reset_index(drop=True)"
]
},
{
@@ -247,8 +250,8 @@
"|-|-|\n",
"|**task**|forecasting|\n",
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>\n",
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl/azureml.train.automl.constants.supportedmodels.regression?view=azure-ml-py).|\n",
"|**experiment_timeout_minutes**|Experimentation timeout in minutes.|\n",
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl-client/azureml.train.automl.constants.supportedmodels.forecasting?view=azure-ml-py).|\n",
"|**experiment_timeout_hours**|Experimentation timeout in hours.|\n",
"|**training_data**|Input dataset, containing both features and label column.|\n",
"|**label_column_name**|The name of the label column.|\n",
"|**compute_target**|The remote compute for training.|\n",
@@ -260,7 +263,7 @@
"|**target_lags**|The target_lags specifies how far back we will construct the lags of the target variable.|\n",
"|**drop_column_names**|Name(s) of columns to drop prior to modeling|\n",
"\n",
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_minutes parameter value to get results."
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_hours parameter value to get results."
]
},
{
@@ -305,7 +308,7 @@
"automl_config = AutoMLConfig(task='forecasting', \n",
" primary_metric='normalized_root_mean_squared_error',\n",
" blacklist_models = ['ExtremeRandomTrees'], \n",
" experiment_timeout_minutes=20,\n",
" experiment_timeout_hours=0.3,\n",
" training_data=train,\n",
" label_column_name=target_column_name,\n",
" compute_target=compute_target,\n",
@@ -450,8 +453,8 @@
"\n",
"script_folder = os.path.join(os.getcwd(), 'forecast')\n",
"os.makedirs(script_folder, exist_ok=True)\n",
"shutil.copy2('forecasting_script.py', script_folder)\n",
"shutil.copy2('forecasting_helper.py', script_folder)"
"shutil.copy('forecasting_script.py', script_folder)\n",
"shutil.copy('forecasting_helper.py', script_folder)"
]
},
{
@@ -507,10 +510,9 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core._vendor.automl.client.core.common import metrics\n",
"from azureml.automl.core.shared import constants, metrics\n",
"from sklearn.metrics import mean_absolute_error, mean_squared_error\n",
"from matplotlib import pyplot as plt\n",
"from automl.client.core.common import constants\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",

View File

@@ -1,11 +1,10 @@
name: auto-ml-forecasting-bike-share
dependencies:
- fbprophet==0.5
- py-xgboost<=0.80
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml
- statsmodels

View File

@@ -1,6 +1,6 @@
import argparse
import azureml.train.automl
from azureml.automl.runtime._vendor.automl.client.core.runtime import forecasting_models
from azureml.automl.runtime.shared import forecasting_models
from azureml.core import Run
from sklearn.externals import joblib
import forecasting_helper
@@ -32,18 +32,17 @@ test_dataset = run.input_datasets['test_data']
grain_column_names = []
df = test_dataset.to_pandas_dataframe()
df = test_dataset.to_pandas_dataframe().reset_index(drop=True)
X_test_df = test_dataset.drop_columns(columns=[target_column_name])
y_test_df = test_dataset.with_timestamp_columns(
None).keep_columns(columns=[target_column_name])
X_test_df = test_dataset.drop_columns(columns=[target_column_name]).to_pandas_dataframe().reset_index(drop=True)
y_test_df = test_dataset.with_timestamp_columns(None).keep_columns(columns=[target_column_name]).to_pandas_dataframe()
fitted_model = joblib.load('model.pkl')
df_all = forecasting_helper.do_rolling_forecast(
fitted_model,
X_test_df.to_pandas_dataframe(),
y_test_df.to_pandas_dataframe().values.T[0],
X_test_df,
y_test_df.values.T[0],
target_column_name,
time_column_name,
max_horizon,

View File

@@ -28,11 +28,10 @@
"1. [Setup](#Setup)\n",
"1. [Data and Forecasting Configurations](#Data)\n",
"1. [Train](#Train)\n",
"1. [Results](#Results)\n",
"\n",
"Advanced Forecasting\n",
"1. [Advanced Training](#Advanced Training)\n",
"1. [Advanced Results](#Advanced Results)"
"1. [Advanced Training](#advanced_training)\n",
"1. [Advanced Results](#advanced_results)"
]
},
{
@@ -43,7 +42,7 @@
"\n",
"In this example we use the associated New York City energy demand dataset to showcase how you can use AutoML for a simple forecasting problem and explore the results. The goal is predict the energy demand for the next 48 hours based on historic time-series data.\n",
"\n",
"If you are using an Azure Machine Learning [Notebook VM](https://docs.microsoft.com/en-us/azure/machine-learning/service/tutorial-1st-experiment-sdk-setup), you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) first, if you haven't already, to establish your connection to the AzureML Workspace.\n",
"If you are using an Azure Machine Learning Compute Instance, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) first, if you haven't already, to establish your connection to the AzureML Workspace.\n",
"\n",
"In this notebook you will learn how to:\n",
"1. Creating an Experiment using an existing Workspace\n",
@@ -85,6 +84,23 @@
"from datetime import datetime"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -109,7 +125,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -140,35 +155,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"aml-compute\"\n",
"amlcompute_cluster_name = \"energy-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_DS12_V2',\n",
" max_nodes=6)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_DS12_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 6)\n",
"\n",
" # Create the cluster.\\n\",\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
"\n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
"\n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -211,7 +213,7 @@
"outputs": [],
"source": [
"dataset = Dataset.Tabular.from_delimited_files(path = \"https://automlsamplenotebookdata.blob.core.windows.net/automl-sample-notebook-data/nyc_energy.csv\").with_timestamp_columns(fine_grain_timestamp=time_column_name) \n",
"dataset.take(5).to_pandas_dataframe()"
"dataset.take(5).to_pandas_dataframe().reset_index(drop=True)"
]
},
{
@@ -253,7 +255,7 @@
"source": [
"# split into train based on time\n",
"train = dataset.time_before(datetime(2017, 8, 8, 5), include_boundary=True)\n",
"train.to_pandas_dataframe().sort_values(time_column_name).tail(5)"
"train.to_pandas_dataframe().reset_index(drop=True).sort_values(time_column_name).tail(5)"
]
},
{
@@ -263,8 +265,8 @@
"outputs": [],
"source": [
"# split into test based on time\n",
"test = dataset.time_between(datetime(2017, 8, 8, 5), datetime(2017, 8, 10, 5))\n",
"test.to_pandas_dataframe().head(5)"
"test = dataset.time_between(datetime(2017, 8, 8, 6), datetime(2017, 8, 10, 5))\n",
"test.to_pandas_dataframe().reset_index(drop=True).head(5)"
]
},
{
@@ -301,8 +303,8 @@
"|-|-|\n",
"|**task**|forecasting|\n",
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl/azureml.train.automl.constants.supportedmodels.regression?view=azure-ml-py).|\n",
"|**experiment_timeout_minutes**|Maximum amount of time in minutes that the experiment take before it terminates.|\n",
"|**blacklist_models**|Models in blacklist won't be used by AutoML. All supported models can be found at [here](https://docs.microsoft.com/en-us/python/api/azureml-train-automl-client/azureml.train.automl.constants.supportedmodels.forecasting?view=azure-ml-py).|\n",
"|**experiment_timeout_hours**|Maximum amount of time in hours that the experiment take before it terminates.|\n",
"|**training_data**|The training data to be used within the experiment.|\n",
"|**label_column_name**|The name of the label column.|\n",
"|**compute_target**|The remote compute for training.|\n",
@@ -316,7 +318,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_minutes parameter value to get results."
"This notebook uses the blacklist_models parameter to exclude some models that take a longer time to train on this dataset. You can choose to remove models from the blacklist_models list but you may need to increase the experiment_timeout_hours parameter value to get results."
]
},
{
@@ -333,7 +335,7 @@
"automl_config = AutoMLConfig(task='forecasting', \n",
" primary_metric='normalized_root_mean_squared_error',\n",
" blacklist_models = ['ExtremeRandomTrees', 'AutoArima', 'Prophet'], \n",
" experiment_timeout_minutes=20,\n",
" experiment_timeout_hours=0.3,\n",
" training_data=train,\n",
" label_column_name=target_column_name,\n",
" compute_target=compute_target,\n",
@@ -454,7 +456,7 @@
"metadata": {},
"outputs": [],
"source": [
"X_test = test.to_pandas_dataframe()\n",
"X_test = test.to_pandas_dataframe().reset_index(drop=True)\n",
"y_test = X_test.pop(target_column_name).values"
]
},
@@ -463,11 +465,7 @@
"metadata": {},
"source": [
"### Forecast Function\n",
"For forecasting, we will use the forecast function instead of the predict function. There are two reasons for this.\n",
"\n",
"We need to pass the recent values of the target variable y, whereas the scikit-compatible predict function only takes the non-target variables 'test'. In our case, the test data immediately follows the training data, and we fill the target variable with NaN. The NaN serves as a question mark for the forecaster to fill with the actuals. Using the forecast function will produce forecasts using the shortest possible forecast horizon. The last time at which a definite (non-NaN) value is seen is the forecast origin - the last time when the value of the target is known.\n",
"\n",
"Using the predict method would result in getting predictions for EVERY horizon the forecaster can predict at. This is useful when training and evaluating the performance of the forecaster at various horizons, but the level of detail is excessive for normal use."
"For forecasting, we will use the forecast function instead of the predict function. Using the predict method would result in getting predictions for EVERY horizon the forecaster can predict at. This is useful when training and evaluating the performance of the forecaster at various horizons, but the level of detail is excessive for normal use. Forecast function also can handle more complicated scenarios, see notebook on [high frequency forecasting](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/forecasting-high-frequency/auto-ml-forecasting-function.ipynb)."
]
},
{
@@ -476,15 +474,10 @@
"metadata": {},
"outputs": [],
"source": [
"# Replace ALL values in y by NaN.\n",
"# The forecast origin will be at the beginning of the first forecast period.\n",
"# (Which is the same time as the end of the last training period.)\n",
"y_query = y_test.copy().astype(np.float)\n",
"y_query.fill(np.nan)\n",
"# The featurized data, aligned to y, will also be returned.\n",
"# This contains the assumptions that were made in the forecast\n",
"# and helps align the forecast to the original data\n",
"y_predictions, X_trans = fitted_model.forecast(X_test, y_query)"
"y_predictions, X_trans = fitted_model.forecast(X_test)"
]
},
{
@@ -514,9 +507,8 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core._vendor.automl.client.core.common import metrics\n",
"from azureml.automl.core.shared import constants, metrics\n",
"from matplotlib import pyplot as plt\n",
"from automl.client.core.common import constants\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",
@@ -557,7 +549,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Advanced Training\n",
"## Advanced Training <a id=\"advanced_training\"></a>\n",
"We did not use lags in the previous model specification. In effect, the prediction was the result of a simple regression on date, grain and any additional features. This is often a very good prediction as common time series patterns like seasonality and trends can be captured in this manner. Such simple regression is horizon-less: it doesn't matter how far into the future we are predicting, because we are not using past data. In the previous example, the horizon was only used to split the data for cross-validation."
]
},
@@ -587,7 +579,7 @@
"automl_config = AutoMLConfig(task='forecasting', \n",
" primary_metric='normalized_root_mean_squared_error',\n",
" blacklist_models = ['ElasticNet','ExtremeRandomTrees','GradientBoosting','XGBoostRegressor','ExtremeRandomTrees', 'AutoArima', 'Prophet'], #These models are blacklisted for tutorial purposes, remove this for real use cases. \n",
" experiment_timeout_minutes=20,\n",
" experiment_timeout_hours=0.3,\n",
" training_data=train,\n",
" label_column_name=target_column_name,\n",
" compute_target=compute_target,\n",
@@ -642,7 +634,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Advanced Results\n",
"## Advanced Results<a id=\"advanced_results\"></a>\n",
"We did not use lags in the previous model specification. In effect, the prediction was the result of a simple regression on date, grain and any additional features. This is often a very good prediction as common time series patterns like seasonality and trends can be captured in this manner. Such simple regression is horizon-less: it doesn't matter how far into the future we are predicting, because we are not using past data. In the previous example, the horizon was only used to split the data for cross-validation."
]
},
@@ -652,15 +644,10 @@
"metadata": {},
"outputs": [],
"source": [
"# Replace ALL values in y by NaN.\n",
"# The forecast origin will be at the beginning of the first forecast period.\n",
"# (Which is the same time as the end of the last training period.)\n",
"y_query = y_test.copy().astype(np.float)\n",
"y_query.fill(np.nan)\n",
"# The featurized data, aligned to y, will also be returned.\n",
"# This contains the assumptions that were made in the forecast\n",
"# and helps align the forecast to the original data\n",
"y_predictions, X_trans = fitted_model_lags.forecast(X_test, y_query)"
"y_predictions, X_trans = fitted_model_lags.forecast(X_test)"
]
},
{
@@ -680,9 +667,8 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core._vendor.automl.client.core.common import metrics\n",
"from azureml.automl.core.shared import constants, metrics\n",
"from matplotlib import pyplot as plt\n",
"from automl.client.core.common import constants\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",

View File

@@ -2,11 +2,8 @@ name: auto-ml-forecasting-energy-demand
dependencies:
- pip:
- azureml-sdk
- interpret
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml
- statsmodels
- azureml-explain-model
- azureml-contrib-interpret

View File

@@ -1,551 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/automated-machine-learning/forecasting-grouping/auto-ml-forecasting-grouping.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Automated Machine Learning\n",
"\n",
"_**Forecasting with grouping using Pipelines**_\n",
"\n",
"## Contents\n",
"\n",
"1. [Introduction](#Introduction)\n",
"2. [Setup](#Setup)\n",
"3. [Data](#Data)\n",
"4. [Compute](#Compute)\n",
"4. [AutoMLConfig](#AutoMLConfig)\n",
"5. [Pipeline](#Pipeline)\n",
"5. [Train](#Train)\n",
"6. [Test](#Test)\n",
"\n",
"\n",
"## Introduction\n",
"In this example we use Automated ML and Pipelines to train, select, and operationalize forecasting models for multiple time-series.\n",
"\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration notebook](../../../configuration.ipynb) first if you haven't already to establish your connection to the AzureML Workspace.\n",
"\n",
"In this notebook you will learn how to:\n",
"\n",
"* Create an Experiment in an existing Workspace.\n",
"* Configure AutoML using AutoMLConfig.\n",
"* Use our helper script to generate pipeline steps to split, train, and deploy the models.\n",
"* Explore the results.\n",
"* Test the models.\n",
"\n",
"It is advised you ensure your cluster has at least one node per group.\n",
"\n",
"An Enterprise workspace is required for this notebook. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade)\n",
"\n",
"## Setup\n",
"As part of the setup you have already created an Azure ML `Workspace` object. For Automated ML you will need to create an `Experiment` object, which is a named object in a `Workspace` used to run experiments. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import json\n",
"import logging\n",
"import warnings\n",
"\n",
"import numpy as np\n",
"import pandas as pd\n",
"\n",
"import azureml.core\n",
"\n",
"from azureml.core.workspace import Workspace\n",
"from azureml.core.experiment import Experiment\n",
"from azureml.train.automl import AutoMLConfig"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Accessing the Azure ML workspace requires authentication with Azure.\n",
"\n",
"The default authentication is interactive authentication using the default tenant. Executing the ws = Workspace.from_config() line in the cell below will prompt for authentication the first time that it is run.\n",
"\n",
"If you have multiple Azure tenants, you can specify the tenant by replacing the ws = Workspace.from_config() line in the cell below with the following:\n",
"```\n",
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
"auth = InteractiveLoginAuthentication(tenant_id = 'mytenantid')\n",
"ws = Workspace.from_config(auth = auth)\n",
"```\n",
"If you need to run in an environment where interactive login is not possible, you can use Service Principal authentication by replacing the ws = Workspace.from_config() line in the cell below with the following:\n",
"```\n",
"from azureml.core.authentication import ServicePrincipalAuthentication\n",
"auth = auth = ServicePrincipalAuthentication('mytenantid', 'myappid', 'mypassword')\n",
"ws = Workspace.from_config(auth = auth)\n",
"```\n",
"For more details, see aka.ms/aml-notebook-auth"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ws = Workspace.from_config()\n",
"ds = ws.get_default_datastore()\n",
"\n",
"# choose a name for the run history container in the workspace\n",
"experiment_name = 'automl-grouping-oj'\n",
"# project folder\n",
"project_folder = './sample_projects/{}'.format(experiment_name)\n",
"\n",
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
"output['Location'] = ws.location\n",
"output['Project Directory'] = project_folder\n",
"output['Run History Name'] = experiment_name\n",
"pd.set_option('display.max_colwidth', -1)\n",
"outputDf = pd.DataFrame(data = output, index = [''])\n",
"outputDf.T"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Data\n",
"Upload data to your default datastore and then load it as a `TabularDataset`"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.dataset import Dataset"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# upload training and test data to your default datastore\n",
"ds = ws.get_default_datastore()\n",
"ds.upload(src_dir='./data', target_path='groupdata', overwrite=True, show_progress=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# load data from your datastore\n",
"data = Dataset.Tabular.from_delimited_files(path=ds.path('groupdata/dominicks_OJ_2_5_8_train.csv'))\n",
"data_test = Dataset.Tabular.from_delimited_files(path=ds.path('groupdata/dominicks_OJ_2_5_8_test.csv'))\n",
"\n",
"data.take(5).to_pandas_dataframe()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Compute \n",
"\n",
"#### Create or Attach existing AmlCompute\n",
"\n",
"You will need to create a compute target for your automated ML run. In this tutorial, you create AmlCompute as your training compute resource.\n",
"#### Creation of AmlCompute takes approximately 5 minutes. \n",
"If the AmlCompute with that name is already in your workspace this code will skip the creation process.\n",
"As with other Azure services, there are limits on certain resources (e.g. AmlCompute) associated with the Azure Machine Learning service. Please read this article on the default limits and how to request more quota."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-11\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 6)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
" \n",
"# For a more detailed view of current AmlCompute status, use get_status()."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## AutoMLConfig\n",
"#### Create a base AutoMLConfig\n",
"This configuration will be used for all the groups in the pipeline."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"target_column = 'Quantity'\n",
"time_column_name = 'WeekStarting'\n",
"grain_column_names = ['Brand']\n",
"group_column_names = ['Store']\n",
"max_horizon = 20"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"automl_settings = {\n",
" \"iteration_timeout_minutes\" : 5,\n",
" \"experiment_timeout_minutes\" : 15,\n",
" \"primary_metric\" : 'normalized_mean_absolute_error',\n",
" \"time_column_name\": time_column_name,\n",
" \"grain_column_names\": grain_column_names,\n",
" \"max_horizon\": max_horizon,\n",
" \"drop_column_names\": ['logQuantity'],\n",
" \"max_concurrent_iterations\": 2,\n",
" \"max_cores_per_iteration\": -1\n",
"}\n",
"base_configuration = AutoMLConfig(task = 'forecasting',\n",
" path = project_folder,\n",
" n_cross_validations=3,\n",
" **automl_settings\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Pipeline\n",
"We've written a script to generate the individual pipeline steps used to create each automl step. Calling this script will return a list of PipelineSteps that will train multiple groups concurrently and then deploy these models.\n",
"\n",
"This step requires an Enterprise workspace to gain access to this feature. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade).\n",
"\n",
"### Call the method to build pipeline steps\n",
"\n",
"`build_pipeline_steps()` takes as input:\n",
"* **automlconfig**: This is the configuration used for every automl step\n",
"* **df**: This is the dataset to be used for training\n",
"* **target_column**: This is the target column of the dataset\n",
"* **compute_target**: The compute to be used for training\n",
"* **deploy**: The option on to deploy the models after training, if set to true an extra step will be added to deploy a webservice with all the models (default is `True`)\n",
"* **service_name**: The service name for the model query endpoint\n",
"* **time_column_name**: The time column of the data"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.webservice import Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"\n",
"service_name = 'grouped-model'\n",
"try:\n",
" # if you want to get existing service below is the command\n",
" # since aci name needs to be unique in subscription deleting existing aci if any\n",
" # we use aci_service_name to create azure aci\n",
" service = Webservice(ws, name=service_name)\n",
" if service:\n",
" service.delete()\n",
"except WebserviceException as e:\n",
" pass"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from build import build_pipeline_steps\n",
"\n",
"steps = build_pipeline_steps(\n",
" base_configuration, \n",
" data, \n",
" target_column,\n",
" compute_target, \n",
" group_column_names=group_column_names, \n",
" deploy=True, \n",
" service_name=service_name, \n",
" time_column_name=time_column_name\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Train\n",
"Use the list of steps generated from above to build the pipeline and submit it to your compute for remote training."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core import Pipeline\n",
"pipeline = Pipeline(\n",
" description=\"A pipeline with one model per data group using Automated ML.\",\n",
" workspace=ws, \n",
" steps=steps)\n",
"\n",
"pipeline_run = experiment.submit(pipeline)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.widgets import RunDetails\n",
"RunDetails(pipeline_run).show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run.wait_for_completion(show_output=False)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Test\n",
"\n",
"Now we can use the holdout set to test our models and ensure our web-service is running as expected."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.webservice import AciWebservice\n",
"service = AciWebservice(ws, service_name)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_test = data_test.to_pandas_dataframe()\n",
"# Drop the column we are trying to predict (target column)\n",
"x_pred = X_test.drop(target_column, inplace=False, axis=1)\n",
"x_pred.head()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Get Predictions\n",
"test_sample = X_test.drop(target_column, inplace=False, axis=1).to_json()\n",
"predictions = service.run(input_data=test_sample)\n",
"print(predictions)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Convert predictions from JSON to DataFrame\n",
"pred_dict =json.loads(predictions)\n",
"X_pred = pd.read_json(pred_dict['predictions'])\n",
"X_pred.head()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Fix the index\n",
"PRED = 'pred_target'\n",
"X_pred[time_column_name] = pd.to_datetime(X_pred[time_column_name], unit='ms')\n",
"\n",
"X_pred.set_index([time_column_name] + grain_column_names, inplace=True, drop=True)\n",
"X_pred.rename({'_automl_target_col': PRED}, inplace=True, axis=1)\n",
"# Drop all but the target column and index\n",
"X_pred.drop(list(set(X_pred.columns.values).difference({PRED})), axis=1, inplace=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_test[time_column_name] = pd.to_datetime(X_test[time_column_name])\n",
"X_test.set_index([time_column_name] + grain_column_names, inplace=True, drop=True)\n",
"# Merge predictions with raw features\n",
"pred_test = X_test.merge(X_pred, left_index=True, right_index=True)\n",
"pred_test.head()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.metrics import mean_absolute_error, mean_squared_error\n",
"def MAPE(actual, pred):\n",
" \"\"\"\n",
" Calculate mean absolute percentage error.\n",
" Remove NA and values where actual is close to zero\n",
" \"\"\"\n",
" not_na = ~(np.isnan(actual) | np.isnan(pred))\n",
" not_zero = ~np.isclose(actual, 0.0)\n",
" actual_safe = actual[not_na & not_zero]\n",
" pred_safe = pred[not_na & not_zero]\n",
" APE = 100*np.abs((actual_safe - pred_safe)/actual_safe)\n",
" return np.mean(APE)\n",
"\n",
"def get_metrics(actuals, preds):\n",
" return pd.Series(\n",
" {\n",
" \"RMSE\": np.sqrt(mean_squared_error(actuals, preds)),\n",
" \"NormRMSE\": np.sqrt(mean_squared_error(actuals, preds))/np.abs(actuals.max()-actuals.min()),\n",
" \"MAE\": mean_absolute_error(actuals, preds),\n",
" \"MAPE\": MAPE(actuals, preds)},\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"get_metrics(pred_test[PRED].values, pred_test[target_column].values)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"authors": [
{
"name": "alyerman"
}
],
"category": "other",
"compute": [
"AML Compute"
],
"datasets": [
"Orange Juice Sales"
],
"deployment": [
"Azure Container Instance"
],
"exclude_from_index": false,
"framework": [
"Scikit-learn",
"Pytorch"
],
"friendly_name": "Automated ML Grouping with Pipeline.",
"index_order": 10,
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.6"
},
"tags": [
"AutomatedML"
],
"task": "Use AzureML Pipeline to trigger multiple Automated ML runs."
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,142 +0,0 @@
from typing import List, Dict
import copy
import json
import pandas as pd
import re
from azureml.core import RunConfiguration
from azureml.core.compute import ComputeTarget
from azureml.core.conda_dependencies import CondaDependencies
from azureml.core.dataset import Dataset
from azureml.pipeline.core import PipelineData, PipelineParameter, TrainingOutput, StepSequence
from azureml.pipeline.steps import PythonScriptStep
from azureml.train.automl import AutoMLConfig
from azureml.train.automl.runtime import AutoMLStep
def _get_groups(data: Dataset, group_column_names: List[str]) -> pd.DataFrame:
return data._dataflow.distinct(columns=group_column_names)\
.keep_columns(columns=group_column_names).to_pandas_dataframe()
def _get_configs(automlconfig: AutoMLConfig,
data: Dataset,
target_column: str,
compute_target: ComputeTarget,
group_column_names: List[str]) -> Dict[str, AutoMLConfig]:
# remove invalid characters regex
valid_chars = re.compile('[^a-zA-Z0-9-]')
groups = _get_groups(data, group_column_names)
configs = {}
for i, group in groups.iterrows():
single = data
group_name = "#####".join(str(x) for x in group.values)
group_name = valid_chars.sub('', group_name)
for key in group.index:
single = single._dataflow.filter(data._dataflow[key] == group[key])
group_conf = copy.deepcopy(automlconfig)
group_conf.user_settings['training_data'] = single
group_conf.user_settings['label_column_name'] = target_column
group_conf.user_settings['compute_target'] = compute_target
configs[group_name] = group_conf
return configs
def build_pipeline_steps(automlconfig: AutoMLConfig,
data: Dataset,
target_column: str,
compute_target: ComputeTarget,
group_column_names: list,
time_column_name: str,
deploy: bool,
service_name: str = 'grouping-demo') -> StepSequence:
steps = []
metrics_output_name = 'metrics_{}'
best_model_output_name = 'best_model_{}'
count = 0
model_names = []
# get all automl configs by group
configs = _get_configs(automlconfig, data, target_column, compute_target, group_column_names)
# build a runconfig for register model
register_config = RunConfiguration()
cd = CondaDependencies()
cd.add_pip_package('azureml-pipeline')
register_config.environment.python.conda_dependencies = cd
# create each automl step end-to-end (train, register)
for group_name, conf in configs.items():
# create automl metrics output
metirics_data = PipelineData(
name='metrics_data_{}'.format(group_name),
pipeline_output_name=metrics_output_name.format(group_name),
training_output=TrainingOutput(type='Metrics'))
# create automl model output
model_data = PipelineData(
name='model_data_{}'.format(group_name),
pipeline_output_name=best_model_output_name.format(group_name),
training_output=TrainingOutput(type='Model', metric=conf.user_settings['primary_metric']))
automl_step = AutoMLStep(
name='automl_{}'.format(group_name),
automl_config=conf,
outputs=[metirics_data, model_data],
allow_reuse=True)
steps.append(automl_step)
# pass the group name as a parameter to the register step ->
# this will become the name of the model for this group.
group_name_param = PipelineParameter("group_name_{}".format(count), default_value=group_name)
count += 1
reg_model_step = PythonScriptStep(
'register.py',
name='register_{}'.format(group_name),
arguments=["--model_name", group_name_param, "--model_path", model_data],
inputs=[model_data],
compute_target=compute_target,
runconfig=register_config,
source_directory="register",
allow_reuse=True
)
steps.append(reg_model_step)
model_names.append(group_name)
final_steps = steps
if deploy:
# modify the conda dependencies to ensure we pick up correct
# versions of azureml-defaults and azureml-train-automl
cd = CondaDependencies.create(pip_packages=['azureml-defaults', 'azureml-train-automl'])
automl_deps = CondaDependencies(conda_dependencies_file_path='deploy/myenv.yml')
cd._merge_dependencies(automl_deps)
cd.save('deploy/myenv.yml')
# add deployment step
pp_group_column_names = PipelineParameter(
"group_column_names",
default_value="#####".join(list(reversed(group_column_names))))
pp_model_names = PipelineParameter(
"model_names",
default_value=json.dumps(model_names))
pp_service_name = PipelineParameter(
"service_name",
default_value=service_name)
deployment_step = PythonScriptStep(
'deploy.py',
name='service_deploy',
arguments=["--group_column_names", pp_group_column_names,
"--model_names", pp_model_names,
"--service_name", pp_service_name,
"--time_column_name", time_column_name],
compute_target=compute_target,
runconfig=RunConfiguration(),
source_directory="deploy"
)
final_steps = StepSequence(steps=[steps, deployment_step])
return final_steps

View File

@@ -1,61 +0,0 @@
WeekStarting,Store,Brand,Quantity,logQuantity,Advert,Price,Age60,COLLEGE,INCOME,Hincome150,Large HH,Minorities,WorkingWoman,SSTRDIST,SSTRVOL,CPDIST5,CPWVOL5
1992-08-20,2,minute.maid,23488,10.06424493,1,1.94,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-20,2,tropicana,13376,9.501217335,1,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-27,2,tropicana,8128,9.00307017,0,2.75,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-27,2,minute.maid,19008,9.852615222,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-27,2,dominicks,9024,9.107642974,0,1.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-03,2,tropicana,19456,9.875910785,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-03,2,minute.maid,11584,9.357380115,0,1.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-03,2,dominicks,2048,7.624618986000001,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-10,2,tropicana,10048,9.215128888999999,0,2.64,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-10,2,minute.maid,26752,10.19436452,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-10,2,dominicks,1984,7.592870287999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-17,2,tropicana,6336,8.754002933999999,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-17,2,minute.maid,3904,8.269756948,0,2.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-17,2,dominicks,4160,8.333270353,0,1.77,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-24,2,tropicana,16192,9.692272572,1,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-24,2,minute.maid,3712,8.219326094,0,2.67,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-09-24,2,dominicks,35264,10.47061789,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-10-01,2,dominicks,8640,9.064157862,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-10-01,2,minute.maid,41216,10.62658181,1,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-10-01,2,tropicana,5824,8.66974259,0,2.97,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-20,5,tropicana,17728,9.78290059,1,2.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-20,5,minute.maid,27072,10.20625526,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-27,5,tropicana,9600,9.169518378,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-27,5,minute.maid,3840,8.253227646000001,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-27,5,dominicks,1856,7.526178913,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-03,5,tropicana,25664,10.15284451,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-03,5,minute.maid,6144,8.723231275,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-03,5,dominicks,3712,8.219326094,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-10,5,tropicana,9984,9.208739091,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-10,5,dominicks,2688,7.896552702,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-10,5,minute.maid,36416,10.50276352,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-17,5,tropicana,8576,9.056722882999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-17,5,minute.maid,5440,8.60153434,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-17,5,dominicks,6464,8.774003599999999,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-24,5,tropicana,13184,9.486759252,1,2.78,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-24,5,dominicks,40896,10.61878754,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-09-24,5,minute.maid,7680,8.946374826,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-10-01,5,dominicks,6144,8.723231275,0,1.85,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-10-01,5,minute.maid,50304,10.82583988,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-10-01,5,tropicana,7488,8.921057017999999,0,2.78,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-20,8,minute.maid,55552,10.9250748,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-20,8,tropicana,8576,9.056722882999999,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-27,8,tropicana,8000,8.987196821,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-27,8,minute.maid,18688,9.835636886,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-27,8,dominicks,19200,9.862665558,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-03,8,tropicana,21760,9.987828701,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-03,8,minute.maid,14656,9.592605087,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-03,8,dominicks,12800,9.45720045,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-10,8,tropicana,12800,9.45720045,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-10,8,minute.maid,30144,10.31374118,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-10,8,dominicks,15296,9.635346635,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-17,8,tropicana,10112,9.221478116,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-17,8,minute.maid,6208,8.733594062,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-17,8,dominicks,20992,9.951896692,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-24,8,tropicana,10304,9.240287448,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-24,8,minute.maid,7104,8.868413285,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-09-24,8,dominicks,73856,11.20987253,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-10-01,8,minute.maid,65856,11.09522582,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-10-01,8,dominicks,16192,9.692272572,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-10-01,8,tropicana,6400,8.764053269,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1 WeekStarting Store Brand Quantity logQuantity Advert Price Age60 COLLEGE INCOME Hincome150 Large HH Minorities WorkingWoman SSTRDIST SSTRVOL CPDIST5 CPWVOL5
2 1992-08-20 2 minute.maid 23488 10.06424493 1 1.94 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
3 1992-08-20 2 tropicana 13376 9.501217335 1 2.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
4 1992-08-27 2 tropicana 8128 9.00307017 0 2.75 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
5 1992-08-27 2 minute.maid 19008 9.852615222 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
6 1992-08-27 2 dominicks 9024 9.107642974 0 1.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
7 1992-09-03 2 tropicana 19456 9.875910785 1 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
8 1992-09-03 2 minute.maid 11584 9.357380115 0 1.81 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
9 1992-09-03 2 dominicks 2048 7.624618986000001 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
10 1992-09-10 2 tropicana 10048 9.215128888999999 0 2.64 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
11 1992-09-10 2 minute.maid 26752 10.19436452 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
12 1992-09-10 2 dominicks 1984 7.592870287999999 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
13 1992-09-17 2 tropicana 6336 8.754002933999999 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
14 1992-09-17 2 minute.maid 3904 8.269756948 0 2.83 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
15 1992-09-17 2 dominicks 4160 8.333270353 0 1.77 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
16 1992-09-24 2 tropicana 16192 9.692272572 1 2.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
17 1992-09-24 2 minute.maid 3712 8.219326094 0 2.67 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
18 1992-09-24 2 dominicks 35264 10.47061789 0 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
19 1992-10-01 2 dominicks 8640 9.064157862 0 1.82 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
20 1992-10-01 2 minute.maid 41216 10.62658181 1 2.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
21 1992-10-01 2 tropicana 5824 8.66974259 0 2.97 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
22 1992-08-20 5 tropicana 17728 9.78290059 1 2.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
23 1992-08-20 5 minute.maid 27072 10.20625526 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
24 1992-08-27 5 tropicana 9600 9.169518378 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
25 1992-08-27 5 minute.maid 3840 8.253227646000001 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
26 1992-08-27 5 dominicks 1856 7.526178913 0 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
27 1992-09-03 5 tropicana 25664 10.15284451 1 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
28 1992-09-03 5 minute.maid 6144 8.723231275 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
29 1992-09-03 5 dominicks 3712 8.219326094 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
30 1992-09-10 5 tropicana 9984 9.208739091 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
31 1992-09-10 5 dominicks 2688 7.896552702 0 1.85 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
32 1992-09-10 5 minute.maid 36416 10.50276352 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
33 1992-09-17 5 tropicana 8576 9.056722882999999 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
34 1992-09-17 5 minute.maid 5440 8.60153434 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
35 1992-09-17 5 dominicks 6464 8.774003599999999 0 1.85 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
36 1992-09-24 5 tropicana 13184 9.486759252 1 2.78 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
37 1992-09-24 5 dominicks 40896 10.61878754 0 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
38 1992-09-24 5 minute.maid 7680 8.946374826 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
39 1992-10-01 5 dominicks 6144 8.723231275 0 1.85 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
40 1992-10-01 5 minute.maid 50304 10.82583988 1 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
41 1992-10-01 5 tropicana 7488 8.921057017999999 0 2.78 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
42 1992-08-20 8 minute.maid 55552 10.9250748 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
43 1992-08-20 8 tropicana 8576 9.056722882999999 1 2.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
44 1992-08-27 8 tropicana 8000 8.987196821 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
45 1992-08-27 8 minute.maid 18688 9.835636886 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
46 1992-08-27 8 dominicks 19200 9.862665558 0 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
47 1992-09-03 8 tropicana 21760 9.987828701 1 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
48 1992-09-03 8 minute.maid 14656 9.592605087 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
49 1992-09-03 8 dominicks 12800 9.45720045 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
50 1992-09-10 8 tropicana 12800 9.45720045 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
51 1992-09-10 8 minute.maid 30144 10.31374118 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
52 1992-09-10 8 dominicks 15296 9.635346635 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
53 1992-09-17 8 tropicana 10112 9.221478116 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
54 1992-09-17 8 minute.maid 6208 8.733594062 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
55 1992-09-17 8 dominicks 20992 9.951896692 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
56 1992-09-24 8 tropicana 10304 9.240287448 1 2.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
57 1992-09-24 8 minute.maid 7104 8.868413285 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
58 1992-09-24 8 dominicks 73856 11.20987253 0 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
59 1992-10-01 8 minute.maid 65856 11.09522582 1 2.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
60 1992-10-01 8 dominicks 16192 9.692272572 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
61 1992-10-01 8 tropicana 6400 8.764053269 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947

View File

@@ -1,973 +0,0 @@
WeekStarting,Store,Brand,Quantity,logQuantity,Advert,Price,Age60,COLLEGE,INCOME,Hincome150,Large HH,Minorities,WorkingWoman,SSTRDIST,SSTRVOL,CPDIST5,CPWVOL5
1990-06-14,2,dominicks,10560,9.264828557000001,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-06-14,2,minute.maid,4480,8.407378325,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-06-14,2,tropicana,8256,9.018695487999999,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-07-26,2,dominicks,8000,8.987196821,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-07-26,2,minute.maid,4672,8.449342525,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-07-26,2,tropicana,6144,8.723231275,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-02,2,tropicana,3840,8.253227646000001,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-02,2,minute.maid,20160,9.911455722000001,1,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-02,2,dominicks,6848,8.831711918,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-09,2,dominicks,2880,7.965545572999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-09,2,minute.maid,2688,7.896552702,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-09,2,tropicana,8000,8.987196821,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-23,2,dominicks,1600,7.377758908,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-23,2,minute.maid,3008,8.009030685,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-23,2,tropicana,8896,9.093357017,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-30,2,tropicana,7168,8.877381955,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-30,2,minute.maid,4672,8.449342525,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-08-30,2,dominicks,25344,10.140297300000002,1,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-06,2,dominicks,10752,9.282847063,0,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-06,2,minute.maid,2752,7.920083199,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-06,2,tropicana,10880,9.29468152,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-13,2,minute.maid,26176,10.17259824,1,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-13,2,dominicks,6656,8.803273982999999,0,1.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-13,2,tropicana,7744,8.954673629,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-20,2,dominicks,6592,8.793612072,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-20,2,minute.maid,3712,8.219326094,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-09-20,2,tropicana,8512,9.049232212,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-11,2,tropicana,5504,8.61323038,0,3.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-11,2,minute.maid,30656,10.33058368,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-11,2,dominicks,1728,7.454719948999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-18,2,tropicana,5888,8.68067166,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-18,2,minute.maid,3840,8.253227646000001,0,2.98,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-18,2,dominicks,33792,10.42797937,1,1.24,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-25,2,tropicana,8384,9.034080407000001,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-25,2,minute.maid,2816,7.943072717000001,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-10-25,2,dominicks,1920,7.560080465,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-01,2,tropicana,5952,8.691482577,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-01,2,minute.maid,23104,10.04776104,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-01,2,dominicks,8960,9.100525506,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-08,2,dominicks,11392,9.340666634,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-08,2,tropicana,6848,8.831711918,0,3.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-08,2,minute.maid,3392,8.129174997,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-15,2,tropicana,9216,9.128696383,0,3.87,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-15,2,minute.maid,26304,10.1774763,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-15,2,dominicks,28416,10.25470765,0,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-22,2,dominicks,17152,9.749870064,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-22,2,tropicana,12160,9.405907156,0,2.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-22,2,minute.maid,6336,8.754002933999999,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-29,2,tropicana,12672,9.447150114,0,2.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-29,2,minute.maid,9920,9.2023082,0,3.17,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-11-29,2,dominicks,26560,10.1871616,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-06,2,dominicks,6336,8.754002933999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-06,2,minute.maid,25280,10.13776885,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-06,2,tropicana,6528,8.783855897,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-13,2,dominicks,26368,10.17990643,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-13,2,tropicana,6144,8.723231275,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-13,2,minute.maid,14848,9.605620455,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-20,2,tropicana,21120,9.957975738,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-20,2,minute.maid,12288,9.416378455,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-20,2,dominicks,896,6.797940412999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-27,2,tropicana,12416,9.426741242,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-27,2,minute.maid,6272,8.743850562,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-12-27,2,dominicks,1472,7.294377299,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-03,2,tropicana,9472,9.156095357,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-03,2,minute.maid,9152,9.121727714,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-03,2,dominicks,1344,7.2034055210000005,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-10,2,tropicana,17920,9.793672686,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-10,2,minute.maid,4160,8.333270353,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-10,2,dominicks,111680,11.62339292,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-17,2,tropicana,9408,9.14931567,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-17,2,minute.maid,10176,9.227787286,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-17,2,dominicks,1856,7.526178913,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-24,2,tropicana,6272,8.743850562,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-24,2,minute.maid,29056,10.27698028,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-24,2,dominicks,5568,8.624791202,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-31,2,tropicana,6912,8.841014311,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-31,2,minute.maid,7104,8.868413285,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-01-31,2,dominicks,32064,10.37548918,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-07,2,tropicana,16768,9.727227587,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-07,2,dominicks,4352,8.378390789,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-07,2,minute.maid,7488,8.921057017999999,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-14,2,dominicks,704,6.556778356000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-14,2,minute.maid,4224,8.348537825,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-14,2,tropicana,6272,8.743850562,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-21,2,tropicana,7936,8.979164649,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-21,2,minute.maid,8960,9.100525506,0,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-21,2,dominicks,13760,9.529521112000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-28,2,tropicana,6144,8.723231275,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-28,2,minute.maid,22464,10.01966931,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-02-28,2,dominicks,43328,10.67655436,1,1.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-07,2,tropicana,7936,8.979164649,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-07,2,minute.maid,3840,8.253227646000001,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-07,2,dominicks,57600,10.96127785,1,1.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-14,2,tropicana,7808,8.962904128,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-14,2,minute.maid,12992,9.472089062,0,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-14,2,dominicks,704,6.556778356000001,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-21,2,tropicana,6080,8.712759975,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-21,2,minute.maid,70144,11.15830555,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-21,2,dominicks,6016,8.702177866,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-28,2,tropicana,42176,10.64960662,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-28,2,dominicks,10368,9.246479419,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-03-28,2,minute.maid,21248,9.964018052,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-04,2,dominicks,12608,9.442086812000001,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-04,2,minute.maid,5696,8.647519453,1,2.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-04,2,tropicana,4928,8.502688505,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-11,2,tropicana,29504,10.29228113,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-11,2,minute.maid,7680,8.946374826,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-11,2,dominicks,6336,8.754002933999999,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-18,2,tropicana,9984,9.208739091,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-18,2,minute.maid,6336,8.754002933999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-18,2,dominicks,140736,11.85464107,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-25,2,tropicana,35200,10.46880136,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-25,2,dominicks,960,6.866933285,1,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-04-25,2,minute.maid,8576,9.056722882999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-02,2,dominicks,1216,7.103322062999999,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-02,2,minute.maid,15104,9.622714887999999,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-02,2,tropicana,23936,10.08313888,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-09,2,tropicana,7104,8.868413285,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-09,2,minute.maid,76480,11.24478455,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-09,2,dominicks,1664,7.416979621,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-16,2,dominicks,4992,8.51559191,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-16,2,minute.maid,5056,8.528330936,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-16,2,tropicana,24512,10.10691807,1,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-23,2,tropicana,6336,8.754002933999999,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-23,2,minute.maid,4736,8.462948177000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-23,2,dominicks,27968,10.23881628,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-30,2,dominicks,12160,9.405907156,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-30,2,minute.maid,4480,8.407378325,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-05-30,2,tropicana,6080,8.712759975,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-06,2,tropicana,33536,10.42037477,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-06,2,minute.maid,4032,8.30201781,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-06,2,dominicks,2240,7.714231145,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-13,2,dominicks,5504,8.61323038,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-13,2,minute.maid,14784,9.601300794,1,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-13,2,tropicana,13248,9.491601877,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-20,2,tropicana,6208,8.733594062,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-20,2,dominicks,8832,9.086136769,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-20,2,minute.maid,12096,9.400630097999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-27,2,dominicks,2624,7.87245515,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-27,2,minute.maid,41792,10.64046021,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-06-27,2,tropicana,10624,9.270870872,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-04,2,tropicana,44672,10.70710219,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-04,2,minute.maid,10560,9.264828557000001,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-04,2,dominicks,10432,9.252633284,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-18,2,tropicana,20096,9.908276069,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-18,2,dominicks,8320,9.026417534,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-18,2,minute.maid,4224,8.348537825,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-25,2,dominicks,6784,8.822322178,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-25,2,minute.maid,2880,7.965545572999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-07-25,2,tropicana,9152,9.121727714,1,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-01,2,tropicana,21952,9.996613531,0,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-01,2,minute.maid,3968,8.286017467999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-01,2,dominicks,60544,11.01112565,1,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-08,2,dominicks,20608,9.933434629,0,0.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-08,2,minute.maid,3712,8.219326094,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-08,2,tropicana,13568,9.515469357999999,0,2.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-29,2,tropicana,4160,8.333270353,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-29,2,minute.maid,2816,7.943072717000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-08-29,2,dominicks,16064,9.684336023,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-05,2,tropicana,39424,10.58213005,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-05,2,minute.maid,4288,8.363575702999999,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-05,2,dominicks,12480,9.431882642,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-12,2,tropicana,5632,8.636219898,0,3.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-12,2,minute.maid,18240,9.811372264,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-12,2,dominicks,17024,9.742379392,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-19,2,dominicks,13440,9.505990614,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-19,2,minute.maid,7360,8.903815212,0,1.95,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-19,2,tropicana,9024,9.107642974,1,2.68,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-26,2,tropicana,6016,8.702177866,0,3.44,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-26,2,minute.maid,7808,8.962904128,0,1.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-09-26,2,dominicks,10112,9.221478116,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-03,2,dominicks,9088,9.114710141,0,1.56,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-03,2,minute.maid,13504,9.510741217,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-03,2,tropicana,7744,8.954673629,0,3.14,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-10,2,tropicana,6784,8.822322178,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-10,2,dominicks,22848,10.03661887,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-10,2,minute.maid,10048,9.215128888999999,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-17,2,dominicks,6976,8.850230966,0,1.65,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-17,2,minute.maid,135936,11.81993947,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-17,2,tropicana,6784,8.822322178,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-24,2,tropicana,6272,8.743850562,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-24,2,minute.maid,5056,8.528330936,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-24,2,dominicks,4160,8.333270353,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-31,2,tropicana,5312,8.577723691000001,0,3.07,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-31,2,minute.maid,27968,10.23881628,0,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-10-31,2,dominicks,3328,8.110126802,0,1.83,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-07,2,tropicana,9216,9.128696383,0,3.11,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-07,2,minute.maid,4736,8.462948177000001,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-07,2,dominicks,12096,9.400630097999999,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-14,2,tropicana,7296,8.895081532,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-14,2,minute.maid,7808,8.962904128,0,2.14,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-14,2,dominicks,6208,8.733594062,0,1.76,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-21,2,tropicana,34240,10.44114983,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-21,2,minute.maid,12480,9.431882642,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-21,2,dominicks,3008,8.009030685,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-28,2,dominicks,19456,9.875910785,1,1.5,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-28,2,minute.maid,9664,9.17616292,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-11-28,2,tropicana,7168,8.877381955,0,2.64,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-05,2,minute.maid,7168,8.877381955,0,2.06,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-05,2,dominicks,16768,9.727227587,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-05,2,tropicana,6080,8.712759975,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-12,2,dominicks,13568,9.515469357999999,1,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-12,2,minute.maid,4480,8.407378325,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-12,2,tropicana,5120,8.540909718,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-19,2,tropicana,8320,9.026417534,0,2.74,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-19,2,minute.maid,5952,8.691482577,0,2.22,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-19,2,dominicks,6080,8.712759975,0,1.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-26,2,dominicks,10432,9.252633284,1,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-26,2,minute.maid,21696,9.984883191,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1991-12-26,2,tropicana,17728,9.78290059,0,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-02,2,minute.maid,12032,9.395325046,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-02,2,dominicks,11712,9.368369236,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-02,2,tropicana,13120,9.481893063,0,2.35,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-09,2,dominicks,4032,8.30201781,0,1.76,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-09,2,minute.maid,7040,8.859363449,0,2.12,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-09,2,tropicana,13120,9.481893063,0,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-16,2,dominicks,6336,8.754002933999999,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-16,2,tropicana,9792,9.189321005,0,2.43,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-16,2,minute.maid,10240,9.234056899,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-23,2,tropicana,3520,8.166216269,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-23,2,minute.maid,6848,8.831711918,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-23,2,dominicks,13632,9.520175249,0,1.47,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-30,2,tropicana,5504,8.61323038,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-30,2,minute.maid,3968,8.286017467999999,0,2.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-01-30,2,dominicks,45120,10.71708089,0,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-06,2,tropicana,6720,8.812843434,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-06,2,minute.maid,5888,8.68067166,0,2.26,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-06,2,dominicks,9984,9.208739091,0,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-13,2,tropicana,20224,9.914625297,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-13,2,dominicks,4800,8.476371197,0,1.82,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-13,2,minute.maid,6208,8.733594062,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-20,2,dominicks,11776,9.373818841,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-20,2,minute.maid,72256,11.18797065,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-20,2,tropicana,5056,8.528330936,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-27,2,tropicana,43584,10.68244539,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-27,2,minute.maid,11520,9.351839934,0,2.11,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-02-27,2,dominicks,11584,9.357380115,0,1.54,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-05,2,tropicana,25728,10.15533517,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-05,2,minute.maid,5824,8.66974259,0,2.35,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-05,2,dominicks,51264,10.84474403,1,1.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-12,2,tropicana,31808,10.36747311,0,1.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-12,2,minute.maid,19392,9.872615889,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-12,2,dominicks,14976,9.614204199,0,1.44,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-19,2,tropicana,20736,9.939626599,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-19,2,minute.maid,9536,9.162829389,0,2.1,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-19,2,dominicks,30784,10.33475035,0,1.59,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-26,2,tropicana,15168,9.626943225,0,2.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-26,2,minute.maid,5312,8.577723691000001,0,2.28,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-03-26,2,dominicks,12480,9.431882642,0,1.6,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-02,2,tropicana,28096,10.2433825,1,2.5,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-02,2,dominicks,3264,8.090708716,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-02,2,minute.maid,14528,9.583833101,1,1.9,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-09,2,dominicks,8768,9.078864009,0,1.48,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-09,2,minute.maid,12416,9.426741242,0,2.12,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-09,2,tropicana,12416,9.426741242,0,2.58,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-16,2,tropicana,5376,8.589699882,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-16,2,minute.maid,5376,8.589699882,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-16,2,dominicks,70848,11.16829202,1,1.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-23,2,tropicana,9792,9.189321005,0,2.67,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-23,2,minute.maid,19008,9.852615222,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-23,2,dominicks,18560,9.828764006,0,1.42,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-30,2,tropicana,16960,9.738612909,1,2.39,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-30,2,minute.maid,3904,8.269756948,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-04-30,2,dominicks,9152,9.121727714,0,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-07,2,tropicana,8320,9.026417534,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-07,2,minute.maid,6336,8.754002933999999,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-07,2,dominicks,9600,9.169518378,0,2.0,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-14,2,tropicana,6912,8.841014311,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-14,2,minute.maid,5440,8.60153434,0,2.79,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-14,2,dominicks,4800,8.476371197,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-21,2,tropicana,6976,8.850230966,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-21,2,minute.maid,22400,10.01681624,1,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-21,2,dominicks,9664,9.17616292,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-28,2,minute.maid,3968,8.286017467999999,0,2.84,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-28,2,tropicana,7232,8.886270902,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-05-28,2,dominicks,45568,10.726961,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-04,2,tropicana,51520,10.84972536,1,2.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-04,2,minute.maid,3264,8.090708716,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-04,2,dominicks,20992,9.951896692,0,1.74,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-11,2,minute.maid,4352,8.378390789,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-11,2,tropicana,22272,10.01108556,0,2.21,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-11,2,dominicks,6592,8.793612072,0,2.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-18,2,dominicks,4992,8.51559191,0,2.05,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-18,2,minute.maid,4480,8.407378325,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-18,2,tropicana,46144,10.73952222,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-25,2,tropicana,4352,8.378390789,1,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-25,2,minute.maid,3840,8.253227646000001,0,2.52,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-06-25,2,dominicks,8064,8.99516499,0,1.24,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-02,2,tropicana,17280,9.757305042,0,2.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-02,2,minute.maid,13312,9.496421162999999,1,2.0,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-02,2,dominicks,7360,8.903815212,0,1.61,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-09,2,tropicana,5696,8.647519453,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-09,2,minute.maid,3776,8.236420527,1,2.33,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-09,2,dominicks,10048,9.215128888999999,0,1.4,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-16,2,tropicana,6848,8.831711918,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-16,2,dominicks,10112,9.221478116,0,1.91,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-16,2,minute.maid,4800,8.476371197,0,2.89,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-23,2,dominicks,9152,9.121727714,0,1.69,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-23,2,minute.maid,24960,10.12502982,1,2.29,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-23,2,tropicana,4416,8.392989587999999,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-30,2,tropicana,4672,8.449342525,0,3.16,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-30,2,minute.maid,4544,8.42156296,0,2.86,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-07-30,2,dominicks,36288,10.49924239,1,1.49,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-06,2,tropicana,7168,8.877381955,1,3.09,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-06,2,minute.maid,3968,8.286017467999999,1,2.81,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-06,2,dominicks,3776,8.236420527,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-13,2,tropicana,5056,8.528330936,0,3.19,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-13,2,dominicks,3328,8.110126802,0,1.97,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-13,2,minute.maid,49600,10.81174611,1,1.99,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1992-08-20,2,dominicks,13824,9.534161491,0,1.36,0.232864734,0.248934934,10.55320518,0.463887065,0.103953406,0.114279949,0.303585347,2.110122129,1.142857143,1.927279669,0.37692661299999997
1990-06-14,5,dominicks,1792,7.491087594,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-06-14,5,minute.maid,4224,8.348537825,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-06-14,5,tropicana,5888,8.68067166,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-06-28,5,minute.maid,4352,8.378390789,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-06-28,5,dominicks,2496,7.82244473,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-06-28,5,tropicana,6976,8.850230966,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-05,5,dominicks,2944,7.98752448,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-05,5,minute.maid,4928,8.502688505,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-05,5,tropicana,6528,8.783855897,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-12,5,dominicks,1024,6.931471806,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-12,5,minute.maid,31168,10.34714721,1,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-12,5,tropicana,4928,8.502688505,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-26,5,dominicks,4224,8.348537825,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-26,5,minute.maid,10048,9.215128888999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-07-26,5,tropicana,5312,8.577723691000001,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-02,5,minute.maid,21760,9.987828701,1,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-02,5,tropicana,5120,8.540909718,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-02,5,dominicks,4544,8.42156296,1,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-09,5,dominicks,1728,7.454719948999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-09,5,minute.maid,4544,8.42156296,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-09,5,tropicana,7936,8.979164649,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-16,5,tropicana,6080,8.712759975,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-16,5,minute.maid,52224,10.86329744,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-16,5,dominicks,1216,7.103322062999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-23,5,dominicks,1152,7.049254841000001,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-23,5,minute.maid,3584,8.184234774,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-23,5,tropicana,4160,8.333270353,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-30,5,minute.maid,5120,8.540909718,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-30,5,tropicana,5888,8.68067166,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-08-30,5,dominicks,30144,10.31374118,1,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-06,5,dominicks,8960,9.100525506,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-06,5,minute.maid,4416,8.392989587999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-06,5,tropicana,9536,9.162829389,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-13,5,tropicana,8320,9.026417534,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-13,5,dominicks,8192,9.010913347,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-13,5,minute.maid,30208,10.31586207,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-20,5,dominicks,6528,8.783855897,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-20,5,minute.maid,4160,8.333270353,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-20,5,tropicana,8000,8.987196821,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-27,5,dominicks,34688,10.45414909,1,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-27,5,minute.maid,4992,8.51559191,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-09-27,5,tropicana,5824,8.66974259,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-04,5,dominicks,4672,8.449342525,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-04,5,minute.maid,13952,9.543378146,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-04,5,tropicana,10624,9.270870872,1,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-11,5,tropicana,6656,8.803273982999999,0,3.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-11,5,dominicks,1088,6.992096427000001,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-11,5,minute.maid,47680,10.772267300000001,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-18,5,tropicana,5184,8.553332238,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-18,5,minute.maid,7616,8.938006577000001,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-18,5,dominicks,69440,11.14821835,1,1.24,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-25,5,tropicana,4928,8.502688505,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-25,5,minute.maid,8896,9.093357017,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-10-25,5,dominicks,1280,7.154615357000001,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-01,5,tropicana,5888,8.68067166,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-01,5,minute.maid,28544,10.25920204,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-01,5,dominicks,35456,10.47604777,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-08,5,tropicana,5312,8.577723691000001,0,3.51,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-08,5,dominicks,13824,9.534161491,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-08,5,minute.maid,5440,8.60153434,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-15,5,tropicana,9984,9.208739091,0,3.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-15,5,minute.maid,52416,10.86696717,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-15,5,dominicks,14208,9.561560465,0,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-22,5,tropicana,8448,9.041685006,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-22,5,dominicks,29312,10.28575227,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-22,5,minute.maid,11712,9.368369236,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-29,5,tropicana,10880,9.29468152,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-29,5,minute.maid,13952,9.543378146,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-11-29,5,dominicks,52992,10.87789624,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-06,5,dominicks,15680,9.660141293999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-06,5,minute.maid,36160,10.49570882,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-06,5,tropicana,5696,8.647519453,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-13,5,tropicana,5696,8.647519453,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-13,5,minute.maid,12864,9.462187991,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-13,5,dominicks,43520,10.68097588,1,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-20,5,tropicana,32384,10.38541975,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-20,5,minute.maid,22208,10.00820786,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-20,5,dominicks,3904,8.269756948,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-27,5,tropicana,10752,9.282847063,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-27,5,minute.maid,9984,9.208739091,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-12-27,5,dominicks,896,6.797940412999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-03,5,tropicana,6912,8.841014311,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-03,5,minute.maid,14016,9.547954812999999,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-03,5,dominicks,2240,7.714231145,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-10,5,tropicana,13440,9.505990614,0,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-10,5,minute.maid,6080,8.712759975,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-10,5,dominicks,125760,11.74213061,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-17,5,tropicana,7808,8.962904128,0,2.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-17,5,minute.maid,7808,8.962904128,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-17,5,dominicks,1408,7.249925537,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-24,5,tropicana,5248,8.565602331000001,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-24,5,minute.maid,40896,10.61878754,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-24,5,dominicks,7232,8.886270902,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-31,5,tropicana,6208,8.733594062,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-31,5,minute.maid,6272,8.743850562,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-01-31,5,dominicks,41216,10.62658181,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-07,5,tropicana,21440,9.973013615,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-07,5,minute.maid,7872,8.971067439,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-07,5,dominicks,9024,9.107642974,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-14,5,dominicks,1600,7.377758908,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-14,5,tropicana,7360,8.903815212,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-14,5,minute.maid,6144,8.723231275,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-21,5,tropicana,6720,8.812843434,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-21,5,minute.maid,8448,9.041685006,0,2.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-21,5,dominicks,2496,7.82244473,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-28,5,tropicana,6656,8.803273982999999,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-28,5,minute.maid,18688,9.835636886,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-02-28,5,dominicks,6336,8.754002933999999,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-07,5,tropicana,6016,8.702177866,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-07,5,minute.maid,6272,8.743850562,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-07,5,dominicks,56384,10.93994071,1,1.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-14,5,tropicana,6144,8.723231275,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-14,5,minute.maid,12096,9.400630097999999,0,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-14,5,dominicks,1600,7.377758908,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-21,5,tropicana,4928,8.502688505,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-21,5,minute.maid,73216,11.20116926,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-21,5,dominicks,2944,7.98752448,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-28,5,tropicana,67712,11.1230187,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-28,5,minute.maid,18944,9.849242538,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-03-28,5,dominicks,13504,9.510741217,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-04,5,dominicks,5376,8.589699882,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-04,5,tropicana,8640,9.064157862,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-04,5,minute.maid,6400,8.764053269,1,2.46,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-11,5,tropicana,35520,10.477851199999998,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-11,5,minute.maid,8640,9.064157862,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-11,5,dominicks,6656,8.803273982999999,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-18,5,tropicana,9664,9.17616292,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-18,5,minute.maid,7296,8.895081532,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-18,5,dominicks,95680,11.46876457,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-25,5,tropicana,49088,10.80136989,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-25,5,minute.maid,12480,9.431882642,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-04-25,5,dominicks,896,6.797940412999999,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-02,5,dominicks,1728,7.454719948999999,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-02,5,minute.maid,14144,9.557045785,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-02,5,tropicana,14912,9.609921537,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-09,5,minute.maid,88256,11.38799696,1,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-09,5,tropicana,6464,8.774003599999999,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-09,5,dominicks,1280,7.154615357000001,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-16,5,dominicks,5696,8.647519453,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-16,5,minute.maid,6848,8.831711918,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-16,5,tropicana,25024,10.12759064,1,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-23,5,minute.maid,7808,8.962904128,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-23,5,tropicana,6272,8.743850562,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-23,5,dominicks,28288,10.25019297,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-30,5,dominicks,4864,8.489616424,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-30,5,minute.maid,6272,8.743850562,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-05-30,5,tropicana,5056,8.528330936,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-06,5,minute.maid,6144,8.723231275,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-06,5,tropicana,47616,10.77092412,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-06,5,dominicks,2880,7.965545572999999,0,2.09,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-13,5,dominicks,5760,8.658692754,1,1.41,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-13,5,minute.maid,27776,10.23192762,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-13,5,tropicana,13888,9.538780437,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-20,5,tropicana,6144,8.723231275,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-20,5,minute.maid,20800,9.942708266,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-20,5,dominicks,15040,9.618468598,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-27,5,dominicks,5120,8.540909718,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-27,5,minute.maid,45696,10.72976605,1,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-06-27,5,tropicana,9344,9.142489705,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-04,5,minute.maid,14336,9.570529135,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-04,5,tropicana,32896,10.40110635,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-04,5,dominicks,3264,8.090708716,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-11,5,dominicks,9536,9.162829389,1,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-11,5,minute.maid,4928,8.502688505,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-11,5,tropicana,21056,9.954940834,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-18,5,tropicana,15360,9.639522007,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-18,5,minute.maid,4608,8.435549202,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-18,5,dominicks,6208,8.733594062,0,1.59,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-25,5,dominicks,6592,8.793612072,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-25,5,tropicana,8000,8.987196821,1,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-07-25,5,minute.maid,5248,8.565602331000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-01,5,tropicana,21120,9.957975738,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-01,5,dominicks,63552,11.05961375,1,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-01,5,minute.maid,4224,8.348537825,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-08,5,dominicks,27968,10.23881628,0,0.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-08,5,minute.maid,4288,8.363575702999999,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-08,5,tropicana,11904,9.384629757,0,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-15,5,minute.maid,16896,9.734832187,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-15,5,tropicana,5056,8.528330936,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-15,5,dominicks,21760,9.987828701,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-22,5,dominicks,2688,7.896552702,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-22,5,minute.maid,77184,11.25394746,1,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-22,5,tropicana,4608,8.435549202,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-29,5,tropicana,6016,8.702177866,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-29,5,minute.maid,5184,8.553332238,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-08-29,5,dominicks,10432,9.252633284,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-05,5,tropicana,50752,10.83470631,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-05,5,minute.maid,5248,8.565602331000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-05,5,dominicks,9792,9.189321005,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-12,5,minute.maid,20672,9.936535407000001,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-12,5,tropicana,5632,8.636219898,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-12,5,dominicks,8448,9.041685006,0,1.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-26,5,tropicana,6400,8.764053269,0,3.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-26,5,dominicks,6912,8.841014311,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-09-26,5,minute.maid,12352,9.421573272,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-03,5,dominicks,8256,9.018695487999999,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-03,5,minute.maid,12032,9.395325046,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-03,5,tropicana,5440,8.60153434,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-10,5,minute.maid,13440,9.505990614,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-10,5,dominicks,28672,10.26367632,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-10,5,tropicana,8128,9.00307017,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-24,5,tropicana,7232,8.886270902,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-24,5,minute.maid,5824,8.66974259,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-24,5,dominicks,4416,8.392989587999999,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-31,5,tropicana,7168,8.877381955,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-31,5,minute.maid,50112,10.82201578,0,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-10-31,5,dominicks,1856,7.526178913,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-07,5,minute.maid,5184,8.553332238,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-07,5,tropicana,7872,8.971067439,0,2.94,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-07,5,dominicks,6528,8.783855897,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-14,5,tropicana,7552,8.929567707999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-14,5,minute.maid,8384,9.034080407000001,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-14,5,dominicks,6080,8.712759975,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-21,5,tropicana,69504,11.14913958,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-21,5,dominicks,3456,8.14786713,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-21,5,minute.maid,10112,9.221478116,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-28,5,dominicks,25856,10.16029796,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-28,5,minute.maid,8384,9.034080407000001,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-11-28,5,tropicana,8960,9.100525506,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-05,5,tropicana,6912,8.841014311,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-05,5,dominicks,25728,10.15533517,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-05,5,minute.maid,11456,9.346268889,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-12,5,dominicks,23552,10.06696602,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-12,5,minute.maid,5952,8.691482577,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-12,5,tropicana,6656,8.803273982999999,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-19,5,tropicana,8192,9.010913347,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-19,5,dominicks,2944,7.98752448,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-19,5,minute.maid,8512,9.049232212,0,2.26,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-26,5,dominicks,5888,8.68067166,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-26,5,minute.maid,27968,10.23881628,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1991-12-26,5,tropicana,13440,9.505990614,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-02,5,tropicana,12160,9.405907156,0,2.39,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-02,5,dominicks,6848,8.831711918,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-02,5,minute.maid,24000,10.08580911,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-09,5,dominicks,1792,7.491087594,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-09,5,minute.maid,6848,8.831711918,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-09,5,tropicana,11840,9.379238908,0,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-16,5,tropicana,8640,9.064157862,0,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-16,5,dominicks,5248,8.565602331000001,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-16,5,minute.maid,15104,9.622714887999999,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-23,5,tropicana,5888,8.68067166,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-23,5,minute.maid,11392,9.340666634,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-23,5,dominicks,16768,9.727227587,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-30,5,tropicana,7424,8.912473275,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-30,5,minute.maid,5824,8.66974259,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-01-30,5,dominicks,52160,10.8620712,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-06,5,tropicana,5632,8.636219898,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-06,5,minute.maid,7488,8.921057017999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-06,5,dominicks,16640,9.719564714,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-13,5,tropicana,33600,10.42228135,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-13,5,minute.maid,8320,9.026417534,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-13,5,dominicks,1344,7.2034055210000005,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-20,5,dominicks,4608,8.435549202,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-20,5,tropicana,5376,8.589699882,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-20,5,minute.maid,99904,11.511965,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-27,5,tropicana,54272,10.90176372,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-27,5,minute.maid,6976,8.850230966,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-02-27,5,dominicks,12672,9.447150114,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-05,5,tropicana,33600,10.42228135,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-05,5,minute.maid,9984,9.208739091,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-05,5,dominicks,48640,10.79220152,1,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-12,5,tropicana,24448,10.10430369,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-12,5,minute.maid,32832,10.39915893,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-12,5,dominicks,13248,9.491601877,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-19,5,tropicana,22784,10.03381381,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-19,5,minute.maid,8128,9.00307017,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-19,5,dominicks,29248,10.28356647,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-26,5,tropicana,19008,9.852615222,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-26,5,minute.maid,6464,8.774003599999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-03-26,5,dominicks,4608,8.435549202,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-02,5,tropicana,15808,9.66827142,1,2.5,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-02,5,minute.maid,36800,10.51325312,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-02,5,dominicks,3136,8.050703382,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-09,5,dominicks,13184,9.486759252,0,1.58,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-09,5,tropicana,14144,9.557045785,0,2.5,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-09,5,minute.maid,12928,9.467150781,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-16,5,tropicana,9600,9.169518378,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-16,5,minute.maid,7424,8.912473275,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-16,5,dominicks,67712,11.1230187,1,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-23,5,tropicana,10112,9.221478116,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-23,5,minute.maid,34176,10.43927892,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-23,5,dominicks,18880,9.84585844,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-30,5,minute.maid,4160,8.333270353,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-30,5,tropicana,31872,10.36948316,1,2.24,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-04-30,5,dominicks,6208,8.733594062,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-07,5,tropicana,9280,9.135616826,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-07,5,minute.maid,5952,8.691482577,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-07,5,dominicks,5952,8.691482577,0,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-14,5,tropicana,7680,8.946374826,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-14,5,minute.maid,6528,8.783855897,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-14,5,dominicks,4160,8.333270353,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-21,5,tropicana,8704,9.071537969,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-21,5,minute.maid,30656,10.33058368,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-21,5,dominicks,23488,10.06424493,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-28,5,tropicana,9920,9.2023082,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-28,5,dominicks,60480,11.01006801,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-05-28,5,minute.maid,6656,8.803273982999999,0,2.66,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-04,5,tropicana,91968,11.42919597,1,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-04,5,minute.maid,4416,8.392989587999999,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-04,5,dominicks,20416,9.924074186,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-11,5,tropicana,44096,10.69412435,0,2.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-11,5,dominicks,6336,8.754002933999999,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-11,5,minute.maid,5696,8.647519453,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-25,5,minute.maid,5696,8.647519453,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-25,5,tropicana,7296,8.895081532,1,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-06-25,5,dominicks,1408,7.249925537,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-02,5,tropicana,12928,9.467150781,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-02,5,minute.maid,39680,10.58860256,1,2.01,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-02,5,dominicks,4672,8.449342525,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-09,5,tropicana,6848,8.831711918,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-09,5,minute.maid,6208,8.733594062,1,2.19,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-09,5,dominicks,19520,9.87919486,0,1.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-16,5,tropicana,8064,8.99516499,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-16,5,minute.maid,7872,8.971067439,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-16,5,dominicks,7872,8.971067439,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-23,5,dominicks,5184,8.553332238,0,1.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-23,5,tropicana,4992,8.51559191,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-23,5,minute.maid,54528,10.90646961,1,2.29,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-30,5,tropicana,7360,8.903815212,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-30,5,minute.maid,6400,8.764053269,0,2.69,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-07-30,5,dominicks,42240,10.65112292,1,1.49,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-06,5,tropicana,8384,9.034080407000001,1,2.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-06,5,minute.maid,5888,8.68067166,1,2.65,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-06,5,dominicks,6592,8.793612072,1,1.89,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-13,5,tropicana,8832,9.086136769,0,2.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-13,5,minute.maid,56384,10.93994071,1,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-13,5,dominicks,2112,7.655390645,0,1.99,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1992-08-20,5,dominicks,21248,9.964018052,0,1.79,0.117368032,0.32122573,10.92237097,0.535883355,0.103091585,0.053875277,0.410568032,3.801997814,0.681818182,1.600573425,0.736306837
1990-06-14,8,dominicks,14336,9.570529135,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-14,8,minute.maid,6080,8.712759975,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-14,8,tropicana,8896,9.093357017,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-21,8,dominicks,6400,8.764053269,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-21,8,minute.maid,51968,10.85838342,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-21,8,tropicana,7296,8.895081532,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-28,8,tropicana,10368,9.246479419,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-28,8,minute.maid,4928,8.502688505,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-06-28,8,dominicks,3968,8.286017467999999,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-05,8,dominicks,4352,8.378390789,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-05,8,minute.maid,5312,8.577723691000001,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-05,8,tropicana,6976,8.850230966,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-12,8,tropicana,6464,8.774003599999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-12,8,dominicks,3520,8.166216269,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-12,8,minute.maid,39424,10.58213005,1,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-19,8,tropicana,8192,9.010913347,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-19,8,dominicks,6464,8.774003599999999,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-19,8,minute.maid,5568,8.624791202,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-26,8,dominicks,5952,8.691482577,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-26,8,minute.maid,14592,9.588228712000001,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-07-26,8,tropicana,7936,8.979164649,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-02,8,tropicana,6656,8.803273982999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-02,8,minute.maid,22208,10.00820786,1,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-02,8,dominicks,8832,9.086136769,1,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-09,8,dominicks,7232,8.886270902,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-09,8,minute.maid,5760,8.658692754,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-09,8,tropicana,8256,9.018695487999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-16,8,tropicana,5568,8.624791202,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-16,8,minute.maid,54016,10.89703558,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-16,8,dominicks,5504,8.61323038,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-23,8,dominicks,4800,8.476371197,0,2.09,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-23,8,minute.maid,5824,8.66974259,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-23,8,tropicana,7488,8.921057017999999,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-30,8,tropicana,6144,8.723231275,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-30,8,minute.maid,6528,8.783855897,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-08-30,8,dominicks,52672,10.87183928,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-06,8,dominicks,16448,9.707959168,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-06,8,minute.maid,5440,8.60153434,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-06,8,tropicana,11008,9.30637756,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-13,8,minute.maid,36544,10.50627229,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-13,8,dominicks,19072,9.85597657,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-13,8,tropicana,5760,8.658692754,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-20,8,dominicks,13376,9.501217335,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-20,8,minute.maid,3776,8.236420527,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-20,8,tropicana,10112,9.221478116,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-27,8,tropicana,8448,9.041685006,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-27,8,minute.maid,5504,8.61323038,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-09-27,8,dominicks,61440,11.02581637,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-04,8,tropicana,8448,9.041685006,1,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-04,8,dominicks,13760,9.529521112000001,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-04,8,minute.maid,12416,9.426741242,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-11,8,minute.maid,53696,10.89109379,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-11,8,dominicks,3136,8.050703382,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-11,8,tropicana,7424,8.912473275,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-18,8,tropicana,5824,8.66974259,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-18,8,minute.maid,5696,8.647519453,0,2.51,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-18,8,dominicks,186176,12.13444774,1,1.14,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-25,8,tropicana,6656,8.803273982999999,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-25,8,minute.maid,4864,8.489616424,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-10-25,8,dominicks,3712,8.219326094,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-01,8,tropicana,6272,8.743850562,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-01,8,minute.maid,37184,10.52363384,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-01,8,dominicks,35776,10.48503256,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-08,8,tropicana,6912,8.841014311,0,3.04,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-08,8,minute.maid,5504,8.61323038,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-08,8,dominicks,26880,10.1991378,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-15,8,tropicana,10496,9.258749511,0,3.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-15,8,minute.maid,51008,10.83973776,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-15,8,dominicks,71680,11.17996705,0,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-22,8,tropicana,11840,9.379238908,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-22,8,minute.maid,11072,9.312174678,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-22,8,dominicks,25088,10.13014492,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-29,8,tropicana,9664,9.17616292,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-29,8,minute.maid,12160,9.405907156,0,2.62,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-11-29,8,dominicks,91456,11.42361326,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-06,8,minute.maid,30528,10.32639957,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-06,8,dominicks,23808,10.07777694,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-06,8,tropicana,6272,8.743850562,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-13,8,dominicks,89856,11.40596367,1,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-13,8,minute.maid,12096,9.400630097999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-13,8,tropicana,7168,8.877381955,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-20,8,minute.maid,16448,9.707959168,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-20,8,dominicks,12224,9.411156511,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-20,8,tropicana,29504,10.29228113,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-27,8,minute.maid,9344,9.142489705,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-27,8,dominicks,3776,8.236420527,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1990-12-27,8,tropicana,8704,9.071537969,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-03,8,tropicana,9280,9.135616826,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-03,8,minute.maid,16128,9.688312171,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-03,8,dominicks,13824,9.534161491,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-10,8,minute.maid,5376,8.589699882,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-10,8,dominicks,251072,12.43349503,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-10,8,tropicana,12224,9.411156511,0,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-17,8,minute.maid,6656,8.803273982999999,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-17,8,tropicana,10368,9.246479419,0,2.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-17,8,dominicks,4864,8.489616424,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-24,8,minute.maid,59712,10.99728828,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-24,8,dominicks,10176,9.227787286,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-24,8,tropicana,8128,9.00307017,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-31,8,tropicana,5952,8.691482577,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-31,8,minute.maid,9856,9.195835686,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-01-31,8,dominicks,105344,11.56498647,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-07,8,minute.maid,6720,8.812843434,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-07,8,dominicks,33600,10.42228135,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-07,8,tropicana,21696,9.984883191,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-14,8,dominicks,4736,8.462948177000001,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-14,8,minute.maid,4224,8.348537825,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-14,8,tropicana,7808,8.962904128,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-21,8,tropicana,8128,9.00307017,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-21,8,minute.maid,9728,9.182763604,0,2.12,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-21,8,dominicks,10304,9.240287448,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-28,8,tropicana,7424,8.912473275,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-28,8,minute.maid,40320,10.604602900000001,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-02-28,8,dominicks,5056,8.528330936,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-07,8,dominicks,179968,12.10053434,1,0.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-07,8,tropicana,5952,8.691482577,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-07,8,minute.maid,5120,8.540909718,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-14,8,minute.maid,19264,9.865993348,0,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-14,8,dominicks,4992,8.51559191,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-14,8,tropicana,7616,8.938006577000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-21,8,tropicana,5312,8.577723691000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-21,8,minute.maid,170432,12.04609167,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-21,8,dominicks,6400,8.764053269,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-28,8,minute.maid,39680,10.58860256,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-28,8,dominicks,14912,9.609921537,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-03-28,8,tropicana,161792,11.99406684,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-04,8,dominicks,34624,10.45230236,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-04,8,minute.maid,8128,9.00307017,1,2.17,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-04,8,tropicana,17280,9.757305042,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-11,8,tropicana,47040,10.75875358,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-11,8,minute.maid,9088,9.114710141,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-11,8,dominicks,10368,9.246479419,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-18,8,tropicana,14464,9.579418083,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-18,8,minute.maid,6720,8.812843434,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-18,8,dominicks,194880,12.18013926,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-25,8,tropicana,52928,10.87668778,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-25,8,dominicks,5696,8.647519453,1,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-04-25,8,minute.maid,7552,8.929567707999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-02,8,dominicks,7168,8.877381955,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-02,8,minute.maid,24768,10.11730778,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-02,8,tropicana,21184,9.961001459,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-09,8,tropicana,7360,8.903815212,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-09,8,minute.maid,183296,12.11885761,1,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-09,8,dominicks,2880,7.965545572999999,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-16,8,dominicks,12288,9.416378455,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-16,8,minute.maid,8896,9.093357017,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-05-16,8,tropicana,15744,9.664214619,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-06,8,dominicks,9280,9.135616826,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-06,8,tropicana,46912,10.75602879,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-06,8,minute.maid,6656,8.803273982999999,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-13,8,tropicana,18240,9.811372264,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-13,8,dominicks,25856,10.16029796,1,1.26,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-13,8,minute.maid,35456,10.47604777,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-20,8,dominicks,19264,9.865993348,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-20,8,minute.maid,17408,9.76468515,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-20,8,tropicana,6464,8.774003599999999,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-27,8,dominicks,6848,8.831711918,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-27,8,minute.maid,75520,11.2321528,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-06-27,8,tropicana,8512,9.049232212,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-04,8,tropicana,28416,10.25470765,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-04,8,minute.maid,21632,9.981928979,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-04,8,dominicks,12928,9.467150781,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-11,8,dominicks,44032,10.69267192,1,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-11,8,minute.maid,8384,9.034080407000001,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-11,8,tropicana,16960,9.738612909,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-18,8,minute.maid,9920,9.2023082,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-18,8,dominicks,25408,10.14281936,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-18,8,tropicana,8320,9.026417534,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-25,8,dominicks,38336,10.55414468,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-25,8,minute.maid,6592,8.793612072,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-07-25,8,tropicana,11136,9.317938383,1,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-01,8,tropicana,27712,10.22962081,0,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-01,8,minute.maid,7168,8.877381955,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-01,8,dominicks,152384,11.93415893,1,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-08,8,dominicks,54464,10.90529521,0,0.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-08,8,minute.maid,6208,8.733594062,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-08,8,tropicana,7744,8.954673629,0,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-15,8,minute.maid,30528,10.32639957,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-15,8,dominicks,47680,10.772267300000001,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-15,8,tropicana,5184,8.553332238,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-22,8,dominicks,14720,9.596962392,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-22,8,minute.maid,155840,11.95658512,1,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-22,8,tropicana,6272,8.743850562,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-29,8,tropicana,7744,8.954673629,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-29,8,dominicks,53248,10.88271552,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-08-29,8,minute.maid,10752,9.282847063,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-05,8,tropicana,53184,10.88151288,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-05,8,minute.maid,6976,8.850230966,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-05,8,dominicks,40576,10.61093204,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-12,8,dominicks,25856,10.16029796,0,1.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-12,8,tropicana,6784,8.822322178,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-12,8,minute.maid,31872,10.36948316,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-19,8,dominicks,24064,10.08847223,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-19,8,minute.maid,5312,8.577723691000001,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-19,8,tropicana,8000,8.987196821,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-26,8,tropicana,6592,8.793612072,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-26,8,minute.maid,33344,10.41463313,0,1.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-09-26,8,dominicks,15680,9.660141293999999,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-03,8,minute.maid,13504,9.510741217,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-03,8,dominicks,16576,9.715711145,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-03,8,tropicana,5248,8.565602331000001,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-10,8,dominicks,49664,10.8130356,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-10,8,tropicana,6592,8.793612072,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-10,8,minute.maid,13504,9.510741217,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-17,8,dominicks,10752,9.282847063,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-17,8,minute.maid,335808,12.72429485,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-17,8,tropicana,5888,8.68067166,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-24,8,tropicana,6336,8.754002933999999,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-24,8,dominicks,9792,9.189321005,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-24,8,minute.maid,13120,9.481893063,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-31,8,tropicana,5888,8.68067166,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-31,8,minute.maid,49664,10.8130356,0,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-10-31,8,dominicks,7104,8.868413285,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-07,8,dominicks,9216,9.128696383,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-07,8,tropicana,6080,8.712759975,0,2.94,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-07,8,minute.maid,10880,9.29468152,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-14,8,tropicana,6848,8.831711918,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-14,8,minute.maid,9984,9.208739091,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-14,8,dominicks,12608,9.442086812000001,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-21,8,tropicana,54016,10.89703558,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-21,8,minute.maid,9216,9.128696383,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-21,8,dominicks,16448,9.707959168,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-28,8,tropicana,10368,9.246479419,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-28,8,dominicks,27968,10.23881628,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-11-28,8,minute.maid,7680,8.946374826,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-05,8,minute.maid,7296,8.895081532,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-05,8,dominicks,37824,10.5406991,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-05,8,tropicana,5568,8.624791202,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-12,8,dominicks,33664,10.4241843,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-12,8,minute.maid,8192,9.010913347,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-12,8,tropicana,4864,8.489616424,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-19,8,tropicana,7232,8.886270902,0,2.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-19,8,minute.maid,6080,8.712759975,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-19,8,dominicks,17728,9.78290059,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-26,8,tropicana,15232,9.631153757,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-26,8,dominicks,25088,10.13014492,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1991-12-26,8,minute.maid,15040,9.618468598,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-02,8,minute.maid,9472,9.156095357,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-02,8,dominicks,13184,9.486759252,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-02,8,tropicana,47040,10.75875358,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-09,8,dominicks,3136,8.050703382,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-09,8,minute.maid,5888,8.68067166,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-09,8,tropicana,9280,9.135616826,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-16,8,tropicana,6720,8.812843434,0,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-16,8,minute.maid,14336,9.570529135,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-16,8,dominicks,5696,8.647519453,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-23,8,minute.maid,11712,9.368369236,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-23,8,dominicks,19008,9.852615222,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-23,8,tropicana,5056,8.528330936,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-30,8,minute.maid,7936,8.979164649,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-30,8,dominicks,121664,11.70901843,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-01-30,8,tropicana,6080,8.712759975,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-06,8,tropicana,10496,9.258749511,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-06,8,minute.maid,5184,8.553332238,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-06,8,dominicks,38848,10.56741187,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-13,8,minute.maid,7168,8.877381955,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-13,8,dominicks,6144,8.723231275,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-13,8,tropicana,39040,10.57234204,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-20,8,dominicks,13632,9.520175249,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-20,8,minute.maid,216064,12.28332994,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-20,8,tropicana,4480,8.407378325,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-27,8,tropicana,61760,11.03101119,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-27,8,minute.maid,15040,9.618468598,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-02-27,8,dominicks,9792,9.189321005,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-05,8,tropicana,15360,9.639522007,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-05,8,minute.maid,11840,9.379238908,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-05,8,dominicks,86912,11.37265139,1,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-12,8,minute.maid,25472,10.14533509,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-12,8,dominicks,24512,10.10691807,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-12,8,tropicana,54976,10.91465201,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-19,8,minute.maid,16384,9.704060528,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-19,8,dominicks,58048,10.96902553,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-19,8,tropicana,34368,10.44488118,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-26,8,tropicana,10752,9.282847063,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-26,8,minute.maid,20480,9.927204079,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-03-26,8,dominicks,13952,9.543378146,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-02,8,minute.maid,34688,10.45414909,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-02,8,dominicks,15168,9.626943225,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-02,8,tropicana,20096,9.908276069,1,2.5,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-09,8,dominicks,14592,9.588228712000001,0,1.58,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-09,8,minute.maid,22400,10.01681624,0,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-09,8,tropicana,16192,9.692272572,0,2.5,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-16,8,tropicana,6528,8.783855897,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-16,8,minute.maid,7808,8.962904128,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-16,8,dominicks,145088,11.88509573,1,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-23,8,tropicana,8320,9.026417534,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-23,8,minute.maid,48064,10.78028874,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-23,8,dominicks,43712,10.68537794,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-30,8,tropicana,30784,10.33475035,1,2.16,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-30,8,minute.maid,7360,8.903815212,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-04-30,8,dominicks,20608,9.933434629,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-07,8,tropicana,18048,9.800790154,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-07,8,minute.maid,6272,8.743850562,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-07,8,dominicks,18752,9.839055692,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-14,8,tropicana,12864,9.462187991,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-14,8,minute.maid,6400,8.764053269,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-14,8,dominicks,20160,9.911455722000001,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-21,8,tropicana,7168,8.877381955,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-21,8,minute.maid,54592,10.90764263,1,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-21,8,dominicks,18688,9.835636886,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-28,8,minute.maid,8128,9.00307017,0,2.39,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-28,8,tropicana,9024,9.107642974,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-05-28,8,dominicks,133824,11.80428078,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-04,8,tropicana,84992,11.35031241,1,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-04,8,minute.maid,4928,8.502688505,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-04,8,dominicks,63488,11.05860619,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-11,8,minute.maid,5440,8.60153434,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-11,8,tropicana,14144,9.557045785,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-11,8,dominicks,71040,11.17099838,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-25,8,tropicana,7488,8.921057017999999,1,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-25,8,minute.maid,5888,8.68067166,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-06-25,8,dominicks,15360,9.639522007,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-02,8,minute.maid,23872,10.0804615,1,2.02,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-02,8,dominicks,17728,9.78290059,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-02,8,tropicana,12352,9.421573272,0,2.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-09,8,tropicana,5696,8.647519453,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-09,8,minute.maid,6848,8.831711918,1,2.19,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-09,8,dominicks,24256,10.09641929,0,1.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-16,8,minute.maid,8192,9.010913347,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-16,8,dominicks,19968,9.901886271,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-16,8,tropicana,7680,8.946374826,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-23,8,dominicks,15936,9.67633598,0,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-23,8,minute.maid,55040,10.91581547,1,2.29,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-23,8,tropicana,5440,8.60153434,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-30,8,tropicana,5632,8.636219898,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-30,8,minute.maid,6528,8.783855897,0,2.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-07-30,8,dominicks,76352,11.24310951,1,1.49,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-06,8,tropicana,8960,9.100525506,1,2.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-06,8,minute.maid,6208,8.733594062,1,2.45,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-06,8,dominicks,17408,9.76468515,1,1.69,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-13,8,minute.maid,94720,11.45868045,1,1.99,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-13,8,tropicana,6080,8.712759975,0,2.89,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-13,8,dominicks,17536,9.77201119,0,1.79,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1992-08-20,8,dominicks,31232,10.34919849,0,1.59,0.252394035,0.095173274,10.59700966,0.054227156,0.131749698,0.035243328,0.283074736,2.636332801,1.5,2.905384316,0.641015947
1 WeekStarting Store Brand Quantity logQuantity Advert Price Age60 COLLEGE INCOME Hincome150 Large HH Minorities WorkingWoman SSTRDIST SSTRVOL CPDIST5 CPWVOL5
2 1990-06-14 2 dominicks 10560 9.264828557000001 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
3 1990-06-14 2 minute.maid 4480 8.407378325 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
4 1990-06-14 2 tropicana 8256 9.018695487999999 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
5 1990-07-26 2 dominicks 8000 8.987196821 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
6 1990-07-26 2 minute.maid 4672 8.449342525 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
7 1990-07-26 2 tropicana 6144 8.723231275 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
8 1990-08-02 2 tropicana 3840 8.253227646000001 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
9 1990-08-02 2 minute.maid 20160 9.911455722000001 1 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
10 1990-08-02 2 dominicks 6848 8.831711918 1 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
11 1990-08-09 2 dominicks 2880 7.965545572999999 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
12 1990-08-09 2 minute.maid 2688 7.896552702 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
13 1990-08-09 2 tropicana 8000 8.987196821 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
14 1990-08-23 2 dominicks 1600 7.377758908 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
15 1990-08-23 2 minute.maid 3008 8.009030685 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
16 1990-08-23 2 tropicana 8896 9.093357017 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
17 1990-08-30 2 tropicana 7168 8.877381955 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
18 1990-08-30 2 minute.maid 4672 8.449342525 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
19 1990-08-30 2 dominicks 25344 10.140297300000002 1 1.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
20 1990-09-06 2 dominicks 10752 9.282847063 0 1.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
21 1990-09-06 2 minute.maid 2752 7.920083199 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
22 1990-09-06 2 tropicana 10880 9.29468152 0 3.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
23 1990-09-13 2 minute.maid 26176 10.17259824 1 2.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
24 1990-09-13 2 dominicks 6656 8.803273982999999 0 1.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
25 1990-09-13 2 tropicana 7744 8.954673629 0 3.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
26 1990-09-20 2 dominicks 6592 8.793612072 0 1.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
27 1990-09-20 2 minute.maid 3712 8.219326094 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
28 1990-09-20 2 tropicana 8512 9.049232212 0 3.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
29 1990-10-11 2 tropicana 5504 8.61323038 0 3.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
30 1990-10-11 2 minute.maid 30656 10.33058368 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
31 1990-10-11 2 dominicks 1728 7.454719948999999 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
32 1990-10-18 2 tropicana 5888 8.68067166 0 3.56 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
33 1990-10-18 2 minute.maid 3840 8.253227646000001 0 2.98 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
34 1990-10-18 2 dominicks 33792 10.42797937 1 1.24 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
35 1990-10-25 2 tropicana 8384 9.034080407000001 0 3.56 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
36 1990-10-25 2 minute.maid 2816 7.943072717000001 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
37 1990-10-25 2 dominicks 1920 7.560080465 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
38 1990-11-01 2 tropicana 5952 8.691482577 0 3.56 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
39 1990-11-01 2 minute.maid 23104 10.04776104 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
40 1990-11-01 2 dominicks 8960 9.100525506 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
41 1990-11-08 2 dominicks 11392 9.340666634 0 1.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
42 1990-11-08 2 tropicana 6848 8.831711918 0 3.56 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
43 1990-11-08 2 minute.maid 3392 8.129174997 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
44 1990-11-15 2 tropicana 9216 9.128696383 0 3.87 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
45 1990-11-15 2 minute.maid 26304 10.1774763 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
46 1990-11-15 2 dominicks 28416 10.25470765 0 0.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
47 1990-11-22 2 dominicks 17152 9.749870064 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
48 1990-11-22 2 tropicana 12160 9.405907156 0 2.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
49 1990-11-22 2 minute.maid 6336 8.754002933999999 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
50 1990-11-29 2 tropicana 12672 9.447150114 0 2.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
51 1990-11-29 2 minute.maid 9920 9.2023082 0 3.17 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
52 1990-11-29 2 dominicks 26560 10.1871616 1 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
53 1990-12-06 2 dominicks 6336 8.754002933999999 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
54 1990-12-06 2 minute.maid 25280 10.13776885 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
55 1990-12-06 2 tropicana 6528 8.783855897 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
56 1990-12-13 2 dominicks 26368 10.17990643 1 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
57 1990-12-13 2 tropicana 6144 8.723231275 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
58 1990-12-13 2 minute.maid 14848 9.605620455 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
59 1990-12-20 2 tropicana 21120 9.957975738 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
60 1990-12-20 2 minute.maid 12288 9.416378455 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
61 1990-12-20 2 dominicks 896 6.797940412999999 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
62 1990-12-27 2 tropicana 12416 9.426741242 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
63 1990-12-27 2 minute.maid 6272 8.743850562 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
64 1990-12-27 2 dominicks 1472 7.294377299 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
65 1991-01-03 2 tropicana 9472 9.156095357 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
66 1991-01-03 2 minute.maid 9152 9.121727714 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
67 1991-01-03 2 dominicks 1344 7.2034055210000005 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
68 1991-01-10 2 tropicana 17920 9.793672686 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
69 1991-01-10 2 minute.maid 4160 8.333270353 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
70 1991-01-10 2 dominicks 111680 11.62339292 1 0.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
71 1991-01-17 2 tropicana 9408 9.14931567 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
72 1991-01-17 2 minute.maid 10176 9.227787286 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
73 1991-01-17 2 dominicks 1856 7.526178913 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
74 1991-01-24 2 tropicana 6272 8.743850562 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
75 1991-01-24 2 minute.maid 29056 10.27698028 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
76 1991-01-24 2 dominicks 5568 8.624791202 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
77 1991-01-31 2 tropicana 6912 8.841014311 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
78 1991-01-31 2 minute.maid 7104 8.868413285 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
79 1991-01-31 2 dominicks 32064 10.37548918 1 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
80 1991-02-07 2 tropicana 16768 9.727227587 0 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
81 1991-02-07 2 dominicks 4352 8.378390789 0 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
82 1991-02-07 2 minute.maid 7488 8.921057017999999 0 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
83 1991-02-14 2 dominicks 704 6.556778356000001 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
84 1991-02-14 2 minute.maid 4224 8.348537825 0 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
85 1991-02-14 2 tropicana 6272 8.743850562 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
86 1991-02-21 2 tropicana 7936 8.979164649 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
87 1991-02-21 2 minute.maid 8960 9.100525506 0 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
88 1991-02-21 2 dominicks 13760 9.529521112000001 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
89 1991-02-28 2 tropicana 6144 8.723231275 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
90 1991-02-28 2 minute.maid 22464 10.01966931 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
91 1991-02-28 2 dominicks 43328 10.67655436 1 1.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
92 1991-03-07 2 tropicana 7936 8.979164649 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
93 1991-03-07 2 minute.maid 3840 8.253227646000001 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
94 1991-03-07 2 dominicks 57600 10.96127785 1 1.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
95 1991-03-14 2 tropicana 7808 8.962904128 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
96 1991-03-14 2 minute.maid 12992 9.472089062 0 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
97 1991-03-14 2 dominicks 704 6.556778356000001 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
98 1991-03-21 2 tropicana 6080 8.712759975 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
99 1991-03-21 2 minute.maid 70144 11.15830555 1 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
100 1991-03-21 2 dominicks 6016 8.702177866 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
101 1991-03-28 2 tropicana 42176 10.64960662 1 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
102 1991-03-28 2 dominicks 10368 9.246479419 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
103 1991-03-28 2 minute.maid 21248 9.964018052 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
104 1991-04-04 2 dominicks 12608 9.442086812000001 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
105 1991-04-04 2 minute.maid 5696 8.647519453 1 2.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
106 1991-04-04 2 tropicana 4928 8.502688505 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
107 1991-04-11 2 tropicana 29504 10.29228113 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
108 1991-04-11 2 minute.maid 7680 8.946374826 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
109 1991-04-11 2 dominicks 6336 8.754002933999999 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
110 1991-04-18 2 tropicana 9984 9.208739091 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
111 1991-04-18 2 minute.maid 6336 8.754002933999999 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
112 1991-04-18 2 dominicks 140736 11.85464107 1 0.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
113 1991-04-25 2 tropicana 35200 10.46880136 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
114 1991-04-25 2 dominicks 960 6.866933285 1 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
115 1991-04-25 2 minute.maid 8576 9.056722882999999 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
116 1991-05-02 2 dominicks 1216 7.103322062999999 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
117 1991-05-02 2 minute.maid 15104 9.622714887999999 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
118 1991-05-02 2 tropicana 23936 10.08313888 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
119 1991-05-09 2 tropicana 7104 8.868413285 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
120 1991-05-09 2 minute.maid 76480 11.24478455 1 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
121 1991-05-09 2 dominicks 1664 7.416979621 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
122 1991-05-16 2 dominicks 4992 8.51559191 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
123 1991-05-16 2 minute.maid 5056 8.528330936 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
124 1991-05-16 2 tropicana 24512 10.10691807 1 2.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
125 1991-05-23 2 tropicana 6336 8.754002933999999 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
126 1991-05-23 2 minute.maid 4736 8.462948177000001 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
127 1991-05-23 2 dominicks 27968 10.23881628 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
128 1991-05-30 2 dominicks 12160 9.405907156 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
129 1991-05-30 2 minute.maid 4480 8.407378325 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
130 1991-05-30 2 tropicana 6080 8.712759975 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
131 1991-06-06 2 tropicana 33536 10.42037477 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
132 1991-06-06 2 minute.maid 4032 8.30201781 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
133 1991-06-06 2 dominicks 2240 7.714231145 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
134 1991-06-13 2 dominicks 5504 8.61323038 1 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
135 1991-06-13 2 minute.maid 14784 9.601300794 1 1.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
136 1991-06-13 2 tropicana 13248 9.491601877 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
137 1991-06-20 2 tropicana 6208 8.733594062 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
138 1991-06-20 2 dominicks 8832 9.086136769 0 1.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
139 1991-06-20 2 minute.maid 12096 9.400630097999999 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
140 1991-06-27 2 dominicks 2624 7.87245515 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
141 1991-06-27 2 minute.maid 41792 10.64046021 1 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
142 1991-06-27 2 tropicana 10624 9.270870872 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
143 1991-07-04 2 tropicana 44672 10.70710219 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
144 1991-07-04 2 minute.maid 10560 9.264828557000001 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
145 1991-07-04 2 dominicks 10432 9.252633284 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
146 1991-07-18 2 tropicana 20096 9.908276069 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
147 1991-07-18 2 dominicks 8320 9.026417534 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
148 1991-07-18 2 minute.maid 4224 8.348537825 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
149 1991-07-25 2 dominicks 6784 8.822322178 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
150 1991-07-25 2 minute.maid 2880 7.965545572999999 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
151 1991-07-25 2 tropicana 9152 9.121727714 1 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
152 1991-08-01 2 tropicana 21952 9.996613531 0 2.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
153 1991-08-01 2 minute.maid 3968 8.286017467999999 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
154 1991-08-01 2 dominicks 60544 11.01112565 1 0.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
155 1991-08-08 2 dominicks 20608 9.933434629 0 0.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
156 1991-08-08 2 minute.maid 3712 8.219326094 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
157 1991-08-08 2 tropicana 13568 9.515469357999999 0 2.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
158 1991-08-29 2 tropicana 4160 8.333270353 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
159 1991-08-29 2 minute.maid 2816 7.943072717000001 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
160 1991-08-29 2 dominicks 16064 9.684336023 0 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
161 1991-09-05 2 tropicana 39424 10.58213005 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
162 1991-09-05 2 minute.maid 4288 8.363575702999999 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
163 1991-09-05 2 dominicks 12480 9.431882642 0 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
164 1991-09-12 2 tropicana 5632 8.636219898 0 3.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
165 1991-09-12 2 minute.maid 18240 9.811372264 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
166 1991-09-12 2 dominicks 17024 9.742379392 0 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
167 1991-09-19 2 dominicks 13440 9.505990614 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
168 1991-09-19 2 minute.maid 7360 8.903815212 0 1.95 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
169 1991-09-19 2 tropicana 9024 9.107642974 1 2.68 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
170 1991-09-26 2 tropicana 6016 8.702177866 0 3.44 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
171 1991-09-26 2 minute.maid 7808 8.962904128 0 1.83 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
172 1991-09-26 2 dominicks 10112 9.221478116 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
173 1991-10-03 2 dominicks 9088 9.114710141 0 1.56 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
174 1991-10-03 2 minute.maid 13504 9.510741217 0 1.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
175 1991-10-03 2 tropicana 7744 8.954673629 0 3.14 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
176 1991-10-10 2 tropicana 6784 8.822322178 0 3.07 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
177 1991-10-10 2 dominicks 22848 10.03661887 1 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
178 1991-10-10 2 minute.maid 10048 9.215128888999999 0 1.91 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
179 1991-10-17 2 dominicks 6976 8.850230966 0 1.65 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
180 1991-10-17 2 minute.maid 135936 11.81993947 1 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
181 1991-10-17 2 tropicana 6784 8.822322178 0 3.07 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
182 1991-10-24 2 tropicana 6272 8.743850562 0 3.07 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
183 1991-10-24 2 minute.maid 5056 8.528330936 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
184 1991-10-24 2 dominicks 4160 8.333270353 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
185 1991-10-31 2 tropicana 5312 8.577723691000001 0 3.07 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
186 1991-10-31 2 minute.maid 27968 10.23881628 0 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
187 1991-10-31 2 dominicks 3328 8.110126802 0 1.83 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
188 1991-11-07 2 tropicana 9216 9.128696383 0 3.11 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
189 1991-11-07 2 minute.maid 4736 8.462948177000001 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
190 1991-11-07 2 dominicks 12096 9.400630097999999 1 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
191 1991-11-14 2 tropicana 7296 8.895081532 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
192 1991-11-14 2 minute.maid 7808 8.962904128 0 2.14 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
193 1991-11-14 2 dominicks 6208 8.733594062 0 1.76 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
194 1991-11-21 2 tropicana 34240 10.44114983 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
195 1991-11-21 2 minute.maid 12480 9.431882642 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
196 1991-11-21 2 dominicks 3008 8.009030685 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
197 1991-11-28 2 dominicks 19456 9.875910785 1 1.5 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
198 1991-11-28 2 minute.maid 9664 9.17616292 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
199 1991-11-28 2 tropicana 7168 8.877381955 0 2.64 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
200 1991-12-05 2 minute.maid 7168 8.877381955 0 2.06 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
201 1991-12-05 2 dominicks 16768 9.727227587 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
202 1991-12-05 2 tropicana 6080 8.712759975 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
203 1991-12-12 2 dominicks 13568 9.515469357999999 1 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
204 1991-12-12 2 minute.maid 4480 8.407378325 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
205 1991-12-12 2 tropicana 5120 8.540909718 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
206 1991-12-19 2 tropicana 8320 9.026417534 0 2.74 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
207 1991-12-19 2 minute.maid 5952 8.691482577 0 2.22 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
208 1991-12-19 2 dominicks 6080 8.712759975 0 1.61 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
209 1991-12-26 2 dominicks 10432 9.252633284 1 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
210 1991-12-26 2 minute.maid 21696 9.984883191 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
211 1991-12-26 2 tropicana 17728 9.78290059 0 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
212 1992-01-02 2 minute.maid 12032 9.395325046 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
213 1992-01-02 2 dominicks 11712 9.368369236 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
214 1992-01-02 2 tropicana 13120 9.481893063 0 2.35 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
215 1992-01-09 2 dominicks 4032 8.30201781 0 1.76 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
216 1992-01-09 2 minute.maid 7040 8.859363449 0 2.12 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
217 1992-01-09 2 tropicana 13120 9.481893063 0 2.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
218 1992-01-16 2 dominicks 6336 8.754002933999999 0 1.82 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
219 1992-01-16 2 tropicana 9792 9.189321005 0 2.43 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
220 1992-01-16 2 minute.maid 10240 9.234056899 1 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
221 1992-01-23 2 tropicana 3520 8.166216269 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
222 1992-01-23 2 minute.maid 6848 8.831711918 1 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
223 1992-01-23 2 dominicks 13632 9.520175249 0 1.47 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
224 1992-01-30 2 tropicana 5504 8.61323038 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
225 1992-01-30 2 minute.maid 3968 8.286017467999999 0 2.61 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
226 1992-01-30 2 dominicks 45120 10.71708089 0 1.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
227 1992-02-06 2 tropicana 6720 8.812843434 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
228 1992-02-06 2 minute.maid 5888 8.68067166 0 2.26 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
229 1992-02-06 2 dominicks 9984 9.208739091 0 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
230 1992-02-13 2 tropicana 20224 9.914625297 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
231 1992-02-13 2 dominicks 4800 8.476371197 0 1.82 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
232 1992-02-13 2 minute.maid 6208 8.733594062 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
233 1992-02-20 2 dominicks 11776 9.373818841 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
234 1992-02-20 2 minute.maid 72256 11.18797065 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
235 1992-02-20 2 tropicana 5056 8.528330936 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
236 1992-02-27 2 tropicana 43584 10.68244539 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
237 1992-02-27 2 minute.maid 11520 9.351839934 0 2.11 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
238 1992-02-27 2 dominicks 11584 9.357380115 0 1.54 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
239 1992-03-05 2 tropicana 25728 10.15533517 0 1.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
240 1992-03-05 2 minute.maid 5824 8.66974259 0 2.35 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
241 1992-03-05 2 dominicks 51264 10.84474403 1 1.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
242 1992-03-12 2 tropicana 31808 10.36747311 0 1.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
243 1992-03-12 2 minute.maid 19392 9.872615889 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
244 1992-03-12 2 dominicks 14976 9.614204199 0 1.44 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
245 1992-03-19 2 tropicana 20736 9.939626599 0 1.91 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
246 1992-03-19 2 minute.maid 9536 9.162829389 0 2.1 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
247 1992-03-19 2 dominicks 30784 10.33475035 0 1.59 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
248 1992-03-26 2 tropicana 15168 9.626943225 0 2.81 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
249 1992-03-26 2 minute.maid 5312 8.577723691000001 0 2.28 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
250 1992-03-26 2 dominicks 12480 9.431882642 0 1.6 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
251 1992-04-02 2 tropicana 28096 10.2433825 1 2.5 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
252 1992-04-02 2 dominicks 3264 8.090708716 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
253 1992-04-02 2 minute.maid 14528 9.583833101 1 1.9 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
254 1992-04-09 2 dominicks 8768 9.078864009 0 1.48 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
255 1992-04-09 2 minute.maid 12416 9.426741242 0 2.12 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
256 1992-04-09 2 tropicana 12416 9.426741242 0 2.58 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
257 1992-04-16 2 tropicana 5376 8.589699882 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
258 1992-04-16 2 minute.maid 5376 8.589699882 0 2.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
259 1992-04-16 2 dominicks 70848 11.16829202 1 1.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
260 1992-04-23 2 tropicana 9792 9.189321005 0 2.67 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
261 1992-04-23 2 minute.maid 19008 9.852615222 1 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
262 1992-04-23 2 dominicks 18560 9.828764006 0 1.42 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
263 1992-04-30 2 tropicana 16960 9.738612909 1 2.39 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
264 1992-04-30 2 minute.maid 3904 8.269756948 0 2.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
265 1992-04-30 2 dominicks 9152 9.121727714 0 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
266 1992-05-07 2 tropicana 8320 9.026417534 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
267 1992-05-07 2 minute.maid 6336 8.754002933999999 0 2.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
268 1992-05-07 2 dominicks 9600 9.169518378 0 2.0 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
269 1992-05-14 2 tropicana 6912 8.841014311 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
270 1992-05-14 2 minute.maid 5440 8.60153434 0 2.79 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
271 1992-05-14 2 dominicks 4800 8.476371197 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
272 1992-05-21 2 tropicana 6976 8.850230966 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
273 1992-05-21 2 minute.maid 22400 10.01681624 1 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
274 1992-05-21 2 dominicks 9664 9.17616292 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
275 1992-05-28 2 minute.maid 3968 8.286017467999999 0 2.84 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
276 1992-05-28 2 tropicana 7232 8.886270902 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
277 1992-05-28 2 dominicks 45568 10.726961 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
278 1992-06-04 2 tropicana 51520 10.84972536 1 2.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
279 1992-06-04 2 minute.maid 3264 8.090708716 0 2.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
280 1992-06-04 2 dominicks 20992 9.951896692 0 1.74 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
281 1992-06-11 2 minute.maid 4352 8.378390789 0 2.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
282 1992-06-11 2 tropicana 22272 10.01108556 0 2.21 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
283 1992-06-11 2 dominicks 6592 8.793612072 0 2.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
284 1992-06-18 2 dominicks 4992 8.51559191 0 2.05 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
285 1992-06-18 2 minute.maid 4480 8.407378325 0 2.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
286 1992-06-18 2 tropicana 46144 10.73952222 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
287 1992-06-25 2 tropicana 4352 8.378390789 1 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
288 1992-06-25 2 minute.maid 3840 8.253227646000001 0 2.52 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
289 1992-06-25 2 dominicks 8064 8.99516499 0 1.24 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
290 1992-07-02 2 tropicana 17280 9.757305042 0 2.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
291 1992-07-02 2 minute.maid 13312 9.496421162999999 1 2.0 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
292 1992-07-02 2 dominicks 7360 8.903815212 0 1.61 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
293 1992-07-09 2 tropicana 5696 8.647519453 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
294 1992-07-09 2 minute.maid 3776 8.236420527 1 2.33 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
295 1992-07-09 2 dominicks 10048 9.215128888999999 0 1.4 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
296 1992-07-16 2 tropicana 6848 8.831711918 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
297 1992-07-16 2 dominicks 10112 9.221478116 0 1.91 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
298 1992-07-16 2 minute.maid 4800 8.476371197 0 2.89 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
299 1992-07-23 2 dominicks 9152 9.121727714 0 1.69 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
300 1992-07-23 2 minute.maid 24960 10.12502982 1 2.29 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
301 1992-07-23 2 tropicana 4416 8.392989587999999 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
302 1992-07-30 2 tropicana 4672 8.449342525 0 3.16 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
303 1992-07-30 2 minute.maid 4544 8.42156296 0 2.86 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
304 1992-07-30 2 dominicks 36288 10.49924239 1 1.49 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
305 1992-08-06 2 tropicana 7168 8.877381955 1 3.09 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
306 1992-08-06 2 minute.maid 3968 8.286017467999999 1 2.81 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
307 1992-08-06 2 dominicks 3776 8.236420527 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
308 1992-08-13 2 tropicana 5056 8.528330936 0 3.19 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
309 1992-08-13 2 dominicks 3328 8.110126802 0 1.97 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
310 1992-08-13 2 minute.maid 49600 10.81174611 1 1.99 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
311 1992-08-20 2 dominicks 13824 9.534161491 0 1.36 0.232864734 0.248934934 10.55320518 0.463887065 0.103953406 0.114279949 0.303585347 2.110122129 1.142857143 1.927279669 0.37692661299999997
312 1990-06-14 5 dominicks 1792 7.491087594 1 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
313 1990-06-14 5 minute.maid 4224 8.348537825 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
314 1990-06-14 5 tropicana 5888 8.68067166 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
315 1990-06-28 5 minute.maid 4352 8.378390789 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
316 1990-06-28 5 dominicks 2496 7.82244473 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
317 1990-06-28 5 tropicana 6976 8.850230966 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
318 1990-07-05 5 dominicks 2944 7.98752448 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
319 1990-07-05 5 minute.maid 4928 8.502688505 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
320 1990-07-05 5 tropicana 6528 8.783855897 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
321 1990-07-12 5 dominicks 1024 6.931471806 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
322 1990-07-12 5 minute.maid 31168 10.34714721 1 2.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
323 1990-07-12 5 tropicana 4928 8.502688505 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
324 1990-07-26 5 dominicks 4224 8.348537825 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
325 1990-07-26 5 minute.maid 10048 9.215128888999999 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
326 1990-07-26 5 tropicana 5312 8.577723691000001 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
327 1990-08-02 5 minute.maid 21760 9.987828701 1 2.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
328 1990-08-02 5 tropicana 5120 8.540909718 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
329 1990-08-02 5 dominicks 4544 8.42156296 1 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
330 1990-08-09 5 dominicks 1728 7.454719948999999 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
331 1990-08-09 5 minute.maid 4544 8.42156296 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
332 1990-08-09 5 tropicana 7936 8.979164649 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
333 1990-08-16 5 tropicana 6080 8.712759975 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
334 1990-08-16 5 minute.maid 52224 10.86329744 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
335 1990-08-16 5 dominicks 1216 7.103322062999999 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
336 1990-08-23 5 dominicks 1152 7.049254841000001 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
337 1990-08-23 5 minute.maid 3584 8.184234774 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
338 1990-08-23 5 tropicana 4160 8.333270353 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
339 1990-08-30 5 minute.maid 5120 8.540909718 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
340 1990-08-30 5 tropicana 5888 8.68067166 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
341 1990-08-30 5 dominicks 30144 10.31374118 1 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
342 1990-09-06 5 dominicks 8960 9.100525506 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
343 1990-09-06 5 minute.maid 4416 8.392989587999999 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
344 1990-09-06 5 tropicana 9536 9.162829389 0 3.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
345 1990-09-13 5 tropicana 8320 9.026417534 0 3.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
346 1990-09-13 5 dominicks 8192 9.010913347 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
347 1990-09-13 5 minute.maid 30208 10.31586207 1 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
348 1990-09-20 5 dominicks 6528 8.783855897 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
349 1990-09-20 5 minute.maid 4160 8.333270353 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
350 1990-09-20 5 tropicana 8000 8.987196821 0 3.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
351 1990-09-27 5 dominicks 34688 10.45414909 1 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
352 1990-09-27 5 minute.maid 4992 8.51559191 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
353 1990-09-27 5 tropicana 5824 8.66974259 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
354 1990-10-04 5 dominicks 4672 8.449342525 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
355 1990-10-04 5 minute.maid 13952 9.543378146 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
356 1990-10-04 5 tropicana 10624 9.270870872 1 3.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
357 1990-10-11 5 tropicana 6656 8.803273982999999 0 3.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
358 1990-10-11 5 dominicks 1088 6.992096427000001 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
359 1990-10-11 5 minute.maid 47680 10.772267300000001 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
360 1990-10-18 5 tropicana 5184 8.553332238 0 3.51 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
361 1990-10-18 5 minute.maid 7616 8.938006577000001 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
362 1990-10-18 5 dominicks 69440 11.14821835 1 1.24 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
363 1990-10-25 5 tropicana 4928 8.502688505 0 3.51 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
364 1990-10-25 5 minute.maid 8896 9.093357017 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
365 1990-10-25 5 dominicks 1280 7.154615357000001 0 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
366 1990-11-01 5 tropicana 5888 8.68067166 0 3.51 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
367 1990-11-01 5 minute.maid 28544 10.25920204 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
368 1990-11-01 5 dominicks 35456 10.47604777 1 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
369 1990-11-08 5 tropicana 5312 8.577723691000001 0 3.51 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
370 1990-11-08 5 dominicks 13824 9.534161491 0 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
371 1990-11-08 5 minute.maid 5440 8.60153434 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
372 1990-11-15 5 tropicana 9984 9.208739091 0 3.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
373 1990-11-15 5 minute.maid 52416 10.86696717 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
374 1990-11-15 5 dominicks 14208 9.561560465 0 0.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
375 1990-11-22 5 tropicana 8448 9.041685006 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
376 1990-11-22 5 dominicks 29312 10.28575227 1 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
377 1990-11-22 5 minute.maid 11712 9.368369236 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
378 1990-11-29 5 tropicana 10880 9.29468152 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
379 1990-11-29 5 minute.maid 13952 9.543378146 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
380 1990-11-29 5 dominicks 52992 10.87789624 1 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
381 1990-12-06 5 dominicks 15680 9.660141293999999 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
382 1990-12-06 5 minute.maid 36160 10.49570882 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
383 1990-12-06 5 tropicana 5696 8.647519453 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
384 1990-12-13 5 tropicana 5696 8.647519453 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
385 1990-12-13 5 minute.maid 12864 9.462187991 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
386 1990-12-13 5 dominicks 43520 10.68097588 1 1.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
387 1990-12-20 5 tropicana 32384 10.38541975 0 2.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
388 1990-12-20 5 minute.maid 22208 10.00820786 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
389 1990-12-20 5 dominicks 3904 8.269756948 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
390 1990-12-27 5 tropicana 10752 9.282847063 0 2.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
391 1990-12-27 5 minute.maid 9984 9.208739091 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
392 1990-12-27 5 dominicks 896 6.797940412999999 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
393 1991-01-03 5 tropicana 6912 8.841014311 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
394 1991-01-03 5 minute.maid 14016 9.547954812999999 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
395 1991-01-03 5 dominicks 2240 7.714231145 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
396 1991-01-10 5 tropicana 13440 9.505990614 0 2.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
397 1991-01-10 5 minute.maid 6080 8.712759975 0 2.46 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
398 1991-01-10 5 dominicks 125760 11.74213061 1 0.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
399 1991-01-17 5 tropicana 7808 8.962904128 0 2.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
400 1991-01-17 5 minute.maid 7808 8.962904128 0 2.46 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
401 1991-01-17 5 dominicks 1408 7.249925537 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
402 1991-01-24 5 tropicana 5248 8.565602331000001 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
403 1991-01-24 5 minute.maid 40896 10.61878754 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
404 1991-01-24 5 dominicks 7232 8.886270902 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
405 1991-01-31 5 tropicana 6208 8.733594062 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
406 1991-01-31 5 minute.maid 6272 8.743850562 0 2.46 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
407 1991-01-31 5 dominicks 41216 10.62658181 1 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
408 1991-02-07 5 tropicana 21440 9.973013615 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
409 1991-02-07 5 minute.maid 7872 8.971067439 0 2.41 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
410 1991-02-07 5 dominicks 9024 9.107642974 0 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
411 1991-02-14 5 dominicks 1600 7.377758908 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
412 1991-02-14 5 tropicana 7360 8.903815212 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
413 1991-02-14 5 minute.maid 6144 8.723231275 0 2.41 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
414 1991-02-21 5 tropicana 6720 8.812843434 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
415 1991-02-21 5 minute.maid 8448 9.041685006 0 2.41 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
416 1991-02-21 5 dominicks 2496 7.82244473 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
417 1991-02-28 5 tropicana 6656 8.803273982999999 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
418 1991-02-28 5 minute.maid 18688 9.835636886 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
419 1991-02-28 5 dominicks 6336 8.754002933999999 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
420 1991-03-07 5 tropicana 6016 8.702177866 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
421 1991-03-07 5 minute.maid 6272 8.743850562 0 2.46 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
422 1991-03-07 5 dominicks 56384 10.93994071 1 1.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
423 1991-03-14 5 tropicana 6144 8.723231275 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
424 1991-03-14 5 minute.maid 12096 9.400630097999999 0 2.46 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
425 1991-03-14 5 dominicks 1600 7.377758908 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
426 1991-03-21 5 tropicana 4928 8.502688505 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
427 1991-03-21 5 minute.maid 73216 11.20116926 1 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
428 1991-03-21 5 dominicks 2944 7.98752448 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
429 1991-03-28 5 tropicana 67712 11.1230187 1 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
430 1991-03-28 5 minute.maid 18944 9.849242538 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
431 1991-03-28 5 dominicks 13504 9.510741217 1 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
432 1991-04-04 5 dominicks 5376 8.589699882 0 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
433 1991-04-04 5 tropicana 8640 9.064157862 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
434 1991-04-04 5 minute.maid 6400 8.764053269 1 2.46 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
435 1991-04-11 5 tropicana 35520 10.477851199999998 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
436 1991-04-11 5 minute.maid 8640 9.064157862 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
437 1991-04-11 5 dominicks 6656 8.803273982999999 0 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
438 1991-04-18 5 tropicana 9664 9.17616292 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
439 1991-04-18 5 minute.maid 7296 8.895081532 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
440 1991-04-18 5 dominicks 95680 11.46876457 1 0.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
441 1991-04-25 5 tropicana 49088 10.80136989 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
442 1991-04-25 5 minute.maid 12480 9.431882642 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
443 1991-04-25 5 dominicks 896 6.797940412999999 1 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
444 1991-05-02 5 dominicks 1728 7.454719948999999 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
445 1991-05-02 5 minute.maid 14144 9.557045785 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
446 1991-05-02 5 tropicana 14912 9.609921537 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
447 1991-05-09 5 minute.maid 88256 11.38799696 1 1.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
448 1991-05-09 5 tropicana 6464 8.774003599999999 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
449 1991-05-09 5 dominicks 1280 7.154615357000001 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
450 1991-05-16 5 dominicks 5696 8.647519453 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
451 1991-05-16 5 minute.maid 6848 8.831711918 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
452 1991-05-16 5 tropicana 25024 10.12759064 1 2.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
453 1991-05-23 5 minute.maid 7808 8.962904128 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
454 1991-05-23 5 tropicana 6272 8.743850562 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
455 1991-05-23 5 dominicks 28288 10.25019297 1 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
456 1991-05-30 5 dominicks 4864 8.489616424 0 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
457 1991-05-30 5 minute.maid 6272 8.743850562 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
458 1991-05-30 5 tropicana 5056 8.528330936 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
459 1991-06-06 5 minute.maid 6144 8.723231275 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
460 1991-06-06 5 tropicana 47616 10.77092412 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
461 1991-06-06 5 dominicks 2880 7.965545572999999 0 2.09 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
462 1991-06-13 5 dominicks 5760 8.658692754 1 1.41 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
463 1991-06-13 5 minute.maid 27776 10.23192762 1 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
464 1991-06-13 5 tropicana 13888 9.538780437 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
465 1991-06-20 5 tropicana 6144 8.723231275 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
466 1991-06-20 5 minute.maid 20800 9.942708266 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
467 1991-06-20 5 dominicks 15040 9.618468598 0 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
468 1991-06-27 5 dominicks 5120 8.540909718 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
469 1991-06-27 5 minute.maid 45696 10.72976605 1 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
470 1991-06-27 5 tropicana 9344 9.142489705 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
471 1991-07-04 5 minute.maid 14336 9.570529135 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
472 1991-07-04 5 tropicana 32896 10.40110635 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
473 1991-07-04 5 dominicks 3264 8.090708716 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
474 1991-07-11 5 dominicks 9536 9.162829389 1 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
475 1991-07-11 5 minute.maid 4928 8.502688505 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
476 1991-07-11 5 tropicana 21056 9.954940834 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
477 1991-07-18 5 tropicana 15360 9.639522007 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
478 1991-07-18 5 minute.maid 4608 8.435549202 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
479 1991-07-18 5 dominicks 6208 8.733594062 0 1.59 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
480 1991-07-25 5 dominicks 6592 8.793612072 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
481 1991-07-25 5 tropicana 8000 8.987196821 1 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
482 1991-07-25 5 minute.maid 5248 8.565602331000001 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
483 1991-08-01 5 tropicana 21120 9.957975738 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
484 1991-08-01 5 dominicks 63552 11.05961375 1 0.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
485 1991-08-01 5 minute.maid 4224 8.348537825 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
486 1991-08-08 5 dominicks 27968 10.23881628 0 0.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
487 1991-08-08 5 minute.maid 4288 8.363575702999999 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
488 1991-08-08 5 tropicana 11904 9.384629757 0 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
489 1991-08-15 5 minute.maid 16896 9.734832187 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
490 1991-08-15 5 tropicana 5056 8.528330936 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
491 1991-08-15 5 dominicks 21760 9.987828701 1 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
492 1991-08-22 5 dominicks 2688 7.896552702 0 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
493 1991-08-22 5 minute.maid 77184 11.25394746 1 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
494 1991-08-22 5 tropicana 4608 8.435549202 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
495 1991-08-29 5 tropicana 6016 8.702177866 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
496 1991-08-29 5 minute.maid 5184 8.553332238 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
497 1991-08-29 5 dominicks 10432 9.252633284 0 1.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
498 1991-09-05 5 tropicana 50752 10.83470631 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
499 1991-09-05 5 minute.maid 5248 8.565602331000001 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
500 1991-09-05 5 dominicks 9792 9.189321005 0 1.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
501 1991-09-12 5 minute.maid 20672 9.936535407000001 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
502 1991-09-12 5 tropicana 5632 8.636219898 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
503 1991-09-12 5 dominicks 8448 9.041685006 0 1.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
504 1991-09-26 5 tropicana 6400 8.764053269 0 3.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
505 1991-09-26 5 dominicks 6912 8.841014311 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
506 1991-09-26 5 minute.maid 12352 9.421573272 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
507 1991-10-03 5 dominicks 8256 9.018695487999999 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
508 1991-10-03 5 minute.maid 12032 9.395325046 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
509 1991-10-03 5 tropicana 5440 8.60153434 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
510 1991-10-10 5 minute.maid 13440 9.505990614 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
511 1991-10-10 5 dominicks 28672 10.26367632 1 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
512 1991-10-10 5 tropicana 8128 9.00307017 0 2.94 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
513 1991-10-24 5 tropicana 7232 8.886270902 0 2.94 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
514 1991-10-24 5 minute.maid 5824 8.66974259 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
515 1991-10-24 5 dominicks 4416 8.392989587999999 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
516 1991-10-31 5 tropicana 7168 8.877381955 0 2.94 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
517 1991-10-31 5 minute.maid 50112 10.82201578 0 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
518 1991-10-31 5 dominicks 1856 7.526178913 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
519 1991-11-07 5 minute.maid 5184 8.553332238 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
520 1991-11-07 5 tropicana 7872 8.971067439 0 2.94 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
521 1991-11-07 5 dominicks 6528 8.783855897 1 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
522 1991-11-14 5 tropicana 7552 8.929567707999999 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
523 1991-11-14 5 minute.maid 8384 9.034080407000001 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
524 1991-11-14 5 dominicks 6080 8.712759975 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
525 1991-11-21 5 tropicana 69504 11.14913958 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
526 1991-11-21 5 dominicks 3456 8.14786713 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
527 1991-11-21 5 minute.maid 10112 9.221478116 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
528 1991-11-28 5 dominicks 25856 10.16029796 1 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
529 1991-11-28 5 minute.maid 8384 9.034080407000001 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
530 1991-11-28 5 tropicana 8960 9.100525506 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
531 1991-12-05 5 tropicana 6912 8.841014311 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
532 1991-12-05 5 dominicks 25728 10.15533517 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
533 1991-12-05 5 minute.maid 11456 9.346268889 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
534 1991-12-12 5 dominicks 23552 10.06696602 1 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
535 1991-12-12 5 minute.maid 5952 8.691482577 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
536 1991-12-12 5 tropicana 6656 8.803273982999999 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
537 1991-12-19 5 tropicana 8192 9.010913347 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
538 1991-12-19 5 dominicks 2944 7.98752448 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
539 1991-12-19 5 minute.maid 8512 9.049232212 0 2.26 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
540 1991-12-26 5 dominicks 5888 8.68067166 1 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
541 1991-12-26 5 minute.maid 27968 10.23881628 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
542 1991-12-26 5 tropicana 13440 9.505990614 0 2.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
543 1992-01-02 5 tropicana 12160 9.405907156 0 2.39 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
544 1992-01-02 5 dominicks 6848 8.831711918 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
545 1992-01-02 5 minute.maid 24000 10.08580911 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
546 1992-01-09 5 dominicks 1792 7.491087594 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
547 1992-01-09 5 minute.maid 6848 8.831711918 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
548 1992-01-09 5 tropicana 11840 9.379238908 0 2.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
549 1992-01-16 5 tropicana 8640 9.064157862 0 2.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
550 1992-01-16 5 dominicks 5248 8.565602331000001 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
551 1992-01-16 5 minute.maid 15104 9.622714887999999 1 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
552 1992-01-23 5 tropicana 5888 8.68067166 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
553 1992-01-23 5 minute.maid 11392 9.340666634 1 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
554 1992-01-23 5 dominicks 16768 9.727227587 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
555 1992-01-30 5 tropicana 7424 8.912473275 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
556 1992-01-30 5 minute.maid 5824 8.66974259 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
557 1992-01-30 5 dominicks 52160 10.8620712 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
558 1992-02-06 5 tropicana 5632 8.636219898 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
559 1992-02-06 5 minute.maid 7488 8.921057017999999 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
560 1992-02-06 5 dominicks 16640 9.719564714 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
561 1992-02-13 5 tropicana 33600 10.42228135 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
562 1992-02-13 5 minute.maid 8320 9.026417534 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
563 1992-02-13 5 dominicks 1344 7.2034055210000005 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
564 1992-02-20 5 dominicks 4608 8.435549202 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
565 1992-02-20 5 tropicana 5376 8.589699882 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
566 1992-02-20 5 minute.maid 99904 11.511965 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
567 1992-02-27 5 tropicana 54272 10.90176372 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
568 1992-02-27 5 minute.maid 6976 8.850230966 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
569 1992-02-27 5 dominicks 12672 9.447150114 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
570 1992-03-05 5 tropicana 33600 10.42228135 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
571 1992-03-05 5 minute.maid 9984 9.208739091 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
572 1992-03-05 5 dominicks 48640 10.79220152 1 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
573 1992-03-12 5 tropicana 24448 10.10430369 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
574 1992-03-12 5 minute.maid 32832 10.39915893 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
575 1992-03-12 5 dominicks 13248 9.491601877 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
576 1992-03-19 5 tropicana 22784 10.03381381 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
577 1992-03-19 5 minute.maid 8128 9.00307017 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
578 1992-03-19 5 dominicks 29248 10.28356647 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
579 1992-03-26 5 tropicana 19008 9.852615222 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
580 1992-03-26 5 minute.maid 6464 8.774003599999999 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
581 1992-03-26 5 dominicks 4608 8.435549202 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
582 1992-04-02 5 tropicana 15808 9.66827142 1 2.5 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
583 1992-04-02 5 minute.maid 36800 10.51325312 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
584 1992-04-02 5 dominicks 3136 8.050703382 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
585 1992-04-09 5 dominicks 13184 9.486759252 0 1.58 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
586 1992-04-09 5 tropicana 14144 9.557045785 0 2.5 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
587 1992-04-09 5 minute.maid 12928 9.467150781 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
588 1992-04-16 5 tropicana 9600 9.169518378 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
589 1992-04-16 5 minute.maid 7424 8.912473275 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
590 1992-04-16 5 dominicks 67712 11.1230187 1 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
591 1992-04-23 5 tropicana 10112 9.221478116 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
592 1992-04-23 5 minute.maid 34176 10.43927892 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
593 1992-04-23 5 dominicks 18880 9.84585844 0 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
594 1992-04-30 5 minute.maid 4160 8.333270353 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
595 1992-04-30 5 tropicana 31872 10.36948316 1 2.24 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
596 1992-04-30 5 dominicks 6208 8.733594062 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
597 1992-05-07 5 tropicana 9280 9.135616826 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
598 1992-05-07 5 minute.maid 5952 8.691482577 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
599 1992-05-07 5 dominicks 5952 8.691482577 0 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
600 1992-05-14 5 tropicana 7680 8.946374826 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
601 1992-05-14 5 minute.maid 6528 8.783855897 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
602 1992-05-14 5 dominicks 4160 8.333270353 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
603 1992-05-21 5 tropicana 8704 9.071537969 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
604 1992-05-21 5 minute.maid 30656 10.33058368 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
605 1992-05-21 5 dominicks 23488 10.06424493 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
606 1992-05-28 5 tropicana 9920 9.2023082 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
607 1992-05-28 5 dominicks 60480 11.01006801 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
608 1992-05-28 5 minute.maid 6656 8.803273982999999 0 2.66 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
609 1992-06-04 5 tropicana 91968 11.42919597 1 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
610 1992-06-04 5 minute.maid 4416 8.392989587999999 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
611 1992-06-04 5 dominicks 20416 9.924074186 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
612 1992-06-11 5 tropicana 44096 10.69412435 0 2.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
613 1992-06-11 5 dominicks 6336 8.754002933999999 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
614 1992-06-11 5 minute.maid 5696 8.647519453 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
615 1992-06-25 5 minute.maid 5696 8.647519453 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
616 1992-06-25 5 tropicana 7296 8.895081532 1 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
617 1992-06-25 5 dominicks 1408 7.249925537 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
618 1992-07-02 5 tropicana 12928 9.467150781 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
619 1992-07-02 5 minute.maid 39680 10.58860256 1 2.01 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
620 1992-07-02 5 dominicks 4672 8.449342525 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
621 1992-07-09 5 tropicana 6848 8.831711918 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
622 1992-07-09 5 minute.maid 6208 8.733594062 1 2.19 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
623 1992-07-09 5 dominicks 19520 9.87919486 0 1.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
624 1992-07-16 5 tropicana 8064 8.99516499 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
625 1992-07-16 5 minute.maid 7872 8.971067439 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
626 1992-07-16 5 dominicks 7872 8.971067439 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
627 1992-07-23 5 dominicks 5184 8.553332238 0 1.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
628 1992-07-23 5 tropicana 4992 8.51559191 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
629 1992-07-23 5 minute.maid 54528 10.90646961 1 2.29 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
630 1992-07-30 5 tropicana 7360 8.903815212 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
631 1992-07-30 5 minute.maid 6400 8.764053269 0 2.69 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
632 1992-07-30 5 dominicks 42240 10.65112292 1 1.49 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
633 1992-08-06 5 tropicana 8384 9.034080407000001 1 2.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
634 1992-08-06 5 minute.maid 5888 8.68067166 1 2.65 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
635 1992-08-06 5 dominicks 6592 8.793612072 1 1.89 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
636 1992-08-13 5 tropicana 8832 9.086136769 0 2.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
637 1992-08-13 5 minute.maid 56384 10.93994071 1 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
638 1992-08-13 5 dominicks 2112 7.655390645 0 1.99 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
639 1992-08-20 5 dominicks 21248 9.964018052 0 1.79 0.117368032 0.32122573 10.92237097 0.535883355 0.103091585 0.053875277 0.410568032 3.801997814 0.681818182 1.600573425 0.736306837
640 1990-06-14 8 dominicks 14336 9.570529135 1 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
641 1990-06-14 8 minute.maid 6080 8.712759975 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
642 1990-06-14 8 tropicana 8896 9.093357017 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
643 1990-06-21 8 dominicks 6400 8.764053269 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
644 1990-06-21 8 minute.maid 51968 10.85838342 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
645 1990-06-21 8 tropicana 7296 8.895081532 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
646 1990-06-28 8 tropicana 10368 9.246479419 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
647 1990-06-28 8 minute.maid 4928 8.502688505 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
648 1990-06-28 8 dominicks 3968 8.286017467999999 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
649 1990-07-05 8 dominicks 4352 8.378390789 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
650 1990-07-05 8 minute.maid 5312 8.577723691000001 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
651 1990-07-05 8 tropicana 6976 8.850230966 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
652 1990-07-12 8 tropicana 6464 8.774003599999999 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
653 1990-07-12 8 dominicks 3520 8.166216269 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
654 1990-07-12 8 minute.maid 39424 10.58213005 1 2.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
655 1990-07-19 8 tropicana 8192 9.010913347 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
656 1990-07-19 8 dominicks 6464 8.774003599999999 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
657 1990-07-19 8 minute.maid 5568 8.624791202 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
658 1990-07-26 8 dominicks 5952 8.691482577 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
659 1990-07-26 8 minute.maid 14592 9.588228712000001 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
660 1990-07-26 8 tropicana 7936 8.979164649 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
661 1990-08-02 8 tropicana 6656 8.803273982999999 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
662 1990-08-02 8 minute.maid 22208 10.00820786 1 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
663 1990-08-02 8 dominicks 8832 9.086136769 1 2.09 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
664 1990-08-09 8 dominicks 7232 8.886270902 0 2.09 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
665 1990-08-09 8 minute.maid 5760 8.658692754 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
666 1990-08-09 8 tropicana 8256 9.018695487999999 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
667 1990-08-16 8 tropicana 5568 8.624791202 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
668 1990-08-16 8 minute.maid 54016 10.89703558 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
669 1990-08-16 8 dominicks 5504 8.61323038 0 2.09 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
670 1990-08-23 8 dominicks 4800 8.476371197 0 2.09 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
671 1990-08-23 8 minute.maid 5824 8.66974259 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
672 1990-08-23 8 tropicana 7488 8.921057017999999 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
673 1990-08-30 8 tropicana 6144 8.723231275 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
674 1990-08-30 8 minute.maid 6528 8.783855897 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
675 1990-08-30 8 dominicks 52672 10.87183928 1 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
676 1990-09-06 8 dominicks 16448 9.707959168 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
677 1990-09-06 8 minute.maid 5440 8.60153434 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
678 1990-09-06 8 tropicana 11008 9.30637756 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
679 1990-09-13 8 minute.maid 36544 10.50627229 1 2.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
680 1990-09-13 8 dominicks 19072 9.85597657 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
681 1990-09-13 8 tropicana 5760 8.658692754 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
682 1990-09-20 8 dominicks 13376 9.501217335 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
683 1990-09-20 8 minute.maid 3776 8.236420527 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
684 1990-09-20 8 tropicana 10112 9.221478116 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
685 1990-09-27 8 tropicana 8448 9.041685006 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
686 1990-09-27 8 minute.maid 5504 8.61323038 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
687 1990-09-27 8 dominicks 61440 11.02581637 1 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
688 1990-10-04 8 tropicana 8448 9.041685006 1 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
689 1990-10-04 8 dominicks 13760 9.529521112000001 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
690 1990-10-04 8 minute.maid 12416 9.426741242 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
691 1990-10-11 8 minute.maid 53696 10.89109379 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
692 1990-10-11 8 dominicks 3136 8.050703382 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
693 1990-10-11 8 tropicana 7424 8.912473275 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
694 1990-10-18 8 tropicana 5824 8.66974259 0 3.04 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
695 1990-10-18 8 minute.maid 5696 8.647519453 0 2.51 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
696 1990-10-18 8 dominicks 186176 12.13444774 1 1.14 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
697 1990-10-25 8 tropicana 6656 8.803273982999999 0 3.04 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
698 1990-10-25 8 minute.maid 4864 8.489616424 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
699 1990-10-25 8 dominicks 3712 8.219326094 0 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
700 1990-11-01 8 tropicana 6272 8.743850562 0 3.04 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
701 1990-11-01 8 minute.maid 37184 10.52363384 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
702 1990-11-01 8 dominicks 35776 10.48503256 1 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
703 1990-11-08 8 tropicana 6912 8.841014311 0 3.04 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
704 1990-11-08 8 minute.maid 5504 8.61323038 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
705 1990-11-08 8 dominicks 26880 10.1991378 0 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
706 1990-11-15 8 tropicana 10496 9.258749511 0 3.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
707 1990-11-15 8 minute.maid 51008 10.83973776 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
708 1990-11-15 8 dominicks 71680 11.17996705 0 0.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
709 1990-11-22 8 tropicana 11840 9.379238908 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
710 1990-11-22 8 minute.maid 11072 9.312174678 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
711 1990-11-22 8 dominicks 25088 10.13014492 1 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
712 1990-11-29 8 tropicana 9664 9.17616292 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
713 1990-11-29 8 minute.maid 12160 9.405907156 0 2.62 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
714 1990-11-29 8 dominicks 91456 11.42361326 1 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
715 1990-12-06 8 minute.maid 30528 10.32639957 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
716 1990-12-06 8 dominicks 23808 10.07777694 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
717 1990-12-06 8 tropicana 6272 8.743850562 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
718 1990-12-13 8 dominicks 89856 11.40596367 1 1.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
719 1990-12-13 8 minute.maid 12096 9.400630097999999 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
720 1990-12-13 8 tropicana 7168 8.877381955 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
721 1990-12-20 8 minute.maid 16448 9.707959168 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
722 1990-12-20 8 dominicks 12224 9.411156511 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
723 1990-12-20 8 tropicana 29504 10.29228113 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
724 1990-12-27 8 minute.maid 9344 9.142489705 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
725 1990-12-27 8 dominicks 3776 8.236420527 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
726 1990-12-27 8 tropicana 8704 9.071537969 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
727 1991-01-03 8 tropicana 9280 9.135616826 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
728 1991-01-03 8 minute.maid 16128 9.688312171 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
729 1991-01-03 8 dominicks 13824 9.534161491 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
730 1991-01-10 8 minute.maid 5376 8.589699882 0 2.17 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
731 1991-01-10 8 dominicks 251072 12.43349503 1 0.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
732 1991-01-10 8 tropicana 12224 9.411156511 0 2.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
733 1991-01-17 8 minute.maid 6656 8.803273982999999 0 2.17 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
734 1991-01-17 8 tropicana 10368 9.246479419 0 2.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
735 1991-01-17 8 dominicks 4864 8.489616424 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
736 1991-01-24 8 minute.maid 59712 10.99728828 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
737 1991-01-24 8 dominicks 10176 9.227787286 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
738 1991-01-24 8 tropicana 8128 9.00307017 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
739 1991-01-31 8 tropicana 5952 8.691482577 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
740 1991-01-31 8 minute.maid 9856 9.195835686 0 2.17 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
741 1991-01-31 8 dominicks 105344 11.56498647 1 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
742 1991-02-07 8 minute.maid 6720 8.812843434 0 2.12 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
743 1991-02-07 8 dominicks 33600 10.42228135 0 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
744 1991-02-07 8 tropicana 21696 9.984883191 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
745 1991-02-14 8 dominicks 4736 8.462948177000001 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
746 1991-02-14 8 minute.maid 4224 8.348537825 0 2.12 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
747 1991-02-14 8 tropicana 7808 8.962904128 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
748 1991-02-21 8 tropicana 8128 9.00307017 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
749 1991-02-21 8 minute.maid 9728 9.182763604 0 2.12 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
750 1991-02-21 8 dominicks 10304 9.240287448 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
751 1991-02-28 8 tropicana 7424 8.912473275 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
752 1991-02-28 8 minute.maid 40320 10.604602900000001 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
753 1991-02-28 8 dominicks 5056 8.528330936 1 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
754 1991-03-07 8 dominicks 179968 12.10053434 1 0.94 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
755 1991-03-07 8 tropicana 5952 8.691482577 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
756 1991-03-07 8 minute.maid 5120 8.540909718 0 2.17 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
757 1991-03-14 8 minute.maid 19264 9.865993348 0 2.17 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
758 1991-03-14 8 dominicks 4992 8.51559191 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
759 1991-03-14 8 tropicana 7616 8.938006577000001 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
760 1991-03-21 8 tropicana 5312 8.577723691000001 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
761 1991-03-21 8 minute.maid 170432 12.04609167 1 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
762 1991-03-21 8 dominicks 6400 8.764053269 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
763 1991-03-28 8 minute.maid 39680 10.58860256 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
764 1991-03-28 8 dominicks 14912 9.609921537 1 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
765 1991-03-28 8 tropicana 161792 11.99406684 1 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
766 1991-04-04 8 dominicks 34624 10.45230236 0 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
767 1991-04-04 8 minute.maid 8128 9.00307017 1 2.17 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
768 1991-04-04 8 tropicana 17280 9.757305042 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
769 1991-04-11 8 tropicana 47040 10.75875358 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
770 1991-04-11 8 minute.maid 9088 9.114710141 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
771 1991-04-11 8 dominicks 10368 9.246479419 0 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
772 1991-04-18 8 tropicana 14464 9.579418083 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
773 1991-04-18 8 minute.maid 6720 8.812843434 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
774 1991-04-18 8 dominicks 194880 12.18013926 1 0.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
775 1991-04-25 8 tropicana 52928 10.87668778 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
776 1991-04-25 8 dominicks 5696 8.647519453 1 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
777 1991-04-25 8 minute.maid 7552 8.929567707999999 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
778 1991-05-02 8 dominicks 7168 8.877381955 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
779 1991-05-02 8 minute.maid 24768 10.11730778 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
780 1991-05-02 8 tropicana 21184 9.961001459 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
781 1991-05-09 8 tropicana 7360 8.903815212 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
782 1991-05-09 8 minute.maid 183296 12.11885761 1 1.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
783 1991-05-09 8 dominicks 2880 7.965545572999999 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
784 1991-05-16 8 dominicks 12288 9.416378455 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
785 1991-05-16 8 minute.maid 8896 9.093357017 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
786 1991-05-16 8 tropicana 15744 9.664214619 1 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
787 1991-06-06 8 dominicks 9280 9.135616826 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
788 1991-06-06 8 tropicana 46912 10.75602879 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
789 1991-06-06 8 minute.maid 6656 8.803273982999999 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
790 1991-06-13 8 tropicana 18240 9.811372264 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
791 1991-06-13 8 dominicks 25856 10.16029796 1 1.26 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
792 1991-06-13 8 minute.maid 35456 10.47604777 1 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
793 1991-06-20 8 dominicks 19264 9.865993348 0 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
794 1991-06-20 8 minute.maid 17408 9.76468515 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
795 1991-06-20 8 tropicana 6464 8.774003599999999 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
796 1991-06-27 8 dominicks 6848 8.831711918 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
797 1991-06-27 8 minute.maid 75520 11.2321528 1 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
798 1991-06-27 8 tropicana 8512 9.049232212 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
799 1991-07-04 8 tropicana 28416 10.25470765 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
800 1991-07-04 8 minute.maid 21632 9.981928979 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
801 1991-07-04 8 dominicks 12928 9.467150781 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
802 1991-07-11 8 dominicks 44032 10.69267192 1 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
803 1991-07-11 8 minute.maid 8384 9.034080407000001 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
804 1991-07-11 8 tropicana 16960 9.738612909 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
805 1991-07-18 8 minute.maid 9920 9.2023082 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
806 1991-07-18 8 dominicks 25408 10.14281936 0 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
807 1991-07-18 8 tropicana 8320 9.026417534 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
808 1991-07-25 8 dominicks 38336 10.55414468 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
809 1991-07-25 8 minute.maid 6592 8.793612072 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
810 1991-07-25 8 tropicana 11136 9.317938383 1 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
811 1991-08-01 8 tropicana 27712 10.22962081 0 2.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
812 1991-08-01 8 minute.maid 7168 8.877381955 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
813 1991-08-01 8 dominicks 152384 11.93415893 1 0.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
814 1991-08-08 8 dominicks 54464 10.90529521 0 0.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
815 1991-08-08 8 minute.maid 6208 8.733594062 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
816 1991-08-08 8 tropicana 7744 8.954673629 0 2.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
817 1991-08-15 8 minute.maid 30528 10.32639957 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
818 1991-08-15 8 dominicks 47680 10.772267300000001 1 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
819 1991-08-15 8 tropicana 5184 8.553332238 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
820 1991-08-22 8 dominicks 14720 9.596962392 0 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
821 1991-08-22 8 minute.maid 155840 11.95658512 1 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
822 1991-08-22 8 tropicana 6272 8.743850562 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
823 1991-08-29 8 tropicana 7744 8.954673629 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
824 1991-08-29 8 dominicks 53248 10.88271552 0 1.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
825 1991-08-29 8 minute.maid 10752 9.282847063 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
826 1991-09-05 8 tropicana 53184 10.88151288 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
827 1991-09-05 8 minute.maid 6976 8.850230966 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
828 1991-09-05 8 dominicks 40576 10.61093204 0 1.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
829 1991-09-12 8 dominicks 25856 10.16029796 0 1.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
830 1991-09-12 8 tropicana 6784 8.822322178 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
831 1991-09-12 8 minute.maid 31872 10.36948316 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
832 1991-09-19 8 dominicks 24064 10.08847223 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
833 1991-09-19 8 minute.maid 5312 8.577723691000001 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
834 1991-09-19 8 tropicana 8000 8.987196821 1 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
835 1991-09-26 8 tropicana 6592 8.793612072 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
836 1991-09-26 8 minute.maid 33344 10.41463313 0 1.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
837 1991-09-26 8 dominicks 15680 9.660141293999999 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
838 1991-10-03 8 minute.maid 13504 9.510741217 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
839 1991-10-03 8 dominicks 16576 9.715711145 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
840 1991-10-03 8 tropicana 5248 8.565602331000001 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
841 1991-10-10 8 dominicks 49664 10.8130356 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
842 1991-10-10 8 tropicana 6592 8.793612072 0 2.94 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
843 1991-10-10 8 minute.maid 13504 9.510741217 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
844 1991-10-17 8 dominicks 10752 9.282847063 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
845 1991-10-17 8 minute.maid 335808 12.72429485 1 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
846 1991-10-17 8 tropicana 5888 8.68067166 0 2.94 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
847 1991-10-24 8 tropicana 6336 8.754002933999999 0 2.94 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
848 1991-10-24 8 dominicks 9792 9.189321005 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
849 1991-10-24 8 minute.maid 13120 9.481893063 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
850 1991-10-31 8 tropicana 5888 8.68067166 0 2.94 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
851 1991-10-31 8 minute.maid 49664 10.8130356 0 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
852 1991-10-31 8 dominicks 7104 8.868413285 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
853 1991-11-07 8 dominicks 9216 9.128696383 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
854 1991-11-07 8 tropicana 6080 8.712759975 0 2.94 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
855 1991-11-07 8 minute.maid 10880 9.29468152 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
856 1991-11-14 8 tropicana 6848 8.831711918 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
857 1991-11-14 8 minute.maid 9984 9.208739091 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
858 1991-11-14 8 dominicks 12608 9.442086812000001 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
859 1991-11-21 8 tropicana 54016 10.89703558 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
860 1991-11-21 8 minute.maid 9216 9.128696383 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
861 1991-11-21 8 dominicks 16448 9.707959168 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
862 1991-11-28 8 tropicana 10368 9.246479419 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
863 1991-11-28 8 dominicks 27968 10.23881628 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
864 1991-11-28 8 minute.maid 7680 8.946374826 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
865 1991-12-05 8 minute.maid 7296 8.895081532 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
866 1991-12-05 8 dominicks 37824 10.5406991 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
867 1991-12-05 8 tropicana 5568 8.624791202 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
868 1991-12-12 8 dominicks 33664 10.4241843 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
869 1991-12-12 8 minute.maid 8192 9.010913347 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
870 1991-12-12 8 tropicana 4864 8.489616424 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
871 1991-12-19 8 tropicana 7232 8.886270902 0 2.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
872 1991-12-19 8 minute.maid 6080 8.712759975 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
873 1991-12-19 8 dominicks 17728 9.78290059 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
874 1991-12-26 8 tropicana 15232 9.631153757 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
875 1991-12-26 8 dominicks 25088 10.13014492 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
876 1991-12-26 8 minute.maid 15040 9.618468598 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
877 1992-01-02 8 minute.maid 9472 9.156095357 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
878 1992-01-02 8 dominicks 13184 9.486759252 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
879 1992-01-02 8 tropicana 47040 10.75875358 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
880 1992-01-09 8 dominicks 3136 8.050703382 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
881 1992-01-09 8 minute.maid 5888 8.68067166 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
882 1992-01-09 8 tropicana 9280 9.135616826 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
883 1992-01-16 8 tropicana 6720 8.812843434 0 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
884 1992-01-16 8 minute.maid 14336 9.570529135 1 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
885 1992-01-16 8 dominicks 5696 8.647519453 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
886 1992-01-23 8 minute.maid 11712 9.368369236 1 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
887 1992-01-23 8 dominicks 19008 9.852615222 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
888 1992-01-23 8 tropicana 5056 8.528330936 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
889 1992-01-30 8 minute.maid 7936 8.979164649 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
890 1992-01-30 8 dominicks 121664 11.70901843 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
891 1992-01-30 8 tropicana 6080 8.712759975 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
892 1992-02-06 8 tropicana 10496 9.258749511 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
893 1992-02-06 8 minute.maid 5184 8.553332238 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
894 1992-02-06 8 dominicks 38848 10.56741187 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
895 1992-02-13 8 minute.maid 7168 8.877381955 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
896 1992-02-13 8 dominicks 6144 8.723231275 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
897 1992-02-13 8 tropicana 39040 10.57234204 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
898 1992-02-20 8 dominicks 13632 9.520175249 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
899 1992-02-20 8 minute.maid 216064 12.28332994 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
900 1992-02-20 8 tropicana 4480 8.407378325 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
901 1992-02-27 8 tropicana 61760 11.03101119 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
902 1992-02-27 8 minute.maid 15040 9.618468598 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
903 1992-02-27 8 dominicks 9792 9.189321005 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
904 1992-03-05 8 tropicana 15360 9.639522007 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
905 1992-03-05 8 minute.maid 11840 9.379238908 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
906 1992-03-05 8 dominicks 86912 11.37265139 1 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
907 1992-03-12 8 minute.maid 25472 10.14533509 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
908 1992-03-12 8 dominicks 24512 10.10691807 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
909 1992-03-12 8 tropicana 54976 10.91465201 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
910 1992-03-19 8 minute.maid 16384 9.704060528 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
911 1992-03-19 8 dominicks 58048 10.96902553 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
912 1992-03-19 8 tropicana 34368 10.44488118 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
913 1992-03-26 8 tropicana 10752 9.282847063 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
914 1992-03-26 8 minute.maid 20480 9.927204079 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
915 1992-03-26 8 dominicks 13952 9.543378146 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
916 1992-04-02 8 minute.maid 34688 10.45414909 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
917 1992-04-02 8 dominicks 15168 9.626943225 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
918 1992-04-02 8 tropicana 20096 9.908276069 1 2.5 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
919 1992-04-09 8 dominicks 14592 9.588228712000001 0 1.58 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
920 1992-04-09 8 minute.maid 22400 10.01681624 0 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
921 1992-04-09 8 tropicana 16192 9.692272572 0 2.5 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
922 1992-04-16 8 tropicana 6528 8.783855897 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
923 1992-04-16 8 minute.maid 7808 8.962904128 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
924 1992-04-16 8 dominicks 145088 11.88509573 1 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
925 1992-04-23 8 tropicana 8320 9.026417534 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
926 1992-04-23 8 minute.maid 48064 10.78028874 1 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
927 1992-04-23 8 dominicks 43712 10.68537794 0 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
928 1992-04-30 8 tropicana 30784 10.33475035 1 2.16 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
929 1992-04-30 8 minute.maid 7360 8.903815212 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
930 1992-04-30 8 dominicks 20608 9.933434629 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
931 1992-05-07 8 tropicana 18048 9.800790154 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
932 1992-05-07 8 minute.maid 6272 8.743850562 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
933 1992-05-07 8 dominicks 18752 9.839055692 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
934 1992-05-14 8 tropicana 12864 9.462187991 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
935 1992-05-14 8 minute.maid 6400 8.764053269 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
936 1992-05-14 8 dominicks 20160 9.911455722000001 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
937 1992-05-21 8 tropicana 7168 8.877381955 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
938 1992-05-21 8 minute.maid 54592 10.90764263 1 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
939 1992-05-21 8 dominicks 18688 9.835636886 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
940 1992-05-28 8 minute.maid 8128 9.00307017 0 2.39 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
941 1992-05-28 8 tropicana 9024 9.107642974 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
942 1992-05-28 8 dominicks 133824 11.80428078 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
943 1992-06-04 8 tropicana 84992 11.35031241 1 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
944 1992-06-04 8 minute.maid 4928 8.502688505 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
945 1992-06-04 8 dominicks 63488 11.05860619 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
946 1992-06-11 8 minute.maid 5440 8.60153434 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
947 1992-06-11 8 tropicana 14144 9.557045785 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
948 1992-06-11 8 dominicks 71040 11.17099838 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
949 1992-06-25 8 tropicana 7488 8.921057017999999 1 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
950 1992-06-25 8 minute.maid 5888 8.68067166 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
951 1992-06-25 8 dominicks 15360 9.639522007 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
952 1992-07-02 8 minute.maid 23872 10.0804615 1 2.02 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
953 1992-07-02 8 dominicks 17728 9.78290059 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
954 1992-07-02 8 tropicana 12352 9.421573272 0 2.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
955 1992-07-09 8 tropicana 5696 8.647519453 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
956 1992-07-09 8 minute.maid 6848 8.831711918 1 2.19 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
957 1992-07-09 8 dominicks 24256 10.09641929 0 1.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
958 1992-07-16 8 minute.maid 8192 9.010913347 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
959 1992-07-16 8 dominicks 19968 9.901886271 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
960 1992-07-16 8 tropicana 7680 8.946374826 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
961 1992-07-23 8 dominicks 15936 9.67633598 0 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
962 1992-07-23 8 minute.maid 55040 10.91581547 1 2.29 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
963 1992-07-23 8 tropicana 5440 8.60153434 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
964 1992-07-30 8 tropicana 5632 8.636219898 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
965 1992-07-30 8 minute.maid 6528 8.783855897 0 2.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
966 1992-07-30 8 dominicks 76352 11.24310951 1 1.49 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
967 1992-08-06 8 tropicana 8960 9.100525506 1 2.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
968 1992-08-06 8 minute.maid 6208 8.733594062 1 2.45 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
969 1992-08-06 8 dominicks 17408 9.76468515 1 1.69 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
970 1992-08-13 8 minute.maid 94720 11.45868045 1 1.99 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
971 1992-08-13 8 tropicana 6080 8.712759975 0 2.89 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
972 1992-08-13 8 dominicks 17536 9.77201119 0 1.79 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947
973 1992-08-20 8 dominicks 31232 10.34919849 0 1.59 0.252394035 0.095173274 10.59700966 0.054227156 0.131749698 0.035243328 0.283074736 2.636332801 1.5 2.905384316 0.641015947

View File

@@ -1,66 +0,0 @@
import argparse
import json
from azureml.core import Run, Model, Workspace
from azureml.core.conda_dependencies import CondaDependencies
from azureml.core.model import InferenceConfig
from azureml.core.webservice import AciWebservice
script_file_name = 'score.py'
conda_env_file_name = 'myenv.yml'
print("In deploy.py")
parser = argparse.ArgumentParser()
parser.add_argument("--time_column_name", type=str, help="time column name")
parser.add_argument("--group_column_names", type=str, help="group column names")
parser.add_argument("--model_names", type=str, help="model names")
parser.add_argument("--service_name", type=str, help="service name")
args = parser.parse_args()
# replace the group column names in scoring script to the ones set by user
print("Update group_column_names")
print(args.group_column_names)
with open(script_file_name, 'r') as cefr:
content = cefr.read()
with open(script_file_name, 'w') as cefw:
content = content.replace('<<groups>>', args.group_column_names.rstrip())
cefw.write(content.replace('<<time_colname>>', args.time_column_name.rstrip()))
with open(script_file_name, 'r') as cefr1:
content1 = cefr1.read()
print(content1)
model_list = json.loads(args.model_names)
print(model_list)
run = Run.get_context()
ws = run.experiment.workspace
deployment_config = AciWebservice.deploy_configuration(
cpu_cores=1,
memory_gb=2,
tags={"method": "grouping"},
description='grouping demo aci deployment'
)
inference_config = InferenceConfig(
entry_script=script_file_name,
runtime='python',
conda_file=conda_env_file_name
)
models = []
for model_name in model_list:
models.append(Model(ws, name=model_name))
service = Model.deploy(
ws,
name=args.service_name,
models=models,
inference_config=inference_config,
deployment_config=deployment_config
)
service.wait_for_deployment(True)

View File

@@ -1,11 +0,0 @@
name: automl_grouping_env
dependencies:
# The python interpreter version.
# Currently Azure ML only supports 3.5.2 and later.
- python=3.6.2
- numpy>=1.16.0,<=1.16.2
- scikit-learn>=0.19.0,<=0.20.3
- conda-forge::fbprophet==0.5

View File

@@ -1,55 +0,0 @@
import json
import pickle
import re
import numpy as np
import pandas as pd
from sklearn.externals import joblib
from sklearn.linear_model import Ridge
from azureml.core.model import Model
import azureml.train.automl
def init():
global models
models = {}
global group_columns_str
group_columns_str = "<<groups>>"
global time_column_name
time_column_name = "<<time_colname>>"
global group_columns
group_columns = group_columns_str.split("#####")
global valid_chars
valid_chars = re.compile('[^a-zA-Z0-9-]')
def run(raw_data):
try:
data = pd.read_json(raw_data)
# Make sure we have correct time points.
data[time_column_name] = pd.to_datetime(data[time_column_name], unit='ms')
dfs = []
for grain, df_one in data.groupby(group_columns):
if isinstance(grain, int):
cur_group = str(grain)
elif isinstance(grain, str):
cur_group = grain
else:
cur_group = "#####".join(list(grain))
cur_group = valid_chars.sub('', cur_group)
print("Query model for group {}".format(cur_group))
if cur_group not in models:
model_path = Model.get_model_path(cur_group)
model = joblib.load(model_path)
models[cur_group] = model
_, xtrans = models[cur_group].forecast(df_one, np.repeat(np.nan, len(df_one)))
dfs.append(xtrans)
df_ret = pd.concat(dfs)
df_ret.reset_index(drop=False, inplace=True)
return json.dumps({'predictions': df_ret.to_json()})
except Exception as e:
error = str(e)
return error

View File

@@ -1,22 +0,0 @@
import argparse
from azureml.core import Run, Model
parser = argparse.ArgumentParser()
parser.add_argument("--model_name")
parser.add_argument("--model_path")
args = parser.parse_args()
run = Run.get_context()
ws = run.experiment.workspace
print('retrieved ws: {}'.format(ws))
print('begin register model')
model = Model.register(
workspace=ws,
model_path=args.model_path,
model_name=args.model_name
)
print('model registered: {}'.format(model))
print('complete')

View File

@@ -68,6 +68,7 @@
"import logging\n",
"import warnings\n",
"\n",
"import azureml.core\n",
"from azureml.core.dataset import Dataset\n",
"from pandas.tseries.frequencies import to_offset\n",
"from azureml.core.compute import AmlCompute\n",
@@ -81,13 +82,29 @@
"np.set_printoptions(precision=4, suppress=True, linewidth=120)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.core\n",
"from azureml.core.workspace import Workspace\n",
"from azureml.core.experiment import Experiment\n",
"from azureml.train.automl import AutoMLConfig\n",
@@ -100,7 +117,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['SKU'] = ws.sku\n",
@@ -258,29 +274,22 @@
"metadata": {},
"outputs": [],
"source": [
"amlcompute_cluster_name = \"cpu-cluster-fcfn\"\n",
" \n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 6)\n",
"# Choose a name for your CPU cluster\n",
"amlcompute_cluster_name = \"fcfn-cluster\"\n",
"\n",
" # Create the cluster.\\n\",\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D2_V2',\n",
" max_nodes=6)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)"
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -335,7 +344,7 @@
"automl_config = AutoMLConfig(task='forecasting',\n",
" debug_log='automl_forecasting_function.log',\n",
" primary_metric='normalized_root_mean_squared_error',\n",
" experiment_timeout_minutes=15,\n",
" experiment_timeout_hours=0.25,\n",
" enable_early_stopping=True,\n",
" training_data=train_data,\n",
" compute_target=compute_target,\n",
@@ -346,9 +355,24 @@
" label_column_name=target_label,\n",
" **time_series_settings)\n",
"\n",
"remote_run = experiment.submit(automl_config, show_output=False)\n",
"remote_run.wait_for_completion()\n",
"\n",
"remote_run = experiment.submit(automl_config, show_output=False)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"remote_run.wait_for_completion()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Retrieve the best model to use it further.\n",
"_, fitted_model = remote_run.get_output()"
]
@@ -377,9 +401,7 @@
"\n",
"![Forecasting after training](forecast_function_at_train.png)\n",
"\n",
"The `X_test` and `y_query` below, taken together, form the **forecast request**. The two are interpreted as aligned - `y_query` could actally be a column in `X_test`. `NaN`s in `y_query` are the question marks. These will be filled with the forecasts.\n",
"\n",
"When the forecast period immediately follows the training period, the models retain the last few points of data. You can simply fill `y_query` filled with question marks - the model has the data for the lookback already.\n"
"We use `X_test` as a **forecast request** to generate the predictions."
]
},
{
@@ -408,8 +430,7 @@
"metadata": {},
"outputs": [],
"source": [
"y_query = np.repeat(np.NaN, X_test.shape[0])\n",
"y_pred_no_gap, xy_nogap = fitted_model.forecast(X_test, y_query)\n",
"y_pred_no_gap, xy_nogap = fitted_model.forecast(X_test)\n",
"\n",
"# xy_nogap contains the predictions in the _automl_target_col column.\n",
"# Those same numbers are output in y_pred_no_gap\n",
@@ -437,7 +458,7 @@
"metadata": {},
"outputs": [],
"source": [
"quantiles = fitted_model.forecast_quantiles(X_test, y_query)\n",
"quantiles = fitted_model.forecast_quantiles(X_test)\n",
"quantiles"
]
},
@@ -460,10 +481,10 @@
"# specify which quantiles you would like \n",
"fitted_model.quantiles = [0.01, 0.5, 0.95]\n",
"# use forecast_quantiles function, not the forecast() one\n",
"y_pred_quantiles = fitted_model.forecast_quantiles(X_test, y_query)\n",
"y_pred_quantiles = fitted_model.forecast_quantiles(X_test)\n",
"\n",
"# it all nicely aligns column-wise\n",
"pd.concat([X_test.reset_index(), pd.DataFrame({'query' : y_query}), y_pred_quantiles], axis=1)"
"# quantile forecasts returned in a Dataframe along with the time and grain columns \n",
"y_pred_quantiles"
]
},
{
@@ -539,9 +560,7 @@
"outputs": [],
"source": [
"try: \n",
" y_query = y_away.copy()\n",
" y_query.fill(np.NaN)\n",
" y_pred_away, xy_away = fitted_model.forecast(X_away, y_query)\n",
" y_pred_away, xy_away = fitted_model.forecast(X_away)\n",
" xy_away\n",
"except Exception as e:\n",
" print(e)"
@@ -551,7 +570,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"How should we read that eror message? The forecast origin is at the last time the model saw an actual value of `y` (the target). That was at the end of the training data! Because the model received all `NaN` (and not an actual target value), it is attempting to forecast from the end of training data. But the requested forecast periods are past the maximum horizon. We need to provide a define `y` value to establish the forecast origin.\n",
"How should we read that eror message? The forecast origin is at the last time the model saw an actual value of `y` (the target). That was at the end of the training data! The model is attempting to forecast from the end of training data. But the requested forecast periods are past the maximum horizon. We need to provide a define `y` value to establish the forecast origin.\n",
"\n",
"We will use this helper function to take the required amount of context from the data preceding the testing data. It's definition is intentionally simplified to keep the idea in the clear."
]
@@ -706,7 +725,7 @@
"metadata": {
"authors": [
{
"name": "erwright, nirovins"
"name": "erwright"
}
],
"category": "tutorial",
@@ -740,7 +759,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.7"
"version": "3.6.8"
},
"tags": [
"Forecasting",

View File

@@ -0,0 +1,10 @@
name: auto-ml-forecasting-function
dependencies:
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -1,11 +0,0 @@
name: automl-forecasting-function
dependencies:
- fbprophet==0.5
- py-xgboost<=0.80
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- pandas_ml
- statsmodels
- matplotlib

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

@@ -40,7 +40,7 @@
"## Introduction\n",
"In this example, we use AutoML to train, select, and operationalize a time-series forecasting model for multiple time-series.\n",
"\n",
"Make sure you have executed the [configuration notebook](../configuration.ipynb) before running this notebook.\n",
"Make sure you have executed the [configuration notebook](../../../configuration.ipynb) before running this notebook.\n",
"\n",
"The examples in the follow code samples use the University of Chicago's Dominick's Finer Foods dataset to forecast orange juice sales. Dominick's was a grocery chain in the Chicago metropolitan area."
]
@@ -65,7 +65,25 @@
"\n",
"from azureml.core.workspace import Workspace\n",
"from azureml.core.experiment import Experiment\n",
"from azureml.train.automl import AutoMLConfig"
"from azureml.train.automl import AutoMLConfig\n",
"from azureml.automl.core.featurization import FeaturizationConfig"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
@@ -89,7 +107,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['SKU'] = ws.sku\n",
@@ -118,35 +135,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-oj\"\n",
"# Choose a name for your CPU cluster\n",
"amlcompute_cluster_name = \"oj-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 6)\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D2_V2',\n",
" max_nodes=6)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
" \n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -315,17 +319,54 @@
"target_column_name = 'Quantity'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Customization\n",
"\n",
"The featurization customization in forecasting is an advanced feature in AutoML which allows our customers to change the default forecasting featurization behaviors and column types through `FeaturizationConfig`. The supported scenarios include,\n",
"1. Column purposes update: Override feature type for the specified column. Currently supports DateTime, Categorical and Numeric. This customization can be used in the scenario that the type of the column cannot correctly reflect its purpose. Some numerical columns, for instance, can be treated as Categorical columns which need to be converted to categorical while some can be treated as epoch timestamp which need to be converted to datetime. To tell our SDK to correctly preprocess these columns, a configuration need to be add with the columns and their desired types.\n",
"2. Transformer parameters update: Currently supports parameter change for Imputer only. User can customize imputation methods, the supported methods are constant for target data and mean, median, most frequent and constant for training data. This customization can be used for the scenario that our customers know which imputation methods fit best to the input data. For instance, some datasets use NaN to represent 0 which the correct behavior should impute all the missing value with 0. To achieve this behavior, these columns need to be configured as constant imputation with `fill_value` 0.\n",
"3. Drop columns: Columns to drop from being featurized. These usually are the columns which are leaky or the columns contain no useful data.\n",
"\n",
"This step requires an Enterprise workspace to gain access to this feature. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"featurization_config = FeaturizationConfig()\n",
"featurization_config.drop_columns = ['logQuantity'] # 'logQuantity' is a leaky feature, so we remove it.\n",
"# Force the CPWVOL5 feature to be numeric type.\n",
"featurization_config.add_column_purpose('CPWVOL5', 'Numeric')\n",
"# Fill missing values in the target column, Quantity, with zeros.\n",
"featurization_config.add_transformer_params('Imputer', ['Quantity'], {\"strategy\": \"constant\", \"fill_value\": 0})\n",
"# Fill missing values in the INCOME column with median value.\n",
"featurization_config.add_transformer_params('Imputer', ['INCOME'], {\"strategy\": \"median\"})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Train\n",
"\n",
"The AutoMLConfig object defines the settings and data for an AutoML training job. Here, we set necessary inputs like the task type, the number of AutoML iterations to try, the training data, and cross-validation parameters. \n",
"The [AutoMLConfig](https://docs.microsoft.com/en-us/python/api/azureml-train-automl-client/azureml.train.automl.automlconfig.automlconfig?view=azure-ml-py) object defines the settings and data for an AutoML training job. Here, we set necessary inputs like the task type, the number of AutoML iterations to try, the training data, and cross-validation parameters.\n",
"\n",
"For forecasting tasks, there are some additional parameters that can be set: the name of the column holding the date/time, the grain column names, and the maximum forecast horizon. A time column is required for forecasting, while the grain is optional. If a grain is not given, AutoML assumes that the whole dataset is a single time-series. We also pass a list of columns to drop prior to modeling. The _logQuantity_ column is completely correlated with the target quantity, so it must be removed to prevent a target leak.\n",
"For forecasting tasks, there are some additional parameters that can be set: the name of the column holding the date/time, the grain column names, and the maximum forecast horizon. A time column is required for forecasting, while the grain is optional. If grain columns are not given, AutoML assumes that the whole dataset is a single time-series. We also pass a list of columns to drop prior to modeling. The _logQuantity_ column is completely correlated with the target quantity, so it must be removed to prevent a target leak.\n",
"\n",
"The forecast horizon is given in units of the time-series frequency; for instance, the OJ series frequency is weekly, so a horizon of 20 means that a trained model will estimate sales up to 20 weeks beyond the latest date in the training data for each series. In this example, we set the maximum horizon to the number of samples per series in the test set (n_test_periods). Generally, the value of this parameter will be dictated by business needs. For example, a demand planning application that estimates the next month of sales should set the horizon according to suitable planning time-scales. Please see the [energy_demand notebook](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/automated-machine-learning/forecasting-energy-demand) for more discussion of forecast horizon.\n",
"\n",
"We note here that AutoML can sweep over two types of time-series models:\n",
"* Models that are trained for each series such as ARIMA and Facebook's Prophet. Note that these models are only available for [Enterprise Edition Workspaces](https://docs.microsoft.com/en-us/azure/machine-learning/how-to-manage-workspace#upgrade).\n",
"* Models trained across multiple time-series using a regression approach.\n",
"\n",
"In the first case, AutoML loops over all time-series in your dataset and trains one model (e.g. AutoArima or Prophet, as the case may be) for each series. This can result in long runtimes to train these models if there are a lot of series in the data. One way to mitigate this problem is to fit models for different series in parallel if you have multiple compute cores available. To enable this behavior, set the `max_cores_per_iteration` parameter in your AutoMLConfig as shown in the example in the next cell. \n",
"\n",
"The forecast horizon is given in units of the time-series frequency; for instance, the OJ series frequency is weekly, so a horizon of 20 means that a trained model will estimate sales up to 20 weeks beyond the latest date in the training data for each series. In this example, we set the maximum horizon to the number of samples per series in the test set (n_test_periods). Generally, the value of this parameter will be dictated by business needs. For example, a demand planning organizaion that needs to estimate the next month of sales would set the horizon accordingly. Please see the [energy_demand notebook](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/automated-machine-learning/forecasting-energy-demand) for more discussion of forecast horizon.\n",
"\n",
"Finally, a note about the cross-validation (CV) procedure for time-series data. AutoML uses out-of-sample error estimates to select a best pipeline/model, so it is important that the CV fold splitting is done correctly. Time-series can violate the basic statistical assumptions of the canonical K-Fold CV strategy, so AutoML implements a [rolling origin validation](https://robjhyndman.com/hyndsight/tscv/) procedure to create CV folds for time-series data. To use this procedure, you just need to specify the desired number of CV folds in the AutoMLConfig object. It is also possible to bypass CV and use your own validation set by setting the *validation_data* parameter of AutoMLConfig.\n",
"\n",
@@ -335,7 +376,7 @@
"|-|-|\n",
"|**task**|forecasting|\n",
"|**primary_metric**|This is the metric that you want to optimize.<br> Forecasting supports the following primary metrics <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>\n",
"|**experiment_timeout_minutes**|Experimentation timeout in minutes.|\n",
"|**experiment_timeout_hours**|Experimentation timeout in hours.|\n",
"|**enable_early_stopping**|If early stopping is on, training will stop when the primary metric is no longer improving.|\n",
"|**training_data**|Input dataset, containing both features and label column.|\n",
"|**label_column_name**|The name of the label column.|\n",
@@ -346,8 +387,9 @@
"|**debug_log**|Log file path for writing debugging information|\n",
"|**time_column_name**|Name of the datetime column in the input data|\n",
"|**grain_column_names**|Name(s) of the columns defining individual series in the input data|\n",
"|**drop_column_names**|Name(s) of columns to drop prior to modeling|\n",
"|**max_horizon**|Maximum desired forecast horizon in units of time-series frequency|"
"|**max_horizon**|Maximum desired forecast horizon in units of time-series frequency|\n",
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Setting this enables AutoML to perform featurization on the input to handle *missing data*, and to perform some common *feature extraction*.|\n",
"|**max_cores_per_iteration**|Maximum number of cores to utilize per iteration. A value of -1 indicates all available cores should be used.|"
]
},
{
@@ -359,20 +401,21 @@
"time_series_settings = {\n",
" 'time_column_name': time_column_name,\n",
" 'grain_column_names': grain_column_names,\n",
" 'drop_column_names': ['logQuantity'], # 'logQuantity' is a leaky feature, so we remove it.\n",
" 'max_horizon': n_test_periods\n",
"}\n",
"\n",
"automl_config = AutoMLConfig(task='forecasting',\n",
" debug_log='automl_oj_sales_errors.log',\n",
" primary_metric='normalized_mean_absolute_error',\n",
" experiment_timeout_minutes=15,\n",
" experiment_timeout_hours=0.25,\n",
" training_data=train_dataset,\n",
" label_column_name=target_column_name,\n",
" compute_target=compute_target,\n",
" enable_early_stopping=True,\n",
" featurization=featurization_config,\n",
" n_cross_validations=3,\n",
" verbosity=logging.INFO,\n",
" max_cores_per_iteration=-1,\n",
" **time_series_settings)"
]
},
@@ -422,6 +465,33 @@
"model_name = best_run.properties['model_name']"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Transparency\n",
"\n",
"View updated featurization summary"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"custom_featurizer = fitted_model.named_steps['timeseriestransformer']"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"custom_featurizer.get_featurization_summary()"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -454,9 +524,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"To produce predictions on the test set, we need to know the feature values at all dates in the test set. This requirement is somewhat reasonable for the OJ sales data since the features mainly consist of price, which is usually set in advance, and customer demographics which are approximately constant for each store over the 20 week forecast horizon in the testing data. \n",
"\n",
"We will first create a query `y_query`, which is aligned index-for-index to `X_test`. This is a vector of target values where each `NaN` serves the function of the question mark to be replaced by forecast. Passing definite values in the `y` argument allows the `forecast` function to make predictions on data that does not immediately follow the train data which contains `y`. In each grain, the last time point where the model sees a definite value of `y` is that grain's _forecast origin_."
"To produce predictions on the test set, we need to know the feature values at all dates in the test set. This requirement is somewhat reasonable for the OJ sales data since the features mainly consist of price, which is usually set in advance, and customer demographics which are approximately constant for each store over the 20 week forecast horizon in the testing data."
]
},
{
@@ -465,15 +533,10 @@
"metadata": {},
"outputs": [],
"source": [
"# Replace ALL values in y by NaN.\n",
"# The forecast origin will be at the beginning of the first forecast period.\n",
"# (Which is the same time as the end of the last training period.)\n",
"y_query = y_test.copy().astype(np.float)\n",
"y_query.fill(np.nan)\n",
"# The featurized data, aligned to y, will also be returned.\n",
"# This contains the assumptions that were made in the forecast\n",
"# and helps align the forecast to the original data\n",
"y_predictions, X_trans = fitted_model.forecast(X_test, y_query)"
"y_predictions, X_trans = fitted_model.forecast(X_test)"
]
},
{
@@ -482,7 +545,7 @@
"source": [
"If you are used to scikit pipelines, perhaps you expected `predict(X_test)`. However, forecasting requires a more general interface that also supplies the past target `y` values. Please use `forecast(X,y)` as `predict(X)` is reserved for internal purposes on forecasting models.\n",
"\n",
"The [energy demand forecasting notebook](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/automated-machine-learning/forecasting-energy-demand) demonstrates the use of the forecast function in more detail in the context of using lags and rolling window features. "
"The [forecast function notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/forecasting-high-frequency/auto-ml-forecasting-function.ipynb) demonstrates the use of the forecast function for a variety of use cases. Also, please see the [API documentation for the forecast function](https://docs.microsoft.com/en-us/python/api/azureml-automl-runtime/azureml.automl.runtime.shared.model_wrappers.forecastingpipelinewrapper?view=azure-ml-py#forecast-x-pred--typing-union-pandas-core-frame-dataframe--nonetype----none--y-pred--typing-union-pandas-core-frame-dataframe--numpy-ndarray--nonetype----none--forecast-destination--typing-union-pandas--libs-tslibs-timestamps-timestamp--nonetype----none--ignore-data-errors--bool---false-----typing-tuple-numpy-ndarray--pandas-core-frame-dataframe-)."
]
},
{
@@ -513,9 +576,8 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core._vendor.automl.client.core.common import metrics\n",
"from azureml.automl.core.shared import constants, metrics\n",
"from matplotlib import pyplot as plt\n",
"from automl.client.core.common import constants\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",
@@ -638,9 +700,7 @@
"outputs": [],
"source": [
"import json\n",
"# The request data frame needs to have y_query column which corresponds to query.\n",
"X_query = X_test.copy()\n",
"X_query['y_query'] = y_query\n",
"# We have to convert datetime to string, because Timestamps cannot be serialized to JSON.\n",
"X_query[time_column_name] = X_query[time_column_name].astype(str)\n",
"# The Service object accept the complex dictionary, which is internally converted to JSON string.\n",
@@ -705,9 +765,6 @@
"framework": [
"Azure ML AutoML"
],
"tags": [
"None"
],
"friendly_name": "Forecasting orange juice sales with deployment",
"index_order": 1,
"kernelspec": {
@@ -725,8 +782,11 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.7"
"version": "3.6.8"
},
"tags": [
"None"
],
"task": "Forecasting"
},
"nbformat": 4,

View File

@@ -1,11 +1,10 @@
name: auto-ml-forecasting-orange-juice-sales
dependencies:
- fbprophet==0.5
- py-xgboost<=0.80
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml
- statsmodels

View File

@@ -42,14 +42,16 @@
"\n",
"This notebook is using the local machine compute to train the model.\n",
"\n",
"If you are using an Azure Machine Learning [Notebook VM](https://docs.microsoft.com/en-us/azure/machine-learning/service/tutorial-1st-experiment-sdk-setup), you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"If you are using an Azure Machine Learning Compute Instance, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"\n",
"In this notebook you will learn how to:\n",
"1. Create an experiment using an existing workspace.\n",
"2. Configure AutoML using `AutoMLConfig`.\n",
"3. Train the model.\n",
"4. Explore the results.\n",
"5. Test the fitted model."
"5. Visualization model's feature importance in azure portal\n",
"6. Explore any model's explanation and explore feature importance in azure portal\n",
"7. Test the fitted model."
]
},
{
@@ -71,13 +73,30 @@
"\n",
"from matplotlib import pyplot as plt\n",
"import pandas as pd\n",
"import os\n",
"\n",
"import azureml.core\n",
"from azureml.core.experiment import Experiment\n",
"from azureml.core.workspace import Workspace\n",
"from azureml.core.dataset import Dataset\n",
"from azureml.train.automl import AutoMLConfig"
"from azureml.train.automl import AutoMLConfig\n",
"from azureml.explain.model._internal.explanation_client import ExplanationClient"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
@@ -94,7 +113,6 @@
"experiment=Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -155,8 +173,7 @@
"automl_settings = {\n",
" \"n_cross_validations\": 3,\n",
" \"primary_metric\": 'average_precision_score_weighted',\n",
" \"preprocess\": True,\n",
" \"experiment_timeout_minutes\": 10, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ablity to find the best model possible\n",
" \"experiment_timeout_hours\": 0.25, # This is a time limit for testing purposes, remove it for real use cases, this will drastically limit ability to find the best model possible\n",
" \"verbosity\": logging.INFO,\n",
" \"enable_stack_ensemble\": False\n",
"}\n",
@@ -260,17 +277,135 @@
"metadata": {},
"source": [
"#### Print the properties of the model\n",
"The fitted_model is a python object and you can read the different properties of the object.\n",
"See *Print the properties of the model* section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification/auto-ml-classification.ipynb)."
"The fitted_model is a python object and you can read the different properties of the object.\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Deploy\n",
"## Best Model 's explanation\n",
"Retrieve the explanation from the best_run which includes explanations for engineered features and raw features.\n",
"\n",
"To deploy the model into a web service endpoint, see _Deploy_ section in [this sample notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/classification-with-deployment/auto-ml-classification-with-deployment.ipynb)"
"#### Download engineered feature importance from artifact store\n",
"You can use ExplanationClient to download the engineered feature explanations from the artifact store of the best_run. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"client = ExplanationClient.from_run(best_run)\n",
"engineered_explanations = client.download_model_explanation(raw=False)\n",
"print(engineered_explanations.get_feature_importance_dict())\n",
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + best_run.get_portal_url())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Explanations\n",
"In this section, we will show how to compute model explanations and visualize the explanations using azureml-explain-model package. Besides retrieving an existing model explanation for an AutoML model, you can also explain your AutoML model with different test data. The following steps will allow you to compute and visualize engineered feature importance based on your test data."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Retrieve any other AutoML model from training"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"automl_run, fitted_model = local_run.get_output(metric='accuracy')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Setup the model explanations for AutoML models\n",
"The fitted_model can generate the following which will be used for getting the engineered explanations using automl_setup_model_explanations:-\n",
"\n",
"1. Featurized data from train samples/test samples\n",
"2. Gather engineered name lists\n",
"3. Find the classes in your labeled column in classification scenarios\n",
"\n",
"The automl_explainer_setup_obj contains all the structures from above list."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_train = training_data.drop_columns(columns=[label_column_name])\n",
"y_train = training_data.keep_columns(columns=[label_column_name], validate=True)\n",
"X_test = validation_data.drop_columns(columns=[label_column_name])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.train.automl.runtime.automl_explain_utilities import automl_setup_model_explanations\n",
"\n",
"automl_explainer_setup_obj = automl_setup_model_explanations(fitted_model, X=X_train, \n",
" X_test=X_test, y=y_train, \n",
" task='classification')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Initialize the Mimic Explainer for feature importance\n",
"For explaining the AutoML models, use the MimicWrapper from azureml.explain.model package. The MimicWrapper can be initialized with fields in automl_explainer_setup_obj, your workspace and a surrogate model to explain the AutoML model (fitted_model here). The MimicWrapper also takes the automl_run object where engineered explanations will be uploaded."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.explain.model.mimic_wrapper import MimicWrapper\n",
"explainer = MimicWrapper(ws, automl_explainer_setup_obj.automl_estimator,\n",
" explainable_model=automl_explainer_setup_obj.surrogate_model, \n",
" init_dataset=automl_explainer_setup_obj.X_transform, run=automl_run,\n",
" features=automl_explainer_setup_obj.engineered_feature_names, \n",
" feature_maps=[automl_explainer_setup_obj.feature_map],\n",
" classes=automl_explainer_setup_obj.classes,\n",
" explainer_kwargs=automl_explainer_setup_obj.surrogate_model_params)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Use Mimic Explainer for computing and visualizing engineered feature importance\n",
"The explain() method in MimicWrapper can be called with the transformed test samples to get the feature importance for the generated engineered features. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"engineered_explanations = explainer.explain(['local', 'global'], eval_dataset=automl_explainer_setup_obj.X_test_transform)\n",
"print(engineered_explanations.get_feature_importance_dict())\n",
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
]
},
{
@@ -369,7 +504,7 @@
"metadata": {
"authors": [
{
"name": "tzvikei"
"name": "anumamah"
}
],
"category": "tutorial",

View File

@@ -2,10 +2,6 @@ name: auto-ml-classification-credit-card-fraud-local
dependencies:
- pip:
- azureml-sdk
- interpret
- azureml-defaults
- azureml-explain-model
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml

View File

@@ -40,7 +40,7 @@
"In this example we use the Hardware Performance Dataset to showcase how you can use AutoML for a simple regression problem. The Regression goal is to predict the performance of certain combinations of hardware parts.\n",
"After training AutoML models for this regression data set, we show how you can compute model explanations on your remote compute using a sample explainer script.\n",
"\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"If you are using an Azure Machine Learning Compute Instance, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"\n",
"An Enterprise workspace is required for this notebook. To learn more about creating an Enterprise workspace or upgrading to an Enterprise workspace from the Azure portal, please visit our [Workspace page.](https://docs.microsoft.com/azure/machine-learning/service/concept-workspace#upgrade) \n",
"\n",
@@ -51,8 +51,8 @@
"4. Explore the results and featurization transparency options\n",
"5. Setup remote compute for computing the model explanations for a given AutoML model.\n",
"6. Start an AzureML experiment on your remote compute to compute explanations for an AutoML model.\n",
"7. Download the feature importance for engineered features and visualize the explanations for engineered features. \n",
"8. Download the feature importance for raw features and visualize the explanations for raw features. \n"
"7. Download the feature importance for engineered features and visualize the explanations for engineered features on azure portal. \n",
"8. Download the feature importance for raw features and visualize the explanations for raw features on azure portal. \n"
]
},
{
@@ -85,6 +85,23 @@
"from azureml.core.dataset import Dataset"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -98,7 +115,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace Name'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -127,35 +143,22 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute\n",
"from azureml.core.compute import ComputeTarget\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster-5\"\n",
"amlcompute_cluster_name = \"hardware-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" compute_target = ComputeTarget(workspace=ws, name=amlcompute_cluster_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D2_V2',\n",
" max_nodes=4)\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, compute_config)\n",
"\n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 4)\n",
"\n",
" # Create the cluster.\\n\",\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
"\n",
"print('Checking cluster status...')\n",
"# Can poll for a minimum number of nodes and for a specific timeout.\n",
"# If no min_node_count is provided, it will use the scale settings for the cluster.\n",
"compute_target.wait_for_completion(show_output = True, min_node_count = None, timeout_in_minutes = 20)\n",
"\n",
"# For a more detailed view of current AmlCompute status, use get_status()."
"compute_target.wait_for_completion(show_output=True)"
]
},
{
@@ -206,9 +209,9 @@
"|-|-|\n",
"|**task**|classification, regression or forecasting|\n",
"|**primary_metric**|This is the metric that you want to optimize. Regression supports the following primary metrics: <br><i>spearman_correlation</i><br><i>normalized_root_mean_squared_error</i><br><i>r2_score</i><br><i>normalized_mean_absolute_error</i>|\n",
"|**experiment_timeout_minutes**| Maximum amount of time in minutes that all iterations combined can take before the experiment terminates.|\n",
"|**experiment_timeout_hours**| Maximum amount of time in hours that all iterations combined can take before the experiment terminates.|\n",
"|**enable_early_stopping**| Flag to enble early termination if the score is not improving in the short term.|\n",
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Note: If the input data is sparse, featurization cannot be turned on.|\n",
"|**featurization**| 'auto' / 'off' / FeaturizationConfig Indicator for whether featurization step should be done automatically or not, or whether customized featurization should be used. Setting this enables AutoML to perform featurization on the input to handle *missing data*, and to perform some common *feature extraction*. Note: If the input data is sparse, featurization cannot be turned on.|\n",
"|**n_cross_validations**|Number of cross validation splits.|\n",
"|**training_data**|(sparse) array-like, shape = [n_samples, n_features]|\n",
"|**label_column_name**|(sparse) array-like, shape = [n_samples, ], targets values.|"
@@ -244,7 +247,7 @@
"source": [
"featurization_config = FeaturizationConfig()\n",
"featurization_config.blocked_transformers = ['LabelEncoder']\n",
"#featurization_config.drop_columns = ['ERP', 'MMIN']\n",
"#featurization_config.drop_columns = ['MMIN']\n",
"featurization_config.add_column_purpose('MYCT', 'Numeric')\n",
"featurization_config.add_column_purpose('VendorName', 'CategoricalHash')\n",
"#default strategy mean, add transformer param for for 3 columns\n",
@@ -262,7 +265,7 @@
"source": [
"automl_settings = {\n",
" \"enable_early_stopping\": True, \n",
" \"experiment_timeout_minutes\" : 10,\n",
" \"experiment_timeout_hours\" : 0.25,\n",
" \"max_concurrent_iterations\": 4,\n",
" \"max_cores_per_iteration\": -1,\n",
" \"n_cross_validations\": 5,\n",
@@ -320,8 +323,6 @@
"outputs": [],
"source": [
"#from azureml.train.automl.run import AutoMLRun\n",
"#experiment_name = 'automl-regression-hardware'\n",
"#experiment = Experiment(ws, experiment_name)\n",
"#remote_run = AutoMLRun(experiment=experiment, run_id='<run_ID_goes_here')\n",
"#remote_run"
]
@@ -514,7 +515,7 @@
" content = cefr.read()\n",
"\n",
"# Replace the values in train_explainer.py file with the appropriate values\n",
"content = content.replace('<<experimnet_name>>', automl_run.experiment.name) # your experiment name.\n",
"content = content.replace('<<experiment_name>>', automl_run.experiment.name) # your experiment name.\n",
"content = content.replace('<<run_id>>', automl_run.id) # Run-id of the AutoML run for which you want to explain the model.\n",
"content = content.replace('<<target_column_name>>', 'ERP') # Your target column name\n",
"content = content.replace('<<task>>', 'regression') # Training task type\n",
@@ -532,8 +533,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Create conda configuration for model explanations experiment\n",
"We need `azureml-explain-model`, `azureml-train-automl` and `azureml-core` packages for computing model explanations for your AutoML model on remote compute."
"#### Create conda configuration for model explanations experiment from automl_run object"
]
},
{
@@ -552,14 +552,9 @@
"# Set compute target to AmlCompute\n",
"conda_run_config.target = compute_target\n",
"conda_run_config.environment.docker.enabled = True\n",
"azureml_pip_packages = [\n",
" 'azureml-train-automl', 'azureml-core', 'azureml-explain-model'\n",
"]\n",
"\n",
"# specify CondaDependencies obj\n",
"conda_run_config.environment.python.conda_dependencies = CondaDependencies.create(\n",
" conda_packages=['scikit-learn', 'numpy','py-xgboost<=0.80'],\n",
" pip_packages=azureml_pip_packages)"
"conda_run_config.environment.python.conda_dependencies = automl_run.get_environment().python.conda_dependencies"
]
},
{
@@ -604,38 +599,8 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Feature importance and explanation dashboard\n",
"In this section we describe how you can download the explanation results from the explanations experiment and visualize the feature importance for your AutoML model. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Setup for visualizing the model explanation results\n",
"For visualizing the explanation results for the *fitted_model* we need to perform the following steps:-\n",
"1. Featurize test data samples.\n",
"\n",
"The *automl_explainer_setup_obj* contains all the structures from above list. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_test = test_data.drop_columns([label]).to_pandas_dataframe()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, automl_setup_model_explanations\n",
"explainer_setup_class = automl_setup_model_explanations(fitted_model, 'regression', X_test=X_test)"
"### Feature importance and visualizing explanation dashboard\n",
"In this section we describe how you can download the explanation results from the explanations experiment and visualize the feature importance for your AutoML model on the azure portal."
]
},
{
@@ -643,7 +608,7 @@
"metadata": {},
"source": [
"#### Download engineered feature importance from artifact store\n",
"You can use *ExplanationClient* to download the engineered feature explanations from the artifact store of the *automl_run*. You can also use ExplanationDashboard to view the dash board visualization of the feature importance values of the engineered features."
"You can use *ExplanationClient* to download the engineered feature explanations from the artifact store of the *automl_run*. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
]
},
{
@@ -653,11 +618,10 @@
"outputs": [],
"source": [
"from azureml.explain.model._internal.explanation_client import ExplanationClient\n",
"from interpret_community.widget import ExplanationDashboard\n",
"client = ExplanationClient.from_run(automl_run)\n",
"engineered_explanations = client.download_model_explanation(raw=False)\n",
"engineered_explanations = client.download_model_explanation(raw=False, comment='engineered explanations')\n",
"print(engineered_explanations.get_feature_importance_dict())\n",
"ExplanationDashboard(engineered_explanations, explainer_setup_class.automl_estimator, datasetX=explainer_setup_class.X_test_transform)"
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
]
},
{
@@ -665,7 +629,7 @@
"metadata": {},
"source": [
"#### Download raw feature importance from artifact store\n",
"You can use *ExplanationClient* to download the raw feature explanations from the artifact store of the *automl_run*. You can also use ExplanationDashboard to view the dash board visualization of the feature importance values of the raw features."
"You can use *ExplanationClient* to download the raw feature explanations from the artifact store of the *automl_run*. You can also use azure portal url to view the dash board visualization of the feature importance values of the raw features."
]
},
{
@@ -674,9 +638,9 @@
"metadata": {},
"outputs": [],
"source": [
"raw_explanations = client.download_model_explanation(raw=True)\n",
"raw_explanations = client.download_model_explanation(raw=True, comment='raw explanations')\n",
"print(raw_explanations.get_feature_importance_dict())\n",
"ExplanationDashboard(raw_explanations, explainer_setup_class.automl_pipeline, datasetX=explainer_setup_class.X_test_raw)"
"print(\"You can visualize the raw explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
]
},
{
@@ -718,20 +682,10 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.conda_dependencies import CondaDependencies \n",
"\n",
"azureml_pip_packages = [\n",
" 'azureml-explain-model', 'azureml-train-automl', 'azureml-defaults'\n",
"]\n",
" \n",
"\n",
"# specify CondaDependencies obj\n",
"myenv = CondaDependencies.create(conda_packages=['scikit-learn', 'pandas', 'numpy', 'py-xgboost<=0.80'],\n",
" pip_packages=azureml_pip_packages,\n",
" pin_sdk_version=True)\n",
"conda_dep = automl_run.get_environment().python.conda_dependencies\n",
"\n",
"with open(\"myenv.yml\",\"w\") as f:\n",
" f.write(myenv.serialize_to_string())\n",
" f.write(conda_dep.serialize_to_string())\n",
"\n",
"with open(\"myenv.yml\",\"r\") as f:\n",
" print(f.read())"
@@ -772,6 +726,7 @@
"from azureml.core.model import InferenceConfig\n",
"from azureml.core.webservice import AciWebservice\n",
"from azureml.core.model import Model\n",
"from azureml.core.environment import Environment\n",
"\n",
"aciconfig = AciWebservice.deploy_configuration(cpu_cores=1, \n",
" memory_gb=1, \n",
@@ -779,9 +734,8 @@
" \"method\" : \"local_explanation\"}, \n",
" description='Get local explanations for Machine test data')\n",
"\n",
"inference_config = InferenceConfig(runtime= \"python\", \n",
" entry_script=\"score_explain.py\",\n",
" conda_file=\"myenv.yml\")\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
"inference_config = InferenceConfig(entry_script=\"score_explain.py\", environment=myenv)\n",
"\n",
"# Use configs and models generated above\n",
"service = Model.deploy(ws, 'model-scoring', [scoring_explainer_model, original_model], inference_config, aciconfig)\n",
@@ -819,6 +773,7 @@
"outputs": [],
"source": [
"if service.state == 'Healthy':\n",
" X_test = test_data.drop_columns([label]).to_pandas_dataframe()\n",
" # Serialize the first row of the test data into json\n",
" X_test_json = X_test[:1].to_json(orient='records')\n",
" print(X_test_json)\n",

View File

@@ -1,10 +1,7 @@
name: auto-ml-forecasting-grouping
name: auto-ml-regression-explanation-featurization
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-pipeline
- azureml-widgets
- pandas_ml
- statsmodels
- matplotlib

View File

@@ -7,10 +7,10 @@ from azureml.core.experiment import Experiment
from sklearn.externals import joblib
from azureml.core.dataset import Dataset
from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, \
automl_setup_model_explanations
automl_setup_model_explanations, automl_check_model_if_explainable
from azureml.explain.model.mimic.models.lightgbm_model import LGBMExplainableModel
from azureml.explain.model.mimic_wrapper import MimicWrapper
from automl.client.core.common.constants import MODEL_PATH
from azureml.automl.core.shared.constants import MODEL_PATH
from azureml.explain.model.scoring.scoring_explainer import TreeScoringExplainer, save
@@ -22,9 +22,14 @@ run = Run.get_context()
ws = run.experiment.workspace
# Get the AutoML run object from the experiment name and the workspace
experiment = Experiment(ws, '<<experimnet_name>>')
experiment = Experiment(ws, '<<experiment_name>>')
automl_run = Run(experiment=experiment, run_id='<<run_id>>')
# Check if this AutoML model is explainable
if not automl_check_model_if_explainable(automl_run):
raise Exception("Model explanations is currently not supported for " + automl_run.get_properties().get(
'run_algorithm'))
# Download the best model from the artifact store
automl_run.download_file(name=MODEL_PATH, output_file_path='model.pkl')
@@ -55,17 +60,16 @@ explainer = MimicWrapper(ws, automl_explainer_setup_obj.automl_estimator, LGBMEx
classes=automl_explainer_setup_obj.classes)
# Compute the engineered explanations
engineered_explanations = explainer.explain(['local', 'global'],
engineered_explanations = explainer.explain(['local', 'global'], tag='engineered explanations',
eval_dataset=automl_explainer_setup_obj.X_test_transform)
# Compute the raw explanations
raw_explanations = explainer.explain(['local', 'global'], get_raw=True,
raw_explanations = explainer.explain(['local', 'global'], get_raw=True, tag='raw explanations',
raw_feature_names=automl_explainer_setup_obj.raw_feature_names,
eval_dataset=automl_explainer_setup_obj.X_test_transform)
print("Engineered and raw explanations computed successfully")
# Initialize the ScoringExplainer
scoring_explainer = TreeScoringExplainer(explainer.explainer, feature_maps=[automl_explainer_setup_obj.feature_map])

View File

@@ -1,13 +0,0 @@
name: auto-ml-regression-hardware-performance-explanation-and-featurization
dependencies:
- pip:
- azureml-sdk
- interpret
- azureml-defaults
- azureml-explain-model
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml
- azureml-explain-model
- azureml-contrib-interpret

View File

@@ -40,7 +40,7 @@
"## Introduction\n",
"In this example we use the Hardware Performance Dataset to showcase how you can use AutoML for a simple regression problem. The Regression goal is to predict the performance of certain combinations of hardware parts.\n",
"\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"If you are using an Azure Machine Learning Compute Instance, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace. \n",
"\n",
"In this notebook you will learn how to:\n",
"1. Create an `Experiment` in an existing `Workspace`.\n",
@@ -79,6 +79,23 @@
"from azureml.train.automl import AutoMLConfig"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This sample notebook may use features that are not available in previous versions of the Azure ML SDK."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -93,7 +110,6 @@
"experiment = Experiment(ws, experiment_name)\n",
"\n",
"output = {}\n",
"output['SDK version'] = azureml.core.VERSION\n",
"output['Subscription ID'] = ws.subscription_id\n",
"output['Workspace'] = ws.name\n",
"output['Resource Group'] = ws.resource_group\n",
@@ -122,7 +138,7 @@
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your CPU cluster\n",
"cpu_cluster_name = \"cpu-cluster-2\"\n",
"cpu_cluster_name = \"reg-cluster\"\n",
"\n",
"# Verify that cluster does not exist already\n",
"try:\n",
@@ -188,15 +204,18 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"automlconfig-remarks-sample"
]
},
"outputs": [],
"source": [
"automl_settings = {\n",
" \"n_cross_validations\": 3,\n",
" \"primary_metric\": 'r2_score',\n",
" \"preprocess\": True,\n",
" \"enable_early_stopping\": True, \n",
" \"experiment_timeout_minutes\": 20, #for real scenarios we reccommend a timeout of at least one hour \n",
" \"experiment_timeout_hours\": 0.3, #for real scenarios we reccommend a timeout of at least one hour \n",
" \"max_concurrent_iterations\": 4,\n",
" \"max_cores_per_iteration\": -1,\n",
" \"verbosity\": logging.INFO,\n",

View File

@@ -2,8 +2,7 @@ name: auto-ml-regression
dependencies:
- pip:
- azureml-sdk
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib
- pandas_ml
- paramiko<2.5.0

View File

@@ -1,23 +0,0 @@
-- This shows using the AutoMLForecast stored procedure to predict using a forecasting model for the nyc_energy dataset.
DECLARE @Model NVARCHAR(MAX) = (SELECT TOP 1 Model FROM dbo.aml_model
WHERE ExperimentName = 'automl-sql-forecast'
ORDER BY CreatedDate DESC)
DECLARE @max_horizon INT = 48
DECLARE @split_time NVARCHAR(22) = (SELECT DATEADD(hour, -@max_horizon, MAX(timeStamp)) FROM nyc_energy WHERE demand IS NOT NULL)
DECLARE @TestDataQuery NVARCHAR(MAX) = '
SELECT CAST(timeStamp AS NVARCHAR(30)) AS timeStamp,
demand,
precip,
temp
FROM nyc_energy
WHERE demand IS NOT NULL AND precip IS NOT NULL AND temp IS NOT NULL
AND timeStamp > ''' + @split_time + ''''
EXEC dbo.AutoMLForecast @input_query=@TestDataQuery,
@label_column='demand',
@time_column_name='timeStamp',
@model=@model
WITH RESULT SETS ((timeStamp DATETIME, grain NVARCHAR(255), predicted_demand FLOAT, precip FLOAT, temp FLOAT, actual_demand FLOAT))

View File

@@ -1,10 +0,0 @@
-- This lists all the metrics for all iterations for the most recent run.
DECLARE @RunId NVARCHAR(43)
DECLARE @ExperimentName NVARCHAR(255)
SELECT TOP 1 @ExperimentName=ExperimentName, @RunId=SUBSTRING(RunId, 1, 43)
FROM aml_model
ORDER BY CreatedDate DESC
EXEC dbo.AutoMLGetMetrics @RunId, @ExperimentName

View File

@@ -1,25 +0,0 @@
-- This shows using the AutoMLTrain stored procedure to create a forecasting model for the nyc_energy dataset.
DECLARE @max_horizon INT = 48
DECLARE @split_time NVARCHAR(22) = (SELECT DATEADD(hour, -@max_horizon, MAX(timeStamp)) FROM nyc_energy WHERE demand IS NOT NULL)
DECLARE @TrainDataQuery NVARCHAR(MAX) = '
SELECT CAST(timeStamp as NVARCHAR(30)) as timeStamp,
demand,
precip,
temp
FROM nyc_energy
WHERE demand IS NOT NULL AND precip IS NOT NULL AND temp IS NOT NULL
and timeStamp < ''' + @split_time + ''''
INSERT INTO dbo.aml_model(RunId, ExperimentName, Model, LogFileText, WorkspaceName)
EXEC dbo.AutoMLTrain @input_query= @TrainDataQuery,
@label_column='demand',
@task='forecasting',
@iterations=10,
@iteration_timeout_minutes=5,
@time_column_name='timeStamp',
@max_horizon=@max_horizon,
@experiment_name='automl-sql-forecast',
@primary_metric='normalized_root_mean_squared_error'

View File

@@ -1,161 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Train a model and use it for prediction\r\n",
"\r\n",
"Before running this notebook, run the auto-ml-sql-setup.ipynb notebook."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/automated-machine-learning/sql-server/energy-demand/auto-ml-sql-energy-demand.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Set the default database"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"USE [automl]\r\n",
"GO"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Use the AutoMLTrain stored procedure to create a forecasting model for the nyc_energy dataset."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"INSERT INTO dbo.aml_model(RunId, ExperimentName, Model, LogFileText, WorkspaceName)\r\n",
"EXEC dbo.AutoMLTrain @input_query='\r\n",
"SELECT CAST(timeStamp as NVARCHAR(30)) as timeStamp,\r\n",
" demand,\r\n",
"\t precip,\r\n",
"\t temp,\r\n",
"\t CASE WHEN timeStamp < ''2017-01-01'' THEN 0 ELSE 1 END AS is_validate_column\r\n",
"FROM nyc_energy\r\n",
"WHERE demand IS NOT NULL AND precip IS NOT NULL AND temp IS NOT NULL\r\n",
"and timeStamp < ''2017-02-01''',\r\n",
"@label_column='demand',\r\n",
"@task='forecasting',\r\n",
"@iterations=10,\r\n",
"@iteration_timeout_minutes=5,\r\n",
"@time_column_name='timeStamp',\r\n",
"@is_validate_column='is_validate_column',\r\n",
"@experiment_name='automl-sql-forecast',\r\n",
"@primary_metric='normalized_root_mean_squared_error'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Use the AutoMLPredict stored procedure to predict using the forecasting model for the nyc_energy dataset."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"DECLARE @Model NVARCHAR(MAX) = (SELECT TOP 1 Model FROM dbo.aml_model\r\n",
" WHERE ExperimentName = 'automl-sql-forecast'\r\n",
"\t\t\t\t\t\t\t\tORDER BY CreatedDate DESC)\r\n",
"\r\n",
"EXEC dbo.AutoMLPredict @input_query='\r\n",
"SELECT CAST(timeStamp AS NVARCHAR(30)) AS timeStamp,\r\n",
" demand,\r\n",
"\t precip,\r\n",
"\t temp\r\n",
"FROM nyc_energy\r\n",
"WHERE demand IS NOT NULL AND precip IS NOT NULL AND temp IS NOT NULL\r\n",
"AND timeStamp >= ''2017-02-01''',\r\n",
"@label_column='demand',\r\n",
"@model=@model\r\n",
"WITH RESULT SETS ((timeStamp NVARCHAR(30), actual_demand FLOAT, precip FLOAT, temp FLOAT, predicted_demand FLOAT))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## List all the metrics for all iterations for the most recent training run."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"DECLARE @RunId NVARCHAR(43)\r\n",
"DECLARE @ExperimentName NVARCHAR(255)\r\n",
"\r\n",
"SELECT TOP 1 @ExperimentName=ExperimentName, @RunId=SUBSTRING(RunId, 1, 43)\r\n",
"FROM aml_model\r\n",
"ORDER BY CreatedDate DESC\r\n",
"\r\n",
"EXEC dbo.AutoMLGetMetrics @RunId, @ExperimentName"
]
}
],
"metadata": {
"authors": [
{
"name": "jeffshep"
}
],
"category": "tutorial",
"compute": [
"Local"
],
"datasets": [
"NYC Energy"
],
"deployment": [
"None"
],
"exclude_from_index": false,
"framework": [
"Azure ML AutoML"
],
"tags": [
""
],
"friendly_name": "Forecasting with automated ML SQL integration",
"index_order": 1,
"kernelspec": {
"display_name": "Python 3.6",
"language": "sql",
"name": "python36"
},
"language_info": {
"name": "sql",
"version": ""
},
"task": "Forecasting"
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,92 +0,0 @@
-- This procedure forecast values based on a forecasting model returned by AutoMLTrain.
-- It returns a dataset with the forecasted values.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE OR ALTER PROCEDURE [dbo].[AutoMLForecast]
(
@input_query NVARCHAR(MAX), -- A SQL query returning data to predict on.
@model NVARCHAR(MAX), -- A model returned from AutoMLTrain.
@time_column_name NVARCHAR(255)='', -- The name of the timestamp column for forecasting.
@label_column NVARCHAR(255)='', -- Optional name of the column from input_query, which should be ignored when predicting
@y_query_column NVARCHAR(255)='', -- Optional value column that can be used for predicting.
-- If specified, this can contain values for past times (after the model was trained)
-- and contain Nan for future times.
@forecast_column_name NVARCHAR(255) = 'predicted'
-- The name of the output column containing the forecast value.
) AS
BEGIN
EXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd
import azureml.core
import numpy as np
from azureml.train.automl import AutoMLConfig
import pickle
import codecs
model_obj = pickle.loads(codecs.decode(model.encode(), "base64"))
test_data = input_data.copy()
if label_column != "" and label_column is not None:
y_test = test_data.pop(label_column).values
else:
y_test = None
if y_query_column != "" and y_query_column is not None:
y_query = test_data.pop(y_query_column).values
else:
y_query = np.repeat(np.nan, len(test_data))
X_test = test_data
if time_column_name != "" and time_column_name is not None:
X_test[time_column_name] = pd.to_datetime(X_test[time_column_name])
y_fcst, X_trans = model_obj.forecast(X_test, y_query)
def align_outputs(y_forecast, X_trans, X_test, y_test, forecast_column_name):
# Demonstrates how to get the output aligned to the inputs
# using pandas indexes. Helps understand what happened if
# the output shape differs from the input shape, or if
# the data got re-sorted by time and grain during forecasting.
# Typical causes of misalignment are:
# * we predicted some periods that were missing in actuals -> drop from eval
# * model was asked to predict past max_horizon -> increase max horizon
# * data at start of X_test was needed for lags -> provide previous periods
df_fcst = pd.DataFrame({forecast_column_name : y_forecast})
# y and X outputs are aligned by forecast() function contract
df_fcst.index = X_trans.index
# align original X_test to y_test
X_test_full = X_test.copy()
if y_test is not None:
X_test_full[label_column] = y_test
# X_test_full does not include origin, so reset for merge
df_fcst.reset_index(inplace=True)
X_test_full = X_test_full.reset_index().drop(columns=''index'')
together = df_fcst.merge(X_test_full, how=''right'')
# drop rows where prediction or actuals are nan
# happens because of missing actuals
# or at edges of time due to lags/rolling windows
clean = together[together[[label_column, forecast_column_name]].notnull().all(axis=1)]
return(clean)
combined_output = align_outputs(y_fcst, X_trans, X_test, y_test, forecast_column_name)
'
, @input_data_1 = @input_query
, @input_data_1_name = N'input_data'
, @output_data_1_name = N'combined_output'
, @params = N'@model NVARCHAR(MAX), @time_column_name NVARCHAR(255), @label_column NVARCHAR(255), @y_query_column NVARCHAR(255), @forecast_column_name NVARCHAR(255)'
, @model = @model
, @time_column_name = @time_column_name
, @label_column = @label_column
, @y_query_column = @y_query_column
, @forecast_column_name = @forecast_column_name
END

View File

@@ -1,70 +0,0 @@
-- This procedure returns a list of metrics for each iteration of a run.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE OR ALTER PROCEDURE [dbo].[AutoMLGetMetrics]
(
@run_id NVARCHAR(250), -- The RunId
@experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.
@connection_name NVARCHAR(255)='default' -- The AML connection to use.
) AS
BEGIN
DECLARE @tenantid NVARCHAR(255)
DECLARE @appid NVARCHAR(255)
DECLARE @password NVARCHAR(255)
DECLARE @config_file NVARCHAR(255)
SELECT @tenantid=TenantId, @appid=AppId, @password=Password, @config_file=ConfigFile
FROM aml_connection
WHERE ConnectionName = @connection_name;
EXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd
import logging
import azureml.core
import numpy as np
from azureml.core.experiment import Experiment
from azureml.train.automl.run import AutoMLRun
from azureml.core.authentication import ServicePrincipalAuthentication
from azureml.core.workspace import Workspace
auth = ServicePrincipalAuthentication(tenantid, appid, password)
ws = Workspace.from_config(path=config_file, auth=auth)
experiment = Experiment(ws, experiment_name)
ml_run = AutoMLRun(experiment = experiment, run_id = run_id)
children = list(ml_run.get_children())
iterationlist = []
metricnamelist = []
metricvaluelist = []
for run in children:
properties = run.get_properties()
if "iteration" in properties:
iteration = int(properties["iteration"])
for metric_name, metric_value in run.get_metrics().items():
if isinstance(metric_value, float):
iterationlist.append(iteration)
metricnamelist.append(metric_name)
metricvaluelist.append(metric_value)
metrics = pd.DataFrame({"iteration": iterationlist, "metric_name": metricnamelist, "metric_value": metricvaluelist})
'
, @output_data_1_name = N'metrics'
, @params = N'@run_id NVARCHAR(250),
@experiment_name NVARCHAR(32),
@tenantid NVARCHAR(255),
@appid NVARCHAR(255),
@password NVARCHAR(255),
@config_file NVARCHAR(255)'
, @run_id = @run_id
, @experiment_name = @experiment_name
, @tenantid = @tenantid
, @appid = @appid
, @password = @password
, @config_file = @config_file
WITH RESULT SETS ((iteration INT, metric_name NVARCHAR(100), metric_value FLOAT))
END

View File

@@ -1,41 +0,0 @@
-- This procedure predicts values based on a model returned by AutoMLTrain and a dataset.
-- It returns the dataset with a new column added, which is the predicted value.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE OR ALTER PROCEDURE [dbo].[AutoMLPredict]
(
@input_query NVARCHAR(MAX), -- A SQL query returning data to predict on.
@model NVARCHAR(MAX), -- A model returned from AutoMLTrain.
@label_column NVARCHAR(255)='' -- Optional name of the column from input_query, which should be ignored when predicting
) AS
BEGIN
EXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd
import azureml.core
import numpy as np
from azureml.train.automl import AutoMLConfig
import pickle
import codecs
model_obj = pickle.loads(codecs.decode(model.encode(), "base64"))
test_data = input_data.copy()
if label_column != "" and label_column is not None:
y_test = test_data.pop(label_column).values
X_test = test_data
predicted = model_obj.predict(X_test)
combined_output = input_data.assign(predicted=predicted)
'
, @input_data_1 = @input_query
, @input_data_1_name = N'input_data'
, @output_data_1_name = N'combined_output'
, @params = N'@model NVARCHAR(MAX), @label_column NVARCHAR(255)'
, @model = @model
, @label_column = @label_column
END

View File

@@ -1,240 +0,0 @@
-- This stored procedure uses automated machine learning to train several models
-- and returns the best model.
--
-- The result set has several columns:
-- best_run - iteration ID for the best model
-- experiment_name - experiment name pass in with the @experiment_name parameter
-- fitted_model - best model found
-- log_file_text - AutoML debug_log contents
-- workspace - name of the Azure ML workspace where run history is stored
--
-- An example call for a classification problem is:
-- insert into dbo.aml_model(RunId, ExperimentName, Model, LogFileText, WorkspaceName)
-- exec dbo.AutoMLTrain @input_query='
-- SELECT top 100000
-- CAST([pickup_datetime] AS NVARCHAR(30)) AS pickup_datetime
-- ,CAST([dropoff_datetime] AS NVARCHAR(30)) AS dropoff_datetime
-- ,[passenger_count]
-- ,[trip_time_in_secs]
-- ,[trip_distance]
-- ,[payment_type]
-- ,[tip_class]
-- FROM [dbo].[nyctaxi_sample] order by [hack_license] ',
-- @label_column = 'tip_class',
-- @iterations=10
--
-- An example call for forecasting is:
-- insert into dbo.aml_model(RunId, ExperimentName, Model, LogFileText, WorkspaceName)
-- exec dbo.AutoMLTrain @input_query='
-- select cast(timeStamp as nvarchar(30)) as timeStamp,
-- demand,
-- precip,
-- temp,
-- case when timeStamp < ''2017-01-01'' then 0 else 1 end as is_validate_column
-- from nyc_energy
-- where demand is not null and precip is not null and temp is not null
-- and timeStamp < ''2017-02-01''',
-- @label_column='demand',
-- @task='forecasting',
-- @iterations=10,
-- @iteration_timeout_minutes=5,
-- @time_column_name='timeStamp',
-- @is_validate_column='is_validate_column',
-- @experiment_name='automl-sql-forecast',
-- @primary_metric='normalized_root_mean_squared_error'
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE OR ALTER PROCEDURE [dbo].[AutoMLTrain]
(
@input_query NVARCHAR(MAX), -- The SQL Query that will return the data to train and validate the model.
@label_column NVARCHAR(255)='Label', -- The name of the column in the result of @input_query that is the label.
@primary_metric NVARCHAR(40)='AUC_weighted', -- The metric to optimize.
@iterations INT=100, -- The maximum number of pipelines to train.
@task NVARCHAR(40)='classification', -- The type of task. Can be classification, regression or forecasting.
@experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.
@iteration_timeout_minutes INT = 15, -- The maximum time in minutes for training a single pipeline.
@experiment_timeout_minutes INT = 60, -- The maximum time in minutes for training all pipelines.
@n_cross_validations INT = 3, -- The number of cross validations.
@blacklist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that will not be used.
-- The list of possible models can be found at:
-- https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-configure-auto-train#configure-your-experiment-settings
@whitelist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that can be used.
-- The list of possible models can be found at:
-- https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-configure-auto-train#configure-your-experiment-settings
@experiment_exit_score FLOAT = 0, -- Stop the experiment if this score is acheived.
@sample_weight_column NVARCHAR(255)='', -- The name of the column in the result of @input_query that gives a sample weight.
@is_validate_column NVARCHAR(255)='', -- The name of the column in the result of @input_query that indicates if the row is for training or validation.
-- In the values of the column, 0 means for training and 1 means for validation.
@time_column_name NVARCHAR(255)='', -- The name of the timestamp column for forecasting.
@connection_name NVARCHAR(255)='default', -- The AML connection to use.
@max_horizon INT = 0 -- A forecast horizon is a time span into the future (or just beyond the latest date in the training data)
-- where forecasts of the target quantity are needed.
-- For example, if data is recorded daily and max_horizon is 5, we will predict 5 days ahead.
) AS
BEGIN
DECLARE @tenantid NVARCHAR(255)
DECLARE @appid NVARCHAR(255)
DECLARE @password NVARCHAR(255)
DECLARE @config_file NVARCHAR(255)
SELECT @tenantid=TenantId, @appid=AppId, @password=Password, @config_file=ConfigFile
FROM aml_connection
WHERE ConnectionName = @connection_name;
EXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd
import logging
import azureml.core
import pandas as pd
import numpy as np
from azureml.core.experiment import Experiment
from azureml.train.automl import AutoMLConfig
from sklearn import datasets
import pickle
import codecs
from azureml.core.authentication import ServicePrincipalAuthentication
from azureml.core.workspace import Workspace
if __name__.startswith("sqlindb"):
auth = ServicePrincipalAuthentication(tenantid, appid, password)
ws = Workspace.from_config(path=config_file, auth=auth)
project_folder = "./sample_projects/" + experiment_name
experiment = Experiment(ws, experiment_name)
data_train = input_data
X_valid = None
y_valid = None
sample_weight_valid = None
if is_validate_column != "" and is_validate_column is not None:
data_train = input_data[input_data[is_validate_column] <= 0]
data_valid = input_data[input_data[is_validate_column] > 0]
data_train.pop(is_validate_column)
data_valid.pop(is_validate_column)
y_valid = data_valid.pop(label_column).values
if sample_weight_column != "" and sample_weight_column is not None:
sample_weight_valid = data_valid.pop(sample_weight_column).values
X_valid = data_valid
n_cross_validations = None
y_train = data_train.pop(label_column).values
sample_weight = None
if sample_weight_column != "" and sample_weight_column is not None:
sample_weight = data_train.pop(sample_weight_column).values
X_train = data_train
if experiment_timeout_minutes == 0:
experiment_timeout_minutes = None
if experiment_exit_score == 0:
experiment_exit_score = None
if blacklist_models == "":
blacklist_models = None
if blacklist_models is not None:
blacklist_models = blacklist_models.replace(" ", "").split(",")
if whitelist_models == "":
whitelist_models = None
if whitelist_models is not None:
whitelist_models = whitelist_models.replace(" ", "").split(",")
automl_settings = {}
preprocess = True
if time_column_name != "" and time_column_name is not None:
automl_settings = { "time_column_name": time_column_name }
preprocess = False
if max_horizon > 0:
automl_settings["max_horizon"] = max_horizon
log_file_name = "automl_sqlindb_errors.log"
automl_config = AutoMLConfig(task = task,
debug_log = log_file_name,
primary_metric = primary_metric,
iteration_timeout_minutes = iteration_timeout_minutes,
experiment_timeout_minutes = experiment_timeout_minutes,
iterations = iterations,
n_cross_validations = n_cross_validations,
preprocess = preprocess,
verbosity = logging.INFO,
X = X_train,
y = y_train,
path = project_folder,
blacklist_models = blacklist_models,
whitelist_models = whitelist_models,
experiment_exit_score = experiment_exit_score,
sample_weight = sample_weight,
X_valid = X_valid,
y_valid = y_valid,
sample_weight_valid = sample_weight_valid,
**automl_settings)
local_run = experiment.submit(automl_config, show_output = True)
best_run, fitted_model = local_run.get_output()
pickled_model = codecs.encode(pickle.dumps(fitted_model), "base64").decode()
log_file_text = ""
try:
with open(log_file_name, "r") as log_file:
log_file_text = log_file.read()
except:
log_file_text = "Log file not found"
returned_model = pd.DataFrame({"best_run": [best_run.id], "experiment_name": [experiment_name], "fitted_model": [pickled_model], "log_file_text": [log_file_text], "workspace": [ws.name]}, dtype=np.dtype(np.str))
'
, @input_data_1 = @input_query
, @input_data_1_name = N'input_data'
, @output_data_1_name = N'returned_model'
, @params = N'@label_column NVARCHAR(255),
@primary_metric NVARCHAR(40),
@iterations INT, @task NVARCHAR(40),
@experiment_name NVARCHAR(32),
@iteration_timeout_minutes INT,
@experiment_timeout_minutes INT,
@n_cross_validations INT,
@blacklist_models NVARCHAR(MAX),
@whitelist_models NVARCHAR(MAX),
@experiment_exit_score FLOAT,
@sample_weight_column NVARCHAR(255),
@is_validate_column NVARCHAR(255),
@time_column_name NVARCHAR(255),
@tenantid NVARCHAR(255),
@appid NVARCHAR(255),
@password NVARCHAR(255),
@config_file NVARCHAR(255),
@max_horizon INT'
, @label_column = @label_column
, @primary_metric = @primary_metric
, @iterations = @iterations
, @task = @task
, @experiment_name = @experiment_name
, @iteration_timeout_minutes = @iteration_timeout_minutes
, @experiment_timeout_minutes = @experiment_timeout_minutes
, @n_cross_validations = @n_cross_validations
, @blacklist_models = @blacklist_models
, @whitelist_models = @whitelist_models
, @experiment_exit_score = @experiment_exit_score
, @sample_weight_column = @sample_weight_column
, @is_validate_column = @is_validate_column
, @time_column_name = @time_column_name
, @tenantid = @tenantid
, @appid = @appid
, @password = @password
, @config_file = @config_file
, @max_horizon = @max_horizon
WITH RESULT SETS ((best_run NVARCHAR(250), experiment_name NVARCHAR(100), fitted_model VARCHAR(MAX), log_file_text NVARCHAR(MAX), workspace NVARCHAR(100)))
END

View File

@@ -1,18 +0,0 @@
-- This is a table to store the Azure ML connection information.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[aml_connection](
[Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
[ConnectionName] [nvarchar](255) NULL,
[TenantId] [nvarchar](255) NULL,
[AppId] [nvarchar](255) NULL,
[Password] [nvarchar](255) NULL,
[ConfigFile] [nvarchar](255) NULL
) ON [PRIMARY]
GO

View File

@@ -1,22 +0,0 @@
-- This is a table to hold the results from the AutoMLTrain procedure.
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[aml_model](
[Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,
[Model] [varchar](max) NOT NULL, -- The model, which can be passed to AutoMLPredict for testing or prediction.
[RunId] [nvarchar](250) NULL, -- The RunId, which can be used to view the model in the Azure Portal.
[CreatedDate] [datetime] NULL,
[ExperimentName] [nvarchar](100) NULL, -- Azure ML Experiment Name
[WorkspaceName] [nvarchar](100) NULL, -- Azure ML Workspace Name
[LogFileText] [nvarchar](max) NULL
)
GO
ALTER TABLE [dbo].[aml_model] ADD DEFAULT (getutcdate()) FOR [CreatedDate]
GO

View File

@@ -1,581 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Set up Azure ML Automated Machine Learning on SQL Server 2019 CTP 2.4 big data cluster\r\n",
"\r\n",
"\\# Prerequisites: \r\n",
"\\# - An Azure subscription and resource group \r\n",
"\\# - An Azure Machine Learning workspace \r\n",
"\\# - A SQL Server 2019 CTP 2.4 big data cluster with Internet access and a database named 'automl' \r\n",
"\\# - Azure CLI \r\n",
"\\# - kubectl command \r\n",
"\\# - The https://github.com/Azure/MachineLearningNotebooks repository downloaded (cloned) to your local machine\r\n",
"\r\n",
"\\# In the 'automl' database, create a table named 'dbo.nyc_energy' as follows: \r\n",
"\\# - In SQL Server Management Studio, right-click the 'automl' database, select Tasks, then Import Flat File. \r\n",
"\\# - Select the file AzureMlCli\\notebooks\\how-to-use-azureml\\automated-machine-learning\\forecasting-energy-demand\\nyc_energy.csv. \r\n",
"\\# - Using the \"Modify Columns\" page, allow nulls for all columns. \r\n",
"\r\n",
"\\# Create an Azure Machine Learning Workspace using the instructions at https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-manage-workspace \r\n",
"\r\n",
"\\# Create an Azure service principal. You can do this with the following commands: \r\n",
"\r\n",
"az login \r\n",
"az account set --subscription *subscriptionid* \r\n",
"\r\n",
"\\# The following command prints out the **appId** and **tenant**, \r\n",
"\\# which you insert into the indicated cell later in this notebook \r\n",
"\\# to allow AutoML to authenticate with Azure: \r\n",
"\r\n",
"az ad sp create-for-rbac --name *principlename* --password *password*\r\n",
"\r\n",
"\\# Log into the master instance of SQL Server 2019 CTP 2.4: \r\n",
"kubectl exec -it mssql-master-pool-0 -n *clustername* -c mssql-server -- /bin/bash\r\n",
"\r\n",
"mkdir /tmp/aml\r\n",
"\r\n",
"cd /tmp/aml\r\n",
"\r\n",
"\\# **Modify** the following with your subscription_id, resource_group, and workspace_name: \r\n",
"cat > config.json << EOF \r\n",
"{ \r\n",
" \"subscription_id\": \"123456ab-78cd-0123-45ef-abcd12345678\", \r\n",
" \"resource_group\": \"myrg1\", \r\n",
" \"workspace_name\": \"myws1\" \r\n",
"} \r\n",
"EOF\r\n",
"\r\n",
"\\# The directory referenced below is appropriate for the master instance of SQL Server 2019 CTP 2.4.\r\n",
"\r\n",
"cd /opt/mssql/mlservices/runtime/python/bin\r\n",
"\r\n",
"./python -m pip install azureml-sdk[automl]\r\n",
"\r\n",
"./python -m pip install --upgrade numpy \r\n",
"\r\n",
"./python -m pip install --upgrade sklearn\r\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/automated-machine-learning/sql-server/setup/auto-ml-sql-setup.png)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- Enable external scripts to allow invoking Python\r\n",
"sp_configure 'external scripts enabled',1 \r\n",
"reconfigure with override \r\n",
"GO\r\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- Use database 'automl'\r\n",
"USE [automl]\r\n",
"GO"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- This is a table to hold the Azure ML connection information.\r\n",
"SET ANSI_NULLS ON\r\n",
"GO\r\n",
"\r\n",
"SET QUOTED_IDENTIFIER ON\r\n",
"GO\r\n",
"\r\n",
"CREATE TABLE [dbo].[aml_connection](\r\n",
" [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,\r\n",
"\t[ConnectionName] [nvarchar](255) NULL,\r\n",
"\t[TenantId] [nvarchar](255) NULL,\r\n",
"\t[AppId] [nvarchar](255) NULL,\r\n",
"\t[Password] [nvarchar](255) NULL,\r\n",
"\t[ConfigFile] [nvarchar](255) NULL\r\n",
") ON [PRIMARY]\r\n",
"GO"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Copy the values from create-for-rbac above into the cell below"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- Use the following values:\r\n",
"-- Leave the name as 'Default'\r\n",
"-- Insert <tenant> returned by create-for-rbac above\r\n",
"-- Insert <AppId> returned by create-for-rbac above\r\n",
"-- Insert <password> used in create-for-rbac above\r\n",
"-- Leave <path> as '/tmp/aml/config.json'\r\n",
"INSERT INTO [dbo].[aml_connection] \r\n",
"VALUES (\r\n",
" N'Default', -- Name\r\n",
" N'11111111-2222-3333-4444-555555555555', -- Tenant\r\n",
" N'aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee', -- AppId\r\n",
" N'insertpasswordhere', -- Password\r\n",
" N'/tmp/aml/config.json' -- Path\r\n",
" );\r\n",
"GO"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- This is a table to hold the results from the AutoMLTrain procedure.\r\n",
"SET ANSI_NULLS ON\r\n",
"GO\r\n",
"\r\n",
"SET QUOTED_IDENTIFIER ON\r\n",
"GO\r\n",
"\r\n",
"CREATE TABLE [dbo].[aml_model](\r\n",
" [Id] [int] IDENTITY(1,1) NOT NULL PRIMARY KEY,\r\n",
" [Model] [varchar](max) NOT NULL, -- The model, which can be passed to AutoMLPredict for testing or prediction.\r\n",
" [RunId] [nvarchar](250) NULL, -- The RunId, which can be used to view the model in the Azure Portal.\r\n",
" [CreatedDate] [datetime] NULL,\r\n",
" [ExperimentName] [nvarchar](100) NULL, -- Azure ML Experiment Name\r\n",
" [WorkspaceName] [nvarchar](100) NULL, -- Azure ML Workspace Name\r\n",
"\t[LogFileText] [nvarchar](max) NULL\r\n",
") \r\n",
"GO\r\n",
"\r\n",
"ALTER TABLE [dbo].[aml_model] ADD DEFAULT (getutcdate()) FOR [CreatedDate]\r\n",
"GO\r\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- This stored procedure uses automated machine learning to train several models\r\n",
"-- and return the best model.\r\n",
"--\r\n",
"-- The result set has several columns:\r\n",
"-- best_run - ID of the best model found\r\n",
"-- experiment_name - training run name\r\n",
"-- fitted_model - best model found\r\n",
"-- log_file_text - console output\r\n",
"-- workspace - name of the Azure ML workspace where run history is stored\r\n",
"--\r\n",
"-- An example call for a classification problem is:\r\n",
"-- insert into dbo.aml_model(RunId, ExperimentName, Model, LogFileText, WorkspaceName)\r\n",
"-- exec dbo.AutoMLTrain @input_query='\r\n",
"-- SELECT top 100000 \r\n",
"-- CAST([pickup_datetime] AS NVARCHAR(30)) AS pickup_datetime\r\n",
"-- ,CAST([dropoff_datetime] AS NVARCHAR(30)) AS dropoff_datetime\r\n",
"-- ,[passenger_count]\r\n",
"-- ,[trip_time_in_secs]\r\n",
"-- ,[trip_distance]\r\n",
"-- ,[payment_type]\r\n",
"-- ,[tip_class]\r\n",
"-- FROM [dbo].[nyctaxi_sample] order by [hack_license] ',\r\n",
"-- @label_column = 'tip_class',\r\n",
"-- @iterations=10\r\n",
"-- \r\n",
"-- An example call for forecasting is:\r\n",
"-- insert into dbo.aml_model(RunId, ExperimentName, Model, LogFileText, WorkspaceName)\r\n",
"-- exec dbo.AutoMLTrain @input_query='\r\n",
"-- select cast(timeStamp as nvarchar(30)) as timeStamp,\r\n",
"-- demand,\r\n",
"-- \t precip,\r\n",
"-- \t temp,\r\n",
"-- case when timeStamp < ''2017-01-01'' then 0 else 1 end as is_validate_column\r\n",
"-- from nyc_energy\r\n",
"-- where demand is not null and precip is not null and temp is not null\r\n",
"-- and timeStamp < ''2017-02-01''',\r\n",
"-- @label_column='demand',\r\n",
"-- @task='forecasting',\r\n",
"-- @iterations=10,\r\n",
"-- @iteration_timeout_minutes=5,\r\n",
"-- @time_column_name='timeStamp',\r\n",
"-- @is_validate_column='is_validate_column',\r\n",
"-- @experiment_name='automl-sql-forecast',\r\n",
"-- @primary_metric='normalized_root_mean_squared_error'\r\n",
"\r\n",
"SET ANSI_NULLS ON\r\n",
"GO\r\n",
"SET QUOTED_IDENTIFIER ON\r\n",
"GO\r\n",
"CREATE OR ALTER PROCEDURE [dbo].[AutoMLTrain]\r\n",
" (\r\n",
" @input_query NVARCHAR(MAX), -- The SQL Query that will return the data to train and validate the model.\r\n",
" @label_column NVARCHAR(255)='Label', -- The name of the column in the result of @input_query that is the label.\r\n",
" @primary_metric NVARCHAR(40)='AUC_weighted', -- The metric to optimize.\r\n",
" @iterations INT=100, -- The maximum number of pipelines to train.\r\n",
" @task NVARCHAR(40)='classification', -- The type of task. Can be classification, regression or forecasting.\r\n",
" @experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.\r\n",
" @iteration_timeout_minutes INT = 15, -- The maximum time in minutes for training a single pipeline. \r\n",
" @experiment_timeout_minutes INT = 60, -- The maximum time in minutes for training all pipelines.\r\n",
" @n_cross_validations INT = 3, -- The number of cross validations.\r\n",
" @blacklist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that will not be used.\r\n",
" -- The list of possible models can be found at:\r\n",
" -- https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-configure-auto-train#configure-your-experiment-settings\r\n",
" @whitelist_models NVARCHAR(MAX) = '', -- A comma separated list of algos that can be used.\r\n",
" -- The list of possible models can be found at:\r\n",
" -- https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-configure-auto-train#configure-your-experiment-settings\r\n",
" @experiment_exit_score FLOAT = 0, -- Stop the experiment if this score is acheived.\r\n",
" @sample_weight_column NVARCHAR(255)='', -- The name of the column in the result of @input_query that gives a sample weight.\r\n",
" @is_validate_column NVARCHAR(255)='', -- The name of the column in the result of @input_query that indicates if the row is for training or validation.\r\n",
"\t -- In the values of the column, 0 means for training and 1 means for validation.\r\n",
" @time_column_name NVARCHAR(255)='', -- The name of the timestamp column for forecasting.\r\n",
"\t@connection_name NVARCHAR(255)='default' -- The AML connection to use.\r\n",
" ) AS\r\n",
"BEGIN\r\n",
"\r\n",
" DECLARE @tenantid NVARCHAR(255)\r\n",
" DECLARE @appid NVARCHAR(255)\r\n",
" DECLARE @password NVARCHAR(255)\r\n",
" DECLARE @config_file NVARCHAR(255)\r\n",
"\r\n",
"\tSELECT @tenantid=TenantId, @appid=AppId, @password=Password, @config_file=ConfigFile\r\n",
"\tFROM aml_connection\r\n",
"\tWHERE ConnectionName = @connection_name;\r\n",
"\r\n",
"\tEXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd\r\n",
"import logging \r\n",
"import azureml.core \r\n",
"import pandas as pd\r\n",
"import numpy as np\r\n",
"from azureml.core.experiment import Experiment \r\n",
"from azureml.train.automl import AutoMLConfig \r\n",
"from sklearn import datasets \r\n",
"import pickle\r\n",
"import codecs\r\n",
"from azureml.core.authentication import ServicePrincipalAuthentication \r\n",
"from azureml.core.workspace import Workspace \r\n",
"\r\n",
"if __name__.startswith(\"sqlindb\"):\r\n",
" auth = ServicePrincipalAuthentication(tenantid, appid, password) \r\n",
" \r\n",
" ws = Workspace.from_config(path=config_file, auth=auth) \r\n",
" \r\n",
" project_folder = \"./sample_projects/\" + experiment_name\r\n",
" \r\n",
" experiment = Experiment(ws, experiment_name) \r\n",
"\r\n",
" data_train = input_data\r\n",
" X_valid = None\r\n",
" y_valid = None\r\n",
" sample_weight_valid = None\r\n",
"\r\n",
" if is_validate_column != \"\" and is_validate_column is not None:\r\n",
" data_train = input_data[input_data[is_validate_column] <= 0]\r\n",
" data_valid = input_data[input_data[is_validate_column] > 0]\r\n",
" data_train.pop(is_validate_column)\r\n",
" data_valid.pop(is_validate_column)\r\n",
" y_valid = data_valid.pop(label_column).values\r\n",
" if sample_weight_column != \"\" and sample_weight_column is not None:\r\n",
" sample_weight_valid = data_valid.pop(sample_weight_column).values\r\n",
" X_valid = data_valid\r\n",
" n_cross_validations = None\r\n",
"\r\n",
" y_train = data_train.pop(label_column).values\r\n",
"\r\n",
" sample_weight = None\r\n",
" if sample_weight_column != \"\" and sample_weight_column is not None:\r\n",
" sample_weight = data_train.pop(sample_weight_column).values\r\n",
"\r\n",
" X_train = data_train\r\n",
"\r\n",
" if experiment_timeout_minutes == 0:\r\n",
" experiment_timeout_minutes = None\r\n",
"\r\n",
" if experiment_exit_score == 0:\r\n",
" experiment_exit_score = None\r\n",
"\r\n",
" if blacklist_models == \"\":\r\n",
" blacklist_models = None\r\n",
"\r\n",
" if blacklist_models is not None:\r\n",
" blacklist_models = blacklist_models.replace(\" \", \"\").split(\",\")\r\n",
"\r\n",
" if whitelist_models == \"\":\r\n",
" whitelist_models = None\r\n",
"\r\n",
" if whitelist_models is not None:\r\n",
" whitelist_models = whitelist_models.replace(\" \", \"\").split(\",\")\r\n",
"\r\n",
" automl_settings = {}\r\n",
" preprocess = True\r\n",
" if time_column_name != \"\" and time_column_name is not None:\r\n",
" automl_settings = { \"time_column_name\": time_column_name }\r\n",
" preprocess = False\r\n",
"\r\n",
" log_file_name = \"automl_errors.log\"\r\n",
"\t \r\n",
" automl_config = AutoMLConfig(task = task, \r\n",
" debug_log = log_file_name, \r\n",
" primary_metric = primary_metric, \r\n",
" iteration_timeout_minutes = iteration_timeout_minutes, \r\n",
" experiment_timeout_minutes = experiment_timeout_minutes,\r\n",
" iterations = iterations, \r\n",
" n_cross_validations = n_cross_validations, \r\n",
" preprocess = preprocess,\r\n",
" verbosity = logging.INFO, \r\n",
" X = X_train, \r\n",
" y = y_train, \r\n",
" path = project_folder,\r\n",
" blacklist_models = blacklist_models,\r\n",
" whitelist_models = whitelist_models,\r\n",
" experiment_exit_score = experiment_exit_score,\r\n",
" sample_weight = sample_weight,\r\n",
" X_valid = X_valid,\r\n",
" y_valid = y_valid,\r\n",
" sample_weight_valid = sample_weight_valid,\r\n",
" **automl_settings) \r\n",
" \r\n",
" local_run = experiment.submit(automl_config, show_output = True) \r\n",
"\r\n",
" best_run, fitted_model = local_run.get_output()\r\n",
"\r\n",
" pickled_model = codecs.encode(pickle.dumps(fitted_model), \"base64\").decode()\r\n",
"\r\n",
" log_file_text = \"\"\r\n",
"\r\n",
" try:\r\n",
" with open(log_file_name, \"r\") as log_file:\r\n",
" log_file_text = log_file.read()\r\n",
" except:\r\n",
" log_file_text = \"Log file not found\"\r\n",
"\r\n",
" returned_model = pd.DataFrame({\"best_run\": [best_run.id], \"experiment_name\": [experiment_name], \"fitted_model\": [pickled_model], \"log_file_text\": [log_file_text], \"workspace\": [ws.name]}, dtype=np.dtype(np.str))\r\n",
"'\r\n",
"\t, @input_data_1 = @input_query\r\n",
"\t, @input_data_1_name = N'input_data'\r\n",
"\t, @output_data_1_name = N'returned_model'\r\n",
"\t, @params = N'@label_column NVARCHAR(255), \r\n",
"\t @primary_metric NVARCHAR(40),\r\n",
"\t\t\t\t @iterations INT, @task NVARCHAR(40),\r\n",
"\t\t\t\t @experiment_name NVARCHAR(32),\r\n",
"\t\t\t\t @iteration_timeout_minutes INT,\r\n",
"\t\t\t\t @experiment_timeout_minutes INT,\r\n",
"\t\t\t\t @n_cross_validations INT,\r\n",
"\t\t\t\t @blacklist_models NVARCHAR(MAX),\r\n",
"\t\t\t\t @whitelist_models NVARCHAR(MAX),\r\n",
"\t\t\t\t @experiment_exit_score FLOAT,\r\n",
"\t\t\t\t @sample_weight_column NVARCHAR(255),\r\n",
"\t\t\t\t @is_validate_column NVARCHAR(255),\r\n",
"\t\t\t\t @time_column_name NVARCHAR(255),\r\n",
"\t\t\t\t @tenantid NVARCHAR(255),\r\n",
"\t\t\t\t @appid NVARCHAR(255),\r\n",
"\t\t\t\t @password NVARCHAR(255),\r\n",
"\t\t\t\t @config_file NVARCHAR(255)'\r\n",
"\t, @label_column = @label_column\r\n",
"\t, @primary_metric = @primary_metric\r\n",
"\t, @iterations = @iterations\r\n",
"\t, @task = @task\r\n",
"\t, @experiment_name = @experiment_name\r\n",
"\t, @iteration_timeout_minutes = @iteration_timeout_minutes\r\n",
"\t, @experiment_timeout_minutes = @experiment_timeout_minutes\r\n",
"\t, @n_cross_validations = @n_cross_validations\r\n",
"\t, @blacklist_models = @blacklist_models\r\n",
"\t, @whitelist_models = @whitelist_models\r\n",
"\t, @experiment_exit_score = @experiment_exit_score\r\n",
"\t, @sample_weight_column = @sample_weight_column\r\n",
"\t, @is_validate_column = @is_validate_column\r\n",
"\t, @time_column_name = @time_column_name\r\n",
"\t, @tenantid = @tenantid\r\n",
"\t, @appid = @appid\r\n",
"\t, @password = @password\r\n",
"\t, @config_file = @config_file\r\n",
"WITH RESULT SETS ((best_run NVARCHAR(250), experiment_name NVARCHAR(100), fitted_model VARCHAR(MAX), log_file_text NVARCHAR(MAX), workspace NVARCHAR(100)))\r\n",
"END"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- This procedure returns a list of metrics for each iteration of a training run.\r\n",
"SET ANSI_NULLS ON\r\n",
"GO\r\n",
"SET QUOTED_IDENTIFIER ON\r\n",
"GO\r\n",
"CREATE OR ALTER PROCEDURE [dbo].[AutoMLGetMetrics]\r\n",
" (\r\n",
"\t@run_id NVARCHAR(250), -- The RunId\r\n",
" @experiment_name NVARCHAR(32)='automl-sql-test', -- This can be used to find the experiment in the Azure Portal.\r\n",
" @connection_name NVARCHAR(255)='default' -- The AML connection to use.\r\n",
" ) AS\r\n",
"BEGIN\r\n",
" DECLARE @tenantid NVARCHAR(255)\r\n",
" DECLARE @appid NVARCHAR(255)\r\n",
" DECLARE @password NVARCHAR(255)\r\n",
" DECLARE @config_file NVARCHAR(255)\r\n",
"\r\n",
"\tSELECT @tenantid=TenantId, @appid=AppId, @password=Password, @config_file=ConfigFile\r\n",
"\tFROM aml_connection\r\n",
"\tWHERE ConnectionName = @connection_name;\r\n",
"\r\n",
" EXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd\r\n",
"import logging \r\n",
"import azureml.core \r\n",
"import numpy as np\r\n",
"from azureml.core.experiment import Experiment \r\n",
"from azureml.train.automl.run import AutoMLRun\r\n",
"from azureml.core.authentication import ServicePrincipalAuthentication \r\n",
"from azureml.core.workspace import Workspace \r\n",
"\r\n",
"auth = ServicePrincipalAuthentication(tenantid, appid, password) \r\n",
" \r\n",
"ws = Workspace.from_config(path=config_file, auth=auth) \r\n",
" \r\n",
"experiment = Experiment(ws, experiment_name) \r\n",
"\r\n",
"ml_run = AutoMLRun(experiment = experiment, run_id = run_id)\r\n",
"\r\n",
"children = list(ml_run.get_children())\r\n",
"iterationlist = []\r\n",
"metricnamelist = []\r\n",
"metricvaluelist = []\r\n",
"\r\n",
"for run in children:\r\n",
" properties = run.get_properties()\r\n",
" if \"iteration\" in properties:\r\n",
" iteration = int(properties[\"iteration\"])\r\n",
" for metric_name, metric_value in run.get_metrics().items():\r\n",
" if isinstance(metric_value, float):\r\n",
" iterationlist.append(iteration)\r\n",
" metricnamelist.append(metric_name)\r\n",
" metricvaluelist.append(metric_value)\r\n",
" \r\n",
"metrics = pd.DataFrame({\"iteration\": iterationlist, \"metric_name\": metricnamelist, \"metric_value\": metricvaluelist})\r\n",
"'\r\n",
" , @output_data_1_name = N'metrics'\r\n",
"\t, @params = N'@run_id NVARCHAR(250), \r\n",
"\t\t\t\t @experiment_name NVARCHAR(32),\r\n",
" \t\t\t\t @tenantid NVARCHAR(255),\r\n",
"\t\t\t\t @appid NVARCHAR(255),\r\n",
"\t\t\t\t @password NVARCHAR(255),\r\n",
"\t\t\t\t @config_file NVARCHAR(255)'\r\n",
" , @run_id = @run_id\r\n",
"\t, @experiment_name = @experiment_name\r\n",
"\t, @tenantid = @tenantid\r\n",
"\t, @appid = @appid\r\n",
"\t, @password = @password\r\n",
"\t, @config_file = @config_file\r\n",
"WITH RESULT SETS ((iteration INT, metric_name NVARCHAR(100), metric_value FLOAT))\r\n",
"END"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"-- This procedure predicts values based on a model returned by AutoMLTrain and a dataset.\r\n",
"-- It returns the dataset with a new column added, which is the predicted value.\r\n",
"SET ANSI_NULLS ON\r\n",
"GO\r\n",
"SET QUOTED_IDENTIFIER ON\r\n",
"GO\r\n",
"CREATE OR ALTER PROCEDURE [dbo].[AutoMLPredict]\r\n",
" (\r\n",
" @input_query NVARCHAR(MAX), -- A SQL query returning data to predict on.\r\n",
" @model NVARCHAR(MAX), -- A model returned from AutoMLTrain.\r\n",
" @label_column NVARCHAR(255)='' -- Optional name of the column from input_query, which should be ignored when predicting\r\n",
" ) AS \r\n",
"BEGIN \r\n",
" \r\n",
" EXEC sp_execute_external_script @language = N'Python', @script = N'import pandas as pd \r\n",
"import azureml.core \r\n",
"import numpy as np \r\n",
"from azureml.train.automl import AutoMLConfig \r\n",
"import pickle \r\n",
"import codecs \r\n",
" \r\n",
"model_obj = pickle.loads(codecs.decode(model.encode(), \"base64\")) \r\n",
" \r\n",
"test_data = input_data.copy() \r\n",
"\r\n",
"if label_column != \"\" and label_column is not None:\r\n",
" y_test = test_data.pop(label_column).values \r\n",
"X_test = test_data \r\n",
" \r\n",
"predicted = model_obj.predict(X_test) \r\n",
" \r\n",
"combined_output = input_data.assign(predicted=predicted)\r\n",
" \r\n",
"' \r\n",
" , @input_data_1 = @input_query \r\n",
" , @input_data_1_name = N'input_data' \r\n",
" , @output_data_1_name = N'combined_output' \r\n",
" , @params = N'@model NVARCHAR(MAX), @label_column NVARCHAR(255)' \r\n",
" , @model = @model \r\n",
"\t, @label_column = @label_column\r\n",
"END"
]
}
],
"metadata": {
"authors": [
{
"name": "jeffshep"
}
],
"category": "tutorial",
"compute": [
"None"
],
"datasets": [
"None"
],
"deployment": [
"None"
],
"exclude_from_index": false,
"framework": [
"Azure ML AutoML"
],
"tags": [
""
],
"friendly_name": "Setup automated ML SQL integration",
"index_order": 1,
"kernelspec": {
"display_name": "Python 3.6",
"language": "sql",
"name": "python36"
},
"language_info": {
"name": "sql",
"version": ""
},
"task": "None"
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -11,6 +11,13 @@
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Register Azure Databricks trained model and deploy it to ACI\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -161,9 +168,9 @@
"source": [
"from azureml.core.conda_dependencies import CondaDependencies \n",
"\n",
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) #showing how to add libs as an eg. - not needed for this model.\n",
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) # showing how to add libs as an eg. - not needed for this model.\n",
"\n",
"with open(\"mydeployenv.yml\",\"w\") as f:\n",
"with open(\"myenv.yml\",\"w\") as f:\n",
" f.write(myacienv.serialize_to_string())"
]
},
@@ -177,6 +184,9 @@
"from azureml.core.webservice import AciWebservice, Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"from azureml.core.model import InferenceConfig\n",
"from azureml.core.environment import Environment\n",
"from azureml.core.conda_dependencies import CondaDependencies\n",
"\n",
"\n",
"myaci_config = AciWebservice.deploy_configuration(cpu_cores = 2, \n",
" memory_gb = 2, \n",
@@ -191,9 +201,16 @@
"except WebserviceException:\n",
" pass\n",
"\n",
"inference_config = InferenceConfig(runtime= 'spark-py', \n",
" entry_script='score_sparkml.py',\n",
" conda_file='mydeployenv.yml')\n",
"myenv = Environment.get(ws, name='AzureML-PySpark-MmlSpark-0.15')\n",
"# we need to add extra packages to procured environment\n",
"# in order to deploy amended environment we need to rename it\n",
"myenv.name = 'myenv'\n",
"model_dependencies = CondaDependencies('myenv.yml')\n",
"for pip_dep in model_dependencies.pip_packages:\n",
" myenv.python.conda_dependencies.add_pip_package(pip_dep)\n",
"for conda_dep in model_dependencies.conda_packages:\n",
" myenv.python.conda_dependencies.add_conda_package(conda_dep)\n",
"inference_config = InferenceConfig(entry_script='score_sparkml.py', environment=myenv)\n",
"\n",
"myservice = Model.deploy(ws, service_name, [mymodel], inference_config, myaci_config)\n",
"myservice.wait_for_deployment(show_output=True)"
@@ -255,6 +272,15 @@
"myservice.delete()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Deploying to other types of computes\n",
"\n",
"In order to learn how to deploy to other types of compute targets, such as AKS, please take a look at the set of notebooks in the [deployment](https://github.com/Azure/MachineLearningNotebooks/tree/master/how-to-use-azureml/deployment) folder."
]
},
{
"cell_type": "markdown",
"metadata": {},

View File

@@ -1,312 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Azure ML & Azure Databricks notebooks by Parashar Shah.\n",
"\n",
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This notebook uses image from ACI notebook for deploying to AKS."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.core\n",
"\n",
"# Check core SDK version number\n",
"print(\"SDK version:\", azureml.core.VERSION)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Set auth to be used by workspace related APIs.\n",
"# For automation or CI/CD ServicePrincipalAuthentication can be used.\n",
"# https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.authentication.serviceprincipalauthentication?view=azure-ml-py\n",
"auth = None"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"\n",
"ws = Workspace.from_config(auth = auth)\n",
"print('Workspace name: ' + ws.name, \n",
" 'Azure region: ' + ws.location, \n",
" 'Subscription id: ' + ws.subscription_id, \n",
" 'Resource group: ' + ws.resource_group, sep = '\\n')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#Register the model\n",
"import os\n",
"from azureml.core.model import Model\n",
"\n",
"model_name = \"AdultCensus_runHistory_aks.mml\" # \n",
"model_name_dbfs = os.path.join(\"/dbfs\", model_name)\n",
"\n",
"print(\"copy model from dbfs to local\")\n",
"model_local = \"file:\" + os.getcwd() + \"/\" + model_name\n",
"dbutils.fs.cp(model_name, model_local, True)\n",
"\n",
"mymodel = Model.register(model_path = model_name, # this points to a local file\n",
" model_name = model_name, # this is the name the model is registered as, am using same name for both path and name. \n",
" description = \"ADB trained model by Parashar\",\n",
" workspace = ws)\n",
"\n",
"print(mymodel.name, mymodel.description, mymodel.version)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#%%writefile score_sparkml.py\n",
"score_sparkml = \"\"\"\n",
" \n",
"import json\n",
" \n",
"def init():\n",
" # One-time initialization of PySpark and predictive model\n",
" import pyspark\n",
" from azureml.core.model import Model\n",
" from pyspark.ml import PipelineModel\n",
" \n",
" global trainedModel\n",
" global spark\n",
" \n",
" spark = pyspark.sql.SparkSession.builder.appName(\"ADB and AML notebook by Parashar\").getOrCreate()\n",
" model_name = \"{model_name}\" #interpolated\n",
" model_path = Model.get_model_path(model_name)\n",
" trainedModel = PipelineModel.load(model_path)\n",
" \n",
"def run(input_json):\n",
" if isinstance(trainedModel, Exception):\n",
" return json.dumps({{\"trainedModel\":str(trainedModel)}})\n",
" \n",
" try:\n",
" sc = spark.sparkContext\n",
" input_list = json.loads(input_json)\n",
" input_rdd = sc.parallelize(input_list)\n",
" input_df = spark.read.json(input_rdd)\n",
" \n",
" # Compute prediction\n",
" prediction = trainedModel.transform(input_df)\n",
" #result = prediction.first().prediction\n",
" predictions = prediction.collect()\n",
" \n",
" #Get each scored result\n",
" preds = [str(x['prediction']) for x in predictions]\n",
" result = \",\".join(preds)\n",
" # you can return any data type as long as it is JSON-serializable\n",
" return result.tolist()\n",
" except Exception as e:\n",
" result = str(e)\n",
" return result\n",
" \n",
"\"\"\".format(model_name=model_name)\n",
" \n",
"exec(score_sparkml)\n",
" \n",
"with open(\"score_sparkml.py\", \"w\") as file:\n",
" file.write(score_sparkml)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.conda_dependencies import CondaDependencies \n",
"\n",
"myacienv = CondaDependencies.create(conda_packages=['scikit-learn','numpy','pandas']) #showing how to add libs as an eg. - not needed for this model.\n",
"\n",
"with open(\"mydeployenv.yml\",\"w\") as f:\n",
" f.write(myacienv.serialize_to_string())"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#create AKS compute\n",
"#it may take 20-25 minutes to create a new cluster\n",
"\n",
"from azureml.core.compute import AksCompute, ComputeTarget\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"aks_name = 'ps-aks-demo2' \n",
"\n",
"try:\n",
" aks_target = ComputeTarget(workspace=ws, name=aks_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" # Use the default configuration (can also provide parameters to customize)\n",
" prov_config = AksCompute.provisioning_configuration()\n",
" \n",
" # Create the cluster\n",
" aks_target = ComputeTarget.create(workspace = ws, \n",
" name = aks_name, \n",
" provisioning_configuration = prov_config)\n",
"\n",
"aks_target.wait_for_completion(show_output = True)\n",
"\n",
"print(aks_target.provisioning_state)\n",
"print(aks_target.provisioning_errors)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#deploy to AKS\n",
"from azureml.core.webservice import AksWebservice, Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"from azureml.core.model import InferenceConfig\n",
"\n",
"aks_config = AksWebservice.deploy_configuration(enable_app_insights=True)\n",
"\n",
"service_name = 'ps-aks-service'\n",
"\n",
"# Remove any existing service under the same name.\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"inference_config = InferenceConfig(runtime = 'spark-py', \n",
" entry_script ='score_sparkml.py',\n",
" conda_file ='mydeployenv.yml')\n",
"\n",
"aks_service = Model.deploy(ws, service_name, [mymodel], inference_config, aks_config, aks_target)\n",
"aks_service.wait_for_deployment(show_output=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service.deployment_status"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#for using the Web HTTP API \n",
"print(aks_service.scoring_uri)\n",
"print(aks_service.get_keys())"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import json\n",
"\n",
"#get the some sample data\n",
"test_data_path = \"AdultCensusIncomeTest\"\n",
"test = spark.read.parquet(test_data_path).limit(5)\n",
"\n",
"test_json = json.dumps(test.toJSON().collect())\n",
"\n",
"print(test_json)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#using data defined above predict if income is >50K (1) or <=50K (0)\n",
"aks_service.run(input_data=test_json)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#comment to not delete the web service\n",
"aks_service.delete()\n",
"#model.delete()\n",
"aks_target.delete() "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/azure-databricks/amlsdk/deploy-to-aks-existingimage-05.png)"
]
}
],
"metadata": {
"authors": [
{
"name": "pasha"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.8"
},
"name": "deploy-to-aks-existingimage-05",
"notebookId": 1030695628045968
},
"nbformat": 4,
"nbformat_minor": 1
}

View File

@@ -512,9 +512,11 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Retrieve the Best Model after the above run is complete \n",
"## Deploy\n",
"\n",
"Below we select the best pipeline from our iterations. The `get_output` method returns the best run and the fitted model. The Model includes the pipeline and any pre-processing. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
"### Retrieve the Best Model\n",
"\n",
"Below we select the best pipeline from our iterations. The `get_output` method on `automl_classifier` returns the best run and the fitted model for the last invocation. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
]
},
{
@@ -523,17 +525,15 @@
"metadata": {},
"outputs": [],
"source": [
"best_run, fitted_model = local_run.get_output()\n",
"print(best_run)\n",
"print(fitted_model)"
"best_run, fitted_model = local_run.get_output()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Best Model Based on Any Other Metric after the above run is complete based on the child run\n",
"Show the run and the model that has the smallest `log_loss` value:"
"### Download the conda environment file\n",
"From the *best_run* download the conda environment file that was used to train the AutoML model."
]
},
{
@@ -542,10 +542,34 @@
"metadata": {},
"outputs": [],
"source": [
"lookup_metric = \"log_loss\"\n",
"best_run, fitted_model = local_run.get_output(metric = lookup_metric)\n",
"print(best_run)\n",
"print(fitted_model)"
"from azureml.automl.core.shared import constants\n",
"conda_env_file_name = 'conda_env.yml'\n",
"best_run.download_file(name=\"outputs/conda_env_v_1_0_0.yml\", output_file_path=conda_env_file_name)\n",
"with open(conda_env_file_name, \"r\") as conda_file:\n",
" conda_file_contents = conda_file.read()\n",
" print(conda_file_contents)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Download the model scoring file\n",
"From the *best_run* download the scoring file to get the predictions from the AutoML model."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core.shared import constants\n",
"script_file_name = 'scoring_file.py'\n",
"best_run.download_file(name=\"outputs/scoring_file_v_1_0_0.py\", output_file_path=script_file_name)\n",
"with open(script_file_name, \"r\") as scoring_file:\n",
" scoring_file_contents = scoring_file.read()\n",
" print(scoring_file_contents)"
]
},
{
@@ -572,8 +596,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create Scoring Script\n",
"Replace model_id with name of model from output of above register cell"
"### Deploy the model as a Web Service on Azure Container Instance\n",
"\n",
"Create the configuration needed for deploying the model as a web service service."
]
},
{
@@ -582,113 +607,17 @@
"metadata": {},
"outputs": [],
"source": [
"%%writefile score.py\n",
"import pickle\n",
"import json\n",
"import numpy as np\n",
"import azureml.train.automl\n",
"from sklearn.externals import joblib\n",
"from azureml.core.model import Model\n",
"import pandas as pd\n",
"\n",
"def init():\n",
" global model\n",
" model_path = Model.get_model_path(model_name = '<<model_id>>') # this name is model.id of model that we want to deploy\n",
" # deserialize the model file back into a sklearn model\n",
" model = joblib.load(model_path)\n",
"\n",
"def run(raw_data):\n",
" try:\n",
" data = (pd.DataFrame(np.array(json.loads(raw_data)['data']), columns=[str(i) for i in range(0,64)]))\n",
" result = model.predict(data)\n",
" except Exception as e:\n",
" result = str(e)\n",
" return json.dumps({\"error\": result})\n",
" return json.dumps({\"result\":result.tolist()})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#Replace <<model_id>>\n",
"content = \"\"\n",
"with open(\"score.py\", \"r\") as fo:\n",
" content = fo.read()\n",
"\n",
"new_content = content.replace(\"<<model_id>>\", local_run.model_id)\n",
"with open(\"score.py\", \"w\") as fw:\n",
" fw.write(new_content)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Create a YAML File for the Environment"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.conda_dependencies import CondaDependencies\n",
"\n",
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'], pip_packages=['azureml-defaults', 'azureml-sdk[automl]'])\n",
"\n",
"conda_env_file_name = 'mydeployenv.yml'\n",
"myenv.save_to_file('.', conda_env_file_name)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Deploy the model as a Web Service on Azure Container Instance\n",
"Replace servicename with any meaningful name of service"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# this will take 10-15 minutes to finish\n",
"\n",
"from azureml.core.webservice import AciWebservice, Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"from azureml.core.model import InferenceConfig\n",
"from azureml.core.model import Model\n",
"import uuid\n",
"from azureml.core.webservice import AciWebservice\n",
"from azureml.core.environment import Environment\n",
"\n",
"myaci_config = AciWebservice.deploy_configuration(\n",
" cpu_cores = 2, \n",
" memory_gb = 2, \n",
" tags = {'name':'Databricks Azure ML ACI'}, \n",
" description = 'This is for ADB and AutoML example.')\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=conda_env_file_name)\n",
"inference_config = InferenceConfig(entry_script=script_file_name, environment=myenv)\n",
"\n",
"inference_config = InferenceConfig(runtime= 'spark-py', \n",
" entry_script='score.py',\n",
" conda_file='mydeployenv.yml')\n",
"\n",
"guid = str(uuid.uuid4()).split(\"-\")[0]\n",
"service_name = \"myservice-{}\".format(guid)\n",
"\n",
"# Remove any existing service under the same name.\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"print(\"Creating service with name: {}\".format(service_name))\n",
"\n",
"myservice = Model.deploy(ws, service_name, [model], inference_config, myaci_config)\n",
"myservice.wait_for_deployment(show_output=True)"
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
" memory_gb = 1, \n",
" tags = {'area': \"digits\", 'type': \"automl_classification\"}, \n",
" description = 'sample service for Automl Classification')"
]
},
{
@@ -697,8 +626,14 @@
"metadata": {},
"outputs": [],
"source": [
"#for using the Web HTTP API \n",
"print(myservice.scoring_uri)"
"from azureml.core.webservice import Webservice\n",
"from azureml.core.model import Model\n",
"\n",
"aci_service_name = 'automl-databricks-local'\n",
"print(aci_service_name)\n",
"aci_service = Model.deploy(ws, aci_service_name, [model], inference_config, aciconfig)\n",
"aci_service.wait_for_deployment(True)\n",
"print(aci_service.state)"
]
},
{
@@ -742,7 +677,7 @@
"for index in np.random.choice(len(y_test), 2, replace = False):\n",
" print(index)\n",
" test_sample = json.dumps({'data':X_test[index:index + 1].values.tolist()})\n",
" predicted = myservice.run(input_data = test_sample)\n",
" predicted = aci_service.run(input_data = test_sample)\n",
" label = y_test.values[index]\n",
" predictedDict = json.loads(predicted)\n",
" title = \"Label value = %d Predicted value = %s \" % ( label,predictedDict['result'][0]) \n",

View File

@@ -0,0 +1,36 @@
## Examples to get started with Azure Machine Learning SDK for R
Learn how to use Azure Machine Learning SDK for R for experimentation and model management.
As a pre-requisite, go through the [Installation](vignettes/installation.Rmd) and [Configuration](vignettes/configuration.Rmd) vignettes to first install the package and set up your Azure Machine Learning Workspace unless you are running these examples on an Azure Machine Learning compute instance. Azure Machine Learning compute instances have the Azure Machine Learning SDK pre-installed and your workspace details pre-configured.
Samples
* Deployment
* [deploy-to-aci](./samples/deployment/deploy-to-aci): Deploy a model as a web service to Azure Container Instances (ACI).
* [deploy-to-local](./samples/deployment/deploy-to-local): Deploy a model as a web service locally.
* Training
* [train-on-amlcompute](./samples/training/train-on-amlcompute): Train a model on a remote AmlCompute cluster.
* [train-on-local](./samples/training/train-on-local): Train a model locally with Docker.
Vignettes
* [deploy-to-aks](./vignettes/deploy-to-aks): Production deploy a model as a web service to Azure Kubernetes Service (AKS).
* [hyperparameter-tune-with-keras](./vignettes/hyperparameter-tune-with-keras): Hyperparameter tune a Keras model using HyperDrive, Azure ML's hyperparameter tuning functionality.
* [train-and-deploy-to-aci](./vignettes/train-and-deploy-to-aci): Train a caret model and deploy as a web service to Azure Container Instances (ACI).
* [train-with-tensorflow](./vignettes/train-with-tensorflow): Train a deep learning TensorFlow model with Azure ML.
Find more information on the [official documentation site for Azure Machine Learning SDK for R](https://azure.github.io/azureml-sdk-for-r/).
### Troubleshooting
- If the following error occurs when submitting an experiment using RStudio:
```R
Error in py_call_impl(callable, dots$args, dots$keywords) :
PermissionError: [Errno 13] Permission denied
```
Move the files for your project into a subdirectory and reset the working directory to that directory before re-submitting.
In order to submit an experiment, the Azure ML SDK must create a .zip file of the project directory to send to the service. However,
the SDK does not have permission to write into the .Rproj.user subdirectory that is automatically created during an RStudio
session. For this reason, the recommended best practice is to isolate project files into their own directory.

View File

@@ -0,0 +1,11 @@
## Azure Machine Learning samples
These samples are short code examples for using Azure Machine Learning SDK for R. If you are new to the R SDK, we recommend that you first take a look at the more detailed end-to-end [vignettes](../vignettes).
Before running a sample in RStudio, set the working directory to the folder that contains the sample script in RStudio using `setwd(dirname)` or Session -> Set Working Directory -> To Source File Location. Each vignette assumes that the data and scripts are in the current working directory.
1. [train-on-amlcompute](training/train-on-amlcompute): Train a model on a remote AmlCompute cluster.
2. [train-on-local](training/train-on-local): Train a model locally with Docker.
2. [deploy-to-aci](deployment/deploy-to-aci): Deploy a model as a web service to Azure Container Instances (ACI).
3. [deploy-to-local](deployment/deploy-to-local): Deploy a model as a web service locally.
> Before you run these samples, make sure you have an Azure Machine Learning workspace. You can follow the [configuration vignette](../vignettes/configuration.Rmd) to set up a workspace. (You do not need to do this if you are running these examples on an Azure Machine Learning compute instance).

View File

@@ -0,0 +1,59 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
library(azuremlsdk)
library(jsonlite)
ws <- load_workspace_from_config()
# Register the model
model <- register_model(ws, model_path = "project_files/model.rds",
model_name = "model.rds")
# Create environment
r_env <- r_environment(name = "r_env")
# Create inference config
inference_config <- inference_config(
entry_script = "score.R",
source_directory = "project_files",
environment = r_env)
# Create ACI deployment config
deployment_config <- aci_webservice_deployment_config(cpu_cores = 1,
memory_gb = 1)
# Deploy the web service
service <- deploy_model(ws,
'rservice',
list(model),
inference_config,
deployment_config)
wait_for_deployment(service, show_output = TRUE)
# If you encounter any issue in deploying the webservice, please visit
# https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment
# Inferencing
# versicolor
plant <- data.frame(Sepal.Length = 6.4,
Sepal.Width = 2.8,
Petal.Length = 4.6,
Petal.Width = 1.8)
# setosa
plant <- data.frame(Sepal.Length = 5.1,
Sepal.Width = 3.5,
Petal.Length = 1.4,
Petal.Width = 0.2)
# virginica
plant <- data.frame(Sepal.Length = 6.7,
Sepal.Width = 3.3,
Petal.Length = 5.2,
Petal.Width = 2.3)
# Test the web service
predicted_val <- invoke_webservice(service, toJSON(plant))
predicted_val
# Delete the web service
delete_webservice(service)

View File

@@ -0,0 +1,17 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
library(jsonlite)
init <- function() {
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
model <- readRDS(file.path(model_path, "model.rds"))
message("model is loaded")
function(data) {
plant <- as.data.frame(fromJSON(data))
prediction <- predict(model, plant)
result <- as.character(prediction)
toJSON(result)
}
}

View File

@@ -0,0 +1,112 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
# Register model and deploy locally
# This example shows how to deploy a web service in step-by-step fashion:
#
# 1) Register model
# 2) Deploy the model as a web service in a local Docker container.
# 3) Invoke web service with SDK or call web service with raw HTTP call.
# 4) Quickly test changes to your entry script by reloading the local service.
# 5) Optionally, you can also make changes to model and update the local service.
library(azuremlsdk)
library(jsonlite)
ws <- load_workspace_from_config()
# Register the model
model <- register_model(ws, model_path = "project_files/model.rds",
model_name = "model.rds")
# Create environment
r_env <- r_environment(name = "r_env")
# Create inference config
inference_config <- inference_config(
entry_script = "score.R",
source_directory = "project_files",
environment = r_env)
# Create local deployment config
local_deployment_config <- local_webservice_deployment_config()
# Deploy the web service
# NOTE:
# The Docker image runs as a Linux container. If you are running Docker for Windows, you need to ensure the Linux Engine is running:
# # PowerShell command to switch to Linux engine
# & 'C:\Program Files\Docker\Docker\DockerCli.exe' -SwitchLinuxEngine
service <- deploy_model(ws,
'rservice-local',
list(model),
inference_config,
local_deployment_config)
# Wait for deployment
wait_for_deployment(service, show_output = TRUE)
# Show the port of local service
message(service$port)
# If you encounter any issue in deploying the webservice, please visit
# https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment
# Inferencing
# versicolor
# plant <- data.frame(Sepal.Length = 6.4,
# Sepal.Width = 2.8,
# Petal.Length = 4.6,
# Petal.Width = 1.8)
# setosa
plant <- data.frame(Sepal.Length = 5.1,
Sepal.Width = 3.5,
Petal.Length = 1.4,
Petal.Width = 0.2)
# # virginica
# plant <- data.frame(Sepal.Length = 6.7,
# Sepal.Width = 3.3,
# Petal.Length = 5.2,
# Petal.Width = 2.3)
#Test the web service
invoke_webservice(service, toJSON(plant))
## The last few lines of the logs should have the correct prediction and should display -> R[write to console]: "setosa"
cat(gsub(pattern = "\n", replacement = " \n", x = get_webservice_logs(service)))
## Test the web service with a HTTP Raw request
#
# NOTE:
# To test the service locally use the https://localhost:<local_service$port> URL
# Import the request library
library(httr)
# Get the service scoring URL from the service object, its URL is for testing locally
local_service_url <- service$scoring_uri #Same as https://localhost:<local_service$port>
#POST request to web service
resp <- POST(local_service_url, body = plant, encode = "json", verbose())
## The last few lines of the logs should have the correct prediction and should display -> R[write to console]: "setosa"
cat(gsub(pattern = "\n", replacement = " \n", x = get_webservice_logs(service)))
# Optional, use a new scoring script
inference_config <- inference_config(
entry_script = "score_new.R",
source_directory = "project_files",
environment = r_env)
## Then reload the service to see the changes made
reload_local_webservice_assets(service)
## Check reloaded service, you will see the last line will say "this is a new scoring script! I was reloaded"
invoke_webservice(service, toJSON(plant))
cat(gsub(pattern = "\n", replacement = " \n", x = get_webservice_logs(service)))
# Update service
# If you want to change your model(s), environment, or deployment configuration, call update() to rebuild the Docker image.
# update_local_webservice(service, models = [NewModelObject], deployment_config = deployment_config, wait = FALSE, inference_config = inference_config)
# Delete service
delete_local_webservice(service)

View File

@@ -0,0 +1,18 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
library(jsonlite)
init <- function() {
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
model <- readRDS(file.path(model_path, "model.rds"))
message("model is loaded")
function(data) {
plant <- as.data.frame(fromJSON(data))
prediction <- predict(model, plant)
result <- as.character(prediction)
message(result)
toJSON(result)
}
}

View File

@@ -0,0 +1,19 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
library(jsonlite)
init <- function() {
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
model <- readRDS(file.path(model_path, "model.rds"))
message("model is loaded")
function(data) {
plant <- as.data.frame(fromJSON(data))
prediction <- predict(model, plant)
result <- as.character(prediction)
message(result)
message("this is a new scoring script! I was reloaded")
toJSON(result)
}
}

View File

@@ -0,0 +1,34 @@
# This script loads a dataset of which the last column is supposed to be the
# class and logs the accuracy
library(azuremlsdk)
library(caret)
library(optparse)
library(datasets)
iris_data <- data(iris)
summary(iris_data)
in_train <- createDataPartition(y = iris_data$Species, p = .8, list = FALSE)
train_data <- iris_data[in_train,]
test_data <- iris_data[-in_train,]
# Run algorithms using 10-fold cross validation
control <- trainControl(method = "cv", number = 10)
metric <- "Accuracy"
set.seed(7)
model <- train(Species ~ .,
data = train_data,
method = "lda",
metric = metric,
trControl = control)
predictions <- predict(model, test_data)
conf_matrix <- confusionMatrix(predictions, test_data$Species)
message(conf_matrix)
log_metric_to_run(metric, conf_matrix$overall["Accuracy"])
saveRDS(model, file = "./outputs/model.rds")
message("Model saved")

View File

@@ -0,0 +1,41 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
# Reminder: set working directory to current file location prior to running this script
library(azuremlsdk)
ws <- load_workspace_from_config()
# Create AmlCompute cluster
cluster_name <- "r-cluster"
compute_target <- get_compute(ws, cluster_name = cluster_name)
if (is.null(compute_target)) {
vm_size <- "STANDARD_D2_V2"
compute_target <- create_aml_compute(workspace = ws,
cluster_name = cluster_name,
vm_size = vm_size,
max_nodes = 1)
wait_for_provisioning_completion(compute_target, show_output = TRUE)
}
# Define estimator
est <- estimator(source_directory = "scripts",
entry_script = "train.R",
compute_target = compute_target)
experiment_name <- "train-r-script-on-amlcompute"
exp <- experiment(ws, experiment_name)
# Submit job and display the run details
run <- submit_experiment(exp, est)
view_run_details(run)
wait_for_run_completion(run, show_output = TRUE)
# Get the run metrics
metrics <- get_run_metrics(run)
metrics
# Delete cluster
delete_compute(compute_target)

View File

@@ -0,0 +1,28 @@
# This script loads a dataset of which the last column is supposed to be the
# class and logs the accuracy
library(azuremlsdk)
library(caret)
library(datasets)
iris_data <- data(iris)
summary(iris_data)
in_train <- createDataPartition(y = iris_data$Species, p = .8, list = FALSE)
train_data <- iris_data[in_train,]
test_data <- iris_data[-in_train,]
# Run algorithms using 10-fold cross validation
control <- trainControl(method = "cv", number = 10)
metric <- "Accuracy"
set.seed(7)
model <- train(Species ~ .,
data = train_data,
method = "lda",
metric = metric,
trControl = control)
predictions <- predict(model, test_data)
conf_matrix <- confusionMatrix(predictions, test_data$Species)
message(conf_matrix)
log_metric_to_run(metric, conf_matrix$overall["Accuracy"])

View File

@@ -0,0 +1,26 @@
# Copyright(c) Microsoft Corporation.
# Licensed under the MIT license.
# Reminder: set working directory to current file location prior to running this script
library(azuremlsdk)
ws <- load_workspace_from_config()
# Define estimator
est <- estimator(source_directory = "scripts",
entry_script = "train.R",
compute_target = "local")
# Initialize experiment
experiment_name <- "train-r-script-on-local"
exp <- experiment(ws, experiment_name)
# Submit job and display the run details
run <- submit_experiment(exp, est)
view_run_details(run)
wait_for_run_completion(run, show_output = TRUE)
# Get the run metrics
metrics <- get_run_metrics(run)
metrics

View File

@@ -0,0 +1,17 @@
## Azure Machine Learning vignettes
These vignettes are end-to-end tutorials for using Azure Machine Learning SDK for R.
Before running a vignette in RStudio, set the working directory to the folder that contains the vignette file (.Rmd file) in RStudio using `setwd(dirname)` or Session -> Set Working Directory -> To Source File Location. Each vignette assumes that the data and scripts are in the current working directory.
The following vignettes are included:
1. [installation](installation.Rmd): Install the Azure ML SDK for R.
2. [configuration](configuration.Rmd): Set up an Azure ML workspace.
3. [train-and-deploy-to-aci](train-and-deploy-to-aci): Train a caret model and deploy as a web service to Azure Container Instances (ACI).
4. [train-with-tensorflow](train-with-tensorflow/): Train a deep learning TensorFlow model with Azure ML.
5. [hyperparameter-tune-with-keras](hyperparameter-tune-with-keras/): Hyperparameter tune a Keras model using HyperDrive, Azure ML's hyperparameter tuning functionality.
6. [deploy-to-aks](deploy-to-aks/): Production deploy a model as a web service to Azure Kubernetes Service (AKS).
> Before you run these samples, make sure you have an Azure Machine Learning workspace. You can follow the [configuration vignette](../vignettes/configuration.Rmd) to set up a workspace. (You do not need to do this if you are running these examples on an Azure Machine Learning compute instance).
For additional examples on using the R SDK, see the [samples](../samples) folder.

View File

@@ -0,0 +1,108 @@
---
title: "Set up an Azure ML workspace"
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Set up an Azure ML workspace}
%\VignetteEngine{knitr::rmarkdown}
\use_package{UTF-8}
---
This tutorial gets you started with the Azure Machine Learning service by walking through the requirements and instructions for setting up a workspace, the top-level resource for Azure ML.
You do not need run this if you are working on an Azure Machine Learning Compute Instance, as the compute instance is already associated with an existing workspace.
## What is an Azure ML workspace?
The workspace is the top-level resource for Azure ML, providing a centralized place to work with all the artifacts you create when you use Azure ML. The workspace keeps a history of all training runs, including logs, metrics, output, and a snapshot of your scripts.
When you create a new workspace, it automatically creates several Azure resources that are used by the workspace:
* Azure Container Registry: Registers docker containers that you use during training and when you deploy a model. To minimize costs, ACR is lazy-loaded until deployment images are created.
* Azure Storage account: Used as the default datastore for the workspace.
* Azure Application Insights: Stores monitoring information about your models.
* Azure Key Vault: Stores secrets that are used by compute targets and other sensitive information that's needed by the workspace.
## Setup
This section describes the steps required before you can access any Azure ML service functionality.
### Azure subscription
In order to create an Azure ML workspace, first you need access to an Azure subscription. An Azure subscription allows you to manage storage, compute, and other assets in the Azure cloud. You can [create a new subscription](https://azure.microsoft.com/en-us/free/) or access existing subscription information from the [Azure portal](https://portal.azure.com/). Later in this tutorial you will need information such as your subscription ID in order to create and access workspaces.
### Azure ML SDK installation
Follow the [installation guide](https://azure.github.io/azureml-sdk-for-r/articles/installation.html) to install **azuremlsdk** on your machine.
## Configure your workspace
### Workspace parameters
To use an Azure ML workspace, you will need to supply the following information:
* Your subscription ID
* A resource group name
* (Optional) The region that will host your workspace
* A name for your workspace
You can get your subscription ID from the [Azure portal](https://portal.azure.com/).
You will also need access to a [resource group](https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-overview#resource-groups), which organizes Azure resources and provides a default region for the resources in a group. You can see what resource groups to which you have access, or create a new one in the Azure portal. If you don't have a resource group, the `create_workspace()` method will create one for you using the name you provide.
The region to host your workspace will be used if you are creating a new workspace. You do not need to specify this if you are using an existing workspace. You can find the list of supported regions [here](https://azure.microsoft.com/en-us/global-infrastructure/services/?products=machine-learning-service). You should pick a region that is close to your location or that contains your data.
The name for your workspace is unique within the subscription and should be descriptive enough to discern among other workspaces. The subscription may be used only by you, or it may be used by your department or your entire enterprise, so choose a name that makes sense for your situation.
The following code chunk allows you to specify your workspace parameters. It uses `Sys.getenv` to read values from environment variables, which is useful for automation. If no environment variable exists, the parameters will be set to the specified default values. Replace the default values in the code below with your default parameter values.
``` {r configure_parameters, eval=FALSE}
subscription_id <- Sys.getenv("SUBSCRIPTION_ID", unset = "<my-subscription-id>")
resource_group <- Sys.getenv("RESOURCE_GROUP", default="<my-resource-group>")
workspace_name <- Sys.getenv("WORKSPACE_NAME", default="<my-workspace-name>")
workspace_region <- Sys.getenv("WORKSPACE_REGION", default="eastus2")
```
### Create a new workspace
If you don't have an existing workspace and are the owner of the subscription or resource group, you can create a new workspace. If you don't have a resource group, `create_workspace()` will create one for you using the name you provide. If you don't want it to do so, set the `create_resource_group = FALSE` parameter.
Note: As with other Azure services, there are limits on certain resources (e.g. AmlCompute quota) associated with the Azure ML service. Please read this [article](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-manage-quotas) on the default limits and how to request more quota.
This cell will create an Azure ML workspace for you in a subscription, provided you have the correct permissions.
This will fail if:
* You do not have permission to create a workspace in the resource group.
* You do not have permission to create a resource group if it does not exist.
* You are not a subscription owner or contributor and no Azure ML workspaces have ever been created in this subscription.
If workspace creation fails, please work with your IT admin to provide you with the appropriate permissions or to provision the required resources.
There are additional parameters that are not shown below that can be configured when creating a workspace. Please see [`create_workspace()`](https://azure.github.io/azureml-sdk-for-r/reference/create_workspace.html) for more details.
``` {r create_workspace, eval=FALSE}
library(azuremlsdk)
ws <- create_workspace(name = workspace_name,
subscription_id = subscription_id,
resource_group = resource_group,
location = workspace_region,
exist_ok = TRUE)
```
You can out write out the workspace ARM properties to a config file with [`write_workspace_config()`](https://azure.github.io/azureml-sdk-for-r/reference/write_workspace_config.html). The method provides a simple way of reusing the same workspace across multiple files or projects. Users can save the workspace details with `write_workspace_config()`, and use [`load_workspace_from_config()`](https://azure.github.io/azureml-sdk-for-r/reference/load_workspace_from_config.html) to load the same workspace in different files or projects without retyping the workspace ARM properties. The method defaults to writing out the config file to the current working directory with "config.json" as the file name. To specify a different path or file name, set the `path` and `file_name` parameters.
``` {r write_config, eval=FALSE}
write_workspace_config(ws)
```
### Access an existing workspace
You can access an existing workspace in a couple of ways. If your workspace properties were previously saved to a config file, you can load the workspace as follows:
``` {r load_config, eval=FALSE}
ws <- load_workspace_from_config()
```
If Azure ML cannot find the config file, specify the path to the config file with the `path` parameter. The method defaults to starting the search in the current directory.
You can also initialize a workspace using the [`get_workspace()`](https://azure.github.io/azureml-sdk-for-r/reference/get_workspace.html) method.
``` {r get_workspace, eval=FALSE}
ws <- get_workspace(name = workspace_name,
subscription_id = subscription_id,
resource_group = resource_group)
```

View File

@@ -0,0 +1,188 @@
---
title: "Deploy a web service to Azure Kubernetes Service"
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Deploy a web service to Azure Kubernetes Service}
%\VignetteEngine{knitr::rmarkdown}
\use_package{UTF-8}
---
This tutorial demonstrates how to deploy a model as a web service on [Azure Kubernetes Service](https://azure.microsoft.com/en-us/services/kubernetes-service/) (AKS). AKS is good for high-scale production deployments; use it if you need one or more of the following capabilities:
* Fast response time
* Autoscaling of the deployed service
* Hardware acceleration options such as GPU
You will learn to:
* Set up your testing environment
* Register a model
* Provision an AKS cluster
* Deploy the model to AKS
* Test the deployed service
## Prerequisites
If you don<6F>t have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
## Set up your testing environment
Start by setting up your environment. This includes importing the **azuremlsdk** package and connecting to your workspace.
### Import package
```{r import_package, eval=FALSE}
library(azuremlsdk)
```
### Load your workspace
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from a **config.json** file if you previously wrote one out with `write_workspace_config()`.
```{r load_workspace, eval=FALSE}
ws <- load_workspace_from_config()
```
Or, you can retrieve a workspace by directly specifying your workspace details:
```{r get_workspace, eval=FALSE}
ws <- get_workspace("<your workspace name>", "<your subscription ID>", "<your resource group>")
```
## Register the model
In this tutorial we will deploy a model that was trained in one of the [samples](https://github.com/Azure/azureml-sdk-for-r/blob/master/samples/training/train-on-amlcompute/train-on-amlcompute.R). The model was trained with the Iris dataset and can be used to determine if a flower is one of three Iris flower species (setosa, versicolor, virginica). We have provided the model file (`model.rds`) for the tutorial; it is located in the "project_files" directory of this vignette.
First, register the model to your workspace with [`register_model()`](https://azure.github.io/azureml-sdk-for-r/reference/register_model.html). A registered model can be any collection of files, but in this case the R model file is sufficient. Azure ML will use the registered model for deployment.
```{r register_model, eval=FALSE}
model <- register_model(ws,
model_path = "project_files/model.rds",
model_name = "iris_model",
description = "Predict an Iris flower type")
```
## Provision an AKS cluster
When deploying a web service to AKS, you deploy to an AKS cluster that is connected to your workspace. There are two ways to connect an AKS cluster to your workspace:
* Create the AKS cluster. The process automatically connects the cluster to the workspace.
* Attach an existing AKS cluster to your workspace. You can attach a cluster with the [`attach_aks_compute()`](https://azure.github.io/azureml-sdk-for-r/reference/attach_aks_compute.html) method.
Creating or attaching an AKS cluster is a one-time process for your workspace. You can reuse this cluster for multiple deployments. If you delete the cluster or the resource group that contains it, you must create a new cluster the next time you need to deploy.
In this tutorial, we will go with the first method of provisioning a new cluster. See the [`create_aks_compute()`](https://azure.github.io/azureml-sdk-for-r/reference/create_aks_compute.html) reference for the full set of configurable parameters. If you pick custom values for the `agent_count` and `vm_size` parameters, you need to make sure `agent_count` multiplied by `vm_size` is greater than or equal to `12` virtual CPUs.
``` {r provision_cluster, eval=FALSE}
aks_target <- create_aks_compute(ws, cluster_name = 'myakscluster')
wait_for_provisioning_completion(aks_target, show_output = TRUE)
```
The Azure ML SDK does not provide support for scaling an AKS cluster. To scale the nodes in the cluster, use the UI for your AKS cluster in the Azure portal. You can only change the node count, not the VM size of the cluster.
## Deploy as a web service
### Define the inference dependencies
To deploy a model, you need an **inference configuration**, which describes the environment needed to host the model and web service. To create an inference config, you will first need a scoring script and an Azure ML environment.
The scoring script (`entry_script`) is an R script that will take as input variable values (in JSON format) and output a prediction from your model. For this tutorial, use the provided scoring file `score.R`. The scoring script must contain an `init()` method that loads your model and returns a function that uses the model to make a prediction based on the input data. See the [documentation](https://azure.github.io/azureml-sdk-for-r/reference/inference_config.html#details) for more details.
Next, define an Azure ML **environment** for your script<70>s package dependencies. With an environment, you specify R packages (from CRAN or elsewhere) that are needed for your script to run. You can also provide the values of environment variables that your script can reference to modify its behavior.
By default Azure ML will build a default Docker image that includes R, the Azure ML SDK, and additional required dependencies for deployment. See the documentation here for the full list of dependencies that will be installed in the default container. You can also specify additional packages to be installed at runtime, or even a custom Docker image to be used instead of the base image that will be built, using the other available parameters to [`r_environment()`](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html).
```{r create_env, eval=FALSE}
r_env <- r_environment(name = "deploy_env")
```
Now you have everything you need to create an inference config for encapsulating your scoring script and environment dependencies.
``` {r create_inference_config, eval=FALSE}
inference_config <- inference_config(
entry_script = "score.R",
source_directory = "project_files",
environment = r_env)
```
### Deploy to AKS
Now, define the deployment configuration that describes the compute resources needed, for example, the number of cores and memory. See the [`aks_webservice_deployment_config()`](https://azure.github.io/azureml-sdk-for-r/reference/aks_webservice_deployment_config.html) for the full set of configurable parameters.
``` {r deploy_config, eval=FALSE}
aks_config <- aks_webservice_deployment_config(cpu_cores = 1, memory_gb = 1)
```
Now, deploy your model as a web service to the AKS cluster you created earlier.
```{r deploy_service, eval=FALSE}
aks_service <- deploy_model(ws,
'my-new-aksservice',
models = list(model),
inference_config = inference_config,
deployment_config = aks_config,
deployment_target = aks_target)
wait_for_deployment(aks_service, show_output = TRUE)
```
To inspect the logs from the deployment:
```{r get_logs, eval=FALSE}
get_webservice_logs(aks_service)
```
If you encounter any issue in deploying the web service, please visit the [troubleshooting guide](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment).
## Test the deployed service
Now that your model is deployed as a service, you can test the service from R using [`invoke_webservice()`](https://azure.github.io/azureml-sdk-for-r/reference/invoke_webservice.html). Provide a new set of data to predict from, convert it to JSON, and send it to the service.
``` {r test_service, eval=FALSE}
library(jsonlite)
# versicolor
plant <- data.frame(Sepal.Length = 6.4,
Sepal.Width = 2.8,
Petal.Length = 4.6,
Petal.Width = 1.8)
# setosa
# plant <- data.frame(Sepal.Length = 5.1,
# Sepal.Width = 3.5,
# Petal.Length = 1.4,
# Petal.Width = 0.2)
# virginica
# plant <- data.frame(Sepal.Length = 6.7,
# Sepal.Width = 3.3,
# Petal.Length = 5.2,
# Petal.Width = 2.3)
predicted_val <- invoke_webservice(aks_service, toJSON(plant))
message(predicted_val)
```
You can also get the web service<63>s HTTP endpoint, which accepts REST client calls. You can share this endpoint with anyone who wants to test the web service or integrate it into an application.
``` {r eval=FALSE}
aks_service$scoring_uri
```
## Web service authentication
When deploying to AKS, key-based authentication is enabled by default. You can also enable token-based authentication. Token-based authentication requires clients to use an Azure Active Directory account to request an authentication token, which is used to make requests to the deployed service.
To disable key-based auth, set the `auth_enabled = FALSE` parameter when creating the deployment configuration with [`aks_webservice_deployment_config()`](https://azure.github.io/azureml-sdk-for-r/reference/aks_webservice_deployment_config.html).
To enable token-based auth, set `token_auth_enabled = TRUE` when creating the deployment config.
### Key-based authentication
If key authentication is enabled, you can use the [`get_webservice_keys()`](https://azure.github.io/azureml-sdk-for-r/reference/get_webservice_keys.html) method to retrieve a primary and secondary authentication key. To generate a new key, use [`generate_new_webservice_key()`](https://azure.github.io/azureml-sdk-for-r/reference/generate_new_webservice_key.html).
### Token-based authentication
If token authentication is enabled, you can use the [`get_webservice_token()`](https://azure.github.io/azureml-sdk-for-r/reference/get_webservice_token.html) method to retrieve a JWT token and that token's expiration time. Make sure to request a new token after the token's expiration time.
## Clean up resources
Delete the resources once you no longer need them. Do not delete any resource you plan on still using.
Delete the web service:
```{r delete_service, eval=FALSE}
delete_webservice(aks_service)
```
Delete the registered model:
```{r delete_model, eval=FALSE}
delete_model(model)
```
Delete the AKS cluster:
```{r delete_cluster, eval=FALSE}
delete_compute(aks_target)
```

View File

@@ -0,0 +1,17 @@
#' Copyright(c) Microsoft Corporation.
#' Licensed under the MIT license.
library(jsonlite)
init <- function() {
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
model <- readRDS(file.path(model_path, "model.rds"))
message("model is loaded")
function(data) {
plant <- as.data.frame(fromJSON(data))
prediction <- predict(model, plant)
result <- as.character(prediction)
toJSON(result)
}
}

View File

@@ -0,0 +1,242 @@
---
title: "Hyperparameter tune a Keras model"
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Hyperparameter tune a Keras model}
%\VignetteEngine{knitr::rmarkdown}
\use_package{UTF-8}
---
This tutorial demonstrates how you can efficiently tune hyperparameters for a model using HyperDrive, Azure ML's hyperparameter tuning functionality. You will train a Keras model on the CIFAR10 dataset, automate hyperparameter exploration, launch parallel jobs, log your results, and find the best run.
### What are hyperparameters?
Hyperparameters are variable parameters chosen to train a model. Learning rate, number of epochs, and batch size are all examples of hyperparameters.
Using brute-force methods to find the optimal values for parameters can be time-consuming, and poor-performing runs can result in wasted money. To avoid this, HyperDrive automates hyperparameter exploration in a time-saving and cost-effective manner by launching several parallel runs with different configurations and finding the configuration that results in best performance on your primary metric.
Let's get started with the example to see how it works!
## Prerequisites
If you don<6F>t have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
## Set up development environment
The setup for your development work in this tutorial includes the following actions:
* Import required packages
* Connect to a workspace
* Create an experiment to track your runs
* Create a remote compute target to use for training
### Import **azuremlsdk** package
```{r eval=FALSE}
library(azuremlsdk)
```
### Load your workspace
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from a **config.json** file if you previously wrote one out with [`write_workspace_config()`](https://azure.github.io/azureml-sdk-for-r/reference/write_workspace_config.html).
```{r load_workpace, eval=FALSE}
ws <- load_workspace_from_config()
```
Or, you can retrieve a workspace by directly specifying your workspace details:
```{r get_workpace, eval=FALSE}
ws <- get_workspace("<your workspace name>", "<your subscription ID>", "<your resource group>")
```
### Create an experiment
An Azure ML **experiment** tracks a grouping of runs, typically from the same training script. Create an experiment to track hyperparameter tuning runs for the Keras model.
```{r create_experiment, eval=FALSE}
exp <- experiment(workspace = ws, name = 'hyperdrive-cifar10')
```
If you would like to track your runs in an existing experiment, simply specify that experiment's name to the `name` parameter of `experiment()`.
### Create a compute target
By using Azure Machine Learning Compute (AmlCompute), a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. In this tutorial, you create a GPU-enabled cluster as your training environment. The code below creates the compute cluster for you if it doesn't already exist in your workspace.
You may need to wait a few minutes for your compute cluster to be provisioned if it doesn't already exist.
```{r create_cluster, eval=FALSE}
cluster_name <- "gpucluster"
compute_target <- get_compute(ws, cluster_name = cluster_name)
if (is.null(compute_target))
{
vm_size <- "STANDARD_NC6"
compute_target <- create_aml_compute(workspace = ws,
cluster_name = cluster_name,
vm_size = vm_size,
max_nodes = 4)
wait_for_provisioning_completion(compute_target, show_output = TRUE)
}
```
## Prepare the training script
A training script called `cifar10_cnn.R` has been provided for you in the "project_files" directory of this tutorial.
In order to leverage HyperDrive, the training script for your model must log the relevant metrics during model training. When you configure the hyperparameter tuning run, you specify the primary metric to use for evaluating run performance. You must log this metric so it is available to the hyperparameter tuning process.
In order to log the required metrics, you need to do the following **inside the training script**:
* Import the **azuremlsdk** package
```
library(azuremlsdk)
```
* Take the hyperparameters as command-line arguments to the script. This is necessary so that when HyperDrive carries out the hyperparameter sweep, it can run the training script with different values to the hyperparameters as defined by the search space.
* Use the [`log_metric_to_run()`](https://azure.github.io/azureml-sdk-for-r/reference/log_metric_to_run.html) function to log the hyperparameters and the primary metric.
```
log_metric_to_run("batch_size", batch_size)
...
log_metric_to_run("epochs", epochs)
...
log_metric_to_run("lr", lr)
...
log_metric_to_run("decay", decay)
...
log_metric_to_run("Loss", results[[1]])
```
## Create an estimator
An Azure ML **estimator** encapsulates the run configuration information needed for executing a training script on the compute target. Azure ML runs are run as containerized jobs on the specified compute target. By default, the Docker image built for your training job will include R, the Azure ML SDK, and a set of commonly used R packages. See the full list of default packages included [here](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html). The estimator is used to define the configuration for each of the child runs that the parent HyperDrive run will kick off.
To create the estimator, define the following:
* The directory that contains your scripts needed for training (`source_directory`). All the files in this directory are uploaded to the cluster node(s) for execution. The directory must contain your training script and any additional scripts required.
* The training script that will be executed (`entry_script`).
* The compute target (`compute_target`), in this case the AmlCompute cluster you created earlier.
* Any environment dependencies required for training. Since the training script requires the Keras package, which is not included in the image by default, pass the package name to the `cran_packages` parameter to have it installed in the Docker container where the job will run. See the [`estimator()`](https://azure.github.io/azureml-sdk-for-r/reference/estimator.html) reference for the full set of configurable options.
* Set the `use_gpu = TRUE` flag so the default base GPU Docker image will be built, since the job will be run on a GPU cluster.
```{r create_estimator, eval=FALSE}
est <- estimator(source_directory = "project_files",
entry_script = "cifar10_cnn.R",
compute_target = compute_target,
cran_packages = c("keras"),
use_gpu = TRUE)
```
## Configure the HyperDrive run
To kick off hyperparameter tuning in Azure ML, you will need to configure a HyperDrive run, which will in turn launch individual children runs of the training scripts with the corresponding hyperparameter values.
### Define search space
In this experiment, we will use four hyperparameters: batch size, number of epochs, learning rate, and decay. In order to begin tuning, we must define the range of values we would like to explore from and how they will be distributed. This is called a parameter space definition and can be created with discrete or continuous ranges.
__Discrete hyperparameters__ are specified as a choice among discrete values represented as a list.
Advanced discrete hyperparameters can also be specified using a distribution. The following distributions are supported:
* `quniform(low, high, q)`
* `qloguniform(low, high, q)`
* `qnormal(mu, sigma, q)`
* `qlognormal(mu, sigma, q)`
__Continuous hyperparameters__ are specified as a distribution over a continuous range of values. The following distributions are supported:
* `uniform(low, high)`
* `loguniform(low, high)`
* `normal(mu, sigma)`
* `lognormal(mu, sigma)`
Here, we will use the [`random_parameter_sampling()`](https://azure.github.io/azureml-sdk-for-r/reference/random_parameter_sampling.html) function to define the search space for each hyperparameter. `batch_size` and `epochs` will be chosen from discrete sets while `lr` and `decay` will be drawn from continuous distributions.
Other available sampling function options are:
* [`grid_parameter_sampling()`](https://azure.github.io/azureml-sdk-for-r/reference/grid_parameter_sampling.html)
* [`bayesian_parameter_sampling()`](https://azure.github.io/azureml-sdk-for-r/reference/bayesian_parameter_sampling.html)
```{r search_space, eval=FALSE}
sampling <- random_parameter_sampling(list(batch_size = choice(c(16, 32, 64)),
epochs = choice(c(200, 350, 500)),
lr = normal(0.0001, 0.005),
decay = uniform(1e-6, 3e-6)))
```
### Define termination policy
To prevent resource waste, Azure ML can detect and terminate poorly performing runs. HyperDrive will do this automatically if you specify an early termination policy.
Here, you will use the [`bandit_policy()`](https://azure.github.io/azureml-sdk-for-r/reference/bandit_policy.html), which terminates any runs where the primary metric is not within the specified slack factor with respect to the best performing training run.
```{r termination_policy, eval=FALSE}
policy <- bandit_policy(slack_factor = 0.15)
```
Other termination policy options are:
* [`median_stopping_policy()`](https://azure.github.io/azureml-sdk-for-r/reference/median_stopping_policy.html)
* [`truncation_selection_policy()`](https://azure.github.io/azureml-sdk-for-r/reference/truncation_selection_policy.html)
If no policy is provided, all runs will continue to completion regardless of performance.
### Finalize configuration
Now, you can create a `HyperDriveConfig` object to define your HyperDrive run. Along with the sampling and policy definitions, you need to specify the name of the primary metric that you want to track and whether we want to maximize it or minimize it. The `primary_metric_name` must correspond with the name of the primary metric you logged in your training script. `max_total_runs` specifies the total number of child runs to launch. See the [hyperdrive_config()](https://azure.github.io/azureml-sdk-for-r/reference/hyperdrive_config.html) reference for the full set of configurable parameters.
```{r create_config, eval=FALSE}
hyperdrive_config <- hyperdrive_config(hyperparameter_sampling = sampling,
primary_metric_goal("MINIMIZE"),
primary_metric_name = "Loss",
max_total_runs = 4,
policy = policy,
estimator = est)
```
## Submit the HyperDrive run
Finally submit the experiment to run on your cluster. The parent HyperDrive run will launch the individual child runs. `submit_experiment()` will return a `HyperDriveRun` object that you will use to interface with the run. In this tutorial, since the cluster we created scales to a max of `4` nodes, all 4 child runs will be launched in parallel.
```{r submit_run, eval=FALSE}
hyperdrive_run <- submit_experiment(exp, hyperdrive_config)
```
You can view the HyperDrive run<75>s details as a table. Clicking the <20>Web View<65> link provided will bring you to Azure Machine Learning studio, where you can monitor the run in the UI.
```{r eval=FALSE}
view_run_details(hyperdrive_run)
```
Wait until hyperparameter tuning is complete before you run more code.
```{r eval=FALSE}
wait_for_run_completion(hyperdrive_run, show_output = TRUE)
```
## Analyse runs by performance
Finally, you can view and compare the metrics collected during all of the child runs!
```{r analyse_runs, eval=FALSE}
# Get the metrics of all the child runs
child_run_metrics <- get_child_run_metrics(hyperdrive_run)
child_run_metrics
# Get the child run objects sorted in descending order by the best primary metric
child_runs <- get_child_runs_sorted_by_primary_metric(hyperdrive_run)
child_runs
# Directly get the run object of the best performing run
best_run <- get_best_run_by_primary_metric(hyperdrive_run)
# Get the metrics of the best performing run
metrics <- get_run_metrics(best_run)
metrics
```
The `metrics` variable will include the values of the hyperparameters that resulted in the best performing run.
## Clean up resources
Delete the resources once you no longer need them. Don't delete any resource you plan to still use.
Delete the compute cluster:
```{r delete_compute, eval=FALSE}
delete_compute(compute_target)
```

View File

@@ -0,0 +1,124 @@
#' Modified from: "https://github.com/rstudio/keras/blob/master/vignettes/
#' examples/cifar10_cnn.R"
#'
#' Train a simple deep CNN on the CIFAR10 small images dataset.
#'
#' It gets down to 0.65 test logloss in 25 epochs, and down to 0.55 after 50
#' epochs, though it is still underfitting at that point.
library(keras)
install_keras()
library(azuremlsdk)
# Parameters --------------------------------------------------------------
args <- commandArgs(trailingOnly = TRUE)
batch_size <- as.numeric(args[2])
log_metric_to_run("batch_size", batch_size)
epochs <- as.numeric(args[4])
log_metric_to_run("epochs", epochs)
lr <- as.numeric(args[6])
log_metric_to_run("lr", lr)
decay <- as.numeric(args[8])
log_metric_to_run("decay", decay)
data_augmentation <- TRUE
# Data Preparation --------------------------------------------------------
# See ?dataset_cifar10 for more info
cifar10 <- dataset_cifar10()
# Feature scale RGB values in test and train inputs
x_train <- cifar10$train$x / 255
x_test <- cifar10$test$x / 255
y_train <- to_categorical(cifar10$train$y, num_classes = 10)
y_test <- to_categorical(cifar10$test$y, num_classes = 10)
# Defining Model ----------------------------------------------------------
# Initialize sequential model
model <- keras_model_sequential()
model %>%
# Start with hidden 2D convolutional layer being fed 32x32 pixel images
layer_conv_2d(
filter = 32, kernel_size = c(3, 3), padding = "same",
input_shape = c(32, 32, 3)
) %>%
layer_activation("relu") %>%
# Second hidden layer
layer_conv_2d(filter = 32, kernel_size = c(3, 3)) %>%
layer_activation("relu") %>%
# Use max pooling
layer_max_pooling_2d(pool_size = c(2, 2)) %>%
layer_dropout(0.25) %>%
# 2 additional hidden 2D convolutional layers
layer_conv_2d(filter = 32, kernel_size = c(3, 3), padding = "same") %>%
layer_activation("relu") %>%
layer_conv_2d(filter = 32, kernel_size = c(3, 3)) %>%
layer_activation("relu") %>%
# Use max pooling once more
layer_max_pooling_2d(pool_size = c(2, 2)) %>%
layer_dropout(0.25) %>%
# Flatten max filtered output into feature vector
# and feed into dense layer
layer_flatten() %>%
layer_dense(512) %>%
layer_activation("relu") %>%
layer_dropout(0.5) %>%
# Outputs from dense layer are projected onto 10 unit output layer
layer_dense(10) %>%
layer_activation("softmax")
opt <- optimizer_rmsprop(lr, decay)
model %>%
compile(loss = "categorical_crossentropy",
optimizer = opt,
metrics = "accuracy"
)
# Training ----------------------------------------------------------------
if (!data_augmentation) {
model %>%
fit(x_train,
y_train,
batch_size = batch_size,
epochs = epochs,
validation_data = list(x_test, y_test),
shuffle = TRUE
)
} else {
datagen <- image_data_generator(rotation_range = 20,
width_shift_range = 0.2,
height_shift_range = 0.2,
horizontal_flip = TRUE
)
datagen %>% fit_image_data_generator(x_train)
results <- evaluate(model, x_train, y_train, batch_size)
log_metric_to_run("Loss", results[[1]])
cat("Loss: ", results[[1]], "\n")
cat("Accuracy: ", results[[2]], "\n")
}

View File

@@ -0,0 +1,100 @@
---
title: "Install the Azure ML SDK for R"
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Install the Azure ML SDK for R}
%\VignetteEngine{knitr::rmarkdown}
\use_package{UTF-8}
---
This article covers the step-by-step instructions for installing the Azure ML SDK for R.
You do not need run this if you are working on an Azure Machine Learning Compute Instance, as the compute instance already has the Azure ML SDK preinstalled.
## Install Conda
If you do not have Conda already installed on your machine, you will first need to install it, since the Azure ML R SDK uses **reticulate** to bind to the Python SDK. We recommend installing [Miniconda](https://docs.conda.io/en/latest/miniconda.html), which is a smaller, lightweight version of Anaconda. Choose the 64-bit binary for Python 3.5 or later.
## Install the **azuremlsdk** R package
You will need **remotes** to install **azuremlsdk** from the GitHub repo.
``` {r install_remotes, eval=FALSE}
install.packages('remotes')
```
Then, you can use the `install_github` function to install the package.
``` {r install_azuremlsdk, eval=FALSE}
remotes::install_cran('azuremlsdk', repos = 'https://cloud.r-project.org/')
```
If you are using R installed from CRAN, which comes with 32-bit and 64-bit binaries, you may need to specify the parameter `INSTALL_opts=c("--no-multiarch")` to only build for the current 64-bit architecture.
``` {r eval=FALSE}
remotes::install_cran('azuremlsdk', repos = 'https://cloud.r-project.org/', INSTALL_opts=c("--no-multiarch"))
```
## Install the Azure ML Python SDK
Lastly, use the **azuremlsdk** R library to install the Python SDK. By default, `azuremlsdk::install_azureml()` will install the [latest version of the Python SDK](https://pypi.org/project/azureml-sdk/) in a conda environment called `r-azureml` if reticulate < 1.14 or `r-reticulate` if reticulate ≥ 1.14.
``` {r install_pythonsdk, eval=FALSE}
azuremlsdk::install_azureml()
```
If you would like to override the default version, environment name, or Python version, you can pass in those arguments. If you would like to restart the R session after installation or delete the conda environment if it already exists and create a new environment, you can also do so:
``` {r eval=FALSE}
azuremlsdk::install_azureml(version = NULL,
custom_envname = "<your conda environment name>",
conda_python_version = "<desired python version>",
restart_session = TRUE,
remove_existing_env = TRUE)
```
## Test installation
You can confirm your installation worked by loading the library and successfully retrieving a run.
``` {r test_installation, eval=FALSE}
library(azuremlsdk)
get_current_run()
```
## Troubleshooting
- In step 3 of the installation, if you get ssl errors on windows, it is due to an
outdated openssl binary. Install the latest openssl binaries from
[here](https://wiki.openssl.org/index.php/Binaries).
- If installation fails due to this error:
```R
Error in strptime(xx, f, tz = tz) :
(converted from warning) unable to identify current timezone 'C':
please set environment variable 'TZ'
In R CMD INSTALL
Error in i.p(...) :
(converted from warning) installation of package C:/.../azureml_0.4.0.tar.gz had non-zero exit
status
```
You will need to set your time zone environment variable to GMT and restart the installation process.
```R
Sys.setenv(TZ='GMT')
```
- If the following permission error occurs while installing in RStudio,
change your RStudio session to administrator mode, and re-run the installation command.
```R
Downloading GitHub repo Azure/azureml-sdk-for-r@master
Skipping 2 packages ahead of CRAN: reticulate, rlang
Running `R CMD build`...
Error: (converted from warning) invalid package
'C:/.../file2b441bf23631'
In R CMD INSTALL
Error in i.p(...) :
(converted from warning) installation of package
C:/.../file2b441bf23631 had non-zero exit status
In addition: Warning messages:
1: In file(con, "r") :
cannot open file 'C:...\file2b44144a540f': Permission denied
2: In file(con, "r") :
cannot open file 'C:...\file2b4463c21577': Permission denied
```

View File

@@ -0,0 +1,16 @@
#' Copyright(c) Microsoft Corporation.
#' Licensed under the MIT license.
library(jsonlite)
init <- function() {
model_path <- Sys.getenv("AZUREML_MODEL_DIR")
model <- readRDS(file.path(model_path, "model.rds"))
message("logistic regression model loaded")
function(data) {
vars <- as.data.frame(fromJSON(data))
prediction <- as.numeric(predict(model, vars, type = "response") * 100)
toJSON(prediction)
}
}

View File

@@ -0,0 +1,33 @@
#' Copyright(c) Microsoft Corporation.
#' Licensed under the MIT license.
library(azuremlsdk)
library(optparse)
library(caret)
options <- list(
make_option(c("-d", "--data_folder"))
)
opt_parser <- OptionParser(option_list = options)
opt <- parse_args(opt_parser)
paste(opt$data_folder)
accidents <- readRDS(file.path(opt$data_folder, "accidents.Rd"))
summary(accidents)
mod <- glm(dead ~ dvcat + seatbelt + frontal + sex + ageOFocc + yearVeh + airbag + occRole, family = binomial, data = accidents)
summary(mod)
predictions <- factor(ifelse(predict(mod) > 0.1, "dead", "alive"))
conf_matrix <- confusionMatrix(predictions, accidents$dead)
message(conf_matrix)
log_metric_to_run("Accuracy", conf_matrix$overall["Accuracy"])
output_dir = "outputs"
if (!dir.exists(output_dir)) {
dir.create(output_dir)
}
saveRDS(mod, file = "./outputs/model.rds")
message("Model saved")

View File

@@ -0,0 +1,326 @@
---
title: "Train and deploy your first model with Azure ML"
author: "David Smith"
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Train and deploy your first model with Azure ML}
%\VignetteEngine{knitr::rmarkdown}
\use_package{UTF-8}
---
In this tutorial, you learn the foundational design patterns in Azure Machine Learning. You'll train and deploy a **caret** model to predict the likelihood of a fatality in an automobile accident. After completing this tutorial, you'll have the practical knowledge of the R SDK to scale up to developing more-complex experiments and workflows.
In this tutorial, you learn the following tasks:
* Connect your workspace
* Load data and prepare for training
* Upload data to the datastore so it is available for remote training
* Create a compute resource
* Train a caret model to predict probability of fatality
* Deploy a prediction endpoint
* Test the model from R
## Prerequisites
If you don't have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
## Set up your development environment
The setup for your development work in this tutorial includes the following actions:
* Install required packages
* Connect to a workspace, so that your local computer can communicate with remote resources
* Create an experiment to track your runs
* Create a remote compute target to use for training
### Install required packages
This tutorial assumes you already have the Azure ML SDK installed. Go ahead and import the **azuremlsdk** package.
```{r eval=FALSE}
library(azuremlsdk)
```
The tutorial uses data from the [**DAAG** package](https://cran.r-project.org/package=DAAG). Install the package if you don't have it.
```{r eval=FALSE}
install.packages("DAAG")
```
The training and scoring scripts (`accidents.R` and `accident_predict.R`) have some additional dependencies. If you plan on running those scripts locally, make sure you have those required packages as well.
### Load your workspace
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from the **config.json** file. You can also retrieve a workspace using [`get_workspace()`](https://azure.github.io/azureml-sdk-for-r/reference/get_workspace.html).
```{r load_workpace, eval=FALSE}
ws <- load_workspace_from_config()
```
### Create an experiment
An Azure ML experiment tracks a grouping of runs, typically from the same training script. Create an experiment to track the runs for training the caret model on the accidents data.
```{r create_experiment, eval=FALSE}
experiment_name <- "accident-logreg"
exp <- experiment(ws, experiment_name)
```
### Create a compute target
By using Azure Machine Learning Compute (AmlCompute), a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. Examples include VMs with GPU support. In this tutorial, you create a single-node AmlCompute cluster as your training environment. The code below creates the compute cluster for you if it doesn't already exist in your workspace.
You may need to wait a few minutes for your compute cluster to be provisioned if it doesn't already exist.
```{r create_cluster, eval=FALSE}
cluster_name <- "rcluster"
compute_target <- get_compute(ws, cluster_name = cluster_name)
if (is.null(compute_target)) {
vm_size <- "STANDARD_D2_V2"
compute_target <- create_aml_compute(workspace = ws,
cluster_name = cluster_name,
vm_size = vm_size,
max_nodes = 1)
wait_for_provisioning_completion(compute_target, show_output = TRUE)
}
```
## Prepare data for training
This tutorial uses data from the **DAAG** package. This dataset includes data from over 25,000 car crashes in the US, with variables you can use to predict the likelihood of a fatality. First, import the data into R and transform it into a new dataframe `accidents` for analysis, and export it to an `Rdata` file.
```{r load_data, eval=FALSE}
library(DAAG)
data(nassCDS)
accidents <- na.omit(nassCDS[,c("dead","dvcat","seatbelt","frontal","sex","ageOFocc","yearVeh","airbag","occRole")])
accidents$frontal <- factor(accidents$frontal, labels=c("notfrontal","frontal"))
accidents$occRole <- factor(accidents$occRole)
saveRDS(accidents, file="accidents.Rd")
```
### Upload data to the datastore
Upload data to the cloud so that it can be access by your remote training environment. Each Azure ML workspace comes with a default datastore that stores the connection information to the Azure blob container that is provisioned in the storage account attached to the workspace. The following code will upload the accidents data you created above to that datastore.
```{r upload_data, eval=FALSE}
ds <- get_default_datastore(ws)
target_path <- "accidentdata"
upload_files_to_datastore(ds,
list("./project_files/accidents.Rd"),
target_path = target_path,
overwrite = TRUE)
```
## Train a model
For this tutorial, fit a logistic regression model on your uploaded data using your remote compute cluster. To submit a job, you need to:
* Prepare the training script
* Create an estimator
* Submit the job
### Prepare the training script
A training script called `accidents.R` has been provided for you in the "project_files" directory of this tutorial. Notice the following details **inside the training script** that have been done to leverage the Azure ML service for training:
* The training script takes an argument `-d` to find the directory that contains the training data. When you define and submit your job later, you point to the datastore for this argument. Azure ML will mount the storage folder to the remote cluster for the training job.
* The training script logs the final accuracy as a metric to the run record in Azure ML using `log_metric_to_run()`. The Azure ML SDK provides a set of logging APIs for logging various metrics during training runs. These metrics are recorded and persisted in the experiment run record. The metrics can then be accessed at any time or viewed in the run details page in [Azure Machine Learning studio](http://ml.azure.com). See the [reference](https://azure.github.io/azureml-sdk-for-r/reference/index.html#section-training-experimentation) for the full set of logging methods `log_*()`.
* The training script saves your model into a directory named **outputs**. The `./outputs` folder receives special treatment by Azure ML. During training, files written to `./outputs` are automatically uploaded to your run record by Azure ML and persisted as artifacts. By saving the trained model to `./outputs`, you'll be able to access and retrieve your model file even after the run is over and you no longer have access to your remote training environment.
### Create an estimator
An Azure ML estimator encapsulates the run configuration information needed for executing a training script on the compute target. Azure ML runs are run as containerized jobs on the specified compute target. By default, the Docker image built for your training job will include R, the Azure ML SDK, and a set of commonly used R packages. See the full list of default packages included [here](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html).
To create the estimator, define:
* The directory that contains your scripts needed for training (`source_directory`). All the files in this directory are uploaded to the cluster node(s) for execution. The directory must contain your training script and any additional scripts required.
* The training script that will be executed (`entry_script`).
* The compute target (`compute_target`), in this case the AmlCompute cluster you created earlier.
* The parameters required from the training script (`script_params`). Azure ML will run your training script as a command-line script with `Rscript`. In this tutorial you specify one argument to the script, the data directory mounting point, which you can access with `ds$path(target_path)`.
* Any environment dependencies required for training. The default Docker image built for training already contains the three packages (`caret`, `e1071`, and `optparse`) needed in the training script. So you don't need to specify additional information. If you are using R packages that are not included by default, use the estimator's `cran_packages` parameter to add additional CRAN packages. See the [`estimator()`](https://azure.github.io/azureml-sdk-for-r/reference/estimator.html) reference for the full set of configurable options.
```{r create_estimator, eval=FALSE}
est <- estimator(source_directory = "project_files",
entry_script = "accidents.R",
script_params = list("--data_folder" = ds$path(target_path)),
compute_target = compute_target
)
```
### Submit the job on the remote cluster
Finally submit the job to run on your cluster. `submit_experiment()` returns a Run object that you then use to interface with the run. In total, the first run takes **about 10 minutes**. But for later runs, the same Docker image is reused as long as the script dependencies don't change. In this case, the image is cached and the container startup time is much faster.
```{r submit_job, eval=FALSE}
run <- submit_experiment(exp, est)
```
You can view a table of the run's details. Clicking the "Web View" link provided will bring you to Azure Machine Learning studio, where you can monitor the run in the UI.
```{r view_run, eval=FALSE}
view_run_details(run)
```
Model training happens in the background. Wait until the model has finished training before you run more code.
```{r wait_run, eval=FALSE}
wait_for_run_completion(run, show_output = TRUE)
```
You -- and colleagues with access to the workspace -- can submit multiple experiments in parallel, and Azure ML will take of scheduling the tasks on the compute cluster. You can even configure the cluster to automatically scale up to multiple nodes, and scale back when there are no more compute tasks in the queue. This configuration is a cost-effective way for teams to share compute resources.
## Retrieve training results
Once your model has finished training, you can access the artifacts of your job that were persisted to the run record, including any metrics logged and the final trained model.
### Get the logged metrics
In the training script `accidents.R`, you logged a metric from your model: the accuracy of the predictions in the training data. You can see metrics in the [studio](https://ml.azure.com), or extract them to the local session as an R list as follows:
```{r metrics, eval=FALSE}
metrics <- get_run_metrics(run)
metrics
```
If you've run multiple experiments (say, using differing variables, algorithms, or hyperparamers), you can use the metrics from each run to compare and choose the model you'll use in production.
### Get the trained model
You can retrieve the trained model and look at the results in your local R session. The following code will download the contents of the `./outputs` directory, which includes the model file.
```{r retrieve_model, eval=FALSE}
download_files_from_run(run, prefix="outputs/")
accident_model <- readRDS("project_files/outputs/model.rds")
summary(accident_model)
```
You see some factors that contribute to an increase in the estimated probability of death:
* higher impact speed
* male driver
* older occupant
* passenger
You see lower probabilities of death with:
* presence of airbags
* presence seatbelts
* frontal collision
The vehicle year of manufacture does not have a significant effect.
You can use this model to make new predictions:
```{r manual_predict, eval=FALSE}
newdata <- data.frame( # valid values shown below
dvcat="10-24", # "1-9km/h" "10-24" "25-39" "40-54" "55+"
seatbelt="none", # "none" "belted"
frontal="frontal", # "notfrontal" "frontal"
sex="f", # "f" "m"
ageOFocc=16, # age in years, 16-97
yearVeh=2002, # year of vehicle, 1955-2003
airbag="none", # "none" "airbag"
occRole="pass" # "driver" "pass"
)
## predicted probability of death for these variables, as a percentage
as.numeric(predict(accident_model,newdata, type="response")*100)
```
## Deploy as a web service
With your model, you can predict the danger of death from a collision. Use Azure ML to deploy your model as a prediction service. In this tutorial, you will deploy the web service in [Azure Container Instances](https://docs.microsoft.com/en-us/azure/container-instances/) (ACI).
### Register the model
First, register the model you downloaded to your workspace with [`register_model()`](https://azure.github.io/azureml-sdk-for-r/reference/register_model.html). A registered model can be any collection of files, but in this case the R model object is sufficient. Azure ML will use the registered model for deployment.
```{r register_model, eval=FALSE}
model <- register_model(ws,
model_path = "project_files/outputs/model.rds",
model_name = "accidents_model",
description = "Predict probablity of auto accident")
```
### Define the inference dependencies
To create a web service for your model, you first need to create a scoring script (`entry_script`), an R script that will take as input variable values (in JSON format) and output a prediction from your model. For this tutorial, use the provided scoring file `accident_predict.R`. The scoring script must contain an `init()` method that loads your model and returns a function that uses the model to make a prediction based on the input data. See the [documentation](https://azure.github.io/azureml-sdk-for-r/reference/inference_config.html#details) for more details.
Next, define an Azure ML **environment** for your script's package dependencies. With an environment, you specify R packages (from CRAN or elsewhere) that are needed for your script to run. You can also provide the values of environment variables that your script can reference to modify its behavior. By default, Azure ML will build the same default Docker image used with the estimator for training. Since the tutorial has no special requirements, create an environment with no special attributes.
```{r create_environment, eval=FALSE}
r_env <- r_environment(name = "basic_env")
```
If you want to use your own Docker image for deployment instead, specify the `custom_docker_image` parameter. See the [`r_environment()`](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html) reference for the full set of configurable options for defining an environment.
Now you have everything you need to create an **inference config** for encapsulating your scoring script and environment dependencies.
``` {r create_inference_config, eval=FALSE}
inference_config <- inference_config(
entry_script = "accident_predict.R",
source_directory = "project_files",
environment = r_env)
```
### Deploy to ACI
In this tutorial, you will deploy your service to ACI. This code provisions a single container to respond to inbound requests, which is suitable for testing and light loads. See [`aci_webservice_deployment_config()`](https://azure.github.io/azureml-sdk-for-r/reference/aci_webservice_deployment_config.html) for additional configurable options. (For production-scale deployments, you can also [deploy to Azure Kubernetes Service](https://azure.github.io/azureml-sdk-for-r/articles/deploy-to-aks/deploy-to-aks.html).)
``` {r create_aci_config, eval=FALSE}
aci_config <- aci_webservice_deployment_config(cpu_cores = 1, memory_gb = 0.5)
```
Now you deploy your model as a web service. Deployment **can take several minutes**.
```{r deploy_service, eval=FALSE}
aci_service <- deploy_model(ws,
'accident-pred',
list(model),
inference_config,
aci_config)
wait_for_deployment(aci_service, show_output = TRUE)
```
If you encounter any issue in deploying the web service, please visit the [troubleshooting guide](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-troubleshoot-deployment).
## Test the deployed service
Now that your model is deployed as a service, you can test the service from R using [`invoke_webservice()`](https://azure.github.io/azureml-sdk-for-r/reference/invoke_webservice.html). Provide a new set of data to predict from, convert it to JSON, and send it to the service.
```{r test_deployment, eval=FALSE}
library(jsonlite)
newdata <- data.frame( # valid values shown below
dvcat="10-24", # "1-9km/h" "10-24" "25-39" "40-54" "55+"
seatbelt="none", # "none" "belted"
frontal="frontal", # "notfrontal" "frontal"
sex="f", # "f" "m"
ageOFocc=22, # age in years, 16-97
yearVeh=2002, # year of vehicle, 1955-2003
airbag="none", # "none" "airbag"
occRole="pass" # "driver" "pass"
)
prob <- invoke_webservice(aci_service, toJSON(newdata))
prob
```
You can also get the web service's HTTP endpoint, which accepts REST client calls. You can share this endpoint with anyone who wants to test the web service or integrate it into an application.
```{r get_endpoint, eval=FALSE}
aci_service$scoring_uri
```
## Clean up resources
Delete the resources once you no longer need them. Don't delete any resource you plan to still use.
Delete the web service:
```{r delete_service, eval=FALSE}
delete_webservice(aci_service)
```
Delete the registered model:
```{r delete_model, eval=FALSE}
delete_model(model)
```
Delete the compute cluster:
```{r delete_compute, eval=FALSE}
delete_compute(compute_target)
```

View File

@@ -0,0 +1,62 @@
# Copyright 2015 The TensorFlow Authors. All Rights Reserved.
# Copyright 2016 RStudio, Inc. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
library(tensorflow)
install_tensorflow(version = "1.13.2-gpu")
library(azuremlsdk)
# Create the model
x <- tf$placeholder(tf$float32, shape(NULL, 784L))
W <- tf$Variable(tf$zeros(shape(784L, 10L)))
b <- tf$Variable(tf$zeros(shape(10L)))
y <- tf$nn$softmax(tf$matmul(x, W) + b)
# Define loss and optimizer
y_ <- tf$placeholder(tf$float32, shape(NULL, 10L))
cross_entropy <- tf$reduce_mean(-tf$reduce_sum(y_ * log(y),
reduction_indices = 1L))
train_step <- tf$train$GradientDescentOptimizer(0.5)$minimize(cross_entropy)
# Create session and initialize variables
sess <- tf$Session()
sess$run(tf$global_variables_initializer())
# Load mnist data )
datasets <- tf$contrib$learn$datasets
mnist <- datasets$mnist$read_data_sets("MNIST-data", one_hot = TRUE)
# Train
for (i in 1:1000) {
batches <- mnist$train$next_batch(100L)
batch_xs <- batches[[1]]
batch_ys <- batches[[2]]
sess$run(train_step,
feed_dict = dict(x = batch_xs, y_ = batch_ys))
}
# Test trained model
correct_prediction <- tf$equal(tf$argmax(y, 1L), tf$argmax(y_, 1L))
accuracy <- tf$reduce_mean(tf$cast(correct_prediction, tf$float32))
cat("Accuracy: ", sess$run(accuracy,
feed_dict = dict(x = mnist$test$images,
y_ = mnist$test$labels)))
log_metric_to_run("accuracy",
sess$run(accuracy, feed_dict = dict(x = mnist$test$images,
y_ = mnist$test$labels)))

View File

@@ -0,0 +1,143 @@
---
title: "Train a TensorFlow model"
date: "`r Sys.Date()`"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Train a TensorFlow model}
%\VignetteEngine{knitr::rmarkdown}
\use_package{UTF-8}
---
This tutorial demonstrates how run a TensorFlow job at scale using Azure ML. You will train a TensorFlow model to classify handwritten digits (MNIST) using a deep neural network (DNN) and log your results to the Azure ML service.
## Prerequisites
If you don<6F>t have access to an Azure ML workspace, follow the [setup tutorial](https://azure.github.io/azureml-sdk-for-r/articles/configuration.html) to configure and create a workspace.
## Set up development environment
The setup for your development work in this tutorial includes the following actions:
* Import required packages
* Connect to a workspace
* Create an experiment to track your runs
* Create a remote compute target to use for training
### Import **azuremlsdk** package
```{r eval=FALSE}
library(azuremlsdk)
```
### Load your workspace
Instantiate a workspace object from your existing workspace. The following code will load the workspace details from a **config.json** file if you previously wrote one out with [`write_workspace_config()`](https://azure.github.io/azureml-sdk-for-r/reference/write_workspace_config.html).
```{r load_workpace, eval=FALSE}
ws <- load_workspace_from_config()
```
Or, you can retrieve a workspace by directly specifying your workspace details:
```{r get_workpace, eval=FALSE}
ws <- get_workspace("<your workspace name>", "<your subscription ID>", "<your resource group>")
```
### Create an experiment
An Azure ML **experiment** tracks a grouping of runs, typically from the same training script. Create an experiment to track the runs for training the TensorFlow model on the MNIST data.
```{r create_experiment, eval=FALSE}
exp <- experiment(workspace = ws, name = "tf-mnist")
```
If you would like to track your runs in an existing experiment, simply specify that experiment's name to the `name` parameter of `experiment()`.
### Create a compute target
By using Azure Machine Learning Compute (AmlCompute), a managed service, data scientists can train machine learning models on clusters of Azure virtual machines. In this tutorial, you create a GPU-enabled cluster as your training environment. The code below creates the compute cluster for you if it doesn't already exist in your workspace.
You may need to wait a few minutes for your compute cluster to be provisioned if it doesn't already exist.
```{r create_cluster, eval=FALSE}
cluster_name <- "gpucluster"
compute_target <- get_compute(ws, cluster_name = cluster_name)
if (is.null(compute_target))
{
vm_size <- "STANDARD_NC6"
compute_target <- create_aml_compute(workspace = ws,
cluster_name = cluster_name,
vm_size = vm_size,
max_nodes = 4)
wait_for_provisioning_completion(compute_target, show_output = TRUE)
}
```
## Prepare the training script
A training script called `tf_mnist.R` has been provided for you in the "project_files" directory of this tutorial. The Azure ML SDK provides a set of logging APIs for logging various metrics during training runs. These metrics are recorded and persisted in the experiment run record, and can be be accessed at any time or viewed in the run details page in [Azure Machine Learning studio](http://ml.azure.com/).
In order to collect and upload run metrics, you need to do the following **inside the training script**:
* Import the **azuremlsdk** package
```
library(azuremlsdk)
```
* Add the [`log_metric_to_run()`](https://azure.github.io/azureml-sdk-for-r/reference/log_metric_to_run.html) function to track our primary metric, "accuracy", for this experiment. If you have your own training script with several important metrics, simply create a logging call for each one within the script.
```
log_metric_to_run("accuracy",
sess$run(accuracy,
feed_dict = dict(x = mnist$test$images, y_ = mnist$test$labels)))
```
See the [reference](https://azure.github.io/azureml-sdk-for-r/reference/index.html#section-training-experimentation) for the full set of logging methods `log_*()` available from the R SDK.
## Create an estimator
An Azure ML **estimator** encapsulates the run configuration information needed for executing a training script on the compute target. Azure ML runs are run as containerized jobs on the specified compute target. By default, the Docker image built for your training job will include R, the Azure ML SDK, and a set of commonly used R packages. See the full list of default packages included [here](https://azure.github.io/azureml-sdk-for-r/reference/r_environment.html).
To create the estimator, define the following:
* The directory that contains your scripts needed for training (`source_directory`). All the files in this directory are uploaded to the cluster node(s) for execution. The directory must contain your training script and any additional scripts required.
* The training script that will be executed (`entry_script`).
* The compute target (`compute_target`), in this case the AmlCompute cluster you created earlier.
* Any environment dependencies required for training. Since the training script requires the TensorFlow package, which is not included in the image by default, pass the package name to the `cran_packages` parameter to have it installed in the Docker container where the job will run. See the [`estimator()`](https://azure.github.io/azureml-sdk-for-r/reference/estimator.html) reference for the full set of configurable options.
* Set the `use_gpu = TRUE` flag so the default base GPU Docker image will be built, since the job will be run on a GPU cluster.
```{r create_estimator, eval=FALSE}
est <- estimator(source_directory = "project_files",
entry_script = "tf_mnist.R",
compute_target = compute_target,
cran_packages = c("tensorflow"),
use_gpu = TRUE)
```
## Submit the job
Finally submit the job to run on your cluster. [`submit_experiment()`](https://azure.github.io/azureml-sdk-for-r/reference/submit_experiment.html) returns a `Run` object that you can then use to interface with the run.
```{r submit_job, eval=FALSE}
run <- submit_experiment(exp, est)
```
You can view the run<75>s details as a table. Clicking the <20>Web View<65> link provided will bring you to Azure Machine Learning studio, where you can monitor the run in the UI.
```{r eval=FALSE}
view_run_details(run)
```
Model training happens in the background. Wait until the model has finished training before you run more code.
```{r eval=FALSE}
wait_for_run_completion(run, show_output = TRUE)
```
## View run metrics
Once your job has finished, you can view the metrics collected during your TensorFlow run.
```{r get_metrics, eval=FALSE}
metrics <- get_run_metrics(run)
metrics
```
## Clean up resources
Delete the resources once you no longer need them. Don't delete any resource you plan to still use.
Delete the compute cluster:
```{r delete_compute, eval=FALSE}
delete_compute(compute_target)
```

View File

@@ -1,497 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/deployment/accelerated-models/accelerated-models-object-detection.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Azure ML Hardware Accelerated Object Detection"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This tutorial will show you how to deploy an object detection service based on the SSD-VGG model in just a few minutes using the Azure Machine Learning Accelerated AI service.\n",
"\n",
"We will use the SSD-VGG model accelerated on an FPGA. Our Accelerated Models Service handles translating deep neural networks (DNN) into an FPGA program.\n",
"\n",
"The steps in this notebook are: \n",
"1. [Setup Environment](#set-up-environment)\n",
"* [Construct Model](#construct-model)\n",
" * Image Preprocessing\n",
" * Featurizer\n",
" * Save Model\n",
" * Save input and output tensor names\n",
"* [Create Image](#create-image)\n",
"* [Deploy Image](#deploy-image)\n",
"* [Test the Service](#test-service)\n",
" * Create Client\n",
" * Serve the model\n",
"* [Cleanup](#cleanup)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"set-up-environment\"></a>\n",
"## 1. Set up Environment\n",
"### 1.a. Imports"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import tensorflow as tf"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 1.b. Retrieve Workspace\n",
"If you haven't created a Workspace, please follow [this notebook](\"../../../configuration.ipynb\") to do so. If you have, run the codeblock below to retrieve it. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"\n",
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"construct-model\"></a>\n",
"## 2. Construct model\n",
"### 2.a. Image preprocessing\n",
"We'd like our service to accept JPEG images as input. However the input to SSD-VGG is a float tensor of shape \\[1, 300, 300, 3\\]. The first dimension is batch, then height, width, and channels (i.e. NHWC). To bridge this gap, we need code that decodes JPEG images and resizes them appropriately for input to SSD-VGG. The Accelerated AI service can execute TensorFlow graphs as part of the service and we'll use that ability to do the image preprocessing. This code defines a TensorFlow graph that preprocesses an array of JPEG images (as TensorFlow strings) and produces a tensor that is ready to be featurized by SSD-VGG.\n",
"\n",
"**Note:** Expect to see TF deprecation warnings until we port our SDK over to use Tensorflow 2.0."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Input images as a two-dimensional tensor containing an arbitrary number of images represented a strings\n",
"import azureml.accel.models.utils as utils\n",
"tf.reset_default_graph()\n",
"\n",
"in_images = tf.placeholder(tf.string)\n",
"image_tensors = utils.preprocess_array(in_images, output_width=300, output_height=300, preserve_aspect_ratio=False)\n",
"print(image_tensors.shape)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.b. Featurizer\n",
"The SSD-VGG model is different from our other models in that it generates 12 tensor outputs. These corresponds to x,y displacements of the anchor boxes and the detection confidence (for 21 classes). Because these outputs are not convenient to work with, we will later use a pre-defined post-processing utility to transform the outputs into a simplified list of bounding boxes with their respective class and confidence.\n",
"\n",
"For more information about the output tensors, take this example: the output tensor 'ssd_300_vgg/block4_box/Reshape_1:0' has a shape of [None, 37, 37, 4, 21]. This gives the pre-softmax confidence for 4 anchor boxes situated at each site of a 37 x 37 grid imposed on the image, one confidence score for each of the 21 classes. The first dimension is the batch dimension. Likewise, 'ssd_300_vgg/block4_box/Reshape:0' has shape [None, 37, 37, 4, 4] and encodes the (cx, cy) center shift and rescaling (sw, sh) relative to each anchor box. Refer to the [SSD-VGG paper](https://arxiv.org/abs/1512.02325) to understand how these are computed. The other 10 tensors are defined similarly."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.accel.models import SsdVgg\n",
"\n",
"saved_model_dir = os.path.join(os.path.expanduser('~'), 'models')\n",
"model_graph = SsdVgg(saved_model_dir, is_frozen = True)\n",
"\n",
"print('SSD-VGG Input Tensors:')\n",
"for idx, input_name in enumerate(model_graph.input_tensor_list):\n",
" print('{}, {}'.format(input_name, model_graph.get_input_dims(idx)))\n",
" \n",
"print('SSD-VGG Output Tensors:')\n",
"for idx, output_name in enumerate(model_graph.output_tensor_list):\n",
" print('{}, {}'.format(output_name, model_graph.get_output_dims(idx)))\n",
"\n",
"ssd_outputs = model_graph.import_graph_def(image_tensors, is_training=False)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.c. Save Model\n",
"Now that we loaded both parts of the tensorflow graph (preprocessor and SSD-VGG featurizer), we can save the graph and associated variables to a directory which we can register as an Azure ML Model."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model_name = \"ssdvgg\"\n",
"model_save_path = os.path.join(saved_model_dir, model_name, \"saved_model\")\n",
"print(\"Saving model in {}\".format(model_save_path))\n",
"\n",
"output_map = {}\n",
"for i, output in enumerate(ssd_outputs):\n",
" output_map['out_{}'.format(i)] = output\n",
"\n",
"with tf.Session() as sess:\n",
" model_graph.restore_weights(sess)\n",
" tf.saved_model.simple_save(sess, \n",
" model_save_path, \n",
" inputs={'images': in_images}, \n",
" outputs=output_map)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.d. Important! Save names of input and output tensors\n",
"\n",
"These input and output tensors that were created during the preprocessing and classifier steps are also going to be used when **converting the model** to an Accelerated Model that can run on FPGA's and for **making an inferencing request**. It is very important to save this information!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"register model from file"
]
},
"outputs": [],
"source": [
"input_tensors = in_images.name\n",
"# We will use the list of output tensors during inferencing\n",
"output_tensors = [output.name for output in ssd_outputs]\n",
"# However, for multiple output tensors, our AccelOnnxConverter will \n",
"# accept comma-delimited strings (lists will cause error)\n",
"output_tensors_str = \",\".join(output_tensors)\n",
"\n",
"print(input_tensors)\n",
"print(output_tensors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"create-image\"></a>\n",
"## 3. Create AccelContainerImage\n",
"Below we will execute all the same steps as in the [Quickstart](./accelerated-models-quickstart.ipynb#create-image) to package the model we have saved locally into an accelerated Docker image saved in our workspace. To complete all the steps, it may take a few minutes. For more details on each step, check out the [Quickstart section on model registration](./accelerated-models-quickstart.ipynb#register-model)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"from azureml.core.model import Model\n",
"from azureml.core.image import Image\n",
"from azureml.accel import AccelOnnxConverter\n",
"from azureml.accel import AccelContainerImage\n",
"\n",
"# Retrieve workspace\n",
"ws = Workspace.from_config()\n",
"print(\"Successfully retrieved workspace:\", ws.name, ws.resource_group, ws.location, ws.subscription_id, '\\n')\n",
"\n",
"# Register model\n",
"registered_model = Model.register(workspace = ws,\n",
" model_path = model_save_path,\n",
" model_name = model_name)\n",
"print(\"Successfully registered: \", registered_model.name, registered_model.description, registered_model.version, '\\n', sep = '\\t')\n",
"\n",
"# Convert model\n",
"convert_request = AccelOnnxConverter.convert_tf_model(ws, registered_model, input_tensors, output_tensors_str)\n",
"if convert_request.wait_for_completion(show_output = False):\n",
" # If the above call succeeded, get the converted model\n",
" converted_model = convert_request.result\n",
" print(\"\\nSuccessfully converted: \", converted_model.name, converted_model.url, converted_model.version, \n",
" converted_model.id, converted_model.created_time, '\\n')\n",
"else:\n",
" print(\"Model conversion failed. Showing output.\")\n",
" convert_request.wait_for_completion(show_output = True)\n",
"\n",
"# Package into AccelContainerImage\n",
"image_config = AccelContainerImage.image_configuration()\n",
"# Image name must be lowercase\n",
"image_name = \"{}-image\".format(model_name)\n",
"image = Image.create(name = image_name,\n",
" models = [converted_model],\n",
" image_config = image_config, \n",
" workspace = ws)\n",
"image.wait_for_creation()\n",
"print(\"Created AccelContainerImage: {} {} {}\\n\".format(image.name, image.creation_state, image.image_location))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"deploy-image\"></a>\n",
"## 4. Deploy image\n",
"Once you have an Azure ML Accelerated Image in your Workspace, you can deploy it to two destinations, to a Databox Edge machine or to an AKS cluster. \n",
"\n",
"### 4.a. Deploy to Databox Edge Machine using IoT Hub\n",
"See the sample [here](https://github.com/Azure-Samples/aml-real-time-ai/) for using the Azure IoT CLI extension for deploying your Docker image to your Databox Edge Machine.\n",
"\n",
"### 4.b. Deploy to AKS Cluster\n",
"Same as in the [Quickstart section on image deployment](./accelerated-models-quickstart.ipynb#deploy-image), we are going to create an AKS cluster with FPGA-enabled machines, then deploy our service to it.\n",
"#### Create AKS ComputeTarget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AksCompute, ComputeTarget\n",
"\n",
"# Uses the specific FPGA enabled VM (sku: Standard_PB6s)\n",
"# Standard_PB6s are available in: eastus, westus2, westeurope, southeastasia\n",
"prov_config = AksCompute.provisioning_configuration(vm_size = \"Standard_PB6s\",\n",
" agent_count = 1, \n",
" location = \"eastus\")\n",
"\n",
"aks_name = 'aks-pb6-obj'\n",
"# Create the cluster\n",
"aks_target = ComputeTarget.create(workspace = ws, \n",
" name = aks_name, \n",
" provisioning_configuration = prov_config)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Provisioning an AKS cluster might take awhile (15 or so minutes), and we want to wait until it's successfully provisioned before we can deploy a service to it. If you interrupt this cell, provisioning of the cluster will continue. You can re-run it or check the status in your Workspace under Compute."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"aks_target.wait_for_completion(show_output = True)\n",
"print(aks_target.provisioning_state)\n",
"print(aks_target.provisioning_errors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Deploy AccelContainerImage to AKS ComputeTarget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"from azureml.core.webservice import Webservice, AksWebservice\n",
"\n",
"# Set the web service configuration (for creating a test service, we don't want autoscale enabled)\n",
"# Authentication is enabled by default, but for testing we specify False\n",
"aks_config = AksWebservice.deploy_configuration(autoscale_enabled=False,\n",
" num_replicas=1,\n",
" auth_enabled = False)\n",
"\n",
"aks_service_name ='my-aks-service-3'\n",
"\n",
"aks_service = Webservice.deploy_from_image(workspace = ws,\n",
" name = aks_service_name,\n",
" image = image,\n",
" deployment_config = aks_config,\n",
" deployment_target = aks_target)\n",
"aks_service.wait_for_deployment(show_output = True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"test-service\"></a>\n",
"## 5. Test the service\n",
"<a id=\"create-client\"></a>\n",
"### 5.a. Create Client\n",
"The image supports gRPC and the TensorFlow Serving \"predict\" API. We will create a PredictionClient from the Webservice object that can call into the docker image to get predictions. If you do not have the Webservice object, you can also create [PredictionClient](https://docs.microsoft.com/en-us/python/api/azureml-accel-models/azureml.accel.predictionclient?view=azure-ml-py) directly.\n",
"\n",
"**Note:** If you chose to use auth_enabled=True when creating your AksWebservice.deploy_configuration(), see documentation [here](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#get-keys--) on how to retrieve your keys and use either key as an argument to PredictionClient(...,access_token=key).\n",
"**WARNING:** If you are running on Azure Notebooks free compute, you will not be able to make outgoing calls to your service. Try locating your client on a different machine to consume it."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Using the grpc client in AzureML Accelerated Models SDK\n",
"from azureml.accel import client_from_service\n",
"\n",
"# Initialize AzureML Accelerated Models client\n",
"client = client_from_service(aks_service)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can adapt the client [code](https://github.com/Azure/aml-real-time-ai/blob/master/pythonlib/amlrealtimeai/client.py) to meet your needs. There is also an example C# [client](https://github.com/Azure/aml-real-time-ai/blob/master/sample-clients/csharp).\n",
"\n",
"The service provides an API that is compatible with TensorFlow Serving. There are instructions to download a sample client [here](https://www.tensorflow.org/serving/setup)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"serve-model\"></a>\n",
"### 5.b. Serve the model\n",
"The SSD-VGG model returns the confidence and bounding boxes for all possible anchor boxes. As mentioned earlier, we will use a post-processing routine to transform this into a list of bounding boxes (y1, x1, y2, x2) where x, y are fractional coordinates measured from left and top respectively. A respective list of classes and scores is also returned to tag each bounding box. Below we make use of this information to draw the bounding boxes on top the original image. Note that in the post-processing routine we select a confidence threshold of 0.5."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import cv2\n",
"from matplotlib import pyplot as plt\n",
"\n",
"colors_tableau = [(255, 255, 255), (31, 119, 180), (174, 199, 232), (255, 127, 14), (255, 187, 120),\n",
" (44, 160, 44), (152, 223, 138), (214, 39, 40), (255, 152, 150),\n",
" (148, 103, 189), (197, 176, 213), (140, 86, 75), (196, 156, 148),\n",
" (227, 119, 194), (247, 182, 210), (127, 127, 127), (199, 199, 199),\n",
" (188, 189, 34), (219, 219, 141), (23, 190, 207), (158, 218, 229)]\n",
"\n",
"\n",
"def draw_boxes_on_img(img, classes, scores, bboxes, thickness=2):\n",
" shape = img.shape\n",
" for i in range(bboxes.shape[0]):\n",
" bbox = bboxes[i]\n",
" color = colors_tableau[classes[i]]\n",
" # Draw bounding box...\n",
" p1 = (int(bbox[0] * shape[0]), int(bbox[1] * shape[1]))\n",
" p2 = (int(bbox[2] * shape[0]), int(bbox[3] * shape[1]))\n",
" cv2.rectangle(img, p1[::-1], p2[::-1], color, thickness)\n",
" # Draw text...\n",
" s = '%s/%.3f' % (classes[i], scores[i])\n",
" p1 = (p1[0]-5, p1[1])\n",
" cv2.putText(img, s, p1[::-1], cv2.FONT_HERSHEY_DUPLEX, 0.4, color, 1)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.accel._external.ssdvgg_utils as ssdvgg_utils\n",
"\n",
"result = client.score_file(path=\"meeting.jpg\", input_name=input_tensors, outputs=output_tensors)\n",
"classes, scores, bboxes = ssdvgg_utils.postprocess(result, select_threshold=0.5)\n",
"\n",
"img = cv2.imread('meeting.jpg', 1)\n",
"img = cv2.cvtColor(img, cv2.COLOR_BGR2RGB)\n",
"draw_boxes_on_img(img, classes, scores, bboxes)\n",
"plt.imshow(img)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"cleanup\"></a>\n",
"## 6. Cleanup\n",
"It's important to clean up your resources, so that you won't incur unnecessary costs. In the [next notebook](./accelerated-models-training.ipynb) you will learn how to train a classfier on a new dataset using transfer learning."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service.delete()\n",
"aks_target.delete()\n",
"image.delete()\n",
"registered_model.delete()\n",
"converted_model.delete()"
]
}
],
"metadata": {
"authors": [
{
"name": "coverste"
},
{
"name": "paledger"
},
{
"name": "sukha"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.6"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,7 +0,0 @@
name: accelerated-models-object-detection
dependencies:
- pip:
- azureml-sdk
- azureml-accel-models[cpu]
- opencv-python
- matplotlib

View File

@@ -1,555 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/deployment/accelerated-models/accelerated-models-quickstart.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Azure ML Hardware Accelerated Models Quickstart"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This tutorial will show you how to deploy an image recognition service based on the ResNet 50 classifier using the Azure Machine Learning Accelerated Models service. Get more information about our service from our [documentation](https://docs.microsoft.com/en-us/azure/machine-learning/service/concept-accelerate-with-fpgas), [API reference](https://docs.microsoft.com/en-us/python/api/azureml-accel-models/azureml.accel?view=azure-ml-py), or [forum](https://aka.ms/aml-forum).\n",
"\n",
"We will use an accelerated ResNet50 featurizer running on an FPGA. Our Accelerated Models Service handles translating deep neural networks (DNN) into an FPGA program.\n",
"\n",
"For more information about using other models besides Resnet50, see the [README](./README.md).\n",
"\n",
"The steps covered in this notebook are: \n",
"1. [Set up environment](#set-up-environment)\n",
"* [Construct model](#construct-model)\n",
" * Image Preprocessing\n",
" * Featurizer (Resnet50)\n",
" * Classifier\n",
" * Save Model\n",
"* [Register Model](#register-model)\n",
"* [Convert into Accelerated Model](#convert-model)\n",
"* [Create Image](#create-image)\n",
"* [Deploy](#deploy-image)\n",
"* [Test service](#test-service)\n",
"* [Clean-up](#clean-up)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"set-up-environment\"></a>\n",
"## 1. Set up environment"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import tensorflow as tf"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Retrieve Workspace\n",
"If you haven't created a Workspace, please follow [this notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/configuration.ipynb) to do so. If you have, run the codeblock below to retrieve it. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"\n",
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"construct-model\"></a>\n",
"## 2. Construct model\n",
"\n",
"There are three parts to the model we are deploying: pre-processing, featurizer with ResNet50, and classifier with ImageNet dataset. Then we will save this complete Tensorflow model graph locally before registering it to your Azure ML Workspace.\n",
"\n",
"### 2.a. Image preprocessing\n",
"We'd like our service to accept JPEG images as input. However the input to ResNet50 is a tensor. So we need code that decodes JPEG images and does the preprocessing required by ResNet50. The Accelerated AI service can execute TensorFlow graphs as part of the service and we'll use that ability to do the image preprocessing. This code defines a TensorFlow graph that preprocesses an array of JPEG images (as strings) and produces a tensor that is ready to be featurized by ResNet50.\n",
"\n",
"**Note:** Expect to see TF deprecation warnings until we port our SDK over to use Tensorflow 2.0."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Input images as a two-dimensional tensor containing an arbitrary number of images represented a strings\n",
"import azureml.accel.models.utils as utils\n",
"tf.reset_default_graph()\n",
"\n",
"in_images = tf.placeholder(tf.string)\n",
"image_tensors = utils.preprocess_array(in_images)\n",
"print(image_tensors.shape)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.b. Featurizer\n",
"We use ResNet50 as a featurizer. In this step we initialize the model. This downloads a TensorFlow checkpoint of the quantized ResNet50."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.accel.models import QuantizedResnet50\n",
"save_path = os.path.expanduser('~/models')\n",
"model_graph = QuantizedResnet50(save_path, is_frozen = True)\n",
"feature_tensor = model_graph.import_graph_def(image_tensors)\n",
"print(model_graph.version)\n",
"print(feature_tensor.name)\n",
"print(feature_tensor.shape)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.c. Classifier\n",
"The model we downloaded includes a classifier which takes the output of the ResNet50 and identifies an image. This classifier is trained on the ImageNet dataset. We are going to use this classifier for our service. The next [notebook](./accelerated-models-training.ipynb) shows how to train a classifier for a different data set. The input to the classifier is a tensor matching the output of our ResNet50 featurizer."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"classifier_output = model_graph.get_default_classifier(feature_tensor)\n",
"print(classifier_output)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.d. Save Model\n",
"Now that we loaded all three parts of the tensorflow graph (preprocessor, resnet50 featurizer, and the classifier), we can save the graph and associated variables to a directory which we can register as an Azure ML Model."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# model_name must be lowercase\n",
"model_name = \"resnet50\"\n",
"model_save_path = os.path.join(save_path, model_name)\n",
"print(\"Saving model in {}\".format(model_save_path))\n",
"\n",
"with tf.Session() as sess:\n",
" model_graph.restore_weights(sess)\n",
" tf.saved_model.simple_save(sess, model_save_path,\n",
" inputs={'images': in_images},\n",
" outputs={'output_alias': classifier_output})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 2.e. Important! Save names of input and output tensors\n",
"\n",
"These input and output tensors that were created during the preprocessing and classifier steps are also going to be used when **converting the model** to an Accelerated Model that can run on FPGA's and for **making an inferencing request**. It is very important to save this information! You can see our defaults for all the models in the [README](./README.md).\n",
"\n",
"By default for Resnet50, these are the values you should see when running the cell below: \n",
"* input_tensors = \"Placeholder:0\"\n",
"* output_tensors = \"classifier/resnet_v1_50/predictions/Softmax:0\""
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"register model from file"
]
},
"outputs": [],
"source": [
"input_tensors = in_images.name\n",
"output_tensors = classifier_output.name\n",
"\n",
"print(input_tensors)\n",
"print(output_tensors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"register-model\"></a>\n",
"## 3. Register Model"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can add tags and descriptions to your models. Using tags, you can track useful information such as the name and version of the machine learning library used to train the model. Note that tags must be alphanumeric."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"register model from file"
]
},
"outputs": [],
"source": [
"from azureml.core.model import Model\n",
"\n",
"registered_model = Model.register(workspace = ws,\n",
" model_path = model_save_path,\n",
" model_name = model_name)\n",
"\n",
"print(\"Successfully registered: \", registered_model.name, registered_model.description, registered_model.version, sep = '\\t')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"convert-model\"></a>\n",
"## 4. Convert Model"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For conversion you need to provide names of input and output tensors. This information can be found from the model_graph you saved in step 2.e. above.\n",
"\n",
"**Note**: Conversion may take a while and on average for FPGA model it is about 1-3 minutes and it depends on model type."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"register model from file"
]
},
"outputs": [],
"source": [
"from azureml.accel import AccelOnnxConverter\n",
"\n",
"convert_request = AccelOnnxConverter.convert_tf_model(ws, registered_model, input_tensors, output_tensors)\n",
"\n",
"if convert_request.wait_for_completion(show_output = False):\n",
" # If the above call succeeded, get the converted model\n",
" converted_model = convert_request.result\n",
" print(\"\\nSuccessfully converted: \", converted_model.name, converted_model.url, converted_model.version, \n",
" converted_model.id, converted_model.created_time, '\\n')\n",
"else:\n",
" print(\"Model conversion failed. Showing output.\")\n",
" convert_request.wait_for_completion(show_output = True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"create-image\"></a>\n",
"## 5. Package the model into an Image"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can add tags and descriptions to image. Also, for FPGA model an image can only contain **single** model.\n",
"\n",
"**Note**: The following command can take few minutes. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.image import Image\n",
"from azureml.accel import AccelContainerImage\n",
"\n",
"image_config = AccelContainerImage.image_configuration()\n",
"# Image name must be lowercase\n",
"image_name = \"{}-image\".format(model_name)\n",
"\n",
"image = Image.create(name = image_name,\n",
" models = [converted_model],\n",
" image_config = image_config, \n",
" workspace = ws)\n",
"image.wait_for_creation(show_output = False)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"deploy-image\"></a>\n",
"## 6. Deploy\n",
"Once you have an Azure ML Accelerated Image in your Workspace, you can deploy it to two destinations, to a Databox Edge machine or to an AKS cluster. \n",
"\n",
"### 6.a. Databox Edge Machine using IoT Hub\n",
"See the sample [here](https://github.com/Azure-Samples/aml-real-time-ai/) for using the Azure IoT CLI extension for deploying your Docker image to your Databox Edge Machine.\n",
"\n",
"### 6.b. Azure Kubernetes Service (AKS) using Azure ML Service\n",
"We are going to create an AKS cluster with FPGA-enabled machines, then deploy our service to it. For more information, see [AKS official docs](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-deploy-and-where#aks).\n",
"\n",
"#### Create AKS ComputeTarget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"sample-akscompute-provision"
]
},
"outputs": [],
"source": [
"from azureml.core.compute import AksCompute, ComputeTarget\n",
"\n",
"# Uses the specific FPGA enabled VM (sku: Standard_PB6s)\n",
"# Standard_PB6s are available in: eastus, westus2, westeurope, southeastasia\n",
"prov_config = AksCompute.provisioning_configuration(vm_size = \"Standard_PB6s\",\n",
" agent_count = 1, \n",
" location = \"eastus\")\n",
"\n",
"aks_name = 'my-aks-pb6'\n",
"# Create the cluster\n",
"aks_target = ComputeTarget.create(workspace = ws, \n",
" name = aks_name, \n",
" provisioning_configuration = prov_config)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Provisioning an AKS cluster might take awhile (15 or so minutes), and we want to wait until it's successfully provisioned before we can deploy a service to it. If you interrupt this cell, provisioning of the cluster will continue. You can also check the status in your Workspace under Compute."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"aks_target.wait_for_completion(show_output = True)\n",
"print(aks_target.provisioning_state)\n",
"print(aks_target.provisioning_errors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Deploy AccelContainerImage to AKS ComputeTarget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"from azureml.core.webservice import Webservice, AksWebservice\n",
"\n",
"# Set the web service configuration (for creating a test service, we don't want autoscale enabled)\n",
"# Authentication is enabled by default, but for testing we specify False\n",
"aks_config = AksWebservice.deploy_configuration(autoscale_enabled=False,\n",
" num_replicas=1,\n",
" auth_enabled = False)\n",
"\n",
"aks_service_name ='my-aks-service-1'\n",
"\n",
"aks_service = Webservice.deploy_from_image(workspace = ws,\n",
" name = aks_service_name,\n",
" image = image,\n",
" deployment_config = aks_config,\n",
" deployment_target = aks_target)\n",
"aks_service.wait_for_deployment(show_output = True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"test-service\"></a>\n",
"## 7. Test the service"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 7.a. Create Client\n",
"The image supports gRPC and the TensorFlow Serving \"predict\" API. We will create a PredictionClient from the Webservice object that can call into the docker image to get predictions. If you do not have the Webservice object, you can also create [PredictionClient](https://docs.microsoft.com/en-us/python/api/azureml-accel-models/azureml.accel.predictionclient?view=azure-ml-py) directly.\n",
"\n",
"**Note:** If you chose to use auth_enabled=True when creating your AksWebservice, see documentation [here](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#get-keys--) on how to retrieve your keys and use either key as an argument to PredictionClient(...,access_token=key).\n",
"**WARNING:** If you are running on Azure Notebooks free compute, you will not be able to make outgoing calls to your service. Try locating your client on a different machine to consume it."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Using the grpc client in AzureML Accelerated Models SDK\n",
"from azureml.accel import client_from_service\n",
"\n",
"# Initialize AzureML Accelerated Models client\n",
"client = client_from_service(aks_service)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"You can adapt the client [code](https://github.com/Azure/aml-real-time-ai/blob/master/pythonlib/amlrealtimeai/client.py) to meet your needs. There is also an example C# [client](https://github.com/Azure/aml-real-time-ai/blob/master/sample-clients/csharp).\n",
"\n",
"The service provides an API that is compatible with TensorFlow Serving. There are instructions to download a sample client [here](https://www.tensorflow.org/serving/setup)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 7.b. Serve the model\n",
"To understand the results we need a mapping to the human readable imagenet classes"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import requests\n",
"classes_entries = requests.get(\"https://raw.githubusercontent.com/Lasagne/Recipes/master/examples/resnet50/imagenet_classes.txt\").text.splitlines()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Score image with input and output tensor names\n",
"results = client.score_file(path=\"./snowleopardgaze.jpg\", \n",
" input_name=input_tensors, \n",
" outputs=output_tensors)\n",
"\n",
"# map results [class_id] => [confidence]\n",
"results = enumerate(results)\n",
"# sort results by confidence\n",
"sorted_results = sorted(results, key=lambda x: x[1], reverse=True)\n",
"# print top 5 results\n",
"for top in sorted_results[:5]:\n",
" print(classes_entries[top[0]], 'confidence:', top[1])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"clean-up\"></a>\n",
"## 8. Clean-up\n",
"Run the cell below to delete your webservice, image, and model (must be done in that order). In the [next notebook](./accelerated-models-training.ipynb) you will learn how to train a classfier on a new dataset using transfer learning and finetune the weights."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service.delete()\n",
"aks_target.delete()\n",
"image.delete()\n",
"registered_model.delete()\n",
"converted_model.delete()"
]
}
],
"metadata": {
"authors": [
{
"name": "coverste"
},
{
"name": "paledger"
},
{
"name": "aibhalla"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,5 +0,0 @@
name: accelerated-models-quickstart
dependencies:
- pip:
- azureml-sdk
- azureml-accel-models[cpu]

View File

@@ -1,870 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/deployment/accelerated-models/accelerated-models-training.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Training with the Azure Machine Learning Accelerated Models Service"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This notebook will introduce how to apply common machine learning techniques, like transfer learning, custom weights, and unquantized vs. quantized models, when working with our Azure Machine Learning Accelerated Models Service (Azure ML Accel Models).\n",
"\n",
"We will use Tensorflow for the preprocessing steps, ResNet50 for the featurizer, and the Keras API (built on Tensorflow backend) to build the classifier layers instead of the default ImageNet classifier used in Quickstart. Then we will train the model, evaluate it, and deploy it to run on an FPGA.\n",
"\n",
"#### Transfer Learning and Custom weights\n",
"We will walk you through two ways to build and train a ResNet50 model on the Kaggle Cats and Dogs dataset: transfer learning only and then transfer learning with custom weights.\n",
"\n",
"In using transfer learning, our goal is to re-purpose the ResNet50 model already trained on the [ImageNet image dataset](http://www.image-net.org/) as a basis for our training of the Kaggle Cats and Dogs dataset. The ResNet50 featurizer will be imported as frozen, so only the Keras classifier will be trained.\n",
"\n",
"With the addition of custom weights, we will build the model so that the ResNet50 featurizer weights as not frozen. This will let us retrain starting with custom weights trained with ImageNet on ResNet50 and then use the Kaggle Cats and Dogs dataset to retrain and fine-tune the quantized version of the model.\n",
"\n",
"#### Unquantized vs. Quantized models\n",
"The unquantized version of our models (ie. Resnet50, Resnet152, Densenet121, Vgg16, SsdVgg) uses native float precision (32-bit floats), which will be faster at training. We will use this for our first run through, then fine-tune the weights with the quantized version. The quantized version of our models (i.e. QuantizedResnet50, QuantizedResnet152, QuantizedDensenet121, QuantizedVgg16, QuantizedSsdVgg) will have the same node names as the unquantized version, but use quantized operations and will match the performance of the model when running on an FPGA.\n",
"\n",
"#### Contents\n",
"1. [Setup Environment](#setup)\n",
"* [Prepare Data](#prepare-data)\n",
"* [Construct Model](#construct-model)\n",
" * Preprocessor\n",
" * Classifier\n",
" * Model construction\n",
"* [Train Model](#train-model)\n",
"* [Test Model](#test-model)\n",
"* [Execution](#execution)\n",
" * [Transfer Learning](#transfer-learning)\n",
" * [Transfer Learning with Custom Weights](#custom-weights)\n",
"* [Create Image](#create-image)\n",
"* [Deploy Image](#deploy-image)\n",
"* [Test the service](#test-service)\n",
"* [Clean-up](#cleanup)\n",
"* [Appendix](#appendix)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"setup\"></a>\n",
"## 1. Setup Environment\n",
"#### 1.a. Please set up your environment as described in the [Quickstart](./accelerated-models-quickstart.ipynb), meaning:\n",
"* Make sure your Workspace config.json exists and has the correct info\n",
"* Install Tensorflow\n",
"\n",
"#### 1.b. Download dataset into ~/catsanddogs \n",
"The dataset we will be using for training can be downloaded [here](https://www.microsoft.com/en-us/download/details.aspx?id=54765). Download the zip and extract to a directory named 'catsanddogs' under your user directory (\"~/catsanddogs\"). \n",
"\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### 1.c. Import packages"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import sys\n",
"import tensorflow as tf\n",
"import numpy as np\n",
"from keras import backend as K\n",
"import sklearn\n",
"import tqdm"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### 1.d. Create directories for later use\n",
"After you train your model in float32, you'll write the weights to a place on disk. We also need a location to store the models that get downloaded."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"custom_weights_dir = os.path.expanduser(\"~/custom-weights\")\n",
"saved_model_dir = os.path.expanduser(\"~/models\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"prepare-data\"></a>\n",
"## 2. Prepare Data\n",
"Load the files we are going to use for training and testing. By default this notebook uses only a very small subset of the Cats and Dogs dataset. That makes it run relatively quickly."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import glob\n",
"import imghdr\n",
"datadir = os.path.expanduser(\"~/catsanddogs\")\n",
"\n",
"cat_files = glob.glob(os.path.join(datadir, 'PetImages', 'Cat', '*.jpg'))\n",
"dog_files = glob.glob(os.path.join(datadir, 'PetImages', 'Dog', '*.jpg'))\n",
"\n",
"# Limit the data set to make the notebook execute quickly.\n",
"cat_files = cat_files[:64]\n",
"dog_files = dog_files[:64]\n",
"\n",
"# The data set has a few images that are not jpeg. Remove them.\n",
"cat_files = [f for f in cat_files if imghdr.what(f) == 'jpeg']\n",
"dog_files = [f for f in dog_files if imghdr.what(f) == 'jpeg']\n",
"\n",
"if(not len(cat_files) or not len(dog_files)):\n",
" print(\"Please download the Kaggle Cats and Dogs dataset form https://www.microsoft.com/en-us/download/details.aspx?id=54765 and extract the zip to \" + datadir) \n",
" raise ValueError(\"Data not found\")\n",
"else:\n",
" print(cat_files[0])\n",
" print(dog_files[0])"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Construct a numpy array as labels\n",
"image_paths = cat_files + dog_files\n",
"total_files = len(cat_files) + len(dog_files)\n",
"labels = np.zeros(total_files)\n",
"labels[len(cat_files):] = 1"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Split images data as training data and test data\n",
"from sklearn.model_selection import train_test_split\n",
"onehot_labels = np.array([[0,1] if i else [1,0] for i in labels])\n",
"img_train, img_test, label_train, label_test = train_test_split(image_paths, onehot_labels, random_state=42, shuffle=True)\n",
"\n",
"print(len(img_train), len(img_test), label_train.shape, label_test.shape)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"construct-model\"></a>\n",
"## 3. Construct Model\n",
"We will define the functions to handle creating the preprocessor and the classifier first, and then run them together to actually construct the model with the Resnet50 featurizer in a single Tensorflow session in a separate cell.\n",
"\n",
"We use ResNet50 for the featurizer and build our own classifier using Keras layers. We train the featurizer and the classifier as one model. We will provide parameters to determine whether we are using the quantized version and whether we are using custom weights in training or not."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3.a. Define image preprocessing step\n",
"Same as in the Quickstart, before passing image dataset to the ResNet50 featurizer, we need to preprocess the input file to get it into the form expected by ResNet50. ResNet50 expects float tensors representing the images in BGR, channel last order. We've provided a default implementation of the preprocessing that you can use.\n",
"\n",
"**Note:** Expect to see TF deprecation warnings until we port our SDK over to use Tensorflow 2.0."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.accel.models.utils as utils\n",
"\n",
"def preprocess_images(scaling_factor=1.0):\n",
" # Convert images to 3D tensors [width,height,channel] - channels are in BGR order.\n",
" in_images = tf.placeholder(tf.string)\n",
" image_tensors = utils.preprocess_array(in_images, 'RGB', scaling_factor)\n",
" return in_images, image_tensors"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3.b. Define classifier\n",
"We use Keras layer APIs to construct the classifier. Because we're using the tensorflow backend, we can train this classifier in one session with our Resnet50 model."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def construct_classifier(in_tensor, seed=None):\n",
" from keras.layers import Dropout, Dense, Flatten\n",
" from keras.initializers import glorot_uniform\n",
" K.set_session(tf.get_default_session())\n",
"\n",
" FC_SIZE = 1024\n",
" NUM_CLASSES = 2\n",
"\n",
" x = Dropout(0.2, input_shape=(1, 1, int(in_tensor.shape[3]),), seed=seed)(in_tensor)\n",
" x = Dense(FC_SIZE, activation='relu', input_dim=(1, 1, int(in_tensor.shape[3]),),\n",
" kernel_initializer=glorot_uniform(seed=seed), bias_initializer='zeros')(x)\n",
" x = Flatten()(x)\n",
" preds = Dense(NUM_CLASSES, activation='softmax', input_dim=FC_SIZE, name='classifier_output',\n",
" kernel_initializer=glorot_uniform(seed=seed), bias_initializer='zeros')(x)\n",
" return preds"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### 3.c. Define model construction\n",
"Now that the preprocessor and classifier for the model are defined, we can define how we want to construct the model. \n",
"\n",
"Constructing the model has these steps: \n",
"1. Get preprocessing steps\n",
"* Get featurizer using the Azure ML Accel Models SDK:\n",
" * import the graph definition\n",
" * restore the weights of the model into a Tensorflow session\n",
"* Get classifier\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def construct_model(quantized, starting_weights_directory = None):\n",
" from azureml.accel.models import Resnet50, QuantizedResnet50\n",
" \n",
" # Convert images to 3D tensors [width,height,channel]\n",
" in_images, image_tensors = preprocess_images(1.0)\n",
"\n",
" # Construct featurizer using quantized or unquantized ResNet50 model\n",
" if not quantized:\n",
" featurizer = Resnet50(saved_model_dir)\n",
" else:\n",
" featurizer = QuantizedResnet50(saved_model_dir, custom_weights_directory = starting_weights_directory)\n",
"\n",
" features = featurizer.import_graph_def(input_tensor=image_tensors)\n",
" \n",
" # Construct classifier\n",
" preds = construct_classifier(features)\n",
" \n",
" # Initialize weights\n",
" sess = tf.get_default_session()\n",
" tf.global_variables_initializer().run()\n",
"\n",
" featurizer.restore_weights(sess)\n",
"\n",
" return in_images, image_tensors, features, preds, featurizer"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"train-model\"></a>\n",
"## 4. Train Model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def read_files(files):\n",
" \"\"\" Read files to array\"\"\"\n",
" contents = []\n",
" for path in files:\n",
" with open(path, 'rb') as f:\n",
" contents.append(f.read())\n",
" return contents"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def train_model(preds, in_images, img_train, label_train, is_retrain = False, train_epoch = 10, learning_rate=None):\n",
" \"\"\" training model \"\"\"\n",
" from keras.objectives import binary_crossentropy\n",
" from tqdm import tqdm\n",
" \n",
" learning_rate = learning_rate if learning_rate else 0.001 if is_retrain else 0.01\n",
" \n",
" # Specify the loss function\n",
" in_labels = tf.placeholder(tf.float32, shape=(None, 2)) \n",
" cross_entropy = tf.reduce_mean(binary_crossentropy(in_labels, preds))\n",
" optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cross_entropy)\n",
"\n",
" def chunks(a, b, n):\n",
" \"\"\"Yield successive n-sized chunks from a and b.\"\"\"\n",
" if (len(a) != len(b)):\n",
" print(\"a and b are not equal in chunks(a,b,n)\")\n",
" raise ValueError(\"Parameter error\")\n",
"\n",
" for i in range(0, len(a), n):\n",
" yield a[i:i + n], b[i:i + n]\n",
"\n",
" chunk_size = 16\n",
" chunk_num = len(label_train) / chunk_size\n",
"\n",
" sess = tf.get_default_session()\n",
" for epoch in range(train_epoch):\n",
" avg_loss = 0\n",
" for img_chunk, label_chunk in tqdm(chunks(img_train, label_train, chunk_size)):\n",
" contents = read_files(img_chunk)\n",
" _, loss = sess.run([optimizer, cross_entropy],\n",
" feed_dict={in_images: contents,\n",
" in_labels: label_chunk,\n",
" K.learning_phase(): 1})\n",
" avg_loss += loss / chunk_num\n",
" print(\"Epoch:\", (epoch + 1), \"loss = \", \"{:.3f}\".format(avg_loss))\n",
" \n",
" # Reach desired performance\n",
" if (avg_loss < 0.001):\n",
" break"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"test-model\"></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"test-model\"></a>\n",
"## 5. Test Model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def test_model(preds, in_images, img_test, label_test):\n",
" \"\"\"Test the model\"\"\"\n",
" from keras.metrics import categorical_accuracy\n",
"\n",
" in_labels = tf.placeholder(tf.float32, shape=(None, 2))\n",
" accuracy = tf.reduce_mean(categorical_accuracy(in_labels, preds))\n",
" contents = read_files(img_test)\n",
"\n",
" accuracy = accuracy.eval(feed_dict={in_images: contents,\n",
" in_labels: label_test,\n",
" K.learning_phase(): 0})\n",
" return accuracy"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"execution\"></a>\n",
"## 6. Execute steps\n",
"You can run through the Transfer Learning section, then skip to Create AccelContainerImage. By default, because the custom weights section takes much longer for training twice, it is not saved as executable cells. You can copy the code or change cell type to 'Code'.\n",
"\n",
"<a id=\"transfer-learning\"></a>\n",
"### 6.a. Training using Transfer Learning"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"# Launch the training\n",
"tf.reset_default_graph()\n",
"sess = tf.Session(graph=tf.get_default_graph())\n",
"\n",
"with sess.as_default():\n",
" in_images, image_tensors, features, preds, featurizer = construct_model(quantized=True)\n",
" train_model(preds, in_images, img_train, label_train, is_retrain=False, train_epoch=10, learning_rate=0.01) \n",
" accuracy = test_model(preds, in_images, img_test, label_test) \n",
" print(\"Accuracy:\", accuracy)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Save Model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model_name = 'resnet50-catsanddogs-tl'\n",
"model_save_path = os.path.join(saved_model_dir, model_name)\n",
"\n",
"tf.saved_model.simple_save(sess, model_save_path,\n",
" inputs={'images': in_images},\n",
" outputs={'output_alias': preds})\n",
"\n",
"input_tensors = in_images.name\n",
"output_tensors = preds.name\n",
"\n",
"print(input_tensors)\n",
"print(output_tensors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"custom-weights\"></a>\n",
"### 6.b. Traning using Custom Weights\n",
"\n",
"Because the quantized graph defintion and the float32 graph defintion share the same node names in the graph definitions, we can initally train the weights in float32, and then reload them with the quantized operations (which take longer) to fine-tune the model.\n",
"\n",
"First we train the model with custom weights but without quantization. Training is done with native float precision (32-bit floats). We load the training data set and batch the training with 10 epochs. When the performance reaches desired level or starts decredation, we stop the training iteration and save the weights as tensorflow checkpoint files. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Launch the training\n",
"```\n",
"tf.reset_default_graph()\n",
"sess = tf.Session(graph=tf.get_default_graph())\n",
"\n",
"with sess.as_default():\n",
" in_images, image_tensors, features, preds, featurizer = construct_model(quantized=False)\n",
" train_model(preds, in_images, img_train, label_train, is_retrain=False, train_epoch=10) \n",
" accuracy = test_model(preds, in_images, img_test, label_test) \n",
" print(\"Accuracy:\", accuracy)\n",
" featurizer.save_weights(custom_weights_dir + \"/rn50\", tf.get_default_session())\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Test Model\n",
"After training, we evaluate the trained model's accuracy on test dataset with quantization. So that we know the model's performance if it is deployed on the FPGA."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```\n",
"tf.reset_default_graph()\n",
"sess = tf.Session(graph=tf.get_default_graph())\n",
"\n",
"with sess.as_default():\n",
" print(\"Testing trained model with quantization\")\n",
" in_images, image_tensors, features, preds, quantized_featurizer = construct_model(quantized=True, starting_weights_directory=custom_weights_dir)\n",
" accuracy = test_model(preds, in_images, img_test, label_test) \n",
" print(\"Accuracy:\", accuracy)\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Fine-Tune Model\n",
"Sometimes, the model's accuracy can drop significantly after quantization. In those cases, we need to retrain the model enabled with quantization to get better model accuracy."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```\n",
"if (accuracy < 0.93):\n",
" with sess.as_default():\n",
" print(\"Fine-tuning model with quantization\")\n",
" train_model(preds, in_images, img_train, label_train, is_retrain=True, train_epoch=10)\n",
" accuracy = test_model(preds, in_images, img_test, label_test) \n",
" print(\"Accuracy:\", accuracy)\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Save Model"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```\n",
"model_name = 'resnet50-catsanddogs-cw'\n",
"model_save_path = os.path.join(saved_model_dir, model_name)\n",
"\n",
"tf.saved_model.simple_save(sess, model_save_path,\n",
" inputs={'images': in_images},\n",
" outputs={'output_alias': preds})\n",
"\n",
"input_tensors = in_images.name\n",
"output_tensors = preds.name\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"create-image\"></a>\n",
"## 7. Create AccelContainerImage\n",
"\n",
"Below we will execute all the same steps as in the [Quickstart](./accelerated-models-quickstart.ipynb#create-image) to package the model we have saved locally into an accelerated Docker image saved in our workspace. To complete all the steps, it may take a few minutes. For more details on each step, check out the [Quickstart section on model registration](./accelerated-models-quickstart.ipynb#register-model)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"from azureml.core.model import Model\n",
"from azureml.core.image import Image\n",
"from azureml.accel import AccelOnnxConverter\n",
"from azureml.accel import AccelContainerImage\n",
"\n",
"# Retrieve workspace\n",
"ws = Workspace.from_config()\n",
"print(\"Successfully retrieved workspace:\", ws.name, ws.resource_group, ws.location, ws.subscription_id, '\\n')\n",
"\n",
"# Register model\n",
"registered_model = Model.register(workspace = ws,\n",
" model_path = model_save_path,\n",
" model_name = model_name)\n",
"print(\"Successfully registered: \", registered_model.name, registered_model.description, registered_model.version, '\\n', sep = '\\t')\n",
"\n",
"# Convert model\n",
"convert_request = AccelOnnxConverter.convert_tf_model(ws, registered_model, input_tensors, output_tensors)\n",
"if convert_request.wait_for_completion(show_output = False):\n",
" # If the above call succeeded, get the converted model\n",
" converted_model = convert_request.result\n",
" print(\"\\nSuccessfully converted: \", converted_model.name, converted_model.url, converted_model.version, \n",
" converted_model.id, converted_model.created_time, '\\n')\n",
"else:\n",
" print(\"Model conversion failed. Showing output.\")\n",
" convert_request.wait_for_completion(show_output = True)\n",
"\n",
"# Package into AccelContainerImage\n",
"image_config = AccelContainerImage.image_configuration()\n",
"# Image name must be lowercase\n",
"image_name = \"{}-image\".format(model_name)\n",
"image = Image.create(name = image_name,\n",
" models = [converted_model],\n",
" image_config = image_config, \n",
" workspace = ws)\n",
"image.wait_for_creation()\n",
"print(\"Created AccelContainerImage: {} {} {}\\n\".format(image.name, image.creation_state, image.image_location))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"deploy-image\"></a>\n",
"## 8. Deploy image\n",
"Once you have an Azure ML Accelerated Image in your Workspace, you can deploy it to two destinations, to a Databox Edge machine or to an AKS cluster. \n",
"\n",
"### 8.a. Deploy to Databox Edge Machine using IoT Hub\n",
"See the sample [here](https://github.com/Azure-Samples/aml-real-time-ai/) for using the Azure IoT CLI extension for deploying your Docker image to your Databox Edge Machine.\n",
"\n",
"### 8.b. Deploy to AKS Cluster"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Create AKS ComputeTarget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AksCompute, ComputeTarget\n",
"\n",
"# Uses the specific FPGA enabled VM (sku: Standard_PB6s)\n",
"# Standard_PB6s are available in: eastus, westus2, westeurope, southeastasia\n",
"prov_config = AksCompute.provisioning_configuration(vm_size = \"Standard_PB6s\",\n",
" agent_count = 1,\n",
" location = \"eastus\")\n",
"\n",
"aks_name = 'aks-pb6-tl'\n",
"# Create the cluster\n",
"aks_target = ComputeTarget.create(workspace = ws, \n",
" name = aks_name, \n",
" provisioning_configuration = prov_config)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Provisioning an AKS cluster might take awhile (15 or so minutes), and we want to wait until it's successfully provisioned before we can deploy a service to it. If you interrupt this cell, provisioning of the cluster will continue. You can re-run it or check the status in your Workspace under Compute."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"aks_target.wait_for_completion(show_output = True)\n",
"print(aks_target.provisioning_state)\n",
"print(aks_target.provisioning_errors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Deploy AccelContainerImage to AKS ComputeTarget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"sample-akswebservice-deploy-from-image"
]
},
"outputs": [],
"source": [
"%%time\n",
"from azureml.core.webservice import Webservice, AksWebservice\n",
"\n",
"# Set the web service configuration (for creating a test service, we don't want autoscale enabled)\n",
"# Authentication is enabled by default, but for testing we specify False\n",
"aks_config = AksWebservice.deploy_configuration(autoscale_enabled=False,\n",
" num_replicas=1,\n",
" auth_enabled = False)\n",
"\n",
"aks_service_name ='my-aks-service-2'\n",
"\n",
"aks_service = Webservice.deploy_from_image(workspace = ws,\n",
" name = aks_service_name,\n",
" image = image,\n",
" deployment_config = aks_config,\n",
" deployment_target = aks_target)\n",
"aks_service.wait_for_deployment(show_output = True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"test-service\"></a>\n",
"## 9. Test the service\n",
"\n",
"<a id=\"create-client\"></a>\n",
"### 9.a. Create Client\n",
"The image supports gRPC and the TensorFlow Serving \"predict\" API. We will create a PredictionClient from the Webservice object that can call into the docker image to get predictions. If you do not have the Webservice object, you can also create [PredictionClient](https://docs.microsoft.com/en-us/python/api/azureml-accel-models/azureml.accel.predictionclient?view=azure-ml-py) directly.\n",
"\n",
"**Note:** If you chose to use auth_enabled=True when creating your AksWebservice.deploy_configuration(), see documentation [here](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.webservice(class)?view=azure-ml-py#get-keys--) on how to retrieve your keys and use either key as an argument to PredictionClient(...,access_token=key).\n",
"**WARNING:** If you are running on Azure Notebooks free compute, you will not be able to make outgoing calls to your service. Try locating your client on a different machine to consume it."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Using the grpc client in AzureML Accelerated Models SDK\n",
"from azureml.accel import client_from_service\n",
"\n",
"# Initialize AzureML Accelerated Models client\n",
"client = client_from_service(aks_service)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"serve-model\"></a>\n",
"### 9.b. Serve the model\n",
"Let's see how our service does on a few images. It may get a few wrong."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Specify an image to classify\n",
"print('CATS')\n",
"for image_file in cat_files[:8]:\n",
" results = client.score_file(path=image_file, \n",
" input_name=input_tensors, \n",
" outputs=output_tensors)\n",
" result = 'CORRECT ' if results[0] > results[1] else 'WRONG '\n",
" print(result + str(results))\n",
"print('DOGS')\n",
"for image_file in dog_files[:8]:\n",
" results = client.score_file(path=image_file, \n",
" input_name=input_tensors, \n",
" outputs=output_tensors)\n",
" result = 'CORRECT ' if results[1] > results[0] else 'WRONG '\n",
" print(result + str(results))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"cleanup\"></a>\n",
"## 10. Cleanup\n",
"It's important to clean up your resources, so that you won't incur unnecessary costs."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service.delete()\n",
"aks_target.delete()\n",
"image.delete()\n",
"registered_model.delete()\n",
"converted_model.delete()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"appendix\"></a>\n",
"## 11. Appendix"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"License for plot_confusion_matrix:\n",
"\n",
"New BSD License\n",
"\n",
"Copyright (c) 2007-2018 The scikit-learn developers.\n",
"All rights reserved.\n",
"\n",
"\n",
"Redistribution and use in source and binary forms, with or without\n",
"modification, are permitted provided that the following conditions are met:\n",
"\n",
" a. Redistributions of source code must retain the above copyright notice,\n",
" this list of conditions and the following disclaimer.\n",
" b. Redistributions in binary form must reproduce the above copyright\n",
" notice, this list of conditions and the following disclaimer in the\n",
" documentation and/or other materials provided with the distribution.\n",
" c. Neither the name of the Scikit-learn Developers nor the names of\n",
" its contributors may be used to endorse or promote products\n",
" derived from this software without specific prior written\n",
" permission. \n",
"\n",
"\n",
"THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\n",
"AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\n",
"IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE\n",
"ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE FOR\n",
"ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\n",
"DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\n",
"SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\n",
"CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT\n",
"LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY\n",
"OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH\n",
"DAMAGE.\n"
]
}
],
"metadata": {
"authors": [
{
"name": "coverste"
},
{
"name": "paledger"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.6"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,8 +0,0 @@
name: accelerated-models-training
dependencies:
- pip:
- azureml-sdk
- azureml-accel-models[cpu]
- keras
- tqdm
- sklearn

Some files were not shown because too many files have changed in this diff Show More