Compare commits

...

13 Commits

Author SHA1 Message Date
amlrelsa-ms
6c629f1eda update samples from Release-57 as a part of SDK release 2020-07-06 22:05:24 +00:00
Harneet Virk
053efde8c9 Merge pull request #1022 from Azure/release_update/Release-56
update samples from Release-56 as a part of  SDK release
2020-06-22 11:12:31 -07:00
amlrelsa-ms
5189691f06 update samples from Release-56 as a part of SDK release 2020-06-22 18:11:40 +00:00
Harneet Virk
fb900916e3 Update README.md 2020-06-11 13:26:04 -07:00
Harneet Virk
738347f3da Merge pull request #996 from Azure/release_update/Release-55
update samples from Release-55 as a part of  SDK release
2020-06-08 15:31:35 -07:00
amlrelsa-ms
34a67c1f8b update samples from Release-55 as a part of SDK release 2020-06-08 22:28:25 +00:00
Harneet Virk
34898828be Merge pull request #992 from Azure/release_update/Release-54
update samples from Release-54 as a part of  SDK release
2020-06-02 14:42:02 -07:00
vizhur
a7c3a0fdb8 update samples from Release-54 as a part of SDK release 2020-06-02 21:34:10 +00:00
Harneet Virk
6d11cdfa0a Merge pull request #984 from Azure/release_update/Release-53
update samples from Release-53 as a part of  SDK release
2020-05-26 19:59:58 -07:00
vizhur
11e8ed2bab update samples from Release-53 as a part of SDK release 2020-05-27 02:45:07 +00:00
Harneet Virk
12c06a4168 Merge pull request #978 from ahcan76/patch-1
Fix image paths in tutorial-1st-experiment-sdk-train.ipynb
2020-05-18 12:58:21 -07:00
ahcan76
1f75dc9725 Update tutorial-1st-experiment-sdk-train.ipynb
Fix the image path
2020-05-18 22:40:54 +03:00
Harneet Virk
1a1a42d525 Merge pull request #977 from Azure/release_update/Release-52
update samples from Release-52 as a part of  SDK release
2020-05-18 12:22:48 -07:00
139 changed files with 5431 additions and 2160 deletions

View File

@@ -40,6 +40,7 @@ The [How to use Azure ML](./how-to-use-azureml) folder contains specific example
- [Deployment](./how-to-use-azureml/deployment) - Examples showing how to deploy and manage machine learning models and solutions
- [Azure Databricks](./how-to-use-azureml/azure-databricks) - Examples showing how to use Azure ML with Azure Databricks
- [Monitor Models](./how-to-use-azureml/monitor-models) - Examples showing how to enable model monitoring services such as DataDrift
- [Reinforcement Learning](./how-to-use-azureml/reinforcement-learning) - Examples showing how to train reinforcement learning agents
---
## Documentation

View File

@@ -103,7 +103,7 @@
"source": [
"import azureml.core\n",
"\n",
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},

View File

@@ -0,0 +1,549 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/contrib/fairness/fairlearn-azureml-mitigation.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Unfairness Mitigation with Fairlearn and Azure Machine Learning\n",
"**This notebook shows how to upload results from Fairlearn's GridSearch mitigation algorithm into a dashboard in Azure Machine Learning Studio**\n",
"\n",
"## Table of Contents\n",
"\n",
"1. [Introduction](#Introduction)\n",
"1. [Loading the Data](#LoadingData)\n",
"1. [Training an Unmitigated Model](#UnmitigatedModel)\n",
"1. [Mitigation with GridSearch](#Mitigation)\n",
"1. [Uploading a Fairness Dashboard to Azure](#AzureUpload)\n",
" 1. Registering models\n",
" 1. Computing Fairness Metrics\n",
" 1. Uploading to Azure\n",
"1. [Conclusion](#Conclusion)\n",
"\n",
"<a id=\"Introduction\"></a>\n",
"## Introduction\n",
"This notebook shows how to use [Fairlearn (an open source fairness assessment and unfairness mitigation package)](http://fairlearn.github.io) and Azure Machine Learning Studio for a binary classification problem. This example uses the well-known adult census dataset. For the purposes of this notebook, we shall treat this as a loan decision problem. We will pretend that the label indicates whether or not each individual repaid a loan in the past. We will use the data to train a predictor to predict whether previously unseen individuals will repay a loan or not. The assumption is that the model predictions are used to decide whether an individual should be offered a loan. Its purpose is purely illustrative of a workflow including a fairness dashboard - in particular, we do **not** include a full discussion of the detailed issues which arise when considering fairness in machine learning. For such discussions, please [refer to the Fairlearn website](http://fairlearn.github.io/).\n",
"\n",
"We will apply the [grid search algorithm](https://fairlearn.github.io/api_reference/fairlearn.reductions.html#fairlearn.reductions.GridSearch) from the Fairlearn package using a specific notion of fairness called Demographic Parity. This produces a set of models, and we will view these in a dashboard both locally and in the Azure Machine Learning Studio.\n",
"\n",
"### Setup\n",
"\n",
"To use this notebook, an Azure Machine Learning workspace is required.\n",
"Please see the [configuration notebook](../../configuration.ipynb) for information about creating one, if required.\n",
"This notebook also requires the following packages:\n",
"* `azureml-contrib-fairness`\n",
"* `fairlearn==0.4.6`\n",
"* `joblib`\n",
"* `shap`\n",
"\n",
"\n",
"<a id=\"LoadingData\"></a>\n",
"## Loading the Data\n",
"We use the well-known `adult` census dataset, which we load using `shap` (for convenience). We start with a fairly unremarkable set of imports:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from fairlearn.reductions import GridSearch, DemographicParity, ErrorRate\n",
"from fairlearn.widget import FairlearnDashboard\n",
"from sklearn import svm\n",
"from sklearn.preprocessing import LabelEncoder, StandardScaler\n",
"from sklearn.linear_model import LogisticRegression\n",
"import pandas as pd\n",
"import shap"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can now load and inspect the data from the `shap` package:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_raw, Y = shap.datasets.adult()\n",
"X_raw[\"Race\"].value_counts().to_dict()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We are going to treat the sex of each individual as a protected attribute (where 0 indicates female and 1 indicates male), and in this particular case we are going separate this attribute out and drop it from the main data (this is not always the best option - see the [Fairlearn website](http://fairlearn.github.io/) for further discussion). We also separate out the Race column, but we will not perform any mitigation based on it. Finally, we perform some standard data preprocessing steps to convert the data into a format suitable for the ML algorithms"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"A = X_raw[['Sex','Race']]\n",
"X = X_raw.drop(labels=['Sex', 'Race'],axis = 1)\n",
"X = pd.get_dummies(X)\n",
"\n",
"\n",
"le = LabelEncoder()\n",
"Y = le.fit_transform(Y)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"With our data prepared, we can make the conventional split in to 'test' and 'train' subsets:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.model_selection import train_test_split\n",
"X_train, X_test, Y_train, Y_test, A_train, A_test = train_test_split(X_raw, \n",
" Y, \n",
" A,\n",
" test_size = 0.2,\n",
" random_state=0,\n",
" stratify=Y)\n",
"\n",
"# Work around indexing issue\n",
"X_train = X_train.reset_index(drop=True)\n",
"A_train = A_train.reset_index(drop=True)\n",
"X_test = X_test.reset_index(drop=True)\n",
"A_test = A_test.reset_index(drop=True)\n",
"\n",
"# Improve labels\n",
"A_test.Sex.loc[(A_test['Sex'] == 0)] = 'female'\n",
"A_test.Sex.loc[(A_test['Sex'] == 1)] = 'male'\n",
"\n",
"\n",
"A_test.Race.loc[(A_test['Race'] == 0)] = 'Amer-Indian-Eskimo'\n",
"A_test.Race.loc[(A_test['Race'] == 1)] = 'Asian-Pac-Islander'\n",
"A_test.Race.loc[(A_test['Race'] == 2)] = 'Black'\n",
"A_test.Race.loc[(A_test['Race'] == 3)] = 'Other'\n",
"A_test.Race.loc[(A_test['Race'] == 4)] = 'White'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"UnmitigatedModel\"></a>\n",
"## Training an Unmitigated Model\n",
"\n",
"So we have a point of comparison, we first train a model (specifically, logistic regression from scikit-learn) on the raw data, without applying any mitigation algorithm:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"unmitigated_predictor = LogisticRegression(solver='liblinear', fit_intercept=True)\n",
"\n",
"unmitigated_predictor.fit(X_train, Y_train)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can view this model in the fairness dashboard, and see the disparities which appear:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"FairlearnDashboard(sensitive_features=A_test, sensitive_feature_names=['Sex', 'Race'],\n",
" y_true=Y_test,\n",
" y_pred={\"unmitigated\": unmitigated_predictor.predict(X_test)})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Looking at the disparity in accuracy when we select 'Sex' as the sensitive feature, we see that males have an error rate about three times greater than the females. More interesting is the disparity in opportunitiy - males are offered loans at three times the rate of females.\n",
"\n",
"Despite the fact that we removed the feature from the training data, our predictor still discriminates based on sex. This demonstrates that simply ignoring a protected attribute when fitting a predictor rarely eliminates unfairness. There will generally be enough other features correlated with the removed attribute to lead to disparate impact."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"Mitigation\"></a>\n",
"## Mitigation with GridSearch\n",
"\n",
"The `GridSearch` class in `Fairlearn` implements a simplified version of the exponentiated gradient reduction of [Agarwal et al. 2018](https://arxiv.org/abs/1803.02453). The user supplies a standard ML estimator, which is treated as a blackbox - for this simple example, we shall use the logistic regression estimator from scikit-learn. `GridSearch` works by generating a sequence of relabellings and reweightings, and trains a predictor for each.\n",
"\n",
"For this example, we specify demographic parity (on the protected attribute of sex) as the fairness metric. Demographic parity requires that individuals are offered the opportunity (a loan in this example) independent of membership in the protected class (i.e., females and males should be offered loans at the same rate). *We are using this metric for the sake of simplicity* in this example; the appropriate fairness metric can only be selected after *careful examination of the broader context* in which the model is to be used."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sweep = GridSearch(LogisticRegression(solver='liblinear', fit_intercept=True),\n",
" constraints=DemographicParity(),\n",
" grid_size=71)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"With our estimator created, we can fit it to the data. After `fit()` completes, we extract the full set of predictors from the `GridSearch` object.\n",
"\n",
"The following cell trains a many copies of the underlying estimator, and may take a minute or two to run:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sweep.fit(X_train, Y_train,\n",
" sensitive_features=A_train.Sex)\n",
"\n",
"predictors = sweep._predictors"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We could load these predictors into the Fairness dashboard now. However, the plot would be somewhat confusing due to their number. In this case, we are going to remove the predictors which are dominated in the error-disparity space by others from the sweep (note that the disparity will only be calculated for the protected attribute; other potentially protected attributes will *not* be mitigated). In general, one might not want to do this, since there may be other considerations beyond the strict optimisation of error and disparity (of the given protected attribute)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"errors, disparities = [], []\n",
"for m in predictors:\n",
" classifier = lambda X: m.predict(X)\n",
" \n",
" error = ErrorRate()\n",
" error.load_data(X_train, pd.Series(Y_train), sensitive_features=A_train.Sex)\n",
" disparity = DemographicParity()\n",
" disparity.load_data(X_train, pd.Series(Y_train), sensitive_features=A_train.Sex)\n",
" \n",
" errors.append(error.gamma(classifier)[0])\n",
" disparities.append(disparity.gamma(classifier).max())\n",
" \n",
"all_results = pd.DataFrame( {\"predictor\": predictors, \"error\": errors, \"disparity\": disparities})\n",
"\n",
"dominant_models_dict = dict()\n",
"base_name_format = \"census_gs_model_{0}\"\n",
"row_id = 0\n",
"for row in all_results.itertuples():\n",
" model_name = base_name_format.format(row_id)\n",
" errors_for_lower_or_eq_disparity = all_results[\"error\"][all_results[\"disparity\"]<=row.disparity]\n",
" if row.error <= errors_for_lower_or_eq_disparity.min():\n",
" dominant_models_dict[model_name] = row.predictor\n",
" row_id = row_id + 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can construct predictions for the dominant models (we include the unmitigated predictor as well, for comparison):"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"predictions_dominant = {\"census_unmitigated\": unmitigated_predictor.predict(X_test)}\n",
"models_dominant = {\"census_unmitigated\": unmitigated_predictor}\n",
"for name, predictor in dominant_models_dict.items():\n",
" value = predictor.predict(X_test)\n",
" predictions_dominant[name] = value\n",
" models_dominant[name] = predictor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"These predictions may then be viewed in the fairness dashboard. We include the race column from the dataset, as an alternative basis for assessing the models. However, since we have not based our mitigation on it, the variation in the models with respect to race can be large."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"FairlearnDashboard(sensitive_features=A_test, \n",
" sensitive_feature_names=['Sex', 'Race'],\n",
" y_true=Y_test.tolist(),\n",
" y_pred=predictions_dominant)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"When using sex as the sensitive feature, we see a Pareto front forming - the set of predictors which represent optimal tradeoffs between accuracy and disparity in predictions. In the ideal case, we would have a predictor at (1,0) - perfectly accurate and without any unfairness under demographic parity (with respect to the protected attribute \"sex\"). The Pareto front represents the closest we can come to this ideal based on our data and choice of estimator. Note the range of the axes - the disparity axis covers more values than the accuracy, so we can reduce disparity substantially for a small loss in accuracy. Finally, we also see that the unmitigated model is towards the top right of the plot, with high accuracy, but worst disparity.\n",
"\n",
"By clicking on individual models on the plot, we can inspect their metrics for disparity and accuracy in greater detail. In a real example, we would then pick the model which represented the best trade-off between accuracy and disparity given the relevant business constraints."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"AzureUpload\"></a>\n",
"## Uploading a Fairness Dashboard to Azure\n",
"\n",
"Uploading a fairness dashboard to Azure is a two stage process. The `FairlearnDashboard` invoked in the previous section relies on the underlying Python kernel to compute metrics on demand. This is obviously not available when the fairness dashboard is rendered in AzureML Studio. By default, the dashboard in Azure Machine Learning Studio also requires the models to be registered. The required stages are therefore:\n",
"1. Register the dominant models\n",
"1. Precompute all the required metrics\n",
"1. Upload to Azure\n",
"\n",
"Before that, we need to connect to Azure Machine Learning Studio:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace, Experiment, Model\n",
"\n",
"ws = Workspace.from_config()\n",
"ws.get_details()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"RegisterModels\"></a>\n",
"### Registering Models\n",
"\n",
"The fairness dashboard is designed to integrate with registered models, so we need to do this for the models we want in the Studio portal. The assumption is that the names of the models specified in the dashboard dictionary correspond to the `id`s (i.e. `<name>:<version>` pairs) of registered models in the workspace. We register each of the models in the `models_dominant` dictionary into the workspace. For this, we have to save each model to a file, and then register that file:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"import os\n",
"\n",
"os.makedirs('models', exist_ok=True)\n",
"def register_model(name, model):\n",
" print(\"Registering \", name)\n",
" model_path = \"models/{0}.pkl\".format(name)\n",
" joblib.dump(value=model, filename=model_path)\n",
" registered_model = Model.register(model_path=model_path,\n",
" model_name=name,\n",
" workspace=ws)\n",
" print(\"Registered \", registered_model.id)\n",
" return registered_model.id\n",
"\n",
"model_name_id_mapping = dict()\n",
"for name, model in models_dominant.items():\n",
" m_id = register_model(name, model)\n",
" model_name_id_mapping[name] = m_id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, produce new predictions dictionaries, with the updated names:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"predictions_dominant_ids = dict()\n",
"for name, y_pred in predictions_dominant.items():\n",
" predictions_dominant_ids[model_name_id_mapping[name]] = y_pred"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"PrecomputeMetrics\"></a>\n",
"### Precomputing Metrics\n",
"\n",
"We create a _dashboard dictionary_ using Fairlearn's `metrics` package. The `_create_group_metric_set` method has arguments similar to the Dashboard constructor, except that the sensitive features are passed as a dictionary (to ensure that names are available), and we must specify the type of prediction. Note that we use the `predictions_dominant_ids` dictionary we just created:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sf = { 'sex': A_test.Sex, 'race': A_test.Race }\n",
"\n",
"from fairlearn.metrics._group_metric_set import _create_group_metric_set\n",
"\n",
"\n",
"dash_dict = _create_group_metric_set(y_true=Y_test,\n",
" predictions=predictions_dominant_ids,\n",
" sensitive_features=sf,\n",
" prediction_type='binary_classification')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"DashboardUpload\"></a>\n",
"### Uploading the Dashboard\n",
"\n",
"Now, we import our `contrib` package which contains the routine to perform the upload:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.fairness import upload_dashboard_dictionary, download_dashboard_by_upload_id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we can create an Experiment, then a Run, and upload our dashboard to it:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"exp = Experiment(ws, \"Test_Fairlearn_GridSearch_Census_Demo\")\n",
"print(exp)\n",
"\n",
"run = exp.start_logging()\n",
"try:\n",
" dashboard_title = \"Dominant Models from GridSearch\"\n",
" upload_id = upload_dashboard_dictionary(run,\n",
" dash_dict,\n",
" dashboard_name=dashboard_title)\n",
" print(\"\\nUploaded to id: {0}\\n\".format(upload_id))\n",
"\n",
" downloaded_dict = download_dashboard_by_upload_id(run, upload_id)\n",
"finally:\n",
" run.complete()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The dashboard can be viewed in the Run Details page.\n",
"\n",
"Finally, we can verify that the dashboard dictionary which we downloaded matches our upload:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(dash_dict == downloaded_dict)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"Conclusion\"></a>\n",
"## Conclusion\n",
"\n",
"In this notebook we have demonstrated how to use the `GridSearch` algorithm from Fairlearn to generate a collection of models, and then present them in the fairness dashboard in Azure Machine Learning Studio. Please remember that this notebook has not attempted to discuss the many considerations which should be part of any approach to unfairness mitigation. The [Fairlearn website](http://fairlearn.github.io/) provides that discussion"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"authors": [
{
"name": "riedgar"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.10"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -0,0 +1,8 @@
name: fairlearn-azureml-mitigation
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-fairness
- fairlearn==0.4.6
- joblib
- shap

View File

@@ -0,0 +1,494 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/contrib/fairness/upload-fairness-dashboard.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Upload a Fairness Dashboard to Azure Machine Learning Studio\n",
"**This notebook shows how to generate and upload a fairness assessment dashboard from Fairlearn to AzureML Studio**\n",
"\n",
"## Table of Contents\n",
"\n",
"1. [Introduction](#Introduction)\n",
"1. [Loading the Data](#LoadingData)\n",
"1. [Processing the Data](#ProcessingData)\n",
"1. [Training Models](#TrainingModels)\n",
"1. [Logging in to AzureML](#LoginAzureML)\n",
"1. [Registering the Models](#RegisterModels)\n",
"1. [Using the Fairlearn Dashboard](#LocalDashboard)\n",
"1. [Uploading a Fairness Dashboard to Azure](#AzureUpload)\n",
" 1. Computing Fairness Metrics\n",
" 1. Uploading to Azure\n",
"1. [Conclusion](#Conclusion)\n",
" \n",
"\n",
"<a id=\"Introduction\"></a>\n",
"## Introduction\n",
"\n",
"In this notebook, we walk through a simple example of using the `azureml-contrib-fairness` package to upload a collection of fairness statistics for a fairness dashboard. It is an example of integrating the [open source Fairlearn package](https://www.github.com/fairlearn/fairlearn) with Azure Machine Learning. This is not an example of fairness analysis or mitigation - this notebook simply shows how to get a fairness dashboard into the Azure Machine Learning portal. We will load the data and train a couple of simple models. We will then use Fairlearn to generate data for a Fairness dashboard, which we can upload to Azure Machine Learning portal and view there.\n",
"\n",
"### Setup\n",
"\n",
"To use this notebook, an Azure Machine Learning workspace is required.\n",
"Please see the [configuration notebook](../../configuration.ipynb) for information about creating one, if required.\n",
"This notebook also requires the following packages:\n",
"* `azureml-contrib-fairness`\n",
"* `fairlearn==0.4.6`\n",
"* `joblib`\n",
"* `shap`\n",
"\n",
"\n",
"\n",
"\n",
"<a id=\"LoadingData\"></a>\n",
"## Loading the Data\n",
"We use the well-known `adult` census dataset, which we load using `shap` (for convenience). We start with a fairly unremarkable set of imports:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn import svm\n",
"from sklearn.preprocessing import LabelEncoder, StandardScaler\n",
"from sklearn.linear_model import LogisticRegression\n",
"import pandas as pd\n",
"import shap"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we can load the data:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_raw, Y = shap.datasets.adult()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can take a look at some of the data. For example, the next cells shows the counts of the different races identified in the dataset:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(X_raw[\"Race\"].value_counts().to_dict())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"ProcessingData\"></a>\n",
"## Processing the Data\n",
"\n",
"With the data loaded, we process it for our needs. First, we extract the sensitive features of interest into `A` (conventionally used in the literature) and put the rest of the feature data into `X`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"A = X_raw[['Sex','Race']]\n",
"X = X_raw.drop(labels=['Sex', 'Race'],axis = 1)\n",
"X = pd.get_dummies(X)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Next, we apply a standard set of scalings:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sc = StandardScaler()\n",
"X_scaled = sc.fit_transform(X)\n",
"X_scaled = pd.DataFrame(X_scaled, columns=X.columns)\n",
"\n",
"le = LabelEncoder()\n",
"Y = le.fit_transform(Y)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can then split our data into training and test sets, and also make the labels on our test portion of `A` human-readable:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.model_selection import train_test_split\n",
"X_train, X_test, Y_train, Y_test, A_train, A_test = train_test_split(X_scaled, \n",
" Y, \n",
" A,\n",
" test_size = 0.2,\n",
" random_state=0,\n",
" stratify=Y)\n",
"\n",
"# Work around indexing issue\n",
"X_train = X_train.reset_index(drop=True)\n",
"A_train = A_train.reset_index(drop=True)\n",
"X_test = X_test.reset_index(drop=True)\n",
"A_test = A_test.reset_index(drop=True)\n",
"\n",
"# Improve labels\n",
"A_test.Sex.loc[(A_test['Sex'] == 0)] = 'female'\n",
"A_test.Sex.loc[(A_test['Sex'] == 1)] = 'male'\n",
"\n",
"\n",
"A_test.Race.loc[(A_test['Race'] == 0)] = 'Amer-Indian-Eskimo'\n",
"A_test.Race.loc[(A_test['Race'] == 1)] = 'Asian-Pac-Islander'\n",
"A_test.Race.loc[(A_test['Race'] == 2)] = 'Black'\n",
"A_test.Race.loc[(A_test['Race'] == 3)] = 'Other'\n",
"A_test.Race.loc[(A_test['Race'] == 4)] = 'White'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"TrainingModels\"></a>\n",
"## Training Models\n",
"\n",
"We now train a couple of different models on our data. The `adult` census dataset is a classification problem - the goal is to predict whether a particular individual exceeds an income threshold. For the purpose of generating a dashboard to upload, it is sufficient to train two basic classifiers. First, a logistic regression classifier:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"lr_predictor = LogisticRegression(solver='liblinear', fit_intercept=True)\n",
"\n",
"lr_predictor.fit(X_train, Y_train)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And for comparison, a support vector classifier:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"svm_predictor = svm.SVC()\n",
"\n",
"svm_predictor.fit(X_train, Y_train)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"LoginAzureML\"></a>\n",
"## Logging in to AzureML\n",
"\n",
"With our two classifiers trained, we can log into our AzureML workspace:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace, Experiment, Model\n",
"\n",
"ws = Workspace.from_config()\n",
"ws.get_details()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"RegisterModels\"></a>\n",
"## Registering the Models\n",
"\n",
"Next, we register our models. By default, the subroutine which uploads the models checks that the names provided correspond to registered models in the workspace. We define a utility routine to do the registering:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"import os\n",
"\n",
"os.makedirs('models', exist_ok=True)\n",
"def register_model(name, model):\n",
" print(\"Registering \", name)\n",
" model_path = \"models/{0}.pkl\".format(name)\n",
" joblib.dump(value=model, filename=model_path)\n",
" registered_model = Model.register(model_path=model_path,\n",
" model_name=name,\n",
" workspace=ws)\n",
" print(\"Registered \", registered_model.id)\n",
" return registered_model.id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, we register the models. For convenience in subsequent method calls, we store the results in a dictionary, which maps the `id` of the registered model (a string in `name:version` format) to the predictor itself:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model_dict = {}\n",
"\n",
"lr_reg_id = register_model(\"fairness_linear_regression\", lr_predictor)\n",
"model_dict[lr_reg_id] = lr_predictor\n",
"svm_reg_id = register_model(\"fairness_svm\", svm_predictor)\n",
"model_dict[svm_reg_id] = svm_predictor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"LocalDashboard\"></a>\n",
"## Using the Fairlearn Dashboard\n",
"\n",
"We can now examine the fairness of the two models we have training, both as a function of race and (binary) sex. Before uploading the dashboard to the AzureML portal, we will first instantiate a local instance of the Fairlearn dashboard.\n",
"\n",
"Regardless of the viewing location, the dashboard is based on three things - the true values, the model predictions and the sensitive feature values. The dashboard can use predictions from multiple models and multiple sensitive features if desired (as we are doing here).\n",
"\n",
"Our first step is to generate a dictionary mapping the `id` of the registered model to the corresponding array of predictions:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ys_pred = {}\n",
"for n, p in model_dict.items():\n",
" ys_pred[n] = p.predict(X_test)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can examine these predictions in a locally invoked Fairlearn dashboard. This can be compared to the dashboard uploaded to the portal (in the next section):"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from fairlearn.widget import FairlearnDashboard\n",
"\n",
"FairlearnDashboard(sensitive_features=A_test, \n",
" sensitive_feature_names=['Sex', 'Race'],\n",
" y_true=Y_test.tolist(),\n",
" y_pred=ys_pred)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"AzureUpload\"></a>\n",
"## Uploading a Fairness Dashboard to Azure\n",
"\n",
"Uploading a fairness dashboard to Azure is a two stage process. The `FairlearnDashboard` invoked in the previous section relies on the underlying Python kernel to compute metrics on demand. This is obviously not available when the fairness dashboard is rendered in AzureML Studio. The required stages are therefore:\n",
"1. Precompute all the required metrics\n",
"1. Upload to Azure\n",
"\n",
"\n",
"### Computing Fairness Metrics\n",
"We use Fairlearn to create a dictionary which contains all the data required to display a dashboard. This includes both the raw data (true values, predicted values and sensitive features), and also the fairness metrics. The API is similar to that used to invoke the Dashboard locally. However, there are a few minor changes to the API, and the type of problem being examined (binary classification, regression etc.) needs to be specified explicitly:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sf = { 'Race': A_test.Race, 'Sex': A_test.Sex }\n",
"\n",
"from fairlearn.metrics._group_metric_set import _create_group_metric_set\n",
"\n",
"dash_dict = _create_group_metric_set(y_true=Y_test,\n",
" predictions=ys_pred,\n",
" sensitive_features=sf,\n",
" prediction_type='binary_classification')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The `_create_group_metric_set()` method is currently underscored since its exact design is not yet final in Fairlearn."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Uploading to Azure\n",
"\n",
"We can now import the `azureml.contrib.fairness` package itself. We will round-trip the data, so there are two required subroutines:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.fairness import upload_dashboard_dictionary, download_dashboard_by_upload_id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can upload the generated dictionary to AzureML. The upload method requires a run, so we first create an experiment and a run. The uploaded dashboard can be seen on the corresponding Run Details page in AzureML Studio. For completeness, we also download the dashboard dictionary which we uploaded."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"exp = Experiment(ws, \"notebook-01\")\n",
"print(exp)\n",
"\n",
"run = exp.start_logging()\n",
"try:\n",
" dashboard_title = \"Sample notebook upload\"\n",
" upload_id = upload_dashboard_dictionary(run,\n",
" dash_dict,\n",
" dashboard_name=dashboard_title)\n",
" print(\"\\nUploaded to id: {0}\\n\".format(upload_id))\n",
"\n",
" downloaded_dict = download_dashboard_by_upload_id(run, upload_id)\n",
"finally:\n",
" run.complete()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can verify that the dashboard dictionary which we downloaded matches our upload:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(dash_dict == downloaded_dict)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"Conclusion\"></a>\n",
"## Conclusion\n",
"\n",
"In this notebook we have demonstrated how to generate and upload a fairness dashboard to AzureML Studio. We have not discussed how to analyse the results and apply mitigations. Those topics will be covered elsewhere."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"authors": [
{
"name": "riedgar"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.8"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

View File

@@ -0,0 +1,8 @@
name: upload-fairness-dashboard
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-fairness
- fairlearn==0.4.6
- joblib
- shap

View File

@@ -144,7 +144,7 @@ jupyter notebook
- Dataset: forecasting for a bike-sharing
- Example of training an automated ML forecasting model on multiple time-series
- [auto-ml-forecasting-function.ipynb](forecasting-high-frequency/auto-ml-forecasting-function.ipynb)
- [auto-ml-forecasting-function.ipynb](forecasting-forecast-function/auto-ml-forecasting-function.ipynb)
- Example of training an automated ML forecasting model on multiple time-series
- [auto-ml-forecasting-beer-remote.ipynb](forecasting-beer-remote/auto-ml-forecasting-beer-remote.ipynb)

View File

@@ -105,7 +105,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -675,10 +675,8 @@
"model_name = best_run.properties['model_name']\n",
"\n",
"script_file_name = 'inference/score.py'\n",
"conda_env_file_name = 'inference/env.yml'\n",
"\n",
"best_run.download_file('outputs/scoring_file_v_1_0_0.py', 'inference/score.py')\n",
"best_run.download_file('outputs/conda_env_v_1_0_0.yml', 'inference/env.yml')"
"best_run.download_file('outputs/scoring_file_v_1_0_0.py', 'inference/score.py')"
]
},
{
@@ -721,8 +719,7 @@
"from azureml.core.model import Model\n",
"from azureml.core.environment import Environment\n",
"\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=conda_env_file_name)\n",
"inference_config = InferenceConfig(entry_script=script_file_name, environment=myenv)\n",
"inference_config = InferenceConfig(entry_script=script_file_name)\n",
"\n",
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
" memory_gb = 1, \n",

View File

@@ -2,7 +2,3 @@ name: auto-ml-classification-bank-marketing-all-features
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib
- onnxruntime==1.0.0

View File

@@ -93,7 +93,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},

View File

@@ -2,6 +2,3 @@ name: auto-ml-classification-credit-card-fraud
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -97,7 +97,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -491,8 +491,8 @@
"metadata": {},
"outputs": [],
"source": [
"test_run = run_inference(test_experiment, compute_target, script_folder, best_dnn_run, test_dataset,\n",
" target_column_name, model_name)"
"test_run = run_inference(test_experiment, compute_target, script_folder, best_dnn_run,\n",
" train_dataset, test_dataset, target_column_name, model_name)"
]
},
{

View File

@@ -2,11 +2,3 @@ name: auto-ml-classification-text-dnn
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib
- https://download.pytorch.org/whl/cpu/torch-1.1.0-cp35-cp35m-win_amd64.whl
- sentencepiece==0.1.82
- pytorch-transformers==1.0
- spacy==2.1.8
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz

View File

@@ -6,7 +6,7 @@ from azureml.core.run import Run
def run_inference(test_experiment, compute_target, script_folder, train_run,
test_dataset, target_column_name, model_name):
train_dataset, test_dataset, target_column_name, model_name):
train_run.download_file('outputs/conda_env_v_1_0_0.yml',
'inference/condafile.yml')
@@ -22,7 +22,10 @@ def run_inference(test_experiment, compute_target, script_folder, train_run,
'--target_column_name': target_column_name,
'--model_name': model_name
},
inputs=[test_dataset.as_named_input('test_data')],
inputs=[
train_dataset.as_named_input('train_data'),
test_dataset.as_named_input('test_data')
],
compute_target=compute_target,
environment_definition=inference_env)

View File

@@ -1,8 +1,11 @@
import numpy as np
import argparse
from azureml.core import Run
import numpy as np
from sklearn.externals import joblib
from azureml.automl.core.shared import constants, metrics
from azureml.automl.runtime.shared.score import scoring, constants
from azureml.core import Run
from azureml.core.model import Model
@@ -29,22 +32,26 @@ model = joblib.load(model_path)
run = Run.get_context()
# get input dataset by name
test_dataset = run.input_datasets['test_data']
train_dataset = run.input_datasets['train_data']
X_test_df = test_dataset.drop_columns(columns=[target_column_name]) \
.to_pandas_dataframe()
y_test_df = test_dataset.with_timestamp_columns(None) \
.keep_columns(columns=[target_column_name]) \
.to_pandas_dataframe()
y_train_df = test_dataset.with_timestamp_columns(None) \
.keep_columns(columns=[target_column_name]) \
.to_pandas_dataframe()
predicted = model.predict_proba(X_test_df)
# use automl metrics module
scores = metrics.compute_metrics_classification(
np.array(predicted),
np.array(y_test_df),
class_labels=model.classes_,
metrics=list(constants.Metric.SCALAR_CLASSIFICATION_SET)
)
# Use the AutoML scoring module
class_labels = np.unique(np.concatenate((y_train_df.values, y_test_df.values)))
train_labels = model.classes_
classification_metrics = list(constants.CLASSIFICATION_SCALAR_SET)
scores = scoring.score_classification(y_test_df.values, predicted,
classification_metrics,
class_labels, train_labels)
print("scores:")
print(scores)

View File

@@ -88,7 +88,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -201,7 +201,7 @@
"conda_run_config.environment.docker.enabled = True\n",
"conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n",
"\n",
"cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets'], \n",
"cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets', 'azureml-defaults'], \n",
" conda_packages=['numpy==1.16.2'], \n",
" pin_sdk_version=False)\n",
"#cd.add_pip_package('azureml-explain-model')\n",

View File

@@ -2,7 +2,3 @@ name: auto-ml-continuous-retraining
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib
- azureml-pipeline

View File

@@ -114,7 +114,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},

View File

@@ -1,11 +1,4 @@
name: auto-ml-forecasting-beer-remote
dependencies:
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib
- azureml-train

View File

@@ -1,11 +1,14 @@
import pandas as pd
import numpy as np
import argparse
from azureml.core import Run
import numpy as np
import pandas as pd
from pandas.tseries.frequencies import to_offset
from sklearn.externals import joblib
from sklearn.metrics import mean_absolute_error, mean_squared_error
from azureml.automl.core.shared import constants, metrics
from pandas.tseries.frequencies import to_offset
from azureml.automl.runtime.shared.score import scoring, constants
from azureml.core import Run
def align_outputs(y_predicted, X_trans, X_test, y_test,
@@ -299,12 +302,11 @@ print(df_all[target_column_name])
print("predicted values:::")
print(df_all['predicted'])
# use automl metrics module
scores = metrics.compute_metrics_regression(
df_all['predicted'],
df_all[target_column_name],
list(constants.Metric.SCALAR_REGRESSION_SET),
None, None, None)
# Use the AutoML scoring module
regression_metrics = list(constants.REGRESSION_SCALAR_SET)
y_test = np.array(df_all[target_column_name])
y_pred = np.array(df_all['predicted'])
scores = scoring.score_regression(y_test, y_pred, regression_metrics)
print("scores:")
print(scores)

View File

@@ -87,7 +87,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -510,16 +510,16 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core.shared import constants, metrics\n",
"from azureml.automl.core.shared import constants\n",
"from azureml.automl.runtime.shared.score import scoring\n",
"from sklearn.metrics import mean_absolute_error, mean_squared_error\n",
"from matplotlib import pyplot as plt\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",
" df_all['predicted'],\n",
" df_all[target_column_name],\n",
" list(constants.Metric.SCALAR_REGRESSION_SET),\n",
" None, None, None)\n",
"scores = scoring.score_regression(\n",
" y_test=df_all[target_column_name],\n",
" y_pred=df_all['predicted'],\n",
" metrics=list(constants.Metric.SCALAR_REGRESSION_SET))\n",
"\n",
"print(\"[Test data scores]\\n\")\n",
"for key, value in scores.items(): \n",

View File

@@ -1,10 +1,4 @@
name: auto-ml-forecasting-bike-share
dependencies:
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -97,7 +97,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -465,7 +465,7 @@
"metadata": {},
"source": [
"### Forecast Function\n",
"For forecasting, we will use the forecast function instead of the predict function. Using the predict method would result in getting predictions for EVERY horizon the forecaster can predict at. This is useful when training and evaluating the performance of the forecaster at various horizons, but the level of detail is excessive for normal use. Forecast function also can handle more complicated scenarios, see notebook on [high frequency forecasting](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/forecasting-high-frequency/auto-ml-forecasting-function.ipynb)."
"For forecasting, we will use the forecast function instead of the predict function. Using the predict method would result in getting predictions for EVERY horizon the forecaster can predict at. This is useful when training and evaluating the performance of the forecaster at various horizons, but the level of detail is excessive for normal use. Forecast function also can handle more complicated scenarios, see the [forecast function notebook](../forecasting-forecast-function/auto-ml-forecasting-function.ipynb)."
]
},
{
@@ -507,15 +507,15 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core.shared import constants, metrics\n",
"from azureml.automl.core.shared import constants\n",
"from azureml.automl.runtime.shared.score import scoring\n",
"from matplotlib import pyplot as plt\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",
" df_all['predicted'],\n",
" df_all[target_column_name],\n",
" list(constants.Metric.SCALAR_REGRESSION_SET),\n",
" None, None, None)\n",
"scores = scoring.score_regression(\n",
" y_test=df_all[target_column_name],\n",
" y_pred=df_all['predicted'],\n",
" metrics=list(constants.Metric.SCALAR_REGRESSION_SET))\n",
"\n",
"print(\"[Test data scores]\\n\")\n",
"for key, value in scores.items(): \n",
@@ -667,15 +667,15 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core.shared import constants, metrics\n",
"from azureml.automl.core.shared import constants\n",
"from azureml.automl.runtime.shared.score import scoring\n",
"from matplotlib import pyplot as plt\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",
" df_all['predicted'],\n",
" df_all[target_column_name],\n",
" list(constants.Metric.SCALAR_REGRESSION_SET),\n",
" None, None, None)\n",
"scores = scoring.score_regression(\n",
" y_test=df_all[target_column_name],\n",
" y_pred=df_all['predicted'],\n",
" metrics=list(constants.Metric.SCALAR_REGRESSION_SET))\n",
"\n",
"print(\"[Test data scores]\\n\")\n",
"for key, value in scores.items(): \n",

View File

@@ -2,8 +2,3 @@ name: auto-ml-forecasting-energy-demand
dependencies:
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -35,7 +35,6 @@
"Terminology:\n",
"* forecast origin: the last period when the target value is known\n",
"* forecast periods(s): the period(s) for which the value of the target is desired.\n",
"* forecast horizon: the number of forecast periods\n",
"* lookback: how many past periods (before forecast origin) the model function depends on. The larger of number of lags and length of rolling window.\n",
"* prediction context: `lookback` periods immediately preceding the forecast origin\n",
"\n",
@@ -95,7 +94,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -720,6 +719,90 @@
"X_show[['date', 'grain', 'ext_predictor', '_automl_target_col']]\n",
"# prediction is in _automl_target_col"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Forecasting farther than the maximum horizon <a id=\"recursive forecasting\"></a>\n",
"When the forecast destination, or the latest date in the prediction data frame, is farther into the future than the specified maximum horizon, the `forecast()` function will still make point predictions out to the later date using a recursive operation mode. Internally, the method recursively applies the regular forecaster to generate context so that we can forecast further into the future. \n",
"\n",
"To illustrate the use-case and operation of recursive forecasting, we'll consider an example with a single time-series where the forecasting period directly follows the training period and is twice as long as the maximum horizon given at training time.\n",
"\n",
"![Recursive_forecast_overview](recursive_forecast_overview_small.png)\n",
"\n",
"Internally, we apply the forecaster in an iterative manner and finish the forecast task in two interations. In the first iteration, we apply the forecaster and get the prediction for the first max-horizon periods (y_pred1). In the second iteraction, y_pred1 is used as the context to produce the prediction for the next max-horizon periods (y_pred2). The combination of (y_pred1 and y_pred2) gives the results for the total forecast periods. \n",
"\n",
"A caveat: forecast accuracy will likely be worse the farther we predict into the future since errors are compounded with recursive application of the forecaster.\n",
"\n",
"![Recursive_forecast_iter1](recursive_forecast_iter1.png)\n",
"![Recursive_forecast_iter2](recursive_forecast_iter2.png)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# generate the same kind of test data we trained on, but with a single grain/time-series and test period twice as long as the max_horizon\n",
"_, _, X_test_long, y_test_long = get_timeseries(train_len=n_train_periods,\n",
" test_len=max_horizon*2,\n",
" time_column_name=TIME_COLUMN_NAME,\n",
" target_column_name=TARGET_COLUMN_NAME,\n",
" grain_column_name=GRAIN_COLUMN_NAME,\n",
" grains=1)\n",
"\n",
"print(X_test_long.groupby(GRAIN_COLUMN_NAME)[TIME_COLUMN_NAME].min())\n",
"print(X_test_long.groupby(GRAIN_COLUMN_NAME)[TIME_COLUMN_NAME].max())"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# forecast() function will invoke the recursive forecast method internally.\n",
"y_pred_long, X_trans_long = fitted_model.forecast(X_test_long)\n",
"y_pred_long"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# What forecast() function does in this case is equivalent to iterating it twice over the test set as the following. \n",
"y_pred1, _ = fitted_model.forecast(X_test_long[:max_horizon])\n",
"y_pred_all, _ = fitted_model.forecast(X_test_long, np.concatenate((y_pred1, np.full(max_horizon, np.nan))))\n",
"np.array_equal(y_pred_all, y_pred_long)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Confidence interval and distributional forecasts\n",
"AutoML cannot currently estimate forecast errors beyond the maximum horizon set during training, so the `forecast_quantiles()` function will return missing values for quantiles not equal to 0.5 beyond the maximum horizon. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fitted_model.forecast_quantiles(X_test_long)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Similarly with the simple senarios illustrated above, forecasting farther than the max horizon in other senarios like 'multiple grain', 'Destination-date forecast', and 'forecast away from the training data' are also automatically handled by the `forecast()` function. "
]
}
],
"metadata": {

View File

@@ -0,0 +1,4 @@
name: auto-ml-forecasting-function
dependencies:
- pip:
- azureml-sdk

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

View File

@@ -1,10 +0,0 @@
name: auto-ml-forecasting-function
dependencies:
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -82,7 +82,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -336,7 +336,11 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"sample-featurizationconfig-remarks"
]
},
"outputs": [],
"source": [
"featurization_config = FeaturizationConfig()\n",
@@ -545,7 +549,7 @@
"source": [
"If you are used to scikit pipelines, perhaps you expected `predict(X_test)`. However, forecasting requires a more general interface that also supplies the past target `y` values. Please use `forecast(X,y)` as `predict(X)` is reserved for internal purposes on forecasting models.\n",
"\n",
"The [forecast function notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/automated-machine-learning/forecasting-high-frequency/auto-ml-forecasting-function.ipynb) demonstrates the use of the forecast function for a variety of use cases. Also, please see the [API documentation for the forecast function](https://docs.microsoft.com/en-us/python/api/azureml-automl-runtime/azureml.automl.runtime.shared.model_wrappers.forecastingpipelinewrapper?view=azure-ml-py#forecast-x-pred--typing-union-pandas-core-frame-dataframe--nonetype----none--y-pred--typing-union-pandas-core-frame-dataframe--numpy-ndarray--nonetype----none--forecast-destination--typing-union-pandas--libs-tslibs-timestamps-timestamp--nonetype----none--ignore-data-errors--bool---false-----typing-tuple-numpy-ndarray--pandas-core-frame-dataframe-)."
"The [forecast function notebook](../forecasting-forecast-function/auto-ml-forecasting-function.ipynb)."
]
},
{
@@ -576,15 +580,15 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core.shared import constants, metrics\n",
"from azureml.automl.core.shared import constants\n",
"from azureml.automl.runtime.shared.score import scoring\n",
"from matplotlib import pyplot as plt\n",
"\n",
"# use automl metrics module\n",
"scores = metrics.compute_metrics_regression(\n",
" df_all['predicted'],\n",
" df_all[target_column_name],\n",
" list(constants.Metric.SCALAR_REGRESSION_SET),\n",
" None, None, None)\n",
"# use automl scoring module\n",
"scores = scoring.score_regression(\n",
" y_test=df_all[target_column_name],\n",
" y_pred=df_all['predicted'],\n",
" metrics=list(constants.Metric.SCALAR_REGRESSION_SET))\n",
"\n",
"print(\"[Test data scores]\\n\")\n",
"for key, value in scores.items(): \n",

View File

@@ -1,10 +1,4 @@
name: auto-ml-forecasting-orange-juice-sales
dependencies:
- py-xgboost<=0.90
- pip:
- azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -28,7 +28,8 @@
"1. [Setup](#Setup)\n",
"1. [Train](#Train)\n",
"1. [Results](#Results)\n",
"1. [Test](#Test)\n",
"1. [Test](#Tests)\n",
"1. [Explanation](#Explanation)\n",
"1. [Acknowledgements](#Acknowledgements)"
]
},
@@ -49,9 +50,9 @@
"2. Configure AutoML using `AutoMLConfig`.\n",
"3. Train the model.\n",
"4. Explore the results.\n",
"5. Visualization model's feature importance in azure portal\n",
"6. Explore any model's explanation and explore feature importance in azure portal\n",
"7. Test the fitted model."
"5. Test the fitted model.\n",
"6. Explore any model's explanation and explore feature importance in azure portal.\n",
"7. Create an AKS cluster, deploy the webservice of AutoML scoring model and the explainer model to the AKS and consume the web service."
]
},
{
@@ -95,7 +96,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -255,9 +256,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Analyze results\n",
"### Analyze results\n",
"\n",
"### Retrieve the Best Model\n",
"#### Retrieve the Best Model\n",
"\n",
"Below we select the best pipeline from our iterations. The `get_output` method on `automl_classifier` returns the best run and the fitted model for the last invocation. Overloads on `get_output` allow you to retrieve the best run and fitted model for *any* logged metric or for a particular *iteration*."
]
@@ -284,9 +285,80 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Best Model 's explanation\n",
"Retrieve the explanation from the best_run which includes explanations for engineered features and raw features.\n",
"## Tests\n",
"\n",
"Now that the model is trained, split the data in the same way the data was split for training (The difference here is the data is being split locally) and then run the test data through the trained model to get the predicted values."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# convert the test data to dataframe\n",
"X_test_df = validation_data.drop_columns(columns=[label_column_name]).to_pandas_dataframe()\n",
"y_test_df = validation_data.keep_columns(columns=[label_column_name], validate=True).to_pandas_dataframe()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# call the predict functions on the model\n",
"y_pred = fitted_model.predict(X_test_df)\n",
"y_pred"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Calculate metrics for the prediction\n",
"\n",
"Now visualize the data on a scatter plot to show what our truth (actual) values are compared to the predicted values \n",
"from the trained model that was returned."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.metrics import confusion_matrix\n",
"import numpy as np\n",
"import itertools\n",
"\n",
"cf =confusion_matrix(y_test_df.values,y_pred)\n",
"plt.imshow(cf,cmap=plt.cm.Blues,interpolation='nearest')\n",
"plt.colorbar()\n",
"plt.title('Confusion Matrix')\n",
"plt.xlabel('Predicted')\n",
"plt.ylabel('Actual')\n",
"class_labels = ['False','True']\n",
"tick_marks = np.arange(len(class_labels))\n",
"plt.xticks(tick_marks,class_labels)\n",
"plt.yticks([-0.5,0,1,1.5],['','False','True',''])\n",
"# plotting text value inside cells\n",
"thresh = cf.max() / 2.\n",
"for i,j in itertools.product(range(cf.shape[0]),range(cf.shape[1])):\n",
" plt.text(j,i,format(cf[i,j],'d'),horizontalalignment='center',color='white' if cf[i,j] >thresh else 'black')\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Explanation\n",
"In this section, we will show how to compute model explanations and visualize the explanations using azureml-explain-model package. We will also show how to run the automl model and the explainer model through deploying an AKS web service.\n",
"\n",
"Besides retrieving an existing model explanation for an AutoML model, you can also explain your AutoML model with different test data. The following steps will allow you to compute and visualize engineered feature importance based on your test data.\n",
"\n",
"### Run the explanation\n",
"#### Download engineered feature importance from artifact store\n",
"You can use ExplanationClient to download the engineered feature explanations from the artifact store of the best_run. You can also use azure portal url to view the dash board visualization of the feature importance values of the engineered features."
]
@@ -303,14 +375,6 @@
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + best_run.get_portal_url())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Explanations\n",
"In this section, we will show how to compute model explanations and visualize the explanations using azureml-explain-model package. Besides retrieving an existing model explanation for an AutoML model, you can also explain your AutoML model with different test data. The following steps will allow you to compute and visualize engineered feature importance based on your test data."
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -403,6 +467,7 @@
"metadata": {},
"outputs": [],
"source": [
"# Compute the engineered explanations\n",
"engineered_explanations = explainer.explain(['local', 'global'], eval_dataset=automl_explainer_setup_obj.X_test_transform)\n",
"print(engineered_explanations.get_feature_importance_dict())\n",
"print(\"You can visualize the engineered explanations under the 'Explanations (preview)' tab in the AutoML run at:-\\n\" + automl_run.get_portal_url())"
@@ -412,41 +477,37 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Test the fitted model\n",
"#### Initialize the scoring Explainer, save and upload it for later use in scoring explanation"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.explain.model.scoring.scoring_explainer import TreeScoringExplainer\n",
"import joblib\n",
"\n",
"Now that the model is trained, split the data in the same way the data was split for training (The difference here is the data is being split locally) and then run the test data through the trained model to get the predicted values."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# convert the test data to dataframe\n",
"X_test_df = validation_data.drop_columns(columns=[label_column_name]).to_pandas_dataframe()\n",
"y_test_df = validation_data.keep_columns(columns=[label_column_name], validate=True).to_pandas_dataframe()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# call the predict functions on the model\n",
"y_pred = fitted_model.predict(X_test_df)\n",
"y_pred"
"# Initialize the ScoringExplainer\n",
"scoring_explainer = TreeScoringExplainer(explainer.explainer, feature_maps=[automl_explainer_setup_obj.feature_map])\n",
"\n",
"# Pickle scoring explainer locally to './scoring_explainer.pkl'\n",
"scoring_explainer_file_name = 'scoring_explainer.pkl'\n",
"with open(scoring_explainer_file_name, 'wb') as stream:\n",
" joblib.dump(scoring_explainer, stream)\n",
"\n",
"# Upload the scoring explainer to the automl run\n",
"automl_run.upload_file('outputs/scoring_explainer.pkl', scoring_explainer_file_name)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Calculate metrics for the prediction\n",
"### Deploying the scoring and explainer models to a web service to Azure Kubernetes Service (AKS)\n",
"\n",
"Now visualize the data on a scatter plot to show what our truth (actual) values are compared to the predicted values \n",
"from the trained model that was returned."
"We use the TreeScoringExplainer from azureml.explain.model package to create the scoring explainer which will be used to compute the raw and engineered feature importances at the inference time. In the cell below, we register the AutoML model and the scoring explainer with the Model Management Service."
]
},
{
@@ -455,25 +516,238 @@
"metadata": {},
"outputs": [],
"source": [
"from sklearn.metrics import confusion_matrix\n",
"import numpy as np\n",
"import itertools\n",
"# Register trained automl model present in the 'outputs' folder in the artifacts\n",
"original_model = automl_run.register_model(model_name='automl_model', \n",
" model_path='outputs/model.pkl')\n",
"scoring_explainer_model = automl_run.register_model(model_name='scoring_explainer',\n",
" model_path='outputs/scoring_explainer.pkl')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Create the conda dependencies for setting up the service\n",
"\n",
"cf =confusion_matrix(y_test_df.values,y_pred)\n",
"plt.imshow(cf,cmap=plt.cm.Blues,interpolation='nearest')\n",
"plt.colorbar()\n",
"plt.title('Confusion Matrix')\n",
"plt.xlabel('Predicted')\n",
"plt.ylabel('Actual')\n",
"class_labels = ['False','True']\n",
"tick_marks = np.arange(len(class_labels))\n",
"plt.xticks(tick_marks,class_labels)\n",
"plt.yticks([-0.5,0,1,1.5],['','False','True',''])\n",
"# plotting text value inside cells\n",
"thresh = cf.max() / 2.\n",
"for i,j in itertools.product(range(cf.shape[0]),range(cf.shape[1])):\n",
" plt.text(j,i,format(cf[i,j],'d'),horizontalalignment='center',color='white' if cf[i,j] >thresh else 'black')\n",
"plt.show()"
"We need to create the conda dependencies comprising of the azureml-explain-model, azureml-train-automl and azureml-defaults packages."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.automl.core.shared import constants\n",
"from azureml.core.environment import Environment\n",
"\n",
"automl_run.download_file(constants.CONDA_ENV_FILE_PATH, 'myenv.yml')\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
"myenv"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Write the Entry Script\n",
"Write the script that will be used to predict on your model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile score.py\n",
"import numpy as np\n",
"import pandas as pd\n",
"import os\n",
"import pickle\n",
"import azureml.train.automl\n",
"import azureml.explain.model\n",
"from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, \\\n",
" automl_setup_model_explanations\n",
"import joblib\n",
"from azureml.core.model import Model\n",
"\n",
"\n",
"def init():\n",
" global automl_model\n",
" global scoring_explainer\n",
"\n",
" # Retrieve the path to the model file using the model name\n",
" # Assume original model is named original_prediction_model\n",
" automl_model_path = Model.get_model_path('automl_model')\n",
" scoring_explainer_path = Model.get_model_path('scoring_explainer')\n",
"\n",
" automl_model = joblib.load(automl_model_path)\n",
" scoring_explainer = joblib.load(scoring_explainer_path)\n",
"\n",
"\n",
"def run(raw_data):\n",
" data = pd.read_json(raw_data, orient='records') \n",
" # Make prediction\n",
" predictions = automl_model.predict(data)\n",
" # Setup for inferencing explanations\n",
" automl_explainer_setup_obj = automl_setup_model_explanations(automl_model,\n",
" X_test=data, task='classification')\n",
" # Retrieve model explanations for engineered explanations\n",
" engineered_local_importance_values = scoring_explainer.explain(automl_explainer_setup_obj.X_test_transform) \n",
" # You can return any data type as long as it is JSON-serializable\n",
" return {'predictions': predictions.tolist(),\n",
" 'engineered_local_importance_values': engineered_local_importance_values}\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Create the InferenceConfig \n",
"Create the inference config that will be used when deploying the model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.model import InferenceConfig\n",
"\n",
"inf_config = InferenceConfig(entry_script='score.py', environment=myenv)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Provision the AKS Cluster\n",
"This is a one time setup. You can reuse this cluster for multiple deployments after it has been created. If you delete the cluster or the resource group that contains it, then you would have to recreate it."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import ComputeTarget, AksCompute\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your cluster.\n",
"aks_name = 'scoring-explain'\n",
"\n",
"# Verify that cluster does not exist already\n",
"try:\n",
" aks_target = ComputeTarget(workspace=ws, name=aks_name)\n",
" print('Found existing cluster, use it.')\n",
"except ComputeTargetException:\n",
" prov_config = AksCompute.provisioning_configuration(vm_size='STANDARD_D3_V2')\n",
" aks_target = ComputeTarget.create(workspace=ws, \n",
" name=aks_name,\n",
" provisioning_configuration=prov_config)\n",
"aks_target.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Deploy web service to AKS"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Set the web service configuration (using default here)\n",
"from azureml.core.webservice import AksWebservice\n",
"from azureml.core.model import Model\n",
"\n",
"aks_config = AksWebservice.deploy_configuration()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service_name ='model-scoring-local-aks'\n",
"\n",
"aks_service = Model.deploy(workspace=ws,\n",
" name=aks_service_name,\n",
" models=[scoring_explainer_model, original_model],\n",
" inference_config=inf_config,\n",
" deployment_config=aks_config,\n",
" deployment_target=aks_target)\n",
"\n",
"aks_service.wait_for_deployment(show_output = True)\n",
"print(aks_service.state)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### View the service logs"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service.get_logs()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Consume the web service using run method to do the scoring and explanation of scoring.\n",
"We test the web sevice by passing data. Run() method retrieves API keys behind the scenes to make sure that call is authenticated."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Serialize the first row of the test data into json\n",
"X_test_json = X_test_df[:1].to_json(orient='records')\n",
"print(X_test_json)\n",
"\n",
"# Call the service to get the predictions and the engineered and raw explanations\n",
"output = aks_service.run(X_test_json)\n",
"\n",
"# Print the predicted value\n",
"print('predictions:\\n{}\\n'.format(output['predictions']))\n",
"# Print the engineered feature importances for the predicted value\n",
"print('engineered_local_importance_values:\\n{}\\n'.format(output['engineered_local_importance_values']))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Clean up\n",
"Delete the service."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"aks_service.delete()"
]
},
{

View File

@@ -2,6 +2,3 @@ name: auto-ml-classification-credit-card-fraud-local
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -98,7 +98,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},
@@ -242,7 +242,11 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"sample-featurizationconfig-remarks2"
]
},
"outputs": [],
"source": [
"featurization_config = FeaturizationConfig()\n",
@@ -260,7 +264,11 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"tags": [
"sample-featurizationconfig-remarks3"
]
},
"outputs": [],
"source": [
"automl_settings = {\n",

View File

@@ -2,6 +2,3 @@ name: auto-ml-regression-explanation-featurization
dependencies:
- pip:
- azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -7,7 +7,7 @@ import azureml.train.automl
import azureml.explain.model
from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, \
automl_setup_model_explanations
from sklearn.externals import joblib
import joblib
from azureml.core.model import Model

View File

@@ -4,15 +4,14 @@ import os
from azureml.core.run import Run
from azureml.core.experiment import Experiment
from sklearn.externals import joblib
from azureml.core.dataset import Dataset
from azureml.train.automl.runtime.automl_explain_utilities import AutoMLExplainerSetupClass, \
automl_setup_model_explanations, automl_check_model_if_explainable
from azureml.explain.model.mimic.models.lightgbm_model import LGBMExplainableModel
from azureml.explain.model.mimic_wrapper import MimicWrapper
from azureml.automl.core.shared.constants import MODEL_PATH
from azureml.explain.model.scoring.scoring_explainer import TreeScoringExplainer, save
from azureml.explain.model.scoring.scoring_explainer import TreeScoringExplainer
import joblib
OUTPUT_DIR = './outputs/'
os.makedirs(OUTPUT_DIR, exist_ok=True)
@@ -74,7 +73,8 @@ print("Engineered and raw explanations computed successfully")
scoring_explainer = TreeScoringExplainer(explainer.explainer, feature_maps=[automl_explainer_setup_obj.feature_map])
# Pickle scoring explainer locally
save(scoring_explainer, exist_ok=True)
with open('scoring_explainer.pkl', 'wb') as stream:
joblib.dump(scoring_explainer, stream)
# Upload the scoring explainer to the automl run
automl_run.upload_file('outputs/scoring_explainer.pkl', 'scoring_explainer.pkl')

View File

@@ -92,7 +92,7 @@
"metadata": {},
"outputs": [],
"source": [
"print(\"This notebook was created using version 1.5.0 of the Azure ML SDK\")\n",
"print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
]
},

View File

@@ -2,7 +2,3 @@ name: auto-ml-regression
dependencies:
- pip:
- azureml-sdk
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -50,10 +50,12 @@ pip install azureml-accel-models[gpu]
### Step 4: Follow our notebooks
The notebooks in this repo walk through the following scenarios:
* [Quickstart](accelerated-models-quickstart.ipynb), deploy and inference a ResNet50 model trained on ImageNet
* [Object Detection](accelerated-models-object-detection.ipynb), deploy and inference an SSD-VGG model that can do object detection
* [Training models](accelerated-models-training.ipynb), train one of our accelerated models on the Kaggle Cats and Dogs dataset to see how to improve accuracy on custom datasets
We provide notebooks to walk through the following scenarios, linked below:
* [Quickstart](https://github.com/Azure/MachineLearningNotebooks/blob/33d6def8c30d3dd3a5bfbea50b9c727788185faf/how-to-use-azureml/deployment/accelerated-models/accelerated-models-quickstart.ipynb), deploy and inference a ResNet50 model trained on ImageNet
* [Object Detection](https://github.com/Azure/MachineLearningNotebooks/blob/33d6def8c30d3dd3a5bfbea50b9c727788185faf/how-to-use-azureml/deployment/accelerated-models/accelerated-models-object-detection.ipynb), deploy and inference an SSD-VGG model that can do object detection
* [Training models](https://github.com/Azure/MachineLearningNotebooks/blob/33d6def8c30d3dd3a5bfbea50b9c727788185faf/how-to-use-azureml/deployment/accelerated-models/accelerated-models-training.ipynb), train one of our accelerated models on the Kaggle Cats and Dogs dataset to see how to improve accuracy on custom datasets
**Note**: the above notebooks work only for tensorflow >= 1.6,<2.0.
<a name="model-classes"></a>
## Model Classes

View File

@@ -86,7 +86,37 @@
"source": [
"In this example, we will be using and registering two models. \n",
"\n",
"You wil need to have a `first_model.pkl` file and `second_model.pkl` file in the current directory. The below call registers the files as Models with the names `my_first_model` and `my_second_model` in the workspace."
"First we will train two simple models on the [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset) included with scikit-learn, serializing them to files in the current directory."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"import sklearn\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import BayesianRidge, Ridge\n",
"\n",
"x, y = load_diabetes(return_X_y=True)\n",
"\n",
"first_model = Ridge().fit(x, y)\n",
"second_model = BayesianRidge().fit(x, y)\n",
"\n",
"joblib.dump(first_model, \"first_model.pkl\")\n",
"joblib.dump(second_model, \"second_model.pkl\")\n",
"\n",
"print(\"Trained models using scikit-learn {}.\".format(sklearn.__version__))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now that we have our trained models locally, we will register them as Models with the names `my_first_model` and `my_second_model` in the workspace."
]
},
{
@@ -149,25 +179,24 @@
"outputs": [],
"source": [
"%%writefile score.py\n",
"import pickle\n",
"import joblib\n",
"import json\n",
"import numpy as np\n",
"from sklearn.externals import joblib\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"from azureml.core.model import Model\n",
"\n",
"def init():\n",
" global model_1, model_2\n",
" # note here \"my_first_model\" is the name of the model registered under the workspace\n",
" # this call should return the path to the model.pkl file on the local disk.\n",
" # Here \"my_first_model\" is the name of the model registered under the workspace.\n",
" # This call will return the path to the .pkl file on the local disk.\n",
" model_1_path = Model.get_model_path(model_name='my_first_model')\n",
" model_2_path = Model.get_model_path(model_name='my_second_model')\n",
" \n",
" # deserialize the model files back into a sklearn model\n",
" # Deserialize the model files back into scikit-learn models.\n",
" model_1 = joblib.load(model_1_path)\n",
" model_2 = joblib.load(model_2_path)\n",
"\n",
"# note you can pass in multiple rows for scoring\n",
"# Note you can pass in multiple rows for scoring.\n",
"def run(raw_data):\n",
" try:\n",
" data = json.loads(raw_data)['data']\n",
@@ -177,7 +206,7 @@
" result_1 = model_1.predict(data)\n",
" result_2 = model_2.predict(data)\n",
"\n",
" # you can return any data type as long as it is JSON-serializable\n",
" # You can return any JSON-serializable value.\n",
" return {\"prediction1\": result_1.tolist(), \"prediction2\": result_2.tolist()}\n",
" except Exception as e:\n",
" result = str(e)\n",
@@ -208,10 +237,10 @@
"source": [
"from azureml.core import Environment\n",
"\n",
"env = Environment.from_conda_specification(name='deploytocloudenv', file_path='myenv.yml')\n",
"\n",
"# This is optional at this point\n",
"# env.register(workspace=ws)"
"env = Environment(\"deploytocloudenv\")\n",
"env.python.conda_dependencies.add_pip_package(\"joblib\")\n",
"env.python.conda_dependencies.add_pip_package(\"numpy\")\n",
"env.python.conda_dependencies.add_pip_package(\"scikit-learn=={}\".format(sklearn.__version__))"
]
},
{
@@ -281,25 +310,15 @@
},
"outputs": [],
"source": [
"from azureml.core.webservice import AciWebservice, Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"from azureml.core.webservice import AciWebservice\n",
"\n",
"aci_service_name = \"aciservice-multimodel\"\n",
"\n",
"deployment_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
"aci_service_name = 'aciservice-multimodel'\n",
"\n",
"try:\n",
" # if you want to get existing service below is the command\n",
" # since aci name needs to be unique in subscription deleting existing aci if any\n",
" # we use aci_service_name to create azure aci\n",
" service = Webservice(ws, name=aci_service_name)\n",
" if service:\n",
" service.delete()\n",
"except WebserviceException as e:\n",
" print()\n",
"\n",
"service = Model.deploy(ws, aci_service_name, [my_model_1, my_model_2], inference_config, deployment_config)\n",
"\n",
"service = Model.deploy(ws, aci_service_name, [my_model_1, my_model_2], inference_config, deployment_config, overwrite=True)\n",
"service.wait_for_deployment(True)\n",
"\n",
"print(service.state)"
]
},
@@ -317,13 +336,11 @@
"outputs": [],
"source": [
"import json\n",
"test_sample = json.dumps({'data': [\n",
" [1,2,3,4,5,6,7,8,9,10], \n",
" [10,9,8,7,6,5,4,3,2,1]\n",
"]})\n",
"\n",
"test_sample_encoded = bytes(test_sample, encoding='utf8')\n",
"prediction = service.run(input_data=test_sample_encoded)\n",
"test_sample = json.dumps({'data': x[0:2].tolist()})\n",
"\n",
"prediction = service.run(test_sample)\n",
"\n",
"print(prediction)"
]
},

View File

@@ -2,3 +2,5 @@ name: multi-model-register-and-deploy
dependencies:
- pip:
- azureml-sdk
- numpy
- scikit-learn

View File

@@ -1,8 +0,0 @@
name: project_environment
dependencies:
- python=3.6.2
- pip:
- azureml-defaults
- scikit-learn
- numpy
- inference-schema[numpy-support]

View File

@@ -1,442 +0,0 @@
3.807590643342410180e-02,5.068011873981870252e-02,6.169620651868849837e-02,2.187235499495579841e-02,-4.422349842444640161e-02,-3.482076283769860309e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,1.990842087631829876e-02,-1.764612515980519894e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-5.147406123880610140e-02,-2.632783471735180084e-02,-8.448724111216979540e-03,-1.916333974822199970e-02,7.441156407875940126e-02,-3.949338287409189657e-02,-6.832974362442149896e-02,-9.220404962683000083e-02
8.529890629667830071e-02,5.068011873981870252e-02,4.445121333659410312e-02,-5.670610554934250001e-03,-4.559945128264750180e-02,-3.419446591411950259e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,2.863770518940129874e-03,-2.593033898947460017e-02
-8.906293935226029801e-02,-4.464163650698899782e-02,-1.159501450521270051e-02,-3.665644679856060184e-02,1.219056876180000040e-02,2.499059336410210108e-02,-3.603757004385269719e-02,3.430885887772629900e-02,2.269202256674450122e-02,-9.361911330135799444e-03
5.383060374248070309e-03,-4.464163650698899782e-02,-3.638469220447349689e-02,2.187235499495579841e-02,3.934851612593179802e-03,1.559613951041610019e-02,8.142083605192099172e-03,-2.592261998182820038e-03,-3.199144494135589684e-02,-4.664087356364819692e-02
-9.269547780327989928e-02,-4.464163650698899782e-02,-4.069594049999709917e-02,-1.944209332987930153e-02,-6.899064987206669775e-02,-7.928784441181220555e-02,4.127682384197570165e-02,-7.639450375000099436e-02,-4.118038518800790082e-02,-9.634615654166470144e-02
-4.547247794002570037e-02,5.068011873981870252e-02,-4.716281294328249912e-02,-1.599922263614299983e-02,-4.009563984984299695e-02,-2.480001206043359885e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-6.291294991625119570e-02,-3.835665973397880263e-02
6.350367559056099842e-02,5.068011873981870252e-02,-1.894705840284650021e-03,6.662967401352719310e-02,9.061988167926439408e-02,1.089143811236970016e-01,2.286863482154040048e-02,1.770335448356720118e-02,-3.581672810154919867e-02,3.064409414368320182e-03
4.170844488444359899e-02,5.068011873981870252e-02,6.169620651868849837e-02,-4.009931749229690007e-02,-1.395253554402150001e-02,6.201685656730160021e-03,-2.867429443567860031e-02,-2.592261998182820038e-03,-1.495647502491130078e-02,1.134862324403770016e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,3.906215296718960200e-02,-3.321357610482440076e-02,-1.257658268582039982e-02,-3.450761437590899733e-02,-2.499265663159149983e-02,-2.592261998182820038e-03,6.773632611028609918e-02,-1.350401824497050006e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-8.380842345523309422e-02,8.100872220010799790e-03,-1.033894713270950005e-01,-9.056118903623530669e-02,-1.394774321933030074e-02,-7.639450375000099436e-02,-6.291294991625119570e-02,-3.421455281914410201e-02
2.717829108036539862e-02,5.068011873981870252e-02,1.750591148957160101e-02,-3.321357610482440076e-02,-7.072771253015849857e-03,4.597154030400080194e-02,-6.549067247654929980e-02,7.120997975363539678e-02,-9.643322289178400675e-02,-5.906719430815229877e-02
1.628067572730669890e-02,-4.464163650698899782e-02,-2.884000768730720157e-02,-9.113481248670509197e-03,-4.320865536613589623e-03,-9.768885894535990141e-03,4.495846164606279866e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,-4.249876664881350324e-02
5.383060374248070309e-03,5.068011873981870252e-02,-1.894705840284650021e-03,8.100872220010799790e-03,-4.320865536613589623e-03,-1.571870666853709964e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,3.839324821169769891e-02,-1.350401824497050006e-02
4.534098333546320025e-02,-4.464163650698899782e-02,-2.560657146566450160e-02,-1.255635194240680048e-02,1.769438019460449832e-02,-6.128357906048329537e-05,8.177483968693349814e-02,-3.949338287409189657e-02,-3.199144494135589684e-02,-7.563562196749110123e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-1.806188694849819934e-02,8.040115678847230274e-02,8.924392882106320368e-02,1.076617872765389949e-01,-3.971920784793980114e-02,1.081111006295440019e-01,3.605579008983190309e-02,-4.249876664881350324e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,4.229558918883229851e-02,4.941532054484590319e-02,2.457414448561009990e-02,-2.386056667506489953e-02,7.441156407875940126e-02,-3.949338287409189657e-02,5.227999979678119719e-02,2.791705090337660150e-02
7.076875249260000666e-02,5.068011873981870252e-02,1.211685112016709989e-02,5.630106193231849965e-02,3.420581449301800248e-02,4.941617338368559792e-02,-3.971920784793980114e-02,3.430885887772629900e-02,2.736770754260900093e-02,-1.077697500466389974e-03
-3.820740103798660192e-02,-4.464163650698899782e-02,-1.051720243133190055e-02,-3.665644679856060184e-02,-3.734373413344069942e-02,-1.947648821001150138e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,-1.811826730789670159e-02,-1.764612515980519894e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,-1.806188694849819934e-02,-4.009931749229690007e-02,-2.944912678412469915e-03,-1.133462820348369975e-02,3.759518603788870178e-02,-3.949338287409189657e-02,-8.944018957797799166e-03,-5.492508739331759815e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,-5.686312160821060252e-02,-4.354218818603310115e-02,-4.559945128264750180e-02,-4.327577130601600180e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-1.190068480150809939e-02,1.549073015887240078e-02
-8.543040090124079389e-02,5.068011873981870252e-02,-2.237313524402180162e-02,1.215130832538269907e-03,-3.734373413344069942e-02,-2.636575436938120090e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-7.212845460195599356e-02,-1.764612515980519894e-02
-8.543040090124079389e-02,-4.464163650698899782e-02,-4.050329988046450294e-03,-9.113481248670509197e-03,-2.944912678412469915e-03,7.767427965677820186e-03,2.286863482154040048e-02,-3.949338287409189657e-02,-6.117659509433449883e-02,-1.350401824497050006e-02
4.534098333546320025e-02,5.068011873981870252e-02,6.061839444480759953e-02,3.105334362634819961e-02,2.870200306021350109e-02,-4.734670130927989828e-02,-5.444575906428809897e-02,7.120997975363539678e-02,1.335989800130079896e-01,1.356118306890790048e-01
-6.363517019512339445e-02,-4.464163650698899782e-02,3.582871674554689856e-02,-2.288496402361559975e-02,-3.046396984243510131e-02,-1.885019128643240088e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,-2.595242443518940012e-02,-5.492508739331759815e-02
-6.726770864614299572e-02,5.068011873981870252e-02,-1.267282657909369996e-02,-4.009931749229690007e-02,-1.532848840222260020e-02,4.635943347782499856e-03,-5.812739686837520292e-02,3.430885887772629900e-02,1.919903307856710151e-02,-3.421455281914410201e-02
-1.072256316073579990e-01,-4.464163650698899782e-02,-7.734155101194770121e-02,-2.632783471735180084e-02,-8.962994274508359616e-02,-9.619786134844690584e-02,2.655027262562750096e-02,-7.639450375000099436e-02,-4.257210492279420166e-02,-5.219804415301099697e-03
-2.367724723390840155e-02,-4.464163650698899782e-02,5.954058237092670069e-02,-4.009931749229690007e-02,-4.284754556624519733e-02,-4.358891976780549654e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,4.034337164788070335e-02
5.260606023750229870e-02,-4.464163650698899782e-02,-2.129532317014089932e-02,-7.452802442965950069e-02,-4.009563984984299695e-02,-3.763909899380440266e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-6.092541861022970299e-04,-5.492508739331759815e-02
6.713621404158050254e-02,5.068011873981870252e-02,-6.205954135808240159e-03,6.318680331979099896e-02,-4.284754556624519733e-02,-9.588471288665739722e-02,5.232173725423699961e-02,-7.639450375000099436e-02,5.942380044479410317e-02,5.276969239238479825e-02
-6.000263174410389727e-02,-4.464163650698899782e-02,4.445121333659410312e-02,-1.944209332987930153e-02,-9.824676969418109224e-03,-7.576846662009279788e-03,2.286863482154040048e-02,-3.949338287409189657e-02,-2.712864555432650121e-02,-9.361911330135799444e-03
-2.367724723390840155e-02,-4.464163650698899782e-02,-6.548561819925780014e-02,-8.141376581713200000e-02,-3.871968699164179961e-02,-5.360967054507050078e-02,5.968501286241110343e-02,-7.639450375000099436e-02,-3.712834601047360072e-02,-4.249876664881350324e-02
3.444336798240450054e-02,5.068011873981870252e-02,1.252871188776620015e-01,2.875809638242839833e-02,-5.385516843185429725e-02,-1.290037051243130006e-02,-1.023070505174200062e-01,1.081111006295440019e-01,2.714857279071319972e-04,2.791705090337660150e-02
3.081082953138499989e-02,-4.464163650698899782e-02,-5.039624916492520257e-02,-2.227739861197989939e-03,-4.422349842444640161e-02,-8.993489211265630334e-02,1.185912177278039964e-01,-7.639450375000099436e-02,-1.811826730789670159e-02,3.064409414368320182e-03
1.628067572730669890e-02,-4.464163650698899782e-02,-6.332999405149600247e-02,-5.731367096097819691e-02,-5.798302700645770191e-02,-4.891244361822749687e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-5.947269741072230137e-02,-6.735140813782170000e-02
4.897352178648269744e-02,5.068011873981870252e-02,-3.099563183506899924e-02,-4.928030602040309877e-02,4.934129593323050011e-02,-4.132213582324419619e-03,1.333177689441520097e-01,-5.351580880693729975e-02,2.131084656824479978e-02,1.963283707370720027e-02
1.264813727628719998e-02,-4.464163650698899782e-02,2.289497185897609866e-02,5.285819123858220142e-02,8.062710187196569719e-03,-2.855779360190789998e-02,3.759518603788870178e-02,-3.949338287409189657e-02,5.472400334817909689e-02,-2.593033898947460017e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,1.103903904628619932e-02,-5.731367096097819691e-02,-2.496015840963049931e-02,-4.296262284422640298e-02,3.023191042971450082e-02,-3.949338287409189657e-02,1.703713241477999851e-02,-5.219804415301099697e-03
-1.882016527791040067e-03,5.068011873981870252e-02,7.139651518361660176e-02,9.761551025715360652e-02,8.786797596286209655e-02,7.540749571221680436e-02,-2.131101882750449997e-02,7.120997975363539678e-02,7.142403278057639360e-02,2.377494398854190089e-02
-1.882016527791040067e-03,5.068011873981870252e-02,1.427247526792889930e-02,-7.452802442965950069e-02,2.558898754392050119e-03,6.201685656730160021e-03,-1.394774321933030074e-02,-2.592261998182820038e-03,1.919903307856710151e-02,3.064409414368320182e-03
5.383060374248070309e-03,5.068011873981870252e-02,-8.361578283570040432e-03,2.187235499495579841e-02,5.484510736603499803e-02,7.321545647968999426e-02,-2.499265663159149983e-02,3.430885887772629900e-02,1.255315281338930007e-02,9.419076154073199869e-02
-9.996055470531900466e-02,-4.464163650698899782e-02,-6.764124234701959781e-02,-1.089567313670219972e-01,-7.449446130487119566e-02,-7.271172671423199729e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-4.986846773523059828e-02,-9.361911330135799444e-03
-6.000263174410389727e-02,5.068011873981870252e-02,-1.051720243133190055e-02,-1.485159908304049987e-02,-4.972730985725089953e-02,-2.354741821327540133e-02,-5.812739686837520292e-02,1.585829843977170153e-02,-9.918957363154769225e-03,-3.421455281914410201e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-7.108515373592319553e-02,2.044628591100669870e-02,-1.008203435632550049e-02,1.185912177278039964e-01,-7.639450375000099436e-02,-4.257210492279420166e-02,7.348022696655839847e-02
4.534098333546320025e-02,5.068011873981870252e-02,6.816307896197400240e-02,8.100872220010799790e-03,-1.670444126042380101e-02,4.635943347782499856e-03,-7.653558588881050062e-02,7.120997975363539678e-02,3.243322577960189995e-02,-1.764612515980519894e-02
2.717829108036539862e-02,5.068011873981870252e-02,-3.530688013059259805e-02,3.220096707616459941e-02,-1.120062982761920074e-02,1.504458729887179960e-03,-1.026610541524320026e-02,-2.592261998182820038e-03,-1.495647502491130078e-02,-5.078298047848289754e-02
-5.637009329308430294e-02,-4.464163650698899782e-02,-1.159501450521270051e-02,-3.321357610482440076e-02,-4.697540414084860200e-02,-4.765984977106939996e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-7.979397554541639223e-03,-8.806194271199530021e-02
-7.816532399920170238e-02,-4.464163650698899782e-02,-7.303030271642410587e-02,-5.731367096097819691e-02,-8.412613131227909824e-02,-7.427746902317970690e-02,-2.499265663159149983e-02,-3.949338287409189657e-02,-1.811826730789670159e-02,-8.391983579716059960e-02
6.713621404158050254e-02,5.068011873981870252e-02,-4.177375257387799801e-02,1.154374291374709975e-02,2.558898754392050119e-03,5.888537194940629722e-03,4.127682384197570165e-02,-3.949338287409189657e-02,-5.947269741072230137e-02,-2.178823207463989955e-02
-4.183993948900609910e-02,5.068011873981870252e-02,1.427247526792889930e-02,-5.670610554934250001e-03,-1.257658268582039982e-02,6.201685656730160021e-03,-7.285394808472339667e-02,7.120997975363539678e-02,3.546193866076970125e-02,-1.350401824497050006e-02
3.444336798240450054e-02,-4.464163650698899782e-02,-7.283766209689159811e-03,1.498661360748330083e-02,-4.422349842444640161e-02,-3.732595053201490098e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-2.139368094035999993e-02,7.206516329203029904e-03
5.987113713954139715e-02,5.068011873981870252e-02,1.642809941569069870e-02,2.875809638242839833e-02,-4.147159270804409714e-02,-2.918409052548700047e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,-2.396681493414269844e-03,-2.178823207463989955e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-9.439390357450949676e-03,-5.670610554934250001e-03,3.970962592582259754e-02,4.471894645684260094e-02,2.655027262562750096e-02,-2.592261998182820038e-03,-1.811826730789670159e-02,-1.350401824497050006e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,-1.590626280073640167e-02,7.007254470726349826e-02,1.219056876180000040e-02,2.217225720799630151e-02,1.550535921336619952e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,4.862758547755009764e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,2.505059600673789980e-02,8.100872220010799790e-03,2.044628591100669870e-02,1.778817874294279927e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,7.206516329203029904e-03
-4.183993948900609910e-02,-4.464163650698899782e-02,-4.931843709104429679e-02,-3.665644679856060184e-02,-7.072771253015849857e-03,-2.260797282790679916e-02,8.545647749102060209e-02,-3.949338287409189657e-02,-6.648814822283539983e-02,7.206516329203029904e-03
-4.183993948900609910e-02,-4.464163650698899782e-02,4.121777711495139968e-02,-2.632783471735180084e-02,-3.183992270063620150e-02,-3.043668437264510085e-02,-3.603757004385269719e-02,2.942906133203560069e-03,3.365681290238470291e-02,-1.764612515980519894e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,-6.332999405149600247e-02,-5.042792957350569760e-02,-8.962994274508359616e-02,-1.043397213549750041e-01,5.232173725423699961e-02,-7.639450375000099436e-02,-5.615757309500619965e-02,-6.735140813782170000e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-6.440780612537699845e-02,3.564383776990089764e-02,1.219056876180000040e-02,-5.799374901012400302e-02,1.811790603972839864e-01,-7.639450375000099436e-02,-6.092541861022970299e-04,-5.078298047848289754e-02
6.350367559056099842e-02,5.068011873981870252e-02,-2.560657146566450160e-02,1.154374291374709975e-02,6.447677737344290061e-02,4.847672799831700269e-02,3.023191042971450082e-02,-2.592261998182820038e-03,3.839324821169769891e-02,1.963283707370720027e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,-4.050329988046450294e-03,-4.009931749229690007e-02,-6.623874415566440021e-02,-7.866154748823310505e-02,5.232173725423699961e-02,-7.639450375000099436e-02,-5.140053526058249722e-02,-3.421455281914410201e-02
-4.183993948900609910e-02,5.068011873981870252e-02,4.572166603000769880e-03,-5.387080026724189868e-02,-4.422349842444640161e-02,-2.730519975474979960e-02,-8.021722369289760457e-02,7.120997975363539678e-02,3.664579779339879884e-02,1.963283707370720027e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-7.283766209689159811e-03,-4.009931749229690007e-02,-1.120062982761920074e-02,-1.383981589779990050e-02,5.968501286241110343e-02,-3.949338287409189657e-02,-8.238148325810279449e-02,-2.593033898947460017e-02
-3.457486258696700065e-02,-4.464163650698899782e-02,-3.746250427835440266e-02,-6.075654165471439799e-02,2.044628591100669870e-02,4.346635260968449710e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-3.075120986455629965e-02,-7.149351505265640061e-02
6.713621404158050254e-02,5.068011873981870252e-02,-2.560657146566450160e-02,-4.009931749229690007e-02,-6.348683843926219983e-02,-5.987263978086120042e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-1.919704761394450121e-02,1.134862324403770016e-02
-4.547247794002570037e-02,5.068011873981870252e-02,-2.452875939178359929e-02,5.974393262605470073e-02,5.310804470794310353e-03,1.496984258683710031e-02,-5.444575906428809897e-02,7.120997975363539678e-02,4.234489544960749752e-02,1.549073015887240078e-02
-9.147093429830140468e-03,5.068011873981870252e-02,-1.806188694849819934e-02,-3.321357610482440076e-02,-2.083229983502719873e-02,1.215150643073130074e-02,-7.285394808472339667e-02,7.120997975363539678e-02,2.714857279071319972e-04,1.963283707370720027e-02
4.170844488444359899e-02,5.068011873981870252e-02,-1.482845072685549936e-02,-1.714684618924559867e-02,-5.696818394814720174e-03,8.393724889256879915e-03,-1.394774321933030074e-02,-1.854239580664649974e-03,-1.190068480150809939e-02,3.064409414368320182e-03
3.807590643342410180e-02,5.068011873981870252e-02,-2.991781976118810041e-02,-4.009931749229690007e-02,-3.321587555883730170e-02,-2.417371513685449835e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,-1.290794225416879923e-02,3.064409414368320182e-03
1.628067572730669890e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-5.670610554934250001e-03,-7.587041416307230279e-02,-6.143838208980879900e-02,-1.394774321933030074e-02,-3.949338287409189657e-02,-5.140053526058249722e-02,1.963283707370720027e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-6.979686649478139548e-02,-1.255635194240680048e-02,-1.930069620102049918e-04,-9.142588970956939953e-03,7.072992627467229731e-02,-3.949338287409189657e-02,-6.291294991625119570e-02,4.034337164788070335e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,3.367309259778510089e-02,1.251584758070440062e-01,2.457414448561009990e-02,2.624318721126020146e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,2.671425763351279944e-02,6.105390622205419948e-02
6.350367559056099842e-02,5.068011873981870252e-02,-4.050329988046450294e-03,-1.255635194240680048e-02,1.030034574030749966e-01,4.878987646010649742e-02,5.600337505832399948e-02,-2.592261998182820038e-03,8.449528221240310000e-02,-1.764612515980519894e-02
1.264813727628719998e-02,5.068011873981870252e-02,-2.021751109626000048e-02,-2.227739861197989939e-03,3.833367306762140020e-02,5.317395492515999966e-02,-6.584467611156170040e-03,3.430885887772629900e-02,-5.145307980263110273e-03,-9.361911330135799444e-03
1.264813727628719998e-02,5.068011873981870252e-02,2.416542455238970041e-03,5.630106193231849965e-02,2.732605020201240090e-02,1.716188181936379939e-02,4.127682384197570165e-02,-3.949338287409189657e-02,3.711738233435969789e-03,7.348022696655839847e-02
-9.147093429830140468e-03,5.068011873981870252e-02,-3.099563183506899924e-02,-2.632783471735180084e-02,-1.120062982761920074e-02,-1.000728964429089965e-03,-2.131101882750449997e-02,-2.592261998182820038e-03,6.209315616505399656e-03,2.791705090337660150e-02
-3.094232413594750000e-02,5.068011873981870252e-02,2.828403222838059977e-02,7.007254470726349826e-02,-1.267806699165139883e-01,-1.068449090492910036e-01,-5.444575906428809897e-02,-4.798064067555100204e-02,-3.075120986455629965e-02,1.549073015887240078e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-3.638469220447349689e-02,-7.452802442965950069e-02,-3.871968699164179961e-02,-2.761834821653930128e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-7.408887149153539631e-02,-1.077697500466389974e-03
5.383060374248070309e-03,-4.464163650698899782e-02,-5.794093368209150136e-02,-2.288496402361559975e-02,-6.761469701386560449e-02,-6.832764824917850199e-02,-5.444575906428809897e-02,-2.592261998182820038e-03,4.289568789252869857e-02,-8.391983579716059960e-02
-1.035930931563389945e-01,-4.464163650698899782e-02,-3.746250427835440266e-02,-2.632783471735180084e-02,2.558898754392050119e-03,1.998021797546959896e-02,1.182372140927919965e-02,-2.592261998182820038e-03,-6.832974362442149896e-02,-2.593033898947460017e-02
7.076875249260000666e-02,-4.464163650698899782e-02,1.211685112016709989e-02,4.252957915737339695e-02,7.135654166444850566e-02,5.348710338694950134e-02,5.232173725423699961e-02,-2.592261998182820038e-03,2.539313491544940155e-02,-5.219804415301099697e-03
1.264813727628719998e-02,5.068011873981870252e-02,-2.237313524402180162e-02,-2.977070541108809906e-02,1.081461590359879960e-02,2.843522644378690054e-02,-2.131101882750449997e-02,3.430885887772629900e-02,-6.080248196314420352e-03,-1.077697500466389974e-03
-1.641217033186929963e-02,-4.464163650698899782e-02,-3.530688013059259805e-02,-2.632783471735180084e-02,3.282986163481690228e-02,1.716188181936379939e-02,1.001830287073690040e-01,-3.949338287409189657e-02,-7.020931272868760620e-02,-7.977772888232589898e-02
-3.820740103798660192e-02,-4.464163650698899782e-02,9.961226972405269262e-03,-4.698505887976939938e-02,-5.935897986465880211e-02,-5.298337362149149743e-02,-1.026610541524320026e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-4.249876664881350324e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-3.961812842611620034e-02,-1.009233664264470032e-01,-2.908801698423390050e-02,-3.012353591085559917e-02,4.495846164606279866e-02,-5.019470792810550031e-02,-6.832974362442149896e-02,-1.294830118603420011e-01
4.534098333546320025e-02,-4.464163650698899782e-02,7.139651518361660176e-02,1.215130832538269907e-03,-9.824676969418109224e-03,-1.000728964429089965e-03,1.550535921336619952e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-7.149351505265640061e-02
-7.090024709716259699e-02,5.068011873981870252e-02,-7.518592686418590354e-02,-4.009931749229690007e-02,-5.110326271545199972e-02,-1.509240974495799914e-02,-3.971920784793980114e-02,-2.592261998182820038e-03,-9.643322289178400675e-02,-3.421455281914410201e-02
4.534098333546320025e-02,-4.464163650698899782e-02,-6.205954135808240159e-03,1.154374291374709975e-02,6.310082451524179348e-02,1.622243643399520069e-02,9.650139090328180291e-02,-3.949338287409189657e-02,4.289568789252869857e-02,-3.835665973397880263e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-4.069594049999709917e-02,-6.764228304218700139e-02,-3.183992270063620150e-02,-3.701280207022530216e-02,3.759518603788870178e-02,-3.949338287409189657e-02,-3.452371533034950118e-02,6.933812005172369786e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,-4.824062501716339796e-02,-1.944209332987930153e-02,-1.930069620102049918e-04,-1.603185513032660131e-02,6.704828847058519337e-02,-3.949338287409189657e-02,-2.479118743246069845e-02,1.963283707370720027e-02
1.264813727628719998e-02,-4.464163650698899782e-02,-2.560657146566450160e-02,-4.009931749229690007e-02,-3.046396984243510131e-02,-4.515466207675319921e-02,7.809320188284639419e-02,-7.639450375000099436e-02,-7.212845460195599356e-02,1.134862324403770016e-02
4.534098333546320025e-02,-4.464163650698899782e-02,5.199589785376040191e-02,-5.387080026724189868e-02,6.310082451524179348e-02,6.476044801137270657e-02,-1.026610541524320026e-02,3.430885887772629900e-02,3.723201120896890010e-02,1.963283707370720027e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,4.572166603000769880e-03,9.761551025715360652e-02,5.310804470794310353e-03,-2.072908205716959829e-02,6.336665066649820044e-02,-3.949338287409189657e-02,1.255315281338930007e-02,1.134862324403770016e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,-6.440780612537699845e-02,-1.020709899795499975e-01,-2.944912678412469915e-03,-1.540555820674759969e-02,6.336665066649820044e-02,-4.724261825803279663e-02,-3.324878724762579674e-02,-5.492508739331759815e-02
-7.816532399920170238e-02,-4.464163650698899782e-02,-1.698407487461730050e-02,-1.255635194240680048e-02,-1.930069620102049918e-04,-1.352666743601040056e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-9.220404962683000083e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,-5.794093368209150136e-02,-8.141376581713200000e-02,-4.559945128264750180e-02,-2.887094206369749880e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,1.143797379512540100e-03,-5.219804415301099697e-03
5.623859868852180283e-02,5.068011873981870252e-02,9.961226972405269262e-03,4.941532054484590319e-02,-4.320865536613589623e-03,-1.227407358885230018e-02,-4.340084565202689815e-02,3.430885887772629900e-02,6.078775415074400001e-02,3.205915781821130212e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,8.864150836571099701e-02,-2.518021116424929914e-02,2.182223876920789951e-02,4.252690722431590187e-02,-3.235593223976569732e-02,3.430885887772629900e-02,2.863770518940129874e-03,7.762233388139309909e-02
1.750521923228520000e-03,5.068011873981870252e-02,-5.128142061927360405e-03,-1.255635194240680048e-02,-1.532848840222260020e-02,-1.383981589779990050e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-6.080248196314420352e-03,-6.735140813782170000e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-6.440780612537699845e-02,1.154374291374709975e-02,2.732605020201240090e-02,3.751653183568340322e-02,-1.394774321933030074e-02,3.430885887772629900e-02,1.178390038357590014e-02,-5.492508739331759815e-02
1.628067572730669890e-02,-4.464163650698899782e-02,1.750591148957160101e-02,-2.288496402361559975e-02,6.034891879883950289e-02,4.440579799505309927e-02,3.023191042971450082e-02,-2.592261998182820038e-03,3.723201120896890010e-02,-1.077697500466389974e-03
1.628067572730669890e-02,5.068011873981870252e-02,-4.500718879552070145e-02,6.318680331979099896e-02,1.081461590359879960e-02,-3.744320408500199904e-04,6.336665066649820044e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,3.620126473304600273e-02
-9.269547780327989928e-02,-4.464163650698899782e-02,2.828403222838059977e-02,-1.599922263614299983e-02,3.695772020942030001e-02,2.499059336410210108e-02,5.600337505832399948e-02,-3.949338287409189657e-02,-5.145307980263110273e-03,-1.077697500466389974e-03
5.987113713954139715e-02,5.068011873981870252e-02,4.121777711495139968e-02,1.154374291374709975e-02,4.108557878402369773e-02,7.071026878537380045e-02,-3.603757004385269719e-02,3.430885887772629900e-02,-1.090443584737709956e-02,-3.007244590430930078e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,6.492964274033119487e-02,-2.227739861197989939e-03,-2.496015840963049931e-02,-1.728444897748479883e-02,2.286863482154040048e-02,-3.949338287409189657e-02,-6.117659509433449883e-02,-6.320930122298699938e-02
2.354575262934580082e-02,5.068011873981870252e-02,-3.207344390894990155e-02,-4.009931749229690007e-02,-3.183992270063620150e-02,-2.166852744253820046e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-1.090443584737709956e-02,1.963283707370720027e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-7.626373893806680238e-02,-4.354218818603310115e-02,-4.559945128264750180e-02,-3.482076283769860309e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-5.947269741072230137e-02,-8.391983579716059960e-02
2.717829108036539862e-02,-4.464163650698899782e-02,4.984027370599859730e-02,-5.501842382034440038e-02,-2.944912678412469915e-03,4.064801645357869753e-02,-5.812739686837520292e-02,5.275941931568080279e-02,-5.295879323920039961e-02,-5.219804415301099697e-03
1.991321417832630017e-02,5.068011873981870252e-02,4.552902541047500196e-02,2.990571983224480160e-02,-6.211088558106100249e-02,-5.580170977759729700e-02,-7.285394808472339667e-02,2.692863470254440103e-02,4.560080841412490066e-02,4.034337164788070335e-02
3.807590643342410180e-02,5.068011873981870252e-02,-9.439390357450949676e-03,2.362754385640800005e-03,1.182945896190920002e-03,3.751653183568340322e-02,-5.444575906428809897e-02,5.017634085436720182e-02,-2.595242443518940012e-02,1.066170822852360034e-01
4.170844488444359899e-02,5.068011873981870252e-02,-3.207344390894990155e-02,-2.288496402361559975e-02,-4.972730985725089953e-02,-4.014428668812060341e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-1.260973855604090033e-01,1.549073015887240078e-02
1.991321417832630017e-02,-4.464163650698899782e-02,4.572166603000769880e-03,-2.632783471735180084e-02,2.319819162740899970e-02,1.027261565999409987e-02,6.704828847058519337e-02,-3.949338287409189657e-02,-2.364455757213410059e-02,-4.664087356364819692e-02
-8.543040090124079389e-02,-4.464163650698899782e-02,2.073934771121430098e-02,-2.632783471735180084e-02,5.310804470794310353e-03,1.966706951368000014e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,-2.364455757213410059e-02,3.064409414368320182e-03
1.991321417832630017e-02,5.068011873981870252e-02,1.427247526792889930e-02,6.318680331979099896e-02,1.494247447820220079e-02,2.029336643725910064e-02,-4.708248345611389801e-02,3.430885887772629900e-02,4.666077235681449775e-02,9.004865462589720093e-02
2.354575262934580082e-02,-4.464163650698899782e-02,1.101977498433290015e-01,6.318680331979099896e-02,1.356652162000110060e-02,-3.294187206696139875e-02,-2.499265663159149983e-02,2.065544415363990138e-02,9.924022573398999514e-02,2.377494398854190089e-02
-3.094232413594750000e-02,5.068011873981870252e-02,1.338730381358059929e-03,-5.670610554934250001e-03,6.447677737344290061e-02,4.941617338368559792e-02,-4.708248345611389801e-02,1.081111006295440019e-01,8.379676636552239877e-02,3.064409414368320182e-03
4.897352178648269744e-02,5.068011873981870252e-02,5.846277029704580186e-02,7.007254470726349826e-02,1.356652162000110060e-02,2.060651489904859884e-02,-2.131101882750449997e-02,3.430885887772629900e-02,2.200405045615050001e-02,2.791705090337660150e-02
5.987113713954139715e-02,-4.464163650698899782e-02,-2.129532317014089932e-02,8.728689817594480205e-02,4.521343735862710239e-02,3.156671106168230240e-02,-4.708248345611389801e-02,7.120997975363539678e-02,7.912108138965789905e-02,1.356118306890790048e-01
-5.637009329308430294e-02,5.068011873981870252e-02,-1.051720243133190055e-02,2.531522568869210010e-02,2.319819162740899970e-02,4.002171952999959703e-02,-3.971920784793980114e-02,3.430885887772629900e-02,2.061233072136409855e-02,5.691179930721949887e-02
1.628067572730669890e-02,-4.464163650698899782e-02,-4.716281294328249912e-02,-2.227739861197989939e-03,-1.945634697682600139e-02,-4.296262284422640298e-02,3.391354823380159783e-02,-3.949338287409189657e-02,2.736770754260900093e-02,2.791705090337660150e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,4.572166603000769880e-03,1.154374291374709975e-02,-3.734373413344069942e-02,-1.853704282464289921e-02,-1.762938102341739949e-02,-2.592261998182820038e-03,-3.980959436433750137e-02,-2.178823207463989955e-02
6.350367559056099842e-02,-4.464163650698899782e-02,1.750591148957160101e-02,2.187235499495579841e-02,8.062710187196569719e-03,2.154596028441720101e-02,-3.603757004385269719e-02,3.430885887772629900e-02,1.990842087631829876e-02,1.134862324403770016e-02
4.897352178648269744e-02,5.068011873981870252e-02,8.109682384854470516e-02,2.187235499495579841e-02,4.383748450042589812e-02,6.413415108779360607e-02,-5.444575906428809897e-02,7.120997975363539678e-02,3.243322577960189995e-02,4.862758547755009764e-02
5.383060374248070309e-03,5.068011873981870252e-02,3.475090467166599972e-02,-1.080116308095460057e-03,1.525377602983150060e-01,1.987879896572929961e-01,-6.180903467246220279e-02,1.852344432601940039e-01,1.556684454070180086e-02,7.348022696655839847e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,2.397278393285700096e-02,8.100872220010799790e-03,-3.459182841703849903e-02,-3.889169284096249957e-02,2.286863482154040048e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-1.350401824497050006e-02
-5.514554978810590376e-03,5.068011873981870252e-02,-8.361578283570040432e-03,-2.227739861197989939e-03,-3.321587555883730170e-02,-6.363042132233559522e-02,-3.603757004385269719e-02,-2.592261998182820038e-03,8.058546423866649877e-02,7.206516329203029904e-03
-8.906293935226029801e-02,-4.464163650698899782e-02,-6.117436990373419786e-02,-2.632783471735180084e-02,-5.523112129005539744e-02,-5.454911593043910295e-02,4.127682384197570165e-02,-7.639450375000099436e-02,-9.393564550871469354e-02,-5.492508739331759815e-02
3.444336798240450054e-02,5.068011873981870252e-02,-1.894705840284650021e-03,-1.255635194240680048e-02,3.833367306762140020e-02,1.371724873967889932e-02,7.809320188284639419e-02,-3.949338287409189657e-02,4.551890466127779880e-03,-9.634615654166470144e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-6.225218197761509670e-02,-2.632783471735180084e-02,-5.696818394814720174e-03,-5.071658967693000106e-03,3.023191042971450082e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,-7.149351505265640061e-02
9.015598825267629943e-03,-4.464163650698899782e-02,1.642809941569069870e-02,4.658001526274530187e-03,9.438663045397699403e-03,1.058576412178359981e-02,-2.867429443567860031e-02,3.430885887772629900e-02,3.896836603088559697e-02,1.190434030297399942e-01
-6.363517019512339445e-02,5.068011873981870252e-02,9.618619288287730273e-02,1.045012516446259948e-01,-2.944912678412469915e-03,-4.758510505903469807e-03,-6.584467611156170040e-03,-2.592261998182820038e-03,2.269202256674450122e-02,7.348022696655839847e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-6.979686649478139548e-02,-6.764228304218700139e-02,-1.945634697682600139e-02,-1.070833127990459925e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-4.687948284421659950e-02,-7.977772888232589898e-02
1.628067572730669890e-02,5.068011873981870252e-02,-2.129532317014089932e-02,-9.113481248670509197e-03,3.420581449301800248e-02,4.785043107473799934e-02,7.788079970179680352e-04,-2.592261998182820038e-03,-1.290794225416879923e-02,2.377494398854190089e-02
-4.183993948900609910e-02,5.068011873981870252e-02,-5.362968538656789907e-02,-4.009931749229690007e-02,-8.412613131227909824e-02,-7.177228132886340206e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-7.212845460195599356e-02,-3.007244590430930078e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,4.337340126271319735e-02,-3.321357610482440076e-02,1.219056876180000040e-02,2.518648827290310109e-04,6.336665066649820044e-02,-3.949338287409189657e-02,-2.712864555432650121e-02,-4.664087356364819692e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,5.630714614928399725e-02,-3.665644679856060184e-02,-4.835135699904979933e-02,-4.296262284422640298e-02,-7.285394808472339667e-02,3.799897096531720114e-02,5.078151336297320045e-02,5.691179930721949887e-02
-9.269547780327989928e-02,-4.464163650698899782e-02,-8.165279930747129655e-02,-5.731367096097819691e-02,-6.073493272285990230e-02,-6.801449978738899338e-02,4.864009945014990260e-02,-7.639450375000099436e-02,-6.648814822283539983e-02,-2.178823207463989955e-02
5.383060374248070309e-03,-4.464163650698899782e-02,4.984027370599859730e-02,9.761551025715360652e-02,-1.532848840222260020e-02,-1.634500359211620013e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,1.703713241477999851e-02,-1.350401824497050006e-02
3.444336798240450054e-02,5.068011873981870252e-02,1.112755619172099975e-01,7.695828609473599757e-02,-3.183992270063620150e-02,-3.388131745233000092e-02,-2.131101882750449997e-02,-2.592261998182820038e-03,2.801650652326400162e-02,7.348022696655839847e-02
2.354575262934580082e-02,-4.464163650698899782e-02,6.169620651868849837e-02,5.285819123858220142e-02,-3.459182841703849903e-02,-4.891244361822749687e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,5.472400334817909689e-02,-5.219804415301099697e-03
4.170844488444359899e-02,5.068011873981870252e-02,1.427247526792889930e-02,4.252957915737339695e-02,-3.046396984243510131e-02,-1.313877426218630021e-03,-4.340084565202689815e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,1.549073015887240078e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,4.768464955823679963e-02,-4.698505887976939938e-02,3.420581449301800248e-02,5.724488492842390308e-02,-8.021722369289760457e-02,1.302517731550900115e-01,4.506616833626150148e-02,1.314697237742440128e-01
4.170844488444359899e-02,5.068011873981870252e-02,1.211685112016709989e-02,3.908670846363720280e-02,5.484510736603499803e-02,4.440579799505309927e-02,4.460445801105040325e-03,-2.592261998182820038e-03,4.560080841412490066e-02,-1.077697500466389974e-03
-3.094232413594750000e-02,-4.464163650698899782e-02,5.649978676881649634e-03,-9.113481248670509197e-03,1.907033305280559851e-02,6.827982580309210209e-03,7.441156407875940126e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-4.249876664881350324e-02
3.081082953138499989e-02,5.068011873981870252e-02,4.660683748435590079e-02,-1.599922263614299983e-02,2.044628591100669870e-02,5.066876723084379891e-02,-5.812739686837520292e-02,7.120997975363539678e-02,6.209315616505399656e-03,7.206516329203029904e-03
-4.183993948900609910e-02,-4.464163650698899782e-02,1.285205550993039902e-01,6.318680331979099896e-02,-3.321587555883730170e-02,-3.262872360517189707e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-5.078298047848289754e-02
-3.094232413594750000e-02,5.068011873981870252e-02,5.954058237092670069e-02,1.215130832538269907e-03,1.219056876180000040e-02,3.156671106168230240e-02,-4.340084565202689815e-02,3.430885887772629900e-02,1.482271084126630077e-02,7.206516329203029904e-03
-5.637009329308430294e-02,-4.464163650698899782e-02,9.295275666123460623e-02,-1.944209332987930153e-02,1.494247447820220079e-02,2.342485105515439842e-02,-2.867429443567860031e-02,2.545258986750810123e-02,2.605608963368469949e-02,4.034337164788070335e-02
-6.000263174410389727e-02,5.068011873981870252e-02,1.535028734180979987e-02,-1.944209332987930153e-02,3.695772020942030001e-02,4.816357953652750101e-02,1.918699701745330000e-02,-2.592261998182820038e-03,-3.075120986455629965e-02,-1.077697500466389974e-03
-4.910501639104519755e-02,5.068011873981870252e-02,-5.128142061927360405e-03,-4.698505887976939938e-02,-2.083229983502719873e-02,-2.041593359538010008e-02,-6.917231028063640375e-02,7.120997975363539678e-02,6.123790751970099866e-02,-3.835665973397880263e-02
2.354575262934580082e-02,-4.464163650698899782e-02,7.031870310973570293e-02,2.531522568869210010e-02,-3.459182841703849903e-02,-1.446611282137899926e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-1.919704761394450121e-02,-9.361911330135799444e-03
1.750521923228520000e-03,-4.464163650698899782e-02,-4.050329988046450294e-03,-5.670610554934250001e-03,-8.448724111216979540e-03,-2.386056667506489953e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-8.944018957797799166e-03,-1.350401824497050006e-02
-3.457486258696700065e-02,5.068011873981870252e-02,-8.168937664037369826e-04,7.007254470726349826e-02,3.970962592582259754e-02,6.695248724389940564e-02,-6.549067247654929980e-02,1.081111006295440019e-01,2.671425763351279944e-02,7.348022696655839847e-02
4.170844488444359899e-02,5.068011873981870252e-02,-4.392937672163980262e-02,6.318680331979099896e-02,-4.320865536613589623e-03,1.622243643399520069e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-3.452371533034950118e-02,1.134862324403770016e-02
6.713621404158050254e-02,5.068011873981870252e-02,2.073934771121430098e-02,-5.670610554934250001e-03,2.044628591100669870e-02,2.624318721126020146e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,8.640282933063080789e-03,3.064409414368320182e-03
-2.730978568492789874e-02,5.068011873981870252e-02,6.061839444480759953e-02,4.941532054484590319e-02,8.511607024645979902e-02,8.636769187485039689e-02,-2.902829807069099918e-03,3.430885887772629900e-02,3.781447882634390162e-02,4.862758547755009764e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,-1.051720243133190055e-02,1.215130832538269907e-03,-3.734373413344069942e-02,-3.576020822306719832e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-2.139368094035999993e-02,-3.421455281914410201e-02
-1.882016527791040067e-03,5.068011873981870252e-02,-3.315125598283080038e-02,-1.829446977677679984e-02,3.145390877661580209e-02,4.284005568610550069e-02,-1.394774321933030074e-02,1.991742173612169944e-02,1.022564240495780000e-02,2.791705090337660150e-02
-1.277963188084970010e-02,-4.464163650698899782e-02,-6.548561819925780014e-02,-6.993753018282070077e-02,1.182945896190920002e-03,1.684873335757430118e-02,-2.902829807069099918e-03,-7.020396503291909812e-03,-3.075120986455629965e-02,-5.078298047848289754e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,4.337340126271319735e-02,8.728689817594480205e-02,1.356652162000110060e-02,7.141131042098750048e-03,-1.394774321933030074e-02,-2.592261998182820038e-03,4.234489544960749752e-02,-1.764612515980519894e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,-6.225218197761509670e-02,-7.452802442965950069e-02,-2.358420555142939912e-02,-1.321351897422090062e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-3.581672810154919867e-02,-4.664087356364819692e-02
-4.547247794002570037e-02,5.068011873981870252e-02,6.385183066645029604e-02,7.007254470726349826e-02,1.332744202834990066e-01,1.314610703725430096e-01,-3.971920784793980114e-02,1.081111006295440019e-01,7.573758845754760549e-02,8.590654771106250032e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,3.043965637614240091e-02,-7.452802442965950069e-02,-2.358420555142939912e-02,-1.133462820348369975e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,-3.075120986455629965e-02,-1.077697500466389974e-03
1.628067572730669890e-02,5.068011873981870252e-02,7.247432725749750060e-02,7.695828609473599757e-02,-8.448724111216979540e-03,5.575388733151089883e-03,-6.584467611156170040e-03,-2.592261998182820038e-03,-2.364455757213410059e-02,6.105390622205419948e-02
4.534098333546320025e-02,-4.464163650698899782e-02,-1.913969902237900103e-02,2.187235499495579841e-02,2.732605020201240090e-02,-1.352666743601040056e-02,1.001830287073690040e-01,-3.949338287409189657e-02,1.776347786711730131e-02,-1.350401824497050006e-02
-4.183993948900609910e-02,-4.464163650698899782e-02,-6.656343027313869898e-02,-4.698505887976939938e-02,-3.734373413344069942e-02,-4.327577130601600180e-02,4.864009945014990260e-02,-3.949338287409189657e-02,-5.615757309500619965e-02,-1.350401824497050006e-02
-5.637009329308430294e-02,5.068011873981870252e-02,-6.009655782985329903e-02,-3.665644679856060184e-02,-8.825398988688250290e-02,-7.083283594349480683e-02,-1.394774321933030074e-02,-3.949338287409189657e-02,-7.814091066906959926e-02,-1.046303703713340055e-01
7.076875249260000666e-02,-4.464163650698899782e-02,6.924089103585480409e-02,3.793908501382069892e-02,2.182223876920789951e-02,1.504458729887179960e-03,-3.603757004385269719e-02,3.910600459159439823e-02,7.763278919555950675e-02,1.066170822852360034e-01
1.750521923228520000e-03,5.068011873981870252e-02,5.954058237092670069e-02,-2.227739861197989939e-03,6.172487165704060308e-02,6.319470570242499696e-02,-5.812739686837520292e-02,1.081111006295440019e-01,6.898221163630259556e-02,1.273276168594099922e-01
-1.882016527791040067e-03,-4.464163650698899782e-02,-2.668438353954540043e-02,4.941532054484590319e-02,5.897296594063840269e-02,-1.603185513032660131e-02,-4.708248345611389801e-02,7.120997975363539678e-02,1.335989800130079896e-01,1.963283707370720027e-02
2.354575262934580082e-02,5.068011873981870252e-02,-2.021751109626000048e-02,-3.665644679856060184e-02,-1.395253554402150001e-02,-1.509240974495799914e-02,5.968501286241110343e-02,-3.949338287409189657e-02,-9.643322289178400675e-02,-1.764612515980519894e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-9.862811928581330378e-02,-7.587041416307230279e-02,-5.987263978086120042e-02,-1.762938102341739949e-02,-3.949338287409189657e-02,-5.140053526058249722e-02,-4.664087356364819692e-02
4.170844488444359899e-02,5.068011873981870252e-02,7.139651518361660176e-02,8.100872220010799790e-03,3.833367306762140020e-02,1.590928797220559840e-02,-1.762938102341739949e-02,3.430885887772629900e-02,7.341007804911610368e-02,8.590654771106250032e-02
-6.363517019512339445e-02,5.068011873981870252e-02,-7.949717515970949888e-02,-5.670610554934250001e-03,-7.174255558846899528e-02,-6.644875747844139480e-02,-1.026610541524320026e-02,-3.949338287409189657e-02,-1.811826730789670159e-02,-5.492508739331759815e-02
1.628067572730669890e-02,5.068011873981870252e-02,9.961226972405269262e-03,-4.354218818603310115e-02,-9.650970703608929835e-02,-9.463211903949929338e-02,-3.971920784793980114e-02,-3.949338287409189657e-02,1.703713241477999851e-02,7.206516329203029904e-03
6.713621404158050254e-02,-4.464163650698899782e-02,-3.854031635223530150e-02,-2.632783471735180084e-02,-3.183992270063620150e-02,-2.636575436938120090e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-2.712864555432650121e-02,3.064409414368320182e-03
4.534098333546320025e-02,5.068011873981870252e-02,1.966153563733339868e-02,3.908670846363720280e-02,2.044628591100669870e-02,2.593003874947069978e-02,8.142083605192099172e-03,-2.592261998182820038e-03,-3.303712578676999863e-03,1.963283707370720027e-02
4.897352178648269744e-02,-4.464163650698899782e-02,2.720622015449970094e-02,-2.518021116424929914e-02,2.319819162740899970e-02,1.841447566652189977e-02,-6.180903467246220279e-02,8.006624876385350087e-02,7.222365081991240221e-02,3.205915781821130212e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-8.361578283570040432e-03,-2.632783471735180084e-02,2.457414448561009990e-02,1.622243643399520069e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-4.836172480289190057e-02,-3.007244590430930078e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,-1.590626280073640167e-02,-1.255635194240680048e-02,2.044628591100669870e-02,4.127431337715779802e-02,-4.340084565202689815e-02,3.430885887772629900e-02,1.407245251576850001e-02,-9.361911330135799444e-03
-3.820740103798660192e-02,5.068011873981870252e-02,4.572166603000769880e-03,3.564383776990089764e-02,-1.120062982761920074e-02,5.888537194940629722e-03,-4.708248345611389801e-02,3.430885887772629900e-02,1.630495279994180133e-02,-1.077697500466389974e-03
4.897352178648269744e-02,-4.464163650698899782e-02,-4.285156464775889684e-02,-5.387080026724189868e-02,4.521343735862710239e-02,5.004247030726469841e-02,3.391354823380159783e-02,-2.592261998182820038e-03,-2.595242443518940012e-02,-6.320930122298699938e-02
4.534098333546320025e-02,5.068011873981870252e-02,5.649978676881649634e-03,5.630106193231849965e-02,6.447677737344290061e-02,8.918602803095619647e-02,-3.971920784793980114e-02,7.120997975363539678e-02,1.556684454070180086e-02,-9.361911330135799444e-03
4.534098333546320025e-02,5.068011873981870252e-02,-3.530688013059259805e-02,6.318680331979099896e-02,-4.320865536613589623e-03,-1.627025888008149911e-03,-1.026610541524320026e-02,-2.592261998182820038e-03,1.556684454070180086e-02,5.691179930721949887e-02
1.628067572730669890e-02,-4.464163650698899782e-02,2.397278393285700096e-02,-2.288496402361559975e-02,-2.496015840963049931e-02,-2.605260590759169922e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,3.723201120896890010e-02,3.205915781821130212e-02
-7.453278554818210111e-02,5.068011873981870252e-02,-1.806188694849819934e-02,8.100872220010799790e-03,-1.945634697682600139e-02,-2.480001206043359885e-02,-6.549067247654929980e-02,3.430885887772629900e-02,6.731721791468489591e-02,-1.764612515980519894e-02
-8.179786245022120650e-02,5.068011873981870252e-02,4.229558918883229851e-02,-1.944209332987930153e-02,3.970962592582259754e-02,5.755803339021339782e-02,-6.917231028063640375e-02,1.081111006295440019e-01,4.718616788601970313e-02,-3.835665973397880263e-02
-6.726770864614299572e-02,-4.464163650698899782e-02,-5.470749746044879791e-02,-2.632783471735180084e-02,-7.587041416307230279e-02,-8.210618056791800512e-02,4.864009945014990260e-02,-7.639450375000099436e-02,-8.682899321629239386e-02,-1.046303703713340055e-01
5.383060374248070309e-03,-4.464163650698899782e-02,-2.972517914165530208e-03,4.941532054484590319e-02,7.410844738085080319e-02,7.071026878537380045e-02,4.495846164606279866e-02,-2.592261998182820038e-03,-1.498586820292070049e-03,-9.361911330135799444e-03
-1.882016527791040067e-03,-4.464163650698899782e-02,-6.656343027313869898e-02,1.215130832538269907e-03,-2.944912678412469915e-03,3.070201038834840124e-03,1.182372140927919965e-02,-2.592261998182820038e-03,-2.028874775162960165e-02,-2.593033898947460017e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-1.267282657909369996e-02,2.875809638242839833e-02,-1.808039411862490120e-02,-5.071658967693000106e-03,-4.708248345611389801e-02,3.430885887772629900e-02,2.337484127982079885e-02,-5.219804415301099697e-03
-5.514554978810590376e-03,5.068011873981870252e-02,-4.177375257387799801e-02,-4.354218818603310115e-02,-7.999827273767569358e-02,-7.615635979391689736e-02,-3.235593223976569732e-02,-3.949338287409189657e-02,1.022564240495780000e-02,-9.361911330135799444e-03
5.623859868852180283e-02,5.068011873981870252e-02,-3.099563183506899924e-02,8.100872220010799790e-03,1.907033305280559851e-02,2.123281182262769934e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-2.952762274177360077e-02,-5.906719430815229877e-02
9.015598825267629943e-03,5.068011873981870252e-02,-5.128142061927360405e-03,-6.419941234845069622e-02,6.998058880624739853e-02,8.386250418053420308e-02,-3.971920784793980114e-02,7.120997975363539678e-02,3.953987807202419963e-02,1.963283707370720027e-02
-6.726770864614299572e-02,-4.464163650698899782e-02,-5.901874575597240019e-02,3.220096707616459941e-02,-5.110326271545199972e-02,-4.953874054180659736e-02,-1.026610541524320026e-02,-3.949338287409189657e-02,2.007840549823790115e-03,2.377494398854190089e-02
2.717829108036539862e-02,5.068011873981870252e-02,2.505059600673789980e-02,1.498661360748330083e-02,2.595009734381130070e-02,4.847672799831700269e-02,-3.971920784793980114e-02,3.430885887772629900e-02,7.837142301823850701e-03,2.377494398854190089e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-3.321357610482440076e-02,3.282986163481690228e-02,3.626393798852529937e-02,3.759518603788870178e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,1.134862324403770016e-02
4.897352178648269744e-02,5.068011873981870252e-02,3.494354529119849794e-03,7.007254470726349826e-02,-8.448724111216979540e-03,1.340410027788939938e-02,-5.444575906428809897e-02,3.430885887772629900e-02,1.331596790892770020e-02,3.620126473304600273e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,5.415152200152219958e-02,-2.632783471735180084e-02,-5.523112129005539744e-02,-3.388131745233000092e-02,-1.394774321933030074e-02,-3.949338287409189657e-02,-7.408887149153539631e-02,-5.906719430815229877e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-4.500718879552070145e-02,3.449621432008449784e-02,4.383748450042589812e-02,-1.571870666853709964e-02,3.759518603788870178e-02,-1.440062067847370023e-02,8.989869327767099905e-02,7.206516329203029904e-03
5.623859868852180283e-02,-4.464163650698899782e-02,-5.794093368209150136e-02,-7.965857695567990157e-03,5.209320164963270050e-02,4.910302492189610318e-02,5.600337505832399948e-02,-2.141183364489639834e-02,-2.832024254799870092e-02,4.448547856271539702e-02
-3.457486258696700065e-02,5.068011873981870252e-02,-5.578530953432969675e-02,-1.599922263614299983e-02,-9.824676969418109224e-03,-7.889995123798789270e-03,3.759518603788870178e-02,-3.949338287409189657e-02,-5.295879323920039961e-02,2.791705090337660150e-02
8.166636784565869944e-02,5.068011873981870252e-02,1.338730381358059929e-03,3.564383776990089764e-02,1.263946559924939983e-01,9.106491880169340081e-02,1.918699701745330000e-02,3.430885887772629900e-02,8.449528221240310000e-02,-3.007244590430930078e-02
-1.882016527791040067e-03,5.068011873981870252e-02,3.043965637614240091e-02,5.285819123858220142e-02,3.970962592582259754e-02,5.661858800484489973e-02,-3.971920784793980114e-02,7.120997975363539678e-02,2.539313491544940155e-02,2.791705090337660150e-02
1.107266754538149961e-01,5.068011873981870252e-02,6.727790750762559745e-03,2.875809638242839833e-02,-2.771206412603280031e-02,-7.263698200219739949e-03,-4.708248345611389801e-02,3.430885887772629900e-02,2.007840549823790115e-03,7.762233388139309909e-02
-3.094232413594750000e-02,-4.464163650698899782e-02,4.660683748435590079e-02,1.498661360748330083e-02,-1.670444126042380101e-02,-4.703355284749029946e-02,7.788079970179680352e-04,-2.592261998182820038e-03,6.345592137206540473e-02,-2.593033898947460017e-02
1.750521923228520000e-03,5.068011873981870252e-02,2.612840808061879863e-02,-9.113481248670509197e-03,2.457414448561009990e-02,3.845597722105199845e-02,-2.131101882750449997e-02,3.430885887772629900e-02,9.436409146079870192e-03,3.064409414368320182e-03
9.015598825267629943e-03,-4.464163650698899782e-02,4.552902541047500196e-02,2.875809638242839833e-02,1.219056876180000040e-02,-1.383981589779990050e-02,2.655027262562750096e-02,-3.949338287409189657e-02,4.613233103941480340e-02,3.620126473304600273e-02
3.081082953138499989e-02,-4.464163650698899782e-02,4.013996504107050084e-02,7.695828609473599757e-02,1.769438019460449832e-02,3.782968029747289795e-02,-2.867429443567860031e-02,3.430885887772629900e-02,-1.498586820292070049e-03,1.190434030297399942e-01
3.807590643342410180e-02,5.068011873981870252e-02,-1.806188694849819934e-02,6.662967401352719310e-02,-5.110326271545199972e-02,-1.665815205390569834e-02,-7.653558588881050062e-02,3.430885887772629900e-02,-1.190068480150809939e-02,-1.350401824497050006e-02
9.015598825267629943e-03,-4.464163650698899782e-02,1.427247526792889930e-02,1.498661360748330083e-02,5.484510736603499803e-02,4.722413415115889884e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-3.324878724762579674e-02,-5.906719430815229877e-02
9.256398319871740610e-02,-4.464163650698899782e-02,3.690652881942779739e-02,2.187235499495579841e-02,-2.496015840963049931e-02,-1.665815205390569834e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-2.251217192966049885e-02,-2.178823207463989955e-02
6.713621404158050254e-02,-4.464163650698899782e-02,3.494354529119849794e-03,3.564383776990089764e-02,4.934129593323050011e-02,3.125356259989280072e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-6.092541861022970299e-04,1.963283707370720027e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-7.087467856866229432e-02,-2.288496402361559975e-02,-1.568959820211340015e-03,-1.000728964429089965e-03,2.655027262562750096e-02,-3.949338287409189657e-02,-2.251217192966049885e-02,7.206516329203029904e-03
3.081082953138499989e-02,-4.464163650698899782e-02,-3.315125598283080038e-02,-2.288496402361559975e-02,-4.697540414084860200e-02,-8.116673518254939601e-02,1.038646665114559969e-01,-7.639450375000099436e-02,-3.980959436433750137e-02,-5.492508739331759815e-02
2.717829108036539862e-02,5.068011873981870252e-02,9.403056873511560221e-02,9.761551025715360652e-02,-3.459182841703849903e-02,-3.200242668159279658e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,3.664579779339879884e-02,1.066170822852360034e-01
1.264813727628719998e-02,5.068011873981870252e-02,3.582871674554689856e-02,4.941532054484590319e-02,5.346915450783389784e-02,7.415490186505870052e-02,-6.917231028063640375e-02,1.450122215054540087e-01,4.560080841412490066e-02,4.862758547755009764e-02
7.440129094361959405e-02,-4.464163650698899782e-02,3.151746845002330322e-02,1.010583809508899950e-01,4.658939021682820258e-02,3.689023491210430272e-02,1.550535921336619952e-02,-2.592261998182820038e-03,3.365681290238470291e-02,4.448547856271539702e-02
-4.183993948900609910e-02,-4.464163650698899782e-02,-6.548561819925780014e-02,-4.009931749229690007e-02,-5.696818394814720174e-03,1.434354566325799982e-02,-4.340084565202689815e-02,3.430885887772629900e-02,7.026862549151949647e-03,-1.350401824497050006e-02
-8.906293935226029801e-02,-4.464163650698899782e-02,-4.177375257387799801e-02,-1.944209332987930153e-02,-6.623874415566440021e-02,-7.427746902317970690e-02,8.142083605192099172e-03,-3.949338287409189657e-02,1.143797379512540100e-03,-3.007244590430930078e-02
2.354575262934580082e-02,5.068011873981870252e-02,-3.961812842611620034e-02,-5.670610554934250001e-03,-4.835135699904979933e-02,-3.325502052875090042e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.016435479455120028e-01,-6.735140813782170000e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,-3.854031635223530150e-02,-2.632783471735180084e-02,-1.532848840222260020e-02,8.781618063081050515e-04,-3.235593223976569732e-02,-2.592261998182820038e-03,1.143797379512540100e-03,-3.835665973397880263e-02
-2.367724723390840155e-02,5.068011873981870252e-02,-2.560657146566450160e-02,4.252957915737339695e-02,-5.385516843185429725e-02,-4.765984977106939996e-02,-2.131101882750449997e-02,-3.949338287409189657e-02,1.143797379512540100e-03,1.963283707370720027e-02
-9.996055470531900466e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-6.419941234845069622e-02,-5.798302700645770191e-02,-6.018578824265070210e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.811826730789670159e-02,-5.078298047848289754e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,-6.656343027313869898e-02,-1.123996020607579971e-01,-4.972730985725089953e-02,-4.139688053527879746e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-3.581672810154919867e-02,-9.361911330135799444e-03
3.081082953138499989e-02,5.068011873981870252e-02,3.259528052390420205e-02,4.941532054484590319e-02,-4.009563984984299695e-02,-4.358891976780549654e-02,-6.917231028063640375e-02,3.430885887772629900e-02,6.301661511474640487e-02,3.064409414368320182e-03
-1.035930931563389945e-01,5.068011873981870252e-02,-4.608500086940160029e-02,-2.632783471735180084e-02,-2.496015840963049931e-02,-2.480001206043359885e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-3.980959436433750137e-02,-5.492508739331759815e-02
6.713621404158050254e-02,5.068011873981870252e-02,-2.991781976118810041e-02,5.744868538213489945e-02,-1.930069620102049918e-04,-1.571870666853709964e-02,7.441156407875940126e-02,-5.056371913686460301e-02,-3.845911230135379971e-02,7.206516329203029904e-03
-5.273755484206479882e-02,-4.464163650698899782e-02,-1.267282657909369996e-02,-6.075654165471439799e-02,-1.930069620102049918e-04,8.080576427467340075e-03,1.182372140927919965e-02,-2.592261998182820038e-03,-2.712864555432650121e-02,-5.078298047848289754e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-1.590626280073640167e-02,-2.977070541108809906e-02,3.934851612593179802e-03,-6.875805026395569565e-04,4.127682384197570165e-02,-3.949338287409189657e-02,-2.364455757213410059e-02,1.134862324403770016e-02
-3.820740103798660192e-02,5.068011873981870252e-02,7.139651518361660176e-02,-5.731367096097819691e-02,1.539137131565160022e-01,1.558866503921270130e-01,7.788079970179680352e-04,7.194800217115350505e-02,5.027649338998960160e-02,6.933812005172369786e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-3.099563183506899924e-02,2.187235499495579841e-02,8.062710187196569719e-03,8.706873351046409346e-03,4.460445801105040325e-03,-2.592261998182820038e-03,9.436409146079870192e-03,1.134862324403770016e-02
1.264813727628719998e-02,5.068011873981870252e-02,2.609183074771409820e-04,-1.140872838930430053e-02,3.970962592582259754e-02,5.724488492842390308e-02,-3.971920784793980114e-02,5.608052019451260223e-02,2.405258322689299982e-02,3.205915781821130212e-02
6.713621404158050254e-02,-4.464163650698899782e-02,3.690652881942779739e-02,-5.042792957350569760e-02,-2.358420555142939912e-02,-3.450761437590899733e-02,4.864009945014990260e-02,-3.949338287409189657e-02,-2.595242443518940012e-02,-3.835665973397880263e-02
4.534098333546320025e-02,-4.464163650698899782e-02,3.906215296718960200e-02,4.597244985110970211e-02,6.686757328995440036e-03,-2.417371513685449835e-02,8.142083605192099172e-03,-1.255556463467829946e-02,6.432823302367089713e-02,5.691179930721949887e-02
6.713621404158050254e-02,5.068011873981870252e-02,-1.482845072685549936e-02,5.859630917623830093e-02,-5.935897986465880211e-02,-3.450761437590899733e-02,-6.180903467246220279e-02,1.290620876969899959e-02,-5.145307980263110273e-03,4.862758547755009764e-02
2.717829108036539862e-02,-4.464163650698899782e-02,6.727790750762559745e-03,3.564383776990089764e-02,7.961225881365530110e-02,7.071026878537380045e-02,1.550535921336619952e-02,3.430885887772629900e-02,4.067226371449769728e-02,1.134862324403770016e-02
5.623859868852180283e-02,-4.464163650698899782e-02,-6.871905442090049665e-02,-6.878990659528949614e-02,-1.930069620102049918e-04,-1.000728964429089965e-03,4.495846164606279866e-02,-3.764832683029650101e-02,-4.836172480289190057e-02,-1.077697500466389974e-03
3.444336798240450054e-02,5.068011873981870252e-02,-9.439390357450949676e-03,5.974393262605470073e-02,-3.596778127523959923e-02,-7.576846662009279788e-03,-7.653558588881050062e-02,7.120997975363539678e-02,1.100810104587249955e-02,-2.178823207463989955e-02
2.354575262934580082e-02,-4.464163650698899782e-02,1.966153563733339868e-02,-1.255635194240680048e-02,8.374011738825870577e-02,3.876912568284150012e-02,6.336665066649820044e-02,-2.592261998182820038e-03,6.604820616309839409e-02,4.862758547755009764e-02
4.897352178648269744e-02,5.068011873981870252e-02,7.462995140525929827e-02,6.662967401352719310e-02,-9.824676969418109224e-03,-2.253322811587220049e-03,-4.340084565202689815e-02,3.430885887772629900e-02,3.365681290238470291e-02,1.963283707370720027e-02
3.081082953138499989e-02,5.068011873981870252e-02,-8.361578283570040432e-03,4.658001526274530187e-03,1.494247447820220079e-02,2.749578105841839898e-02,8.142083605192099172e-03,-8.127430129569179762e-03,-2.952762274177360077e-02,5.691179930721949887e-02
-1.035930931563389945e-01,5.068011873981870252e-02,-2.345094731790270046e-02,-2.288496402361559975e-02,-8.687803702868139577e-02,-6.770135132559949864e-02,-1.762938102341739949e-02,-3.949338287409189657e-02,-7.814091066906959926e-02,-7.149351505265640061e-02
1.628067572730669890e-02,5.068011873981870252e-02,-4.608500086940160029e-02,1.154374291374709975e-02,-3.321587555883730170e-02,-1.603185513032660131e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,-4.398540256559110156e-02,-4.249876664881350324e-02
-6.000263174410389727e-02,5.068011873981870252e-02,5.415152200152219958e-02,-1.944209332987930153e-02,-4.972730985725089953e-02,-4.891244361822749687e-02,2.286863482154040048e-02,-3.949338287409189657e-02,-4.398540256559110156e-02,-5.219804415301099697e-03
-2.730978568492789874e-02,-4.464163650698899782e-02,-3.530688013059259805e-02,-2.977070541108809906e-02,-5.660707414825649764e-02,-5.862004593370299943e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-4.986846773523059828e-02,-1.294830118603420011e-01
4.170844488444359899e-02,-4.464163650698899782e-02,-3.207344390894990155e-02,-6.190416520781699683e-02,7.961225881365530110e-02,5.098191569263330059e-02,5.600337505832399948e-02,-9.972486173364639508e-03,4.506616833626150148e-02,-5.906719430815229877e-02
-8.179786245022120650e-02,-4.464163650698899782e-02,-8.165279930747129655e-02,-4.009931749229690007e-02,2.558898754392050119e-03,-1.853704282464289921e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-1.090443584737709956e-02,-9.220404962683000083e-02
-4.183993948900609910e-02,-4.464163650698899782e-02,4.768464955823679963e-02,5.974393262605470073e-02,1.277706088506949944e-01,1.280164372928579986e-01,-2.499265663159149983e-02,1.081111006295440019e-01,6.389312063683939835e-02,4.034337164788070335e-02
-1.277963188084970010e-02,-4.464163650698899782e-02,6.061839444480759953e-02,5.285819123858220142e-02,4.796534307502930278e-02,2.937467182915549924e-02,-1.762938102341739949e-02,3.430885887772629900e-02,7.021129819331020649e-02,7.206516329203029904e-03
6.713621404158050254e-02,-4.464163650698899782e-02,5.630714614928399725e-02,7.351541540099980343e-02,-1.395253554402150001e-02,-3.920484130275200124e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,7.573758845754760549e-02,3.620126473304600273e-02
-5.273755484206479882e-02,5.068011873981870252e-02,9.834181703063900326e-02,8.728689817594480205e-02,6.034891879883950289e-02,4.878987646010649742e-02,-5.812739686837520292e-02,1.081111006295440019e-01,8.449528221240310000e-02,4.034337164788070335e-02
5.383060374248070309e-03,-4.464163650698899782e-02,5.954058237092670069e-02,-5.616604740787570216e-02,2.457414448561009990e-02,5.286080646337049799e-02,-4.340084565202689815e-02,5.091436327188540029e-02,-4.219859706946029777e-03,-3.007244590430930078e-02
8.166636784565869944e-02,-4.464163650698899782e-02,3.367309259778510089e-02,8.100872220010799790e-03,5.209320164963270050e-02,5.661858800484489973e-02,-1.762938102341739949e-02,3.430885887772629900e-02,3.486419309615960277e-02,6.933812005172369786e-02
3.081082953138499989e-02,5.068011873981870252e-02,5.630714614928399725e-02,7.695828609473599757e-02,4.934129593323050011e-02,-1.227407358885230018e-02,-3.603757004385269719e-02,7.120997975363539678e-02,1.200533820015380060e-01,9.004865462589720093e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-6.548561819925780014e-02,-5.670610554934250001e-03,-7.072771253015849857e-03,-1.947648821001150138e-02,4.127682384197570165e-02,-3.949338287409189657e-02,-3.303712578676999863e-03,7.206516329203029904e-03
-4.910501639104519755e-02,-4.464163650698899782e-02,1.608549173157310108e-01,-4.698505887976939938e-02,-2.908801698423390050e-02,-1.978963667180099958e-02,-4.708248345611389801e-02,3.430885887772629900e-02,2.801650652326400162e-02,1.134862324403770016e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-5.578530953432969675e-02,2.531522568869210010e-02,-7.072771253015849857e-03,-2.354741821327540133e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-5.145307980263110273e-03,-5.078298047848289754e-02
7.803382939463919532e-02,5.068011873981870252e-02,-2.452875939178359929e-02,-4.239456463293059946e-02,6.686757328995440036e-03,5.286080646337049799e-02,-6.917231028063640375e-02,8.080427118137170628e-02,-3.712834601047360072e-02,5.691179930721949887e-02
1.264813727628719998e-02,-4.464163650698899782e-02,-3.638469220447349689e-02,4.252957915737339695e-02,-1.395253554402150001e-02,1.293437758520510003e-02,-2.683347553363510038e-02,5.156973385758089994e-03,-4.398540256559110156e-02,7.206516329203029904e-03
4.170844488444359899e-02,-4.464163650698899782e-02,-8.361578283570040432e-03,-5.731367096097819691e-02,8.062710187196569719e-03,-3.137612975801370302e-02,1.517259579645879874e-01,-7.639450375000099436e-02,-8.023654024890179703e-02,-1.764612515980519894e-02
4.897352178648269744e-02,-4.464163650698899782e-02,-4.177375257387799801e-02,1.045012516446259948e-01,3.558176735121919981e-02,-2.573945744580210040e-02,1.774974225931970073e-01,-7.639450375000099436e-02,-1.290794225416879923e-02,1.549073015887240078e-02
-1.641217033186929963e-02,5.068011873981870252e-02,1.274427430254229943e-01,9.761551025715360652e-02,1.631842733640340160e-02,1.747503028115330106e-02,-2.131101882750449997e-02,3.430885887772629900e-02,3.486419309615960277e-02,3.064409414368320182e-03
-7.453278554818210111e-02,5.068011873981870252e-02,-7.734155101194770121e-02,-4.698505887976939938e-02,-4.697540414084860200e-02,-3.262872360517189707e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-7.212845460195599356e-02,-1.764612515980519894e-02
3.444336798240450054e-02,5.068011873981870252e-02,2.828403222838059977e-02,-3.321357610482440076e-02,-4.559945128264750180e-02,-9.768885894535990141e-03,-5.076412126020100196e-02,-2.592261998182820038e-03,-5.947269741072230137e-02,-2.178823207463989955e-02
-3.457486258696700065e-02,5.068011873981870252e-02,-2.560657146566450160e-02,-1.714684618924559867e-02,1.182945896190920002e-03,-2.879619735166290186e-03,8.142083605192099172e-03,-1.550765430475099967e-02,1.482271084126630077e-02,4.034337164788070335e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-6.225218197761509670e-02,1.154374291374709975e-02,-8.448724111216979540e-03,-3.669965360843580049e-02,1.222728555318910032e-01,-7.639450375000099436e-02,-8.682899321629239386e-02,3.064409414368320182e-03
5.987113713954139715e-02,-4.464163650698899782e-02,-8.168937664037369826e-04,-8.485663651086830517e-02,7.548440023905199359e-02,7.947842571548069390e-02,4.460445801105040325e-03,3.430885887772629900e-02,2.337484127982079885e-02,2.791705090337660150e-02
6.350367559056099842e-02,5.068011873981870252e-02,8.864150836571099701e-02,7.007254470726349826e-02,2.044628591100669870e-02,3.751653183568340322e-02,-5.076412126020100196e-02,7.120997975363539678e-02,2.930041326858690010e-02,7.348022696655839847e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-3.207344390894990155e-02,-2.632783471735180084e-02,4.246153164222479792e-02,-1.039518281811509931e-02,1.590892335727620011e-01,-7.639450375000099436e-02,-1.190068480150809939e-02,-3.835665973397880263e-02
5.383060374248070309e-03,5.068011873981870252e-02,3.043965637614240091e-02,8.384402748220859403e-02,-3.734373413344069942e-02,-4.734670130927989828e-02,1.550535921336619952e-02,-3.949338287409189657e-02,8.640282933063080789e-03,1.549073015887240078e-02
3.807590643342410180e-02,5.068011873981870252e-02,8.883414898524360018e-03,4.252957915737339695e-02,-4.284754556624519733e-02,-2.104223051895920057e-02,-3.971920784793980114e-02,-2.592261998182820038e-03,-1.811826730789670159e-02,7.206516329203029904e-03
1.264813727628719998e-02,-4.464163650698899782e-02,6.727790750762559745e-03,-5.616604740787570216e-02,-7.587041416307230279e-02,-6.644875747844139480e-02,-2.131101882750449997e-02,-3.764832683029650101e-02,-1.811826730789670159e-02,-9.220404962683000083e-02
7.440129094361959405e-02,5.068011873981870252e-02,-2.021751109626000048e-02,4.597244985110970211e-02,7.410844738085080319e-02,3.281930490884039930e-02,-3.603757004385269719e-02,7.120997975363539678e-02,1.063542767417259977e-01,3.620126473304600273e-02
1.628067572730669890e-02,-4.464163650698899782e-02,-2.452875939178359929e-02,3.564383776990089764e-02,-7.072771253015849857e-03,-3.192768196955810076e-03,-1.394774321933030074e-02,-2.592261998182820038e-03,1.556684454070180086e-02,1.549073015887240078e-02
-5.514554978810590376e-03,5.068011873981870252e-02,-1.159501450521270051e-02,1.154374291374709975e-02,-2.220825269322829892e-02,-1.540555820674759969e-02,-2.131101882750449997e-02,-2.592261998182820038e-03,1.100810104587249955e-02,6.933812005172369786e-02
1.264813727628719998e-02,-4.464163650698899782e-02,2.612840808061879863e-02,6.318680331979099896e-02,1.250187031342930022e-01,9.169121572527250130e-02,6.336665066649820044e-02,-2.592261998182820038e-03,5.757285620242599822e-02,-2.178823207463989955e-02
-3.457486258696700065e-02,-4.464163650698899782e-02,-5.901874575597240019e-02,1.215130832538269907e-03,-5.385516843185429725e-02,-7.803525056465400456e-02,6.704828847058519337e-02,-7.639450375000099436e-02,-2.139368094035999993e-02,1.549073015887240078e-02
6.713621404158050254e-02,5.068011873981870252e-02,-3.638469220447349689e-02,-8.485663651086830517e-02,-7.072771253015849857e-03,1.966706951368000014e-02,-5.444575906428809897e-02,3.430885887772629900e-02,1.143797379512540100e-03,3.205915781821130212e-02
3.807590643342410180e-02,5.068011873981870252e-02,-2.452875939178359929e-02,4.658001526274530187e-03,-2.633611126783170012e-02,-2.636575436938120090e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-2.593033898947460017e-02
9.015598825267629943e-03,5.068011873981870252e-02,1.858372356345249984e-02,3.908670846363720280e-02,1.769438019460449832e-02,1.058576412178359981e-02,1.918699701745330000e-02,-2.592261998182820038e-03,1.630495279994180133e-02,-1.764612515980519894e-02
-9.269547780327989928e-02,5.068011873981870252e-02,-9.027529589851850111e-02,-5.731367096097819691e-02,-2.496015840963049931e-02,-3.043668437264510085e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,2.405258322689299982e-02,3.064409414368320182e-03
7.076875249260000666e-02,-4.464163650698899782e-02,-5.128142061927360405e-03,-5.670610554934250001e-03,8.786797596286209655e-02,1.029645603496960049e-01,1.182372140927919965e-02,3.430885887772629900e-02,-8.944018957797799166e-03,2.791705090337660150e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,-5.255187331268700024e-02,-3.321357610482440076e-02,-4.422349842444640161e-02,-3.638650514664620167e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-6.832974362442149896e-02,-3.007244590430930078e-02
4.170844488444359899e-02,5.068011873981870252e-02,-2.237313524402180162e-02,2.875809638242839833e-02,-6.623874415566440021e-02,-4.515466207675319921e-02,-6.180903467246220279e-02,-2.592261998182820038e-03,2.863770518940129874e-03,-5.492508739331759815e-02
1.264813727628719998e-02,-4.464163650698899782e-02,-2.021751109626000048e-02,-1.599922263614299983e-02,1.219056876180000040e-02,2.123281182262769934e-02,-7.653558588881050062e-02,1.081111006295440019e-01,5.988072306548120061e-02,-2.178823207463989955e-02
-3.820740103798660192e-02,-4.464163650698899782e-02,-5.470749746044879791e-02,-7.797089512339580586e-02,-3.321587555883730170e-02,-8.649025903297140327e-02,1.406810445523269948e-01,-7.639450375000099436e-02,-1.919704761394450121e-02,-5.219804415301099697e-03
4.534098333546320025e-02,-4.464163650698899782e-02,-6.205954135808240159e-03,-1.599922263614299983e-02,1.250187031342930022e-01,1.251981011367520047e-01,1.918699701745330000e-02,3.430885887772629900e-02,3.243322577960189995e-02,-5.219804415301099697e-03
7.076875249260000666e-02,5.068011873981870252e-02,-1.698407487461730050e-02,2.187235499495579841e-02,4.383748450042589812e-02,5.630543954305530091e-02,3.759518603788870178e-02,-2.592261998182820038e-03,-7.020931272868760620e-02,-1.764612515980519894e-02
-7.453278554818210111e-02,5.068011873981870252e-02,5.522933407540309841e-02,-4.009931749229690007e-02,5.346915450783389784e-02,5.317395492515999966e-02,-4.340084565202689815e-02,7.120997975363539678e-02,6.123790751970099866e-02,-3.421455281914410201e-02
5.987113713954139715e-02,5.068011873981870252e-02,7.678557555302109594e-02,2.531522568869210010e-02,1.182945896190920002e-03,1.684873335757430118e-02,-5.444575906428809897e-02,3.430885887772629900e-02,2.993564839653250001e-02,4.448547856271539702e-02
7.440129094361959405e-02,-4.464163650698899782e-02,1.858372356345249984e-02,6.318680331979099896e-02,6.172487165704060308e-02,4.284005568610550069e-02,8.142083605192099172e-03,-2.592261998182820038e-03,5.803912766389510147e-02,-5.906719430815229877e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-2.237313524402180162e-02,-3.206595255172180192e-02,-4.972730985725089953e-02,-6.864079671096809387e-02,7.809320188284639419e-02,-7.085933561861459951e-02,-6.291294991625119570e-02,-3.835665973397880263e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,9.295275666123460623e-02,1.269136646684959971e-02,2.044628591100669870e-02,4.252690722431590187e-02,7.788079970179680352e-04,3.598276718899090076e-04,-5.454415271109520208e-02,-1.077697500466389974e-03
2.354575262934580082e-02,5.068011873981870252e-02,-3.099563183506899924e-02,-5.670610554934250001e-03,-1.670444126042380101e-02,1.778817874294279927e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-7.408887149153539631e-02,-3.421455281914410201e-02
-5.273755484206479882e-02,5.068011873981870252e-02,3.906215296718960200e-02,-4.009931749229690007e-02,-5.696818394814720174e-03,-1.290037051243130006e-02,1.182372140927919965e-02,-3.949338287409189657e-02,1.630495279994180133e-02,3.064409414368320182e-03
6.713621404158050254e-02,-4.464163650698899782e-02,-6.117436990373419786e-02,-4.009931749229690007e-02,-2.633611126783170012e-02,-2.448686359864400003e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-5.615757309500619965e-02,-5.906719430815229877e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-8.361578283570040432e-03,-6.419941234845069622e-02,-3.871968699164179961e-02,-2.448686359864400003e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-6.468302246445030435e-02,-5.492508739331759815e-02
2.354575262934580082e-02,5.068011873981870252e-02,-3.746250427835440266e-02,-4.698505887976939938e-02,-9.100589560328480043e-02,-7.553006287033779687e-02,-3.235593223976569732e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,-1.350401824497050006e-02
3.807590643342410180e-02,5.068011873981870252e-02,-1.375063865297449991e-02,-1.599922263614299983e-02,-3.596778127523959923e-02,-2.198167590432769866e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-2.595242443518940012e-02,-1.077697500466389974e-03
1.628067572730669890e-02,-4.464163650698899782e-02,7.355213933137849658e-02,-4.124694104539940176e-02,-4.320865536613589623e-03,-1.352666743601040056e-02,-1.394774321933030074e-02,-1.116217163146459961e-03,4.289568789252869857e-02,4.448547856271539702e-02
-1.882016527791040067e-03,5.068011873981870252e-02,-2.452875939178359929e-02,5.285819123858220142e-02,2.732605020201240090e-02,3.000096875273459973e-02,3.023191042971450082e-02,-2.592261998182820038e-03,-2.139368094035999993e-02,3.620126473304600273e-02
1.264813727628719998e-02,-4.464163650698899782e-02,3.367309259778510089e-02,3.334859052598110329e-02,3.007795591841460128e-02,2.718263259662880016e-02,-2.902829807069099918e-03,8.847085473348980864e-03,3.119299070280229930e-02,2.791705090337660150e-02
7.440129094361959405e-02,-4.464163650698899782e-02,3.475090467166599972e-02,9.417263956341730136e-02,5.759701308243719842e-02,2.029336643725910064e-02,2.286863482154040048e-02,-2.592261998182820038e-03,7.380214692004880006e-02,-2.178823207463989955e-02
4.170844488444359899e-02,5.068011873981870252e-02,-3.854031635223530150e-02,5.285819123858220142e-02,7.686035309725310072e-02,1.164299442066459994e-01,-3.971920784793980114e-02,7.120997975363539678e-02,-2.251217192966049885e-02,-1.350401824497050006e-02
-9.147093429830140468e-03,5.068011873981870252e-02,-3.961812842611620034e-02,-4.009931749229690007e-02,-8.448724111216979540e-03,1.622243643399520069e-02,-6.549067247654929980e-02,7.120997975363539678e-02,1.776347786711730131e-02,-6.735140813782170000e-02
9.015598825267629943e-03,5.068011873981870252e-02,-1.894705840284650021e-03,2.187235499495579841e-02,-3.871968699164179961e-02,-2.480001206043359885e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-3.980959436433750137e-02,-1.350401824497050006e-02
6.713621404158050254e-02,5.068011873981870252e-02,-3.099563183506899924e-02,4.658001526274530187e-03,2.457414448561009990e-02,3.563764106494619888e-02,-2.867429443567860031e-02,3.430885887772629900e-02,2.337484127982079885e-02,8.176444079622779970e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-4.608500086940160029e-02,-3.321357610482440076e-02,-7.311850844667000526e-02,-8.147988364433890462e-02,4.495846164606279866e-02,-6.938329078357829971e-02,-6.117659509433449883e-02,-7.977772888232589898e-02
-9.147093429830140468e-03,5.068011873981870252e-02,1.338730381358059929e-03,-2.227739861197989939e-03,7.961225881365530110e-02,7.008397186179469995e-02,3.391354823380159783e-02,-2.592261998182820038e-03,2.671425763351279944e-02,8.176444079622779970e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,6.492964274033119487e-02,3.564383776990089764e-02,-1.568959820211340015e-03,1.496984258683710031e-02,-1.394774321933030074e-02,7.288388806489919797e-04,-1.811826730789670159e-02,3.205915781821130212e-02
9.619652164973699349e-02,-4.464163650698899782e-02,4.013996504107050084e-02,-5.731367096097819691e-02,4.521343735862710239e-02,6.068951800810880315e-02,-2.131101882750449997e-02,3.615391492152170150e-02,1.255315281338930007e-02,2.377494398854190089e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-5.670610554934250001e-03,-2.083229983502719873e-02,-1.415296435958940044e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-3.845911230135379971e-02,-3.007244590430930078e-02
5.987113713954139715e-02,5.068011873981870252e-02,5.307370992764130074e-02,5.285819123858220142e-02,3.282986163481690228e-02,1.966706951368000014e-02,-1.026610541524320026e-02,3.430885887772629900e-02,5.520503808961670089e-02,-1.077697500466389974e-03
-2.367724723390840155e-02,-4.464163650698899782e-02,4.013996504107050084e-02,-1.255635194240680048e-02,-9.824676969418109224e-03,-1.000728964429089965e-03,-2.902829807069099918e-03,-2.592261998182820038e-03,-1.190068480150809939e-02,-3.835665973397880263e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-2.021751109626000048e-02,-5.387080026724189868e-02,3.145390877661580209e-02,2.060651489904859884e-02,5.600337505832399948e-02,-3.949338287409189657e-02,-1.090443584737709956e-02,-1.077697500466389974e-03
1.628067572730669890e-02,5.068011873981870252e-02,1.427247526792889930e-02,1.215130832538269907e-03,1.182945896190920002e-03,-2.135537898074869878e-02,-3.235593223976569732e-02,3.430885887772629900e-02,7.496833602773420036e-02,4.034337164788070335e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-3.422906805671169922e-02,5.515343848250200270e-02,6.722868308984519814e-02,7.415490186505870052e-02,-6.584467611156170040e-03,3.283281404268990206e-02,2.472532334280450050e-02,6.933812005172369786e-02
8.893144474769780483e-02,-4.464163650698899782e-02,6.727790750762559745e-03,2.531522568869210010e-02,3.007795591841460128e-02,8.706873351046409346e-03,6.336665066649820044e-02,-3.949338287409189657e-02,9.436409146079870192e-03,3.205915781821130212e-02
1.991321417832630017e-02,-4.464163650698899782e-02,4.572166603000769880e-03,4.597244985110970211e-02,-1.808039411862490120e-02,-5.454911593043910295e-02,6.336665066649820044e-02,-3.949338287409189657e-02,2.866072031380889965e-02,6.105390622205419948e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,3.043965637614240091e-02,-5.670610554934250001e-03,8.236416453005759863e-02,9.200436418706199604e-02,-1.762938102341739949e-02,7.120997975363539678e-02,3.304707235493409972e-02,3.064409414368320182e-03
9.619652164973699349e-02,-4.464163650698899782e-02,5.199589785376040191e-02,7.925353333865589600e-02,5.484510736603499803e-02,3.657708645031480105e-02,-7.653558588881050062e-02,1.413221094178629955e-01,9.864637430492799453e-02,6.105390622205419948e-02
2.354575262934580082e-02,5.068011873981870252e-02,6.169620651868849837e-02,6.203917986997459916e-02,2.457414448561009990e-02,-3.607335668485669999e-02,-9.126213710515880539e-02,1.553445353507079962e-01,1.333957338374689994e-01,8.176444079622779970e-02
7.076875249260000666e-02,5.068011873981870252e-02,-7.283766209689159811e-03,4.941532054484590319e-02,6.034891879883950289e-02,-4.445362044113949918e-03,-5.444575906428809897e-02,1.081111006295440019e-01,1.290194116001679991e-01,5.691179930721949887e-02
3.081082953138499989e-02,-4.464163650698899782e-02,5.649978676881649634e-03,1.154374291374709975e-02,7.823630595545419397e-02,7.791268340653299818e-02,-4.340084565202689815e-02,1.081111006295440019e-01,6.604820616309839409e-02,1.963283707370720027e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,5.415152200152219958e-02,-6.649465948908450663e-02,7.273249452264969606e-02,5.661858800484489973e-02,-4.340084565202689815e-02,8.486339447772170419e-02,8.449528221240310000e-02,4.862758547755009764e-02
4.534098333546320025e-02,5.068011873981870252e-02,-8.361578283570040432e-03,-3.321357610482440076e-02,-7.072771253015849857e-03,1.191310268097639903e-03,-3.971920784793980114e-02,3.430885887772629900e-02,2.993564839653250001e-02,2.791705090337660150e-02
7.440129094361959405e-02,-4.464163650698899782e-02,1.145089981388529993e-01,2.875809638242839833e-02,2.457414448561009990e-02,2.499059336410210108e-02,1.918699701745330000e-02,-2.592261998182820038e-03,-6.092541861022970299e-04,-5.219804415301099697e-03
-3.820740103798660192e-02,-4.464163650698899782e-02,6.708526688809300642e-02,-6.075654165471439799e-02,-2.908801698423390050e-02,-2.323426975148589965e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,-1.498586820292070049e-03,1.963283707370720027e-02
-1.277963188084970010e-02,5.068011873981870252e-02,-5.578530953432969675e-02,-2.227739861197989939e-03,-2.771206412603280031e-02,-2.918409052548700047e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-1.705210460474350029e-02,4.448547856271539702e-02
9.015598825267629943e-03,5.068011873981870252e-02,3.043965637614240091e-02,4.252957915737339695e-02,-2.944912678412469915e-03,3.689023491210430272e-02,-6.549067247654929980e-02,7.120997975363539678e-02,-2.364455757213410059e-02,1.549073015887240078e-02
8.166636784565869944e-02,5.068011873981870252e-02,-2.560657146566450160e-02,-3.665644679856060184e-02,-7.036660273026780488e-02,-4.640725592391130305e-02,-3.971920784793980114e-02,-2.592261998182820038e-03,-4.118038518800790082e-02,-5.219804415301099697e-03
3.081082953138499989e-02,-4.464163650698899782e-02,1.048086894739250069e-01,7.695828609473599757e-02,-1.120062982761920074e-02,-1.133462820348369975e-02,-5.812739686837520292e-02,3.430885887772629900e-02,5.710418744784390155e-02,3.620126473304600273e-02
2.717829108036539862e-02,5.068011873981870252e-02,-6.205954135808240159e-03,2.875809638242839833e-02,-1.670444126042380101e-02,-1.627025888008149911e-03,-5.812739686837520292e-02,3.430885887772629900e-02,2.930041326858690010e-02,3.205915781821130212e-02
-6.000263174410389727e-02,5.068011873981870252e-02,-4.716281294328249912e-02,-2.288496402361559975e-02,-7.174255558846899528e-02,-5.768060054833450134e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-6.291294991625119570e-02,-5.492508739331759815e-02
5.383060374248070309e-03,-4.464163650698899782e-02,-4.824062501716339796e-02,-1.255635194240680048e-02,1.182945896190920002e-03,-6.637401276640669812e-03,6.336665066649820044e-02,-3.949338287409189657e-02,-5.140053526058249722e-02,-5.906719430815229877e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,8.540807214406830050e-02,-3.665644679856060184e-02,9.199583453746550121e-02,8.949917649274570508e-02,-6.180903467246220279e-02,1.450122215054540087e-01,8.094791351127560153e-02,5.276969239238479825e-02
1.991321417832630017e-02,5.068011873981870252e-02,-1.267282657909369996e-02,7.007254470726349826e-02,-1.120062982761920074e-02,7.141131042098750048e-03,-3.971920784793980114e-02,3.430885887772629900e-02,5.384369968545729690e-03,3.064409414368320182e-03
-6.363517019512339445e-02,-4.464163650698899782e-02,-3.315125598283080038e-02,-3.321357610482440076e-02,1.182945896190920002e-03,2.405114797873349891e-02,-2.499265663159149983e-02,-2.592261998182820038e-03,-2.251217192966049885e-02,-5.906719430815229877e-02
2.717829108036539862e-02,-4.464163650698899782e-02,-7.283766209689159811e-03,-5.042792957350569760e-02,7.548440023905199359e-02,5.661858800484489973e-02,3.391354823380159783e-02,-2.592261998182820038e-03,4.344317225278129802e-02,1.549073015887240078e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,-1.375063865297449991e-02,1.320442171945160059e-01,-9.824676969418109224e-03,-3.819065120534880214e-03,1.918699701745330000e-02,-3.949338287409189657e-02,-3.581672810154919867e-02,-3.007244590430930078e-02
3.081082953138499989e-02,5.068011873981870252e-02,5.954058237092670069e-02,5.630106193231849965e-02,-2.220825269322829892e-02,1.191310268097639903e-03,-3.235593223976569732e-02,-2.592261998182820038e-03,-2.479118743246069845e-02,-1.764612515980519894e-02
5.623859868852180283e-02,5.068011873981870252e-02,2.181715978509519982e-02,5.630106193231849965e-02,-7.072771253015849857e-03,1.810132720473240156e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-2.364455757213410059e-02,2.377494398854190089e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,1.858372356345249984e-02,9.072976886968099619e-02,3.934851612593179802e-03,8.706873351046409346e-03,3.759518603788870178e-02,-3.949338287409189657e-02,-5.780006567561250114e-02,7.206516329203029904e-03
-1.072256316073579990e-01,-4.464163650698899782e-02,-1.159501450521270051e-02,-4.009931749229690007e-02,4.934129593323050011e-02,6.444729954958319795e-02,-1.394774321933030074e-02,3.430885887772629900e-02,7.026862549151949647e-03,-3.007244590430930078e-02
8.166636784565869944e-02,5.068011873981870252e-02,-2.972517914165530208e-03,-3.321357610482440076e-02,4.246153164222479792e-02,5.787118185200299664e-02,-1.026610541524320026e-02,3.430885887772629900e-02,-6.092541861022970299e-04,-1.077697500466389974e-03
5.383060374248070309e-03,5.068011873981870252e-02,1.750591148957160101e-02,3.220096707616459941e-02,1.277706088506949944e-01,1.273901403692790091e-01,-2.131101882750449997e-02,7.120997975363539678e-02,6.257518145805600340e-02,1.549073015887240078e-02
3.807590643342410180e-02,5.068011873981870252e-02,-2.991781976118810041e-02,-7.452802442965950069e-02,-1.257658268582039982e-02,-1.258722205064180012e-02,4.460445801105040325e-03,-2.592261998182820038e-03,3.711738233435969789e-03,-3.007244590430930078e-02
3.081082953138499989e-02,-4.464163650698899782e-02,-2.021751109626000048e-02,-5.670610554934250001e-03,-4.320865536613589623e-03,-2.949723898727649868e-02,7.809320188284639419e-02,-3.949338287409189657e-02,-1.090443584737709956e-02,-1.077697500466389974e-03
1.750521923228520000e-03,5.068011873981870252e-02,-5.794093368209150136e-02,-4.354218818603310115e-02,-9.650970703608929835e-02,-4.703355284749029946e-02,-9.862541271333299941e-02,3.430885887772629900e-02,-6.117659509433449883e-02,-7.149351505265640061e-02
-2.730978568492789874e-02,5.068011873981870252e-02,6.061839444480759953e-02,1.079441223383619947e-01,1.219056876180000040e-02,-1.759759743927430051e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,7.021129819331020649e-02,1.356118306890790048e-01
-8.543040090124079389e-02,5.068011873981870252e-02,-4.069594049999709917e-02,-3.321357610482440076e-02,-8.137422559587689785e-02,-6.958024209633670298e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-5.780006567561250114e-02,-4.249876664881350324e-02
1.264813727628719998e-02,5.068011873981870252e-02,-7.195249064254319316e-02,-4.698505887976939938e-02,-5.110326271545199972e-02,-9.713730673381550107e-02,1.185912177278039964e-01,-7.639450375000099436e-02,-2.028874775162960165e-02,-3.835665973397880263e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-5.578530953432969675e-02,-3.665644679856060184e-02,8.924392882106320368e-02,-3.192768196955810076e-03,8.142083605192099172e-03,3.430885887772629900e-02,1.323726493386760128e-01,3.064409414368320182e-03
-2.367724723390840155e-02,5.068011873981870252e-02,4.552902541047500196e-02,2.187235499495579841e-02,1.098832216940800049e-01,8.887287956916670173e-02,7.788079970179680352e-04,3.430885887772629900e-02,7.419253669003070262e-02,6.105390622205419948e-02
-7.453278554818210111e-02,5.068011873981870252e-02,-9.439390357450949676e-03,1.498661360748330083e-02,-3.734373413344069942e-02,-2.166852744253820046e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,1.134862324403770016e-02
-5.514554978810590376e-03,5.068011873981870252e-02,-3.315125598283080038e-02,-1.599922263614299983e-02,8.062710187196569719e-03,1.622243643399520069e-02,1.550535921336619952e-02,-2.592261998182820038e-03,-2.832024254799870092e-02,-7.563562196749110123e-02
-6.000263174410389727e-02,5.068011873981870252e-02,4.984027370599859730e-02,1.842948430121960079e-02,-1.670444126042380101e-02,-3.012353591085559917e-02,-1.762938102341739949e-02,-2.592261998182820038e-03,4.976865992074899769e-02,-5.906719430815229877e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,-8.488623552911400694e-02,-2.632783471735180084e-02,-3.596778127523959923e-02,-3.419446591411950259e-02,4.127682384197570165e-02,-5.167075276314189725e-02,-8.238148325810279449e-02,-4.664087356364819692e-02
3.807590643342410180e-02,5.068011873981870252e-02,5.649978676881649634e-03,3.220096707616459941e-02,6.686757328995440036e-03,1.747503028115330106e-02,-2.499265663159149983e-02,3.430885887772629900e-02,1.482271084126630077e-02,6.105390622205419948e-02
1.628067572730669890e-02,-4.464163650698899782e-02,2.073934771121430098e-02,2.187235499495579841e-02,-1.395253554402150001e-02,-1.321351897422090062e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,1.331596790892770020e-02,4.034337164788070335e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-7.283766209689159811e-03,2.875809638242839833e-02,-4.284754556624519733e-02,-4.828614669464850045e-02,5.232173725423699961e-02,-7.639450375000099436e-02,-7.212845460195599356e-02,2.377494398854190089e-02
1.991321417832630017e-02,5.068011873981870252e-02,1.048086894739250069e-01,7.007254470726349826e-02,-3.596778127523959923e-02,-2.667890283117069911e-02,-2.499265663159149983e-02,-2.592261998182820038e-03,3.711738233435969789e-03,4.034337164788070335e-02
-4.910501639104519755e-02,5.068011873981870252e-02,-2.452875939178359929e-02,6.750727943574620551e-05,-4.697540414084860200e-02,-2.824464514011839830e-02,-6.549067247654929980e-02,2.840467953758080144e-02,1.919903307856710151e-02,1.134862324403770016e-02
1.750521923228520000e-03,5.068011873981870252e-02,-6.205954135808240159e-03,-1.944209332987930153e-02,-9.824676969418109224e-03,4.949091809572019746e-03,-3.971920784793980114e-02,3.430885887772629900e-02,1.482271084126630077e-02,9.833286845556660216e-02
3.444336798240450054e-02,-4.464163650698899782e-02,-3.854031635223530150e-02,-1.255635194240680048e-02,9.438663045397699403e-03,5.262240271361550044e-03,-6.584467611156170040e-03,-2.592261998182820038e-03,3.119299070280229930e-02,9.833286845556660216e-02
-4.547247794002570037e-02,5.068011873981870252e-02,1.371430516903520136e-01,-1.599922263614299983e-02,4.108557878402369773e-02,3.187985952347179713e-02,-4.340084565202689815e-02,7.120997975363539678e-02,7.102157794598219775e-02,4.862758547755009764e-02
-9.147093429830140468e-03,5.068011873981870252e-02,1.705552259806600024e-01,1.498661360748330083e-02,3.007795591841460128e-02,3.375875029420900147e-02,-2.131101882750449997e-02,3.430885887772629900e-02,3.365681290238470291e-02,3.205915781821130212e-02
-1.641217033186929963e-02,5.068011873981870252e-02,2.416542455238970041e-03,1.498661360748330083e-02,2.182223876920789951e-02,-1.008203435632550049e-02,-2.499265663159149983e-02,3.430885887772629900e-02,8.553312118743899850e-02,8.176444079622779970e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,3.798434089330870317e-02,-4.009931749229690007e-02,-2.496015840963049931e-02,-3.819065120534880214e-03,-4.340084565202689815e-02,1.585829843977170153e-02,-5.145307980263110273e-03,2.791705090337660150e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-5.794093368209150136e-02,-5.731367096097819691e-02,-1.568959820211340015e-03,-1.258722205064180012e-02,7.441156407875940126e-02,-3.949338287409189657e-02,-6.117659509433449883e-02,-7.563562196749110123e-02
5.260606023750229870e-02,5.068011873981870252e-02,-9.439390357450949676e-03,4.941532054484590319e-02,5.071724879143160031e-02,-1.916333974822199970e-02,-1.394774321933030074e-02,3.430885887772629900e-02,1.193439942037869961e-01,-1.764612515980519894e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-2.345094731790270046e-02,-1.599922263614299983e-02,1.356652162000110060e-02,1.277780335431030062e-02,2.655027262562750096e-02,-2.592261998182820038e-03,-1.090443584737709956e-02,-2.178823207463989955e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,-1.051720243133190055e-02,-5.670610554934250001e-03,-6.623874415566440021e-02,-5.705430362475540085e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-4.257210492279420166e-02,-1.077697500466389974e-03
-1.072256316073579990e-01,-4.464163650698899782e-02,-3.422906805671169922e-02,-6.764228304218700139e-02,-6.348683843926219983e-02,-7.051968748170529822e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-6.092541861022970299e-04,-7.977772888232589898e-02
4.534098333546320025e-02,5.068011873981870252e-02,-2.972517914165530208e-03,1.079441223383619947e-01,3.558176735121919981e-02,2.248540566978590033e-02,2.655027262562750096e-02,-2.592261998182820038e-03,2.801650652326400162e-02,1.963283707370720027e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,6.816307896197400240e-02,-5.670610554934250001e-03,1.195148917014880047e-01,1.302084765253850029e-01,-2.499265663159149983e-02,8.670845052151719690e-02,4.613233103941480340e-02,-1.077697500466389974e-03
1.991321417832630017e-02,5.068011873981870252e-02,9.961226972405269262e-03,1.842948430121960079e-02,1.494247447820220079e-02,4.471894645684260094e-02,-6.180903467246220279e-02,7.120997975363539678e-02,9.436409146079870192e-03,-6.320930122298699938e-02
1.628067572730669890e-02,5.068011873981870252e-02,2.416542455238970041e-03,-5.670610554934250001e-03,-5.696818394814720174e-03,1.089891258357309975e-02,-5.076412126020100196e-02,3.430885887772629900e-02,2.269202256674450122e-02,-3.835665973397880263e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-3.854031635223530150e-02,2.187235499495579841e-02,-1.088932827598989989e-01,-1.156130659793979942e-01,2.286863482154040048e-02,-7.639450375000099436e-02,-4.687948284421659950e-02,2.377494398854190089e-02
1.628067572730669890e-02,-4.464163650698899782e-02,2.612840808061879863e-02,5.859630917623830093e-02,-6.073493272285990230e-02,-4.421521669138449989e-02,-1.394774321933030074e-02,-3.395821474270550172e-02,-5.140053526058249722e-02,-2.593033898947460017e-02
-7.090024709716259699e-02,5.068011873981870252e-02,-8.919748382463760228e-02,-7.452802442965950069e-02,-4.284754556624519733e-02,-2.573945744580210040e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-1.290794225416879923e-02,-5.492508739331759815e-02
4.897352178648269744e-02,-4.464163650698899782e-02,6.061839444480759953e-02,-2.288496402361559975e-02,-2.358420555142939912e-02,-7.271172671423199729e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,1.041376113589790042e-01,3.620126473304600273e-02
5.383060374248070309e-03,5.068011873981870252e-02,-2.884000768730720157e-02,-9.113481248670509197e-03,-3.183992270063620150e-02,-2.887094206369749880e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-1.811826730789670159e-02,7.206516329203029904e-03
3.444336798240450054e-02,5.068011873981870252e-02,-2.991781976118810041e-02,4.658001526274530187e-03,9.337178739566659447e-02,8.699398879842949739e-02,3.391354823380159783e-02,-2.592261998182820038e-03,2.405258322689299982e-02,-3.835665973397880263e-02
2.354575262934580082e-02,5.068011873981870252e-02,-1.913969902237900103e-02,4.941532054484590319e-02,-6.348683843926219983e-02,-6.112523362801929733e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-2.595242443518940012e-02,-1.350401824497050006e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-4.069594049999709917e-02,-1.599922263614299983e-02,-8.448724111216979540e-03,-1.759759743927430051e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,3.064409414368320182e-03
-4.547247794002570037e-02,-4.464163650698899782e-02,1.535028734180979987e-02,-7.452802442965950069e-02,-4.972730985725089953e-02,-1.728444897748479883e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,-1.043648208321659998e-01,-7.563562196749110123e-02
5.260606023750229870e-02,5.068011873981870252e-02,-2.452875939178359929e-02,5.630106193231849965e-02,-7.072771253015849857e-03,-5.071658967693000106e-03,-2.131101882750449997e-02,-2.592261998182820038e-03,2.671425763351279944e-02,-3.835665973397880263e-02
-5.514554978810590376e-03,5.068011873981870252e-02,1.338730381358059929e-03,-8.485663651086830517e-02,-1.120062982761920074e-02,-1.665815205390569834e-02,4.864009945014990260e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-8.806194271199530021e-02
9.015598825267629943e-03,5.068011873981870252e-02,6.924089103585480409e-02,5.974393262605470073e-02,1.769438019460449832e-02,-2.323426975148589965e-02,-4.708248345611389801e-02,3.430885887772629900e-02,1.032922649115240038e-01,7.348022696655839847e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,-6.979686649478139548e-02,-6.419941234845069622e-02,-5.935897986465880211e-02,-5.047818592717519953e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-8.913686007934769340e-02,-5.078298047848289754e-02
-4.183993948900609910e-02,5.068011873981870252e-02,-2.991781976118810041e-02,-2.227739861197989939e-03,2.182223876920789951e-02,3.657708645031480105e-02,1.182372140927919965e-02,-2.592261998182820038e-03,-4.118038518800790082e-02,6.519601313688899724e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-4.354218818603310115e-02,-2.908801698423390050e-02,-2.323426975148589965e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-3.980959436433750137e-02,-2.178823207463989955e-02
3.444336798240450054e-02,-4.464163650698899782e-02,1.858372356345249984e-02,5.630106193231849965e-02,1.219056876180000040e-02,-5.454911593043910295e-02,-6.917231028063640375e-02,7.120997975363539678e-02,1.300806095217529879e-01,7.206516329203029904e-03
-6.000263174410389727e-02,-4.464163650698899782e-02,1.338730381358059929e-03,-2.977070541108809906e-02,-7.072771253015849857e-03,-2.166852744253820046e-02,1.182372140927919965e-02,-2.592261998182820038e-03,3.181521750079859684e-02,-5.492508739331759815e-02
-8.543040090124079389e-02,5.068011873981870252e-02,-3.099563183506899924e-02,-2.288496402361559975e-02,-6.348683843926219983e-02,-5.423596746864960128e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-9.643322289178400675e-02,-3.421455281914410201e-02
5.260606023750229870e-02,-4.464163650698899782e-02,-4.050329988046450294e-03,-3.091832896419060075e-02,-4.697540414084860200e-02,-5.830689747191349775e-02,-1.394774321933030074e-02,-2.583996815000549896e-02,3.605579008983190309e-02,2.377494398854190089e-02
1.264813727628719998e-02,-4.464163650698899782e-02,1.535028734180979987e-02,-3.321357610482440076e-02,4.108557878402369773e-02,3.219300798526129881e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,4.506616833626150148e-02,-6.735140813782170000e-02
5.987113713954139715e-02,5.068011873981870252e-02,2.289497185897609866e-02,4.941532054484590319e-02,1.631842733640340160e-02,1.183835796894170019e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,3.953987807202419963e-02,1.963283707370720027e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,4.552902541047500196e-02,9.072976886968099619e-02,-1.808039411862490120e-02,-3.544705976127759950e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-3.452371533034950118e-02,-9.361911330135799444e-03
1.628067572730669890e-02,-4.464163650698899782e-02,-4.500718879552070145e-02,-5.731367096097819691e-02,-3.459182841703849903e-02,-5.392281900686000246e-02,7.441156407875940126e-02,-7.639450375000099436e-02,-4.257210492279420166e-02,4.034337164788070335e-02
1.107266754538149961e-01,5.068011873981870252e-02,-3.315125598283080038e-02,-2.288496402361559975e-02,-4.320865536613589623e-03,2.029336643725910064e-02,-6.180903467246220279e-02,7.120997975363539678e-02,1.556684454070180086e-02,4.448547856271539702e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,9.726400495675820157e-02,-5.670610554934250001e-03,-5.696818394814720174e-03,-2.386056667506489953e-02,-2.131101882750449997e-02,-2.592261998182820038e-03,6.168584882386619894e-02,4.034337164788070335e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,5.415152200152219958e-02,7.007254470726349826e-02,-3.321587555883730170e-02,-2.793149667832890010e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-2.712864555432650121e-02,-9.361911330135799444e-03
4.897352178648269744e-02,5.068011873981870252e-02,1.231314947298999957e-01,8.384402748220859403e-02,-1.047654241852959967e-01,-1.008950882752900069e-01,-6.917231028063640375e-02,-2.592261998182820038e-03,3.664579779339879884e-02,-3.007244590430930078e-02
-5.637009329308430294e-02,-4.464163650698899782e-02,-8.057498723359039772e-02,-8.485663651086830517e-02,-3.734373413344069942e-02,-3.701280207022530216e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-5.615757309500619965e-02,-1.377672256900120129e-01
2.717829108036539862e-02,-4.464163650698899782e-02,9.295275666123460623e-02,-5.272317671413939699e-02,8.062710187196569719e-03,3.970857106821010230e-02,-2.867429443567860031e-02,2.102445536239900062e-02,-4.836172480289190057e-02,1.963283707370720027e-02
6.350367559056099842e-02,-4.464163650698899782e-02,-5.039624916492520257e-02,1.079441223383619947e-01,3.145390877661580209e-02,1.935392105189049847e-02,-1.762938102341739949e-02,2.360753382371260159e-02,5.803912766389510147e-02,4.034337164788070335e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-1.159501450521270051e-02,5.630106193231849965e-02,5.622106022423609822e-02,7.290230801790049953e-02,-3.971920784793980114e-02,7.120997975363539678e-02,3.056648739841480097e-02,-5.219804415301099697e-03
-9.147093429830140468e-03,5.068011873981870252e-02,-2.776219561342629927e-02,8.100872220010799790e-03,4.796534307502930278e-02,3.720338337389379746e-02,-2.867429443567860031e-02,3.430885887772629900e-02,6.604820616309839409e-02,-4.249876664881350324e-02
5.383060374248070309e-03,-4.464163650698899782e-02,5.846277029704580186e-02,-4.354218818603310115e-02,-7.311850844667000526e-02,-7.239857825244250256e-02,1.918699701745330000e-02,-7.639450375000099436e-02,-5.140053526058249722e-02,-2.593033898947460017e-02
7.440129094361959405e-02,-4.464163650698899782e-02,8.540807214406830050e-02,6.318680331979099896e-02,1.494247447820220079e-02,1.309095181609989944e-02,1.550535921336619952e-02,-2.592261998182820038e-03,6.209315616505399656e-03,8.590654771106250032e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-8.168937664037369826e-04,-2.632783471735180084e-02,1.081461590359879960e-02,7.141131042098750048e-03,4.864009945014990260e-02,-3.949338287409189657e-02,-3.581672810154919867e-02,1.963283707370720027e-02
8.166636784565869944e-02,5.068011873981870252e-02,6.727790750762559745e-03,-4.522987001831730094e-03,1.098832216940800049e-01,1.170562411302250028e-01,-3.235593223976569732e-02,9.187460744414439884e-02,5.472400334817909689e-02,7.206516329203029904e-03
-5.514554978810590376e-03,-4.464163650698899782e-02,8.883414898524360018e-03,-5.042792957350569760e-02,2.595009734381130070e-02,4.722413415115889884e-02,-4.340084565202689815e-02,7.120997975363539678e-02,1.482271084126630077e-02,3.064409414368320182e-03
-2.730978568492789874e-02,-4.464163650698899782e-02,8.001901177466380632e-02,9.876313370696999938e-02,-2.944912678412469915e-03,1.810132720473240156e-02,-1.762938102341739949e-02,3.311917341962639788e-03,-2.952762274177360077e-02,3.620126473304600273e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,7.139651518361660176e-02,-7.452802442965950069e-02,-1.532848840222260020e-02,-1.313877426218630021e-03,4.460445801105040325e-03,-2.141183364489639834e-02,-4.687948284421659950e-02,3.064409414368320182e-03
9.015598825267629943e-03,-4.464163650698899782e-02,-2.452875939178359929e-02,-2.632783471735180084e-02,9.887559882847110626e-02,9.419640341958869512e-02,7.072992627467229731e-02,-2.592261998182820038e-03,-2.139368094035999993e-02,7.206516329203029904e-03
-2.004470878288880029e-02,-4.464163650698899782e-02,-5.470749746044879791e-02,-5.387080026724189868e-02,-6.623874415566440021e-02,-5.736745208654490252e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-7.408887149153539631e-02,-5.219804415301099697e-03
2.354575262934580082e-02,-4.464163650698899782e-02,-3.638469220447349689e-02,6.750727943574620551e-05,1.182945896190920002e-03,3.469819567957759671e-02,-4.340084565202689815e-02,3.430885887772629900e-02,-3.324878724762579674e-02,6.105390622205419948e-02
3.807590643342410180e-02,5.068011873981870252e-02,1.642809941569069870e-02,2.187235499495579841e-02,3.970962592582259754e-02,4.503209491863210262e-02,-4.340084565202689815e-02,7.120997975363539678e-02,4.976865992074899769e-02,1.549073015887240078e-02
-7.816532399920170238e-02,5.068011873981870252e-02,7.786338762690199478e-02,5.285819123858220142e-02,7.823630595545419397e-02,6.444729954958319795e-02,2.655027262562750096e-02,-2.592261998182820038e-03,4.067226371449769728e-02,-9.361911330135799444e-03
9.015598825267629943e-03,5.068011873981870252e-02,-3.961812842611620034e-02,2.875809638242839833e-02,3.833367306762140020e-02,7.352860494147960002e-02,-7.285394808472339667e-02,1.081111006295440019e-01,1.556684454070180086e-02,-4.664087356364819692e-02
1.750521923228520000e-03,5.068011873981870252e-02,1.103903904628619932e-02,-1.944209332987930153e-02,-1.670444126042380101e-02,-3.819065120534880214e-03,-4.708248345611389801e-02,3.430885887772629900e-02,2.405258322689299982e-02,2.377494398854190089e-02
-7.816532399920170238e-02,-4.464163650698899782e-02,-4.069594049999709917e-02,-8.141376581713200000e-02,-1.006375656106929944e-01,-1.127947298232920004e-01,2.286863482154040048e-02,-7.639450375000099436e-02,-2.028874775162960165e-02,-5.078298047848289754e-02
3.081082953138499989e-02,5.068011873981870252e-02,-3.422906805671169922e-02,4.367720260718979675e-02,5.759701308243719842e-02,6.883137801463659611e-02,-3.235593223976569732e-02,5.755656502954899917e-02,3.546193866076970125e-02,8.590654771106250032e-02
-3.457486258696700065e-02,5.068011873981870252e-02,5.649978676881649634e-03,-5.670610554934250001e-03,-7.311850844667000526e-02,-6.269097593696699999e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-4.542095777704099890e-02,3.205915781821130212e-02
4.897352178648269744e-02,5.068011873981870252e-02,8.864150836571099701e-02,8.728689817594480205e-02,3.558176735121919981e-02,2.154596028441720101e-02,-2.499265663159149983e-02,3.430885887772629900e-02,6.604820616309839409e-02,1.314697237742440128e-01
-4.183993948900609910e-02,-4.464163650698899782e-02,-3.315125598283080038e-02,-2.288496402361559975e-02,4.658939021682820258e-02,4.158746183894729970e-02,5.600337505832399948e-02,-2.473293452372829840e-02,-2.595242443518940012e-02,-3.835665973397880263e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,-5.686312160821060252e-02,-5.042792957350569760e-02,2.182223876920789951e-02,4.534524338042170144e-02,-2.867429443567860031e-02,3.430885887772629900e-02,-9.918957363154769225e-03,-1.764612515980519894e-02
7.076875249260000666e-02,5.068011873981870252e-02,-3.099563183506899924e-02,2.187235499495579841e-02,-3.734373413344069942e-02,-4.703355284749029946e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-1.495647502491130078e-02,-1.077697500466389974e-03
9.015598825267629943e-03,-4.464163650698899782e-02,5.522933407540309841e-02,-5.670610554934250001e-03,5.759701308243719842e-02,4.471894645684260094e-02,-2.902829807069099918e-03,2.323852261495349888e-02,5.568354770267369691e-02,1.066170822852360034e-01
-2.730978568492789874e-02,-4.464163650698899782e-02,-6.009655782985329903e-02,-2.977070541108809906e-02,4.658939021682820258e-02,1.998021797546959896e-02,1.222728555318910032e-01,-3.949338287409189657e-02,-5.140053526058249722e-02,-9.361911330135799444e-03
1.628067572730669890e-02,-4.464163650698899782e-02,1.338730381358059929e-03,8.100872220010799790e-03,5.310804470794310353e-03,1.089891258357309975e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-4.542095777704099890e-02,3.205915781821130212e-02
-1.277963188084970010e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-4.009931749229690007e-02,-1.670444126042380101e-02,4.635943347782499856e-03,-1.762938102341739949e-02,-2.592261998182820038e-03,-3.845911230135379971e-02,-3.835665973397880263e-02
-5.637009329308430294e-02,-4.464163650698899782e-02,-7.410811479030500470e-02,-5.042792957350569760e-02,-2.496015840963049931e-02,-4.703355284749029946e-02,9.281975309919469896e-02,-7.639450375000099436e-02,-6.117659509433449883e-02,-4.664087356364819692e-02
4.170844488444359899e-02,5.068011873981870252e-02,1.966153563733339868e-02,5.974393262605470073e-02,-5.696818394814720174e-03,-2.566471273376759888e-03,-2.867429443567860031e-02,-2.592261998182820038e-03,3.119299070280229930e-02,7.206516329203029904e-03
-5.514554978810590376e-03,5.068011873981870252e-02,-1.590626280073640167e-02,-6.764228304218700139e-02,4.934129593323050011e-02,7.916527725369119917e-02,-2.867429443567860031e-02,3.430885887772629900e-02,-1.811826730789670159e-02,4.448547856271539702e-02
4.170844488444359899e-02,5.068011873981870252e-02,-1.590626280073640167e-02,1.728186074811709910e-02,-3.734373413344069942e-02,-1.383981589779990050e-02,-2.499265663159149983e-02,-1.107951979964190078e-02,-4.687948284421659950e-02,1.549073015887240078e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,3.906215296718960200e-02,1.215130832538269907e-03,1.631842733640340160e-02,1.528299104862660025e-02,-2.867429443567860031e-02,2.655962349378539894e-02,4.452837402140529671e-02,-2.593033898947460017e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,-7.303030271642410587e-02,-8.141376581713200000e-02,8.374011738825870577e-02,2.780892952020790065e-02,1.738157847891100005e-01,-3.949338287409189657e-02,-4.219859706946029777e-03,3.064409414368320182e-03
1 3.807590643342410180e-02 5.068011873981870252e-02 6.169620651868849837e-02 2.187235499495579841e-02 -4.422349842444640161e-02 -3.482076283769860309e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 1.990842087631829876e-02 -1.764612515980519894e-02
2 -1.882016527791040067e-03 -4.464163650698899782e-02 -5.147406123880610140e-02 -2.632783471735180084e-02 -8.448724111216979540e-03 -1.916333974822199970e-02 7.441156407875940126e-02 -3.949338287409189657e-02 -6.832974362442149896e-02 -9.220404962683000083e-02
3 8.529890629667830071e-02 5.068011873981870252e-02 4.445121333659410312e-02 -5.670610554934250001e-03 -4.559945128264750180e-02 -3.419446591411950259e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 2.863770518940129874e-03 -2.593033898947460017e-02
4 -8.906293935226029801e-02 -4.464163650698899782e-02 -1.159501450521270051e-02 -3.665644679856060184e-02 1.219056876180000040e-02 2.499059336410210108e-02 -3.603757004385269719e-02 3.430885887772629900e-02 2.269202256674450122e-02 -9.361911330135799444e-03
5 5.383060374248070309e-03 -4.464163650698899782e-02 -3.638469220447349689e-02 2.187235499495579841e-02 3.934851612593179802e-03 1.559613951041610019e-02 8.142083605192099172e-03 -2.592261998182820038e-03 -3.199144494135589684e-02 -4.664087356364819692e-02
6 -9.269547780327989928e-02 -4.464163650698899782e-02 -4.069594049999709917e-02 -1.944209332987930153e-02 -6.899064987206669775e-02 -7.928784441181220555e-02 4.127682384197570165e-02 -7.639450375000099436e-02 -4.118038518800790082e-02 -9.634615654166470144e-02
7 -4.547247794002570037e-02 5.068011873981870252e-02 -4.716281294328249912e-02 -1.599922263614299983e-02 -4.009563984984299695e-02 -2.480001206043359885e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -6.291294991625119570e-02 -3.835665973397880263e-02
8 6.350367559056099842e-02 5.068011873981870252e-02 -1.894705840284650021e-03 6.662967401352719310e-02 9.061988167926439408e-02 1.089143811236970016e-01 2.286863482154040048e-02 1.770335448356720118e-02 -3.581672810154919867e-02 3.064409414368320182e-03
9 4.170844488444359899e-02 5.068011873981870252e-02 6.169620651868849837e-02 -4.009931749229690007e-02 -1.395253554402150001e-02 6.201685656730160021e-03 -2.867429443567860031e-02 -2.592261998182820038e-03 -1.495647502491130078e-02 1.134862324403770016e-02
10 -7.090024709716259699e-02 -4.464163650698899782e-02 3.906215296718960200e-02 -3.321357610482440076e-02 -1.257658268582039982e-02 -3.450761437590899733e-02 -2.499265663159149983e-02 -2.592261998182820038e-03 6.773632611028609918e-02 -1.350401824497050006e-02
11 -9.632801625429950054e-02 -4.464163650698899782e-02 -8.380842345523309422e-02 8.100872220010799790e-03 -1.033894713270950005e-01 -9.056118903623530669e-02 -1.394774321933030074e-02 -7.639450375000099436e-02 -6.291294991625119570e-02 -3.421455281914410201e-02
12 2.717829108036539862e-02 5.068011873981870252e-02 1.750591148957160101e-02 -3.321357610482440076e-02 -7.072771253015849857e-03 4.597154030400080194e-02 -6.549067247654929980e-02 7.120997975363539678e-02 -9.643322289178400675e-02 -5.906719430815229877e-02
13 1.628067572730669890e-02 -4.464163650698899782e-02 -2.884000768730720157e-02 -9.113481248670509197e-03 -4.320865536613589623e-03 -9.768885894535990141e-03 4.495846164606279866e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 -4.249876664881350324e-02
14 5.383060374248070309e-03 5.068011873981870252e-02 -1.894705840284650021e-03 8.100872220010799790e-03 -4.320865536613589623e-03 -1.571870666853709964e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 3.839324821169769891e-02 -1.350401824497050006e-02
15 4.534098333546320025e-02 -4.464163650698899782e-02 -2.560657146566450160e-02 -1.255635194240680048e-02 1.769438019460449832e-02 -6.128357906048329537e-05 8.177483968693349814e-02 -3.949338287409189657e-02 -3.199144494135589684e-02 -7.563562196749110123e-02
16 -5.273755484206479882e-02 5.068011873981870252e-02 -1.806188694849819934e-02 8.040115678847230274e-02 8.924392882106320368e-02 1.076617872765389949e-01 -3.971920784793980114e-02 1.081111006295440019e-01 3.605579008983190309e-02 -4.249876664881350324e-02
17 -5.514554978810590376e-03 -4.464163650698899782e-02 4.229558918883229851e-02 4.941532054484590319e-02 2.457414448561009990e-02 -2.386056667506489953e-02 7.441156407875940126e-02 -3.949338287409189657e-02 5.227999979678119719e-02 2.791705090337660150e-02
18 7.076875249260000666e-02 5.068011873981870252e-02 1.211685112016709989e-02 5.630106193231849965e-02 3.420581449301800248e-02 4.941617338368559792e-02 -3.971920784793980114e-02 3.430885887772629900e-02 2.736770754260900093e-02 -1.077697500466389974e-03
19 -3.820740103798660192e-02 -4.464163650698899782e-02 -1.051720243133190055e-02 -3.665644679856060184e-02 -3.734373413344069942e-02 -1.947648821001150138e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 -1.811826730789670159e-02 -1.764612515980519894e-02
20 -2.730978568492789874e-02 -4.464163650698899782e-02 -1.806188694849819934e-02 -4.009931749229690007e-02 -2.944912678412469915e-03 -1.133462820348369975e-02 3.759518603788870178e-02 -3.949338287409189657e-02 -8.944018957797799166e-03 -5.492508739331759815e-02
21 -4.910501639104519755e-02 -4.464163650698899782e-02 -5.686312160821060252e-02 -4.354218818603310115e-02 -4.559945128264750180e-02 -4.327577130601600180e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -1.190068480150809939e-02 1.549073015887240078e-02
22 -8.543040090124079389e-02 5.068011873981870252e-02 -2.237313524402180162e-02 1.215130832538269907e-03 -3.734373413344069942e-02 -2.636575436938120090e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -7.212845460195599356e-02 -1.764612515980519894e-02
23 -8.543040090124079389e-02 -4.464163650698899782e-02 -4.050329988046450294e-03 -9.113481248670509197e-03 -2.944912678412469915e-03 7.767427965677820186e-03 2.286863482154040048e-02 -3.949338287409189657e-02 -6.117659509433449883e-02 -1.350401824497050006e-02
24 4.534098333546320025e-02 5.068011873981870252e-02 6.061839444480759953e-02 3.105334362634819961e-02 2.870200306021350109e-02 -4.734670130927989828e-02 -5.444575906428809897e-02 7.120997975363539678e-02 1.335989800130079896e-01 1.356118306890790048e-01
25 -6.363517019512339445e-02 -4.464163650698899782e-02 3.582871674554689856e-02 -2.288496402361559975e-02 -3.046396984243510131e-02 -1.885019128643240088e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 -2.595242443518940012e-02 -5.492508739331759815e-02
26 -6.726770864614299572e-02 5.068011873981870252e-02 -1.267282657909369996e-02 -4.009931749229690007e-02 -1.532848840222260020e-02 4.635943347782499856e-03 -5.812739686837520292e-02 3.430885887772629900e-02 1.919903307856710151e-02 -3.421455281914410201e-02
27 -1.072256316073579990e-01 -4.464163650698899782e-02 -7.734155101194770121e-02 -2.632783471735180084e-02 -8.962994274508359616e-02 -9.619786134844690584e-02 2.655027262562750096e-02 -7.639450375000099436e-02 -4.257210492279420166e-02 -5.219804415301099697e-03
28 -2.367724723390840155e-02 -4.464163650698899782e-02 5.954058237092670069e-02 -4.009931749229690007e-02 -4.284754556624519733e-02 -4.358891976780549654e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 4.034337164788070335e-02
29 5.260606023750229870e-02 -4.464163650698899782e-02 -2.129532317014089932e-02 -7.452802442965950069e-02 -4.009563984984299695e-02 -3.763909899380440266e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -6.092541861022970299e-04 -5.492508739331759815e-02
30 6.713621404158050254e-02 5.068011873981870252e-02 -6.205954135808240159e-03 6.318680331979099896e-02 -4.284754556624519733e-02 -9.588471288665739722e-02 5.232173725423699961e-02 -7.639450375000099436e-02 5.942380044479410317e-02 5.276969239238479825e-02
31 -6.000263174410389727e-02 -4.464163650698899782e-02 4.445121333659410312e-02 -1.944209332987930153e-02 -9.824676969418109224e-03 -7.576846662009279788e-03 2.286863482154040048e-02 -3.949338287409189657e-02 -2.712864555432650121e-02 -9.361911330135799444e-03
32 -2.367724723390840155e-02 -4.464163650698899782e-02 -6.548561819925780014e-02 -8.141376581713200000e-02 -3.871968699164179961e-02 -5.360967054507050078e-02 5.968501286241110343e-02 -7.639450375000099436e-02 -3.712834601047360072e-02 -4.249876664881350324e-02
33 3.444336798240450054e-02 5.068011873981870252e-02 1.252871188776620015e-01 2.875809638242839833e-02 -5.385516843185429725e-02 -1.290037051243130006e-02 -1.023070505174200062e-01 1.081111006295440019e-01 2.714857279071319972e-04 2.791705090337660150e-02
34 3.081082953138499989e-02 -4.464163650698899782e-02 -5.039624916492520257e-02 -2.227739861197989939e-03 -4.422349842444640161e-02 -8.993489211265630334e-02 1.185912177278039964e-01 -7.639450375000099436e-02 -1.811826730789670159e-02 3.064409414368320182e-03
35 1.628067572730669890e-02 -4.464163650698899782e-02 -6.332999405149600247e-02 -5.731367096097819691e-02 -5.798302700645770191e-02 -4.891244361822749687e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -5.947269741072230137e-02 -6.735140813782170000e-02
36 4.897352178648269744e-02 5.068011873981870252e-02 -3.099563183506899924e-02 -4.928030602040309877e-02 4.934129593323050011e-02 -4.132213582324419619e-03 1.333177689441520097e-01 -5.351580880693729975e-02 2.131084656824479978e-02 1.963283707370720027e-02
37 1.264813727628719998e-02 -4.464163650698899782e-02 2.289497185897609866e-02 5.285819123858220142e-02 8.062710187196569719e-03 -2.855779360190789998e-02 3.759518603788870178e-02 -3.949338287409189657e-02 5.472400334817909689e-02 -2.593033898947460017e-02
38 -9.147093429830140468e-03 -4.464163650698899782e-02 1.103903904628619932e-02 -5.731367096097819691e-02 -2.496015840963049931e-02 -4.296262284422640298e-02 3.023191042971450082e-02 -3.949338287409189657e-02 1.703713241477999851e-02 -5.219804415301099697e-03
39 -1.882016527791040067e-03 5.068011873981870252e-02 7.139651518361660176e-02 9.761551025715360652e-02 8.786797596286209655e-02 7.540749571221680436e-02 -2.131101882750449997e-02 7.120997975363539678e-02 7.142403278057639360e-02 2.377494398854190089e-02
40 -1.882016527791040067e-03 5.068011873981870252e-02 1.427247526792889930e-02 -7.452802442965950069e-02 2.558898754392050119e-03 6.201685656730160021e-03 -1.394774321933030074e-02 -2.592261998182820038e-03 1.919903307856710151e-02 3.064409414368320182e-03
41 5.383060374248070309e-03 5.068011873981870252e-02 -8.361578283570040432e-03 2.187235499495579841e-02 5.484510736603499803e-02 7.321545647968999426e-02 -2.499265663159149983e-02 3.430885887772629900e-02 1.255315281338930007e-02 9.419076154073199869e-02
42 -9.996055470531900466e-02 -4.464163650698899782e-02 -6.764124234701959781e-02 -1.089567313670219972e-01 -7.449446130487119566e-02 -7.271172671423199729e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -4.986846773523059828e-02 -9.361911330135799444e-03
43 -6.000263174410389727e-02 5.068011873981870252e-02 -1.051720243133190055e-02 -1.485159908304049987e-02 -4.972730985725089953e-02 -2.354741821327540133e-02 -5.812739686837520292e-02 1.585829843977170153e-02 -9.918957363154769225e-03 -3.421455281914410201e-02
44 1.991321417832630017e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -7.108515373592319553e-02 2.044628591100669870e-02 -1.008203435632550049e-02 1.185912177278039964e-01 -7.639450375000099436e-02 -4.257210492279420166e-02 7.348022696655839847e-02
45 4.534098333546320025e-02 5.068011873981870252e-02 6.816307896197400240e-02 8.100872220010799790e-03 -1.670444126042380101e-02 4.635943347782499856e-03 -7.653558588881050062e-02 7.120997975363539678e-02 3.243322577960189995e-02 -1.764612515980519894e-02
46 2.717829108036539862e-02 5.068011873981870252e-02 -3.530688013059259805e-02 3.220096707616459941e-02 -1.120062982761920074e-02 1.504458729887179960e-03 -1.026610541524320026e-02 -2.592261998182820038e-03 -1.495647502491130078e-02 -5.078298047848289754e-02
47 -5.637009329308430294e-02 -4.464163650698899782e-02 -1.159501450521270051e-02 -3.321357610482440076e-02 -4.697540414084860200e-02 -4.765984977106939996e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -7.979397554541639223e-03 -8.806194271199530021e-02
48 -7.816532399920170238e-02 -4.464163650698899782e-02 -7.303030271642410587e-02 -5.731367096097819691e-02 -8.412613131227909824e-02 -7.427746902317970690e-02 -2.499265663159149983e-02 -3.949338287409189657e-02 -1.811826730789670159e-02 -8.391983579716059960e-02
49 6.713621404158050254e-02 5.068011873981870252e-02 -4.177375257387799801e-02 1.154374291374709975e-02 2.558898754392050119e-03 5.888537194940629722e-03 4.127682384197570165e-02 -3.949338287409189657e-02 -5.947269741072230137e-02 -2.178823207463989955e-02
50 -4.183993948900609910e-02 5.068011873981870252e-02 1.427247526792889930e-02 -5.670610554934250001e-03 -1.257658268582039982e-02 6.201685656730160021e-03 -7.285394808472339667e-02 7.120997975363539678e-02 3.546193866076970125e-02 -1.350401824497050006e-02
51 3.444336798240450054e-02 -4.464163650698899782e-02 -7.283766209689159811e-03 1.498661360748330083e-02 -4.422349842444640161e-02 -3.732595053201490098e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -2.139368094035999993e-02 7.206516329203029904e-03
52 5.987113713954139715e-02 5.068011873981870252e-02 1.642809941569069870e-02 2.875809638242839833e-02 -4.147159270804409714e-02 -2.918409052548700047e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 -2.396681493414269844e-03 -2.178823207463989955e-02
53 -5.273755484206479882e-02 -4.464163650698899782e-02 -9.439390357450949676e-03 -5.670610554934250001e-03 3.970962592582259754e-02 4.471894645684260094e-02 2.655027262562750096e-02 -2.592261998182820038e-03 -1.811826730789670159e-02 -1.350401824497050006e-02
54 -9.147093429830140468e-03 -4.464163650698899782e-02 -1.590626280073640167e-02 7.007254470726349826e-02 1.219056876180000040e-02 2.217225720799630151e-02 1.550535921336619952e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 4.862758547755009764e-02
55 -4.910501639104519755e-02 -4.464163650698899782e-02 2.505059600673789980e-02 8.100872220010799790e-03 2.044628591100669870e-02 1.778817874294279927e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 7.206516329203029904e-03
56 -4.183993948900609910e-02 -4.464163650698899782e-02 -4.931843709104429679e-02 -3.665644679856060184e-02 -7.072771253015849857e-03 -2.260797282790679916e-02 8.545647749102060209e-02 -3.949338287409189657e-02 -6.648814822283539983e-02 7.206516329203029904e-03
57 -4.183993948900609910e-02 -4.464163650698899782e-02 4.121777711495139968e-02 -2.632783471735180084e-02 -3.183992270063620150e-02 -3.043668437264510085e-02 -3.603757004385269719e-02 2.942906133203560069e-03 3.365681290238470291e-02 -1.764612515980519894e-02
58 -2.730978568492789874e-02 -4.464163650698899782e-02 -6.332999405149600247e-02 -5.042792957350569760e-02 -8.962994274508359616e-02 -1.043397213549750041e-01 5.232173725423699961e-02 -7.639450375000099436e-02 -5.615757309500619965e-02 -6.735140813782170000e-02
59 4.170844488444359899e-02 -4.464163650698899782e-02 -6.440780612537699845e-02 3.564383776990089764e-02 1.219056876180000040e-02 -5.799374901012400302e-02 1.811790603972839864e-01 -7.639450375000099436e-02 -6.092541861022970299e-04 -5.078298047848289754e-02
60 6.350367559056099842e-02 5.068011873981870252e-02 -2.560657146566450160e-02 1.154374291374709975e-02 6.447677737344290061e-02 4.847672799831700269e-02 3.023191042971450082e-02 -2.592261998182820038e-03 3.839324821169769891e-02 1.963283707370720027e-02
61 -7.090024709716259699e-02 -4.464163650698899782e-02 -4.050329988046450294e-03 -4.009931749229690007e-02 -6.623874415566440021e-02 -7.866154748823310505e-02 5.232173725423699961e-02 -7.639450375000099436e-02 -5.140053526058249722e-02 -3.421455281914410201e-02
62 -4.183993948900609910e-02 5.068011873981870252e-02 4.572166603000769880e-03 -5.387080026724189868e-02 -4.422349842444640161e-02 -2.730519975474979960e-02 -8.021722369289760457e-02 7.120997975363539678e-02 3.664579779339879884e-02 1.963283707370720027e-02
63 -2.730978568492789874e-02 5.068011873981870252e-02 -7.283766209689159811e-03 -4.009931749229690007e-02 -1.120062982761920074e-02 -1.383981589779990050e-02 5.968501286241110343e-02 -3.949338287409189657e-02 -8.238148325810279449e-02 -2.593033898947460017e-02
64 -3.457486258696700065e-02 -4.464163650698899782e-02 -3.746250427835440266e-02 -6.075654165471439799e-02 2.044628591100669870e-02 4.346635260968449710e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -3.075120986455629965e-02 -7.149351505265640061e-02
65 6.713621404158050254e-02 5.068011873981870252e-02 -2.560657146566450160e-02 -4.009931749229690007e-02 -6.348683843926219983e-02 -5.987263978086120042e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -1.919704761394450121e-02 1.134862324403770016e-02
66 -4.547247794002570037e-02 5.068011873981870252e-02 -2.452875939178359929e-02 5.974393262605470073e-02 5.310804470794310353e-03 1.496984258683710031e-02 -5.444575906428809897e-02 7.120997975363539678e-02 4.234489544960749752e-02 1.549073015887240078e-02
67 -9.147093429830140468e-03 5.068011873981870252e-02 -1.806188694849819934e-02 -3.321357610482440076e-02 -2.083229983502719873e-02 1.215150643073130074e-02 -7.285394808472339667e-02 7.120997975363539678e-02 2.714857279071319972e-04 1.963283707370720027e-02
68 4.170844488444359899e-02 5.068011873981870252e-02 -1.482845072685549936e-02 -1.714684618924559867e-02 -5.696818394814720174e-03 8.393724889256879915e-03 -1.394774321933030074e-02 -1.854239580664649974e-03 -1.190068480150809939e-02 3.064409414368320182e-03
69 3.807590643342410180e-02 5.068011873981870252e-02 -2.991781976118810041e-02 -4.009931749229690007e-02 -3.321587555883730170e-02 -2.417371513685449835e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 -1.290794225416879923e-02 3.064409414368320182e-03
70 1.628067572730669890e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -5.670610554934250001e-03 -7.587041416307230279e-02 -6.143838208980879900e-02 -1.394774321933030074e-02 -3.949338287409189657e-02 -5.140053526058249722e-02 1.963283707370720027e-02
71 -1.882016527791040067e-03 -4.464163650698899782e-02 -6.979686649478139548e-02 -1.255635194240680048e-02 -1.930069620102049918e-04 -9.142588970956939953e-03 7.072992627467229731e-02 -3.949338287409189657e-02 -6.291294991625119570e-02 4.034337164788070335e-02
72 -1.882016527791040067e-03 -4.464163650698899782e-02 3.367309259778510089e-02 1.251584758070440062e-01 2.457414448561009990e-02 2.624318721126020146e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 2.671425763351279944e-02 6.105390622205419948e-02
73 6.350367559056099842e-02 5.068011873981870252e-02 -4.050329988046450294e-03 -1.255635194240680048e-02 1.030034574030749966e-01 4.878987646010649742e-02 5.600337505832399948e-02 -2.592261998182820038e-03 8.449528221240310000e-02 -1.764612515980519894e-02
74 1.264813727628719998e-02 5.068011873981870252e-02 -2.021751109626000048e-02 -2.227739861197989939e-03 3.833367306762140020e-02 5.317395492515999966e-02 -6.584467611156170040e-03 3.430885887772629900e-02 -5.145307980263110273e-03 -9.361911330135799444e-03
75 1.264813727628719998e-02 5.068011873981870252e-02 2.416542455238970041e-03 5.630106193231849965e-02 2.732605020201240090e-02 1.716188181936379939e-02 4.127682384197570165e-02 -3.949338287409189657e-02 3.711738233435969789e-03 7.348022696655839847e-02
76 -9.147093429830140468e-03 5.068011873981870252e-02 -3.099563183506899924e-02 -2.632783471735180084e-02 -1.120062982761920074e-02 -1.000728964429089965e-03 -2.131101882750449997e-02 -2.592261998182820038e-03 6.209315616505399656e-03 2.791705090337660150e-02
77 -3.094232413594750000e-02 5.068011873981870252e-02 2.828403222838059977e-02 7.007254470726349826e-02 -1.267806699165139883e-01 -1.068449090492910036e-01 -5.444575906428809897e-02 -4.798064067555100204e-02 -3.075120986455629965e-02 1.549073015887240078e-02
78 -9.632801625429950054e-02 -4.464163650698899782e-02 -3.638469220447349689e-02 -7.452802442965950069e-02 -3.871968699164179961e-02 -2.761834821653930128e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -7.408887149153539631e-02 -1.077697500466389974e-03
79 5.383060374248070309e-03 -4.464163650698899782e-02 -5.794093368209150136e-02 -2.288496402361559975e-02 -6.761469701386560449e-02 -6.832764824917850199e-02 -5.444575906428809897e-02 -2.592261998182820038e-03 4.289568789252869857e-02 -8.391983579716059960e-02
80 -1.035930931563389945e-01 -4.464163650698899782e-02 -3.746250427835440266e-02 -2.632783471735180084e-02 2.558898754392050119e-03 1.998021797546959896e-02 1.182372140927919965e-02 -2.592261998182820038e-03 -6.832974362442149896e-02 -2.593033898947460017e-02
81 7.076875249260000666e-02 -4.464163650698899782e-02 1.211685112016709989e-02 4.252957915737339695e-02 7.135654166444850566e-02 5.348710338694950134e-02 5.232173725423699961e-02 -2.592261998182820038e-03 2.539313491544940155e-02 -5.219804415301099697e-03
82 1.264813727628719998e-02 5.068011873981870252e-02 -2.237313524402180162e-02 -2.977070541108809906e-02 1.081461590359879960e-02 2.843522644378690054e-02 -2.131101882750449997e-02 3.430885887772629900e-02 -6.080248196314420352e-03 -1.077697500466389974e-03
83 -1.641217033186929963e-02 -4.464163650698899782e-02 -3.530688013059259805e-02 -2.632783471735180084e-02 3.282986163481690228e-02 1.716188181936379939e-02 1.001830287073690040e-01 -3.949338287409189657e-02 -7.020931272868760620e-02 -7.977772888232589898e-02
84 -3.820740103798660192e-02 -4.464163650698899782e-02 9.961226972405269262e-03 -4.698505887976939938e-02 -5.935897986465880211e-02 -5.298337362149149743e-02 -1.026610541524320026e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -4.249876664881350324e-02
85 1.750521923228520000e-03 -4.464163650698899782e-02 -3.961812842611620034e-02 -1.009233664264470032e-01 -2.908801698423390050e-02 -3.012353591085559917e-02 4.495846164606279866e-02 -5.019470792810550031e-02 -6.832974362442149896e-02 -1.294830118603420011e-01
86 4.534098333546320025e-02 -4.464163650698899782e-02 7.139651518361660176e-02 1.215130832538269907e-03 -9.824676969418109224e-03 -1.000728964429089965e-03 1.550535921336619952e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -7.149351505265640061e-02
87 -7.090024709716259699e-02 5.068011873981870252e-02 -7.518592686418590354e-02 -4.009931749229690007e-02 -5.110326271545199972e-02 -1.509240974495799914e-02 -3.971920784793980114e-02 -2.592261998182820038e-03 -9.643322289178400675e-02 -3.421455281914410201e-02
88 4.534098333546320025e-02 -4.464163650698899782e-02 -6.205954135808240159e-03 1.154374291374709975e-02 6.310082451524179348e-02 1.622243643399520069e-02 9.650139090328180291e-02 -3.949338287409189657e-02 4.289568789252869857e-02 -3.835665973397880263e-02
89 -5.273755484206479882e-02 5.068011873981870252e-02 -4.069594049999709917e-02 -6.764228304218700139e-02 -3.183992270063620150e-02 -3.701280207022530216e-02 3.759518603788870178e-02 -3.949338287409189657e-02 -3.452371533034950118e-02 6.933812005172369786e-02
90 -4.547247794002570037e-02 -4.464163650698899782e-02 -4.824062501716339796e-02 -1.944209332987930153e-02 -1.930069620102049918e-04 -1.603185513032660131e-02 6.704828847058519337e-02 -3.949338287409189657e-02 -2.479118743246069845e-02 1.963283707370720027e-02
91 1.264813727628719998e-02 -4.464163650698899782e-02 -2.560657146566450160e-02 -4.009931749229690007e-02 -3.046396984243510131e-02 -4.515466207675319921e-02 7.809320188284639419e-02 -7.639450375000099436e-02 -7.212845460195599356e-02 1.134862324403770016e-02
92 4.534098333546320025e-02 -4.464163650698899782e-02 5.199589785376040191e-02 -5.387080026724189868e-02 6.310082451524179348e-02 6.476044801137270657e-02 -1.026610541524320026e-02 3.430885887772629900e-02 3.723201120896890010e-02 1.963283707370720027e-02
93 -2.004470878288880029e-02 -4.464163650698899782e-02 4.572166603000769880e-03 9.761551025715360652e-02 5.310804470794310353e-03 -2.072908205716959829e-02 6.336665066649820044e-02 -3.949338287409189657e-02 1.255315281338930007e-02 1.134862324403770016e-02
94 -4.910501639104519755e-02 -4.464163650698899782e-02 -6.440780612537699845e-02 -1.020709899795499975e-01 -2.944912678412469915e-03 -1.540555820674759969e-02 6.336665066649820044e-02 -4.724261825803279663e-02 -3.324878724762579674e-02 -5.492508739331759815e-02
95 -7.816532399920170238e-02 -4.464163650698899782e-02 -1.698407487461730050e-02 -1.255635194240680048e-02 -1.930069620102049918e-04 -1.352666743601040056e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -9.220404962683000083e-02
96 -7.090024709716259699e-02 -4.464163650698899782e-02 -5.794093368209150136e-02 -8.141376581713200000e-02 -4.559945128264750180e-02 -2.887094206369749880e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 1.143797379512540100e-03 -5.219804415301099697e-03
97 5.623859868852180283e-02 5.068011873981870252e-02 9.961226972405269262e-03 4.941532054484590319e-02 -4.320865536613589623e-03 -1.227407358885230018e-02 -4.340084565202689815e-02 3.430885887772629900e-02 6.078775415074400001e-02 3.205915781821130212e-02
98 -2.730978568492789874e-02 -4.464163650698899782e-02 8.864150836571099701e-02 -2.518021116424929914e-02 2.182223876920789951e-02 4.252690722431590187e-02 -3.235593223976569732e-02 3.430885887772629900e-02 2.863770518940129874e-03 7.762233388139309909e-02
99 1.750521923228520000e-03 5.068011873981870252e-02 -5.128142061927360405e-03 -1.255635194240680048e-02 -1.532848840222260020e-02 -1.383981589779990050e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -6.080248196314420352e-03 -6.735140813782170000e-02
100 -1.882016527791040067e-03 -4.464163650698899782e-02 -6.440780612537699845e-02 1.154374291374709975e-02 2.732605020201240090e-02 3.751653183568340322e-02 -1.394774321933030074e-02 3.430885887772629900e-02 1.178390038357590014e-02 -5.492508739331759815e-02
101 1.628067572730669890e-02 -4.464163650698899782e-02 1.750591148957160101e-02 -2.288496402361559975e-02 6.034891879883950289e-02 4.440579799505309927e-02 3.023191042971450082e-02 -2.592261998182820038e-03 3.723201120896890010e-02 -1.077697500466389974e-03
102 1.628067572730669890e-02 5.068011873981870252e-02 -4.500718879552070145e-02 6.318680331979099896e-02 1.081461590359879960e-02 -3.744320408500199904e-04 6.336665066649820044e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 3.620126473304600273e-02
103 -9.269547780327989928e-02 -4.464163650698899782e-02 2.828403222838059977e-02 -1.599922263614299983e-02 3.695772020942030001e-02 2.499059336410210108e-02 5.600337505832399948e-02 -3.949338287409189657e-02 -5.145307980263110273e-03 -1.077697500466389974e-03
104 5.987113713954139715e-02 5.068011873981870252e-02 4.121777711495139968e-02 1.154374291374709975e-02 4.108557878402369773e-02 7.071026878537380045e-02 -3.603757004385269719e-02 3.430885887772629900e-02 -1.090443584737709956e-02 -3.007244590430930078e-02
105 -2.730978568492789874e-02 -4.464163650698899782e-02 6.492964274033119487e-02 -2.227739861197989939e-03 -2.496015840963049931e-02 -1.728444897748479883e-02 2.286863482154040048e-02 -3.949338287409189657e-02 -6.117659509433449883e-02 -6.320930122298699938e-02
106 2.354575262934580082e-02 5.068011873981870252e-02 -3.207344390894990155e-02 -4.009931749229690007e-02 -3.183992270063620150e-02 -2.166852744253820046e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -1.090443584737709956e-02 1.963283707370720027e-02
107 -9.632801625429950054e-02 -4.464163650698899782e-02 -7.626373893806680238e-02 -4.354218818603310115e-02 -4.559945128264750180e-02 -3.482076283769860309e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -5.947269741072230137e-02 -8.391983579716059960e-02
108 2.717829108036539862e-02 -4.464163650698899782e-02 4.984027370599859730e-02 -5.501842382034440038e-02 -2.944912678412469915e-03 4.064801645357869753e-02 -5.812739686837520292e-02 5.275941931568080279e-02 -5.295879323920039961e-02 -5.219804415301099697e-03
109 1.991321417832630017e-02 5.068011873981870252e-02 4.552902541047500196e-02 2.990571983224480160e-02 -6.211088558106100249e-02 -5.580170977759729700e-02 -7.285394808472339667e-02 2.692863470254440103e-02 4.560080841412490066e-02 4.034337164788070335e-02
110 3.807590643342410180e-02 5.068011873981870252e-02 -9.439390357450949676e-03 2.362754385640800005e-03 1.182945896190920002e-03 3.751653183568340322e-02 -5.444575906428809897e-02 5.017634085436720182e-02 -2.595242443518940012e-02 1.066170822852360034e-01
111 4.170844488444359899e-02 5.068011873981870252e-02 -3.207344390894990155e-02 -2.288496402361559975e-02 -4.972730985725089953e-02 -4.014428668812060341e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -1.260973855604090033e-01 1.549073015887240078e-02
112 1.991321417832630017e-02 -4.464163650698899782e-02 4.572166603000769880e-03 -2.632783471735180084e-02 2.319819162740899970e-02 1.027261565999409987e-02 6.704828847058519337e-02 -3.949338287409189657e-02 -2.364455757213410059e-02 -4.664087356364819692e-02
113 -8.543040090124079389e-02 -4.464163650698899782e-02 2.073934771121430098e-02 -2.632783471735180084e-02 5.310804470794310353e-03 1.966706951368000014e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 -2.364455757213410059e-02 3.064409414368320182e-03
114 1.991321417832630017e-02 5.068011873981870252e-02 1.427247526792889930e-02 6.318680331979099896e-02 1.494247447820220079e-02 2.029336643725910064e-02 -4.708248345611389801e-02 3.430885887772629900e-02 4.666077235681449775e-02 9.004865462589720093e-02
115 2.354575262934580082e-02 -4.464163650698899782e-02 1.101977498433290015e-01 6.318680331979099896e-02 1.356652162000110060e-02 -3.294187206696139875e-02 -2.499265663159149983e-02 2.065544415363990138e-02 9.924022573398999514e-02 2.377494398854190089e-02
116 -3.094232413594750000e-02 5.068011873981870252e-02 1.338730381358059929e-03 -5.670610554934250001e-03 6.447677737344290061e-02 4.941617338368559792e-02 -4.708248345611389801e-02 1.081111006295440019e-01 8.379676636552239877e-02 3.064409414368320182e-03
117 4.897352178648269744e-02 5.068011873981870252e-02 5.846277029704580186e-02 7.007254470726349826e-02 1.356652162000110060e-02 2.060651489904859884e-02 -2.131101882750449997e-02 3.430885887772629900e-02 2.200405045615050001e-02 2.791705090337660150e-02
118 5.987113713954139715e-02 -4.464163650698899782e-02 -2.129532317014089932e-02 8.728689817594480205e-02 4.521343735862710239e-02 3.156671106168230240e-02 -4.708248345611389801e-02 7.120997975363539678e-02 7.912108138965789905e-02 1.356118306890790048e-01
119 -5.637009329308430294e-02 5.068011873981870252e-02 -1.051720243133190055e-02 2.531522568869210010e-02 2.319819162740899970e-02 4.002171952999959703e-02 -3.971920784793980114e-02 3.430885887772629900e-02 2.061233072136409855e-02 5.691179930721949887e-02
120 1.628067572730669890e-02 -4.464163650698899782e-02 -4.716281294328249912e-02 -2.227739861197989939e-03 -1.945634697682600139e-02 -4.296262284422640298e-02 3.391354823380159783e-02 -3.949338287409189657e-02 2.736770754260900093e-02 2.791705090337660150e-02
121 -4.910501639104519755e-02 -4.464163650698899782e-02 4.572166603000769880e-03 1.154374291374709975e-02 -3.734373413344069942e-02 -1.853704282464289921e-02 -1.762938102341739949e-02 -2.592261998182820038e-03 -3.980959436433750137e-02 -2.178823207463989955e-02
122 6.350367559056099842e-02 -4.464163650698899782e-02 1.750591148957160101e-02 2.187235499495579841e-02 8.062710187196569719e-03 2.154596028441720101e-02 -3.603757004385269719e-02 3.430885887772629900e-02 1.990842087631829876e-02 1.134862324403770016e-02
123 4.897352178648269744e-02 5.068011873981870252e-02 8.109682384854470516e-02 2.187235499495579841e-02 4.383748450042589812e-02 6.413415108779360607e-02 -5.444575906428809897e-02 7.120997975363539678e-02 3.243322577960189995e-02 4.862758547755009764e-02
124 5.383060374248070309e-03 5.068011873981870252e-02 3.475090467166599972e-02 -1.080116308095460057e-03 1.525377602983150060e-01 1.987879896572929961e-01 -6.180903467246220279e-02 1.852344432601940039e-01 1.556684454070180086e-02 7.348022696655839847e-02
125 -5.514554978810590376e-03 -4.464163650698899782e-02 2.397278393285700096e-02 8.100872220010799790e-03 -3.459182841703849903e-02 -3.889169284096249957e-02 2.286863482154040048e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -1.350401824497050006e-02
126 -5.514554978810590376e-03 5.068011873981870252e-02 -8.361578283570040432e-03 -2.227739861197989939e-03 -3.321587555883730170e-02 -6.363042132233559522e-02 -3.603757004385269719e-02 -2.592261998182820038e-03 8.058546423866649877e-02 7.206516329203029904e-03
127 -8.906293935226029801e-02 -4.464163650698899782e-02 -6.117436990373419786e-02 -2.632783471735180084e-02 -5.523112129005539744e-02 -5.454911593043910295e-02 4.127682384197570165e-02 -7.639450375000099436e-02 -9.393564550871469354e-02 -5.492508739331759815e-02
128 3.444336798240450054e-02 5.068011873981870252e-02 -1.894705840284650021e-03 -1.255635194240680048e-02 3.833367306762140020e-02 1.371724873967889932e-02 7.809320188284639419e-02 -3.949338287409189657e-02 4.551890466127779880e-03 -9.634615654166470144e-02
129 -5.273755484206479882e-02 -4.464163650698899782e-02 -6.225218197761509670e-02 -2.632783471735180084e-02 -5.696818394814720174e-03 -5.071658967693000106e-03 3.023191042971450082e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 -7.149351505265640061e-02
130 9.015598825267629943e-03 -4.464163650698899782e-02 1.642809941569069870e-02 4.658001526274530187e-03 9.438663045397699403e-03 1.058576412178359981e-02 -2.867429443567860031e-02 3.430885887772629900e-02 3.896836603088559697e-02 1.190434030297399942e-01
131 -6.363517019512339445e-02 5.068011873981870252e-02 9.618619288287730273e-02 1.045012516446259948e-01 -2.944912678412469915e-03 -4.758510505903469807e-03 -6.584467611156170040e-03 -2.592261998182820038e-03 2.269202256674450122e-02 7.348022696655839847e-02
132 -9.632801625429950054e-02 -4.464163650698899782e-02 -6.979686649478139548e-02 -6.764228304218700139e-02 -1.945634697682600139e-02 -1.070833127990459925e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -4.687948284421659950e-02 -7.977772888232589898e-02
133 1.628067572730669890e-02 5.068011873981870252e-02 -2.129532317014089932e-02 -9.113481248670509197e-03 3.420581449301800248e-02 4.785043107473799934e-02 7.788079970179680352e-04 -2.592261998182820038e-03 -1.290794225416879923e-02 2.377494398854190089e-02
134 -4.183993948900609910e-02 5.068011873981870252e-02 -5.362968538656789907e-02 -4.009931749229690007e-02 -8.412613131227909824e-02 -7.177228132886340206e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -7.212845460195599356e-02 -3.007244590430930078e-02
135 -7.453278554818210111e-02 -4.464163650698899782e-02 4.337340126271319735e-02 -3.321357610482440076e-02 1.219056876180000040e-02 2.518648827290310109e-04 6.336665066649820044e-02 -3.949338287409189657e-02 -2.712864555432650121e-02 -4.664087356364819692e-02
136 -5.514554978810590376e-03 -4.464163650698899782e-02 5.630714614928399725e-02 -3.665644679856060184e-02 -4.835135699904979933e-02 -4.296262284422640298e-02 -7.285394808472339667e-02 3.799897096531720114e-02 5.078151336297320045e-02 5.691179930721949887e-02
137 -9.269547780327989928e-02 -4.464163650698899782e-02 -8.165279930747129655e-02 -5.731367096097819691e-02 -6.073493272285990230e-02 -6.801449978738899338e-02 4.864009945014990260e-02 -7.639450375000099436e-02 -6.648814822283539983e-02 -2.178823207463989955e-02
138 5.383060374248070309e-03 -4.464163650698899782e-02 4.984027370599859730e-02 9.761551025715360652e-02 -1.532848840222260020e-02 -1.634500359211620013e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 1.703713241477999851e-02 -1.350401824497050006e-02
139 3.444336798240450054e-02 5.068011873981870252e-02 1.112755619172099975e-01 7.695828609473599757e-02 -3.183992270063620150e-02 -3.388131745233000092e-02 -2.131101882750449997e-02 -2.592261998182820038e-03 2.801650652326400162e-02 7.348022696655839847e-02
140 2.354575262934580082e-02 -4.464163650698899782e-02 6.169620651868849837e-02 5.285819123858220142e-02 -3.459182841703849903e-02 -4.891244361822749687e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 5.472400334817909689e-02 -5.219804415301099697e-03
141 4.170844488444359899e-02 5.068011873981870252e-02 1.427247526792889930e-02 4.252957915737339695e-02 -3.046396984243510131e-02 -1.313877426218630021e-03 -4.340084565202689815e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 1.549073015887240078e-02
142 -2.730978568492789874e-02 -4.464163650698899782e-02 4.768464955823679963e-02 -4.698505887976939938e-02 3.420581449301800248e-02 5.724488492842390308e-02 -8.021722369289760457e-02 1.302517731550900115e-01 4.506616833626150148e-02 1.314697237742440128e-01
143 4.170844488444359899e-02 5.068011873981870252e-02 1.211685112016709989e-02 3.908670846363720280e-02 5.484510736603499803e-02 4.440579799505309927e-02 4.460445801105040325e-03 -2.592261998182820038e-03 4.560080841412490066e-02 -1.077697500466389974e-03
144 -3.094232413594750000e-02 -4.464163650698899782e-02 5.649978676881649634e-03 -9.113481248670509197e-03 1.907033305280559851e-02 6.827982580309210209e-03 7.441156407875940126e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -4.249876664881350324e-02
145 3.081082953138499989e-02 5.068011873981870252e-02 4.660683748435590079e-02 -1.599922263614299983e-02 2.044628591100669870e-02 5.066876723084379891e-02 -5.812739686837520292e-02 7.120997975363539678e-02 6.209315616505399656e-03 7.206516329203029904e-03
146 -4.183993948900609910e-02 -4.464163650698899782e-02 1.285205550993039902e-01 6.318680331979099896e-02 -3.321587555883730170e-02 -3.262872360517189707e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -5.078298047848289754e-02
147 -3.094232413594750000e-02 5.068011873981870252e-02 5.954058237092670069e-02 1.215130832538269907e-03 1.219056876180000040e-02 3.156671106168230240e-02 -4.340084565202689815e-02 3.430885887772629900e-02 1.482271084126630077e-02 7.206516329203029904e-03
148 -5.637009329308430294e-02 -4.464163650698899782e-02 9.295275666123460623e-02 -1.944209332987930153e-02 1.494247447820220079e-02 2.342485105515439842e-02 -2.867429443567860031e-02 2.545258986750810123e-02 2.605608963368469949e-02 4.034337164788070335e-02
149 -6.000263174410389727e-02 5.068011873981870252e-02 1.535028734180979987e-02 -1.944209332987930153e-02 3.695772020942030001e-02 4.816357953652750101e-02 1.918699701745330000e-02 -2.592261998182820038e-03 -3.075120986455629965e-02 -1.077697500466389974e-03
150 -4.910501639104519755e-02 5.068011873981870252e-02 -5.128142061927360405e-03 -4.698505887976939938e-02 -2.083229983502719873e-02 -2.041593359538010008e-02 -6.917231028063640375e-02 7.120997975363539678e-02 6.123790751970099866e-02 -3.835665973397880263e-02
151 2.354575262934580082e-02 -4.464163650698899782e-02 7.031870310973570293e-02 2.531522568869210010e-02 -3.459182841703849903e-02 -1.446611282137899926e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -1.919704761394450121e-02 -9.361911330135799444e-03
152 1.750521923228520000e-03 -4.464163650698899782e-02 -4.050329988046450294e-03 -5.670610554934250001e-03 -8.448724111216979540e-03 -2.386056667506489953e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -8.944018957797799166e-03 -1.350401824497050006e-02
153 -3.457486258696700065e-02 5.068011873981870252e-02 -8.168937664037369826e-04 7.007254470726349826e-02 3.970962592582259754e-02 6.695248724389940564e-02 -6.549067247654929980e-02 1.081111006295440019e-01 2.671425763351279944e-02 7.348022696655839847e-02
154 4.170844488444359899e-02 5.068011873981870252e-02 -4.392937672163980262e-02 6.318680331979099896e-02 -4.320865536613589623e-03 1.622243643399520069e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -3.452371533034950118e-02 1.134862324403770016e-02
155 6.713621404158050254e-02 5.068011873981870252e-02 2.073934771121430098e-02 -5.670610554934250001e-03 2.044628591100669870e-02 2.624318721126020146e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 8.640282933063080789e-03 3.064409414368320182e-03
156 -2.730978568492789874e-02 5.068011873981870252e-02 6.061839444480759953e-02 4.941532054484590319e-02 8.511607024645979902e-02 8.636769187485039689e-02 -2.902829807069099918e-03 3.430885887772629900e-02 3.781447882634390162e-02 4.862758547755009764e-02
157 -1.641217033186929963e-02 -4.464163650698899782e-02 -1.051720243133190055e-02 1.215130832538269907e-03 -3.734373413344069942e-02 -3.576020822306719832e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -2.139368094035999993e-02 -3.421455281914410201e-02
158 -1.882016527791040067e-03 5.068011873981870252e-02 -3.315125598283080038e-02 -1.829446977677679984e-02 3.145390877661580209e-02 4.284005568610550069e-02 -1.394774321933030074e-02 1.991742173612169944e-02 1.022564240495780000e-02 2.791705090337660150e-02
159 -1.277963188084970010e-02 -4.464163650698899782e-02 -6.548561819925780014e-02 -6.993753018282070077e-02 1.182945896190920002e-03 1.684873335757430118e-02 -2.902829807069099918e-03 -7.020396503291909812e-03 -3.075120986455629965e-02 -5.078298047848289754e-02
160 -5.514554978810590376e-03 -4.464163650698899782e-02 4.337340126271319735e-02 8.728689817594480205e-02 1.356652162000110060e-02 7.141131042098750048e-03 -1.394774321933030074e-02 -2.592261998182820038e-03 4.234489544960749752e-02 -1.764612515980519894e-02
161 -9.147093429830140468e-03 -4.464163650698899782e-02 -6.225218197761509670e-02 -7.452802442965950069e-02 -2.358420555142939912e-02 -1.321351897422090062e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -3.581672810154919867e-02 -4.664087356364819692e-02
162 -4.547247794002570037e-02 5.068011873981870252e-02 6.385183066645029604e-02 7.007254470726349826e-02 1.332744202834990066e-01 1.314610703725430096e-01 -3.971920784793980114e-02 1.081111006295440019e-01 7.573758845754760549e-02 8.590654771106250032e-02
163 -5.273755484206479882e-02 -4.464163650698899782e-02 3.043965637614240091e-02 -7.452802442965950069e-02 -2.358420555142939912e-02 -1.133462820348369975e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 -3.075120986455629965e-02 -1.077697500466389974e-03
164 1.628067572730669890e-02 5.068011873981870252e-02 7.247432725749750060e-02 7.695828609473599757e-02 -8.448724111216979540e-03 5.575388733151089883e-03 -6.584467611156170040e-03 -2.592261998182820038e-03 -2.364455757213410059e-02 6.105390622205419948e-02
165 4.534098333546320025e-02 -4.464163650698899782e-02 -1.913969902237900103e-02 2.187235499495579841e-02 2.732605020201240090e-02 -1.352666743601040056e-02 1.001830287073690040e-01 -3.949338287409189657e-02 1.776347786711730131e-02 -1.350401824497050006e-02
166 -4.183993948900609910e-02 -4.464163650698899782e-02 -6.656343027313869898e-02 -4.698505887976939938e-02 -3.734373413344069942e-02 -4.327577130601600180e-02 4.864009945014990260e-02 -3.949338287409189657e-02 -5.615757309500619965e-02 -1.350401824497050006e-02
167 -5.637009329308430294e-02 5.068011873981870252e-02 -6.009655782985329903e-02 -3.665644679856060184e-02 -8.825398988688250290e-02 -7.083283594349480683e-02 -1.394774321933030074e-02 -3.949338287409189657e-02 -7.814091066906959926e-02 -1.046303703713340055e-01
168 7.076875249260000666e-02 -4.464163650698899782e-02 6.924089103585480409e-02 3.793908501382069892e-02 2.182223876920789951e-02 1.504458729887179960e-03 -3.603757004385269719e-02 3.910600459159439823e-02 7.763278919555950675e-02 1.066170822852360034e-01
169 1.750521923228520000e-03 5.068011873981870252e-02 5.954058237092670069e-02 -2.227739861197989939e-03 6.172487165704060308e-02 6.319470570242499696e-02 -5.812739686837520292e-02 1.081111006295440019e-01 6.898221163630259556e-02 1.273276168594099922e-01
170 -1.882016527791040067e-03 -4.464163650698899782e-02 -2.668438353954540043e-02 4.941532054484590319e-02 5.897296594063840269e-02 -1.603185513032660131e-02 -4.708248345611389801e-02 7.120997975363539678e-02 1.335989800130079896e-01 1.963283707370720027e-02
171 2.354575262934580082e-02 5.068011873981870252e-02 -2.021751109626000048e-02 -3.665644679856060184e-02 -1.395253554402150001e-02 -1.509240974495799914e-02 5.968501286241110343e-02 -3.949338287409189657e-02 -9.643322289178400675e-02 -1.764612515980519894e-02
172 -2.004470878288880029e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -9.862811928581330378e-02 -7.587041416307230279e-02 -5.987263978086120042e-02 -1.762938102341739949e-02 -3.949338287409189657e-02 -5.140053526058249722e-02 -4.664087356364819692e-02
173 4.170844488444359899e-02 5.068011873981870252e-02 7.139651518361660176e-02 8.100872220010799790e-03 3.833367306762140020e-02 1.590928797220559840e-02 -1.762938102341739949e-02 3.430885887772629900e-02 7.341007804911610368e-02 8.590654771106250032e-02
174 -6.363517019512339445e-02 5.068011873981870252e-02 -7.949717515970949888e-02 -5.670610554934250001e-03 -7.174255558846899528e-02 -6.644875747844139480e-02 -1.026610541524320026e-02 -3.949338287409189657e-02 -1.811826730789670159e-02 -5.492508739331759815e-02
175 1.628067572730669890e-02 5.068011873981870252e-02 9.961226972405269262e-03 -4.354218818603310115e-02 -9.650970703608929835e-02 -9.463211903949929338e-02 -3.971920784793980114e-02 -3.949338287409189657e-02 1.703713241477999851e-02 7.206516329203029904e-03
176 6.713621404158050254e-02 -4.464163650698899782e-02 -3.854031635223530150e-02 -2.632783471735180084e-02 -3.183992270063620150e-02 -2.636575436938120090e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -2.712864555432650121e-02 3.064409414368320182e-03
177 4.534098333546320025e-02 5.068011873981870252e-02 1.966153563733339868e-02 3.908670846363720280e-02 2.044628591100669870e-02 2.593003874947069978e-02 8.142083605192099172e-03 -2.592261998182820038e-03 -3.303712578676999863e-03 1.963283707370720027e-02
178 4.897352178648269744e-02 -4.464163650698899782e-02 2.720622015449970094e-02 -2.518021116424929914e-02 2.319819162740899970e-02 1.841447566652189977e-02 -6.180903467246220279e-02 8.006624876385350087e-02 7.222365081991240221e-02 3.205915781821130212e-02
179 4.170844488444359899e-02 -4.464163650698899782e-02 -8.361578283570040432e-03 -2.632783471735180084e-02 2.457414448561009990e-02 1.622243643399520069e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -4.836172480289190057e-02 -3.007244590430930078e-02
180 -2.367724723390840155e-02 -4.464163650698899782e-02 -1.590626280073640167e-02 -1.255635194240680048e-02 2.044628591100669870e-02 4.127431337715779802e-02 -4.340084565202689815e-02 3.430885887772629900e-02 1.407245251576850001e-02 -9.361911330135799444e-03
181 -3.820740103798660192e-02 5.068011873981870252e-02 4.572166603000769880e-03 3.564383776990089764e-02 -1.120062982761920074e-02 5.888537194940629722e-03 -4.708248345611389801e-02 3.430885887772629900e-02 1.630495279994180133e-02 -1.077697500466389974e-03
182 4.897352178648269744e-02 -4.464163650698899782e-02 -4.285156464775889684e-02 -5.387080026724189868e-02 4.521343735862710239e-02 5.004247030726469841e-02 3.391354823380159783e-02 -2.592261998182820038e-03 -2.595242443518940012e-02 -6.320930122298699938e-02
183 4.534098333546320025e-02 5.068011873981870252e-02 5.649978676881649634e-03 5.630106193231849965e-02 6.447677737344290061e-02 8.918602803095619647e-02 -3.971920784793980114e-02 7.120997975363539678e-02 1.556684454070180086e-02 -9.361911330135799444e-03
184 4.534098333546320025e-02 5.068011873981870252e-02 -3.530688013059259805e-02 6.318680331979099896e-02 -4.320865536613589623e-03 -1.627025888008149911e-03 -1.026610541524320026e-02 -2.592261998182820038e-03 1.556684454070180086e-02 5.691179930721949887e-02
185 1.628067572730669890e-02 -4.464163650698899782e-02 2.397278393285700096e-02 -2.288496402361559975e-02 -2.496015840963049931e-02 -2.605260590759169922e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 3.723201120896890010e-02 3.205915781821130212e-02
186 -7.453278554818210111e-02 5.068011873981870252e-02 -1.806188694849819934e-02 8.100872220010799790e-03 -1.945634697682600139e-02 -2.480001206043359885e-02 -6.549067247654929980e-02 3.430885887772629900e-02 6.731721791468489591e-02 -1.764612515980519894e-02
187 -8.179786245022120650e-02 5.068011873981870252e-02 4.229558918883229851e-02 -1.944209332987930153e-02 3.970962592582259754e-02 5.755803339021339782e-02 -6.917231028063640375e-02 1.081111006295440019e-01 4.718616788601970313e-02 -3.835665973397880263e-02
188 -6.726770864614299572e-02 -4.464163650698899782e-02 -5.470749746044879791e-02 -2.632783471735180084e-02 -7.587041416307230279e-02 -8.210618056791800512e-02 4.864009945014990260e-02 -7.639450375000099436e-02 -8.682899321629239386e-02 -1.046303703713340055e-01
189 5.383060374248070309e-03 -4.464163650698899782e-02 -2.972517914165530208e-03 4.941532054484590319e-02 7.410844738085080319e-02 7.071026878537380045e-02 4.495846164606279866e-02 -2.592261998182820038e-03 -1.498586820292070049e-03 -9.361911330135799444e-03
190 -1.882016527791040067e-03 -4.464163650698899782e-02 -6.656343027313869898e-02 1.215130832538269907e-03 -2.944912678412469915e-03 3.070201038834840124e-03 1.182372140927919965e-02 -2.592261998182820038e-03 -2.028874775162960165e-02 -2.593033898947460017e-02
191 9.015598825267629943e-03 -4.464163650698899782e-02 -1.267282657909369996e-02 2.875809638242839833e-02 -1.808039411862490120e-02 -5.071658967693000106e-03 -4.708248345611389801e-02 3.430885887772629900e-02 2.337484127982079885e-02 -5.219804415301099697e-03
192 -5.514554978810590376e-03 5.068011873981870252e-02 -4.177375257387799801e-02 -4.354218818603310115e-02 -7.999827273767569358e-02 -7.615635979391689736e-02 -3.235593223976569732e-02 -3.949338287409189657e-02 1.022564240495780000e-02 -9.361911330135799444e-03
193 5.623859868852180283e-02 5.068011873981870252e-02 -3.099563183506899924e-02 8.100872220010799790e-03 1.907033305280559851e-02 2.123281182262769934e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -2.952762274177360077e-02 -5.906719430815229877e-02
194 9.015598825267629943e-03 5.068011873981870252e-02 -5.128142061927360405e-03 -6.419941234845069622e-02 6.998058880624739853e-02 8.386250418053420308e-02 -3.971920784793980114e-02 7.120997975363539678e-02 3.953987807202419963e-02 1.963283707370720027e-02
195 -6.726770864614299572e-02 -4.464163650698899782e-02 -5.901874575597240019e-02 3.220096707616459941e-02 -5.110326271545199972e-02 -4.953874054180659736e-02 -1.026610541524320026e-02 -3.949338287409189657e-02 2.007840549823790115e-03 2.377494398854190089e-02
196 2.717829108036539862e-02 5.068011873981870252e-02 2.505059600673789980e-02 1.498661360748330083e-02 2.595009734381130070e-02 4.847672799831700269e-02 -3.971920784793980114e-02 3.430885887772629900e-02 7.837142301823850701e-03 2.377494398854190089e-02
197 -2.367724723390840155e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -3.321357610482440076e-02 3.282986163481690228e-02 3.626393798852529937e-02 3.759518603788870178e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 1.134862324403770016e-02
198 4.897352178648269744e-02 5.068011873981870252e-02 3.494354529119849794e-03 7.007254470726349826e-02 -8.448724111216979540e-03 1.340410027788939938e-02 -5.444575906428809897e-02 3.430885887772629900e-02 1.331596790892770020e-02 3.620126473304600273e-02
199 -5.273755484206479882e-02 -4.464163650698899782e-02 5.415152200152219958e-02 -2.632783471735180084e-02 -5.523112129005539744e-02 -3.388131745233000092e-02 -1.394774321933030074e-02 -3.949338287409189657e-02 -7.408887149153539631e-02 -5.906719430815229877e-02
200 4.170844488444359899e-02 -4.464163650698899782e-02 -4.500718879552070145e-02 3.449621432008449784e-02 4.383748450042589812e-02 -1.571870666853709964e-02 3.759518603788870178e-02 -1.440062067847370023e-02 8.989869327767099905e-02 7.206516329203029904e-03
201 5.623859868852180283e-02 -4.464163650698899782e-02 -5.794093368209150136e-02 -7.965857695567990157e-03 5.209320164963270050e-02 4.910302492189610318e-02 5.600337505832399948e-02 -2.141183364489639834e-02 -2.832024254799870092e-02 4.448547856271539702e-02
202 -3.457486258696700065e-02 5.068011873981870252e-02 -5.578530953432969675e-02 -1.599922263614299983e-02 -9.824676969418109224e-03 -7.889995123798789270e-03 3.759518603788870178e-02 -3.949338287409189657e-02 -5.295879323920039961e-02 2.791705090337660150e-02
203 8.166636784565869944e-02 5.068011873981870252e-02 1.338730381358059929e-03 3.564383776990089764e-02 1.263946559924939983e-01 9.106491880169340081e-02 1.918699701745330000e-02 3.430885887772629900e-02 8.449528221240310000e-02 -3.007244590430930078e-02
204 -1.882016527791040067e-03 5.068011873981870252e-02 3.043965637614240091e-02 5.285819123858220142e-02 3.970962592582259754e-02 5.661858800484489973e-02 -3.971920784793980114e-02 7.120997975363539678e-02 2.539313491544940155e-02 2.791705090337660150e-02
205 1.107266754538149961e-01 5.068011873981870252e-02 6.727790750762559745e-03 2.875809638242839833e-02 -2.771206412603280031e-02 -7.263698200219739949e-03 -4.708248345611389801e-02 3.430885887772629900e-02 2.007840549823790115e-03 7.762233388139309909e-02
206 -3.094232413594750000e-02 -4.464163650698899782e-02 4.660683748435590079e-02 1.498661360748330083e-02 -1.670444126042380101e-02 -4.703355284749029946e-02 7.788079970179680352e-04 -2.592261998182820038e-03 6.345592137206540473e-02 -2.593033898947460017e-02
207 1.750521923228520000e-03 5.068011873981870252e-02 2.612840808061879863e-02 -9.113481248670509197e-03 2.457414448561009990e-02 3.845597722105199845e-02 -2.131101882750449997e-02 3.430885887772629900e-02 9.436409146079870192e-03 3.064409414368320182e-03
208 9.015598825267629943e-03 -4.464163650698899782e-02 4.552902541047500196e-02 2.875809638242839833e-02 1.219056876180000040e-02 -1.383981589779990050e-02 2.655027262562750096e-02 -3.949338287409189657e-02 4.613233103941480340e-02 3.620126473304600273e-02
209 3.081082953138499989e-02 -4.464163650698899782e-02 4.013996504107050084e-02 7.695828609473599757e-02 1.769438019460449832e-02 3.782968029747289795e-02 -2.867429443567860031e-02 3.430885887772629900e-02 -1.498586820292070049e-03 1.190434030297399942e-01
210 3.807590643342410180e-02 5.068011873981870252e-02 -1.806188694849819934e-02 6.662967401352719310e-02 -5.110326271545199972e-02 -1.665815205390569834e-02 -7.653558588881050062e-02 3.430885887772629900e-02 -1.190068480150809939e-02 -1.350401824497050006e-02
211 9.015598825267629943e-03 -4.464163650698899782e-02 1.427247526792889930e-02 1.498661360748330083e-02 5.484510736603499803e-02 4.722413415115889884e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -3.324878724762579674e-02 -5.906719430815229877e-02
212 9.256398319871740610e-02 -4.464163650698899782e-02 3.690652881942779739e-02 2.187235499495579841e-02 -2.496015840963049931e-02 -1.665815205390569834e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -2.251217192966049885e-02 -2.178823207463989955e-02
213 6.713621404158050254e-02 -4.464163650698899782e-02 3.494354529119849794e-03 3.564383776990089764e-02 4.934129593323050011e-02 3.125356259989280072e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -6.092541861022970299e-04 1.963283707370720027e-02
214 1.750521923228520000e-03 -4.464163650698899782e-02 -7.087467856866229432e-02 -2.288496402361559975e-02 -1.568959820211340015e-03 -1.000728964429089965e-03 2.655027262562750096e-02 -3.949338287409189657e-02 -2.251217192966049885e-02 7.206516329203029904e-03
215 3.081082953138499989e-02 -4.464163650698899782e-02 -3.315125598283080038e-02 -2.288496402361559975e-02 -4.697540414084860200e-02 -8.116673518254939601e-02 1.038646665114559969e-01 -7.639450375000099436e-02 -3.980959436433750137e-02 -5.492508739331759815e-02
216 2.717829108036539862e-02 5.068011873981870252e-02 9.403056873511560221e-02 9.761551025715360652e-02 -3.459182841703849903e-02 -3.200242668159279658e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 3.664579779339879884e-02 1.066170822852360034e-01
217 1.264813727628719998e-02 5.068011873981870252e-02 3.582871674554689856e-02 4.941532054484590319e-02 5.346915450783389784e-02 7.415490186505870052e-02 -6.917231028063640375e-02 1.450122215054540087e-01 4.560080841412490066e-02 4.862758547755009764e-02
218 7.440129094361959405e-02 -4.464163650698899782e-02 3.151746845002330322e-02 1.010583809508899950e-01 4.658939021682820258e-02 3.689023491210430272e-02 1.550535921336619952e-02 -2.592261998182820038e-03 3.365681290238470291e-02 4.448547856271539702e-02
219 -4.183993948900609910e-02 -4.464163650698899782e-02 -6.548561819925780014e-02 -4.009931749229690007e-02 -5.696818394814720174e-03 1.434354566325799982e-02 -4.340084565202689815e-02 3.430885887772629900e-02 7.026862549151949647e-03 -1.350401824497050006e-02
220 -8.906293935226029801e-02 -4.464163650698899782e-02 -4.177375257387799801e-02 -1.944209332987930153e-02 -6.623874415566440021e-02 -7.427746902317970690e-02 8.142083605192099172e-03 -3.949338287409189657e-02 1.143797379512540100e-03 -3.007244590430930078e-02
221 2.354575262934580082e-02 5.068011873981870252e-02 -3.961812842611620034e-02 -5.670610554934250001e-03 -4.835135699904979933e-02 -3.325502052875090042e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.016435479455120028e-01 -6.735140813782170000e-02
222 -4.547247794002570037e-02 -4.464163650698899782e-02 -3.854031635223530150e-02 -2.632783471735180084e-02 -1.532848840222260020e-02 8.781618063081050515e-04 -3.235593223976569732e-02 -2.592261998182820038e-03 1.143797379512540100e-03 -3.835665973397880263e-02
223 -2.367724723390840155e-02 5.068011873981870252e-02 -2.560657146566450160e-02 4.252957915737339695e-02 -5.385516843185429725e-02 -4.765984977106939996e-02 -2.131101882750449997e-02 -3.949338287409189657e-02 1.143797379512540100e-03 1.963283707370720027e-02
224 -9.996055470531900466e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -6.419941234845069622e-02 -5.798302700645770191e-02 -6.018578824265070210e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.811826730789670159e-02 -5.078298047848289754e-02
225 -2.730978568492789874e-02 -4.464163650698899782e-02 -6.656343027313869898e-02 -1.123996020607579971e-01 -4.972730985725089953e-02 -4.139688053527879746e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -3.581672810154919867e-02 -9.361911330135799444e-03
226 3.081082953138499989e-02 5.068011873981870252e-02 3.259528052390420205e-02 4.941532054484590319e-02 -4.009563984984299695e-02 -4.358891976780549654e-02 -6.917231028063640375e-02 3.430885887772629900e-02 6.301661511474640487e-02 3.064409414368320182e-03
227 -1.035930931563389945e-01 5.068011873981870252e-02 -4.608500086940160029e-02 -2.632783471735180084e-02 -2.496015840963049931e-02 -2.480001206043359885e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -3.980959436433750137e-02 -5.492508739331759815e-02
228 6.713621404158050254e-02 5.068011873981870252e-02 -2.991781976118810041e-02 5.744868538213489945e-02 -1.930069620102049918e-04 -1.571870666853709964e-02 7.441156407875940126e-02 -5.056371913686460301e-02 -3.845911230135379971e-02 7.206516329203029904e-03
229 -5.273755484206479882e-02 -4.464163650698899782e-02 -1.267282657909369996e-02 -6.075654165471439799e-02 -1.930069620102049918e-04 8.080576427467340075e-03 1.182372140927919965e-02 -2.592261998182820038e-03 -2.712864555432650121e-02 -5.078298047848289754e-02
230 -2.730978568492789874e-02 5.068011873981870252e-02 -1.590626280073640167e-02 -2.977070541108809906e-02 3.934851612593179802e-03 -6.875805026395569565e-04 4.127682384197570165e-02 -3.949338287409189657e-02 -2.364455757213410059e-02 1.134862324403770016e-02
231 -3.820740103798660192e-02 5.068011873981870252e-02 7.139651518361660176e-02 -5.731367096097819691e-02 1.539137131565160022e-01 1.558866503921270130e-01 7.788079970179680352e-04 7.194800217115350505e-02 5.027649338998960160e-02 6.933812005172369786e-02
232 9.015598825267629943e-03 -4.464163650698899782e-02 -3.099563183506899924e-02 2.187235499495579841e-02 8.062710187196569719e-03 8.706873351046409346e-03 4.460445801105040325e-03 -2.592261998182820038e-03 9.436409146079870192e-03 1.134862324403770016e-02
233 1.264813727628719998e-02 5.068011873981870252e-02 2.609183074771409820e-04 -1.140872838930430053e-02 3.970962592582259754e-02 5.724488492842390308e-02 -3.971920784793980114e-02 5.608052019451260223e-02 2.405258322689299982e-02 3.205915781821130212e-02
234 6.713621404158050254e-02 -4.464163650698899782e-02 3.690652881942779739e-02 -5.042792957350569760e-02 -2.358420555142939912e-02 -3.450761437590899733e-02 4.864009945014990260e-02 -3.949338287409189657e-02 -2.595242443518940012e-02 -3.835665973397880263e-02
235 4.534098333546320025e-02 -4.464163650698899782e-02 3.906215296718960200e-02 4.597244985110970211e-02 6.686757328995440036e-03 -2.417371513685449835e-02 8.142083605192099172e-03 -1.255556463467829946e-02 6.432823302367089713e-02 5.691179930721949887e-02
236 6.713621404158050254e-02 5.068011873981870252e-02 -1.482845072685549936e-02 5.859630917623830093e-02 -5.935897986465880211e-02 -3.450761437590899733e-02 -6.180903467246220279e-02 1.290620876969899959e-02 -5.145307980263110273e-03 4.862758547755009764e-02
237 2.717829108036539862e-02 -4.464163650698899782e-02 6.727790750762559745e-03 3.564383776990089764e-02 7.961225881365530110e-02 7.071026878537380045e-02 1.550535921336619952e-02 3.430885887772629900e-02 4.067226371449769728e-02 1.134862324403770016e-02
238 5.623859868852180283e-02 -4.464163650698899782e-02 -6.871905442090049665e-02 -6.878990659528949614e-02 -1.930069620102049918e-04 -1.000728964429089965e-03 4.495846164606279866e-02 -3.764832683029650101e-02 -4.836172480289190057e-02 -1.077697500466389974e-03
239 3.444336798240450054e-02 5.068011873981870252e-02 -9.439390357450949676e-03 5.974393262605470073e-02 -3.596778127523959923e-02 -7.576846662009279788e-03 -7.653558588881050062e-02 7.120997975363539678e-02 1.100810104587249955e-02 -2.178823207463989955e-02
240 2.354575262934580082e-02 -4.464163650698899782e-02 1.966153563733339868e-02 -1.255635194240680048e-02 8.374011738825870577e-02 3.876912568284150012e-02 6.336665066649820044e-02 -2.592261998182820038e-03 6.604820616309839409e-02 4.862758547755009764e-02
241 4.897352178648269744e-02 5.068011873981870252e-02 7.462995140525929827e-02 6.662967401352719310e-02 -9.824676969418109224e-03 -2.253322811587220049e-03 -4.340084565202689815e-02 3.430885887772629900e-02 3.365681290238470291e-02 1.963283707370720027e-02
242 3.081082953138499989e-02 5.068011873981870252e-02 -8.361578283570040432e-03 4.658001526274530187e-03 1.494247447820220079e-02 2.749578105841839898e-02 8.142083605192099172e-03 -8.127430129569179762e-03 -2.952762274177360077e-02 5.691179930721949887e-02
243 -1.035930931563389945e-01 5.068011873981870252e-02 -2.345094731790270046e-02 -2.288496402361559975e-02 -8.687803702868139577e-02 -6.770135132559949864e-02 -1.762938102341739949e-02 -3.949338287409189657e-02 -7.814091066906959926e-02 -7.149351505265640061e-02
244 1.628067572730669890e-02 5.068011873981870252e-02 -4.608500086940160029e-02 1.154374291374709975e-02 -3.321587555883730170e-02 -1.603185513032660131e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 -4.398540256559110156e-02 -4.249876664881350324e-02
245 -6.000263174410389727e-02 5.068011873981870252e-02 5.415152200152219958e-02 -1.944209332987930153e-02 -4.972730985725089953e-02 -4.891244361822749687e-02 2.286863482154040048e-02 -3.949338287409189657e-02 -4.398540256559110156e-02 -5.219804415301099697e-03
246 -2.730978568492789874e-02 -4.464163650698899782e-02 -3.530688013059259805e-02 -2.977070541108809906e-02 -5.660707414825649764e-02 -5.862004593370299943e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -4.986846773523059828e-02 -1.294830118603420011e-01
247 4.170844488444359899e-02 -4.464163650698899782e-02 -3.207344390894990155e-02 -6.190416520781699683e-02 7.961225881365530110e-02 5.098191569263330059e-02 5.600337505832399948e-02 -9.972486173364639508e-03 4.506616833626150148e-02 -5.906719430815229877e-02
248 -8.179786245022120650e-02 -4.464163650698899782e-02 -8.165279930747129655e-02 -4.009931749229690007e-02 2.558898754392050119e-03 -1.853704282464289921e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -1.090443584737709956e-02 -9.220404962683000083e-02
249 -4.183993948900609910e-02 -4.464163650698899782e-02 4.768464955823679963e-02 5.974393262605470073e-02 1.277706088506949944e-01 1.280164372928579986e-01 -2.499265663159149983e-02 1.081111006295440019e-01 6.389312063683939835e-02 4.034337164788070335e-02
250 -1.277963188084970010e-02 -4.464163650698899782e-02 6.061839444480759953e-02 5.285819123858220142e-02 4.796534307502930278e-02 2.937467182915549924e-02 -1.762938102341739949e-02 3.430885887772629900e-02 7.021129819331020649e-02 7.206516329203029904e-03
251 6.713621404158050254e-02 -4.464163650698899782e-02 5.630714614928399725e-02 7.351541540099980343e-02 -1.395253554402150001e-02 -3.920484130275200124e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 7.573758845754760549e-02 3.620126473304600273e-02
252 -5.273755484206479882e-02 5.068011873981870252e-02 9.834181703063900326e-02 8.728689817594480205e-02 6.034891879883950289e-02 4.878987646010649742e-02 -5.812739686837520292e-02 1.081111006295440019e-01 8.449528221240310000e-02 4.034337164788070335e-02
253 5.383060374248070309e-03 -4.464163650698899782e-02 5.954058237092670069e-02 -5.616604740787570216e-02 2.457414448561009990e-02 5.286080646337049799e-02 -4.340084565202689815e-02 5.091436327188540029e-02 -4.219859706946029777e-03 -3.007244590430930078e-02
254 8.166636784565869944e-02 -4.464163650698899782e-02 3.367309259778510089e-02 8.100872220010799790e-03 5.209320164963270050e-02 5.661858800484489973e-02 -1.762938102341739949e-02 3.430885887772629900e-02 3.486419309615960277e-02 6.933812005172369786e-02
255 3.081082953138499989e-02 5.068011873981870252e-02 5.630714614928399725e-02 7.695828609473599757e-02 4.934129593323050011e-02 -1.227407358885230018e-02 -3.603757004385269719e-02 7.120997975363539678e-02 1.200533820015380060e-01 9.004865462589720093e-02
256 1.750521923228520000e-03 -4.464163650698899782e-02 -6.548561819925780014e-02 -5.670610554934250001e-03 -7.072771253015849857e-03 -1.947648821001150138e-02 4.127682384197570165e-02 -3.949338287409189657e-02 -3.303712578676999863e-03 7.206516329203029904e-03
257 -4.910501639104519755e-02 -4.464163650698899782e-02 1.608549173157310108e-01 -4.698505887976939938e-02 -2.908801698423390050e-02 -1.978963667180099958e-02 -4.708248345611389801e-02 3.430885887772629900e-02 2.801650652326400162e-02 1.134862324403770016e-02
258 -2.730978568492789874e-02 5.068011873981870252e-02 -5.578530953432969675e-02 2.531522568869210010e-02 -7.072771253015849857e-03 -2.354741821327540133e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -5.145307980263110273e-03 -5.078298047848289754e-02
259 7.803382939463919532e-02 5.068011873981870252e-02 -2.452875939178359929e-02 -4.239456463293059946e-02 6.686757328995440036e-03 5.286080646337049799e-02 -6.917231028063640375e-02 8.080427118137170628e-02 -3.712834601047360072e-02 5.691179930721949887e-02
260 1.264813727628719998e-02 -4.464163650698899782e-02 -3.638469220447349689e-02 4.252957915737339695e-02 -1.395253554402150001e-02 1.293437758520510003e-02 -2.683347553363510038e-02 5.156973385758089994e-03 -4.398540256559110156e-02 7.206516329203029904e-03
261 4.170844488444359899e-02 -4.464163650698899782e-02 -8.361578283570040432e-03 -5.731367096097819691e-02 8.062710187196569719e-03 -3.137612975801370302e-02 1.517259579645879874e-01 -7.639450375000099436e-02 -8.023654024890179703e-02 -1.764612515980519894e-02
262 4.897352178648269744e-02 -4.464163650698899782e-02 -4.177375257387799801e-02 1.045012516446259948e-01 3.558176735121919981e-02 -2.573945744580210040e-02 1.774974225931970073e-01 -7.639450375000099436e-02 -1.290794225416879923e-02 1.549073015887240078e-02
263 -1.641217033186929963e-02 5.068011873981870252e-02 1.274427430254229943e-01 9.761551025715360652e-02 1.631842733640340160e-02 1.747503028115330106e-02 -2.131101882750449997e-02 3.430885887772629900e-02 3.486419309615960277e-02 3.064409414368320182e-03
264 -7.453278554818210111e-02 5.068011873981870252e-02 -7.734155101194770121e-02 -4.698505887976939938e-02 -4.697540414084860200e-02 -3.262872360517189707e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -7.212845460195599356e-02 -1.764612515980519894e-02
265 3.444336798240450054e-02 5.068011873981870252e-02 2.828403222838059977e-02 -3.321357610482440076e-02 -4.559945128264750180e-02 -9.768885894535990141e-03 -5.076412126020100196e-02 -2.592261998182820038e-03 -5.947269741072230137e-02 -2.178823207463989955e-02
266 -3.457486258696700065e-02 5.068011873981870252e-02 -2.560657146566450160e-02 -1.714684618924559867e-02 1.182945896190920002e-03 -2.879619735166290186e-03 8.142083605192099172e-03 -1.550765430475099967e-02 1.482271084126630077e-02 4.034337164788070335e-02
267 -5.273755484206479882e-02 5.068011873981870252e-02 -6.225218197761509670e-02 1.154374291374709975e-02 -8.448724111216979540e-03 -3.669965360843580049e-02 1.222728555318910032e-01 -7.639450375000099436e-02 -8.682899321629239386e-02 3.064409414368320182e-03
268 5.987113713954139715e-02 -4.464163650698899782e-02 -8.168937664037369826e-04 -8.485663651086830517e-02 7.548440023905199359e-02 7.947842571548069390e-02 4.460445801105040325e-03 3.430885887772629900e-02 2.337484127982079885e-02 2.791705090337660150e-02
269 6.350367559056099842e-02 5.068011873981870252e-02 8.864150836571099701e-02 7.007254470726349826e-02 2.044628591100669870e-02 3.751653183568340322e-02 -5.076412126020100196e-02 7.120997975363539678e-02 2.930041326858690010e-02 7.348022696655839847e-02
270 9.015598825267629943e-03 -4.464163650698899782e-02 -3.207344390894990155e-02 -2.632783471735180084e-02 4.246153164222479792e-02 -1.039518281811509931e-02 1.590892335727620011e-01 -7.639450375000099436e-02 -1.190068480150809939e-02 -3.835665973397880263e-02
271 5.383060374248070309e-03 5.068011873981870252e-02 3.043965637614240091e-02 8.384402748220859403e-02 -3.734373413344069942e-02 -4.734670130927989828e-02 1.550535921336619952e-02 -3.949338287409189657e-02 8.640282933063080789e-03 1.549073015887240078e-02
272 3.807590643342410180e-02 5.068011873981870252e-02 8.883414898524360018e-03 4.252957915737339695e-02 -4.284754556624519733e-02 -2.104223051895920057e-02 -3.971920784793980114e-02 -2.592261998182820038e-03 -1.811826730789670159e-02 7.206516329203029904e-03
273 1.264813727628719998e-02 -4.464163650698899782e-02 6.727790750762559745e-03 -5.616604740787570216e-02 -7.587041416307230279e-02 -6.644875747844139480e-02 -2.131101882750449997e-02 -3.764832683029650101e-02 -1.811826730789670159e-02 -9.220404962683000083e-02
274 7.440129094361959405e-02 5.068011873981870252e-02 -2.021751109626000048e-02 4.597244985110970211e-02 7.410844738085080319e-02 3.281930490884039930e-02 -3.603757004385269719e-02 7.120997975363539678e-02 1.063542767417259977e-01 3.620126473304600273e-02
275 1.628067572730669890e-02 -4.464163650698899782e-02 -2.452875939178359929e-02 3.564383776990089764e-02 -7.072771253015849857e-03 -3.192768196955810076e-03 -1.394774321933030074e-02 -2.592261998182820038e-03 1.556684454070180086e-02 1.549073015887240078e-02
276 -5.514554978810590376e-03 5.068011873981870252e-02 -1.159501450521270051e-02 1.154374291374709975e-02 -2.220825269322829892e-02 -1.540555820674759969e-02 -2.131101882750449997e-02 -2.592261998182820038e-03 1.100810104587249955e-02 6.933812005172369786e-02
277 1.264813727628719998e-02 -4.464163650698899782e-02 2.612840808061879863e-02 6.318680331979099896e-02 1.250187031342930022e-01 9.169121572527250130e-02 6.336665066649820044e-02 -2.592261998182820038e-03 5.757285620242599822e-02 -2.178823207463989955e-02
278 -3.457486258696700065e-02 -4.464163650698899782e-02 -5.901874575597240019e-02 1.215130832538269907e-03 -5.385516843185429725e-02 -7.803525056465400456e-02 6.704828847058519337e-02 -7.639450375000099436e-02 -2.139368094035999993e-02 1.549073015887240078e-02
279 6.713621404158050254e-02 5.068011873981870252e-02 -3.638469220447349689e-02 -8.485663651086830517e-02 -7.072771253015849857e-03 1.966706951368000014e-02 -5.444575906428809897e-02 3.430885887772629900e-02 1.143797379512540100e-03 3.205915781821130212e-02
280 3.807590643342410180e-02 5.068011873981870252e-02 -2.452875939178359929e-02 4.658001526274530187e-03 -2.633611126783170012e-02 -2.636575436938120090e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -2.593033898947460017e-02
281 9.015598825267629943e-03 5.068011873981870252e-02 1.858372356345249984e-02 3.908670846363720280e-02 1.769438019460449832e-02 1.058576412178359981e-02 1.918699701745330000e-02 -2.592261998182820038e-03 1.630495279994180133e-02 -1.764612515980519894e-02
282 -9.269547780327989928e-02 5.068011873981870252e-02 -9.027529589851850111e-02 -5.731367096097819691e-02 -2.496015840963049931e-02 -3.043668437264510085e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 2.405258322689299982e-02 3.064409414368320182e-03
283 7.076875249260000666e-02 -4.464163650698899782e-02 -5.128142061927360405e-03 -5.670610554934250001e-03 8.786797596286209655e-02 1.029645603496960049e-01 1.182372140927919965e-02 3.430885887772629900e-02 -8.944018957797799166e-03 2.791705090337660150e-02
284 -1.641217033186929963e-02 -4.464163650698899782e-02 -5.255187331268700024e-02 -3.321357610482440076e-02 -4.422349842444640161e-02 -3.638650514664620167e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -6.832974362442149896e-02 -3.007244590430930078e-02
285 4.170844488444359899e-02 5.068011873981870252e-02 -2.237313524402180162e-02 2.875809638242839833e-02 -6.623874415566440021e-02 -4.515466207675319921e-02 -6.180903467246220279e-02 -2.592261998182820038e-03 2.863770518940129874e-03 -5.492508739331759815e-02
286 1.264813727628719998e-02 -4.464163650698899782e-02 -2.021751109626000048e-02 -1.599922263614299983e-02 1.219056876180000040e-02 2.123281182262769934e-02 -7.653558588881050062e-02 1.081111006295440019e-01 5.988072306548120061e-02 -2.178823207463989955e-02
287 -3.820740103798660192e-02 -4.464163650698899782e-02 -5.470749746044879791e-02 -7.797089512339580586e-02 -3.321587555883730170e-02 -8.649025903297140327e-02 1.406810445523269948e-01 -7.639450375000099436e-02 -1.919704761394450121e-02 -5.219804415301099697e-03
288 4.534098333546320025e-02 -4.464163650698899782e-02 -6.205954135808240159e-03 -1.599922263614299983e-02 1.250187031342930022e-01 1.251981011367520047e-01 1.918699701745330000e-02 3.430885887772629900e-02 3.243322577960189995e-02 -5.219804415301099697e-03
289 7.076875249260000666e-02 5.068011873981870252e-02 -1.698407487461730050e-02 2.187235499495579841e-02 4.383748450042589812e-02 5.630543954305530091e-02 3.759518603788870178e-02 -2.592261998182820038e-03 -7.020931272868760620e-02 -1.764612515980519894e-02
290 -7.453278554818210111e-02 5.068011873981870252e-02 5.522933407540309841e-02 -4.009931749229690007e-02 5.346915450783389784e-02 5.317395492515999966e-02 -4.340084565202689815e-02 7.120997975363539678e-02 6.123790751970099866e-02 -3.421455281914410201e-02
291 5.987113713954139715e-02 5.068011873981870252e-02 7.678557555302109594e-02 2.531522568869210010e-02 1.182945896190920002e-03 1.684873335757430118e-02 -5.444575906428809897e-02 3.430885887772629900e-02 2.993564839653250001e-02 4.448547856271539702e-02
292 7.440129094361959405e-02 -4.464163650698899782e-02 1.858372356345249984e-02 6.318680331979099896e-02 6.172487165704060308e-02 4.284005568610550069e-02 8.142083605192099172e-03 -2.592261998182820038e-03 5.803912766389510147e-02 -5.906719430815229877e-02
293 9.015598825267629943e-03 -4.464163650698899782e-02 -2.237313524402180162e-02 -3.206595255172180192e-02 -4.972730985725089953e-02 -6.864079671096809387e-02 7.809320188284639419e-02 -7.085933561861459951e-02 -6.291294991625119570e-02 -3.835665973397880263e-02
294 -7.090024709716259699e-02 -4.464163650698899782e-02 9.295275666123460623e-02 1.269136646684959971e-02 2.044628591100669870e-02 4.252690722431590187e-02 7.788079970179680352e-04 3.598276718899090076e-04 -5.454415271109520208e-02 -1.077697500466389974e-03
295 2.354575262934580082e-02 5.068011873981870252e-02 -3.099563183506899924e-02 -5.670610554934250001e-03 -1.670444126042380101e-02 1.778817874294279927e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -7.408887149153539631e-02 -3.421455281914410201e-02
296 -5.273755484206479882e-02 5.068011873981870252e-02 3.906215296718960200e-02 -4.009931749229690007e-02 -5.696818394814720174e-03 -1.290037051243130006e-02 1.182372140927919965e-02 -3.949338287409189657e-02 1.630495279994180133e-02 3.064409414368320182e-03
297 6.713621404158050254e-02 -4.464163650698899782e-02 -6.117436990373419786e-02 -4.009931749229690007e-02 -2.633611126783170012e-02 -2.448686359864400003e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -5.615757309500619965e-02 -5.906719430815229877e-02
298 1.750521923228520000e-03 -4.464163650698899782e-02 -8.361578283570040432e-03 -6.419941234845069622e-02 -3.871968699164179961e-02 -2.448686359864400003e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -6.468302246445030435e-02 -5.492508739331759815e-02
299 2.354575262934580082e-02 5.068011873981870252e-02 -3.746250427835440266e-02 -4.698505887976939938e-02 -9.100589560328480043e-02 -7.553006287033779687e-02 -3.235593223976569732e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 -1.350401824497050006e-02
300 3.807590643342410180e-02 5.068011873981870252e-02 -1.375063865297449991e-02 -1.599922263614299983e-02 -3.596778127523959923e-02 -2.198167590432769866e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -2.595242443518940012e-02 -1.077697500466389974e-03
301 1.628067572730669890e-02 -4.464163650698899782e-02 7.355213933137849658e-02 -4.124694104539940176e-02 -4.320865536613589623e-03 -1.352666743601040056e-02 -1.394774321933030074e-02 -1.116217163146459961e-03 4.289568789252869857e-02 4.448547856271539702e-02
302 -1.882016527791040067e-03 5.068011873981870252e-02 -2.452875939178359929e-02 5.285819123858220142e-02 2.732605020201240090e-02 3.000096875273459973e-02 3.023191042971450082e-02 -2.592261998182820038e-03 -2.139368094035999993e-02 3.620126473304600273e-02
303 1.264813727628719998e-02 -4.464163650698899782e-02 3.367309259778510089e-02 3.334859052598110329e-02 3.007795591841460128e-02 2.718263259662880016e-02 -2.902829807069099918e-03 8.847085473348980864e-03 3.119299070280229930e-02 2.791705090337660150e-02
304 7.440129094361959405e-02 -4.464163650698899782e-02 3.475090467166599972e-02 9.417263956341730136e-02 5.759701308243719842e-02 2.029336643725910064e-02 2.286863482154040048e-02 -2.592261998182820038e-03 7.380214692004880006e-02 -2.178823207463989955e-02
305 4.170844488444359899e-02 5.068011873981870252e-02 -3.854031635223530150e-02 5.285819123858220142e-02 7.686035309725310072e-02 1.164299442066459994e-01 -3.971920784793980114e-02 7.120997975363539678e-02 -2.251217192966049885e-02 -1.350401824497050006e-02
306 -9.147093429830140468e-03 5.068011873981870252e-02 -3.961812842611620034e-02 -4.009931749229690007e-02 -8.448724111216979540e-03 1.622243643399520069e-02 -6.549067247654929980e-02 7.120997975363539678e-02 1.776347786711730131e-02 -6.735140813782170000e-02
307 9.015598825267629943e-03 5.068011873981870252e-02 -1.894705840284650021e-03 2.187235499495579841e-02 -3.871968699164179961e-02 -2.480001206043359885e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -3.980959436433750137e-02 -1.350401824497050006e-02
308 6.713621404158050254e-02 5.068011873981870252e-02 -3.099563183506899924e-02 4.658001526274530187e-03 2.457414448561009990e-02 3.563764106494619888e-02 -2.867429443567860031e-02 3.430885887772629900e-02 2.337484127982079885e-02 8.176444079622779970e-02
309 1.750521923228520000e-03 -4.464163650698899782e-02 -4.608500086940160029e-02 -3.321357610482440076e-02 -7.311850844667000526e-02 -8.147988364433890462e-02 4.495846164606279866e-02 -6.938329078357829971e-02 -6.117659509433449883e-02 -7.977772888232589898e-02
310 -9.147093429830140468e-03 5.068011873981870252e-02 1.338730381358059929e-03 -2.227739861197989939e-03 7.961225881365530110e-02 7.008397186179469995e-02 3.391354823380159783e-02 -2.592261998182820038e-03 2.671425763351279944e-02 8.176444079622779970e-02
311 -5.514554978810590376e-03 -4.464163650698899782e-02 6.492964274033119487e-02 3.564383776990089764e-02 -1.568959820211340015e-03 1.496984258683710031e-02 -1.394774321933030074e-02 7.288388806489919797e-04 -1.811826730789670159e-02 3.205915781821130212e-02
312 9.619652164973699349e-02 -4.464163650698899782e-02 4.013996504107050084e-02 -5.731367096097819691e-02 4.521343735862710239e-02 6.068951800810880315e-02 -2.131101882750449997e-02 3.615391492152170150e-02 1.255315281338930007e-02 2.377494398854190089e-02
313 -7.453278554818210111e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -5.670610554934250001e-03 -2.083229983502719873e-02 -1.415296435958940044e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -3.845911230135379971e-02 -3.007244590430930078e-02
314 5.987113713954139715e-02 5.068011873981870252e-02 5.307370992764130074e-02 5.285819123858220142e-02 3.282986163481690228e-02 1.966706951368000014e-02 -1.026610541524320026e-02 3.430885887772629900e-02 5.520503808961670089e-02 -1.077697500466389974e-03
315 -2.367724723390840155e-02 -4.464163650698899782e-02 4.013996504107050084e-02 -1.255635194240680048e-02 -9.824676969418109224e-03 -1.000728964429089965e-03 -2.902829807069099918e-03 -2.592261998182820038e-03 -1.190068480150809939e-02 -3.835665973397880263e-02
316 9.015598825267629943e-03 -4.464163650698899782e-02 -2.021751109626000048e-02 -5.387080026724189868e-02 3.145390877661580209e-02 2.060651489904859884e-02 5.600337505832399948e-02 -3.949338287409189657e-02 -1.090443584737709956e-02 -1.077697500466389974e-03
317 1.628067572730669890e-02 5.068011873981870252e-02 1.427247526792889930e-02 1.215130832538269907e-03 1.182945896190920002e-03 -2.135537898074869878e-02 -3.235593223976569732e-02 3.430885887772629900e-02 7.496833602773420036e-02 4.034337164788070335e-02
318 1.991321417832630017e-02 -4.464163650698899782e-02 -3.422906805671169922e-02 5.515343848250200270e-02 6.722868308984519814e-02 7.415490186505870052e-02 -6.584467611156170040e-03 3.283281404268990206e-02 2.472532334280450050e-02 6.933812005172369786e-02
319 8.893144474769780483e-02 -4.464163650698899782e-02 6.727790750762559745e-03 2.531522568869210010e-02 3.007795591841460128e-02 8.706873351046409346e-03 6.336665066649820044e-02 -3.949338287409189657e-02 9.436409146079870192e-03 3.205915781821130212e-02
320 1.991321417832630017e-02 -4.464163650698899782e-02 4.572166603000769880e-03 4.597244985110970211e-02 -1.808039411862490120e-02 -5.454911593043910295e-02 6.336665066649820044e-02 -3.949338287409189657e-02 2.866072031380889965e-02 6.105390622205419948e-02
321 -2.367724723390840155e-02 -4.464163650698899782e-02 3.043965637614240091e-02 -5.670610554934250001e-03 8.236416453005759863e-02 9.200436418706199604e-02 -1.762938102341739949e-02 7.120997975363539678e-02 3.304707235493409972e-02 3.064409414368320182e-03
322 9.619652164973699349e-02 -4.464163650698899782e-02 5.199589785376040191e-02 7.925353333865589600e-02 5.484510736603499803e-02 3.657708645031480105e-02 -7.653558588881050062e-02 1.413221094178629955e-01 9.864637430492799453e-02 6.105390622205419948e-02
323 2.354575262934580082e-02 5.068011873981870252e-02 6.169620651868849837e-02 6.203917986997459916e-02 2.457414448561009990e-02 -3.607335668485669999e-02 -9.126213710515880539e-02 1.553445353507079962e-01 1.333957338374689994e-01 8.176444079622779970e-02
324 7.076875249260000666e-02 5.068011873981870252e-02 -7.283766209689159811e-03 4.941532054484590319e-02 6.034891879883950289e-02 -4.445362044113949918e-03 -5.444575906428809897e-02 1.081111006295440019e-01 1.290194116001679991e-01 5.691179930721949887e-02
325 3.081082953138499989e-02 -4.464163650698899782e-02 5.649978676881649634e-03 1.154374291374709975e-02 7.823630595545419397e-02 7.791268340653299818e-02 -4.340084565202689815e-02 1.081111006295440019e-01 6.604820616309839409e-02 1.963283707370720027e-02
326 -1.882016527791040067e-03 -4.464163650698899782e-02 5.415152200152219958e-02 -6.649465948908450663e-02 7.273249452264969606e-02 5.661858800484489973e-02 -4.340084565202689815e-02 8.486339447772170419e-02 8.449528221240310000e-02 4.862758547755009764e-02
327 4.534098333546320025e-02 5.068011873981870252e-02 -8.361578283570040432e-03 -3.321357610482440076e-02 -7.072771253015849857e-03 1.191310268097639903e-03 -3.971920784793980114e-02 3.430885887772629900e-02 2.993564839653250001e-02 2.791705090337660150e-02
328 7.440129094361959405e-02 -4.464163650698899782e-02 1.145089981388529993e-01 2.875809638242839833e-02 2.457414448561009990e-02 2.499059336410210108e-02 1.918699701745330000e-02 -2.592261998182820038e-03 -6.092541861022970299e-04 -5.219804415301099697e-03
329 -3.820740103798660192e-02 -4.464163650698899782e-02 6.708526688809300642e-02 -6.075654165471439799e-02 -2.908801698423390050e-02 -2.323426975148589965e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 -1.498586820292070049e-03 1.963283707370720027e-02
330 -1.277963188084970010e-02 5.068011873981870252e-02 -5.578530953432969675e-02 -2.227739861197989939e-03 -2.771206412603280031e-02 -2.918409052548700047e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -1.705210460474350029e-02 4.448547856271539702e-02
331 9.015598825267629943e-03 5.068011873981870252e-02 3.043965637614240091e-02 4.252957915737339695e-02 -2.944912678412469915e-03 3.689023491210430272e-02 -6.549067247654929980e-02 7.120997975363539678e-02 -2.364455757213410059e-02 1.549073015887240078e-02
332 8.166636784565869944e-02 5.068011873981870252e-02 -2.560657146566450160e-02 -3.665644679856060184e-02 -7.036660273026780488e-02 -4.640725592391130305e-02 -3.971920784793980114e-02 -2.592261998182820038e-03 -4.118038518800790082e-02 -5.219804415301099697e-03
333 3.081082953138499989e-02 -4.464163650698899782e-02 1.048086894739250069e-01 7.695828609473599757e-02 -1.120062982761920074e-02 -1.133462820348369975e-02 -5.812739686837520292e-02 3.430885887772629900e-02 5.710418744784390155e-02 3.620126473304600273e-02
334 2.717829108036539862e-02 5.068011873981870252e-02 -6.205954135808240159e-03 2.875809638242839833e-02 -1.670444126042380101e-02 -1.627025888008149911e-03 -5.812739686837520292e-02 3.430885887772629900e-02 2.930041326858690010e-02 3.205915781821130212e-02
335 -6.000263174410389727e-02 5.068011873981870252e-02 -4.716281294328249912e-02 -2.288496402361559975e-02 -7.174255558846899528e-02 -5.768060054833450134e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -6.291294991625119570e-02 -5.492508739331759815e-02
336 5.383060374248070309e-03 -4.464163650698899782e-02 -4.824062501716339796e-02 -1.255635194240680048e-02 1.182945896190920002e-03 -6.637401276640669812e-03 6.336665066649820044e-02 -3.949338287409189657e-02 -5.140053526058249722e-02 -5.906719430815229877e-02
337 -2.004470878288880029e-02 -4.464163650698899782e-02 8.540807214406830050e-02 -3.665644679856060184e-02 9.199583453746550121e-02 8.949917649274570508e-02 -6.180903467246220279e-02 1.450122215054540087e-01 8.094791351127560153e-02 5.276969239238479825e-02
338 1.991321417832630017e-02 5.068011873981870252e-02 -1.267282657909369996e-02 7.007254470726349826e-02 -1.120062982761920074e-02 7.141131042098750048e-03 -3.971920784793980114e-02 3.430885887772629900e-02 5.384369968545729690e-03 3.064409414368320182e-03
339 -6.363517019512339445e-02 -4.464163650698899782e-02 -3.315125598283080038e-02 -3.321357610482440076e-02 1.182945896190920002e-03 2.405114797873349891e-02 -2.499265663159149983e-02 -2.592261998182820038e-03 -2.251217192966049885e-02 -5.906719430815229877e-02
340 2.717829108036539862e-02 -4.464163650698899782e-02 -7.283766209689159811e-03 -5.042792957350569760e-02 7.548440023905199359e-02 5.661858800484489973e-02 3.391354823380159783e-02 -2.592261998182820038e-03 4.344317225278129802e-02 1.549073015887240078e-02
341 -1.641217033186929963e-02 -4.464163650698899782e-02 -1.375063865297449991e-02 1.320442171945160059e-01 -9.824676969418109224e-03 -3.819065120534880214e-03 1.918699701745330000e-02 -3.949338287409189657e-02 -3.581672810154919867e-02 -3.007244590430930078e-02
342 3.081082953138499989e-02 5.068011873981870252e-02 5.954058237092670069e-02 5.630106193231849965e-02 -2.220825269322829892e-02 1.191310268097639903e-03 -3.235593223976569732e-02 -2.592261998182820038e-03 -2.479118743246069845e-02 -1.764612515980519894e-02
343 5.623859868852180283e-02 5.068011873981870252e-02 2.181715978509519982e-02 5.630106193231849965e-02 -7.072771253015849857e-03 1.810132720473240156e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -2.364455757213410059e-02 2.377494398854190089e-02
344 -2.004470878288880029e-02 -4.464163650698899782e-02 1.858372356345249984e-02 9.072976886968099619e-02 3.934851612593179802e-03 8.706873351046409346e-03 3.759518603788870178e-02 -3.949338287409189657e-02 -5.780006567561250114e-02 7.206516329203029904e-03
345 -1.072256316073579990e-01 -4.464163650698899782e-02 -1.159501450521270051e-02 -4.009931749229690007e-02 4.934129593323050011e-02 6.444729954958319795e-02 -1.394774321933030074e-02 3.430885887772629900e-02 7.026862549151949647e-03 -3.007244590430930078e-02
346 8.166636784565869944e-02 5.068011873981870252e-02 -2.972517914165530208e-03 -3.321357610482440076e-02 4.246153164222479792e-02 5.787118185200299664e-02 -1.026610541524320026e-02 3.430885887772629900e-02 -6.092541861022970299e-04 -1.077697500466389974e-03
347 5.383060374248070309e-03 5.068011873981870252e-02 1.750591148957160101e-02 3.220096707616459941e-02 1.277706088506949944e-01 1.273901403692790091e-01 -2.131101882750449997e-02 7.120997975363539678e-02 6.257518145805600340e-02 1.549073015887240078e-02
348 3.807590643342410180e-02 5.068011873981870252e-02 -2.991781976118810041e-02 -7.452802442965950069e-02 -1.257658268582039982e-02 -1.258722205064180012e-02 4.460445801105040325e-03 -2.592261998182820038e-03 3.711738233435969789e-03 -3.007244590430930078e-02
349 3.081082953138499989e-02 -4.464163650698899782e-02 -2.021751109626000048e-02 -5.670610554934250001e-03 -4.320865536613589623e-03 -2.949723898727649868e-02 7.809320188284639419e-02 -3.949338287409189657e-02 -1.090443584737709956e-02 -1.077697500466389974e-03
350 1.750521923228520000e-03 5.068011873981870252e-02 -5.794093368209150136e-02 -4.354218818603310115e-02 -9.650970703608929835e-02 -4.703355284749029946e-02 -9.862541271333299941e-02 3.430885887772629900e-02 -6.117659509433449883e-02 -7.149351505265640061e-02
351 -2.730978568492789874e-02 5.068011873981870252e-02 6.061839444480759953e-02 1.079441223383619947e-01 1.219056876180000040e-02 -1.759759743927430051e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 7.021129819331020649e-02 1.356118306890790048e-01
352 -8.543040090124079389e-02 5.068011873981870252e-02 -4.069594049999709917e-02 -3.321357610482440076e-02 -8.137422559587689785e-02 -6.958024209633670298e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -5.780006567561250114e-02 -4.249876664881350324e-02
353 1.264813727628719998e-02 5.068011873981870252e-02 -7.195249064254319316e-02 -4.698505887976939938e-02 -5.110326271545199972e-02 -9.713730673381550107e-02 1.185912177278039964e-01 -7.639450375000099436e-02 -2.028874775162960165e-02 -3.835665973397880263e-02
354 -5.273755484206479882e-02 -4.464163650698899782e-02 -5.578530953432969675e-02 -3.665644679856060184e-02 8.924392882106320368e-02 -3.192768196955810076e-03 8.142083605192099172e-03 3.430885887772629900e-02 1.323726493386760128e-01 3.064409414368320182e-03
355 -2.367724723390840155e-02 5.068011873981870252e-02 4.552902541047500196e-02 2.187235499495579841e-02 1.098832216940800049e-01 8.887287956916670173e-02 7.788079970179680352e-04 3.430885887772629900e-02 7.419253669003070262e-02 6.105390622205419948e-02
356 -7.453278554818210111e-02 5.068011873981870252e-02 -9.439390357450949676e-03 1.498661360748330083e-02 -3.734373413344069942e-02 -2.166852744253820046e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 1.134862324403770016e-02
357 -5.514554978810590376e-03 5.068011873981870252e-02 -3.315125598283080038e-02 -1.599922263614299983e-02 8.062710187196569719e-03 1.622243643399520069e-02 1.550535921336619952e-02 -2.592261998182820038e-03 -2.832024254799870092e-02 -7.563562196749110123e-02
358 -6.000263174410389727e-02 5.068011873981870252e-02 4.984027370599859730e-02 1.842948430121960079e-02 -1.670444126042380101e-02 -3.012353591085559917e-02 -1.762938102341739949e-02 -2.592261998182820038e-03 4.976865992074899769e-02 -5.906719430815229877e-02
359 -2.004470878288880029e-02 -4.464163650698899782e-02 -8.488623552911400694e-02 -2.632783471735180084e-02 -3.596778127523959923e-02 -3.419446591411950259e-02 4.127682384197570165e-02 -5.167075276314189725e-02 -8.238148325810279449e-02 -4.664087356364819692e-02
360 3.807590643342410180e-02 5.068011873981870252e-02 5.649978676881649634e-03 3.220096707616459941e-02 6.686757328995440036e-03 1.747503028115330106e-02 -2.499265663159149983e-02 3.430885887772629900e-02 1.482271084126630077e-02 6.105390622205419948e-02
361 1.628067572730669890e-02 -4.464163650698899782e-02 2.073934771121430098e-02 2.187235499495579841e-02 -1.395253554402150001e-02 -1.321351897422090062e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 1.331596790892770020e-02 4.034337164788070335e-02
362 4.170844488444359899e-02 -4.464163650698899782e-02 -7.283766209689159811e-03 2.875809638242839833e-02 -4.284754556624519733e-02 -4.828614669464850045e-02 5.232173725423699961e-02 -7.639450375000099436e-02 -7.212845460195599356e-02 2.377494398854190089e-02
363 1.991321417832630017e-02 5.068011873981870252e-02 1.048086894739250069e-01 7.007254470726349826e-02 -3.596778127523959923e-02 -2.667890283117069911e-02 -2.499265663159149983e-02 -2.592261998182820038e-03 3.711738233435969789e-03 4.034337164788070335e-02
364 -4.910501639104519755e-02 5.068011873981870252e-02 -2.452875939178359929e-02 6.750727943574620551e-05 -4.697540414084860200e-02 -2.824464514011839830e-02 -6.549067247654929980e-02 2.840467953758080144e-02 1.919903307856710151e-02 1.134862324403770016e-02
365 1.750521923228520000e-03 5.068011873981870252e-02 -6.205954135808240159e-03 -1.944209332987930153e-02 -9.824676969418109224e-03 4.949091809572019746e-03 -3.971920784793980114e-02 3.430885887772629900e-02 1.482271084126630077e-02 9.833286845556660216e-02
366 3.444336798240450054e-02 -4.464163650698899782e-02 -3.854031635223530150e-02 -1.255635194240680048e-02 9.438663045397699403e-03 5.262240271361550044e-03 -6.584467611156170040e-03 -2.592261998182820038e-03 3.119299070280229930e-02 9.833286845556660216e-02
367 -4.547247794002570037e-02 5.068011873981870252e-02 1.371430516903520136e-01 -1.599922263614299983e-02 4.108557878402369773e-02 3.187985952347179713e-02 -4.340084565202689815e-02 7.120997975363539678e-02 7.102157794598219775e-02 4.862758547755009764e-02
368 -9.147093429830140468e-03 5.068011873981870252e-02 1.705552259806600024e-01 1.498661360748330083e-02 3.007795591841460128e-02 3.375875029420900147e-02 -2.131101882750449997e-02 3.430885887772629900e-02 3.365681290238470291e-02 3.205915781821130212e-02
369 -1.641217033186929963e-02 5.068011873981870252e-02 2.416542455238970041e-03 1.498661360748330083e-02 2.182223876920789951e-02 -1.008203435632550049e-02 -2.499265663159149983e-02 3.430885887772629900e-02 8.553312118743899850e-02 8.176444079622779970e-02
370 -9.147093429830140468e-03 -4.464163650698899782e-02 3.798434089330870317e-02 -4.009931749229690007e-02 -2.496015840963049931e-02 -3.819065120534880214e-03 -4.340084565202689815e-02 1.585829843977170153e-02 -5.145307980263110273e-03 2.791705090337660150e-02
371 1.991321417832630017e-02 -4.464163650698899782e-02 -5.794093368209150136e-02 -5.731367096097819691e-02 -1.568959820211340015e-03 -1.258722205064180012e-02 7.441156407875940126e-02 -3.949338287409189657e-02 -6.117659509433449883e-02 -7.563562196749110123e-02
372 5.260606023750229870e-02 5.068011873981870252e-02 -9.439390357450949676e-03 4.941532054484590319e-02 5.071724879143160031e-02 -1.916333974822199970e-02 -1.394774321933030074e-02 3.430885887772629900e-02 1.193439942037869961e-01 -1.764612515980519894e-02
373 -2.730978568492789874e-02 5.068011873981870252e-02 -2.345094731790270046e-02 -1.599922263614299983e-02 1.356652162000110060e-02 1.277780335431030062e-02 2.655027262562750096e-02 -2.592261998182820038e-03 -1.090443584737709956e-02 -2.178823207463989955e-02
374 -7.453278554818210111e-02 -4.464163650698899782e-02 -1.051720243133190055e-02 -5.670610554934250001e-03 -6.623874415566440021e-02 -5.705430362475540085e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -4.257210492279420166e-02 -1.077697500466389974e-03
375 -1.072256316073579990e-01 -4.464163650698899782e-02 -3.422906805671169922e-02 -6.764228304218700139e-02 -6.348683843926219983e-02 -7.051968748170529822e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -6.092541861022970299e-04 -7.977772888232589898e-02
376 4.534098333546320025e-02 5.068011873981870252e-02 -2.972517914165530208e-03 1.079441223383619947e-01 3.558176735121919981e-02 2.248540566978590033e-02 2.655027262562750096e-02 -2.592261998182820038e-03 2.801650652326400162e-02 1.963283707370720027e-02
377 -1.882016527791040067e-03 -4.464163650698899782e-02 6.816307896197400240e-02 -5.670610554934250001e-03 1.195148917014880047e-01 1.302084765253850029e-01 -2.499265663159149983e-02 8.670845052151719690e-02 4.613233103941480340e-02 -1.077697500466389974e-03
378 1.991321417832630017e-02 5.068011873981870252e-02 9.961226972405269262e-03 1.842948430121960079e-02 1.494247447820220079e-02 4.471894645684260094e-02 -6.180903467246220279e-02 7.120997975363539678e-02 9.436409146079870192e-03 -6.320930122298699938e-02
379 1.628067572730669890e-02 5.068011873981870252e-02 2.416542455238970041e-03 -5.670610554934250001e-03 -5.696818394814720174e-03 1.089891258357309975e-02 -5.076412126020100196e-02 3.430885887772629900e-02 2.269202256674450122e-02 -3.835665973397880263e-02
380 -1.882016527791040067e-03 -4.464163650698899782e-02 -3.854031635223530150e-02 2.187235499495579841e-02 -1.088932827598989989e-01 -1.156130659793979942e-01 2.286863482154040048e-02 -7.639450375000099436e-02 -4.687948284421659950e-02 2.377494398854190089e-02
381 1.628067572730669890e-02 -4.464163650698899782e-02 2.612840808061879863e-02 5.859630917623830093e-02 -6.073493272285990230e-02 -4.421521669138449989e-02 -1.394774321933030074e-02 -3.395821474270550172e-02 -5.140053526058249722e-02 -2.593033898947460017e-02
382 -7.090024709716259699e-02 5.068011873981870252e-02 -8.919748382463760228e-02 -7.452802442965950069e-02 -4.284754556624519733e-02 -2.573945744580210040e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -1.290794225416879923e-02 -5.492508739331759815e-02
383 4.897352178648269744e-02 -4.464163650698899782e-02 6.061839444480759953e-02 -2.288496402361559975e-02 -2.358420555142939912e-02 -7.271172671423199729e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 1.041376113589790042e-01 3.620126473304600273e-02
384 5.383060374248070309e-03 5.068011873981870252e-02 -2.884000768730720157e-02 -9.113481248670509197e-03 -3.183992270063620150e-02 -2.887094206369749880e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -1.811826730789670159e-02 7.206516329203029904e-03
385 3.444336798240450054e-02 5.068011873981870252e-02 -2.991781976118810041e-02 4.658001526274530187e-03 9.337178739566659447e-02 8.699398879842949739e-02 3.391354823380159783e-02 -2.592261998182820038e-03 2.405258322689299982e-02 -3.835665973397880263e-02
386 2.354575262934580082e-02 5.068011873981870252e-02 -1.913969902237900103e-02 4.941532054484590319e-02 -6.348683843926219983e-02 -6.112523362801929733e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -2.595242443518940012e-02 -1.350401824497050006e-02
387 1.991321417832630017e-02 -4.464163650698899782e-02 -4.069594049999709917e-02 -1.599922263614299983e-02 -8.448724111216979540e-03 -1.759759743927430051e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 3.064409414368320182e-03
388 -4.547247794002570037e-02 -4.464163650698899782e-02 1.535028734180979987e-02 -7.452802442965950069e-02 -4.972730985725089953e-02 -1.728444897748479883e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 -1.043648208321659998e-01 -7.563562196749110123e-02
389 5.260606023750229870e-02 5.068011873981870252e-02 -2.452875939178359929e-02 5.630106193231849965e-02 -7.072771253015849857e-03 -5.071658967693000106e-03 -2.131101882750449997e-02 -2.592261998182820038e-03 2.671425763351279944e-02 -3.835665973397880263e-02
390 -5.514554978810590376e-03 5.068011873981870252e-02 1.338730381358059929e-03 -8.485663651086830517e-02 -1.120062982761920074e-02 -1.665815205390569834e-02 4.864009945014990260e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -8.806194271199530021e-02
391 9.015598825267629943e-03 5.068011873981870252e-02 6.924089103585480409e-02 5.974393262605470073e-02 1.769438019460449832e-02 -2.323426975148589965e-02 -4.708248345611389801e-02 3.430885887772629900e-02 1.032922649115240038e-01 7.348022696655839847e-02
392 -2.367724723390840155e-02 -4.464163650698899782e-02 -6.979686649478139548e-02 -6.419941234845069622e-02 -5.935897986465880211e-02 -5.047818592717519953e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -8.913686007934769340e-02 -5.078298047848289754e-02
393 -4.183993948900609910e-02 5.068011873981870252e-02 -2.991781976118810041e-02 -2.227739861197989939e-03 2.182223876920789951e-02 3.657708645031480105e-02 1.182372140927919965e-02 -2.592261998182820038e-03 -4.118038518800790082e-02 6.519601313688899724e-02
394 -7.453278554818210111e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -4.354218818603310115e-02 -2.908801698423390050e-02 -2.323426975148589965e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -3.980959436433750137e-02 -2.178823207463989955e-02
395 3.444336798240450054e-02 -4.464163650698899782e-02 1.858372356345249984e-02 5.630106193231849965e-02 1.219056876180000040e-02 -5.454911593043910295e-02 -6.917231028063640375e-02 7.120997975363539678e-02 1.300806095217529879e-01 7.206516329203029904e-03
396 -6.000263174410389727e-02 -4.464163650698899782e-02 1.338730381358059929e-03 -2.977070541108809906e-02 -7.072771253015849857e-03 -2.166852744253820046e-02 1.182372140927919965e-02 -2.592261998182820038e-03 3.181521750079859684e-02 -5.492508739331759815e-02
397 -8.543040090124079389e-02 5.068011873981870252e-02 -3.099563183506899924e-02 -2.288496402361559975e-02 -6.348683843926219983e-02 -5.423596746864960128e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -9.643322289178400675e-02 -3.421455281914410201e-02
398 5.260606023750229870e-02 -4.464163650698899782e-02 -4.050329988046450294e-03 -3.091832896419060075e-02 -4.697540414084860200e-02 -5.830689747191349775e-02 -1.394774321933030074e-02 -2.583996815000549896e-02 3.605579008983190309e-02 2.377494398854190089e-02
399 1.264813727628719998e-02 -4.464163650698899782e-02 1.535028734180979987e-02 -3.321357610482440076e-02 4.108557878402369773e-02 3.219300798526129881e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 4.506616833626150148e-02 -6.735140813782170000e-02
400 5.987113713954139715e-02 5.068011873981870252e-02 2.289497185897609866e-02 4.941532054484590319e-02 1.631842733640340160e-02 1.183835796894170019e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 3.953987807202419963e-02 1.963283707370720027e-02
401 -2.367724723390840155e-02 -4.464163650698899782e-02 4.552902541047500196e-02 9.072976886968099619e-02 -1.808039411862490120e-02 -3.544705976127759950e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -3.452371533034950118e-02 -9.361911330135799444e-03
402 1.628067572730669890e-02 -4.464163650698899782e-02 -4.500718879552070145e-02 -5.731367096097819691e-02 -3.459182841703849903e-02 -5.392281900686000246e-02 7.441156407875940126e-02 -7.639450375000099436e-02 -4.257210492279420166e-02 4.034337164788070335e-02
403 1.107266754538149961e-01 5.068011873981870252e-02 -3.315125598283080038e-02 -2.288496402361559975e-02 -4.320865536613589623e-03 2.029336643725910064e-02 -6.180903467246220279e-02 7.120997975363539678e-02 1.556684454070180086e-02 4.448547856271539702e-02
404 -2.004470878288880029e-02 -4.464163650698899782e-02 9.726400495675820157e-02 -5.670610554934250001e-03 -5.696818394814720174e-03 -2.386056667506489953e-02 -2.131101882750449997e-02 -2.592261998182820038e-03 6.168584882386619894e-02 4.034337164788070335e-02
405 -1.641217033186929963e-02 -4.464163650698899782e-02 5.415152200152219958e-02 7.007254470726349826e-02 -3.321587555883730170e-02 -2.793149667832890010e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -2.712864555432650121e-02 -9.361911330135799444e-03
406 4.897352178648269744e-02 5.068011873981870252e-02 1.231314947298999957e-01 8.384402748220859403e-02 -1.047654241852959967e-01 -1.008950882752900069e-01 -6.917231028063640375e-02 -2.592261998182820038e-03 3.664579779339879884e-02 -3.007244590430930078e-02
407 -5.637009329308430294e-02 -4.464163650698899782e-02 -8.057498723359039772e-02 -8.485663651086830517e-02 -3.734373413344069942e-02 -3.701280207022530216e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -5.615757309500619965e-02 -1.377672256900120129e-01
408 2.717829108036539862e-02 -4.464163650698899782e-02 9.295275666123460623e-02 -5.272317671413939699e-02 8.062710187196569719e-03 3.970857106821010230e-02 -2.867429443567860031e-02 2.102445536239900062e-02 -4.836172480289190057e-02 1.963283707370720027e-02
409 6.350367559056099842e-02 -4.464163650698899782e-02 -5.039624916492520257e-02 1.079441223383619947e-01 3.145390877661580209e-02 1.935392105189049847e-02 -1.762938102341739949e-02 2.360753382371260159e-02 5.803912766389510147e-02 4.034337164788070335e-02
410 -5.273755484206479882e-02 5.068011873981870252e-02 -1.159501450521270051e-02 5.630106193231849965e-02 5.622106022423609822e-02 7.290230801790049953e-02 -3.971920784793980114e-02 7.120997975363539678e-02 3.056648739841480097e-02 -5.219804415301099697e-03
411 -9.147093429830140468e-03 5.068011873981870252e-02 -2.776219561342629927e-02 8.100872220010799790e-03 4.796534307502930278e-02 3.720338337389379746e-02 -2.867429443567860031e-02 3.430885887772629900e-02 6.604820616309839409e-02 -4.249876664881350324e-02
412 5.383060374248070309e-03 -4.464163650698899782e-02 5.846277029704580186e-02 -4.354218818603310115e-02 -7.311850844667000526e-02 -7.239857825244250256e-02 1.918699701745330000e-02 -7.639450375000099436e-02 -5.140053526058249722e-02 -2.593033898947460017e-02
413 7.440129094361959405e-02 -4.464163650698899782e-02 8.540807214406830050e-02 6.318680331979099896e-02 1.494247447820220079e-02 1.309095181609989944e-02 1.550535921336619952e-02 -2.592261998182820038e-03 6.209315616505399656e-03 8.590654771106250032e-02
414 -5.273755484206479882e-02 -4.464163650698899782e-02 -8.168937664037369826e-04 -2.632783471735180084e-02 1.081461590359879960e-02 7.141131042098750048e-03 4.864009945014990260e-02 -3.949338287409189657e-02 -3.581672810154919867e-02 1.963283707370720027e-02
415 8.166636784565869944e-02 5.068011873981870252e-02 6.727790750762559745e-03 -4.522987001831730094e-03 1.098832216940800049e-01 1.170562411302250028e-01 -3.235593223976569732e-02 9.187460744414439884e-02 5.472400334817909689e-02 7.206516329203029904e-03
416 -5.514554978810590376e-03 -4.464163650698899782e-02 8.883414898524360018e-03 -5.042792957350569760e-02 2.595009734381130070e-02 4.722413415115889884e-02 -4.340084565202689815e-02 7.120997975363539678e-02 1.482271084126630077e-02 3.064409414368320182e-03
417 -2.730978568492789874e-02 -4.464163650698899782e-02 8.001901177466380632e-02 9.876313370696999938e-02 -2.944912678412469915e-03 1.810132720473240156e-02 -1.762938102341739949e-02 3.311917341962639788e-03 -2.952762274177360077e-02 3.620126473304600273e-02
418 -5.273755484206479882e-02 -4.464163650698899782e-02 7.139651518361660176e-02 -7.452802442965950069e-02 -1.532848840222260020e-02 -1.313877426218630021e-03 4.460445801105040325e-03 -2.141183364489639834e-02 -4.687948284421659950e-02 3.064409414368320182e-03
419 9.015598825267629943e-03 -4.464163650698899782e-02 -2.452875939178359929e-02 -2.632783471735180084e-02 9.887559882847110626e-02 9.419640341958869512e-02 7.072992627467229731e-02 -2.592261998182820038e-03 -2.139368094035999993e-02 7.206516329203029904e-03
420 -2.004470878288880029e-02 -4.464163650698899782e-02 -5.470749746044879791e-02 -5.387080026724189868e-02 -6.623874415566440021e-02 -5.736745208654490252e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -7.408887149153539631e-02 -5.219804415301099697e-03
421 2.354575262934580082e-02 -4.464163650698899782e-02 -3.638469220447349689e-02 6.750727943574620551e-05 1.182945896190920002e-03 3.469819567957759671e-02 -4.340084565202689815e-02 3.430885887772629900e-02 -3.324878724762579674e-02 6.105390622205419948e-02
422 3.807590643342410180e-02 5.068011873981870252e-02 1.642809941569069870e-02 2.187235499495579841e-02 3.970962592582259754e-02 4.503209491863210262e-02 -4.340084565202689815e-02 7.120997975363539678e-02 4.976865992074899769e-02 1.549073015887240078e-02
423 -7.816532399920170238e-02 5.068011873981870252e-02 7.786338762690199478e-02 5.285819123858220142e-02 7.823630595545419397e-02 6.444729954958319795e-02 2.655027262562750096e-02 -2.592261998182820038e-03 4.067226371449769728e-02 -9.361911330135799444e-03
424 9.015598825267629943e-03 5.068011873981870252e-02 -3.961812842611620034e-02 2.875809638242839833e-02 3.833367306762140020e-02 7.352860494147960002e-02 -7.285394808472339667e-02 1.081111006295440019e-01 1.556684454070180086e-02 -4.664087356364819692e-02
425 1.750521923228520000e-03 5.068011873981870252e-02 1.103903904628619932e-02 -1.944209332987930153e-02 -1.670444126042380101e-02 -3.819065120534880214e-03 -4.708248345611389801e-02 3.430885887772629900e-02 2.405258322689299982e-02 2.377494398854190089e-02
426 -7.816532399920170238e-02 -4.464163650698899782e-02 -4.069594049999709917e-02 -8.141376581713200000e-02 -1.006375656106929944e-01 -1.127947298232920004e-01 2.286863482154040048e-02 -7.639450375000099436e-02 -2.028874775162960165e-02 -5.078298047848289754e-02
427 3.081082953138499989e-02 5.068011873981870252e-02 -3.422906805671169922e-02 4.367720260718979675e-02 5.759701308243719842e-02 6.883137801463659611e-02 -3.235593223976569732e-02 5.755656502954899917e-02 3.546193866076970125e-02 8.590654771106250032e-02
428 -3.457486258696700065e-02 5.068011873981870252e-02 5.649978676881649634e-03 -5.670610554934250001e-03 -7.311850844667000526e-02 -6.269097593696699999e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -4.542095777704099890e-02 3.205915781821130212e-02
429 4.897352178648269744e-02 5.068011873981870252e-02 8.864150836571099701e-02 8.728689817594480205e-02 3.558176735121919981e-02 2.154596028441720101e-02 -2.499265663159149983e-02 3.430885887772629900e-02 6.604820616309839409e-02 1.314697237742440128e-01
430 -4.183993948900609910e-02 -4.464163650698899782e-02 -3.315125598283080038e-02 -2.288496402361559975e-02 4.658939021682820258e-02 4.158746183894729970e-02 5.600337505832399948e-02 -2.473293452372829840e-02 -2.595242443518940012e-02 -3.835665973397880263e-02
431 -9.147093429830140468e-03 -4.464163650698899782e-02 -5.686312160821060252e-02 -5.042792957350569760e-02 2.182223876920789951e-02 4.534524338042170144e-02 -2.867429443567860031e-02 3.430885887772629900e-02 -9.918957363154769225e-03 -1.764612515980519894e-02
432 7.076875249260000666e-02 5.068011873981870252e-02 -3.099563183506899924e-02 2.187235499495579841e-02 -3.734373413344069942e-02 -4.703355284749029946e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -1.495647502491130078e-02 -1.077697500466389974e-03
433 9.015598825267629943e-03 -4.464163650698899782e-02 5.522933407540309841e-02 -5.670610554934250001e-03 5.759701308243719842e-02 4.471894645684260094e-02 -2.902829807069099918e-03 2.323852261495349888e-02 5.568354770267369691e-02 1.066170822852360034e-01
434 -2.730978568492789874e-02 -4.464163650698899782e-02 -6.009655782985329903e-02 -2.977070541108809906e-02 4.658939021682820258e-02 1.998021797546959896e-02 1.222728555318910032e-01 -3.949338287409189657e-02 -5.140053526058249722e-02 -9.361911330135799444e-03
435 1.628067572730669890e-02 -4.464163650698899782e-02 1.338730381358059929e-03 8.100872220010799790e-03 5.310804470794310353e-03 1.089891258357309975e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -4.542095777704099890e-02 3.205915781821130212e-02
436 -1.277963188084970010e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -4.009931749229690007e-02 -1.670444126042380101e-02 4.635943347782499856e-03 -1.762938102341739949e-02 -2.592261998182820038e-03 -3.845911230135379971e-02 -3.835665973397880263e-02
437 -5.637009329308430294e-02 -4.464163650698899782e-02 -7.410811479030500470e-02 -5.042792957350569760e-02 -2.496015840963049931e-02 -4.703355284749029946e-02 9.281975309919469896e-02 -7.639450375000099436e-02 -6.117659509433449883e-02 -4.664087356364819692e-02
438 4.170844488444359899e-02 5.068011873981870252e-02 1.966153563733339868e-02 5.974393262605470073e-02 -5.696818394814720174e-03 -2.566471273376759888e-03 -2.867429443567860031e-02 -2.592261998182820038e-03 3.119299070280229930e-02 7.206516329203029904e-03
439 -5.514554978810590376e-03 5.068011873981870252e-02 -1.590626280073640167e-02 -6.764228304218700139e-02 4.934129593323050011e-02 7.916527725369119917e-02 -2.867429443567860031e-02 3.430885887772629900e-02 -1.811826730789670159e-02 4.448547856271539702e-02
440 4.170844488444359899e-02 5.068011873981870252e-02 -1.590626280073640167e-02 1.728186074811709910e-02 -3.734373413344069942e-02 -1.383981589779990050e-02 -2.499265663159149983e-02 -1.107951979964190078e-02 -4.687948284421659950e-02 1.549073015887240078e-02
441 -4.547247794002570037e-02 -4.464163650698899782e-02 3.906215296718960200e-02 1.215130832538269907e-03 1.631842733640340160e-02 1.528299104862660025e-02 -2.867429443567860031e-02 2.655962349378539894e-02 4.452837402140529671e-02 -2.593033898947460017e-02
442 -4.547247794002570037e-02 -4.464163650698899782e-02 -7.303030271642410587e-02 -8.141376581713200000e-02 8.374011738825870577e-02 2.780892952020790065e-02 1.738157847891100005e-01 -3.949338287409189657e-02 -4.219859706946029777e-03 3.064409414368320182e-03

View File

@@ -1,442 +0,0 @@
1.510000000000000000e+02
7.500000000000000000e+01
1.410000000000000000e+02
2.060000000000000000e+02
1.350000000000000000e+02
9.700000000000000000e+01
1.380000000000000000e+02
6.300000000000000000e+01
1.100000000000000000e+02
3.100000000000000000e+02
1.010000000000000000e+02
6.900000000000000000e+01
1.790000000000000000e+02
1.850000000000000000e+02
1.180000000000000000e+02
1.710000000000000000e+02
1.660000000000000000e+02
1.440000000000000000e+02
9.700000000000000000e+01
1.680000000000000000e+02
6.800000000000000000e+01
4.900000000000000000e+01
6.800000000000000000e+01
2.450000000000000000e+02
1.840000000000000000e+02
2.020000000000000000e+02
1.370000000000000000e+02
8.500000000000000000e+01
1.310000000000000000e+02
2.830000000000000000e+02
1.290000000000000000e+02
5.900000000000000000e+01
3.410000000000000000e+02
8.700000000000000000e+01
6.500000000000000000e+01
1.020000000000000000e+02
2.650000000000000000e+02
2.760000000000000000e+02
2.520000000000000000e+02
9.000000000000000000e+01
1.000000000000000000e+02
5.500000000000000000e+01
6.100000000000000000e+01
9.200000000000000000e+01
2.590000000000000000e+02
5.300000000000000000e+01
1.900000000000000000e+02
1.420000000000000000e+02
7.500000000000000000e+01
1.420000000000000000e+02
1.550000000000000000e+02
2.250000000000000000e+02
5.900000000000000000e+01
1.040000000000000000e+02
1.820000000000000000e+02
1.280000000000000000e+02
5.200000000000000000e+01
3.700000000000000000e+01
1.700000000000000000e+02
1.700000000000000000e+02
6.100000000000000000e+01
1.440000000000000000e+02
5.200000000000000000e+01
1.280000000000000000e+02
7.100000000000000000e+01
1.630000000000000000e+02
1.500000000000000000e+02
9.700000000000000000e+01
1.600000000000000000e+02
1.780000000000000000e+02
4.800000000000000000e+01
2.700000000000000000e+02
2.020000000000000000e+02
1.110000000000000000e+02
8.500000000000000000e+01
4.200000000000000000e+01
1.700000000000000000e+02
2.000000000000000000e+02
2.520000000000000000e+02
1.130000000000000000e+02
1.430000000000000000e+02
5.100000000000000000e+01
5.200000000000000000e+01
2.100000000000000000e+02
6.500000000000000000e+01
1.410000000000000000e+02
5.500000000000000000e+01
1.340000000000000000e+02
4.200000000000000000e+01
1.110000000000000000e+02
9.800000000000000000e+01
1.640000000000000000e+02
4.800000000000000000e+01
9.600000000000000000e+01
9.000000000000000000e+01
1.620000000000000000e+02
1.500000000000000000e+02
2.790000000000000000e+02
9.200000000000000000e+01
8.300000000000000000e+01
1.280000000000000000e+02
1.020000000000000000e+02
3.020000000000000000e+02
1.980000000000000000e+02
9.500000000000000000e+01
5.300000000000000000e+01
1.340000000000000000e+02
1.440000000000000000e+02
2.320000000000000000e+02
8.100000000000000000e+01
1.040000000000000000e+02
5.900000000000000000e+01
2.460000000000000000e+02
2.970000000000000000e+02
2.580000000000000000e+02
2.290000000000000000e+02
2.750000000000000000e+02
2.810000000000000000e+02
1.790000000000000000e+02
2.000000000000000000e+02
2.000000000000000000e+02
1.730000000000000000e+02
1.800000000000000000e+02
8.400000000000000000e+01
1.210000000000000000e+02
1.610000000000000000e+02
9.900000000000000000e+01
1.090000000000000000e+02
1.150000000000000000e+02
2.680000000000000000e+02
2.740000000000000000e+02
1.580000000000000000e+02
1.070000000000000000e+02
8.300000000000000000e+01
1.030000000000000000e+02
2.720000000000000000e+02
8.500000000000000000e+01
2.800000000000000000e+02
3.360000000000000000e+02
2.810000000000000000e+02
1.180000000000000000e+02
3.170000000000000000e+02
2.350000000000000000e+02
6.000000000000000000e+01
1.740000000000000000e+02
2.590000000000000000e+02
1.780000000000000000e+02
1.280000000000000000e+02
9.600000000000000000e+01
1.260000000000000000e+02
2.880000000000000000e+02
8.800000000000000000e+01
2.920000000000000000e+02
7.100000000000000000e+01
1.970000000000000000e+02
1.860000000000000000e+02
2.500000000000000000e+01
8.400000000000000000e+01
9.600000000000000000e+01
1.950000000000000000e+02
5.300000000000000000e+01
2.170000000000000000e+02
1.720000000000000000e+02
1.310000000000000000e+02
2.140000000000000000e+02
5.900000000000000000e+01
7.000000000000000000e+01
2.200000000000000000e+02
2.680000000000000000e+02
1.520000000000000000e+02
4.700000000000000000e+01
7.400000000000000000e+01
2.950000000000000000e+02
1.010000000000000000e+02
1.510000000000000000e+02
1.270000000000000000e+02
2.370000000000000000e+02
2.250000000000000000e+02
8.100000000000000000e+01
1.510000000000000000e+02
1.070000000000000000e+02
6.400000000000000000e+01
1.380000000000000000e+02
1.850000000000000000e+02
2.650000000000000000e+02
1.010000000000000000e+02
1.370000000000000000e+02
1.430000000000000000e+02
1.410000000000000000e+02
7.900000000000000000e+01
2.920000000000000000e+02
1.780000000000000000e+02
9.100000000000000000e+01
1.160000000000000000e+02
8.600000000000000000e+01
1.220000000000000000e+02
7.200000000000000000e+01
1.290000000000000000e+02
1.420000000000000000e+02
9.000000000000000000e+01
1.580000000000000000e+02
3.900000000000000000e+01
1.960000000000000000e+02
2.220000000000000000e+02
2.770000000000000000e+02
9.900000000000000000e+01
1.960000000000000000e+02
2.020000000000000000e+02
1.550000000000000000e+02
7.700000000000000000e+01
1.910000000000000000e+02
7.000000000000000000e+01
7.300000000000000000e+01
4.900000000000000000e+01
6.500000000000000000e+01
2.630000000000000000e+02
2.480000000000000000e+02
2.960000000000000000e+02
2.140000000000000000e+02
1.850000000000000000e+02
7.800000000000000000e+01
9.300000000000000000e+01
2.520000000000000000e+02
1.500000000000000000e+02
7.700000000000000000e+01
2.080000000000000000e+02
7.700000000000000000e+01
1.080000000000000000e+02
1.600000000000000000e+02
5.300000000000000000e+01
2.200000000000000000e+02
1.540000000000000000e+02
2.590000000000000000e+02
9.000000000000000000e+01
2.460000000000000000e+02
1.240000000000000000e+02
6.700000000000000000e+01
7.200000000000000000e+01
2.570000000000000000e+02
2.620000000000000000e+02
2.750000000000000000e+02
1.770000000000000000e+02
7.100000000000000000e+01
4.700000000000000000e+01
1.870000000000000000e+02
1.250000000000000000e+02
7.800000000000000000e+01
5.100000000000000000e+01
2.580000000000000000e+02
2.150000000000000000e+02
3.030000000000000000e+02
2.430000000000000000e+02
9.100000000000000000e+01
1.500000000000000000e+02
3.100000000000000000e+02
1.530000000000000000e+02
3.460000000000000000e+02
6.300000000000000000e+01
8.900000000000000000e+01
5.000000000000000000e+01
3.900000000000000000e+01
1.030000000000000000e+02
3.080000000000000000e+02
1.160000000000000000e+02
1.450000000000000000e+02
7.400000000000000000e+01
4.500000000000000000e+01
1.150000000000000000e+02
2.640000000000000000e+02
8.700000000000000000e+01
2.020000000000000000e+02
1.270000000000000000e+02
1.820000000000000000e+02
2.410000000000000000e+02
6.600000000000000000e+01
9.400000000000000000e+01
2.830000000000000000e+02
6.400000000000000000e+01
1.020000000000000000e+02
2.000000000000000000e+02
2.650000000000000000e+02
9.400000000000000000e+01
2.300000000000000000e+02
1.810000000000000000e+02
1.560000000000000000e+02
2.330000000000000000e+02
6.000000000000000000e+01
2.190000000000000000e+02
8.000000000000000000e+01
6.800000000000000000e+01
3.320000000000000000e+02
2.480000000000000000e+02
8.400000000000000000e+01
2.000000000000000000e+02
5.500000000000000000e+01
8.500000000000000000e+01
8.900000000000000000e+01
3.100000000000000000e+01
1.290000000000000000e+02
8.300000000000000000e+01
2.750000000000000000e+02
6.500000000000000000e+01
1.980000000000000000e+02
2.360000000000000000e+02
2.530000000000000000e+02
1.240000000000000000e+02
4.400000000000000000e+01
1.720000000000000000e+02
1.140000000000000000e+02
1.420000000000000000e+02
1.090000000000000000e+02
1.800000000000000000e+02
1.440000000000000000e+02
1.630000000000000000e+02
1.470000000000000000e+02
9.700000000000000000e+01
2.200000000000000000e+02
1.900000000000000000e+02
1.090000000000000000e+02
1.910000000000000000e+02
1.220000000000000000e+02
2.300000000000000000e+02
2.420000000000000000e+02
2.480000000000000000e+02
2.490000000000000000e+02
1.920000000000000000e+02
1.310000000000000000e+02
2.370000000000000000e+02
7.800000000000000000e+01
1.350000000000000000e+02
2.440000000000000000e+02
1.990000000000000000e+02
2.700000000000000000e+02
1.640000000000000000e+02
7.200000000000000000e+01
9.600000000000000000e+01
3.060000000000000000e+02
9.100000000000000000e+01
2.140000000000000000e+02
9.500000000000000000e+01
2.160000000000000000e+02
2.630000000000000000e+02
1.780000000000000000e+02
1.130000000000000000e+02
2.000000000000000000e+02
1.390000000000000000e+02
1.390000000000000000e+02
8.800000000000000000e+01
1.480000000000000000e+02
8.800000000000000000e+01
2.430000000000000000e+02
7.100000000000000000e+01
7.700000000000000000e+01
1.090000000000000000e+02
2.720000000000000000e+02
6.000000000000000000e+01
5.400000000000000000e+01
2.210000000000000000e+02
9.000000000000000000e+01
3.110000000000000000e+02
2.810000000000000000e+02
1.820000000000000000e+02
3.210000000000000000e+02
5.800000000000000000e+01
2.620000000000000000e+02
2.060000000000000000e+02
2.330000000000000000e+02
2.420000000000000000e+02
1.230000000000000000e+02
1.670000000000000000e+02
6.300000000000000000e+01
1.970000000000000000e+02
7.100000000000000000e+01
1.680000000000000000e+02
1.400000000000000000e+02
2.170000000000000000e+02
1.210000000000000000e+02
2.350000000000000000e+02
2.450000000000000000e+02
4.000000000000000000e+01
5.200000000000000000e+01
1.040000000000000000e+02
1.320000000000000000e+02
8.800000000000000000e+01
6.900000000000000000e+01
2.190000000000000000e+02
7.200000000000000000e+01
2.010000000000000000e+02
1.100000000000000000e+02
5.100000000000000000e+01
2.770000000000000000e+02
6.300000000000000000e+01
1.180000000000000000e+02
6.900000000000000000e+01
2.730000000000000000e+02
2.580000000000000000e+02
4.300000000000000000e+01
1.980000000000000000e+02
2.420000000000000000e+02
2.320000000000000000e+02
1.750000000000000000e+02
9.300000000000000000e+01
1.680000000000000000e+02
2.750000000000000000e+02
2.930000000000000000e+02
2.810000000000000000e+02
7.200000000000000000e+01
1.400000000000000000e+02
1.890000000000000000e+02
1.810000000000000000e+02
2.090000000000000000e+02
1.360000000000000000e+02
2.610000000000000000e+02
1.130000000000000000e+02
1.310000000000000000e+02
1.740000000000000000e+02
2.570000000000000000e+02
5.500000000000000000e+01
8.400000000000000000e+01
4.200000000000000000e+01
1.460000000000000000e+02
2.120000000000000000e+02
2.330000000000000000e+02
9.100000000000000000e+01
1.110000000000000000e+02
1.520000000000000000e+02
1.200000000000000000e+02
6.700000000000000000e+01
3.100000000000000000e+02
9.400000000000000000e+01
1.830000000000000000e+02
6.600000000000000000e+01
1.730000000000000000e+02
7.200000000000000000e+01
4.900000000000000000e+01
6.400000000000000000e+01
4.800000000000000000e+01
1.780000000000000000e+02
1.040000000000000000e+02
1.320000000000000000e+02
2.200000000000000000e+02
5.700000000000000000e+01
1 1.510000000000000000e+02
2 7.500000000000000000e+01
3 1.410000000000000000e+02
4 2.060000000000000000e+02
5 1.350000000000000000e+02
6 9.700000000000000000e+01
7 1.380000000000000000e+02
8 6.300000000000000000e+01
9 1.100000000000000000e+02
10 3.100000000000000000e+02
11 1.010000000000000000e+02
12 6.900000000000000000e+01
13 1.790000000000000000e+02
14 1.850000000000000000e+02
15 1.180000000000000000e+02
16 1.710000000000000000e+02
17 1.660000000000000000e+02
18 1.440000000000000000e+02
19 9.700000000000000000e+01
20 1.680000000000000000e+02
21 6.800000000000000000e+01
22 4.900000000000000000e+01
23 6.800000000000000000e+01
24 2.450000000000000000e+02
25 1.840000000000000000e+02
26 2.020000000000000000e+02
27 1.370000000000000000e+02
28 8.500000000000000000e+01
29 1.310000000000000000e+02
30 2.830000000000000000e+02
31 1.290000000000000000e+02
32 5.900000000000000000e+01
33 3.410000000000000000e+02
34 8.700000000000000000e+01
35 6.500000000000000000e+01
36 1.020000000000000000e+02
37 2.650000000000000000e+02
38 2.760000000000000000e+02
39 2.520000000000000000e+02
40 9.000000000000000000e+01
41 1.000000000000000000e+02
42 5.500000000000000000e+01
43 6.100000000000000000e+01
44 9.200000000000000000e+01
45 2.590000000000000000e+02
46 5.300000000000000000e+01
47 1.900000000000000000e+02
48 1.420000000000000000e+02
49 7.500000000000000000e+01
50 1.420000000000000000e+02
51 1.550000000000000000e+02
52 2.250000000000000000e+02
53 5.900000000000000000e+01
54 1.040000000000000000e+02
55 1.820000000000000000e+02
56 1.280000000000000000e+02
57 5.200000000000000000e+01
58 3.700000000000000000e+01
59 1.700000000000000000e+02
60 1.700000000000000000e+02
61 6.100000000000000000e+01
62 1.440000000000000000e+02
63 5.200000000000000000e+01
64 1.280000000000000000e+02
65 7.100000000000000000e+01
66 1.630000000000000000e+02
67 1.500000000000000000e+02
68 9.700000000000000000e+01
69 1.600000000000000000e+02
70 1.780000000000000000e+02
71 4.800000000000000000e+01
72 2.700000000000000000e+02
73 2.020000000000000000e+02
74 1.110000000000000000e+02
75 8.500000000000000000e+01
76 4.200000000000000000e+01
77 1.700000000000000000e+02
78 2.000000000000000000e+02
79 2.520000000000000000e+02
80 1.130000000000000000e+02
81 1.430000000000000000e+02
82 5.100000000000000000e+01
83 5.200000000000000000e+01
84 2.100000000000000000e+02
85 6.500000000000000000e+01
86 1.410000000000000000e+02
87 5.500000000000000000e+01
88 1.340000000000000000e+02
89 4.200000000000000000e+01
90 1.110000000000000000e+02
91 9.800000000000000000e+01
92 1.640000000000000000e+02
93 4.800000000000000000e+01
94 9.600000000000000000e+01
95 9.000000000000000000e+01
96 1.620000000000000000e+02
97 1.500000000000000000e+02
98 2.790000000000000000e+02
99 9.200000000000000000e+01
100 8.300000000000000000e+01
101 1.280000000000000000e+02
102 1.020000000000000000e+02
103 3.020000000000000000e+02
104 1.980000000000000000e+02
105 9.500000000000000000e+01
106 5.300000000000000000e+01
107 1.340000000000000000e+02
108 1.440000000000000000e+02
109 2.320000000000000000e+02
110 8.100000000000000000e+01
111 1.040000000000000000e+02
112 5.900000000000000000e+01
113 2.460000000000000000e+02
114 2.970000000000000000e+02
115 2.580000000000000000e+02
116 2.290000000000000000e+02
117 2.750000000000000000e+02
118 2.810000000000000000e+02
119 1.790000000000000000e+02
120 2.000000000000000000e+02
121 2.000000000000000000e+02
122 1.730000000000000000e+02
123 1.800000000000000000e+02
124 8.400000000000000000e+01
125 1.210000000000000000e+02
126 1.610000000000000000e+02
127 9.900000000000000000e+01
128 1.090000000000000000e+02
129 1.150000000000000000e+02
130 2.680000000000000000e+02
131 2.740000000000000000e+02
132 1.580000000000000000e+02
133 1.070000000000000000e+02
134 8.300000000000000000e+01
135 1.030000000000000000e+02
136 2.720000000000000000e+02
137 8.500000000000000000e+01
138 2.800000000000000000e+02
139 3.360000000000000000e+02
140 2.810000000000000000e+02
141 1.180000000000000000e+02
142 3.170000000000000000e+02
143 2.350000000000000000e+02
144 6.000000000000000000e+01
145 1.740000000000000000e+02
146 2.590000000000000000e+02
147 1.780000000000000000e+02
148 1.280000000000000000e+02
149 9.600000000000000000e+01
150 1.260000000000000000e+02
151 2.880000000000000000e+02
152 8.800000000000000000e+01
153 2.920000000000000000e+02
154 7.100000000000000000e+01
155 1.970000000000000000e+02
156 1.860000000000000000e+02
157 2.500000000000000000e+01
158 8.400000000000000000e+01
159 9.600000000000000000e+01
160 1.950000000000000000e+02
161 5.300000000000000000e+01
162 2.170000000000000000e+02
163 1.720000000000000000e+02
164 1.310000000000000000e+02
165 2.140000000000000000e+02
166 5.900000000000000000e+01
167 7.000000000000000000e+01
168 2.200000000000000000e+02
169 2.680000000000000000e+02
170 1.520000000000000000e+02
171 4.700000000000000000e+01
172 7.400000000000000000e+01
173 2.950000000000000000e+02
174 1.010000000000000000e+02
175 1.510000000000000000e+02
176 1.270000000000000000e+02
177 2.370000000000000000e+02
178 2.250000000000000000e+02
179 8.100000000000000000e+01
180 1.510000000000000000e+02
181 1.070000000000000000e+02
182 6.400000000000000000e+01
183 1.380000000000000000e+02
184 1.850000000000000000e+02
185 2.650000000000000000e+02
186 1.010000000000000000e+02
187 1.370000000000000000e+02
188 1.430000000000000000e+02
189 1.410000000000000000e+02
190 7.900000000000000000e+01
191 2.920000000000000000e+02
192 1.780000000000000000e+02
193 9.100000000000000000e+01
194 1.160000000000000000e+02
195 8.600000000000000000e+01
196 1.220000000000000000e+02
197 7.200000000000000000e+01
198 1.290000000000000000e+02
199 1.420000000000000000e+02
200 9.000000000000000000e+01
201 1.580000000000000000e+02
202 3.900000000000000000e+01
203 1.960000000000000000e+02
204 2.220000000000000000e+02
205 2.770000000000000000e+02
206 9.900000000000000000e+01
207 1.960000000000000000e+02
208 2.020000000000000000e+02
209 1.550000000000000000e+02
210 7.700000000000000000e+01
211 1.910000000000000000e+02
212 7.000000000000000000e+01
213 7.300000000000000000e+01
214 4.900000000000000000e+01
215 6.500000000000000000e+01
216 2.630000000000000000e+02
217 2.480000000000000000e+02
218 2.960000000000000000e+02
219 2.140000000000000000e+02
220 1.850000000000000000e+02
221 7.800000000000000000e+01
222 9.300000000000000000e+01
223 2.520000000000000000e+02
224 1.500000000000000000e+02
225 7.700000000000000000e+01
226 2.080000000000000000e+02
227 7.700000000000000000e+01
228 1.080000000000000000e+02
229 1.600000000000000000e+02
230 5.300000000000000000e+01
231 2.200000000000000000e+02
232 1.540000000000000000e+02
233 2.590000000000000000e+02
234 9.000000000000000000e+01
235 2.460000000000000000e+02
236 1.240000000000000000e+02
237 6.700000000000000000e+01
238 7.200000000000000000e+01
239 2.570000000000000000e+02
240 2.620000000000000000e+02
241 2.750000000000000000e+02
242 1.770000000000000000e+02
243 7.100000000000000000e+01
244 4.700000000000000000e+01
245 1.870000000000000000e+02
246 1.250000000000000000e+02
247 7.800000000000000000e+01
248 5.100000000000000000e+01
249 2.580000000000000000e+02
250 2.150000000000000000e+02
251 3.030000000000000000e+02
252 2.430000000000000000e+02
253 9.100000000000000000e+01
254 1.500000000000000000e+02
255 3.100000000000000000e+02
256 1.530000000000000000e+02
257 3.460000000000000000e+02
258 6.300000000000000000e+01
259 8.900000000000000000e+01
260 5.000000000000000000e+01
261 3.900000000000000000e+01
262 1.030000000000000000e+02
263 3.080000000000000000e+02
264 1.160000000000000000e+02
265 1.450000000000000000e+02
266 7.400000000000000000e+01
267 4.500000000000000000e+01
268 1.150000000000000000e+02
269 2.640000000000000000e+02
270 8.700000000000000000e+01
271 2.020000000000000000e+02
272 1.270000000000000000e+02
273 1.820000000000000000e+02
274 2.410000000000000000e+02
275 6.600000000000000000e+01
276 9.400000000000000000e+01
277 2.830000000000000000e+02
278 6.400000000000000000e+01
279 1.020000000000000000e+02
280 2.000000000000000000e+02
281 2.650000000000000000e+02
282 9.400000000000000000e+01
283 2.300000000000000000e+02
284 1.810000000000000000e+02
285 1.560000000000000000e+02
286 2.330000000000000000e+02
287 6.000000000000000000e+01
288 2.190000000000000000e+02
289 8.000000000000000000e+01
290 6.800000000000000000e+01
291 3.320000000000000000e+02
292 2.480000000000000000e+02
293 8.400000000000000000e+01
294 2.000000000000000000e+02
295 5.500000000000000000e+01
296 8.500000000000000000e+01
297 8.900000000000000000e+01
298 3.100000000000000000e+01
299 1.290000000000000000e+02
300 8.300000000000000000e+01
301 2.750000000000000000e+02
302 6.500000000000000000e+01
303 1.980000000000000000e+02
304 2.360000000000000000e+02
305 2.530000000000000000e+02
306 1.240000000000000000e+02
307 4.400000000000000000e+01
308 1.720000000000000000e+02
309 1.140000000000000000e+02
310 1.420000000000000000e+02
311 1.090000000000000000e+02
312 1.800000000000000000e+02
313 1.440000000000000000e+02
314 1.630000000000000000e+02
315 1.470000000000000000e+02
316 9.700000000000000000e+01
317 2.200000000000000000e+02
318 1.900000000000000000e+02
319 1.090000000000000000e+02
320 1.910000000000000000e+02
321 1.220000000000000000e+02
322 2.300000000000000000e+02
323 2.420000000000000000e+02
324 2.480000000000000000e+02
325 2.490000000000000000e+02
326 1.920000000000000000e+02
327 1.310000000000000000e+02
328 2.370000000000000000e+02
329 7.800000000000000000e+01
330 1.350000000000000000e+02
331 2.440000000000000000e+02
332 1.990000000000000000e+02
333 2.700000000000000000e+02
334 1.640000000000000000e+02
335 7.200000000000000000e+01
336 9.600000000000000000e+01
337 3.060000000000000000e+02
338 9.100000000000000000e+01
339 2.140000000000000000e+02
340 9.500000000000000000e+01
341 2.160000000000000000e+02
342 2.630000000000000000e+02
343 1.780000000000000000e+02
344 1.130000000000000000e+02
345 2.000000000000000000e+02
346 1.390000000000000000e+02
347 1.390000000000000000e+02
348 8.800000000000000000e+01
349 1.480000000000000000e+02
350 8.800000000000000000e+01
351 2.430000000000000000e+02
352 7.100000000000000000e+01
353 7.700000000000000000e+01
354 1.090000000000000000e+02
355 2.720000000000000000e+02
356 6.000000000000000000e+01
357 5.400000000000000000e+01
358 2.210000000000000000e+02
359 9.000000000000000000e+01
360 3.110000000000000000e+02
361 2.810000000000000000e+02
362 1.820000000000000000e+02
363 3.210000000000000000e+02
364 5.800000000000000000e+01
365 2.620000000000000000e+02
366 2.060000000000000000e+02
367 2.330000000000000000e+02
368 2.420000000000000000e+02
369 1.230000000000000000e+02
370 1.670000000000000000e+02
371 6.300000000000000000e+01
372 1.970000000000000000e+02
373 7.100000000000000000e+01
374 1.680000000000000000e+02
375 1.400000000000000000e+02
376 2.170000000000000000e+02
377 1.210000000000000000e+02
378 2.350000000000000000e+02
379 2.450000000000000000e+02
380 4.000000000000000000e+01
381 5.200000000000000000e+01
382 1.040000000000000000e+02
383 1.320000000000000000e+02
384 8.800000000000000000e+01
385 6.900000000000000000e+01
386 2.190000000000000000e+02
387 7.200000000000000000e+01
388 2.010000000000000000e+02
389 1.100000000000000000e+02
390 5.100000000000000000e+01
391 2.770000000000000000e+02
392 6.300000000000000000e+01
393 1.180000000000000000e+02
394 6.900000000000000000e+01
395 2.730000000000000000e+02
396 2.580000000000000000e+02
397 4.300000000000000000e+01
398 1.980000000000000000e+02
399 2.420000000000000000e+02
400 2.320000000000000000e+02
401 1.750000000000000000e+02
402 9.300000000000000000e+01
403 1.680000000000000000e+02
404 2.750000000000000000e+02
405 2.930000000000000000e+02
406 2.810000000000000000e+02
407 7.200000000000000000e+01
408 1.400000000000000000e+02
409 1.890000000000000000e+02
410 1.810000000000000000e+02
411 2.090000000000000000e+02
412 1.360000000000000000e+02
413 2.610000000000000000e+02
414 1.130000000000000000e+02
415 1.310000000000000000e+02
416 1.740000000000000000e+02
417 2.570000000000000000e+02
418 5.500000000000000000e+01
419 8.400000000000000000e+01
420 4.200000000000000000e+01
421 1.460000000000000000e+02
422 2.120000000000000000e+02
423 2.330000000000000000e+02
424 9.100000000000000000e+01
425 1.110000000000000000e+02
426 1.520000000000000000e+02
427 1.200000000000000000e+02
428 6.700000000000000000e+01
429 3.100000000000000000e+02
430 9.400000000000000000e+01
431 1.830000000000000000e+02
432 6.600000000000000000e+01
433 1.730000000000000000e+02
434 7.200000000000000000e+01
435 4.900000000000000000e+01
436 6.400000000000000000e+01
437 4.800000000000000000e+01
438 1.780000000000000000e+02
439 1.040000000000000000e+02
440 1.320000000000000000e+02
441 2.200000000000000000e+02
442 5.700000000000000000e+01

View File

@@ -80,9 +80,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"## Register input and output datasets\n",
"## Create trained model\n",
"\n",
"For this example, we have provided a small model (`sklearn_regression_model.pkl` in the notebook's directory) that was trained on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). Here, you will register the data used to create this model in your workspace."
"For this example, we will train a small model on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). "
]
},
{
@@ -91,9 +91,42 @@
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"\n",
"dataset_x, dataset_y = load_diabetes(return_X_y=True)\n",
"\n",
"model = Ridge().fit(dataset_x, dataset_y)\n",
"\n",
"joblib.dump(model, 'sklearn_regression_model.pkl')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Register input and output datasets\n",
"\n",
"Here, you will register the data used to create the model in your workspace."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"from azureml.core import Dataset\n",
"\n",
"\n",
"np.savetxt('features.csv', dataset_x, delimiter=',')\n",
"np.savetxt('labels.csv', dataset_y, delimiter=',')\n",
"\n",
"datastore = ws.get_default_datastore()\n",
"datastore.upload_files(files=['./features.csv', './labels.csv'],\n",
" target_path='sklearn_regression/',\n",
@@ -125,6 +158,8 @@
},
"outputs": [],
"source": [
"import sklearn\n",
"\n",
"from azureml.core import Model\n",
"from azureml.core.resource_configuration import ResourceConfiguration\n",
"\n",
@@ -133,7 +168,7 @@
" model_name='my-sklearn-model', # Name of the registered model in your workspace.\n",
" model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n",
" model_framework=Model.Framework.SCIKITLEARN, # Framework used to create the model.\n",
" model_framework_version='0.19.1', # Version of scikit-learn used to create the model.\n",
" model_framework_version=sklearn.__version__, # Version of scikit-learn used to create the model.\n",
" sample_input_dataset=input_dataset,\n",
" sample_output_dataset=output_dataset,\n",
" resource_configuration=ResourceConfiguration(cpu=1, memory_in_gb=0.5),\n",
@@ -174,19 +209,9 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"\n",
"\n",
"service_name = 'my-sklearn-service'\n",
"\n",
"# Remove any existing service under the same name.\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"service = Model.deploy(ws, service_name, [model])\n",
"service = Model.deploy(ws, service_name, [model], overwrite=True)\n",
"service.wait_for_deployment(show_output=True)"
]
},
@@ -207,10 +232,7 @@
"\n",
"\n",
"input_payload = json.dumps({\n",
" 'data': [\n",
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
" -0.03482076, -0.04340085, -0.00259226, 0.01990842, -0.01764613]\n",
" ],\n",
" 'data': dataset_x[0:2].tolist(),\n",
" 'method': 'predict' # If you have a classification model, you can get probabilities by changing this to 'predict_proba'.\n",
"})\n",
"\n",
@@ -262,7 +284,7 @@
" 'inference-schema[numpy-support]',\n",
" 'joblib',\n",
" 'numpy',\n",
" 'scikit-learn'\n",
" 'scikit-learn=={}'.format(sklearn.__version__)\n",
"])"
]
},
@@ -303,20 +325,12 @@
},
"outputs": [],
"source": [
"from azureml.core import Webservice\n",
"from azureml.core.model import InferenceConfig\n",
"from azureml.core.webservice import AciWebservice\n",
"from azureml.exceptions import WebserviceException\n",
"\n",
"\n",
"service_name = 'my-custom-env-service'\n",
"\n",
"# Remove any existing service under the same name.\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
"\n",
@@ -324,7 +338,8 @@
" name=service_name,\n",
" models=[model],\n",
" inference_config=inference_config,\n",
" deployment_config=aci_config)\n",
" deployment_config=aci_config,\n",
" overwrite=True)\n",
"service.wait_for_deployment(show_output=True)"
]
},
@@ -342,10 +357,7 @@
"outputs": [],
"source": [
"input_payload = json.dumps({\n",
" 'data': [\n",
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
" -0.03482076, -0.04340085, -0.00259226, 0.01990842, -0.01764613]\n",
" ]\n",
" 'data': dataset_x[0:2].tolist()\n",
"})\n",
"\n",
"output = service.run(input_payload)\n",
@@ -471,7 +483,7 @@
" 'inference-schema[numpy-support]',\n",
" 'joblib',\n",
" 'numpy',\n",
" 'scikit-learn'\n",
" 'scikit-learn=={}'.format(sklearn.__version__)\n",
"])\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"# if cpu and memory_in_gb parameters are not provided\n",

View File

@@ -2,3 +2,5 @@ name: model-register-and-deploy
dependencies:
- pip:
- azureml-sdk
- numpy
- scikit-learn

View File

@@ -1,8 +0,0 @@
name: project_environment
dependencies:
- python=3.6.2
- pip:
- azureml-defaults
- scikit-learn
- numpy
- inference-schema[numpy-support]

View File

@@ -75,6 +75,33 @@
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create trained model\n",
"\n",
"For this example, we will train a small model on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"dataset_x, dataset_y = load_diabetes(return_X_y=True)\n",
"\n",
"sk_model = Ridge().fit(dataset_x, dataset_y)\n",
"\n",
"joblib.dump(sk_model, \"sklearn_regression_model.pkl\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -148,13 +175,10 @@
"outputs": [],
"source": [
"%%writefile source_directory/x/y/score.py\n",
"import os\n",
"import pickle\n",
"import joblib\n",
"import json\n",
"import numpy as np\n",
"from sklearn.externals import joblib\n",
"from sklearn.linear_model import Ridge\n",
"from azureml.core.model import Model\n",
"import os\n",
"\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
@@ -165,16 +189,17 @@
" # It holds the path to the directory that contains the deployed model (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # If there are multiple models, this value is the path to the directory containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n",
" # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n",
"\n",
" global name\n",
" # note here, entire source directory on inference config gets added into image\n",
" # bellow is the example how you can use any extra files in image\n",
" # Note here, the entire source directory from inference config gets added into image.\n",
" # Below is an example of how you can use any extra files in image.\n",
" with open('./source_directory/extradata.json') as json_file:\n",
" data = json.load(json_file)\n",
" name = data[\"people\"][0][\"name\"]\n",
"\n",
"input_sample = np.array([[10,9,8,7,6,5,4,3,2,1]])\n",
"input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n",
"\n",
"@input_schema('data', NumpyParameterType(input_sample))\n",
@@ -182,37 +207,13 @@
"def run(data):\n",
" try:\n",
" result = model.predict(data)\n",
" # you can return any datatype as long as it is JSON-serializable\n",
" # You can return any JSON-serializable object.\n",
" return \"Hello \" + name + \" here is your result = \" + str(result)\n",
" except Exception as e:\n",
" error = str(e)\n",
" return error"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency for your environemnt. This package contains the functionality needed to host the model as a web service."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile source_directory/env/myenv.yml\n",
"name: project_environment\n",
"dependencies:\n",
" - python=3.6.2\n",
" - pip:\n",
" - azureml-defaults\n",
" - scikit-learn\n",
" - numpy\n",
" - inference-schema[numpy-support]"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -249,11 +250,16 @@
"metadata": {},
"outputs": [],
"source": [
"import sklearn\n",
"\n",
"from azureml.core.environment import Environment\n",
"from azureml.core.model import InferenceConfig\n",
"\n",
"\n",
"myenv = Environment.from_conda_specification(name='myenv', file_path='myenv.yml')\n",
"myenv = Environment('myenv')\n",
"myenv.python.conda_dependencies.add_pip_package(\"inference-schema[numpy-support]\")\n",
"myenv.python.conda_dependencies.add_pip_package(\"joblib\")\n",
"myenv.python.conda_dependencies.add_pip_package(\"scikit-learn=={}\".format(sklearn.__version__))\n",
"\n",
"# explicitly set base_image to None when setting base_dockerfile\n",
"myenv.docker.base_image = None\n",
@@ -262,7 +268,7 @@
"\n",
"inference_config = InferenceConfig(source_directory=source_directory,\n",
" entry_script=\"x/y/score.py\",\n",
" environment=myenv)\n"
" environment=myenv)"
]
},
{
@@ -352,15 +358,10 @@
"import json\n",
"\n",
"sample_input = json.dumps({\n",
" 'data': [\n",
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
" ]\n",
" 'data': dataset_x[0:2].tolist()\n",
"})\n",
"\n",
"sample_input = bytes(sample_input, encoding='utf-8')\n",
"\n",
"print(local_service.run(input_data=sample_input))"
"print(local_service.run(sample_input))"
]
},
{
@@ -379,12 +380,10 @@
"outputs": [],
"source": [
"%%writefile source_directory/x/y/score.py\n",
"import os\n",
"import pickle\n",
"import joblib\n",
"import json\n",
"import numpy as np\n",
"from sklearn.externals import joblib\n",
"from sklearn.linear_model import Ridge\n",
"import os\n",
"\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
@@ -395,17 +394,18 @@
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n",
" # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n",
"\n",
" global name, from_location\n",
" # note here, entire source directory on inference config gets added into image\n",
" # bellow is the example how you can use any extra files in image\n",
" # Note here, the entire source directory from inference config gets added into image.\n",
" # Below is an example of how you can use any extra files in image.\n",
" with open('source_directory/extradata.json') as json_file: \n",
" data = json.load(json_file)\n",
" name = data[\"people\"][0][\"name\"]\n",
" from_location = data[\"people\"][0][\"from\"]\n",
"\n",
"input_sample = np.array([[10,9,8,7,6,5,4,3,2,1]])\n",
"input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n",
"\n",
"@input_schema('data', NumpyParameterType(input_sample))\n",
@@ -413,7 +413,7 @@
"def run(data):\n",
" try:\n",
" result = model.predict(data)\n",
" # you can return any datatype as long as it is JSON-serializable\n",
" # You can return any JSON-serializable object.\n",
" return \"Hello \" + name + \" from \" + from_location + \" here is your result = \" + str(result)\n",
" except Exception as e:\n",
" error = str(e)\n",
@@ -430,7 +430,7 @@
"print(\"--------------------------------------------------------------\")\n",
"\n",
"# After calling reload(), run() will return the updated message.\n",
"local_service.run(input_data=sample_input)"
"local_service.run(sample_input)"
]
},
{

View File

@@ -0,0 +1,5 @@
name: register-model-deploy-local-advanced
dependencies:
- pip:
- azureml-sdk
- scikit-learn

View File

@@ -71,6 +71,33 @@
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create trained model\n",
"\n",
"For this example, we will train a small model on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"dataset_x, dataset_y = load_diabetes(return_X_y=True)\n",
"\n",
"sk_model = Ridge().fit(dataset_x, dataset_y)\n",
"\n",
"joblib.dump(sk_model, \"sklearn_regression_model.pkl\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -82,9 +109,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"You can add tags and descriptions to your models. we are using `sklearn_regression_model.pkl` file in the current directory as a model with the name `sklearn_regression_model` in the workspace.\n",
"Here we are registering the serialized file `sklearn_regression_model.pkl` in the current directory as a model with the name `sklearn_regression_model` in the workspace.\n",
"\n",
"Using tags, you can track useful information such as the name and version of the machine learning library used to train the model, framework, category, target customer etc. Note that tags must be alphanumeric."
"You can add tags and descriptions to your models. Using tags, you can track useful information such as the name and version of the machine learning library used to train the model, framework, category, target customer etc. Note that tags must be alphanumeric."
]
},
{
@@ -119,11 +146,62 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.conda_dependencies import CondaDependencies\n",
"import sklearn\n",
"\n",
"from azureml.core.environment import Environment\n",
"\n",
"environment = Environment(\"LocalDeploy\")\n",
"environment.python.conda_dependencies = CondaDependencies(\"myenv.yml\")"
"environment.python.conda_dependencies.add_pip_package(\"inference-schema[numpy-support]\")\n",
"environment.python.conda_dependencies.add_pip_package(\"joblib\")\n",
"environment.python.conda_dependencies.add_pip_package(\"scikit-learn=={}\".format(sklearn.__version__))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Provide the Scoring Script\n",
"\n",
"This Python script handles the model execution inside the service container. The `init()` method loads the model file, and `run(data)` is called for every input to the service."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile score.py\n",
"import joblib\n",
"import json\n",
"import numpy as np\n",
"import os\n",
"\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
"\n",
"def init():\n",
" global model\n",
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n",
"\n",
"input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n",
"\n",
"@input_schema('data', NumpyParameterType(input_sample))\n",
"@output_schema(NumpyParameterType(output_sample))\n",
"def run(data):\n",
" try:\n",
" result = model.predict(data)\n",
" # You can return any JSON-serializable object.\n",
" return result.tolist()\n",
" except Exception as e:\n",
" error = str(e)\n",
" return error"
]
},
{
@@ -145,113 +223,6 @@
" environment=environment)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Model Profiling\n",
"\n",
"Profile your model to understand how much CPU and memory the service, created as a result of its deployment, will need. Profiling returns information such as CPU usage, memory usage, and response latency. It also provides a CPU and memory recommendation based on the resource usage. You can profile your model (or more precisely the service built based on your model) on any CPU and/or memory combination where 0.1 <= CPU <= 3.5 and 0.1GB <= memory <= 15GB. If you do not provide a CPU and/or memory requirement, we will test it on the default configuration of 3.5 CPU and 15GB memory.\n",
"\n",
"In order to profile your model you will need:\n",
"- a registered model\n",
"- an entry script\n",
"- an inference configuration\n",
"- a single column tabular dataset, where each row contains a string representing sample request data sent to the service.\n",
"\n",
"Please, note that profiling is a long running operation and can take up to 25 minutes depending on the size of the dataset.\n",
"\n",
"At this point we only support profiling of services that expect their request data to be a string, for example: string serialized json, text, string serialized image, etc. The content of each row of the dataset (string) will be put into the body of the HTTP request and sent to the service encapsulating the model for scoring.\n",
"\n",
"Below is an example of how you can construct an input dataset to profile a service which expects its incoming requests to contain serialized json. In this case we created a dataset based one hundred instances of the same request data. In real world scenarios however, we suggest that you use larger datasets with various inputs, especially if your model resource usage/behavior is input dependent."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import json\n",
"from azureml.core import Datastore\n",
"from azureml.core.dataset import Dataset\n",
"from azureml.data import dataset_type_definitions\n",
"\n",
"\n",
"# create a string that can be put in the body of the request\n",
"serialized_input_json = json.dumps({\n",
" 'data': [\n",
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
" ]\n",
"})\n",
"dataset_content = []\n",
"for i in range(100):\n",
" dataset_content.append(serialized_input_json)\n",
"dataset_content = '\\n'.join(dataset_content)\n",
"file_name = 'sample_request_data_diabetes.txt'\n",
"f = open(file_name, 'w')\n",
"f.write(dataset_content)\n",
"f.close()\n",
"\n",
"# upload the txt file created above to the Datastore and create a dataset from it\n",
"data_store = Datastore.get_default(ws)\n",
"data_store.upload_files(['./' + file_name], target_path='sample_request_data_diabetes')\n",
"datastore_path = [(data_store, 'sample_request_data_diabetes' +'/' + file_name)]\n",
"sample_request_data_diabetes = Dataset.Tabular.from_delimited_files(\n",
" datastore_path,\n",
" separator='\\n',\n",
" infer_column_types=True,\n",
" header=dataset_type_definitions.PromoteHeadersBehavior.NO_HEADERS)\n",
"sample_request_data_diabetes = sample_request_data_diabetes.register(workspace=ws,\n",
" name='sample_request_data_diabetes',\n",
" create_new_version=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now that we have an input dataset we are ready to go ahead with profiling. In this case we are testing the previously introduced sklearn regression model on 1 CPU and 0.5 GB memory. The memory usage and recommendation presented in the result is measured in Gigabytes. The CPU usage and recommendation is measured in CPU cores."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from datetime import datetime\n",
"from azureml.core import Environment\n",
"from azureml.core.conda_dependencies import CondaDependencies\n",
"from azureml.core.model import Model, InferenceConfig\n",
"\n",
"\n",
"environment = Environment('my-sklearn-environment')\n",
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
" 'azureml-defaults',\n",
" 'inference-schema[numpy-support]',\n",
" 'joblib',\n",
" 'numpy',\n",
" 'scikit-learn'\n",
"])\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"# if cpu and memory_in_gb parameters are not provided\n",
"# the model will be profiled on default configuration of\n",
"# 3.5CPU and 15GB memory\n",
"profile = Model.profile(ws,\n",
" 'profile-%s' % datetime.now().strftime('%m%d%Y-%H%M%S'),\n",
" [model],\n",
" inference_config,\n",
" input_dataset=sample_request_data_diabetes,\n",
" cpu=1.0,\n",
" memory_in_gb=0.5)\n",
"\n",
"# profiling is a long running operation and may take up to 25 min\n",
"profile.wait_for_completion(True)\n",
"details = profile.get_details()"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -338,15 +309,10 @@
"import json\n",
"\n",
"sample_input = json.dumps({\n",
" 'data': [\n",
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
" ]\n",
" 'data': dataset_x[0:2].tolist()\n",
"})\n",
"\n",
"sample_input = bytes(sample_input, encoding='utf-8')\n",
"\n",
"local_service.run(input_data=sample_input)"
"local_service.run(sample_input)"
]
},
{
@@ -365,12 +331,10 @@
"outputs": [],
"source": [
"%%writefile score.py\n",
"import os\n",
"import pickle\n",
"import joblib\n",
"import json\n",
"import numpy as np\n",
"from sklearn.externals import joblib\n",
"from sklearn.linear_model import Ridge\n",
"import os\n",
"\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
@@ -381,10 +345,10 @@
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n",
" # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n",
"\n",
"input_sample = np.array([[10,9,8,7,6,5,4,3,2,1]])\n",
"input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n",
"\n",
"@input_schema('data', NumpyParameterType(input_sample))\n",
@@ -392,8 +356,8 @@
"def run(data):\n",
" try:\n",
" result = model.predict(data)\n",
" # you can return any datatype as long as it is JSON-serializable\n",
" return 'hello from updated score.py'\n",
" # You can return any JSON-serializable object.\n",
" return 'Hello from the updated score.py: ' + str(result.tolist())\n",
" except Exception as e:\n",
" error = str(e)\n",
" return error"
@@ -409,7 +373,7 @@
"print(\"--------------------------------------------------------------\")\n",
"\n",
"# After calling reload(), run() will return the updated message.\n",
"local_service.run(input_data=sample_input)"
"local_service.run(sample_input)"
]
},
{

View File

@@ -0,0 +1,5 @@
name: register-model-deploy-local
dependencies:
- pip:
- azureml-sdk
- scikit-learn

View File

@@ -1,35 +0,0 @@
import os
import pickle
import json
import numpy as np
from sklearn.externals import joblib
from sklearn.linear_model import Ridge
from inference_schema.schema_decorators import input_schema, output_schema
from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType
def init():
global model
# AZUREML_MODEL_DIR is an environment variable created during deployment.
# It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)
# For multiple models, it points to the folder containing all deployed models (./azureml-models)
model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')
# deserialize the model file back into a sklearn model
model = joblib.load(model_path)
input_sample = np.array([[10, 9, 8, 7, 6, 5, 4, 3, 2, 1]])
output_sample = np.array([3726.995])
@input_schema('data', NumpyParameterType(input_sample))
@output_schema(NumpyParameterType(output_sample))
def run(data):
try:
result = model.predict(data)
# you can return any datatype as long as it is JSON-serializable
return result.tolist()
except Exception as e:
error = str(e)
return error

View File

@@ -108,9 +108,9 @@
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
" 'azureml-defaults',\n",
" 'inference-schema[numpy-support]',\n",
" 'joblib',\n",
" 'numpy',\n",
" 'scikit-learn'\n",
" 'scikit-learn==0.19.1',\n",
" 'scipy'\n",
"])"
]
},

View File

@@ -5,7 +5,7 @@
"metadata": {},
"source": [
"# Enabling App Insights for Services in Production\n",
"With this notebook, you can learn how to enable App Insights for standard service monitoring, plus, we provide examples for doing custom logging within a scoring files in a model. \n",
"With this notebook, you can learn how to enable App Insights for standard service monitoring, plus, we provide examples for doing custom logging within a scoring files in a model.\n",
"\n",
"\n",
"## What does Application Insights monitor?\n",
@@ -45,11 +45,13 @@
"metadata": {},
"outputs": [],
"source": [
"import azureml.core\n",
"import json\n",
"\n",
"from azureml.core import Workspace\n",
"from azureml.core.compute import AksCompute, ComputeTarget\n",
"from azureml.core.webservice import AksWebservice\n",
"import azureml.core\n",
"import json\n",
"\n",
"print(azureml.core.VERSION)"
]
},
@@ -67,7 +69,7 @@
"outputs": [],
"source": [
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
]
},
{
@@ -84,13 +86,13 @@
"metadata": {},
"outputs": [],
"source": [
"#Register the model\n",
"from azureml.core.model import Model\n",
"model = Model.register(model_path = \"sklearn_regression_model.pkl\", # this points to a local file\n",
" model_name = \"sklearn_regression_model.pkl\", # this is the name the model is registered as\n",
" tags = {'area': \"diabetes\", 'type': \"regression\"},\n",
" description = \"Ridge regression model to predict diabetes\",\n",
" workspace = ws)\n",
"from azureml.core import Model\n",
"\n",
"model = Model.register(model_path=\"sklearn_regression_model.pkl\", # This points to a local file.\n",
" model_name=\"sklearn_regression_model.pkl\", # This is the name the model is registered as.\n",
" tags={'area': \"diabetes\", 'type': \"regression\"},\n",
" description=\"Ridge regression model to predict diabetes\",\n",
" workspace=ws)\n",
"\n",
"print(model.name, model.description, model.version)"
]
@@ -120,7 +122,7 @@
"import os\n",
"import pickle\n",
"import json\n",
"import numpy \n",
"import numpy\n",
"from sklearn.externals import joblib\n",
"from sklearn.linear_model import Ridge\n",
"import time\n",
@@ -129,15 +131,15 @@
" global model\n",
" #Print statement for appinsights custom traces:\n",
" print (\"model initialized\" + time.strftime(\"%H:%M:%S\"))\n",
" \n",
"\n",
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" \n",
"\n",
" # deserialize the model file back into a sklearn model\n",
" model = joblib.load(model_path)\n",
" \n",
"\n",
"\n",
"# note you can pass in multiple rows for scoring\n",
"def run(raw_data):\n",
@@ -168,7 +170,7 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.conda_dependencies import CondaDependencies \n",
"from azureml.core.conda_dependencies import CondaDependencies\n",
"\n",
"myenv = CondaDependencies.create(conda_packages=['numpy','scikit-learn'],\n",
" pip_packages=['azureml-defaults'])\n",
@@ -190,9 +192,8 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.model import InferenceConfig\n",
"from azureml.core.environment import Environment\n",
"\n",
"from azureml.core.model import InferenceConfig\n",
"\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=\"myenv.yml\")\n",
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)"
@@ -213,11 +214,11 @@
"source": [
"from azureml.core.webservice import AciWebservice\n",
"\n",
"aci_deployment_config = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
" memory_gb = 1, \n",
" tags = {'area': \"diabetes\", 'type': \"regression\"}, \n",
" description = 'Predict diabetes using regression model',\n",
" enable_app_insights = True)"
"aci_deployment_config = AciWebservice.deploy_configuration(cpu_cores=1,\n",
" memory_gb=1,\n",
" tags={'area': \"diabetes\", 'type': \"regression\"},\n",
" description=\"Predict diabetes using regression model\",\n",
" enable_app_insights=True)"
]
},
{
@@ -226,29 +227,14 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.webservice import Webservice\n",
"aci_service_name = \"aci-service-appinsights\"\n",
"\n",
"aci_service = Model.deploy(ws, aci_service_name, [model], inference_config, aci_deployment_config, overwrite=True)\n",
"aci_service.wait_for_deployment(show_output=True)\n",
"\n",
"aci_service_name = 'my-aci-service-4'\n",
"aci_service = Model.deploy(ws, aci_service_name, [model], inference_config, aci_deployment_config)\n",
"aci_service.wait_for_deployment(True)\n",
"print(aci_service.state)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"\n",
"test_sample = json.dumps({'data': [\n",
" [1,28,13,45,54,6,57,8,8,10], \n",
" [101,9,8,37,6,45,4,3,2,41]\n",
"]})\n",
"test_sample = bytes(test_sample,encoding='utf8')"
]
},
{
"cell_type": "code",
"execution_count": null,
@@ -256,7 +242,15 @@
"outputs": [],
"source": [
"if aci_service.state == \"Healthy\":\n",
" prediction = aci_service.run(input_data=test_sample)\n",
" test_sample = json.dumps({\n",
" \"data\": [\n",
" [1,28,13,45,54,6,57,8,8,10],\n",
" [101,9,8,37,6,45,4,3,2,41]\n",
" ]\n",
" })\n",
"\n",
" prediction = aci_service.run(test_sample)\n",
"\n",
" print(prediction)\n",
"else:\n",
" raise ValueError(\"Service deployment isn't healthy, can't call the service. Error: \", aci_service.error)"
@@ -282,14 +276,21 @@
"metadata": {},
"outputs": [],
"source": [
"# Use the default configuration (can also provide parameters to customize)\n",
"prov_config = AksCompute.provisioning_configuration()\n",
"from azureml.exceptions import ComputeTargetException\n",
"\n",
"aks_name = 'my-aks-test3' \n",
"# Create the cluster\n",
"aks_target = ComputeTarget.create(workspace = ws, \n",
" name = aks_name, \n",
" provisioning_configuration = prov_config)"
"aks_name = \"my-aks\"\n",
"\n",
"try:\n",
" aks_target = ComputeTarget(ws, aks_name)\n",
" print(\"Using existing AKS cluster {}.\".format(aks_name))\n",
"except ComputeTargetException:\n",
" print(\"Creating a new AKS cluster {}.\".format(aks_name))\n",
"\n",
" # Use the default configuration (can also provide parameters to customize).\n",
" prov_config = AksCompute.provisioning_configuration()\n",
" aks_target = ComputeTarget.create(workspace=ws,\n",
" name=aks_name,\n",
" provisioning_configuration=prov_config)"
]
},
{
@@ -299,7 +300,8 @@
"outputs": [],
"source": [
"%%time\n",
"aks_target.wait_for_completion(show_output = True)"
"if aks_target.provisioning_state != \"Succeeded\":\n",
" aks_target.wait_for_completion(show_output=True)"
]
},
{
@@ -323,13 +325,13 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"```python \n",
"```python\n",
"%%time\n",
"resource_id = '/subscriptions/<subscriptionid>/resourcegroups/<resourcegroupname>/providers/Microsoft.ContainerService/managedClusters/<aksservername>'\n",
"create_name= 'myaks4'\n",
"attach_config = AksCompute.attach_configuration(resource_id=resource_id)\n",
"aks_target = ComputeTarget.attach(workspace = ws, \n",
" name = create_name, \n",
"aks_target = ComputeTarget.attach(workspace=ws,\n",
" name=create_name,\n",
" attach_configuration=attach_config)\n",
"## Wait for the operation to complete\n",
"aks_target.wait_for_provisioning(True)```"
@@ -349,7 +351,7 @@
"metadata": {},
"outputs": [],
"source": [
"#Set the web service configuration\n",
"# Set the web service configuration.\n",
"aks_deployment_config = AksWebservice.deploy_configuration(enable_app_insights=True)"
]
},
@@ -366,15 +368,16 @@
"metadata": {},
"outputs": [],
"source": [
"if aks_target.provisioning_state== \"Succeeded\": \n",
" aks_service_name ='aks-w-dc5'\n",
"if aks_target.provisioning_state == \"Succeeded\":\n",
" aks_service_name = \"aks-service-appinsights\"\n",
" aks_service = Model.deploy(ws,\n",
" aks_service_name, \n",
" [model], \n",
" inference_config, \n",
" aks_deployment_config, \n",
" deployment_target = aks_target) \n",
" aks_service.wait_for_deployment(show_output = True)\n",
" aks_service_name,\n",
" [model],\n",
" inference_config,\n",
" aks_deployment_config,\n",
" deployment_target=aks_target,\n",
" overwrite=True)\n",
" aks_service.wait_for_deployment(show_output=True)\n",
" print(aks_service.state)\n",
"else:\n",
" raise ValueError(\"AKS provisioning failed. Error: \", aks_service.error)"
@@ -395,13 +398,14 @@
"source": [
"%%time\n",
"\n",
"test_sample = json.dumps({'data': [\n",
" [1,28,13,45,54,6,57,8,8,10], \n",
" [101,9,8,37,6,45,4,3,2,41]\n",
"]})\n",
"test_sample = bytes(test_sample,encoding='utf8')\n",
"\n",
"if aks_service.state == \"Healthy\":\n",
" test_sample = json.dumps({\n",
" \"data\": [\n",
" [1,28,13,45,54,6,57,8,8,10],\n",
" [101,9,8,37,6,45,4,3,2,41]\n",
" ]\n",
" })\n",
"\n",
" prediction = aks_service.run(input_data=test_sample)\n",
" print(prediction)\n",
"else:\n",
@@ -435,7 +439,7 @@
"outputs": [],
"source": [
"aks_service.update(enable_app_insights=False)\n",
"aks_service.wait_for_deployment(show_output = True)"
"aks_service.wait_for_deployment(show_output=True)"
]
},
{

View File

@@ -115,6 +115,11 @@
"# Convert from CoreML into ONNX\n",
"onnx_model = onnxmltools.convert_coreml(coreml_model, 'TinyYOLOv2')\n",
"\n",
"# Fix the preprocessor bias in the ImageScaler\n",
"for init in onnx_model.graph.initializer:\n",
" if init.name == 'scalerPreprocessor_bias':\n",
" init.dims[1] = 1\n",
"\n",
"# Save ONNX model\n",
"onnxmltools.utils.save_model(onnx_model, 'tinyyolov2.onnx')\n",
"\n",
@@ -255,7 +260,7 @@
"source": [
"from azureml.core.conda_dependencies import CondaDependencies \n",
"\n",
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime==0.4.0\", \"azureml-core\", \"azureml-defaults\"])\n",
"myenv = CondaDependencies.create(pip_packages=[\"numpy\", \"onnxruntime\", \"azureml-core\", \"azureml-defaults\"])\n",
"\n",
"with open(\"myenv.yml\",\"w\") as f:\n",
" f.write(myenv.serialize_to_string())"
@@ -316,7 +321,7 @@
"metadata": {},
"outputs": [],
"source": [
"aci_service_name = 'my-aci-service-15ad'\n",
"aci_service_name = 'my-aci-service-tiny-yolo'\n",
"print(\"Service\", aci_service_name)\n",
"aci_service = Model.deploy(ws, aci_service_name, [model], inference_config, aciconfig)\n",
"aci_service.wait_for_deployment(True)\n",

View File

@@ -4,4 +4,5 @@ dependencies:
- azureml-sdk
- numpy
- git+https://github.com/apple/coremltools@v2.1
- onnx<1.7.0
- onnxmltools

View File

@@ -5,5 +5,5 @@ dependencies:
- azureml-widgets
- matplotlib
- numpy
- onnx
- onnx<1.7.0
- opencv-python-headless

View File

@@ -5,5 +5,5 @@ dependencies:
- azureml-widgets
- matplotlib
- numpy
- onnx
- onnx<1.7.0
- opencv-python-headless

View File

@@ -0,0 +1,354 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/deployment/production-deploy-to-aks/production-deploy-to-aks.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Deploying a web service to Azure Kubernetes Service (AKS)\n",
"This notebook shows the steps for deploying a service: registering a model, provisioning a cluster with ssl (one time action), and deploying a service to it. \n",
"We then test and delete the service, image and model."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"from azureml.core.compute import AksCompute, ComputeTarget\n",
"from azureml.core.webservice import Webservice, AksWebservice\n",
"from azureml.core.model import Model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.core\n",
"print(azureml.core.VERSION)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Get workspace\n",
"Load existing workspace from the config file info."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.workspace import Workspace\n",
"\n",
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Register the model\n",
"Register an existing trained model, add descirption and tags."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#Register the model\n",
"from azureml.core.model import Model\n",
"model = Model.register(model_path = \"sklearn_regression_model.pkl\", # this points to a local file\n",
" model_name = \"sklearn_model\", # this is the name the model is registered as\n",
" tags = {'area': \"diabetes\", 'type': \"regression\"},\n",
" description = \"Ridge regression model to predict diabetes\",\n",
" workspace = ws)\n",
"\n",
"print(model.name, model.description, model.version)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Create the Environment\n",
"Create an environment that the model will be deployed with"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Environment\n",
"from azureml.core.conda_dependencies import CondaDependencies \n",
"\n",
"conda_deps = CondaDependencies.create(conda_packages=['numpy', 'scikit-learn==0.19.1', 'scipy'], pip_packages=['azureml-defaults', 'inference-schema'])\n",
"myenv = Environment(name='myenv')\n",
"myenv.python.conda_dependencies = conda_deps"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### Use a custom Docker image\n",
"\n",
"You can also specify a custom Docker image to be used as base image if you don't want to use the default base image provided by Azure ML. Please make sure the custom Docker image has Ubuntu >= 16.04, Conda >= 4.5.\\* and Python(3.5.\\* or 3.6.\\*).\n",
"\n",
"Only supported with `python` runtime.\n",
"```python\n",
"# use an image available in public Container Registry without authentication\n",
"myenv.docker.base_image = \"mcr.microsoft.com/azureml/o16n-sample-user-base/ubuntu-miniconda\"\n",
"\n",
"# or, use an image available in a private Container Registry\n",
"myenv.docker.base_image = \"myregistry.azurecr.io/mycustomimage:1.0\"\n",
"myenv.docker.base_image_registry.address = \"myregistry.azurecr.io\"\n",
"myenv.docker.base_image_registry.username = \"username\"\n",
"myenv.docker.base_image_registry.password = \"password\"\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Write the Entry Script\n",
"Write the script that will be used to predict on your model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile score.py\n",
"import os\n",
"import pickle\n",
"import json\n",
"import numpy\n",
"from sklearn.externals import joblib\n",
"from sklearn.linear_model import Ridge\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.standard_py_parameter_type import StandardPythonParameterType\n",
"\n",
"def init():\n",
" global model\n",
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n",
" model = joblib.load(model_path)\n",
"\n",
"\n",
"standard_sample_input = {'a': 10, 'b': 9, 'c': 8, 'd': 7, 'e': 6, 'f': 5, 'g': 4, 'h': 3, 'i': 2, 'j': 1 }\n",
"standard_sample_output = {'outcome': 1}\n",
"\n",
"@input_schema('param', StandardPythonParameterType(standard_sample_input))\n",
"@output_schema(StandardPythonParameterType(standard_sample_output))\n",
"def run(param):\n",
" try:\n",
" raw_data = [param['a'], param['b'], param['c'], param['d'], param['e'], param['f'], param['g'], param['h'], param['i'], param['j']]\n",
" data = numpy.array([raw_data])\n",
" result = model.predict(data)\n",
" return { 'outcome' : result[0] }\n",
" except Exception as e:\n",
" error = str(e)\n",
" return error"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Create the InferenceConfig\n",
"Create the inference config that will be used when deploying the model"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.model import InferenceConfig\n",
"\n",
"inf_config = InferenceConfig(entry_script='score.py', environment=myenv)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Provision the AKS Cluster with SSL\n",
"This is a one time setup. You can reuse this cluster for multiple deployments after it has been created. If you delete the cluster or the resource group that contains it, then you would have to recreate it.\n",
"\n",
"See code snippet below. Check the documentation [here](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service) for more details"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Use the default configuration (can also provide parameters to customize)\n",
"\n",
"provisioning_config = AksCompute.provisioning_configuration()\n",
"# Leaf domain label generates a name using the formula\n",
"# \"<leaf-domain-label>######.<azure-region>.cloudapp.azure.net\"\n",
"# where \"######\" is a random series of characters\n",
"provisioning_config.enable_ssl(leaf_domain_label = \"contoso\")\n",
"\n",
"aks_name = 'my-aks-ssl-1' \n",
"# Create the cluster\n",
"aks_target = ComputeTarget.create(workspace = ws, \n",
" name = aks_name, \n",
" provisioning_configuration = provisioning_config)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"aks_target.wait_for_completion(show_output = True)\n",
"print(aks_target.provisioning_state)\n",
"print(aks_target.provisioning_errors)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Deploy web service to AKS"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"sample-deploy-to-aks"
]
},
"outputs": [],
"source": [
"%%time\n",
"\n",
"aks_config = AksWebservice.deploy_configuration()\n",
"\n",
"aks_service_name ='aks-service-ssl-1'\n",
"\n",
"aks_service = Model.deploy(workspace=ws,\n",
" name=aks_service_name,\n",
" models=[model],\n",
" inference_config=inf_config,\n",
" deployment_config=aks_config,\n",
" deployment_target=aks_target,\n",
" overwrite=True)\n",
"\n",
"aks_service.wait_for_deployment(show_output = True)\n",
"print(aks_service.state)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Test the web service using run method\n",
"We test the web sevice by passing data.\n",
"Run() method retrieves API keys behind the scenes to make sure that call is authenticated."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"import json\n",
"\n",
"standard_sample_input = json.dumps({'param': {'a': 10, 'b': 9, 'c': 8, 'd': 7, 'e': 6, 'f': 5, 'g': 4, 'h': 3, 'i': 2, 'j': 1 }})\n",
"\n",
"aks_service.run(input_data=standard_sample_input)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Clean up\n",
"Delete the service, image and model."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"aks_service.delete()\n",
"model.delete()"
]
}
],
"metadata": {
"authors": [
{
"name": "vaidyas"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.6"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -0,0 +1,8 @@
name: production-deploy-to-aks-ssl
dependencies:
- pip:
- azureml-sdk
- matplotlib
- tqdm
- scipy
- sklearn

View File

@@ -109,7 +109,7 @@
"from azureml.core import Environment\n",
"from azureml.core.conda_dependencies import CondaDependencies \n",
"\n",
"conda_deps = CondaDependencies.create(conda_packages=['numpy','scikit-learn'], pip_packages=['azureml-defaults'])\n",
"conda_deps = CondaDependencies.create(conda_packages=['numpy','scikit-learn==0.19.1','scipy'], pip_packages=['azureml-defaults'])\n",
"myenv = Environment(name='myenv')\n",
"myenv.python.conda_dependencies = conda_deps"
]
@@ -300,7 +300,8 @@
" 'inference-schema[numpy-support]',\n",
" 'joblib',\n",
" 'numpy',\n",
" 'scikit-learn'\n",
" 'scikit-learn==0.19.1',\n",
" 'scipy'\n",
"])\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"# if cpu and memory_in_gb parameters are not provided\n",

File diff suppressed because one or more lines are too long

View File

@@ -1,260 +0,0 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/deployment/tensorflow/tensorflow-model-register-and-deploy.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Register TensorFlow SavedModel and deploy as webservice\n",
"\n",
"Following this notebook, you will:\n",
"\n",
" - Learn how to register a TF SavedModel in your Azure Machine Learning Workspace.\n",
" - Deploy your model as a web service in an Azure Container Instance."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prerequisites\n",
"\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the [configuration notebook](../../../configuration.ipynb) to install the Azure Machine Learning Python SDK and create a workspace."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.core\n",
"\n",
"# Check core SDK version number.\n",
"print('SDK version:', azureml.core.VERSION)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize workspace\n",
"\n",
"Create a [Workspace](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.workspace%28class%29?view=azure-ml-py) object from your persisted configuration."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"create workspace"
]
},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"\n",
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Download the Model\n",
"\n",
"Download and extract the model from https://amlsamplenotebooksdata.blob.core.windows.net/data/flowers_model.tar.gz to \"models\" directory"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import tarfile\n",
"import urllib.request\n",
"\n",
"# create directory for model\n",
"model_dir = 'models'\n",
"if not os.path.isdir(model_dir):\n",
" os.mkdir(model_dir)\n",
"\n",
"url=\"https://amlsamplenotebooksdata.blob.core.windows.net/data/flowers_model.tar.gz\"\n",
"response = urllib.request.urlretrieve(url, model_dir + \"/flowers_model.tar.gz\")\n",
"tar = tarfile.open(model_dir + \"/flowers_model.tar.gz\", \"r:gz\")\n",
"tar.extractall(model_dir)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Register model\n",
"\n",
"Register a file or folder as a model by calling [Model.register()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#register-workspace--model-path--model-name--tags-none--properties-none--description-none--datasets-none--model-framework-none--model-framework-version-none--child-paths-none-). For this example, we have provided a TensorFlow SavedModel (`flowers_model` in the notebook's directory).\n",
"\n",
"In addition to the content of the model file itself, your registered model will also store model metadata -- model description, tags, and framework information -- that will be useful when managing and deploying models in your workspace. Using tags, for instance, you can categorize your models and apply filters when listing models in your workspace. Also, marking this model with the scikit-learn framework will simplify deploying it as a web service, as we'll see later."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"register model from file"
]
},
"outputs": [],
"source": [
"from azureml.core import Model\n",
"\n",
"model = Model.register(workspace=ws,\n",
" model_name='flowers', # Name of the registered model in your workspace.\n",
" model_path= model_dir + '/flowers_model', # Local Tensorflow SavedModel folder to upload and register as a model.\n",
" model_framework=Model.Framework.TENSORFLOW, # Framework used to create the model.\n",
" model_framework_version='1.14.0', # Version of Tensorflow used to create the model.\n",
" description='Flowers model')\n",
"\n",
"print('Name:', model.name)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Deploy model\n",
"\n",
"Deploy your model as a web service using [Model.deploy()](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.core.model.model?view=azure-ml-py#deploy-workspace--name--models--inference-config--deployment-config-none--deployment-target-none-). Web services take one or more models, load them in an environment, and run them on one of several supported deployment targets.\n",
"\n",
"For this example, we will deploy your TensorFlow SavedModel to an Azure Container Instance (ACI)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Use a default environment (for supported models)\n",
"\n",
"The Azure Machine Learning service provides a default environment for supported model frameworks, including TensorFlow, based on the metadata you provided when registering your model. This is the easiest way to deploy your model.\n",
"\n",
"**Note**: This step can take several minutes."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"\n",
"service_name = 'tensorflow-flower-service'\n",
"\n",
"# Remove any existing service under the same name.\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"service = Model.deploy(ws, service_name, [model])\n",
"service.wait_for_deployment(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"After your model is deployed, perform a call to the web service."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import requests\n",
"\n",
"headers = {'Content-Type': 'application/json'}\n",
"\n",
"if service.auth_enabled:\n",
" headers['Authorization'] = 'Bearer '+ service.get_keys()[0]\n",
"elif service.token_auth_enabled:\n",
" headers['Authorization'] = 'Bearer '+ service.get_token()[0]\n",
"\n",
"scoring_uri = service.scoring_uri # If you have a SavedModel with classify and regress, \n",
" # you can change the scoring_uri from 'uri:predict' to 'uri:classify' or 'uri:regress'.\n",
"print(scoring_uri)\n",
"\n",
"with open('tensorflow-flower-predict-input.json', 'rb') as data_file:\n",
" response = requests.post(\n",
" scoring_uri, data=data_file, headers=headers)\n",
"print(response.status_code)\n",
"print(response.elapsed)\n",
"print(response.json())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"When you are finished testing your service, clean up the deployment."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"service.delete()"
]
}
],
"metadata": {
"authors": [
{
"name": "vaidyas"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.0"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,4 +0,0 @@
name: tensorflow-model-register-and-deploy
dependencies:
- pip:
- azureml-sdk

View File

@@ -58,7 +58,7 @@
"\n",
"Problem: Boston Housing Price Prediction with scikit-learn (train a model and run an explainer remotely via AMLCompute, and download and visualize the remotely-calculated explanations.)\n",
"\n",
"| ![explanations-run-history](./img/explanations-run-history.PNG) |\n",
"| ![explanations-run-history](./img/explanations-run-history.png) |\n",
"|:--:|\n"
]
},
@@ -261,6 +261,10 @@
"if pandas_ver:\n",
" pandas_dep = 'pandas=={}'.format(pandas_ver)\n",
"# specify CondaDependencies obj\n",
"# The CondaDependencies specifies the conda and pip packages that are installed in the environment\n",
"# the submitted job is run in. Note the remote environment(s) needs to be similar to the local\n",
"# environment, otherwise if a model is trained or deployed in a different environment this can\n",
"# cause errors. Please take extra care when specifying your dependencies in a production environment.\n",
"run_config.environment.python.conda_dependencies = CondaDependencies.create(conda_packages=[sklearn_dep, pandas_dep],\n",
" pip_packages=azureml_pip_packages)\n",
"\n",
@@ -379,6 +383,10 @@
"if pandas_ver:\n",
" pandas_dep = 'pandas=={}'.format(pandas_ver)\n",
"# specify CondaDependencies obj\n",
"# The CondaDependencies specifies the conda and pip packages that are installed in the environment\n",
"# the submitted job is run in. Note the remote environment(s) needs to be similar to the local\n",
"# environment, otherwise if a model is trained or deployed in a different environment this can\n",
"# cause errors. Please take extra care when specifying your dependencies in a production environment.\n",
"run_config.environment.python.conda_dependencies = CondaDependencies.create(conda_packages=[sklearn_dep, pandas_dep],\n",
" pip_packages=azureml_pip_packages)\n",
"\n",
@@ -509,6 +517,10 @@
"if pandas_ver:\n",
" pandas_dep = 'pandas=={}'.format(pandas_ver)\n",
"# specify CondaDependencies obj\n",
"# The CondaDependencies specifies the conda and pip packages that are installed in the environment\n",
"# the submitted job is run in. Note the remote environment(s) needs to be similar to the local\n",
"# environment, otherwise if a model is trained or deployed in a different environment this can\n",
"# cause errors. Please take extra care when specifying your dependencies in a production environment.\n",
"run_config.environment.python.conda_dependencies = CondaDependencies.create(conda_packages=[sklearn_dep, pandas_dep],\n",
" pip_packages=azureml_pip_packages)\n",
"\n",
@@ -672,7 +684,7 @@
"source": [
"# retrieve model for visualization and deployment\n",
"from azureml.core.model import Model\n",
"from sklearn.externals import joblib\n",
"import joblib\n",
"original_model = Model(ws, 'model_explain_model_on_amlcomp')\n",
"model_path = original_model.download(exist_ok=True)\n",
"original_model = joblib.load(model_path)"
@@ -692,7 +704,7 @@
"outputs": [],
"source": [
"# retrieve x_test for visualization\n",
"from sklearn.externals import joblib\n",
"import joblib\n",
"x_test_path = './x_test_boston_housing.pkl'\n",
"run.download_file('x_test_boston_housing.pkl', output_file_path=x_test_path)"
]

View File

@@ -7,7 +7,7 @@ from interpret.ext.blackbox import TabularExplainer
from azureml.contrib.interpret.explanation.explanation_client import ExplanationClient
from sklearn.model_selection import train_test_split
from azureml.core.run import Run
from sklearn.externals import joblib
import joblib
import os
import numpy as np

View File

@@ -3,7 +3,7 @@ import numpy as np
import pandas as pd
import os
import pickle
from sklearn.externals import joblib
import joblib
from sklearn.linear_model import LogisticRegression
from azureml.core.model import Model

View File

@@ -3,7 +3,7 @@ import numpy as np
import pandas as pd
import os
import pickle
from sklearn.externals import joblib
import joblib
from sklearn.linear_model import LogisticRegression
from azureml.core.model import Model

View File

@@ -165,7 +165,7 @@
"outputs": [],
"source": [
"from sklearn.model_selection import train_test_split\n",
"from sklearn.externals import joblib\n",
"import joblib\n",
"from sklearn.preprocessing import StandardScaler, OneHotEncoder\n",
"from sklearn.impute import SimpleImputer\n",
"from sklearn.pipeline import Pipeline\n",
@@ -346,6 +346,10 @@
"if pandas_ver:\n",
" pandas_dep = 'pandas=={}'.format(pandas_ver)\n",
"# specify CondaDependencies obj\n",
"# The CondaDependencies specifies the conda and pip packages that are installed in the environment\n",
"# the submitted job is run in. Note the remote environment(s) needs to be similar to the local\n",
"# environment, otherwise if a model is trained or deployed in a different environment this can\n",
"# cause errors. Please take extra care when specifying your dependencies in a production environment.\n",
"myenv = CondaDependencies.create(conda_packages=[sklearn_dep, pandas_dep],\n",
" pip_packages=['sklearn-pandas', 'pyyaml'] + azureml_pip_packages,\n",
" pin_sdk_version=False)\n",
@@ -413,9 +417,9 @@
"headers = {'Content-Type':'application/json'}\n",
"\n",
"# send request to service\n",
"print(\"POST to url\", service.scoring_uri)\n",
"resp = requests.post(service.scoring_uri, sample_data, headers=headers)\n",
"\n",
"print(\"POST to url\", service.scoring_uri)\n",
"# can covert back to Python objects from json string if desired\n",
"print(\"prediction:\", resp.text)\n",
"result = json.loads(resp.text)"

View File

@@ -63,7 +63,7 @@
"7.\tCreate an image and register it in the image registry.\n",
"8.\tDeploy the image as a web service in Azure.\n",
"\n",
"| ![azure-machine-learning-cycle](./img/azure-machine-learning-cycle.PNG) |\n",
"| ![azure-machine-learning-cycle](./img/azure-machine-learning-cycle.png) |\n",
"|:--:|"
]
},
@@ -264,6 +264,10 @@
"if pandas_ver:\n",
" pandas_dep = 'pandas=={}'.format(pandas_ver)\n",
"# specify CondaDependencies obj\n",
"# The CondaDependencies specifies the conda and pip packages that are installed in the environment\n",
"# the submitted job is run in. Note the remote environment(s) needs to be similar to the local\n",
"# environment, otherwise if a model is trained or deployed in a different environment this can\n",
"# cause errors. Please take extra care when specifying your dependencies in a production environment.\n",
"run_config.environment.python.conda_dependencies = CondaDependencies.create(conda_packages=[sklearn_dep, pandas_dep],\n",
" pip_packages=['sklearn_pandas', 'pyyaml'] + azureml_pip_packages,\n",
" pin_sdk_version=False)\n",
@@ -325,7 +329,7 @@
"source": [
"# retrieve model for visualization and deployment\n",
"from azureml.core.model import Model\n",
"from sklearn.externals import joblib\n",
"import joblib\n",
"original_model = Model(ws, 'amlcompute_deploy_model')\n",
"model_path = original_model.download(exist_ok=True)\n",
"original_svm_model = joblib.load(model_path)"
@@ -352,7 +356,7 @@
"outputs": [],
"source": [
"# retrieve x_test for visualization\n",
"from sklearn.externals import joblib\n",
"import joblib\n",
"x_test_path = './x_test.pkl'\n",
"run.download_file('x_test_ibm.pkl', output_file_path=x_test_path)\n",
"x_test = joblib.load(x_test_path)"
@@ -432,6 +436,10 @@
"if pandas_ver:\n",
" pandas_dep = 'pandas=={}'.format(pandas_ver)\n",
"# specify CondaDependencies obj\n",
"# The CondaDependencies specifies the conda and pip packages that are installed in the environment\n",
"# the submitted job is run in. Note the remote environment(s) needs to be similar to the local\n",
"# environment, otherwise if a model is trained or deployed in a different environment this can\n",
"# cause errors. Please take extra care when specifying your dependencies in a production environment.\n",
"myenv = CondaDependencies.create(conda_packages=[sklearn_dep, pandas_dep],\n",
" pip_packages=['sklearn-pandas', 'pyyaml'] + azureml_pip_packages,\n",
" pin_sdk_version=False)\n",
@@ -495,9 +503,9 @@
"headers = {'Content-Type':'application/json'}\n",
"\n",
"# send request to service\n",
"print(\"POST to url\", service.scoring_uri)\n",
"resp = requests.post(service.scoring_uri, input_data, headers=headers)\n",
"\n",
"print(\"POST to url\", service.scoring_uri)\n",
"# can covert back to Python objects from json string if desired\n",
"print(\"prediction:\", resp.text)"
]

View File

@@ -6,7 +6,7 @@ import os
import pandas as pd
import zipfile
from sklearn.model_selection import train_test_split
from sklearn.externals import joblib
import joblib
from sklearn.preprocessing import StandardScaler, OneHotEncoder
from sklearn.impute import SimpleImputer
from sklearn.pipeline import Pipeline

View File

@@ -252,7 +252,7 @@
"source": [
"binaries_folder = \"azurebatch/job_binaries\"\n",
"if not os.path.isdir(binaries_folder):\n",
" os.mkdir(binaries_folder)\n",
" os.makedirs(binaries_folder)\n",
"\n",
"file_name=\"azurebatch.cmd\"\n",
"with open(path.join(binaries_folder, file_name), 'w') as f:\n",

View File

@@ -0,0 +1,510 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/aml-pipelines-with-data-dependency-steps.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Showcasing Dataset and PipelineParameter\n",
"\n",
"This notebook demonstrates how a [**FileDataset**](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.filedataset?view=azure-ml-py) or [**TabularDataset**](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py) can be parametrized with [**PipelineParameters**](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelineparameter?view=azure-ml-py) in an AML [Pipeline](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline(class)?view=azure-ml-py). By parametrizing datasets, you can dynamically run pipeline experiments with different datasets without any code change.\n",
"\n",
"A common use case is building a training pipeline with a sample of your training data for quick iterative development. When you're ready to test and deploy your pipeline at scale, you can pass in your full training dataset to the pipeline experiment without making any changes to your training script. \n",
" \n",
"To see more about how parameters work between steps, please refer [aml-pipelines-with-data-dependency-steps](https://aka.ms/pl-data-dep).\n",
"\n",
"* [How to create a Pipeline with a Dataset PipelineParameter](#index1)\n",
"* [How to submit a Pipeline with a Dataset PipelineParameter](#index2)\n",
"* [How to submit a Pipeline and change the Dataset PipelineParameter value from the sdk](#index3)\n",
"* [How to submit a Pipeline and change the Dataset PipelineParameter value using a REST call](#index4)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Azure Machine Learning and Pipeline SDK-specific imports"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import azureml.core\n",
"from azureml.core import Workspace, Experiment, Dataset\n",
"from azureml.core.compute import ComputeTarget, AmlCompute\n",
"from azureml.data.dataset_consumption_config import DatasetConsumptionConfig\n",
"from azureml.widgets import RunDetails\n",
"\n",
"from azureml.pipeline.core import PipelineParameter\n",
"from azureml.pipeline.core import Pipeline, PipelineRun\n",
"from azureml.pipeline.steps import PythonScriptStep\n",
"\n",
"# Check core SDK version number\n",
"print(\"SDK version:\", azureml.core.VERSION)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize Workspace\n",
"\n",
"Initialize a workspace object from persisted configuration. If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure the config file is present at .\\config.json\n",
"\n",
"If you don't have a config.json file, go through the [configuration Notebook](https://aka.ms/pl-config) first.\n",
"\n",
"This sets you up with a working config file that has information on your workspace, subscription id, etc."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create an Azure ML experiment\n",
"\n",
"Let's create an experiment named \"showcasing-dataset\" and a folder to hold the training scripts. The script runs will be recorded under the experiment in Azure."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Choose a name for the run history container in the workspace.\n",
"experiment_name = 'showcasing-dataset'\n",
"source_directory = '.'\n",
"\n",
"experiment = Experiment(ws, experiment_name)\n",
"experiment"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create or Attach an AmlCompute cluster\n",
"You will need to create a [compute target](https://docs.microsoft.com/azure/machine-learning/service/concept-azure-machine-learning-architecture#compute-target) for your AutoML run. In this tutorial, you get the default `AmlCompute` as your training compute resource."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Choose a name for your cluster.\n",
"amlcompute_cluster_name = \"cpu-cluster\"\n",
"\n",
"found = False\n",
"# Check if this compute target already exists in the workspace.\n",
"cts = ws.compute_targets\n",
"if amlcompute_cluster_name in cts and cts[amlcompute_cluster_name].type == 'AmlCompute':\n",
" found = True\n",
" print('Found existing compute target.')\n",
" compute_target = cts[amlcompute_cluster_name]\n",
" \n",
"if not found:\n",
" print('Creating a new compute target...')\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_V2\", # for GPU, use \"STANDARD_NC6\"\n",
" #vm_priority = 'lowpriority', # optional\n",
" max_nodes = 4)\n",
"\n",
" # Create the cluster.\n",
" compute_target = ComputeTarget.create(ws, amlcompute_cluster_name, provisioning_config)\n",
" \n",
" # Can poll for a minimum number of nodes and for a specific timeout.\n",
" # If no min_node_count is provided, it will use the scale settings for the cluster.\n",
" compute_target.wait_for_completion(show_output = True, timeout_in_minutes = 10)\n",
" \n",
" # For a more detailed view of current AmlCompute status, use get_status()."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Dataset Configuration\n",
"\n",
"The following steps detail how to create a [FileDataset](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.filedataset?view=azure-ml-py) and [TabularDataset](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py) from an external CSV file, and configure them to be used by a [Pipeline](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline(class)?view=azure-ml-py):\n",
"\n",
"1. Create a dataset from a csv file\n",
"2. Create a [PipelineParameter](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelineparameter?view=azure-ml-py) object and set the `default_value` to the dataset. [PipelineParameter](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelineparameter?view=azure-ml-py) objects enabled arguments to be passed into Pipelines when they are resubmitted after creation. The `name` is referenced later on when we submit additional pipeline runs with different input datasets. \n",
"3. Create a [DatasetConsumptionConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.dataset_consumption_config.datasetconsumptionconfig?view=azure-ml-py) object from the [PiepelineParameter](https://docs.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipelineparameter?view=azure-ml-py). The [DatasetConsumptionConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.dataset_consumption_config.datasetconsumptionconfig?view=azure-ml-py) object specifies how the dataset should be used by the remote compute where the pipeline is run. **NOTE** only [DatasetConsumptionConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.dataset_consumption_config.datasetconsumptionconfig?view=azure-ml-py) objects built on [FileDataset](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.filedataset?view=azure-ml-py) can be set `as_mount()` or `as_download()` on the remote compute."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"datapath-remarks-sample"
]
},
"outputs": [],
"source": [
"file_dataset = Dataset.File.from_files('https://dprepdata.blob.core.windows.net/demo/Titanic.csv')\n",
"file_pipeline_param = PipelineParameter(name=\"file_ds_param\", default_value=file_dataset)\n",
"file_ds_consumption = DatasetConsumptionConfig(\"file_dataset\", file_pipeline_param).as_mount()\n",
"\n",
"tabular_dataset = Dataset.Tabular.from_delimited_files('https://dprepdata.blob.core.windows.net/demo/Titanic.csv')\n",
"tabular_pipeline_param = PipelineParameter(name=\"tabular_ds_param\", default_value=tabular_dataset)\n",
"tabular_ds_consumption = DatasetConsumptionConfig(\"tabular_dataset\", tabular_pipeline_param)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We will setup a training script to ingest our passed-in datasets and print their contents. **NOTE** the names of the datasets referenced inside the training script correspond to the `name` of their respective [DatasetConsumptionConfig](https://docs.microsoft.com/en-us/python/api/azureml-core/azureml.data.dataset_consumption_config.datasetconsumptionconfig?view=azure-ml-py) objects we defined above."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile train_with_dataset.py\n",
"from azureml.core import Run\n",
"\n",
"input_file_ds_path = Run.get_context().input_datasets['file_dataset']\n",
"with open(input_file_ds_path, 'r') as f:\n",
" content = f.read()\n",
" print(content)\n",
"\n",
"input_tabular_ds = Run.get_context().input_datasets['tabular_dataset']\n",
"tabular_df = input_tabular_ds.to_pandas_dataframe()\n",
"print(tabular_df)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id='index1'></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create a Pipeline with a Dataset PipelineParameter\n",
"\n",
"Note that the ```file_ds_consumption``` and ```tabular_ds_consumption``` are specified as both arguments and inputs to create a step."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"train_step = PythonScriptStep(\n",
" name=\"train_step\",\n",
" script_name=\"train_with_dataset.py\",\n",
" arguments=[\"--param1\", file_ds_consumption, \"--param2\", tabular_ds_consumption],\n",
" inputs=[file_ds_consumption, tabular_ds_consumption],\n",
" compute_target=compute_target,\n",
" source_directory=source_directory)\n",
"\n",
"print(\"train_step created\")\n",
"\n",
"pipeline = Pipeline(workspace=ws, steps=[train_step])\n",
"print(\"pipeline with the train_step created\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id='index2'></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Submit a Pipeline with a Dataset PipelineParameter\n",
"\n",
"Pipelines can be submitted with default values of PipelineParameters by not specifying any parameters."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Pipeline will run with default file_ds and tabular_ds\n",
"pipeline_run = experiment.submit(pipeline)\n",
"print(\"Pipeline is submitted for execution\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"RunDetails(pipeline_run).show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run.wait_for_completion()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id='index3'></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Submit a Pipeline with a different Dataset PipelineParameter value from the SDK\n",
"\n",
"The training pipeline can be reused with different input datasets by passing them in as PipelineParameters"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"iris_file_ds = Dataset.File.from_files('https://raw.githubusercontent.com/Azure/MachineLearningNotebooks/'\n",
" '4e7b3784d50e81c313c62bcdf9a330194153d9cd/how-to-use-azureml/work-with-data/'\n",
" 'datasets-tutorial/train-with-datasets/train-dataset/iris.csv')\n",
"\n",
"iris_tabular_ds = Dataset.Tabular.from_delimited_files('https://raw.githubusercontent.com/Azure/MachineLearningNotebooks/'\n",
" '4e7b3784d50e81c313c62bcdf9a330194153d9cd/how-to-use-azureml/work-with-data/'\n",
" 'datasets-tutorial/train-with-datasets/train-dataset/iris.csv')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run_with_params = experiment.submit(pipeline, pipeline_parameters={'file_ds_param': iris_file_ds, 'tabular_ds_param': iris_tabular_ds}) "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"RunDetails(pipeline_run_with_params).show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run_with_params.wait_for_completion()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id='index4'></a>"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Dynamically Set the Dataset PipelineParameter Values using a REST Call\n",
"\n",
"Let's publish the pipeline we created previously, so we can generate a pipeline endpoint. We can then submit the iris datasets to the pipeline REST endpoint by passing in their IDs. "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline = pipeline.publish(name=\"Dataset_Pipeline\", description=\"Pipeline to test Dataset PipelineParameter\", continue_on_step_failure=True)\n",
"published_pipeline"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline.submit(ws, experiment_name=\"publishedexperiment\", pipeline_parameters={'file_ds_param': iris_file_ds, 'tabular_ds_param': iris_tabular_ds})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
"import requests\n",
"\n",
"auth = InteractiveLoginAuthentication()\n",
"aad_token = auth.get_authentication_header()\n",
"\n",
"rest_endpoint = published_pipeline.endpoint\n",
"\n",
"print(\"You can perform HTTP POST on URL {} to trigger this pipeline\".format(rest_endpoint))"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# specify the param when running the pipeline\n",
"response = requests.post(rest_endpoint, \n",
" headers=aad_token, \n",
" json={\"ExperimentName\": \"MyRestPipeline\",\n",
" \"RunSource\": \"SDK\",\n",
" \"DataSetDefinitionValueAssignments\": {\"file_ds_param\": {\"SavedDataSetReference\": {\"Id\": iris_file_ds.id}},\n",
" \"tabular_ds_param\": {\"SavedDataSetReference\": {\"Id\": iris_tabular_ds.id}}}\n",
" }\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_via_rest = PipelineRun(ws.experiments[\"MyRestPipeline\"], run_id)\n",
"RunDetails(published_pipeline_run_via_rest).show()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_via_rest.wait_for_completion()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id='index5'></a>"
]
}
],
"metadata": {
"authors": [
{
"name": "rafarmah"
}
],
"category": "tutorial",
"compute": [
"AML Compute"
],
"datasets": [
"Custom"
],
"deployment": [
"None"
],
"exclude_from_index": false,
"framework": [
"Azure ML"
],
"friendly_name": "How to use Dataset as a PipelineParameter",
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.7"
},
"order_index": 13,
"star_tag": [
"featured"
],
"tags": [
"None"
],
"task": "Demonstrates the use of Dataset as a PipelineParameter"
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -0,0 +1,5 @@
name: aml-pipelines-showcasing-dataset-and-pipelineparameter
dependencies:
- pip:
- azureml-sdk
- azureml-widgets

View File

@@ -510,7 +510,7 @@
" inputs=[step_1_input],\n",
" num_workers=1,\n",
" python_script_path=python_script_path,\n",
" python_script_params={'arg1', pipeline_param, 'arg2},\n",
" python_script_params={'arg1', pipeline_param, 'arg2'},\n",
" run_name='DB_Python_demo',\n",
" compute_target=databricks_compute,\n",
" allow_reuse=True\n",

View File

@@ -279,8 +279,7 @@
"# Specify CondaDependencies obj, add necessary packages\n",
"aml_run_config.environment.python.conda_dependencies = CondaDependencies.create(\n",
" conda_packages=['pandas','scikit-learn'], \n",
" pip_packages=['azureml-sdk[automl,explain]', 'pyarrow'], \n",
" pin_sdk_version=False)\n",
" pip_packages=['azureml-sdk[automl,explain]', 'pyarrow'])\n",
"\n",
"print (\"Run configuration created.\")"
]
@@ -692,7 +691,6 @@
" debug_log = 'automated_ml_errors.log',\n",
" path = train_model_folder,\n",
" compute_target = aml_compute,\n",
" run_configuration = aml_run_config,\n",
" featurization = 'auto',\n",
" training_data = training_dataset,\n",
" label_column_name = 'cost',\n",
@@ -718,7 +716,6 @@
"\n",
"trainWithAutomlStep = AutoMLStep(name='AutoML_Regression',\n",
" automl_config=automl_config,\n",
" passthru_automl_config=False,\n",
" allow_reuse=True)\n",
"print(\"trainWithAutomlStep created.\")"
]

View File

@@ -13,7 +13,7 @@ def init():
global g_tf_sess
# pull down model from workspace
model_path = Model.get_model_path("mnist")
model_path = Model.get_model_path("mnist-prs")
# contruct graph to execute
tf.reset_default_graph()

View File

@@ -2,18 +2,16 @@
Azure Machine Learning Batch Inference targets large inference jobs that are not time-sensitive. Batch Inference provides cost-effective inference compute scaling, with unparalleled throughput for asynchronous applications. It is optimized for high-throughput, fire-and-forget inference over large collections of data.
# Getting Started with Batch Inference Public Preview
# Getting Started with Batch Inference
Batch inference public preview offers a platform in which to do large inference or generic parallel map-style operations. Below introduces the major steps to use this new functionality. For a quick try, please follow the prerequisites and simply run the sample notebooks provided in this directory.
Batch inference offers a platform in which to do large inference or generic parallel map-style operations. Below introduces the major steps to use this new functionality. For a quick try, please follow the prerequisites and simply run the sample notebooks provided in this directory.
## Prerequisites
### Python package installation
Following the convention of most AzureML Public Preview features, Batch Inference SDK is currently available as a contrib package.
If you're unfamiliar with creating a new Python environment, you may follow this example for [creating a conda environment](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-configure-environment#local). Batch Inference package can be installed through the following pip command.
```
pip install azureml-contrib-pipeline-steps
pip install azureml-pipeline-steps
```
### Creation of Azure Machine Learning Workspace
@@ -66,9 +64,8 @@ base_image_registry.password = "password"
## Create a batch inference job
**ParallelRunStep** is a newly added step in the azureml.contrib.pipeline.steps package. You will use it to add a step to create a batch inference job with your Azure machine learning pipeline. (Use batch inference without an Azure machine learning pipeline is not supported yet). ParallelRunStep has all the following parameters:
**ParallelRunStep** is a newly added step in the azureml.pipeline.steps package. You will use it to add a step to create a batch inference job with your Azure machine learning pipeline. (Use batch inference without an Azure machine learning pipeline is not supported yet). ParallelRunStep has all the following parameters:
- **name**: this name will be used to register batch inference service, has the following naming restrictions: (unique, 3-32 chars and regex ^\[a-z\]([-a-z0-9]*[a-z0-9])?$)
- **models**: zero or more model names already registered in Azure Machine Learning model registry.
- **parallel_run_config**: ParallelRunConfig as defined above.
- **inputs**: one or more Dataset objects.
- **output**: this should be a PipelineData object encapsulating an Azure BLOB container path.
@@ -123,6 +120,6 @@ pipeline_run.wait_for_completion(show_output=True)
- [file-dataset-image-inference-mnist.ipynb](./file-dataset-image-inference-mnist.ipynb) demonstrates how to run batch inference on an MNIST dataset using FileDataset.
- [tabular-dataset-inference-iris.ipynb](./tabular-dataset-inference-iris.ipynb) demonstrates how to run batch inference on an IRIS dataset using TabularDataset.
- [pipeline-style-transfer.ipynb](../pipeline-style-transfer/pipeline-style-transfer.ipynb) demonstrates using ParallelRunStep in multi-step pipeline and using output from one step as input to ParallelRunStep.
- [pipeline-style-transfer.ipynb](../pipeline-style-transfer/pipeline-style-transfer-parallel-run.ipynb) demonstrates using ParallelRunStep in multi-step pipeline and using output from one step as input to ParallelRunStep.
![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/parallel-run/README.png)

View File

@@ -23,11 +23,6 @@
"\n",
"In this notebook, we will demonstrate how to make predictions on large quantities of data asynchronously using the ML pipelines with Azure Machine Learning. Batch inference (or batch scoring) provides cost-effective inference, with unparalleled throughput for asynchronous applications. Batch prediction pipelines can scale to perform inference on terabytes of production data. Batch prediction is optimized for high throughput, fire-and-forget predictions for a large collection of data.\n",
"\n",
"> **Note**\n",
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook. Pandas is used to display job results.\n",
"```\n",
"pip install azureml-contrib-pipeline-steps pandas\n",
"```\n",
"> **Tip**\n",
"If your system requires low-latency processing (to process a single document or small set of documents quickly), use [real-time scoring](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-consume-web-service) instead of batch prediction.\n",
"\n",
@@ -86,7 +81,6 @@
"source": [
"import os\n",
"from azureml.core.compute import AmlCompute, ComputeTarget\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# choose a name for your cluster\n",
"compute_name = os.environ.get(\"AML_COMPUTE_CLUSTER_NAME\", \"cpu-cluster\")\n",
@@ -184,9 +178,20 @@
"mnist_ds_name = 'mnist_sample_data'\n",
"\n",
"path_on_datastore = mnist_data.path('mnist')\n",
"input_mnist_ds = Dataset.File.from_files(path=path_on_datastore, validate=False)\n",
"registered_mnist_ds = input_mnist_ds.register(ws, mnist_ds_name, create_new_version=True)\n",
"named_mnist_ds = registered_mnist_ds.as_named_input(mnist_ds_name)"
"input_mnist_ds = Dataset.File.from_files(path=path_on_datastore, validate=False)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.data.dataset_consumption_config import DatasetConsumptionConfig\n",
"from azureml.pipeline.core import PipelineParameter\n",
"\n",
"pipeline_param = PipelineParameter(name=\"mnist_param\", default_value=input_mnist_ds)\n",
"input_mnist_ds_consumption = DatasetConsumptionConfig(\"minist_param_config\", pipeline_param).as_mount()"
]
},
{
@@ -269,7 +274,7 @@
"\n",
"# register downloaded model \n",
"model = Model.register(model_path = \"models/\",\n",
" model_name = \"mnist\", # this is the name the model is registered as\n",
" model_name = \"mnist-prs\", # this is the name the model is registered as\n",
" tags = {'pretrained': \"mnist\"},\n",
" description = \"Mnist trained tensorflow model\",\n",
" workspace = ws)"
@@ -306,8 +311,6 @@
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"\n",
"scripts_folder = \"Code\"\n",
"script_file = \"digit_identification.py\"\n",
"\n",
@@ -341,8 +344,8 @@
"from azureml.core import Environment\n",
"from azureml.core.runconfig import CondaDependencies, DEFAULT_CPU_IMAGE\n",
"\n",
"batch_conda_deps = CondaDependencies.create(pip_packages=[\"tensorflow==1.15.2\", \"pillow\"])\n",
"\n",
"batch_conda_deps = CondaDependencies.create(pip_packages=[\"tensorflow==1.15.2\", \"pillow\", \n",
" \"azureml-core\", \"azureml-dataprep[fuse]\"])\n",
"batch_env = Environment(name=\"batch_environment\")\n",
"batch_env.python.conda_dependencies = batch_conda_deps\n",
"batch_env.docker.enabled = True\n",
@@ -362,17 +365,21 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.pipeline.steps import ParallelRunStep, ParallelRunConfig\n",
"from azureml.pipeline.core import PipelineParameter\n",
"from azureml.pipeline.steps import ParallelRunStep, ParallelRunConfig\n",
"\n",
"parallel_run_config = ParallelRunConfig(\n",
" source_directory=scripts_folder,\n",
" entry_script=script_file,\n",
" mini_batch_size=\"5\",\n",
" mini_batch_size=PipelineParameter(name=\"batch_size_param\", default_value=\"5\"),\n",
" error_threshold=10,\n",
" output_action=\"append_row\",\n",
" append_row_file_name=\"mnist_outputs.txt\",\n",
" environment=batch_env,\n",
" compute_target=compute_target,\n",
" node_count=2)"
" process_count_per_node=PipelineParameter(name=\"process_count_param\", default_value=2),\n",
" node_count=2\n",
")"
]
},
{
@@ -392,10 +399,8 @@
"parallelrun_step = ParallelRunStep(\n",
" name=\"predict-digits-mnist\",\n",
" parallel_run_config=parallel_run_config,\n",
" inputs=[ named_mnist_ds ],\n",
" inputs=[ input_mnist_ds_consumption ],\n",
" output=output_dir,\n",
" models=[ model ],\n",
" arguments=[ ],\n",
" allow_reuse=True\n",
")"
]
@@ -454,6 +459,46 @@
"pipeline_run.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Resubmit a with different dataset\n",
"Since we made the input a `PipelineParameter`, we can resubmit with a different dataset without having to create an entirely new experiment. We'll use the same datastore but use only a single image."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"path_on_datastore = mnist_data.path('mnist/0.png')\n",
"single_image_ds = Dataset.File.from_files(path=path_on_datastore, validate=False)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run_2 = experiment.submit(pipeline, \n",
" pipeline_parameters={\"mnist_param\": single_image_ds, \n",
" \"batch_size_param\": \"1\",\n",
" \"process_count_param\": 1}\n",
")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run_2.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
@@ -480,7 +525,7 @@
"\n",
"for root, dirs, files in os.walk(\"mnist_results\"):\n",
" for file in files:\n",
" if file.endswith('parallel_run_step.txt'):\n",
" if file.endswith('mnist_outputs.txt'):\n",
" result_file = os.path.join(root,file)\n",
"\n",
"df = pd.read_csv(result_file, delimiter=\":\", header=None)\n",

View File

@@ -2,6 +2,6 @@ name: file-dataset-image-inference-mnist
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-pipeline-steps
- azureml-pipeline-steps
- azureml-widgets
- pandas

View File

@@ -23,11 +23,6 @@
"\n",
"In this notebook, we will demonstrate how to make predictions on large quantities of data asynchronously using the ML pipelines with Azure Machine Learning. Batch inference (or batch scoring) provides cost-effective inference, with unparalleled throughput for asynchronous applications. Batch prediction pipelines can scale to perform inference on terabytes of production data. Batch prediction is optimized for high throughput, fire-and-forget predictions for a large collection of data.\n",
"\n",
"> **Note**\n",
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook. Pandas is used to display job results.\n",
"```\n",
"pip install azureml-contrib-pipeline-steps pandas\n",
"```\n",
"> **Tip**\n",
"If your system requires low-latency processing (to process a single document or small set of documents quickly), use [real-time scoring](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-consume-web-service) instead of batch prediction.\n",
"\n",
@@ -84,7 +79,6 @@
"source": [
"import os\n",
"from azureml.core.compute import AmlCompute, ComputeTarget\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# choose a name for your cluster\n",
"compute_name = os.environ.get(\"AML_COMPUTE_CLUSTER_NAME\", \"cpu-cluster\")\n",
@@ -233,7 +227,7 @@
"\n",
"# register downloaded model\n",
"model = Model.register(model_path = \"iris_model.pkl/iris_model.pkl\",\n",
" model_name = \"iris\", # this is the name the model is registered as\n",
" model_name = \"iris-prs\", # this is the name the model is registered as\n",
" tags = {'pretrained': \"iris\"},\n",
" workspace = ws)"
]
@@ -304,7 +298,8 @@
"from azureml.core import Environment\n",
"from azureml.core.runconfig import CondaDependencies\n",
"\n",
"predict_conda_deps = CondaDependencies.create(pip_packages=[ \"scikit-learn==0.20.3\" ])\n",
"predict_conda_deps = CondaDependencies.create(pip_packages=[\"scikit-learn==0.20.3\",\n",
" \"azureml-core\", \"azureml-dataprep[pandas,fuse]\"])\n",
"\n",
"predict_env = Environment(name=\"predict_environment\")\n",
"predict_env.python.conda_dependencies = predict_conda_deps\n",
@@ -325,7 +320,7 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.pipeline.steps import ParallelRunStep, ParallelRunConfig\n",
"from azureml.pipeline.steps import ParallelRunStep, ParallelRunConfig\n",
"\n",
"# In a real-world scenario, you'll want to shape your process per node and nodes to fit your problem domain.\n",
"parallel_run_config = ParallelRunConfig(\n",
@@ -334,10 +329,12 @@
" mini_batch_size='5MB',\n",
" error_threshold=5,\n",
" output_action='append_row',\n",
" append_row_file_name=\"iris_outputs.txt\",\n",
" environment=predict_env,\n",
" compute_target=compute_target, \n",
" node_count=3,\n",
" run_invocation_timeout=600)"
" node_count=2,\n",
" run_invocation_timeout=600\n",
")"
]
},
{
@@ -359,8 +356,7 @@
" inputs=[named_iris_ds],\n",
" output=output_folder,\n",
" parallel_run_config=parallel_run_config,\n",
" models=[model],\n",
" arguments=['--model_name', 'iris'],\n",
" arguments=['--model_name', 'iris-prs'],\n",
" allow_reuse=True\n",
")"
]
@@ -384,7 +380,7 @@
"\n",
"pipeline = Pipeline(workspace=ws, steps=[distributed_csv_iris_step])\n",
"\n",
"pipeline_run = Experiment(ws, 'iris').submit(pipeline)"
"pipeline_run = Experiment(ws, 'iris-prs').submit(pipeline)"
]
},
{
@@ -453,7 +449,7 @@
"\n",
"for root, dirs, files in os.walk(\"iris_results\"):\n",
" for file in files:\n",
" if file.endswith('parallel_run_step.txt'):\n",
" if file.endswith('iris_outputs.txt'):\n",
" result_file = os.path.join(root,file)\n",
"\n",
"# cleanup output format\n",

View File

@@ -2,6 +2,6 @@ name: tabular-dataset-inference-iris
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-pipeline-steps
- azureml-pipeline-steps
- azureml-widgets
- pandas

View File

@@ -0,0 +1,185 @@
# Original source: https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/neural_style.py
import argparse
import os
import sys
import re
from PIL import Image
import torch
from torchvision import transforms
def load_image(filename, size=None, scale=None):
img = Image.open(filename)
if size is not None:
img = img.resize((size, size), Image.ANTIALIAS)
elif scale is not None:
img = img.resize((int(img.size[0] / scale), int(img.size[1] / scale)), Image.ANTIALIAS)
return img
def save_image(filename, data):
img = data.clone().clamp(0, 255).numpy()
img = img.transpose(1, 2, 0).astype("uint8")
img = Image.fromarray(img)
img.save(filename)
class TransformerNet(torch.nn.Module):
def __init__(self):
super(TransformerNet, self).__init__()
# Initial convolution layers
self.conv1 = ConvLayer(3, 32, kernel_size=9, stride=1)
self.in1 = torch.nn.InstanceNorm2d(32, affine=True)
self.conv2 = ConvLayer(32, 64, kernel_size=3, stride=2)
self.in2 = torch.nn.InstanceNorm2d(64, affine=True)
self.conv3 = ConvLayer(64, 128, kernel_size=3, stride=2)
self.in3 = torch.nn.InstanceNorm2d(128, affine=True)
# Residual layers
self.res1 = ResidualBlock(128)
self.res2 = ResidualBlock(128)
self.res3 = ResidualBlock(128)
self.res4 = ResidualBlock(128)
self.res5 = ResidualBlock(128)
# Upsampling Layers
self.deconv1 = UpsampleConvLayer(128, 64, kernel_size=3, stride=1, upsample=2)
self.in4 = torch.nn.InstanceNorm2d(64, affine=True)
self.deconv2 = UpsampleConvLayer(64, 32, kernel_size=3, stride=1, upsample=2)
self.in5 = torch.nn.InstanceNorm2d(32, affine=True)
self.deconv3 = ConvLayer(32, 3, kernel_size=9, stride=1)
# Non-linearities
self.relu = torch.nn.ReLU()
def forward(self, X):
y = self.relu(self.in1(self.conv1(X)))
y = self.relu(self.in2(self.conv2(y)))
y = self.relu(self.in3(self.conv3(y)))
y = self.res1(y)
y = self.res2(y)
y = self.res3(y)
y = self.res4(y)
y = self.res5(y)
y = self.relu(self.in4(self.deconv1(y)))
y = self.relu(self.in5(self.deconv2(y)))
y = self.deconv3(y)
return y
class ConvLayer(torch.nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, stride):
super(ConvLayer, self).__init__()
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
out = self.reflection_pad(x)
out = self.conv2d(out)
return out
class ResidualBlock(torch.nn.Module):
"""ResidualBlock
introduced in: https://arxiv.org/abs/1512.03385
recommended architecture: http://torch.ch/blog/2016/02/04/resnets.html
"""
def __init__(self, channels):
super(ResidualBlock, self).__init__()
self.conv1 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in1 = torch.nn.InstanceNorm2d(channels, affine=True)
self.conv2 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in2 = torch.nn.InstanceNorm2d(channels, affine=True)
self.relu = torch.nn.ReLU()
def forward(self, x):
residual = x
out = self.relu(self.in1(self.conv1(x)))
out = self.in2(self.conv2(out))
out = out + residual
return out
class UpsampleConvLayer(torch.nn.Module):
"""UpsampleConvLayer
Upsamples the input and then does a convolution. This method gives better results
compared to ConvTranspose2d.
ref: http://distill.pub/2016/deconv-checkerboard/
"""
def __init__(self, in_channels, out_channels, kernel_size, stride, upsample=None):
super(UpsampleConvLayer, self).__init__()
self.upsample = upsample
if upsample:
self.upsample_layer = torch.nn.Upsample(mode='nearest', scale_factor=upsample)
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
x_in = x
if self.upsample:
x_in = self.upsample_layer(x_in)
out = self.reflection_pad(x_in)
out = self.conv2d(out)
return out
def stylize(args):
device = torch.device("cuda" if args.cuda else "cpu")
with torch.no_grad():
style_model = TransformerNet()
state_dict = torch.load(os.path.join(args.model_dir, args.style + ".pth"))
# remove saved deprecated running_* keys in InstanceNorm from the checkpoint
for k in list(state_dict.keys()):
if re.search(r'in\d+\.running_(mean|var)$', k):
del state_dict[k]
style_model.load_state_dict(state_dict)
style_model.to(device)
filenames = os.listdir(args.content_dir)
for filename in filenames:
print("Processing {}".format(filename))
full_path = os.path.join(args.content_dir, filename)
content_image = load_image(full_path, scale=args.content_scale)
content_transform = transforms.Compose([
transforms.ToTensor(),
transforms.Lambda(lambda x: x.mul(255))
])
content_image = content_transform(content_image)
content_image = content_image.unsqueeze(0).to(device)
output = style_model(content_image).cpu()
output_path = os.path.join(args.output_dir, filename)
save_image(output_path, output[0])
def main():
arg_parser = argparse.ArgumentParser(description="parser for fast-neural-style")
arg_parser.add_argument("--content-scale", type=float, default=None,
help="factor for scaling down the content image")
arg_parser.add_argument("--model-dir", type=str, required=True,
help="saved model to be used for stylizing the image.")
arg_parser.add_argument("--cuda", type=int, required=True,
help="set it to 1 for running on GPU, 0 for CPU")
arg_parser.add_argument("--style", type=str,
help="style name")
arg_parser.add_argument("--content-dir", type=str, required=True,
help="directory holding the images")
arg_parser.add_argument("--output-dir", type=str, required=True,
help="directory holding the output images")
args = arg_parser.parse_args()
if args.cuda and not torch.cuda.is_available():
print("ERROR: cuda is not available, try running on CPU")
sys.exit(1)
os.makedirs(args.output_dir, exist_ok=True)
stylize(args)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,207 @@
# Original source: https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/neural_style.py
import argparse
import os
import sys
import re
from PIL import Image
import torch
from torchvision import transforms
from mpi4py import MPI
def load_image(filename, size=None, scale=None):
img = Image.open(filename)
if size is not None:
img = img.resize((size, size), Image.ANTIALIAS)
elif scale is not None:
img = img.resize((int(img.size[0] / scale), int(img.size[1] / scale)), Image.ANTIALIAS)
return img
def save_image(filename, data):
img = data.clone().clamp(0, 255).numpy()
img = img.transpose(1, 2, 0).astype("uint8")
img = Image.fromarray(img)
img.save(filename)
class TransformerNet(torch.nn.Module):
def __init__(self):
super(TransformerNet, self).__init__()
# Initial convolution layers
self.conv1 = ConvLayer(3, 32, kernel_size=9, stride=1)
self.in1 = torch.nn.InstanceNorm2d(32, affine=True)
self.conv2 = ConvLayer(32, 64, kernel_size=3, stride=2)
self.in2 = torch.nn.InstanceNorm2d(64, affine=True)
self.conv3 = ConvLayer(64, 128, kernel_size=3, stride=2)
self.in3 = torch.nn.InstanceNorm2d(128, affine=True)
# Residual layers
self.res1 = ResidualBlock(128)
self.res2 = ResidualBlock(128)
self.res3 = ResidualBlock(128)
self.res4 = ResidualBlock(128)
self.res5 = ResidualBlock(128)
# Upsampling Layers
self.deconv1 = UpsampleConvLayer(128, 64, kernel_size=3, stride=1, upsample=2)
self.in4 = torch.nn.InstanceNorm2d(64, affine=True)
self.deconv2 = UpsampleConvLayer(64, 32, kernel_size=3, stride=1, upsample=2)
self.in5 = torch.nn.InstanceNorm2d(32, affine=True)
self.deconv3 = ConvLayer(32, 3, kernel_size=9, stride=1)
# Non-linearities
self.relu = torch.nn.ReLU()
def forward(self, X):
y = self.relu(self.in1(self.conv1(X)))
y = self.relu(self.in2(self.conv2(y)))
y = self.relu(self.in3(self.conv3(y)))
y = self.res1(y)
y = self.res2(y)
y = self.res3(y)
y = self.res4(y)
y = self.res5(y)
y = self.relu(self.in4(self.deconv1(y)))
y = self.relu(self.in5(self.deconv2(y)))
y = self.deconv3(y)
return y
class ConvLayer(torch.nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, stride):
super(ConvLayer, self).__init__()
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
out = self.reflection_pad(x)
out = self.conv2d(out)
return out
class ResidualBlock(torch.nn.Module):
"""ResidualBlock
introduced in: https://arxiv.org/abs/1512.03385
recommended architecture: http://torch.ch/blog/2016/02/04/resnets.html
"""
def __init__(self, channels):
super(ResidualBlock, self).__init__()
self.conv1 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in1 = torch.nn.InstanceNorm2d(channels, affine=True)
self.conv2 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in2 = torch.nn.InstanceNorm2d(channels, affine=True)
self.relu = torch.nn.ReLU()
def forward(self, x):
residual = x
out = self.relu(self.in1(self.conv1(x)))
out = self.in2(self.conv2(out))
out = out + residual
return out
class UpsampleConvLayer(torch.nn.Module):
"""UpsampleConvLayer
Upsamples the input and then does a convolution. This method gives better results
compared to ConvTranspose2d.
ref: http://distill.pub/2016/deconv-checkerboard/
"""
def __init__(self, in_channels, out_channels, kernel_size, stride, upsample=None):
super(UpsampleConvLayer, self).__init__()
self.upsample = upsample
if upsample:
self.upsample_layer = torch.nn.Upsample(mode='nearest', scale_factor=upsample)
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
x_in = x
if self.upsample:
x_in = self.upsample_layer(x_in)
out = self.reflection_pad(x_in)
out = self.conv2d(out)
return out
def stylize(args, comm):
rank = comm.Get_rank()
size = comm.Get_size()
device = torch.device("cuda" if args.cuda else "cpu")
with torch.no_grad():
style_model = TransformerNet()
state_dict = torch.load(os.path.join(args.model_dir, args.style + ".pth"))
# remove saved deprecated running_* keys in InstanceNorm from the checkpoint
for k in list(state_dict.keys()):
if re.search(r'in\d+\.running_(mean|var)$', k):
del state_dict[k]
style_model.load_state_dict(state_dict)
style_model.to(device)
filenames = os.listdir(args.content_dir)
filenames = sorted(filenames)
partition_size = len(filenames) // size
partitioned_filenames = filenames[rank * partition_size: (rank + 1) * partition_size]
print("RANK {} - is processing {} images out of the total {}".format(rank, len(partitioned_filenames),
len(filenames)))
output_paths = []
for filename in partitioned_filenames:
# print("Processing {}".format(filename))
full_path = os.path.join(args.content_dir, filename)
content_image = load_image(full_path, scale=args.content_scale)
content_transform = transforms.Compose([
transforms.ToTensor(),
transforms.Lambda(lambda x: x.mul(255))
])
content_image = content_transform(content_image)
content_image = content_image.unsqueeze(0).to(device)
output = style_model(content_image).cpu()
output_path = os.path.join(args.output_dir, filename)
save_image(output_path, output[0])
output_paths.append(output_path)
print("RANK {} - number of pre-aggregated output files {}".format(rank, len(output_paths)))
output_paths_list = comm.gather(output_paths, root=0)
if rank == 0:
print("RANK {} - number of aggregated output files {}".format(rank, len(output_paths_list)))
print("RANK {} - end".format(rank))
def main():
arg_parser = argparse.ArgumentParser(description="parser for fast-neural-style")
arg_parser.add_argument("--content-scale", type=float, default=None,
help="factor for scaling down the content image")
arg_parser.add_argument("--model-dir", type=str, required=True,
help="saved model to be used for stylizing the image.")
arg_parser.add_argument("--cuda", type=int, required=True,
help="set it to 1 for running on GPU, 0 for CPU")
arg_parser.add_argument("--style", type=str, help="style name")
arg_parser.add_argument("--content-dir", type=str, required=True,
help="directory holding the images")
arg_parser.add_argument("--output-dir", type=str, required=True,
help="directory holding the output images")
args = arg_parser.parse_args()
comm = MPI.COMM_WORLD
if args.cuda and not torch.cuda.is_available():
print("ERROR: cuda is not available, try running on CPU")
sys.exit(1)
os.makedirs(args.output_dir, exist_ok=True)
stylize(args, comm)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,728 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer-mpi.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Neural style transfer on video\n",
"Using modified code from `pytorch`'s neural style [example](https://pytorch.org/tutorials/advanced/neural_style_tutorial.html), we show how to setup a pipeline for doing style transfer on video. The pipeline has following steps:\n",
"1. Split a video into images\n",
"2. Run neural style on each image using one of the provided models (from `pytorch` pretrained models for this example).\n",
"3. Stitch the image back into a video."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prerequisites\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the configuration Notebook located at https://github.com/Azure/MachineLearningNotebooks first if you haven't. This sets you up with a working config file that has information on your workspace, subscription id, etc. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize Workspace\n",
"\n",
"Initialize a workspace object from persisted configuration."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"from azureml.core import Workspace, Experiment\n",
"\n",
"ws = Workspace.from_config()\n",
"print('Workspace name: ' + ws.name, \n",
" 'Azure region: ' + ws.location, \n",
" 'Subscription id: ' + ws.subscription_id, \n",
" 'Resource group: ' + ws.resource_group, sep = '\\n')\n",
"\n",
"scripts_folder = \"mpi_scripts\"\n",
"\n",
"if not os.path.isdir(scripts_folder):\n",
" os.mkdir(scripts_folder)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute, ComputeTarget\n",
"from azureml.core.datastore import Datastore\n",
"from azureml.data.data_reference import DataReference\n",
"from azureml.pipeline.core import Pipeline, PipelineData\n",
"from azureml.pipeline.steps import PythonScriptStep, MpiStep\n",
"from azureml.core.runconfig import CondaDependencies, RunConfiguration\n",
"from azureml.core.compute_target import ComputeTargetException"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Create or use existing compute"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# AmlCompute\n",
"cpu_cluster_name = \"cpu-cluster\"\n",
"try:\n",
" cpu_cluster = AmlCompute(ws, cpu_cluster_name)\n",
" print(\"found existing cluster.\")\n",
"except ComputeTargetException:\n",
" print(\"creating new cluster\")\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_v2\",\n",
" max_nodes = 1)\n",
"\n",
" # create the cluster\n",
" cpu_cluster = ComputeTarget.create(ws, cpu_cluster_name, provisioning_config)\n",
" cpu_cluster.wait_for_completion(show_output=True)\n",
" \n",
"# AmlCompute\n",
"gpu_cluster_name = \"gpu-cluster\"\n",
"try:\n",
" gpu_cluster = AmlCompute(ws, gpu_cluster_name)\n",
" print(\"found existing cluster.\")\n",
"except ComputeTargetException:\n",
" print(\"creating new cluster\")\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_NC6\",\n",
" max_nodes = 3)\n",
"\n",
" # create the cluster\n",
" gpu_cluster = ComputeTarget.create(ws, gpu_cluster_name, provisioning_config)\n",
" gpu_cluster.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Python Scripts\n",
"We use an edited version of `neural_style_mpi.py` (original is [here](https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/neural_style.py)). Scripts to split and stitch the video are thin wrappers to calls to `ffmpeg`. These scripts are also located in the \"scripts_folder\".\n",
"\n",
"We install `ffmpeg` through conda dependencies."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile $scripts_folder/process_video.py\n",
"import argparse\n",
"import glob\n",
"import os\n",
"import subprocess\n",
"\n",
"parser = argparse.ArgumentParser(description=\"Process input video\")\n",
"parser.add_argument('--input_video', required=True)\n",
"parser.add_argument('--output_audio', required=True)\n",
"parser.add_argument('--output_images', required=True)\n",
"\n",
"args = parser.parse_args()\n",
"\n",
"os.makedirs(args.output_audio, exist_ok=True)\n",
"os.makedirs(args.output_images, exist_ok=True)\n",
"\n",
"subprocess.run(\"ffmpeg -i {} {}/video.aac\"\n",
" .format(args.input_video, args.output_audio),\n",
" shell=True, check=True\n",
" )\n",
"\n",
"subprocess.run(\"ffmpeg -i {} {}/%05d_video.jpg -hide_banner\"\n",
" .format(args.input_video, args.output_images),\n",
" shell=True, check=True\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile $scripts_folder/stitch_video.py\n",
"import argparse\n",
"import os\n",
"import subprocess\n",
"\n",
"parser = argparse.ArgumentParser(description=\"Process input video\")\n",
"parser.add_argument('--images_dir', required=True)\n",
"parser.add_argument('--input_audio', required=True)\n",
"parser.add_argument('--output_dir', required=True)\n",
"\n",
"args = parser.parse_args()\n",
"\n",
"os.makedirs(args.output_dir, exist_ok=True)\n",
"\n",
"subprocess.run(\"ffmpeg -framerate 30 -i {}/%05d_video.jpg -c:v libx264 -profile:v high -crf 20 -pix_fmt yuv420p \"\n",
" \"-y {}/video_without_audio.mp4\"\n",
" .format(args.images_dir, args.output_dir),\n",
" shell=True, check=True\n",
" )\n",
"\n",
"subprocess.run(\"ffmpeg -i {}/video_without_audio.mp4 -i {}/video.aac -map 0:0 -map 1:0 -vcodec \"\n",
" \"copy -acodec copy -y {}/video_with_audio.mp4\"\n",
" .format(args.output_dir, args.input_audio, args.output_dir),\n",
" shell=True, check=True\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The sample video **organutan.mp4** is stored at a publicly shared datastore. We are registering the datastore below. If you want to take a look at the original video, click here. (https://pipelinedata.blob.core.windows.net/sample-videos/orangutan.mp4)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# datastore for input video\n",
"account_name = \"pipelinedata\"\n",
"video_ds = Datastore.register_azure_blob_container(ws, \"videos\", \"sample-videos\",\n",
" account_name=account_name, overwrite=True)\n",
"\n",
"# datastore for models\n",
"models_ds = Datastore.register_azure_blob_container(ws, \"models\", \"styletransfer\", \n",
" account_name=\"pipelinedata\", \n",
" overwrite=True)\n",
" \n",
"# downloaded models from https://pytorch.org/tutorials/advanced/neural_style_tutorial.html are kept here\n",
"models_dir = DataReference(data_reference_name=\"models\", datastore=models_ds, \n",
" path_on_datastore=\"saved_models\", mode=\"download\")\n",
"\n",
"# the default blob store attached to a workspace\n",
"default_datastore = ws.get_default_datastore()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Sample video"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"video_name=os.getenv(\"STYLE_TRANSFER_VIDEO_NAME\", \"orangutan.mp4\") \n",
"orangutan_video = DataReference(datastore=video_ds,\n",
" data_reference_name=\"video\",\n",
" path_on_datastore=video_name, mode=\"download\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"cd = CondaDependencies()\n",
"\n",
"cd.add_channel(\"conda-forge\")\n",
"cd.add_conda_package(\"ffmpeg\")\n",
"\n",
"cd.add_channel(\"pytorch\")\n",
"cd.add_conda_package(\"pytorch\")\n",
"cd.add_conda_package(\"torchvision\")\n",
"\n",
"# Runconfig\n",
"amlcompute_run_config = RunConfiguration(conda_dependencies=cd)\n",
"amlcompute_run_config.environment.docker.enabled = True\n",
"amlcompute_run_config.environment.docker.base_image = \"pytorch/pytorch\"\n",
"amlcompute_run_config.environment.spark.precache_packages = False"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ffmpeg_audio = PipelineData(name=\"ffmpeg_audio\", datastore=default_datastore)\n",
"ffmpeg_images = PipelineData(name=\"ffmpeg_images\", datastore=default_datastore)\n",
"processed_images = PipelineData(name=\"processed_images\", datastore=default_datastore)\n",
"output_video = PipelineData(name=\"output_video\", datastore=default_datastore)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Define tweakable parameters to pipeline\n",
"These parameters can be changed when the pipeline is published and rerun from a REST call"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core.graph import PipelineParameter\n",
"# create a parameter for style (one of \"candy\", \"mosaic\", \"rain_princess\", \"udnie\") to transfer the images to\n",
"style_param = PipelineParameter(name=\"style\", default_value=\"mosaic\")\n",
"# create a parameter for the number of nodes to use in step no. 2 (style transfer)\n",
"nodecount_param = PipelineParameter(name=\"nodecount\", default_value=1)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"split_video_step = PythonScriptStep(\n",
" name=\"split video\",\n",
" script_name=\"process_video.py\",\n",
" arguments=[\"--input_video\", orangutan_video,\n",
" \"--output_audio\", ffmpeg_audio,\n",
" \"--output_images\", ffmpeg_images,\n",
" ],\n",
" compute_target=cpu_cluster,\n",
" inputs=[orangutan_video],\n",
" outputs=[ffmpeg_images, ffmpeg_audio],\n",
" runconfig=amlcompute_run_config,\n",
" source_directory=scripts_folder\n",
")\n",
"\n",
"# create a MPI step for distributing style transfer step across multiple nodes in AmlCompute \n",
"# using 'nodecount_param' PipelineParameter\n",
"distributed_style_transfer_step = MpiStep(\n",
" name=\"mpi style transfer\",\n",
" script_name=\"neural_style_mpi.py\",\n",
" arguments=[\"--content-dir\", ffmpeg_images,\n",
" \"--output-dir\", processed_images,\n",
" \"--model-dir\", models_dir,\n",
" \"--style\", style_param,\n",
" \"--cuda\", 1\n",
" ],\n",
" compute_target=gpu_cluster,\n",
" node_count=nodecount_param, \n",
" process_count_per_node=1,\n",
" inputs=[models_dir, ffmpeg_images],\n",
" outputs=[processed_images],\n",
" pip_packages=[\"mpi4py\", \"torch\", \"torchvision\"],\n",
" use_gpu=True,\n",
" source_directory=scripts_folder\n",
")\n",
"\n",
"stitch_video_step = PythonScriptStep(\n",
" name=\"stitch\",\n",
" script_name=\"stitch_video.py\",\n",
" arguments=[\"--images_dir\", processed_images, \n",
" \"--input_audio\", ffmpeg_audio, \n",
" \"--output_dir\", output_video],\n",
" compute_target=cpu_cluster,\n",
" inputs=[processed_images, ffmpeg_audio],\n",
" outputs=[output_video],\n",
" runconfig=amlcompute_run_config,\n",
" source_directory=scripts_folder\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Run the pipeline"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline = Pipeline(workspace=ws, steps=[stitch_video_step])\n",
"# submit the pipeline and provide values for the PipelineParameters used in the pipeline\n",
"pipeline_run = Experiment(ws, 'style_transfer').submit(pipeline, pipeline_parameters={\"style\": \"mosaic\", \"nodecount\": 3})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Monitor using widget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.widgets import RunDetails\n",
"RunDetails(pipeline_run).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Downloads the video in `output_video` folder"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Download output video"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def download_video(run, target_dir=None):\n",
" stitch_run = run.find_step_run(\"stitch\")[0]\n",
" port_data = stitch_run.get_output_data(\"output_video\")\n",
" port_data.download(target_dir, show_progress=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run.wait_for_completion()\n",
"download_video(pipeline_run, \"output_video_mosaic\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Publish pipeline"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline = pipeline_run.publish_pipeline(\n",
" name=\"batch score style transfer\", description=\"style transfer\", version=\"1.0\")\n",
"\n",
"published_pipeline"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get published pipeline\n",
"\n",
"You can get the published pipeline using **pipeline id**.\n",
"\n",
"To get all the published pipelines for a given workspace(ws): \n",
"```css\n",
"all_pub_pipelines = PublishedPipeline.get_all(ws)\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core import PublishedPipeline\n",
"\n",
"pipeline_id = published_pipeline.id # use your published pipeline id\n",
"published_pipeline = PublishedPipeline.get(ws, pipeline_id)\n",
"\n",
"published_pipeline"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Re-run pipeline through REST calls for other styles"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get AAD token\n",
"[This notebook](https://aka.ms/pl-restep-auth) shows how to authenticate to AML workspace."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
"import requests\n",
"\n",
"auth = InteractiveLoginAuthentication()\n",
"aad_token = auth.get_authentication_header()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get endpoint URL"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"rest_endpoint = published_pipeline.endpoint"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Send request and monitor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Run the pipeline using PipelineParameter values style='candy' and nodecount=2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": \"style_transfer\",\n",
" \"ParameterAssignments\": {\"style\": \"candy\", \"nodecount\": 2}})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core.run import PipelineRun\n",
"published_pipeline_run_candy = PipelineRun(ws.experiments[\"style_transfer\"], run_id)\n",
"RunDetails(published_pipeline_run_candy).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Run the pipeline using PipelineParameter values style='rain_princess' and nodecount=3"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": \"style_transfer\",\n",
" \"ParameterAssignments\": {\"style\": \"rain_princess\", \"nodecount\": 3}})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_rain = PipelineRun(ws.experiments[\"style_transfer\"], run_id)\n",
"RunDetails(published_pipeline_run_rain).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Run the pipeline using PipelineParameter values style='udnie' and nodecount=4"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": \"style_transfer\",\n",
" \"ParameterAssignments\": {\"style\": \"udnie\", \"nodecount\": 3}})\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_udnie = PipelineRun(ws.experiments[\"style_transfer\"], run_id)\n",
"RunDetails(published_pipeline_run_udnie).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Download output from re-run"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_candy.wait_for_completion()\n",
"published_pipeline_run_rain.wait_for_completion()\n",
"published_pipeline_run_udnie.wait_for_completion()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"download_video(published_pipeline_run_candy, target_dir=\"output_video_candy\")\n",
"download_video(published_pipeline_run_rain, target_dir=\"output_video_rain_princess\")\n",
"download_video(published_pipeline_run_udnie, target_dir=\"output_video_udnie\")"
]
}
],
"metadata": {
"authors": [
{
"name": "balapv mabables"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.7"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,8 +1,7 @@
name: pipeline-style-transfer
name: pipeline-style-transfer-mpi
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-pipeline-steps
- azureml-pipeline-steps
- azureml-widgets
- requests

View File

@@ -13,7 +13,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer.png)"
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer-parallel-run.png)"
]
},
{
@@ -26,11 +26,8 @@
"2. Run neural style on each image using one of the provided models (from `pytorch` pretrained models for this example).\n",
"3. Stitch the image back into a video.\n",
"\n",
"> **Note**\n",
"This notebook uses public preview functionality (ParallelRunStep). Please install azureml-contrib-pipeline-steps package before running this notebook.\n",
"```\n",
"pip install azureml-contrib-pipeline-steps\n",
"```"
"> **Tip**\n",
"If your system requires low-latency processing (to process a single document or small set of documents quickly), use [real-time scoring](https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-consume-web-service) instead of batch prediction."
]
},
{
@@ -356,7 +353,9 @@
"source": [
"from azureml.pipeline.core.graph import PipelineParameter\n",
"# create a parameter for style (one of \"candy\", \"mosaic\") to transfer the images to\n",
"style_param = PipelineParameter(name=\"style\", default_value=\"mosaic\")"
"style_param = PipelineParameter(name=\"style\", default_value=\"mosaic\")\n",
"# create a parameter for the number of nodes to use in step no. 2 (style transfer)\n",
"nodecount_param = PipelineParameter(name=\"nodecount\", default_value=2)"
]
},
{
@@ -415,6 +414,8 @@
"parallel_cd.add_conda_package(\"pytorch\")\n",
"parallel_cd.add_conda_package(\"torchvision\")\n",
"parallel_cd.add_conda_package(\"pillow<7\") # needed for torchvision==0.4.0\n",
"parallel_cd.add_pip_package(\"azureml-core\")\n",
"parallel_cd.add_pip_package(\"azureml-dataprep[fuse]\")\n",
"\n",
"styleenvironment = Environment(name=\"styleenvironment\")\n",
"styleenvironment.python.conda_dependencies=parallel_cd\n",
@@ -427,7 +428,8 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.pipeline.steps import ParallelRunConfig\n",
"from azureml.pipeline.core import PipelineParameter\n",
"from azureml.pipeline.steps import ParallelRunConfig\n",
"\n",
"parallel_run_config = ParallelRunConfig(\n",
" environment=styleenvironment,\n",
@@ -437,7 +439,9 @@
" error_threshold=1,\n",
" source_directory=scripts_folder,\n",
" compute_target=gpu_cluster, \n",
" node_count=3)"
" node_count=nodecount_param,\n",
" process_count_per_node=2\n",
")"
]
},
{
@@ -446,7 +450,7 @@
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.pipeline.steps import ParallelRunStep\n",
"from azureml.pipeline.steps import ParallelRunStep\n",
"from datetime import datetime\n",
"\n",
"parallel_step_name = 'styletransfer-' + datetime.now().strftime('%Y%m%d%H%M')\n",
@@ -455,9 +459,6 @@
" name=parallel_step_name,\n",
" inputs=[ffmpeg_images_file_dataset], # Input file share/blob container/file dataset\n",
" output=processed_images, # Output file share/blob container\n",
" models=[mosaic_model, candy_model],\n",
" tags = {'scenario': \"batch inference\", 'type': \"demo\"},\n",
" properties = {'area': \"style transfer\"},\n",
" arguments=[\"--style\", style_param],\n",
" parallel_run_config=parallel_run_config,\n",
" allow_reuse=True #[optional - default value True]\n",
@@ -666,7 +667,8 @@
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": experiment_name,\n",
" \"ParameterAssignments\": {\"style\": \"candy\", \"aml_node_count\": 2}})\n",
" \"ParameterAssignments\": {\"style\": \"candy\", \"NodeCount\": 3}})\n",
"\n",
"run_id = response.json()[\"Id\"]\n",
"\n",
"from azureml.pipeline.core.run import PipelineRun\n",

View File

@@ -0,0 +1,7 @@
name: pipeline-style-transfer-parallel-run
dependencies:
- pip:
- azureml-sdk
- azureml-pipeline-steps
- azureml-widgets
- requests

View File

@@ -381,7 +381,7 @@
"metadata": {},
"outputs": [],
"source": [
"run.cancel()"
"run.wait_for_completion(show_output=True)"
]
},
{

View File

@@ -456,6 +456,24 @@
"monitor.enable_schedule()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Delete the DataDriftDetector\n",
"\n",
"Invoking the `delete()` method on the object deletes the the drift monitor permanently and cannot be undone. You will no longer be able to find it in the UI and the `list()` or `get()` methods. The object on which delete() was called will have its state set to deleted and name suffixed with deleted. The baseline and target datasets and model data that was collected, if any, are not deleted. The compute is not deleted. The DataDrift schedule pipeline is disabled and archived."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"monitor.delete()"
]
},
{
"cell_type": "markdown",
"metadata": {},

View File

@@ -7,4 +7,5 @@ dependencies:
- azureml-monitoring
- scikit-learn
- numpy
- packaging
- inference-schema[numpy-support]

View File

@@ -4,7 +4,13 @@ import numpy as np
from azureml.monitoring import ModelDataCollector
from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType
from inference_schema.schema_decorators import input_schema, output_schema
from sklearn.externals import joblib
# sklearn.externals.joblib is removed in 0.23
from sklearn import __version__ as sklearnver
from packaging.version import Version
if Version(sklearnver) < Version("0.23.0"):
from sklearn.externals import joblib
else:
import joblib
def init():

Some files were not shown because too many files have changed in this diff Show More