Compare commits

...

5 Commits

Author SHA1 Message Date
amlrelsa-ms
6c629f1eda update samples from Release-57 as a part of SDK release 2020-07-06 22:05:24 +00:00
Harneet Virk
053efde8c9 Merge pull request #1022 from Azure/release_update/Release-56
update samples from Release-56 as a part of  SDK release
2020-06-22 11:12:31 -07:00
amlrelsa-ms
5189691f06 update samples from Release-56 as a part of SDK release 2020-06-22 18:11:40 +00:00
Harneet Virk
fb900916e3 Update README.md 2020-06-11 13:26:04 -07:00
Harneet Virk
738347f3da Merge pull request #996 from Azure/release_update/Release-55
update samples from Release-55 as a part of  SDK release
2020-06-08 15:31:35 -07:00
80 changed files with 3004 additions and 1348 deletions

View File

@@ -103,7 +103,7 @@
"source": [ "source": [
"import azureml.core\n", "import azureml.core\n",
"\n", "\n",
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -0,0 +1,549 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/contrib/fairness/fairlearn-azureml-mitigation.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Unfairness Mitigation with Fairlearn and Azure Machine Learning\n",
"**This notebook shows how to upload results from Fairlearn's GridSearch mitigation algorithm into a dashboard in Azure Machine Learning Studio**\n",
"\n",
"## Table of Contents\n",
"\n",
"1. [Introduction](#Introduction)\n",
"1. [Loading the Data](#LoadingData)\n",
"1. [Training an Unmitigated Model](#UnmitigatedModel)\n",
"1. [Mitigation with GridSearch](#Mitigation)\n",
"1. [Uploading a Fairness Dashboard to Azure](#AzureUpload)\n",
" 1. Registering models\n",
" 1. Computing Fairness Metrics\n",
" 1. Uploading to Azure\n",
"1. [Conclusion](#Conclusion)\n",
"\n",
"<a id=\"Introduction\"></a>\n",
"## Introduction\n",
"This notebook shows how to use [Fairlearn (an open source fairness assessment and unfairness mitigation package)](http://fairlearn.github.io) and Azure Machine Learning Studio for a binary classification problem. This example uses the well-known adult census dataset. For the purposes of this notebook, we shall treat this as a loan decision problem. We will pretend that the label indicates whether or not each individual repaid a loan in the past. We will use the data to train a predictor to predict whether previously unseen individuals will repay a loan or not. The assumption is that the model predictions are used to decide whether an individual should be offered a loan. Its purpose is purely illustrative of a workflow including a fairness dashboard - in particular, we do **not** include a full discussion of the detailed issues which arise when considering fairness in machine learning. For such discussions, please [refer to the Fairlearn website](http://fairlearn.github.io/).\n",
"\n",
"We will apply the [grid search algorithm](https://fairlearn.github.io/api_reference/fairlearn.reductions.html#fairlearn.reductions.GridSearch) from the Fairlearn package using a specific notion of fairness called Demographic Parity. This produces a set of models, and we will view these in a dashboard both locally and in the Azure Machine Learning Studio.\n",
"\n",
"### Setup\n",
"\n",
"To use this notebook, an Azure Machine Learning workspace is required.\n",
"Please see the [configuration notebook](../../configuration.ipynb) for information about creating one, if required.\n",
"This notebook also requires the following packages:\n",
"* `azureml-contrib-fairness`\n",
"* `fairlearn==0.4.6`\n",
"* `joblib`\n",
"* `shap`\n",
"\n",
"\n",
"<a id=\"LoadingData\"></a>\n",
"## Loading the Data\n",
"We use the well-known `adult` census dataset, which we load using `shap` (for convenience). We start with a fairly unremarkable set of imports:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from fairlearn.reductions import GridSearch, DemographicParity, ErrorRate\n",
"from fairlearn.widget import FairlearnDashboard\n",
"from sklearn import svm\n",
"from sklearn.preprocessing import LabelEncoder, StandardScaler\n",
"from sklearn.linear_model import LogisticRegression\n",
"import pandas as pd\n",
"import shap"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can now load and inspect the data from the `shap` package:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_raw, Y = shap.datasets.adult()\n",
"X_raw[\"Race\"].value_counts().to_dict()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We are going to treat the sex of each individual as a protected attribute (where 0 indicates female and 1 indicates male), and in this particular case we are going separate this attribute out and drop it from the main data (this is not always the best option - see the [Fairlearn website](http://fairlearn.github.io/) for further discussion). We also separate out the Race column, but we will not perform any mitigation based on it. Finally, we perform some standard data preprocessing steps to convert the data into a format suitable for the ML algorithms"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"A = X_raw[['Sex','Race']]\n",
"X = X_raw.drop(labels=['Sex', 'Race'],axis = 1)\n",
"X = pd.get_dummies(X)\n",
"\n",
"\n",
"le = LabelEncoder()\n",
"Y = le.fit_transform(Y)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"With our data prepared, we can make the conventional split in to 'test' and 'train' subsets:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.model_selection import train_test_split\n",
"X_train, X_test, Y_train, Y_test, A_train, A_test = train_test_split(X_raw, \n",
" Y, \n",
" A,\n",
" test_size = 0.2,\n",
" random_state=0,\n",
" stratify=Y)\n",
"\n",
"# Work around indexing issue\n",
"X_train = X_train.reset_index(drop=True)\n",
"A_train = A_train.reset_index(drop=True)\n",
"X_test = X_test.reset_index(drop=True)\n",
"A_test = A_test.reset_index(drop=True)\n",
"\n",
"# Improve labels\n",
"A_test.Sex.loc[(A_test['Sex'] == 0)] = 'female'\n",
"A_test.Sex.loc[(A_test['Sex'] == 1)] = 'male'\n",
"\n",
"\n",
"A_test.Race.loc[(A_test['Race'] == 0)] = 'Amer-Indian-Eskimo'\n",
"A_test.Race.loc[(A_test['Race'] == 1)] = 'Asian-Pac-Islander'\n",
"A_test.Race.loc[(A_test['Race'] == 2)] = 'Black'\n",
"A_test.Race.loc[(A_test['Race'] == 3)] = 'Other'\n",
"A_test.Race.loc[(A_test['Race'] == 4)] = 'White'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"UnmitigatedModel\"></a>\n",
"## Training an Unmitigated Model\n",
"\n",
"So we have a point of comparison, we first train a model (specifically, logistic regression from scikit-learn) on the raw data, without applying any mitigation algorithm:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"unmitigated_predictor = LogisticRegression(solver='liblinear', fit_intercept=True)\n",
"\n",
"unmitigated_predictor.fit(X_train, Y_train)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can view this model in the fairness dashboard, and see the disparities which appear:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"FairlearnDashboard(sensitive_features=A_test, sensitive_feature_names=['Sex', 'Race'],\n",
" y_true=Y_test,\n",
" y_pred={\"unmitigated\": unmitigated_predictor.predict(X_test)})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Looking at the disparity in accuracy when we select 'Sex' as the sensitive feature, we see that males have an error rate about three times greater than the females. More interesting is the disparity in opportunitiy - males are offered loans at three times the rate of females.\n",
"\n",
"Despite the fact that we removed the feature from the training data, our predictor still discriminates based on sex. This demonstrates that simply ignoring a protected attribute when fitting a predictor rarely eliminates unfairness. There will generally be enough other features correlated with the removed attribute to lead to disparate impact."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"Mitigation\"></a>\n",
"## Mitigation with GridSearch\n",
"\n",
"The `GridSearch` class in `Fairlearn` implements a simplified version of the exponentiated gradient reduction of [Agarwal et al. 2018](https://arxiv.org/abs/1803.02453). The user supplies a standard ML estimator, which is treated as a blackbox - for this simple example, we shall use the logistic regression estimator from scikit-learn. `GridSearch` works by generating a sequence of relabellings and reweightings, and trains a predictor for each.\n",
"\n",
"For this example, we specify demographic parity (on the protected attribute of sex) as the fairness metric. Demographic parity requires that individuals are offered the opportunity (a loan in this example) independent of membership in the protected class (i.e., females and males should be offered loans at the same rate). *We are using this metric for the sake of simplicity* in this example; the appropriate fairness metric can only be selected after *careful examination of the broader context* in which the model is to be used."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sweep = GridSearch(LogisticRegression(solver='liblinear', fit_intercept=True),\n",
" constraints=DemographicParity(),\n",
" grid_size=71)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"With our estimator created, we can fit it to the data. After `fit()` completes, we extract the full set of predictors from the `GridSearch` object.\n",
"\n",
"The following cell trains a many copies of the underlying estimator, and may take a minute or two to run:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sweep.fit(X_train, Y_train,\n",
" sensitive_features=A_train.Sex)\n",
"\n",
"predictors = sweep._predictors"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We could load these predictors into the Fairness dashboard now. However, the plot would be somewhat confusing due to their number. In this case, we are going to remove the predictors which are dominated in the error-disparity space by others from the sweep (note that the disparity will only be calculated for the protected attribute; other potentially protected attributes will *not* be mitigated). In general, one might not want to do this, since there may be other considerations beyond the strict optimisation of error and disparity (of the given protected attribute)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"errors, disparities = [], []\n",
"for m in predictors:\n",
" classifier = lambda X: m.predict(X)\n",
" \n",
" error = ErrorRate()\n",
" error.load_data(X_train, pd.Series(Y_train), sensitive_features=A_train.Sex)\n",
" disparity = DemographicParity()\n",
" disparity.load_data(X_train, pd.Series(Y_train), sensitive_features=A_train.Sex)\n",
" \n",
" errors.append(error.gamma(classifier)[0])\n",
" disparities.append(disparity.gamma(classifier).max())\n",
" \n",
"all_results = pd.DataFrame( {\"predictor\": predictors, \"error\": errors, \"disparity\": disparities})\n",
"\n",
"dominant_models_dict = dict()\n",
"base_name_format = \"census_gs_model_{0}\"\n",
"row_id = 0\n",
"for row in all_results.itertuples():\n",
" model_name = base_name_format.format(row_id)\n",
" errors_for_lower_or_eq_disparity = all_results[\"error\"][all_results[\"disparity\"]<=row.disparity]\n",
" if row.error <= errors_for_lower_or_eq_disparity.min():\n",
" dominant_models_dict[model_name] = row.predictor\n",
" row_id = row_id + 1"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can construct predictions for the dominant models (we include the unmitigated predictor as well, for comparison):"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"predictions_dominant = {\"census_unmitigated\": unmitigated_predictor.predict(X_test)}\n",
"models_dominant = {\"census_unmitigated\": unmitigated_predictor}\n",
"for name, predictor in dominant_models_dict.items():\n",
" value = predictor.predict(X_test)\n",
" predictions_dominant[name] = value\n",
" models_dominant[name] = predictor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"These predictions may then be viewed in the fairness dashboard. We include the race column from the dataset, as an alternative basis for assessing the models. However, since we have not based our mitigation on it, the variation in the models with respect to race can be large."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"FairlearnDashboard(sensitive_features=A_test, \n",
" sensitive_feature_names=['Sex', 'Race'],\n",
" y_true=Y_test.tolist(),\n",
" y_pred=predictions_dominant)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"When using sex as the sensitive feature, we see a Pareto front forming - the set of predictors which represent optimal tradeoffs between accuracy and disparity in predictions. In the ideal case, we would have a predictor at (1,0) - perfectly accurate and without any unfairness under demographic parity (with respect to the protected attribute \"sex\"). The Pareto front represents the closest we can come to this ideal based on our data and choice of estimator. Note the range of the axes - the disparity axis covers more values than the accuracy, so we can reduce disparity substantially for a small loss in accuracy. Finally, we also see that the unmitigated model is towards the top right of the plot, with high accuracy, but worst disparity.\n",
"\n",
"By clicking on individual models on the plot, we can inspect their metrics for disparity and accuracy in greater detail. In a real example, we would then pick the model which represented the best trade-off between accuracy and disparity given the relevant business constraints."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"AzureUpload\"></a>\n",
"## Uploading a Fairness Dashboard to Azure\n",
"\n",
"Uploading a fairness dashboard to Azure is a two stage process. The `FairlearnDashboard` invoked in the previous section relies on the underlying Python kernel to compute metrics on demand. This is obviously not available when the fairness dashboard is rendered in AzureML Studio. By default, the dashboard in Azure Machine Learning Studio also requires the models to be registered. The required stages are therefore:\n",
"1. Register the dominant models\n",
"1. Precompute all the required metrics\n",
"1. Upload to Azure\n",
"\n",
"Before that, we need to connect to Azure Machine Learning Studio:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace, Experiment, Model\n",
"\n",
"ws = Workspace.from_config()\n",
"ws.get_details()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"RegisterModels\"></a>\n",
"### Registering Models\n",
"\n",
"The fairness dashboard is designed to integrate with registered models, so we need to do this for the models we want in the Studio portal. The assumption is that the names of the models specified in the dashboard dictionary correspond to the `id`s (i.e. `<name>:<version>` pairs) of registered models in the workspace. We register each of the models in the `models_dominant` dictionary into the workspace. For this, we have to save each model to a file, and then register that file:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"import os\n",
"\n",
"os.makedirs('models', exist_ok=True)\n",
"def register_model(name, model):\n",
" print(\"Registering \", name)\n",
" model_path = \"models/{0}.pkl\".format(name)\n",
" joblib.dump(value=model, filename=model_path)\n",
" registered_model = Model.register(model_path=model_path,\n",
" model_name=name,\n",
" workspace=ws)\n",
" print(\"Registered \", registered_model.id)\n",
" return registered_model.id\n",
"\n",
"model_name_id_mapping = dict()\n",
"for name, model in models_dominant.items():\n",
" m_id = register_model(name, model)\n",
" model_name_id_mapping[name] = m_id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, produce new predictions dictionaries, with the updated names:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"predictions_dominant_ids = dict()\n",
"for name, y_pred in predictions_dominant.items():\n",
" predictions_dominant_ids[model_name_id_mapping[name]] = y_pred"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"PrecomputeMetrics\"></a>\n",
"### Precomputing Metrics\n",
"\n",
"We create a _dashboard dictionary_ using Fairlearn's `metrics` package. The `_create_group_metric_set` method has arguments similar to the Dashboard constructor, except that the sensitive features are passed as a dictionary (to ensure that names are available), and we must specify the type of prediction. Note that we use the `predictions_dominant_ids` dictionary we just created:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sf = { 'sex': A_test.Sex, 'race': A_test.Race }\n",
"\n",
"from fairlearn.metrics._group_metric_set import _create_group_metric_set\n",
"\n",
"\n",
"dash_dict = _create_group_metric_set(y_true=Y_test,\n",
" predictions=predictions_dominant_ids,\n",
" sensitive_features=sf,\n",
" prediction_type='binary_classification')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"DashboardUpload\"></a>\n",
"### Uploading the Dashboard\n",
"\n",
"Now, we import our `contrib` package which contains the routine to perform the upload:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.fairness import upload_dashboard_dictionary, download_dashboard_by_upload_id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we can create an Experiment, then a Run, and upload our dashboard to it:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"exp = Experiment(ws, \"Test_Fairlearn_GridSearch_Census_Demo\")\n",
"print(exp)\n",
"\n",
"run = exp.start_logging()\n",
"try:\n",
" dashboard_title = \"Dominant Models from GridSearch\"\n",
" upload_id = upload_dashboard_dictionary(run,\n",
" dash_dict,\n",
" dashboard_name=dashboard_title)\n",
" print(\"\\nUploaded to id: {0}\\n\".format(upload_id))\n",
"\n",
" downloaded_dict = download_dashboard_by_upload_id(run, upload_id)\n",
"finally:\n",
" run.complete()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The dashboard can be viewed in the Run Details page.\n",
"\n",
"Finally, we can verify that the dashboard dictionary which we downloaded matches our upload:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(dash_dict == downloaded_dict)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"Conclusion\"></a>\n",
"## Conclusion\n",
"\n",
"In this notebook we have demonstrated how to use the `GridSearch` algorithm from Fairlearn to generate a collection of models, and then present them in the fairness dashboard in Azure Machine Learning Studio. Please remember that this notebook has not attempted to discuss the many considerations which should be part of any approach to unfairness mitigation. The [Fairlearn website](http://fairlearn.github.io/) provides that discussion"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"authors": [
{
"name": "riedgar"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.10"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -0,0 +1,8 @@
name: fairlearn-azureml-mitigation
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-fairness
- fairlearn==0.4.6
- joblib
- shap

View File

@@ -0,0 +1,494 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved. \n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/contrib/fairness/upload-fairness-dashboard.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Upload a Fairness Dashboard to Azure Machine Learning Studio\n",
"**This notebook shows how to generate and upload a fairness assessment dashboard from Fairlearn to AzureML Studio**\n",
"\n",
"## Table of Contents\n",
"\n",
"1. [Introduction](#Introduction)\n",
"1. [Loading the Data](#LoadingData)\n",
"1. [Processing the Data](#ProcessingData)\n",
"1. [Training Models](#TrainingModels)\n",
"1. [Logging in to AzureML](#LoginAzureML)\n",
"1. [Registering the Models](#RegisterModels)\n",
"1. [Using the Fairlearn Dashboard](#LocalDashboard)\n",
"1. [Uploading a Fairness Dashboard to Azure](#AzureUpload)\n",
" 1. Computing Fairness Metrics\n",
" 1. Uploading to Azure\n",
"1. [Conclusion](#Conclusion)\n",
" \n",
"\n",
"<a id=\"Introduction\"></a>\n",
"## Introduction\n",
"\n",
"In this notebook, we walk through a simple example of using the `azureml-contrib-fairness` package to upload a collection of fairness statistics for a fairness dashboard. It is an example of integrating the [open source Fairlearn package](https://www.github.com/fairlearn/fairlearn) with Azure Machine Learning. This is not an example of fairness analysis or mitigation - this notebook simply shows how to get a fairness dashboard into the Azure Machine Learning portal. We will load the data and train a couple of simple models. We will then use Fairlearn to generate data for a Fairness dashboard, which we can upload to Azure Machine Learning portal and view there.\n",
"\n",
"### Setup\n",
"\n",
"To use this notebook, an Azure Machine Learning workspace is required.\n",
"Please see the [configuration notebook](../../configuration.ipynb) for information about creating one, if required.\n",
"This notebook also requires the following packages:\n",
"* `azureml-contrib-fairness`\n",
"* `fairlearn==0.4.6`\n",
"* `joblib`\n",
"* `shap`\n",
"\n",
"\n",
"\n",
"\n",
"<a id=\"LoadingData\"></a>\n",
"## Loading the Data\n",
"We use the well-known `adult` census dataset, which we load using `shap` (for convenience). We start with a fairly unremarkable set of imports:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn import svm\n",
"from sklearn.preprocessing import LabelEncoder, StandardScaler\n",
"from sklearn.linear_model import LogisticRegression\n",
"import pandas as pd\n",
"import shap"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we can load the data:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"X_raw, Y = shap.datasets.adult()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can take a look at some of the data. For example, the next cells shows the counts of the different races identified in the dataset:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(X_raw[\"Race\"].value_counts().to_dict())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"ProcessingData\"></a>\n",
"## Processing the Data\n",
"\n",
"With the data loaded, we process it for our needs. First, we extract the sensitive features of interest into `A` (conventionally used in the literature) and put the rest of the feature data into `X`:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"A = X_raw[['Sex','Race']]\n",
"X = X_raw.drop(labels=['Sex', 'Race'],axis = 1)\n",
"X = pd.get_dummies(X)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Next, we apply a standard set of scalings:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sc = StandardScaler()\n",
"X_scaled = sc.fit_transform(X)\n",
"X_scaled = pd.DataFrame(X_scaled, columns=X.columns)\n",
"\n",
"le = LabelEncoder()\n",
"Y = le.fit_transform(Y)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can then split our data into training and test sets, and also make the labels on our test portion of `A` human-readable:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from sklearn.model_selection import train_test_split\n",
"X_train, X_test, Y_train, Y_test, A_train, A_test = train_test_split(X_scaled, \n",
" Y, \n",
" A,\n",
" test_size = 0.2,\n",
" random_state=0,\n",
" stratify=Y)\n",
"\n",
"# Work around indexing issue\n",
"X_train = X_train.reset_index(drop=True)\n",
"A_train = A_train.reset_index(drop=True)\n",
"X_test = X_test.reset_index(drop=True)\n",
"A_test = A_test.reset_index(drop=True)\n",
"\n",
"# Improve labels\n",
"A_test.Sex.loc[(A_test['Sex'] == 0)] = 'female'\n",
"A_test.Sex.loc[(A_test['Sex'] == 1)] = 'male'\n",
"\n",
"\n",
"A_test.Race.loc[(A_test['Race'] == 0)] = 'Amer-Indian-Eskimo'\n",
"A_test.Race.loc[(A_test['Race'] == 1)] = 'Asian-Pac-Islander'\n",
"A_test.Race.loc[(A_test['Race'] == 2)] = 'Black'\n",
"A_test.Race.loc[(A_test['Race'] == 3)] = 'Other'\n",
"A_test.Race.loc[(A_test['Race'] == 4)] = 'White'"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"TrainingModels\"></a>\n",
"## Training Models\n",
"\n",
"We now train a couple of different models on our data. The `adult` census dataset is a classification problem - the goal is to predict whether a particular individual exceeds an income threshold. For the purpose of generating a dashboard to upload, it is sufficient to train two basic classifiers. First, a logistic regression classifier:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"lr_predictor = LogisticRegression(solver='liblinear', fit_intercept=True)\n",
"\n",
"lr_predictor.fit(X_train, Y_train)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And for comparison, a support vector classifier:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"svm_predictor = svm.SVC()\n",
"\n",
"svm_predictor.fit(X_train, Y_train)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"LoginAzureML\"></a>\n",
"## Logging in to AzureML\n",
"\n",
"With our two classifiers trained, we can log into our AzureML workspace:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Workspace, Experiment, Model\n",
"\n",
"ws = Workspace.from_config()\n",
"ws.get_details()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"RegisterModels\"></a>\n",
"## Registering the Models\n",
"\n",
"Next, we register our models. By default, the subroutine which uploads the models checks that the names provided correspond to registered models in the workspace. We define a utility routine to do the registering:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"import os\n",
"\n",
"os.makedirs('models', exist_ok=True)\n",
"def register_model(name, model):\n",
" print(\"Registering \", name)\n",
" model_path = \"models/{0}.pkl\".format(name)\n",
" joblib.dump(value=model, filename=model_path)\n",
" registered_model = Model.register(model_path=model_path,\n",
" model_name=name,\n",
" workspace=ws)\n",
" print(\"Registered \", registered_model.id)\n",
" return registered_model.id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, we register the models. For convenience in subsequent method calls, we store the results in a dictionary, which maps the `id` of the registered model (a string in `name:version` format) to the predictor itself:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model_dict = {}\n",
"\n",
"lr_reg_id = register_model(\"fairness_linear_regression\", lr_predictor)\n",
"model_dict[lr_reg_id] = lr_predictor\n",
"svm_reg_id = register_model(\"fairness_svm\", svm_predictor)\n",
"model_dict[svm_reg_id] = svm_predictor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"LocalDashboard\"></a>\n",
"## Using the Fairlearn Dashboard\n",
"\n",
"We can now examine the fairness of the two models we have training, both as a function of race and (binary) sex. Before uploading the dashboard to the AzureML portal, we will first instantiate a local instance of the Fairlearn dashboard.\n",
"\n",
"Regardless of the viewing location, the dashboard is based on three things - the true values, the model predictions and the sensitive feature values. The dashboard can use predictions from multiple models and multiple sensitive features if desired (as we are doing here).\n",
"\n",
"Our first step is to generate a dictionary mapping the `id` of the registered model to the corresponding array of predictions:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ys_pred = {}\n",
"for n, p in model_dict.items():\n",
" ys_pred[n] = p.predict(X_test)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"We can examine these predictions in a locally invoked Fairlearn dashboard. This can be compared to the dashboard uploaded to the portal (in the next section):"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from fairlearn.widget import FairlearnDashboard\n",
"\n",
"FairlearnDashboard(sensitive_features=A_test, \n",
" sensitive_feature_names=['Sex', 'Race'],\n",
" y_true=Y_test.tolist(),\n",
" y_pred=ys_pred)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"AzureUpload\"></a>\n",
"## Uploading a Fairness Dashboard to Azure\n",
"\n",
"Uploading a fairness dashboard to Azure is a two stage process. The `FairlearnDashboard` invoked in the previous section relies on the underlying Python kernel to compute metrics on demand. This is obviously not available when the fairness dashboard is rendered in AzureML Studio. The required stages are therefore:\n",
"1. Precompute all the required metrics\n",
"1. Upload to Azure\n",
"\n",
"\n",
"### Computing Fairness Metrics\n",
"We use Fairlearn to create a dictionary which contains all the data required to display a dashboard. This includes both the raw data (true values, predicted values and sensitive features), and also the fairness metrics. The API is similar to that used to invoke the Dashboard locally. However, there are a few minor changes to the API, and the type of problem being examined (binary classification, regression etc.) needs to be specified explicitly:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"sf = { 'Race': A_test.Race, 'Sex': A_test.Sex }\n",
"\n",
"from fairlearn.metrics._group_metric_set import _create_group_metric_set\n",
"\n",
"dash_dict = _create_group_metric_set(y_true=Y_test,\n",
" predictions=ys_pred,\n",
" sensitive_features=sf,\n",
" prediction_type='binary_classification')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The `_create_group_metric_set()` method is currently underscored since its exact design is not yet final in Fairlearn."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Uploading to Azure\n",
"\n",
"We can now import the `azureml.contrib.fairness` package itself. We will round-trip the data, so there are two required subroutines:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.contrib.fairness import upload_dashboard_dictionary, download_dashboard_by_upload_id"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can upload the generated dictionary to AzureML. The upload method requires a run, so we first create an experiment and a run. The uploaded dashboard can be seen on the corresponding Run Details page in AzureML Studio. For completeness, we also download the dashboard dictionary which we uploaded."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"exp = Experiment(ws, \"notebook-01\")\n",
"print(exp)\n",
"\n",
"run = exp.start_logging()\n",
"try:\n",
" dashboard_title = \"Sample notebook upload\"\n",
" upload_id = upload_dashboard_dictionary(run,\n",
" dash_dict,\n",
" dashboard_name=dashboard_title)\n",
" print(\"\\nUploaded to id: {0}\\n\".format(upload_id))\n",
"\n",
" downloaded_dict = download_dashboard_by_upload_id(run, upload_id)\n",
"finally:\n",
" run.complete()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Finally, we can verify that the dashboard dictionary which we downloaded matches our upload:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(dash_dict == downloaded_dict)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<a id=\"Conclusion\"></a>\n",
"## Conclusion\n",
"\n",
"In this notebook we have demonstrated how to generate and upload a fairness dashboard to AzureML Studio. We have not discussed how to analyse the results and apply mitigations. Those topics will be covered elsewhere."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"authors": [
{
"name": "riedgar"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.8"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

View File

@@ -0,0 +1,8 @@
name: upload-fairness-dashboard
dependencies:
- pip:
- azureml-sdk
- azureml-contrib-fairness
- fairlearn==0.4.6
- joblib
- shap

View File

@@ -105,7 +105,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },
@@ -675,10 +675,8 @@
"model_name = best_run.properties['model_name']\n", "model_name = best_run.properties['model_name']\n",
"\n", "\n",
"script_file_name = 'inference/score.py'\n", "script_file_name = 'inference/score.py'\n",
"conda_env_file_name = 'inference/env.yml'\n",
"\n", "\n",
"best_run.download_file('outputs/scoring_file_v_1_0_0.py', 'inference/score.py')\n", "best_run.download_file('outputs/scoring_file_v_1_0_0.py', 'inference/score.py')"
"best_run.download_file('outputs/conda_env_v_1_0_0.yml', 'inference/env.yml')"
] ]
}, },
{ {
@@ -721,8 +719,7 @@
"from azureml.core.model import Model\n", "from azureml.core.model import Model\n",
"from azureml.core.environment import Environment\n", "from azureml.core.environment import Environment\n",
"\n", "\n",
"myenv = Environment.from_conda_specification(name=\"myenv\", file_path=conda_env_file_name)\n", "inference_config = InferenceConfig(entry_script=script_file_name)\n",
"inference_config = InferenceConfig(entry_script=script_file_name, environment=myenv)\n",
"\n", "\n",
"aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n", "aciconfig = AciWebservice.deploy_configuration(cpu_cores = 1, \n",
" memory_gb = 1, \n", " memory_gb = 1, \n",

View File

@@ -2,7 +2,3 @@ name: auto-ml-classification-bank-marketing-all-features
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib
- onnxruntime==1.0.0

View File

@@ -93,7 +93,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -2,6 +2,3 @@ name: auto-ml-classification-credit-card-fraud
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -97,7 +97,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -2,11 +2,3 @@ name: auto-ml-classification-text-dnn
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib
- https://download.pytorch.org/whl/cpu/torch-1.1.0-cp35-cp35m-win_amd64.whl
- sentencepiece==0.1.82
- pytorch-transformers==1.0
- spacy==2.1.8
- https://aka.ms/automl-resources/packages/en_core_web_sm-2.1.0.tar.gz

View File

@@ -88,7 +88,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },
@@ -201,7 +201,7 @@
"conda_run_config.environment.docker.enabled = True\n", "conda_run_config.environment.docker.enabled = True\n",
"conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n", "conda_run_config.environment.docker.base_image = azureml.core.runconfig.DEFAULT_CPU_IMAGE\n",
"\n", "\n",
"cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets'], \n", "cd = CondaDependencies.create(pip_packages=['azureml-sdk[automl]', 'applicationinsights', 'azureml-opendatasets', 'azureml-defaults'], \n",
" conda_packages=['numpy==1.16.2'], \n", " conda_packages=['numpy==1.16.2'], \n",
" pin_sdk_version=False)\n", " pin_sdk_version=False)\n",
"#cd.add_pip_package('azureml-explain-model')\n", "#cd.add_pip_package('azureml-explain-model')\n",

View File

@@ -2,7 +2,3 @@ name: auto-ml-continuous-retraining
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib
- azureml-pipeline

View File

@@ -114,7 +114,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -1,11 +1,4 @@
name: auto-ml-forecasting-beer-remote name: auto-ml-forecasting-beer-remote
dependencies: dependencies:
- py-xgboost<=0.90
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib
- azureml-train

View File

@@ -87,7 +87,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -1,10 +1,4 @@
name: auto-ml-forecasting-bike-share name: auto-ml-forecasting-bike-share
dependencies: dependencies:
- py-xgboost<=0.90
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -97,7 +97,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -2,8 +2,3 @@ name: auto-ml-forecasting-energy-demand
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -94,7 +94,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -1,10 +1,4 @@
name: auto-ml-forecasting-function name: auto-ml-forecasting-function
dependencies: dependencies:
- py-xgboost<=0.90
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -82,7 +82,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -1,10 +1,4 @@
name: auto-ml-forecasting-orange-juice-sales name: auto-ml-forecasting-orange-juice-sales
dependencies: dependencies:
- py-xgboost<=0.90
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy==1.16.2
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -96,7 +96,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -2,6 +2,3 @@ name: auto-ml-classification-credit-card-fraud-local
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -98,7 +98,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -2,6 +2,3 @@ name: auto-ml-regression-explanation-featurization
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -92,7 +92,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -2,7 +2,3 @@ name: auto-ml-regression
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- pandas==0.23.4
- azureml-train-automl
- azureml-widgets
- matplotlib

View File

@@ -50,10 +50,12 @@ pip install azureml-accel-models[gpu]
### Step 4: Follow our notebooks ### Step 4: Follow our notebooks
The notebooks in this repo walk through the following scenarios: We provide notebooks to walk through the following scenarios, linked below:
* [Quickstart](accelerated-models-quickstart.ipynb), deploy and inference a ResNet50 model trained on ImageNet * [Quickstart](https://github.com/Azure/MachineLearningNotebooks/blob/33d6def8c30d3dd3a5bfbea50b9c727788185faf/how-to-use-azureml/deployment/accelerated-models/accelerated-models-quickstart.ipynb), deploy and inference a ResNet50 model trained on ImageNet
* [Object Detection](accelerated-models-object-detection.ipynb), deploy and inference an SSD-VGG model that can do object detection * [Object Detection](https://github.com/Azure/MachineLearningNotebooks/blob/33d6def8c30d3dd3a5bfbea50b9c727788185faf/how-to-use-azureml/deployment/accelerated-models/accelerated-models-object-detection.ipynb), deploy and inference an SSD-VGG model that can do object detection
* [Training models](accelerated-models-training.ipynb), train one of our accelerated models on the Kaggle Cats and Dogs dataset to see how to improve accuracy on custom datasets * [Training models](https://github.com/Azure/MachineLearningNotebooks/blob/33d6def8c30d3dd3a5bfbea50b9c727788185faf/how-to-use-azureml/deployment/accelerated-models/accelerated-models-training.ipynb), train one of our accelerated models on the Kaggle Cats and Dogs dataset to see how to improve accuracy on custom datasets
**Note**: the above notebooks work only for tensorflow >= 1.6,<2.0.
<a name="model-classes"></a> <a name="model-classes"></a>
## Model Classes ## Model Classes

View File

@@ -86,7 +86,37 @@
"source": [ "source": [
"In this example, we will be using and registering two models. \n", "In this example, we will be using and registering two models. \n",
"\n", "\n",
"You wil need to have a `first_model.pkl` file and `second_model.pkl` file in the current directory. The below call registers the files as Models with the names `my_first_model` and `my_second_model` in the workspace." "First we will train two simple models on the [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset) included with scikit-learn, serializing them to files in the current directory."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"import sklearn\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import BayesianRidge, Ridge\n",
"\n",
"x, y = load_diabetes(return_X_y=True)\n",
"\n",
"first_model = Ridge().fit(x, y)\n",
"second_model = BayesianRidge().fit(x, y)\n",
"\n",
"joblib.dump(first_model, \"first_model.pkl\")\n",
"joblib.dump(second_model, \"second_model.pkl\")\n",
"\n",
"print(\"Trained models using scikit-learn {}.\".format(sklearn.__version__))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now that we have our trained models locally, we will register them as Models with the names `my_first_model` and `my_second_model` in the workspace."
] ]
}, },
{ {
@@ -102,12 +132,12 @@
"from azureml.core.model import Model\n", "from azureml.core.model import Model\n",
"\n", "\n",
"my_model_1 = Model.register(model_path=\"first_model.pkl\",\n", "my_model_1 = Model.register(model_path=\"first_model.pkl\",\n",
" model_name=\"my_first_model\",\n", " model_name=\"my_first_model\",\n",
" workspace=ws)\n", " workspace=ws)\n",
"\n", "\n",
"my_model_2 = Model.register(model_path=\"second_model.pkl\",\n", "my_model_2 = Model.register(model_path=\"second_model.pkl\",\n",
" model_name=\"my_second_model\",\n", " model_name=\"my_second_model\",\n",
" workspace=ws)" " workspace=ws)"
] ]
}, },
{ {
@@ -149,25 +179,24 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"%%writefile score.py\n", "%%writefile score.py\n",
"import pickle\n", "import joblib\n",
"import json\n", "import json\n",
"import numpy as np\n", "import numpy as np\n",
"from sklearn.externals import joblib\n", "\n",
"from sklearn.linear_model import Ridge\n",
"from azureml.core.model import Model\n", "from azureml.core.model import Model\n",
"\n", "\n",
"def init():\n", "def init():\n",
" global model_1, model_2\n", " global model_1, model_2\n",
" # note here \"my_first_model\" is the name of the model registered under the workspace\n", " # Here \"my_first_model\" is the name of the model registered under the workspace.\n",
" # this call should return the path to the model.pkl file on the local disk.\n", " # This call will return the path to the .pkl file on the local disk.\n",
" model_1_path = Model.get_model_path(model_name='my_first_model')\n", " model_1_path = Model.get_model_path(model_name='my_first_model')\n",
" model_2_path = Model.get_model_path(model_name='my_second_model')\n", " model_2_path = Model.get_model_path(model_name='my_second_model')\n",
" \n", " \n",
" # deserialize the model files back into a sklearn model\n", " # Deserialize the model files back into scikit-learn models.\n",
" model_1 = joblib.load(model_1_path)\n", " model_1 = joblib.load(model_1_path)\n",
" model_2 = joblib.load(model_2_path)\n", " model_2 = joblib.load(model_2_path)\n",
"\n", "\n",
"# note you can pass in multiple rows for scoring\n", "# Note you can pass in multiple rows for scoring.\n",
"def run(raw_data):\n", "def run(raw_data):\n",
" try:\n", " try:\n",
" data = json.loads(raw_data)['data']\n", " data = json.loads(raw_data)['data']\n",
@@ -177,7 +206,7 @@
" result_1 = model_1.predict(data)\n", " result_1 = model_1.predict(data)\n",
" result_2 = model_2.predict(data)\n", " result_2 = model_2.predict(data)\n",
"\n", "\n",
" # you can return any data type as long as it is JSON-serializable\n", " # You can return any JSON-serializable value.\n",
" return {\"prediction1\": result_1.tolist(), \"prediction2\": result_2.tolist()}\n", " return {\"prediction1\": result_1.tolist(), \"prediction2\": result_2.tolist()}\n",
" except Exception as e:\n", " except Exception as e:\n",
" result = str(e)\n", " result = str(e)\n",
@@ -208,10 +237,10 @@
"source": [ "source": [
"from azureml.core import Environment\n", "from azureml.core import Environment\n",
"\n", "\n",
"env = Environment.from_conda_specification(name='deploytocloudenv', file_path='myenv.yml')\n", "env = Environment(\"deploytocloudenv\")\n",
"\n", "env.python.conda_dependencies.add_pip_package(\"joblib\")\n",
"# This is optional at this point\n", "env.python.conda_dependencies.add_pip_package(\"numpy\")\n",
"# env.register(workspace=ws)" "env.python.conda_dependencies.add_pip_package(\"scikit-learn=={}\".format(sklearn.__version__))"
] ]
}, },
{ {
@@ -281,25 +310,15 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.core.webservice import AciWebservice, Webservice\n", "from azureml.core.webservice import AciWebservice\n",
"from azureml.exceptions import WebserviceException\n", "\n",
"aci_service_name = \"aciservice-multimodel\"\n",
"\n", "\n",
"deployment_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n", "deployment_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
"aci_service_name = 'aciservice-multimodel'\n",
"\n",
"try:\n",
" # if you want to get existing service below is the command\n",
" # since aci name needs to be unique in subscription deleting existing aci if any\n",
" # we use aci_service_name to create azure aci\n",
" service = Webservice(ws, name=aci_service_name)\n",
" if service:\n",
" service.delete()\n",
"except WebserviceException as e:\n",
" print()\n",
"\n",
"service = Model.deploy(ws, aci_service_name, [my_model_1, my_model_2], inference_config, deployment_config)\n",
"\n", "\n",
"service = Model.deploy(ws, aci_service_name, [my_model_1, my_model_2], inference_config, deployment_config, overwrite=True)\n",
"service.wait_for_deployment(True)\n", "service.wait_for_deployment(True)\n",
"\n",
"print(service.state)" "print(service.state)"
] ]
}, },
@@ -317,13 +336,11 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"import json\n", "import json\n",
"test_sample = json.dumps({'data': [\n",
" [1,2,3,4,5,6,7,8,9,10], \n",
" [10,9,8,7,6,5,4,3,2,1]\n",
"]})\n",
"\n", "\n",
"test_sample_encoded = bytes(test_sample, encoding='utf8')\n", "test_sample = json.dumps({'data': x[0:2].tolist()})\n",
"prediction = service.run(input_data=test_sample_encoded)\n", "\n",
"prediction = service.run(test_sample)\n",
"\n",
"print(prediction)" "print(prediction)"
] ]
}, },

View File

@@ -2,3 +2,5 @@ name: multi-model-register-and-deploy
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy
- scikit-learn

View File

@@ -1,8 +0,0 @@
name: project_environment
dependencies:
- python=3.6.2
- pip:
- azureml-defaults
- scikit-learn
- numpy
- inference-schema[numpy-support]

View File

@@ -1,442 +0,0 @@
3.807590643342410180e-02,5.068011873981870252e-02,6.169620651868849837e-02,2.187235499495579841e-02,-4.422349842444640161e-02,-3.482076283769860309e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,1.990842087631829876e-02,-1.764612515980519894e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-5.147406123880610140e-02,-2.632783471735180084e-02,-8.448724111216979540e-03,-1.916333974822199970e-02,7.441156407875940126e-02,-3.949338287409189657e-02,-6.832974362442149896e-02,-9.220404962683000083e-02
8.529890629667830071e-02,5.068011873981870252e-02,4.445121333659410312e-02,-5.670610554934250001e-03,-4.559945128264750180e-02,-3.419446591411950259e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,2.863770518940129874e-03,-2.593033898947460017e-02
-8.906293935226029801e-02,-4.464163650698899782e-02,-1.159501450521270051e-02,-3.665644679856060184e-02,1.219056876180000040e-02,2.499059336410210108e-02,-3.603757004385269719e-02,3.430885887772629900e-02,2.269202256674450122e-02,-9.361911330135799444e-03
5.383060374248070309e-03,-4.464163650698899782e-02,-3.638469220447349689e-02,2.187235499495579841e-02,3.934851612593179802e-03,1.559613951041610019e-02,8.142083605192099172e-03,-2.592261998182820038e-03,-3.199144494135589684e-02,-4.664087356364819692e-02
-9.269547780327989928e-02,-4.464163650698899782e-02,-4.069594049999709917e-02,-1.944209332987930153e-02,-6.899064987206669775e-02,-7.928784441181220555e-02,4.127682384197570165e-02,-7.639450375000099436e-02,-4.118038518800790082e-02,-9.634615654166470144e-02
-4.547247794002570037e-02,5.068011873981870252e-02,-4.716281294328249912e-02,-1.599922263614299983e-02,-4.009563984984299695e-02,-2.480001206043359885e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-6.291294991625119570e-02,-3.835665973397880263e-02
6.350367559056099842e-02,5.068011873981870252e-02,-1.894705840284650021e-03,6.662967401352719310e-02,9.061988167926439408e-02,1.089143811236970016e-01,2.286863482154040048e-02,1.770335448356720118e-02,-3.581672810154919867e-02,3.064409414368320182e-03
4.170844488444359899e-02,5.068011873981870252e-02,6.169620651868849837e-02,-4.009931749229690007e-02,-1.395253554402150001e-02,6.201685656730160021e-03,-2.867429443567860031e-02,-2.592261998182820038e-03,-1.495647502491130078e-02,1.134862324403770016e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,3.906215296718960200e-02,-3.321357610482440076e-02,-1.257658268582039982e-02,-3.450761437590899733e-02,-2.499265663159149983e-02,-2.592261998182820038e-03,6.773632611028609918e-02,-1.350401824497050006e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-8.380842345523309422e-02,8.100872220010799790e-03,-1.033894713270950005e-01,-9.056118903623530669e-02,-1.394774321933030074e-02,-7.639450375000099436e-02,-6.291294991625119570e-02,-3.421455281914410201e-02
2.717829108036539862e-02,5.068011873981870252e-02,1.750591148957160101e-02,-3.321357610482440076e-02,-7.072771253015849857e-03,4.597154030400080194e-02,-6.549067247654929980e-02,7.120997975363539678e-02,-9.643322289178400675e-02,-5.906719430815229877e-02
1.628067572730669890e-02,-4.464163650698899782e-02,-2.884000768730720157e-02,-9.113481248670509197e-03,-4.320865536613589623e-03,-9.768885894535990141e-03,4.495846164606279866e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,-4.249876664881350324e-02
5.383060374248070309e-03,5.068011873981870252e-02,-1.894705840284650021e-03,8.100872220010799790e-03,-4.320865536613589623e-03,-1.571870666853709964e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,3.839324821169769891e-02,-1.350401824497050006e-02
4.534098333546320025e-02,-4.464163650698899782e-02,-2.560657146566450160e-02,-1.255635194240680048e-02,1.769438019460449832e-02,-6.128357906048329537e-05,8.177483968693349814e-02,-3.949338287409189657e-02,-3.199144494135589684e-02,-7.563562196749110123e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-1.806188694849819934e-02,8.040115678847230274e-02,8.924392882106320368e-02,1.076617872765389949e-01,-3.971920784793980114e-02,1.081111006295440019e-01,3.605579008983190309e-02,-4.249876664881350324e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,4.229558918883229851e-02,4.941532054484590319e-02,2.457414448561009990e-02,-2.386056667506489953e-02,7.441156407875940126e-02,-3.949338287409189657e-02,5.227999979678119719e-02,2.791705090337660150e-02
7.076875249260000666e-02,5.068011873981870252e-02,1.211685112016709989e-02,5.630106193231849965e-02,3.420581449301800248e-02,4.941617338368559792e-02,-3.971920784793980114e-02,3.430885887772629900e-02,2.736770754260900093e-02,-1.077697500466389974e-03
-3.820740103798660192e-02,-4.464163650698899782e-02,-1.051720243133190055e-02,-3.665644679856060184e-02,-3.734373413344069942e-02,-1.947648821001150138e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,-1.811826730789670159e-02,-1.764612515980519894e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,-1.806188694849819934e-02,-4.009931749229690007e-02,-2.944912678412469915e-03,-1.133462820348369975e-02,3.759518603788870178e-02,-3.949338287409189657e-02,-8.944018957797799166e-03,-5.492508739331759815e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,-5.686312160821060252e-02,-4.354218818603310115e-02,-4.559945128264750180e-02,-4.327577130601600180e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-1.190068480150809939e-02,1.549073015887240078e-02
-8.543040090124079389e-02,5.068011873981870252e-02,-2.237313524402180162e-02,1.215130832538269907e-03,-3.734373413344069942e-02,-2.636575436938120090e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-7.212845460195599356e-02,-1.764612515980519894e-02
-8.543040090124079389e-02,-4.464163650698899782e-02,-4.050329988046450294e-03,-9.113481248670509197e-03,-2.944912678412469915e-03,7.767427965677820186e-03,2.286863482154040048e-02,-3.949338287409189657e-02,-6.117659509433449883e-02,-1.350401824497050006e-02
4.534098333546320025e-02,5.068011873981870252e-02,6.061839444480759953e-02,3.105334362634819961e-02,2.870200306021350109e-02,-4.734670130927989828e-02,-5.444575906428809897e-02,7.120997975363539678e-02,1.335989800130079896e-01,1.356118306890790048e-01
-6.363517019512339445e-02,-4.464163650698899782e-02,3.582871674554689856e-02,-2.288496402361559975e-02,-3.046396984243510131e-02,-1.885019128643240088e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,-2.595242443518940012e-02,-5.492508739331759815e-02
-6.726770864614299572e-02,5.068011873981870252e-02,-1.267282657909369996e-02,-4.009931749229690007e-02,-1.532848840222260020e-02,4.635943347782499856e-03,-5.812739686837520292e-02,3.430885887772629900e-02,1.919903307856710151e-02,-3.421455281914410201e-02
-1.072256316073579990e-01,-4.464163650698899782e-02,-7.734155101194770121e-02,-2.632783471735180084e-02,-8.962994274508359616e-02,-9.619786134844690584e-02,2.655027262562750096e-02,-7.639450375000099436e-02,-4.257210492279420166e-02,-5.219804415301099697e-03
-2.367724723390840155e-02,-4.464163650698899782e-02,5.954058237092670069e-02,-4.009931749229690007e-02,-4.284754556624519733e-02,-4.358891976780549654e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,4.034337164788070335e-02
5.260606023750229870e-02,-4.464163650698899782e-02,-2.129532317014089932e-02,-7.452802442965950069e-02,-4.009563984984299695e-02,-3.763909899380440266e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-6.092541861022970299e-04,-5.492508739331759815e-02
6.713621404158050254e-02,5.068011873981870252e-02,-6.205954135808240159e-03,6.318680331979099896e-02,-4.284754556624519733e-02,-9.588471288665739722e-02,5.232173725423699961e-02,-7.639450375000099436e-02,5.942380044479410317e-02,5.276969239238479825e-02
-6.000263174410389727e-02,-4.464163650698899782e-02,4.445121333659410312e-02,-1.944209332987930153e-02,-9.824676969418109224e-03,-7.576846662009279788e-03,2.286863482154040048e-02,-3.949338287409189657e-02,-2.712864555432650121e-02,-9.361911330135799444e-03
-2.367724723390840155e-02,-4.464163650698899782e-02,-6.548561819925780014e-02,-8.141376581713200000e-02,-3.871968699164179961e-02,-5.360967054507050078e-02,5.968501286241110343e-02,-7.639450375000099436e-02,-3.712834601047360072e-02,-4.249876664881350324e-02
3.444336798240450054e-02,5.068011873981870252e-02,1.252871188776620015e-01,2.875809638242839833e-02,-5.385516843185429725e-02,-1.290037051243130006e-02,-1.023070505174200062e-01,1.081111006295440019e-01,2.714857279071319972e-04,2.791705090337660150e-02
3.081082953138499989e-02,-4.464163650698899782e-02,-5.039624916492520257e-02,-2.227739861197989939e-03,-4.422349842444640161e-02,-8.993489211265630334e-02,1.185912177278039964e-01,-7.639450375000099436e-02,-1.811826730789670159e-02,3.064409414368320182e-03
1.628067572730669890e-02,-4.464163650698899782e-02,-6.332999405149600247e-02,-5.731367096097819691e-02,-5.798302700645770191e-02,-4.891244361822749687e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-5.947269741072230137e-02,-6.735140813782170000e-02
4.897352178648269744e-02,5.068011873981870252e-02,-3.099563183506899924e-02,-4.928030602040309877e-02,4.934129593323050011e-02,-4.132213582324419619e-03,1.333177689441520097e-01,-5.351580880693729975e-02,2.131084656824479978e-02,1.963283707370720027e-02
1.264813727628719998e-02,-4.464163650698899782e-02,2.289497185897609866e-02,5.285819123858220142e-02,8.062710187196569719e-03,-2.855779360190789998e-02,3.759518603788870178e-02,-3.949338287409189657e-02,5.472400334817909689e-02,-2.593033898947460017e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,1.103903904628619932e-02,-5.731367096097819691e-02,-2.496015840963049931e-02,-4.296262284422640298e-02,3.023191042971450082e-02,-3.949338287409189657e-02,1.703713241477999851e-02,-5.219804415301099697e-03
-1.882016527791040067e-03,5.068011873981870252e-02,7.139651518361660176e-02,9.761551025715360652e-02,8.786797596286209655e-02,7.540749571221680436e-02,-2.131101882750449997e-02,7.120997975363539678e-02,7.142403278057639360e-02,2.377494398854190089e-02
-1.882016527791040067e-03,5.068011873981870252e-02,1.427247526792889930e-02,-7.452802442965950069e-02,2.558898754392050119e-03,6.201685656730160021e-03,-1.394774321933030074e-02,-2.592261998182820038e-03,1.919903307856710151e-02,3.064409414368320182e-03
5.383060374248070309e-03,5.068011873981870252e-02,-8.361578283570040432e-03,2.187235499495579841e-02,5.484510736603499803e-02,7.321545647968999426e-02,-2.499265663159149983e-02,3.430885887772629900e-02,1.255315281338930007e-02,9.419076154073199869e-02
-9.996055470531900466e-02,-4.464163650698899782e-02,-6.764124234701959781e-02,-1.089567313670219972e-01,-7.449446130487119566e-02,-7.271172671423199729e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-4.986846773523059828e-02,-9.361911330135799444e-03
-6.000263174410389727e-02,5.068011873981870252e-02,-1.051720243133190055e-02,-1.485159908304049987e-02,-4.972730985725089953e-02,-2.354741821327540133e-02,-5.812739686837520292e-02,1.585829843977170153e-02,-9.918957363154769225e-03,-3.421455281914410201e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-7.108515373592319553e-02,2.044628591100669870e-02,-1.008203435632550049e-02,1.185912177278039964e-01,-7.639450375000099436e-02,-4.257210492279420166e-02,7.348022696655839847e-02
4.534098333546320025e-02,5.068011873981870252e-02,6.816307896197400240e-02,8.100872220010799790e-03,-1.670444126042380101e-02,4.635943347782499856e-03,-7.653558588881050062e-02,7.120997975363539678e-02,3.243322577960189995e-02,-1.764612515980519894e-02
2.717829108036539862e-02,5.068011873981870252e-02,-3.530688013059259805e-02,3.220096707616459941e-02,-1.120062982761920074e-02,1.504458729887179960e-03,-1.026610541524320026e-02,-2.592261998182820038e-03,-1.495647502491130078e-02,-5.078298047848289754e-02
-5.637009329308430294e-02,-4.464163650698899782e-02,-1.159501450521270051e-02,-3.321357610482440076e-02,-4.697540414084860200e-02,-4.765984977106939996e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-7.979397554541639223e-03,-8.806194271199530021e-02
-7.816532399920170238e-02,-4.464163650698899782e-02,-7.303030271642410587e-02,-5.731367096097819691e-02,-8.412613131227909824e-02,-7.427746902317970690e-02,-2.499265663159149983e-02,-3.949338287409189657e-02,-1.811826730789670159e-02,-8.391983579716059960e-02
6.713621404158050254e-02,5.068011873981870252e-02,-4.177375257387799801e-02,1.154374291374709975e-02,2.558898754392050119e-03,5.888537194940629722e-03,4.127682384197570165e-02,-3.949338287409189657e-02,-5.947269741072230137e-02,-2.178823207463989955e-02
-4.183993948900609910e-02,5.068011873981870252e-02,1.427247526792889930e-02,-5.670610554934250001e-03,-1.257658268582039982e-02,6.201685656730160021e-03,-7.285394808472339667e-02,7.120997975363539678e-02,3.546193866076970125e-02,-1.350401824497050006e-02
3.444336798240450054e-02,-4.464163650698899782e-02,-7.283766209689159811e-03,1.498661360748330083e-02,-4.422349842444640161e-02,-3.732595053201490098e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-2.139368094035999993e-02,7.206516329203029904e-03
5.987113713954139715e-02,5.068011873981870252e-02,1.642809941569069870e-02,2.875809638242839833e-02,-4.147159270804409714e-02,-2.918409052548700047e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,-2.396681493414269844e-03,-2.178823207463989955e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-9.439390357450949676e-03,-5.670610554934250001e-03,3.970962592582259754e-02,4.471894645684260094e-02,2.655027262562750096e-02,-2.592261998182820038e-03,-1.811826730789670159e-02,-1.350401824497050006e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,-1.590626280073640167e-02,7.007254470726349826e-02,1.219056876180000040e-02,2.217225720799630151e-02,1.550535921336619952e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,4.862758547755009764e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,2.505059600673789980e-02,8.100872220010799790e-03,2.044628591100669870e-02,1.778817874294279927e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,7.206516329203029904e-03
-4.183993948900609910e-02,-4.464163650698899782e-02,-4.931843709104429679e-02,-3.665644679856060184e-02,-7.072771253015849857e-03,-2.260797282790679916e-02,8.545647749102060209e-02,-3.949338287409189657e-02,-6.648814822283539983e-02,7.206516329203029904e-03
-4.183993948900609910e-02,-4.464163650698899782e-02,4.121777711495139968e-02,-2.632783471735180084e-02,-3.183992270063620150e-02,-3.043668437264510085e-02,-3.603757004385269719e-02,2.942906133203560069e-03,3.365681290238470291e-02,-1.764612515980519894e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,-6.332999405149600247e-02,-5.042792957350569760e-02,-8.962994274508359616e-02,-1.043397213549750041e-01,5.232173725423699961e-02,-7.639450375000099436e-02,-5.615757309500619965e-02,-6.735140813782170000e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-6.440780612537699845e-02,3.564383776990089764e-02,1.219056876180000040e-02,-5.799374901012400302e-02,1.811790603972839864e-01,-7.639450375000099436e-02,-6.092541861022970299e-04,-5.078298047848289754e-02
6.350367559056099842e-02,5.068011873981870252e-02,-2.560657146566450160e-02,1.154374291374709975e-02,6.447677737344290061e-02,4.847672799831700269e-02,3.023191042971450082e-02,-2.592261998182820038e-03,3.839324821169769891e-02,1.963283707370720027e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,-4.050329988046450294e-03,-4.009931749229690007e-02,-6.623874415566440021e-02,-7.866154748823310505e-02,5.232173725423699961e-02,-7.639450375000099436e-02,-5.140053526058249722e-02,-3.421455281914410201e-02
-4.183993948900609910e-02,5.068011873981870252e-02,4.572166603000769880e-03,-5.387080026724189868e-02,-4.422349842444640161e-02,-2.730519975474979960e-02,-8.021722369289760457e-02,7.120997975363539678e-02,3.664579779339879884e-02,1.963283707370720027e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-7.283766209689159811e-03,-4.009931749229690007e-02,-1.120062982761920074e-02,-1.383981589779990050e-02,5.968501286241110343e-02,-3.949338287409189657e-02,-8.238148325810279449e-02,-2.593033898947460017e-02
-3.457486258696700065e-02,-4.464163650698899782e-02,-3.746250427835440266e-02,-6.075654165471439799e-02,2.044628591100669870e-02,4.346635260968449710e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-3.075120986455629965e-02,-7.149351505265640061e-02
6.713621404158050254e-02,5.068011873981870252e-02,-2.560657146566450160e-02,-4.009931749229690007e-02,-6.348683843926219983e-02,-5.987263978086120042e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-1.919704761394450121e-02,1.134862324403770016e-02
-4.547247794002570037e-02,5.068011873981870252e-02,-2.452875939178359929e-02,5.974393262605470073e-02,5.310804470794310353e-03,1.496984258683710031e-02,-5.444575906428809897e-02,7.120997975363539678e-02,4.234489544960749752e-02,1.549073015887240078e-02
-9.147093429830140468e-03,5.068011873981870252e-02,-1.806188694849819934e-02,-3.321357610482440076e-02,-2.083229983502719873e-02,1.215150643073130074e-02,-7.285394808472339667e-02,7.120997975363539678e-02,2.714857279071319972e-04,1.963283707370720027e-02
4.170844488444359899e-02,5.068011873981870252e-02,-1.482845072685549936e-02,-1.714684618924559867e-02,-5.696818394814720174e-03,8.393724889256879915e-03,-1.394774321933030074e-02,-1.854239580664649974e-03,-1.190068480150809939e-02,3.064409414368320182e-03
3.807590643342410180e-02,5.068011873981870252e-02,-2.991781976118810041e-02,-4.009931749229690007e-02,-3.321587555883730170e-02,-2.417371513685449835e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,-1.290794225416879923e-02,3.064409414368320182e-03
1.628067572730669890e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-5.670610554934250001e-03,-7.587041416307230279e-02,-6.143838208980879900e-02,-1.394774321933030074e-02,-3.949338287409189657e-02,-5.140053526058249722e-02,1.963283707370720027e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-6.979686649478139548e-02,-1.255635194240680048e-02,-1.930069620102049918e-04,-9.142588970956939953e-03,7.072992627467229731e-02,-3.949338287409189657e-02,-6.291294991625119570e-02,4.034337164788070335e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,3.367309259778510089e-02,1.251584758070440062e-01,2.457414448561009990e-02,2.624318721126020146e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,2.671425763351279944e-02,6.105390622205419948e-02
6.350367559056099842e-02,5.068011873981870252e-02,-4.050329988046450294e-03,-1.255635194240680048e-02,1.030034574030749966e-01,4.878987646010649742e-02,5.600337505832399948e-02,-2.592261998182820038e-03,8.449528221240310000e-02,-1.764612515980519894e-02
1.264813727628719998e-02,5.068011873981870252e-02,-2.021751109626000048e-02,-2.227739861197989939e-03,3.833367306762140020e-02,5.317395492515999966e-02,-6.584467611156170040e-03,3.430885887772629900e-02,-5.145307980263110273e-03,-9.361911330135799444e-03
1.264813727628719998e-02,5.068011873981870252e-02,2.416542455238970041e-03,5.630106193231849965e-02,2.732605020201240090e-02,1.716188181936379939e-02,4.127682384197570165e-02,-3.949338287409189657e-02,3.711738233435969789e-03,7.348022696655839847e-02
-9.147093429830140468e-03,5.068011873981870252e-02,-3.099563183506899924e-02,-2.632783471735180084e-02,-1.120062982761920074e-02,-1.000728964429089965e-03,-2.131101882750449997e-02,-2.592261998182820038e-03,6.209315616505399656e-03,2.791705090337660150e-02
-3.094232413594750000e-02,5.068011873981870252e-02,2.828403222838059977e-02,7.007254470726349826e-02,-1.267806699165139883e-01,-1.068449090492910036e-01,-5.444575906428809897e-02,-4.798064067555100204e-02,-3.075120986455629965e-02,1.549073015887240078e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-3.638469220447349689e-02,-7.452802442965950069e-02,-3.871968699164179961e-02,-2.761834821653930128e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-7.408887149153539631e-02,-1.077697500466389974e-03
5.383060374248070309e-03,-4.464163650698899782e-02,-5.794093368209150136e-02,-2.288496402361559975e-02,-6.761469701386560449e-02,-6.832764824917850199e-02,-5.444575906428809897e-02,-2.592261998182820038e-03,4.289568789252869857e-02,-8.391983579716059960e-02
-1.035930931563389945e-01,-4.464163650698899782e-02,-3.746250427835440266e-02,-2.632783471735180084e-02,2.558898754392050119e-03,1.998021797546959896e-02,1.182372140927919965e-02,-2.592261998182820038e-03,-6.832974362442149896e-02,-2.593033898947460017e-02
7.076875249260000666e-02,-4.464163650698899782e-02,1.211685112016709989e-02,4.252957915737339695e-02,7.135654166444850566e-02,5.348710338694950134e-02,5.232173725423699961e-02,-2.592261998182820038e-03,2.539313491544940155e-02,-5.219804415301099697e-03
1.264813727628719998e-02,5.068011873981870252e-02,-2.237313524402180162e-02,-2.977070541108809906e-02,1.081461590359879960e-02,2.843522644378690054e-02,-2.131101882750449997e-02,3.430885887772629900e-02,-6.080248196314420352e-03,-1.077697500466389974e-03
-1.641217033186929963e-02,-4.464163650698899782e-02,-3.530688013059259805e-02,-2.632783471735180084e-02,3.282986163481690228e-02,1.716188181936379939e-02,1.001830287073690040e-01,-3.949338287409189657e-02,-7.020931272868760620e-02,-7.977772888232589898e-02
-3.820740103798660192e-02,-4.464163650698899782e-02,9.961226972405269262e-03,-4.698505887976939938e-02,-5.935897986465880211e-02,-5.298337362149149743e-02,-1.026610541524320026e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-4.249876664881350324e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-3.961812842611620034e-02,-1.009233664264470032e-01,-2.908801698423390050e-02,-3.012353591085559917e-02,4.495846164606279866e-02,-5.019470792810550031e-02,-6.832974362442149896e-02,-1.294830118603420011e-01
4.534098333546320025e-02,-4.464163650698899782e-02,7.139651518361660176e-02,1.215130832538269907e-03,-9.824676969418109224e-03,-1.000728964429089965e-03,1.550535921336619952e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-7.149351505265640061e-02
-7.090024709716259699e-02,5.068011873981870252e-02,-7.518592686418590354e-02,-4.009931749229690007e-02,-5.110326271545199972e-02,-1.509240974495799914e-02,-3.971920784793980114e-02,-2.592261998182820038e-03,-9.643322289178400675e-02,-3.421455281914410201e-02
4.534098333546320025e-02,-4.464163650698899782e-02,-6.205954135808240159e-03,1.154374291374709975e-02,6.310082451524179348e-02,1.622243643399520069e-02,9.650139090328180291e-02,-3.949338287409189657e-02,4.289568789252869857e-02,-3.835665973397880263e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-4.069594049999709917e-02,-6.764228304218700139e-02,-3.183992270063620150e-02,-3.701280207022530216e-02,3.759518603788870178e-02,-3.949338287409189657e-02,-3.452371533034950118e-02,6.933812005172369786e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,-4.824062501716339796e-02,-1.944209332987930153e-02,-1.930069620102049918e-04,-1.603185513032660131e-02,6.704828847058519337e-02,-3.949338287409189657e-02,-2.479118743246069845e-02,1.963283707370720027e-02
1.264813727628719998e-02,-4.464163650698899782e-02,-2.560657146566450160e-02,-4.009931749229690007e-02,-3.046396984243510131e-02,-4.515466207675319921e-02,7.809320188284639419e-02,-7.639450375000099436e-02,-7.212845460195599356e-02,1.134862324403770016e-02
4.534098333546320025e-02,-4.464163650698899782e-02,5.199589785376040191e-02,-5.387080026724189868e-02,6.310082451524179348e-02,6.476044801137270657e-02,-1.026610541524320026e-02,3.430885887772629900e-02,3.723201120896890010e-02,1.963283707370720027e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,4.572166603000769880e-03,9.761551025715360652e-02,5.310804470794310353e-03,-2.072908205716959829e-02,6.336665066649820044e-02,-3.949338287409189657e-02,1.255315281338930007e-02,1.134862324403770016e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,-6.440780612537699845e-02,-1.020709899795499975e-01,-2.944912678412469915e-03,-1.540555820674759969e-02,6.336665066649820044e-02,-4.724261825803279663e-02,-3.324878724762579674e-02,-5.492508739331759815e-02
-7.816532399920170238e-02,-4.464163650698899782e-02,-1.698407487461730050e-02,-1.255635194240680048e-02,-1.930069620102049918e-04,-1.352666743601040056e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-9.220404962683000083e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,-5.794093368209150136e-02,-8.141376581713200000e-02,-4.559945128264750180e-02,-2.887094206369749880e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,1.143797379512540100e-03,-5.219804415301099697e-03
5.623859868852180283e-02,5.068011873981870252e-02,9.961226972405269262e-03,4.941532054484590319e-02,-4.320865536613589623e-03,-1.227407358885230018e-02,-4.340084565202689815e-02,3.430885887772629900e-02,6.078775415074400001e-02,3.205915781821130212e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,8.864150836571099701e-02,-2.518021116424929914e-02,2.182223876920789951e-02,4.252690722431590187e-02,-3.235593223976569732e-02,3.430885887772629900e-02,2.863770518940129874e-03,7.762233388139309909e-02
1.750521923228520000e-03,5.068011873981870252e-02,-5.128142061927360405e-03,-1.255635194240680048e-02,-1.532848840222260020e-02,-1.383981589779990050e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-6.080248196314420352e-03,-6.735140813782170000e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-6.440780612537699845e-02,1.154374291374709975e-02,2.732605020201240090e-02,3.751653183568340322e-02,-1.394774321933030074e-02,3.430885887772629900e-02,1.178390038357590014e-02,-5.492508739331759815e-02
1.628067572730669890e-02,-4.464163650698899782e-02,1.750591148957160101e-02,-2.288496402361559975e-02,6.034891879883950289e-02,4.440579799505309927e-02,3.023191042971450082e-02,-2.592261998182820038e-03,3.723201120896890010e-02,-1.077697500466389974e-03
1.628067572730669890e-02,5.068011873981870252e-02,-4.500718879552070145e-02,6.318680331979099896e-02,1.081461590359879960e-02,-3.744320408500199904e-04,6.336665066649820044e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,3.620126473304600273e-02
-9.269547780327989928e-02,-4.464163650698899782e-02,2.828403222838059977e-02,-1.599922263614299983e-02,3.695772020942030001e-02,2.499059336410210108e-02,5.600337505832399948e-02,-3.949338287409189657e-02,-5.145307980263110273e-03,-1.077697500466389974e-03
5.987113713954139715e-02,5.068011873981870252e-02,4.121777711495139968e-02,1.154374291374709975e-02,4.108557878402369773e-02,7.071026878537380045e-02,-3.603757004385269719e-02,3.430885887772629900e-02,-1.090443584737709956e-02,-3.007244590430930078e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,6.492964274033119487e-02,-2.227739861197989939e-03,-2.496015840963049931e-02,-1.728444897748479883e-02,2.286863482154040048e-02,-3.949338287409189657e-02,-6.117659509433449883e-02,-6.320930122298699938e-02
2.354575262934580082e-02,5.068011873981870252e-02,-3.207344390894990155e-02,-4.009931749229690007e-02,-3.183992270063620150e-02,-2.166852744253820046e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-1.090443584737709956e-02,1.963283707370720027e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-7.626373893806680238e-02,-4.354218818603310115e-02,-4.559945128264750180e-02,-3.482076283769860309e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-5.947269741072230137e-02,-8.391983579716059960e-02
2.717829108036539862e-02,-4.464163650698899782e-02,4.984027370599859730e-02,-5.501842382034440038e-02,-2.944912678412469915e-03,4.064801645357869753e-02,-5.812739686837520292e-02,5.275941931568080279e-02,-5.295879323920039961e-02,-5.219804415301099697e-03
1.991321417832630017e-02,5.068011873981870252e-02,4.552902541047500196e-02,2.990571983224480160e-02,-6.211088558106100249e-02,-5.580170977759729700e-02,-7.285394808472339667e-02,2.692863470254440103e-02,4.560080841412490066e-02,4.034337164788070335e-02
3.807590643342410180e-02,5.068011873981870252e-02,-9.439390357450949676e-03,2.362754385640800005e-03,1.182945896190920002e-03,3.751653183568340322e-02,-5.444575906428809897e-02,5.017634085436720182e-02,-2.595242443518940012e-02,1.066170822852360034e-01
4.170844488444359899e-02,5.068011873981870252e-02,-3.207344390894990155e-02,-2.288496402361559975e-02,-4.972730985725089953e-02,-4.014428668812060341e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-1.260973855604090033e-01,1.549073015887240078e-02
1.991321417832630017e-02,-4.464163650698899782e-02,4.572166603000769880e-03,-2.632783471735180084e-02,2.319819162740899970e-02,1.027261565999409987e-02,6.704828847058519337e-02,-3.949338287409189657e-02,-2.364455757213410059e-02,-4.664087356364819692e-02
-8.543040090124079389e-02,-4.464163650698899782e-02,2.073934771121430098e-02,-2.632783471735180084e-02,5.310804470794310353e-03,1.966706951368000014e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,-2.364455757213410059e-02,3.064409414368320182e-03
1.991321417832630017e-02,5.068011873981870252e-02,1.427247526792889930e-02,6.318680331979099896e-02,1.494247447820220079e-02,2.029336643725910064e-02,-4.708248345611389801e-02,3.430885887772629900e-02,4.666077235681449775e-02,9.004865462589720093e-02
2.354575262934580082e-02,-4.464163650698899782e-02,1.101977498433290015e-01,6.318680331979099896e-02,1.356652162000110060e-02,-3.294187206696139875e-02,-2.499265663159149983e-02,2.065544415363990138e-02,9.924022573398999514e-02,2.377494398854190089e-02
-3.094232413594750000e-02,5.068011873981870252e-02,1.338730381358059929e-03,-5.670610554934250001e-03,6.447677737344290061e-02,4.941617338368559792e-02,-4.708248345611389801e-02,1.081111006295440019e-01,8.379676636552239877e-02,3.064409414368320182e-03
4.897352178648269744e-02,5.068011873981870252e-02,5.846277029704580186e-02,7.007254470726349826e-02,1.356652162000110060e-02,2.060651489904859884e-02,-2.131101882750449997e-02,3.430885887772629900e-02,2.200405045615050001e-02,2.791705090337660150e-02
5.987113713954139715e-02,-4.464163650698899782e-02,-2.129532317014089932e-02,8.728689817594480205e-02,4.521343735862710239e-02,3.156671106168230240e-02,-4.708248345611389801e-02,7.120997975363539678e-02,7.912108138965789905e-02,1.356118306890790048e-01
-5.637009329308430294e-02,5.068011873981870252e-02,-1.051720243133190055e-02,2.531522568869210010e-02,2.319819162740899970e-02,4.002171952999959703e-02,-3.971920784793980114e-02,3.430885887772629900e-02,2.061233072136409855e-02,5.691179930721949887e-02
1.628067572730669890e-02,-4.464163650698899782e-02,-4.716281294328249912e-02,-2.227739861197989939e-03,-1.945634697682600139e-02,-4.296262284422640298e-02,3.391354823380159783e-02,-3.949338287409189657e-02,2.736770754260900093e-02,2.791705090337660150e-02
-4.910501639104519755e-02,-4.464163650698899782e-02,4.572166603000769880e-03,1.154374291374709975e-02,-3.734373413344069942e-02,-1.853704282464289921e-02,-1.762938102341739949e-02,-2.592261998182820038e-03,-3.980959436433750137e-02,-2.178823207463989955e-02
6.350367559056099842e-02,-4.464163650698899782e-02,1.750591148957160101e-02,2.187235499495579841e-02,8.062710187196569719e-03,2.154596028441720101e-02,-3.603757004385269719e-02,3.430885887772629900e-02,1.990842087631829876e-02,1.134862324403770016e-02
4.897352178648269744e-02,5.068011873981870252e-02,8.109682384854470516e-02,2.187235499495579841e-02,4.383748450042589812e-02,6.413415108779360607e-02,-5.444575906428809897e-02,7.120997975363539678e-02,3.243322577960189995e-02,4.862758547755009764e-02
5.383060374248070309e-03,5.068011873981870252e-02,3.475090467166599972e-02,-1.080116308095460057e-03,1.525377602983150060e-01,1.987879896572929961e-01,-6.180903467246220279e-02,1.852344432601940039e-01,1.556684454070180086e-02,7.348022696655839847e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,2.397278393285700096e-02,8.100872220010799790e-03,-3.459182841703849903e-02,-3.889169284096249957e-02,2.286863482154040048e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-1.350401824497050006e-02
-5.514554978810590376e-03,5.068011873981870252e-02,-8.361578283570040432e-03,-2.227739861197989939e-03,-3.321587555883730170e-02,-6.363042132233559522e-02,-3.603757004385269719e-02,-2.592261998182820038e-03,8.058546423866649877e-02,7.206516329203029904e-03
-8.906293935226029801e-02,-4.464163650698899782e-02,-6.117436990373419786e-02,-2.632783471735180084e-02,-5.523112129005539744e-02,-5.454911593043910295e-02,4.127682384197570165e-02,-7.639450375000099436e-02,-9.393564550871469354e-02,-5.492508739331759815e-02
3.444336798240450054e-02,5.068011873981870252e-02,-1.894705840284650021e-03,-1.255635194240680048e-02,3.833367306762140020e-02,1.371724873967889932e-02,7.809320188284639419e-02,-3.949338287409189657e-02,4.551890466127779880e-03,-9.634615654166470144e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-6.225218197761509670e-02,-2.632783471735180084e-02,-5.696818394814720174e-03,-5.071658967693000106e-03,3.023191042971450082e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,-7.149351505265640061e-02
9.015598825267629943e-03,-4.464163650698899782e-02,1.642809941569069870e-02,4.658001526274530187e-03,9.438663045397699403e-03,1.058576412178359981e-02,-2.867429443567860031e-02,3.430885887772629900e-02,3.896836603088559697e-02,1.190434030297399942e-01
-6.363517019512339445e-02,5.068011873981870252e-02,9.618619288287730273e-02,1.045012516446259948e-01,-2.944912678412469915e-03,-4.758510505903469807e-03,-6.584467611156170040e-03,-2.592261998182820038e-03,2.269202256674450122e-02,7.348022696655839847e-02
-9.632801625429950054e-02,-4.464163650698899782e-02,-6.979686649478139548e-02,-6.764228304218700139e-02,-1.945634697682600139e-02,-1.070833127990459925e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-4.687948284421659950e-02,-7.977772888232589898e-02
1.628067572730669890e-02,5.068011873981870252e-02,-2.129532317014089932e-02,-9.113481248670509197e-03,3.420581449301800248e-02,4.785043107473799934e-02,7.788079970179680352e-04,-2.592261998182820038e-03,-1.290794225416879923e-02,2.377494398854190089e-02
-4.183993948900609910e-02,5.068011873981870252e-02,-5.362968538656789907e-02,-4.009931749229690007e-02,-8.412613131227909824e-02,-7.177228132886340206e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-7.212845460195599356e-02,-3.007244590430930078e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,4.337340126271319735e-02,-3.321357610482440076e-02,1.219056876180000040e-02,2.518648827290310109e-04,6.336665066649820044e-02,-3.949338287409189657e-02,-2.712864555432650121e-02,-4.664087356364819692e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,5.630714614928399725e-02,-3.665644679856060184e-02,-4.835135699904979933e-02,-4.296262284422640298e-02,-7.285394808472339667e-02,3.799897096531720114e-02,5.078151336297320045e-02,5.691179930721949887e-02
-9.269547780327989928e-02,-4.464163650698899782e-02,-8.165279930747129655e-02,-5.731367096097819691e-02,-6.073493272285990230e-02,-6.801449978738899338e-02,4.864009945014990260e-02,-7.639450375000099436e-02,-6.648814822283539983e-02,-2.178823207463989955e-02
5.383060374248070309e-03,-4.464163650698899782e-02,4.984027370599859730e-02,9.761551025715360652e-02,-1.532848840222260020e-02,-1.634500359211620013e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,1.703713241477999851e-02,-1.350401824497050006e-02
3.444336798240450054e-02,5.068011873981870252e-02,1.112755619172099975e-01,7.695828609473599757e-02,-3.183992270063620150e-02,-3.388131745233000092e-02,-2.131101882750449997e-02,-2.592261998182820038e-03,2.801650652326400162e-02,7.348022696655839847e-02
2.354575262934580082e-02,-4.464163650698899782e-02,6.169620651868849837e-02,5.285819123858220142e-02,-3.459182841703849903e-02,-4.891244361822749687e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,5.472400334817909689e-02,-5.219804415301099697e-03
4.170844488444359899e-02,5.068011873981870252e-02,1.427247526792889930e-02,4.252957915737339695e-02,-3.046396984243510131e-02,-1.313877426218630021e-03,-4.340084565202689815e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,1.549073015887240078e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,4.768464955823679963e-02,-4.698505887976939938e-02,3.420581449301800248e-02,5.724488492842390308e-02,-8.021722369289760457e-02,1.302517731550900115e-01,4.506616833626150148e-02,1.314697237742440128e-01
4.170844488444359899e-02,5.068011873981870252e-02,1.211685112016709989e-02,3.908670846363720280e-02,5.484510736603499803e-02,4.440579799505309927e-02,4.460445801105040325e-03,-2.592261998182820038e-03,4.560080841412490066e-02,-1.077697500466389974e-03
-3.094232413594750000e-02,-4.464163650698899782e-02,5.649978676881649634e-03,-9.113481248670509197e-03,1.907033305280559851e-02,6.827982580309210209e-03,7.441156407875940126e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-4.249876664881350324e-02
3.081082953138499989e-02,5.068011873981870252e-02,4.660683748435590079e-02,-1.599922263614299983e-02,2.044628591100669870e-02,5.066876723084379891e-02,-5.812739686837520292e-02,7.120997975363539678e-02,6.209315616505399656e-03,7.206516329203029904e-03
-4.183993948900609910e-02,-4.464163650698899782e-02,1.285205550993039902e-01,6.318680331979099896e-02,-3.321587555883730170e-02,-3.262872360517189707e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-5.078298047848289754e-02
-3.094232413594750000e-02,5.068011873981870252e-02,5.954058237092670069e-02,1.215130832538269907e-03,1.219056876180000040e-02,3.156671106168230240e-02,-4.340084565202689815e-02,3.430885887772629900e-02,1.482271084126630077e-02,7.206516329203029904e-03
-5.637009329308430294e-02,-4.464163650698899782e-02,9.295275666123460623e-02,-1.944209332987930153e-02,1.494247447820220079e-02,2.342485105515439842e-02,-2.867429443567860031e-02,2.545258986750810123e-02,2.605608963368469949e-02,4.034337164788070335e-02
-6.000263174410389727e-02,5.068011873981870252e-02,1.535028734180979987e-02,-1.944209332987930153e-02,3.695772020942030001e-02,4.816357953652750101e-02,1.918699701745330000e-02,-2.592261998182820038e-03,-3.075120986455629965e-02,-1.077697500466389974e-03
-4.910501639104519755e-02,5.068011873981870252e-02,-5.128142061927360405e-03,-4.698505887976939938e-02,-2.083229983502719873e-02,-2.041593359538010008e-02,-6.917231028063640375e-02,7.120997975363539678e-02,6.123790751970099866e-02,-3.835665973397880263e-02
2.354575262934580082e-02,-4.464163650698899782e-02,7.031870310973570293e-02,2.531522568869210010e-02,-3.459182841703849903e-02,-1.446611282137899926e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-1.919704761394450121e-02,-9.361911330135799444e-03
1.750521923228520000e-03,-4.464163650698899782e-02,-4.050329988046450294e-03,-5.670610554934250001e-03,-8.448724111216979540e-03,-2.386056667506489953e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-8.944018957797799166e-03,-1.350401824497050006e-02
-3.457486258696700065e-02,5.068011873981870252e-02,-8.168937664037369826e-04,7.007254470726349826e-02,3.970962592582259754e-02,6.695248724389940564e-02,-6.549067247654929980e-02,1.081111006295440019e-01,2.671425763351279944e-02,7.348022696655839847e-02
4.170844488444359899e-02,5.068011873981870252e-02,-4.392937672163980262e-02,6.318680331979099896e-02,-4.320865536613589623e-03,1.622243643399520069e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-3.452371533034950118e-02,1.134862324403770016e-02
6.713621404158050254e-02,5.068011873981870252e-02,2.073934771121430098e-02,-5.670610554934250001e-03,2.044628591100669870e-02,2.624318721126020146e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,8.640282933063080789e-03,3.064409414368320182e-03
-2.730978568492789874e-02,5.068011873981870252e-02,6.061839444480759953e-02,4.941532054484590319e-02,8.511607024645979902e-02,8.636769187485039689e-02,-2.902829807069099918e-03,3.430885887772629900e-02,3.781447882634390162e-02,4.862758547755009764e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,-1.051720243133190055e-02,1.215130832538269907e-03,-3.734373413344069942e-02,-3.576020822306719832e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-2.139368094035999993e-02,-3.421455281914410201e-02
-1.882016527791040067e-03,5.068011873981870252e-02,-3.315125598283080038e-02,-1.829446977677679984e-02,3.145390877661580209e-02,4.284005568610550069e-02,-1.394774321933030074e-02,1.991742173612169944e-02,1.022564240495780000e-02,2.791705090337660150e-02
-1.277963188084970010e-02,-4.464163650698899782e-02,-6.548561819925780014e-02,-6.993753018282070077e-02,1.182945896190920002e-03,1.684873335757430118e-02,-2.902829807069099918e-03,-7.020396503291909812e-03,-3.075120986455629965e-02,-5.078298047848289754e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,4.337340126271319735e-02,8.728689817594480205e-02,1.356652162000110060e-02,7.141131042098750048e-03,-1.394774321933030074e-02,-2.592261998182820038e-03,4.234489544960749752e-02,-1.764612515980519894e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,-6.225218197761509670e-02,-7.452802442965950069e-02,-2.358420555142939912e-02,-1.321351897422090062e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-3.581672810154919867e-02,-4.664087356364819692e-02
-4.547247794002570037e-02,5.068011873981870252e-02,6.385183066645029604e-02,7.007254470726349826e-02,1.332744202834990066e-01,1.314610703725430096e-01,-3.971920784793980114e-02,1.081111006295440019e-01,7.573758845754760549e-02,8.590654771106250032e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,3.043965637614240091e-02,-7.452802442965950069e-02,-2.358420555142939912e-02,-1.133462820348369975e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,-3.075120986455629965e-02,-1.077697500466389974e-03
1.628067572730669890e-02,5.068011873981870252e-02,7.247432725749750060e-02,7.695828609473599757e-02,-8.448724111216979540e-03,5.575388733151089883e-03,-6.584467611156170040e-03,-2.592261998182820038e-03,-2.364455757213410059e-02,6.105390622205419948e-02
4.534098333546320025e-02,-4.464163650698899782e-02,-1.913969902237900103e-02,2.187235499495579841e-02,2.732605020201240090e-02,-1.352666743601040056e-02,1.001830287073690040e-01,-3.949338287409189657e-02,1.776347786711730131e-02,-1.350401824497050006e-02
-4.183993948900609910e-02,-4.464163650698899782e-02,-6.656343027313869898e-02,-4.698505887976939938e-02,-3.734373413344069942e-02,-4.327577130601600180e-02,4.864009945014990260e-02,-3.949338287409189657e-02,-5.615757309500619965e-02,-1.350401824497050006e-02
-5.637009329308430294e-02,5.068011873981870252e-02,-6.009655782985329903e-02,-3.665644679856060184e-02,-8.825398988688250290e-02,-7.083283594349480683e-02,-1.394774321933030074e-02,-3.949338287409189657e-02,-7.814091066906959926e-02,-1.046303703713340055e-01
7.076875249260000666e-02,-4.464163650698899782e-02,6.924089103585480409e-02,3.793908501382069892e-02,2.182223876920789951e-02,1.504458729887179960e-03,-3.603757004385269719e-02,3.910600459159439823e-02,7.763278919555950675e-02,1.066170822852360034e-01
1.750521923228520000e-03,5.068011873981870252e-02,5.954058237092670069e-02,-2.227739861197989939e-03,6.172487165704060308e-02,6.319470570242499696e-02,-5.812739686837520292e-02,1.081111006295440019e-01,6.898221163630259556e-02,1.273276168594099922e-01
-1.882016527791040067e-03,-4.464163650698899782e-02,-2.668438353954540043e-02,4.941532054484590319e-02,5.897296594063840269e-02,-1.603185513032660131e-02,-4.708248345611389801e-02,7.120997975363539678e-02,1.335989800130079896e-01,1.963283707370720027e-02
2.354575262934580082e-02,5.068011873981870252e-02,-2.021751109626000048e-02,-3.665644679856060184e-02,-1.395253554402150001e-02,-1.509240974495799914e-02,5.968501286241110343e-02,-3.949338287409189657e-02,-9.643322289178400675e-02,-1.764612515980519894e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-9.862811928581330378e-02,-7.587041416307230279e-02,-5.987263978086120042e-02,-1.762938102341739949e-02,-3.949338287409189657e-02,-5.140053526058249722e-02,-4.664087356364819692e-02
4.170844488444359899e-02,5.068011873981870252e-02,7.139651518361660176e-02,8.100872220010799790e-03,3.833367306762140020e-02,1.590928797220559840e-02,-1.762938102341739949e-02,3.430885887772629900e-02,7.341007804911610368e-02,8.590654771106250032e-02
-6.363517019512339445e-02,5.068011873981870252e-02,-7.949717515970949888e-02,-5.670610554934250001e-03,-7.174255558846899528e-02,-6.644875747844139480e-02,-1.026610541524320026e-02,-3.949338287409189657e-02,-1.811826730789670159e-02,-5.492508739331759815e-02
1.628067572730669890e-02,5.068011873981870252e-02,9.961226972405269262e-03,-4.354218818603310115e-02,-9.650970703608929835e-02,-9.463211903949929338e-02,-3.971920784793980114e-02,-3.949338287409189657e-02,1.703713241477999851e-02,7.206516329203029904e-03
6.713621404158050254e-02,-4.464163650698899782e-02,-3.854031635223530150e-02,-2.632783471735180084e-02,-3.183992270063620150e-02,-2.636575436938120090e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-2.712864555432650121e-02,3.064409414368320182e-03
4.534098333546320025e-02,5.068011873981870252e-02,1.966153563733339868e-02,3.908670846363720280e-02,2.044628591100669870e-02,2.593003874947069978e-02,8.142083605192099172e-03,-2.592261998182820038e-03,-3.303712578676999863e-03,1.963283707370720027e-02
4.897352178648269744e-02,-4.464163650698899782e-02,2.720622015449970094e-02,-2.518021116424929914e-02,2.319819162740899970e-02,1.841447566652189977e-02,-6.180903467246220279e-02,8.006624876385350087e-02,7.222365081991240221e-02,3.205915781821130212e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-8.361578283570040432e-03,-2.632783471735180084e-02,2.457414448561009990e-02,1.622243643399520069e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-4.836172480289190057e-02,-3.007244590430930078e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,-1.590626280073640167e-02,-1.255635194240680048e-02,2.044628591100669870e-02,4.127431337715779802e-02,-4.340084565202689815e-02,3.430885887772629900e-02,1.407245251576850001e-02,-9.361911330135799444e-03
-3.820740103798660192e-02,5.068011873981870252e-02,4.572166603000769880e-03,3.564383776990089764e-02,-1.120062982761920074e-02,5.888537194940629722e-03,-4.708248345611389801e-02,3.430885887772629900e-02,1.630495279994180133e-02,-1.077697500466389974e-03
4.897352178648269744e-02,-4.464163650698899782e-02,-4.285156464775889684e-02,-5.387080026724189868e-02,4.521343735862710239e-02,5.004247030726469841e-02,3.391354823380159783e-02,-2.592261998182820038e-03,-2.595242443518940012e-02,-6.320930122298699938e-02
4.534098333546320025e-02,5.068011873981870252e-02,5.649978676881649634e-03,5.630106193231849965e-02,6.447677737344290061e-02,8.918602803095619647e-02,-3.971920784793980114e-02,7.120997975363539678e-02,1.556684454070180086e-02,-9.361911330135799444e-03
4.534098333546320025e-02,5.068011873981870252e-02,-3.530688013059259805e-02,6.318680331979099896e-02,-4.320865536613589623e-03,-1.627025888008149911e-03,-1.026610541524320026e-02,-2.592261998182820038e-03,1.556684454070180086e-02,5.691179930721949887e-02
1.628067572730669890e-02,-4.464163650698899782e-02,2.397278393285700096e-02,-2.288496402361559975e-02,-2.496015840963049931e-02,-2.605260590759169922e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,3.723201120896890010e-02,3.205915781821130212e-02
-7.453278554818210111e-02,5.068011873981870252e-02,-1.806188694849819934e-02,8.100872220010799790e-03,-1.945634697682600139e-02,-2.480001206043359885e-02,-6.549067247654929980e-02,3.430885887772629900e-02,6.731721791468489591e-02,-1.764612515980519894e-02
-8.179786245022120650e-02,5.068011873981870252e-02,4.229558918883229851e-02,-1.944209332987930153e-02,3.970962592582259754e-02,5.755803339021339782e-02,-6.917231028063640375e-02,1.081111006295440019e-01,4.718616788601970313e-02,-3.835665973397880263e-02
-6.726770864614299572e-02,-4.464163650698899782e-02,-5.470749746044879791e-02,-2.632783471735180084e-02,-7.587041416307230279e-02,-8.210618056791800512e-02,4.864009945014990260e-02,-7.639450375000099436e-02,-8.682899321629239386e-02,-1.046303703713340055e-01
5.383060374248070309e-03,-4.464163650698899782e-02,-2.972517914165530208e-03,4.941532054484590319e-02,7.410844738085080319e-02,7.071026878537380045e-02,4.495846164606279866e-02,-2.592261998182820038e-03,-1.498586820292070049e-03,-9.361911330135799444e-03
-1.882016527791040067e-03,-4.464163650698899782e-02,-6.656343027313869898e-02,1.215130832538269907e-03,-2.944912678412469915e-03,3.070201038834840124e-03,1.182372140927919965e-02,-2.592261998182820038e-03,-2.028874775162960165e-02,-2.593033898947460017e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-1.267282657909369996e-02,2.875809638242839833e-02,-1.808039411862490120e-02,-5.071658967693000106e-03,-4.708248345611389801e-02,3.430885887772629900e-02,2.337484127982079885e-02,-5.219804415301099697e-03
-5.514554978810590376e-03,5.068011873981870252e-02,-4.177375257387799801e-02,-4.354218818603310115e-02,-7.999827273767569358e-02,-7.615635979391689736e-02,-3.235593223976569732e-02,-3.949338287409189657e-02,1.022564240495780000e-02,-9.361911330135799444e-03
5.623859868852180283e-02,5.068011873981870252e-02,-3.099563183506899924e-02,8.100872220010799790e-03,1.907033305280559851e-02,2.123281182262769934e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-2.952762274177360077e-02,-5.906719430815229877e-02
9.015598825267629943e-03,5.068011873981870252e-02,-5.128142061927360405e-03,-6.419941234845069622e-02,6.998058880624739853e-02,8.386250418053420308e-02,-3.971920784793980114e-02,7.120997975363539678e-02,3.953987807202419963e-02,1.963283707370720027e-02
-6.726770864614299572e-02,-4.464163650698899782e-02,-5.901874575597240019e-02,3.220096707616459941e-02,-5.110326271545199972e-02,-4.953874054180659736e-02,-1.026610541524320026e-02,-3.949338287409189657e-02,2.007840549823790115e-03,2.377494398854190089e-02
2.717829108036539862e-02,5.068011873981870252e-02,2.505059600673789980e-02,1.498661360748330083e-02,2.595009734381130070e-02,4.847672799831700269e-02,-3.971920784793980114e-02,3.430885887772629900e-02,7.837142301823850701e-03,2.377494398854190089e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-3.321357610482440076e-02,3.282986163481690228e-02,3.626393798852529937e-02,3.759518603788870178e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,1.134862324403770016e-02
4.897352178648269744e-02,5.068011873981870252e-02,3.494354529119849794e-03,7.007254470726349826e-02,-8.448724111216979540e-03,1.340410027788939938e-02,-5.444575906428809897e-02,3.430885887772629900e-02,1.331596790892770020e-02,3.620126473304600273e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,5.415152200152219958e-02,-2.632783471735180084e-02,-5.523112129005539744e-02,-3.388131745233000092e-02,-1.394774321933030074e-02,-3.949338287409189657e-02,-7.408887149153539631e-02,-5.906719430815229877e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-4.500718879552070145e-02,3.449621432008449784e-02,4.383748450042589812e-02,-1.571870666853709964e-02,3.759518603788870178e-02,-1.440062067847370023e-02,8.989869327767099905e-02,7.206516329203029904e-03
5.623859868852180283e-02,-4.464163650698899782e-02,-5.794093368209150136e-02,-7.965857695567990157e-03,5.209320164963270050e-02,4.910302492189610318e-02,5.600337505832399948e-02,-2.141183364489639834e-02,-2.832024254799870092e-02,4.448547856271539702e-02
-3.457486258696700065e-02,5.068011873981870252e-02,-5.578530953432969675e-02,-1.599922263614299983e-02,-9.824676969418109224e-03,-7.889995123798789270e-03,3.759518603788870178e-02,-3.949338287409189657e-02,-5.295879323920039961e-02,2.791705090337660150e-02
8.166636784565869944e-02,5.068011873981870252e-02,1.338730381358059929e-03,3.564383776990089764e-02,1.263946559924939983e-01,9.106491880169340081e-02,1.918699701745330000e-02,3.430885887772629900e-02,8.449528221240310000e-02,-3.007244590430930078e-02
-1.882016527791040067e-03,5.068011873981870252e-02,3.043965637614240091e-02,5.285819123858220142e-02,3.970962592582259754e-02,5.661858800484489973e-02,-3.971920784793980114e-02,7.120997975363539678e-02,2.539313491544940155e-02,2.791705090337660150e-02
1.107266754538149961e-01,5.068011873981870252e-02,6.727790750762559745e-03,2.875809638242839833e-02,-2.771206412603280031e-02,-7.263698200219739949e-03,-4.708248345611389801e-02,3.430885887772629900e-02,2.007840549823790115e-03,7.762233388139309909e-02
-3.094232413594750000e-02,-4.464163650698899782e-02,4.660683748435590079e-02,1.498661360748330083e-02,-1.670444126042380101e-02,-4.703355284749029946e-02,7.788079970179680352e-04,-2.592261998182820038e-03,6.345592137206540473e-02,-2.593033898947460017e-02
1.750521923228520000e-03,5.068011873981870252e-02,2.612840808061879863e-02,-9.113481248670509197e-03,2.457414448561009990e-02,3.845597722105199845e-02,-2.131101882750449997e-02,3.430885887772629900e-02,9.436409146079870192e-03,3.064409414368320182e-03
9.015598825267629943e-03,-4.464163650698899782e-02,4.552902541047500196e-02,2.875809638242839833e-02,1.219056876180000040e-02,-1.383981589779990050e-02,2.655027262562750096e-02,-3.949338287409189657e-02,4.613233103941480340e-02,3.620126473304600273e-02
3.081082953138499989e-02,-4.464163650698899782e-02,4.013996504107050084e-02,7.695828609473599757e-02,1.769438019460449832e-02,3.782968029747289795e-02,-2.867429443567860031e-02,3.430885887772629900e-02,-1.498586820292070049e-03,1.190434030297399942e-01
3.807590643342410180e-02,5.068011873981870252e-02,-1.806188694849819934e-02,6.662967401352719310e-02,-5.110326271545199972e-02,-1.665815205390569834e-02,-7.653558588881050062e-02,3.430885887772629900e-02,-1.190068480150809939e-02,-1.350401824497050006e-02
9.015598825267629943e-03,-4.464163650698899782e-02,1.427247526792889930e-02,1.498661360748330083e-02,5.484510736603499803e-02,4.722413415115889884e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-3.324878724762579674e-02,-5.906719430815229877e-02
9.256398319871740610e-02,-4.464163650698899782e-02,3.690652881942779739e-02,2.187235499495579841e-02,-2.496015840963049931e-02,-1.665815205390569834e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-2.251217192966049885e-02,-2.178823207463989955e-02
6.713621404158050254e-02,-4.464163650698899782e-02,3.494354529119849794e-03,3.564383776990089764e-02,4.934129593323050011e-02,3.125356259989280072e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-6.092541861022970299e-04,1.963283707370720027e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-7.087467856866229432e-02,-2.288496402361559975e-02,-1.568959820211340015e-03,-1.000728964429089965e-03,2.655027262562750096e-02,-3.949338287409189657e-02,-2.251217192966049885e-02,7.206516329203029904e-03
3.081082953138499989e-02,-4.464163650698899782e-02,-3.315125598283080038e-02,-2.288496402361559975e-02,-4.697540414084860200e-02,-8.116673518254939601e-02,1.038646665114559969e-01,-7.639450375000099436e-02,-3.980959436433750137e-02,-5.492508739331759815e-02
2.717829108036539862e-02,5.068011873981870252e-02,9.403056873511560221e-02,9.761551025715360652e-02,-3.459182841703849903e-02,-3.200242668159279658e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,3.664579779339879884e-02,1.066170822852360034e-01
1.264813727628719998e-02,5.068011873981870252e-02,3.582871674554689856e-02,4.941532054484590319e-02,5.346915450783389784e-02,7.415490186505870052e-02,-6.917231028063640375e-02,1.450122215054540087e-01,4.560080841412490066e-02,4.862758547755009764e-02
7.440129094361959405e-02,-4.464163650698899782e-02,3.151746845002330322e-02,1.010583809508899950e-01,4.658939021682820258e-02,3.689023491210430272e-02,1.550535921336619952e-02,-2.592261998182820038e-03,3.365681290238470291e-02,4.448547856271539702e-02
-4.183993948900609910e-02,-4.464163650698899782e-02,-6.548561819925780014e-02,-4.009931749229690007e-02,-5.696818394814720174e-03,1.434354566325799982e-02,-4.340084565202689815e-02,3.430885887772629900e-02,7.026862549151949647e-03,-1.350401824497050006e-02
-8.906293935226029801e-02,-4.464163650698899782e-02,-4.177375257387799801e-02,-1.944209332987930153e-02,-6.623874415566440021e-02,-7.427746902317970690e-02,8.142083605192099172e-03,-3.949338287409189657e-02,1.143797379512540100e-03,-3.007244590430930078e-02
2.354575262934580082e-02,5.068011873981870252e-02,-3.961812842611620034e-02,-5.670610554934250001e-03,-4.835135699904979933e-02,-3.325502052875090042e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.016435479455120028e-01,-6.735140813782170000e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,-3.854031635223530150e-02,-2.632783471735180084e-02,-1.532848840222260020e-02,8.781618063081050515e-04,-3.235593223976569732e-02,-2.592261998182820038e-03,1.143797379512540100e-03,-3.835665973397880263e-02
-2.367724723390840155e-02,5.068011873981870252e-02,-2.560657146566450160e-02,4.252957915737339695e-02,-5.385516843185429725e-02,-4.765984977106939996e-02,-2.131101882750449997e-02,-3.949338287409189657e-02,1.143797379512540100e-03,1.963283707370720027e-02
-9.996055470531900466e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-6.419941234845069622e-02,-5.798302700645770191e-02,-6.018578824265070210e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-1.811826730789670159e-02,-5.078298047848289754e-02
-2.730978568492789874e-02,-4.464163650698899782e-02,-6.656343027313869898e-02,-1.123996020607579971e-01,-4.972730985725089953e-02,-4.139688053527879746e-02,7.788079970179680352e-04,-3.949338287409189657e-02,-3.581672810154919867e-02,-9.361911330135799444e-03
3.081082953138499989e-02,5.068011873981870252e-02,3.259528052390420205e-02,4.941532054484590319e-02,-4.009563984984299695e-02,-4.358891976780549654e-02,-6.917231028063640375e-02,3.430885887772629900e-02,6.301661511474640487e-02,3.064409414368320182e-03
-1.035930931563389945e-01,5.068011873981870252e-02,-4.608500086940160029e-02,-2.632783471735180084e-02,-2.496015840963049931e-02,-2.480001206043359885e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-3.980959436433750137e-02,-5.492508739331759815e-02
6.713621404158050254e-02,5.068011873981870252e-02,-2.991781976118810041e-02,5.744868538213489945e-02,-1.930069620102049918e-04,-1.571870666853709964e-02,7.441156407875940126e-02,-5.056371913686460301e-02,-3.845911230135379971e-02,7.206516329203029904e-03
-5.273755484206479882e-02,-4.464163650698899782e-02,-1.267282657909369996e-02,-6.075654165471439799e-02,-1.930069620102049918e-04,8.080576427467340075e-03,1.182372140927919965e-02,-2.592261998182820038e-03,-2.712864555432650121e-02,-5.078298047848289754e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-1.590626280073640167e-02,-2.977070541108809906e-02,3.934851612593179802e-03,-6.875805026395569565e-04,4.127682384197570165e-02,-3.949338287409189657e-02,-2.364455757213410059e-02,1.134862324403770016e-02
-3.820740103798660192e-02,5.068011873981870252e-02,7.139651518361660176e-02,-5.731367096097819691e-02,1.539137131565160022e-01,1.558866503921270130e-01,7.788079970179680352e-04,7.194800217115350505e-02,5.027649338998960160e-02,6.933812005172369786e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-3.099563183506899924e-02,2.187235499495579841e-02,8.062710187196569719e-03,8.706873351046409346e-03,4.460445801105040325e-03,-2.592261998182820038e-03,9.436409146079870192e-03,1.134862324403770016e-02
1.264813727628719998e-02,5.068011873981870252e-02,2.609183074771409820e-04,-1.140872838930430053e-02,3.970962592582259754e-02,5.724488492842390308e-02,-3.971920784793980114e-02,5.608052019451260223e-02,2.405258322689299982e-02,3.205915781821130212e-02
6.713621404158050254e-02,-4.464163650698899782e-02,3.690652881942779739e-02,-5.042792957350569760e-02,-2.358420555142939912e-02,-3.450761437590899733e-02,4.864009945014990260e-02,-3.949338287409189657e-02,-2.595242443518940012e-02,-3.835665973397880263e-02
4.534098333546320025e-02,-4.464163650698899782e-02,3.906215296718960200e-02,4.597244985110970211e-02,6.686757328995440036e-03,-2.417371513685449835e-02,8.142083605192099172e-03,-1.255556463467829946e-02,6.432823302367089713e-02,5.691179930721949887e-02
6.713621404158050254e-02,5.068011873981870252e-02,-1.482845072685549936e-02,5.859630917623830093e-02,-5.935897986465880211e-02,-3.450761437590899733e-02,-6.180903467246220279e-02,1.290620876969899959e-02,-5.145307980263110273e-03,4.862758547755009764e-02
2.717829108036539862e-02,-4.464163650698899782e-02,6.727790750762559745e-03,3.564383776990089764e-02,7.961225881365530110e-02,7.071026878537380045e-02,1.550535921336619952e-02,3.430885887772629900e-02,4.067226371449769728e-02,1.134862324403770016e-02
5.623859868852180283e-02,-4.464163650698899782e-02,-6.871905442090049665e-02,-6.878990659528949614e-02,-1.930069620102049918e-04,-1.000728964429089965e-03,4.495846164606279866e-02,-3.764832683029650101e-02,-4.836172480289190057e-02,-1.077697500466389974e-03
3.444336798240450054e-02,5.068011873981870252e-02,-9.439390357450949676e-03,5.974393262605470073e-02,-3.596778127523959923e-02,-7.576846662009279788e-03,-7.653558588881050062e-02,7.120997975363539678e-02,1.100810104587249955e-02,-2.178823207463989955e-02
2.354575262934580082e-02,-4.464163650698899782e-02,1.966153563733339868e-02,-1.255635194240680048e-02,8.374011738825870577e-02,3.876912568284150012e-02,6.336665066649820044e-02,-2.592261998182820038e-03,6.604820616309839409e-02,4.862758547755009764e-02
4.897352178648269744e-02,5.068011873981870252e-02,7.462995140525929827e-02,6.662967401352719310e-02,-9.824676969418109224e-03,-2.253322811587220049e-03,-4.340084565202689815e-02,3.430885887772629900e-02,3.365681290238470291e-02,1.963283707370720027e-02
3.081082953138499989e-02,5.068011873981870252e-02,-8.361578283570040432e-03,4.658001526274530187e-03,1.494247447820220079e-02,2.749578105841839898e-02,8.142083605192099172e-03,-8.127430129569179762e-03,-2.952762274177360077e-02,5.691179930721949887e-02
-1.035930931563389945e-01,5.068011873981870252e-02,-2.345094731790270046e-02,-2.288496402361559975e-02,-8.687803702868139577e-02,-6.770135132559949864e-02,-1.762938102341739949e-02,-3.949338287409189657e-02,-7.814091066906959926e-02,-7.149351505265640061e-02
1.628067572730669890e-02,5.068011873981870252e-02,-4.608500086940160029e-02,1.154374291374709975e-02,-3.321587555883730170e-02,-1.603185513032660131e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,-4.398540256559110156e-02,-4.249876664881350324e-02
-6.000263174410389727e-02,5.068011873981870252e-02,5.415152200152219958e-02,-1.944209332987930153e-02,-4.972730985725089953e-02,-4.891244361822749687e-02,2.286863482154040048e-02,-3.949338287409189657e-02,-4.398540256559110156e-02,-5.219804415301099697e-03
-2.730978568492789874e-02,-4.464163650698899782e-02,-3.530688013059259805e-02,-2.977070541108809906e-02,-5.660707414825649764e-02,-5.862004593370299943e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-4.986846773523059828e-02,-1.294830118603420011e-01
4.170844488444359899e-02,-4.464163650698899782e-02,-3.207344390894990155e-02,-6.190416520781699683e-02,7.961225881365530110e-02,5.098191569263330059e-02,5.600337505832399948e-02,-9.972486173364639508e-03,4.506616833626150148e-02,-5.906719430815229877e-02
-8.179786245022120650e-02,-4.464163650698899782e-02,-8.165279930747129655e-02,-4.009931749229690007e-02,2.558898754392050119e-03,-1.853704282464289921e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-1.090443584737709956e-02,-9.220404962683000083e-02
-4.183993948900609910e-02,-4.464163650698899782e-02,4.768464955823679963e-02,5.974393262605470073e-02,1.277706088506949944e-01,1.280164372928579986e-01,-2.499265663159149983e-02,1.081111006295440019e-01,6.389312063683939835e-02,4.034337164788070335e-02
-1.277963188084970010e-02,-4.464163650698899782e-02,6.061839444480759953e-02,5.285819123858220142e-02,4.796534307502930278e-02,2.937467182915549924e-02,-1.762938102341739949e-02,3.430885887772629900e-02,7.021129819331020649e-02,7.206516329203029904e-03
6.713621404158050254e-02,-4.464163650698899782e-02,5.630714614928399725e-02,7.351541540099980343e-02,-1.395253554402150001e-02,-3.920484130275200124e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,7.573758845754760549e-02,3.620126473304600273e-02
-5.273755484206479882e-02,5.068011873981870252e-02,9.834181703063900326e-02,8.728689817594480205e-02,6.034891879883950289e-02,4.878987646010649742e-02,-5.812739686837520292e-02,1.081111006295440019e-01,8.449528221240310000e-02,4.034337164788070335e-02
5.383060374248070309e-03,-4.464163650698899782e-02,5.954058237092670069e-02,-5.616604740787570216e-02,2.457414448561009990e-02,5.286080646337049799e-02,-4.340084565202689815e-02,5.091436327188540029e-02,-4.219859706946029777e-03,-3.007244590430930078e-02
8.166636784565869944e-02,-4.464163650698899782e-02,3.367309259778510089e-02,8.100872220010799790e-03,5.209320164963270050e-02,5.661858800484489973e-02,-1.762938102341739949e-02,3.430885887772629900e-02,3.486419309615960277e-02,6.933812005172369786e-02
3.081082953138499989e-02,5.068011873981870252e-02,5.630714614928399725e-02,7.695828609473599757e-02,4.934129593323050011e-02,-1.227407358885230018e-02,-3.603757004385269719e-02,7.120997975363539678e-02,1.200533820015380060e-01,9.004865462589720093e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-6.548561819925780014e-02,-5.670610554934250001e-03,-7.072771253015849857e-03,-1.947648821001150138e-02,4.127682384197570165e-02,-3.949338287409189657e-02,-3.303712578676999863e-03,7.206516329203029904e-03
-4.910501639104519755e-02,-4.464163650698899782e-02,1.608549173157310108e-01,-4.698505887976939938e-02,-2.908801698423390050e-02,-1.978963667180099958e-02,-4.708248345611389801e-02,3.430885887772629900e-02,2.801650652326400162e-02,1.134862324403770016e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-5.578530953432969675e-02,2.531522568869210010e-02,-7.072771253015849857e-03,-2.354741821327540133e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-5.145307980263110273e-03,-5.078298047848289754e-02
7.803382939463919532e-02,5.068011873981870252e-02,-2.452875939178359929e-02,-4.239456463293059946e-02,6.686757328995440036e-03,5.286080646337049799e-02,-6.917231028063640375e-02,8.080427118137170628e-02,-3.712834601047360072e-02,5.691179930721949887e-02
1.264813727628719998e-02,-4.464163650698899782e-02,-3.638469220447349689e-02,4.252957915737339695e-02,-1.395253554402150001e-02,1.293437758520510003e-02,-2.683347553363510038e-02,5.156973385758089994e-03,-4.398540256559110156e-02,7.206516329203029904e-03
4.170844488444359899e-02,-4.464163650698899782e-02,-8.361578283570040432e-03,-5.731367096097819691e-02,8.062710187196569719e-03,-3.137612975801370302e-02,1.517259579645879874e-01,-7.639450375000099436e-02,-8.023654024890179703e-02,-1.764612515980519894e-02
4.897352178648269744e-02,-4.464163650698899782e-02,-4.177375257387799801e-02,1.045012516446259948e-01,3.558176735121919981e-02,-2.573945744580210040e-02,1.774974225931970073e-01,-7.639450375000099436e-02,-1.290794225416879923e-02,1.549073015887240078e-02
-1.641217033186929963e-02,5.068011873981870252e-02,1.274427430254229943e-01,9.761551025715360652e-02,1.631842733640340160e-02,1.747503028115330106e-02,-2.131101882750449997e-02,3.430885887772629900e-02,3.486419309615960277e-02,3.064409414368320182e-03
-7.453278554818210111e-02,5.068011873981870252e-02,-7.734155101194770121e-02,-4.698505887976939938e-02,-4.697540414084860200e-02,-3.262872360517189707e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-7.212845460195599356e-02,-1.764612515980519894e-02
3.444336798240450054e-02,5.068011873981870252e-02,2.828403222838059977e-02,-3.321357610482440076e-02,-4.559945128264750180e-02,-9.768885894535990141e-03,-5.076412126020100196e-02,-2.592261998182820038e-03,-5.947269741072230137e-02,-2.178823207463989955e-02
-3.457486258696700065e-02,5.068011873981870252e-02,-2.560657146566450160e-02,-1.714684618924559867e-02,1.182945896190920002e-03,-2.879619735166290186e-03,8.142083605192099172e-03,-1.550765430475099967e-02,1.482271084126630077e-02,4.034337164788070335e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-6.225218197761509670e-02,1.154374291374709975e-02,-8.448724111216979540e-03,-3.669965360843580049e-02,1.222728555318910032e-01,-7.639450375000099436e-02,-8.682899321629239386e-02,3.064409414368320182e-03
5.987113713954139715e-02,-4.464163650698899782e-02,-8.168937664037369826e-04,-8.485663651086830517e-02,7.548440023905199359e-02,7.947842571548069390e-02,4.460445801105040325e-03,3.430885887772629900e-02,2.337484127982079885e-02,2.791705090337660150e-02
6.350367559056099842e-02,5.068011873981870252e-02,8.864150836571099701e-02,7.007254470726349826e-02,2.044628591100669870e-02,3.751653183568340322e-02,-5.076412126020100196e-02,7.120997975363539678e-02,2.930041326858690010e-02,7.348022696655839847e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-3.207344390894990155e-02,-2.632783471735180084e-02,4.246153164222479792e-02,-1.039518281811509931e-02,1.590892335727620011e-01,-7.639450375000099436e-02,-1.190068480150809939e-02,-3.835665973397880263e-02
5.383060374248070309e-03,5.068011873981870252e-02,3.043965637614240091e-02,8.384402748220859403e-02,-3.734373413344069942e-02,-4.734670130927989828e-02,1.550535921336619952e-02,-3.949338287409189657e-02,8.640282933063080789e-03,1.549073015887240078e-02
3.807590643342410180e-02,5.068011873981870252e-02,8.883414898524360018e-03,4.252957915737339695e-02,-4.284754556624519733e-02,-2.104223051895920057e-02,-3.971920784793980114e-02,-2.592261998182820038e-03,-1.811826730789670159e-02,7.206516329203029904e-03
1.264813727628719998e-02,-4.464163650698899782e-02,6.727790750762559745e-03,-5.616604740787570216e-02,-7.587041416307230279e-02,-6.644875747844139480e-02,-2.131101882750449997e-02,-3.764832683029650101e-02,-1.811826730789670159e-02,-9.220404962683000083e-02
7.440129094361959405e-02,5.068011873981870252e-02,-2.021751109626000048e-02,4.597244985110970211e-02,7.410844738085080319e-02,3.281930490884039930e-02,-3.603757004385269719e-02,7.120997975363539678e-02,1.063542767417259977e-01,3.620126473304600273e-02
1.628067572730669890e-02,-4.464163650698899782e-02,-2.452875939178359929e-02,3.564383776990089764e-02,-7.072771253015849857e-03,-3.192768196955810076e-03,-1.394774321933030074e-02,-2.592261998182820038e-03,1.556684454070180086e-02,1.549073015887240078e-02
-5.514554978810590376e-03,5.068011873981870252e-02,-1.159501450521270051e-02,1.154374291374709975e-02,-2.220825269322829892e-02,-1.540555820674759969e-02,-2.131101882750449997e-02,-2.592261998182820038e-03,1.100810104587249955e-02,6.933812005172369786e-02
1.264813727628719998e-02,-4.464163650698899782e-02,2.612840808061879863e-02,6.318680331979099896e-02,1.250187031342930022e-01,9.169121572527250130e-02,6.336665066649820044e-02,-2.592261998182820038e-03,5.757285620242599822e-02,-2.178823207463989955e-02
-3.457486258696700065e-02,-4.464163650698899782e-02,-5.901874575597240019e-02,1.215130832538269907e-03,-5.385516843185429725e-02,-7.803525056465400456e-02,6.704828847058519337e-02,-7.639450375000099436e-02,-2.139368094035999993e-02,1.549073015887240078e-02
6.713621404158050254e-02,5.068011873981870252e-02,-3.638469220447349689e-02,-8.485663651086830517e-02,-7.072771253015849857e-03,1.966706951368000014e-02,-5.444575906428809897e-02,3.430885887772629900e-02,1.143797379512540100e-03,3.205915781821130212e-02
3.807590643342410180e-02,5.068011873981870252e-02,-2.452875939178359929e-02,4.658001526274530187e-03,-2.633611126783170012e-02,-2.636575436938120090e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-1.599826775813870117e-02,-2.593033898947460017e-02
9.015598825267629943e-03,5.068011873981870252e-02,1.858372356345249984e-02,3.908670846363720280e-02,1.769438019460449832e-02,1.058576412178359981e-02,1.918699701745330000e-02,-2.592261998182820038e-03,1.630495279994180133e-02,-1.764612515980519894e-02
-9.269547780327989928e-02,5.068011873981870252e-02,-9.027529589851850111e-02,-5.731367096097819691e-02,-2.496015840963049931e-02,-3.043668437264510085e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,2.405258322689299982e-02,3.064409414368320182e-03
7.076875249260000666e-02,-4.464163650698899782e-02,-5.128142061927360405e-03,-5.670610554934250001e-03,8.786797596286209655e-02,1.029645603496960049e-01,1.182372140927919965e-02,3.430885887772629900e-02,-8.944018957797799166e-03,2.791705090337660150e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,-5.255187331268700024e-02,-3.321357610482440076e-02,-4.422349842444640161e-02,-3.638650514664620167e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-6.832974362442149896e-02,-3.007244590430930078e-02
4.170844488444359899e-02,5.068011873981870252e-02,-2.237313524402180162e-02,2.875809638242839833e-02,-6.623874415566440021e-02,-4.515466207675319921e-02,-6.180903467246220279e-02,-2.592261998182820038e-03,2.863770518940129874e-03,-5.492508739331759815e-02
1.264813727628719998e-02,-4.464163650698899782e-02,-2.021751109626000048e-02,-1.599922263614299983e-02,1.219056876180000040e-02,2.123281182262769934e-02,-7.653558588881050062e-02,1.081111006295440019e-01,5.988072306548120061e-02,-2.178823207463989955e-02
-3.820740103798660192e-02,-4.464163650698899782e-02,-5.470749746044879791e-02,-7.797089512339580586e-02,-3.321587555883730170e-02,-8.649025903297140327e-02,1.406810445523269948e-01,-7.639450375000099436e-02,-1.919704761394450121e-02,-5.219804415301099697e-03
4.534098333546320025e-02,-4.464163650698899782e-02,-6.205954135808240159e-03,-1.599922263614299983e-02,1.250187031342930022e-01,1.251981011367520047e-01,1.918699701745330000e-02,3.430885887772629900e-02,3.243322577960189995e-02,-5.219804415301099697e-03
7.076875249260000666e-02,5.068011873981870252e-02,-1.698407487461730050e-02,2.187235499495579841e-02,4.383748450042589812e-02,5.630543954305530091e-02,3.759518603788870178e-02,-2.592261998182820038e-03,-7.020931272868760620e-02,-1.764612515980519894e-02
-7.453278554818210111e-02,5.068011873981870252e-02,5.522933407540309841e-02,-4.009931749229690007e-02,5.346915450783389784e-02,5.317395492515999966e-02,-4.340084565202689815e-02,7.120997975363539678e-02,6.123790751970099866e-02,-3.421455281914410201e-02
5.987113713954139715e-02,5.068011873981870252e-02,7.678557555302109594e-02,2.531522568869210010e-02,1.182945896190920002e-03,1.684873335757430118e-02,-5.444575906428809897e-02,3.430885887772629900e-02,2.993564839653250001e-02,4.448547856271539702e-02
7.440129094361959405e-02,-4.464163650698899782e-02,1.858372356345249984e-02,6.318680331979099896e-02,6.172487165704060308e-02,4.284005568610550069e-02,8.142083605192099172e-03,-2.592261998182820038e-03,5.803912766389510147e-02,-5.906719430815229877e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-2.237313524402180162e-02,-3.206595255172180192e-02,-4.972730985725089953e-02,-6.864079671096809387e-02,7.809320188284639419e-02,-7.085933561861459951e-02,-6.291294991625119570e-02,-3.835665973397880263e-02
-7.090024709716259699e-02,-4.464163650698899782e-02,9.295275666123460623e-02,1.269136646684959971e-02,2.044628591100669870e-02,4.252690722431590187e-02,7.788079970179680352e-04,3.598276718899090076e-04,-5.454415271109520208e-02,-1.077697500466389974e-03
2.354575262934580082e-02,5.068011873981870252e-02,-3.099563183506899924e-02,-5.670610554934250001e-03,-1.670444126042380101e-02,1.778817874294279927e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-7.408887149153539631e-02,-3.421455281914410201e-02
-5.273755484206479882e-02,5.068011873981870252e-02,3.906215296718960200e-02,-4.009931749229690007e-02,-5.696818394814720174e-03,-1.290037051243130006e-02,1.182372140927919965e-02,-3.949338287409189657e-02,1.630495279994180133e-02,3.064409414368320182e-03
6.713621404158050254e-02,-4.464163650698899782e-02,-6.117436990373419786e-02,-4.009931749229690007e-02,-2.633611126783170012e-02,-2.448686359864400003e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-5.615757309500619965e-02,-5.906719430815229877e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-8.361578283570040432e-03,-6.419941234845069622e-02,-3.871968699164179961e-02,-2.448686359864400003e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-6.468302246445030435e-02,-5.492508739331759815e-02
2.354575262934580082e-02,5.068011873981870252e-02,-3.746250427835440266e-02,-4.698505887976939938e-02,-9.100589560328480043e-02,-7.553006287033779687e-02,-3.235593223976569732e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,-1.350401824497050006e-02
3.807590643342410180e-02,5.068011873981870252e-02,-1.375063865297449991e-02,-1.599922263614299983e-02,-3.596778127523959923e-02,-2.198167590432769866e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-2.595242443518940012e-02,-1.077697500466389974e-03
1.628067572730669890e-02,-4.464163650698899782e-02,7.355213933137849658e-02,-4.124694104539940176e-02,-4.320865536613589623e-03,-1.352666743601040056e-02,-1.394774321933030074e-02,-1.116217163146459961e-03,4.289568789252869857e-02,4.448547856271539702e-02
-1.882016527791040067e-03,5.068011873981870252e-02,-2.452875939178359929e-02,5.285819123858220142e-02,2.732605020201240090e-02,3.000096875273459973e-02,3.023191042971450082e-02,-2.592261998182820038e-03,-2.139368094035999993e-02,3.620126473304600273e-02
1.264813727628719998e-02,-4.464163650698899782e-02,3.367309259778510089e-02,3.334859052598110329e-02,3.007795591841460128e-02,2.718263259662880016e-02,-2.902829807069099918e-03,8.847085473348980864e-03,3.119299070280229930e-02,2.791705090337660150e-02
7.440129094361959405e-02,-4.464163650698899782e-02,3.475090467166599972e-02,9.417263956341730136e-02,5.759701308243719842e-02,2.029336643725910064e-02,2.286863482154040048e-02,-2.592261998182820038e-03,7.380214692004880006e-02,-2.178823207463989955e-02
4.170844488444359899e-02,5.068011873981870252e-02,-3.854031635223530150e-02,5.285819123858220142e-02,7.686035309725310072e-02,1.164299442066459994e-01,-3.971920784793980114e-02,7.120997975363539678e-02,-2.251217192966049885e-02,-1.350401824497050006e-02
-9.147093429830140468e-03,5.068011873981870252e-02,-3.961812842611620034e-02,-4.009931749229690007e-02,-8.448724111216979540e-03,1.622243643399520069e-02,-6.549067247654929980e-02,7.120997975363539678e-02,1.776347786711730131e-02,-6.735140813782170000e-02
9.015598825267629943e-03,5.068011873981870252e-02,-1.894705840284650021e-03,2.187235499495579841e-02,-3.871968699164179961e-02,-2.480001206043359885e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-3.980959436433750137e-02,-1.350401824497050006e-02
6.713621404158050254e-02,5.068011873981870252e-02,-3.099563183506899924e-02,4.658001526274530187e-03,2.457414448561009990e-02,3.563764106494619888e-02,-2.867429443567860031e-02,3.430885887772629900e-02,2.337484127982079885e-02,8.176444079622779970e-02
1.750521923228520000e-03,-4.464163650698899782e-02,-4.608500086940160029e-02,-3.321357610482440076e-02,-7.311850844667000526e-02,-8.147988364433890462e-02,4.495846164606279866e-02,-6.938329078357829971e-02,-6.117659509433449883e-02,-7.977772888232589898e-02
-9.147093429830140468e-03,5.068011873981870252e-02,1.338730381358059929e-03,-2.227739861197989939e-03,7.961225881365530110e-02,7.008397186179469995e-02,3.391354823380159783e-02,-2.592261998182820038e-03,2.671425763351279944e-02,8.176444079622779970e-02
-5.514554978810590376e-03,-4.464163650698899782e-02,6.492964274033119487e-02,3.564383776990089764e-02,-1.568959820211340015e-03,1.496984258683710031e-02,-1.394774321933030074e-02,7.288388806489919797e-04,-1.811826730789670159e-02,3.205915781821130212e-02
9.619652164973699349e-02,-4.464163650698899782e-02,4.013996504107050084e-02,-5.731367096097819691e-02,4.521343735862710239e-02,6.068951800810880315e-02,-2.131101882750449997e-02,3.615391492152170150e-02,1.255315281338930007e-02,2.377494398854190089e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-5.670610554934250001e-03,-2.083229983502719873e-02,-1.415296435958940044e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-3.845911230135379971e-02,-3.007244590430930078e-02
5.987113713954139715e-02,5.068011873981870252e-02,5.307370992764130074e-02,5.285819123858220142e-02,3.282986163481690228e-02,1.966706951368000014e-02,-1.026610541524320026e-02,3.430885887772629900e-02,5.520503808961670089e-02,-1.077697500466389974e-03
-2.367724723390840155e-02,-4.464163650698899782e-02,4.013996504107050084e-02,-1.255635194240680048e-02,-9.824676969418109224e-03,-1.000728964429089965e-03,-2.902829807069099918e-03,-2.592261998182820038e-03,-1.190068480150809939e-02,-3.835665973397880263e-02
9.015598825267629943e-03,-4.464163650698899782e-02,-2.021751109626000048e-02,-5.387080026724189868e-02,3.145390877661580209e-02,2.060651489904859884e-02,5.600337505832399948e-02,-3.949338287409189657e-02,-1.090443584737709956e-02,-1.077697500466389974e-03
1.628067572730669890e-02,5.068011873981870252e-02,1.427247526792889930e-02,1.215130832538269907e-03,1.182945896190920002e-03,-2.135537898074869878e-02,-3.235593223976569732e-02,3.430885887772629900e-02,7.496833602773420036e-02,4.034337164788070335e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-3.422906805671169922e-02,5.515343848250200270e-02,6.722868308984519814e-02,7.415490186505870052e-02,-6.584467611156170040e-03,3.283281404268990206e-02,2.472532334280450050e-02,6.933812005172369786e-02
8.893144474769780483e-02,-4.464163650698899782e-02,6.727790750762559745e-03,2.531522568869210010e-02,3.007795591841460128e-02,8.706873351046409346e-03,6.336665066649820044e-02,-3.949338287409189657e-02,9.436409146079870192e-03,3.205915781821130212e-02
1.991321417832630017e-02,-4.464163650698899782e-02,4.572166603000769880e-03,4.597244985110970211e-02,-1.808039411862490120e-02,-5.454911593043910295e-02,6.336665066649820044e-02,-3.949338287409189657e-02,2.866072031380889965e-02,6.105390622205419948e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,3.043965637614240091e-02,-5.670610554934250001e-03,8.236416453005759863e-02,9.200436418706199604e-02,-1.762938102341739949e-02,7.120997975363539678e-02,3.304707235493409972e-02,3.064409414368320182e-03
9.619652164973699349e-02,-4.464163650698899782e-02,5.199589785376040191e-02,7.925353333865589600e-02,5.484510736603499803e-02,3.657708645031480105e-02,-7.653558588881050062e-02,1.413221094178629955e-01,9.864637430492799453e-02,6.105390622205419948e-02
2.354575262934580082e-02,5.068011873981870252e-02,6.169620651868849837e-02,6.203917986997459916e-02,2.457414448561009990e-02,-3.607335668485669999e-02,-9.126213710515880539e-02,1.553445353507079962e-01,1.333957338374689994e-01,8.176444079622779970e-02
7.076875249260000666e-02,5.068011873981870252e-02,-7.283766209689159811e-03,4.941532054484590319e-02,6.034891879883950289e-02,-4.445362044113949918e-03,-5.444575906428809897e-02,1.081111006295440019e-01,1.290194116001679991e-01,5.691179930721949887e-02
3.081082953138499989e-02,-4.464163650698899782e-02,5.649978676881649634e-03,1.154374291374709975e-02,7.823630595545419397e-02,7.791268340653299818e-02,-4.340084565202689815e-02,1.081111006295440019e-01,6.604820616309839409e-02,1.963283707370720027e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,5.415152200152219958e-02,-6.649465948908450663e-02,7.273249452264969606e-02,5.661858800484489973e-02,-4.340084565202689815e-02,8.486339447772170419e-02,8.449528221240310000e-02,4.862758547755009764e-02
4.534098333546320025e-02,5.068011873981870252e-02,-8.361578283570040432e-03,-3.321357610482440076e-02,-7.072771253015849857e-03,1.191310268097639903e-03,-3.971920784793980114e-02,3.430885887772629900e-02,2.993564839653250001e-02,2.791705090337660150e-02
7.440129094361959405e-02,-4.464163650698899782e-02,1.145089981388529993e-01,2.875809638242839833e-02,2.457414448561009990e-02,2.499059336410210108e-02,1.918699701745330000e-02,-2.592261998182820038e-03,-6.092541861022970299e-04,-5.219804415301099697e-03
-3.820740103798660192e-02,-4.464163650698899782e-02,6.708526688809300642e-02,-6.075654165471439799e-02,-2.908801698423390050e-02,-2.323426975148589965e-02,-1.026610541524320026e-02,-2.592261998182820038e-03,-1.498586820292070049e-03,1.963283707370720027e-02
-1.277963188084970010e-02,5.068011873981870252e-02,-5.578530953432969675e-02,-2.227739861197989939e-03,-2.771206412603280031e-02,-2.918409052548700047e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-1.705210460474350029e-02,4.448547856271539702e-02
9.015598825267629943e-03,5.068011873981870252e-02,3.043965637614240091e-02,4.252957915737339695e-02,-2.944912678412469915e-03,3.689023491210430272e-02,-6.549067247654929980e-02,7.120997975363539678e-02,-2.364455757213410059e-02,1.549073015887240078e-02
8.166636784565869944e-02,5.068011873981870252e-02,-2.560657146566450160e-02,-3.665644679856060184e-02,-7.036660273026780488e-02,-4.640725592391130305e-02,-3.971920784793980114e-02,-2.592261998182820038e-03,-4.118038518800790082e-02,-5.219804415301099697e-03
3.081082953138499989e-02,-4.464163650698899782e-02,1.048086894739250069e-01,7.695828609473599757e-02,-1.120062982761920074e-02,-1.133462820348369975e-02,-5.812739686837520292e-02,3.430885887772629900e-02,5.710418744784390155e-02,3.620126473304600273e-02
2.717829108036539862e-02,5.068011873981870252e-02,-6.205954135808240159e-03,2.875809638242839833e-02,-1.670444126042380101e-02,-1.627025888008149911e-03,-5.812739686837520292e-02,3.430885887772629900e-02,2.930041326858690010e-02,3.205915781821130212e-02
-6.000263174410389727e-02,5.068011873981870252e-02,-4.716281294328249912e-02,-2.288496402361559975e-02,-7.174255558846899528e-02,-5.768060054833450134e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-6.291294991625119570e-02,-5.492508739331759815e-02
5.383060374248070309e-03,-4.464163650698899782e-02,-4.824062501716339796e-02,-1.255635194240680048e-02,1.182945896190920002e-03,-6.637401276640669812e-03,6.336665066649820044e-02,-3.949338287409189657e-02,-5.140053526058249722e-02,-5.906719430815229877e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,8.540807214406830050e-02,-3.665644679856060184e-02,9.199583453746550121e-02,8.949917649274570508e-02,-6.180903467246220279e-02,1.450122215054540087e-01,8.094791351127560153e-02,5.276969239238479825e-02
1.991321417832630017e-02,5.068011873981870252e-02,-1.267282657909369996e-02,7.007254470726349826e-02,-1.120062982761920074e-02,7.141131042098750048e-03,-3.971920784793980114e-02,3.430885887772629900e-02,5.384369968545729690e-03,3.064409414368320182e-03
-6.363517019512339445e-02,-4.464163650698899782e-02,-3.315125598283080038e-02,-3.321357610482440076e-02,1.182945896190920002e-03,2.405114797873349891e-02,-2.499265663159149983e-02,-2.592261998182820038e-03,-2.251217192966049885e-02,-5.906719430815229877e-02
2.717829108036539862e-02,-4.464163650698899782e-02,-7.283766209689159811e-03,-5.042792957350569760e-02,7.548440023905199359e-02,5.661858800484489973e-02,3.391354823380159783e-02,-2.592261998182820038e-03,4.344317225278129802e-02,1.549073015887240078e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,-1.375063865297449991e-02,1.320442171945160059e-01,-9.824676969418109224e-03,-3.819065120534880214e-03,1.918699701745330000e-02,-3.949338287409189657e-02,-3.581672810154919867e-02,-3.007244590430930078e-02
3.081082953138499989e-02,5.068011873981870252e-02,5.954058237092670069e-02,5.630106193231849965e-02,-2.220825269322829892e-02,1.191310268097639903e-03,-3.235593223976569732e-02,-2.592261998182820038e-03,-2.479118743246069845e-02,-1.764612515980519894e-02
5.623859868852180283e-02,5.068011873981870252e-02,2.181715978509519982e-02,5.630106193231849965e-02,-7.072771253015849857e-03,1.810132720473240156e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-2.364455757213410059e-02,2.377494398854190089e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,1.858372356345249984e-02,9.072976886968099619e-02,3.934851612593179802e-03,8.706873351046409346e-03,3.759518603788870178e-02,-3.949338287409189657e-02,-5.780006567561250114e-02,7.206516329203029904e-03
-1.072256316073579990e-01,-4.464163650698899782e-02,-1.159501450521270051e-02,-4.009931749229690007e-02,4.934129593323050011e-02,6.444729954958319795e-02,-1.394774321933030074e-02,3.430885887772629900e-02,7.026862549151949647e-03,-3.007244590430930078e-02
8.166636784565869944e-02,5.068011873981870252e-02,-2.972517914165530208e-03,-3.321357610482440076e-02,4.246153164222479792e-02,5.787118185200299664e-02,-1.026610541524320026e-02,3.430885887772629900e-02,-6.092541861022970299e-04,-1.077697500466389974e-03
5.383060374248070309e-03,5.068011873981870252e-02,1.750591148957160101e-02,3.220096707616459941e-02,1.277706088506949944e-01,1.273901403692790091e-01,-2.131101882750449997e-02,7.120997975363539678e-02,6.257518145805600340e-02,1.549073015887240078e-02
3.807590643342410180e-02,5.068011873981870252e-02,-2.991781976118810041e-02,-7.452802442965950069e-02,-1.257658268582039982e-02,-1.258722205064180012e-02,4.460445801105040325e-03,-2.592261998182820038e-03,3.711738233435969789e-03,-3.007244590430930078e-02
3.081082953138499989e-02,-4.464163650698899782e-02,-2.021751109626000048e-02,-5.670610554934250001e-03,-4.320865536613589623e-03,-2.949723898727649868e-02,7.809320188284639419e-02,-3.949338287409189657e-02,-1.090443584737709956e-02,-1.077697500466389974e-03
1.750521923228520000e-03,5.068011873981870252e-02,-5.794093368209150136e-02,-4.354218818603310115e-02,-9.650970703608929835e-02,-4.703355284749029946e-02,-9.862541271333299941e-02,3.430885887772629900e-02,-6.117659509433449883e-02,-7.149351505265640061e-02
-2.730978568492789874e-02,5.068011873981870252e-02,6.061839444480759953e-02,1.079441223383619947e-01,1.219056876180000040e-02,-1.759759743927430051e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,7.021129819331020649e-02,1.356118306890790048e-01
-8.543040090124079389e-02,5.068011873981870252e-02,-4.069594049999709917e-02,-3.321357610482440076e-02,-8.137422559587689785e-02,-6.958024209633670298e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-5.780006567561250114e-02,-4.249876664881350324e-02
1.264813727628719998e-02,5.068011873981870252e-02,-7.195249064254319316e-02,-4.698505887976939938e-02,-5.110326271545199972e-02,-9.713730673381550107e-02,1.185912177278039964e-01,-7.639450375000099436e-02,-2.028874775162960165e-02,-3.835665973397880263e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-5.578530953432969675e-02,-3.665644679856060184e-02,8.924392882106320368e-02,-3.192768196955810076e-03,8.142083605192099172e-03,3.430885887772629900e-02,1.323726493386760128e-01,3.064409414368320182e-03
-2.367724723390840155e-02,5.068011873981870252e-02,4.552902541047500196e-02,2.187235499495579841e-02,1.098832216940800049e-01,8.887287956916670173e-02,7.788079970179680352e-04,3.430885887772629900e-02,7.419253669003070262e-02,6.105390622205419948e-02
-7.453278554818210111e-02,5.068011873981870252e-02,-9.439390357450949676e-03,1.498661360748330083e-02,-3.734373413344069942e-02,-2.166852744253820046e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,-3.324878724762579674e-02,1.134862324403770016e-02
-5.514554978810590376e-03,5.068011873981870252e-02,-3.315125598283080038e-02,-1.599922263614299983e-02,8.062710187196569719e-03,1.622243643399520069e-02,1.550535921336619952e-02,-2.592261998182820038e-03,-2.832024254799870092e-02,-7.563562196749110123e-02
-6.000263174410389727e-02,5.068011873981870252e-02,4.984027370599859730e-02,1.842948430121960079e-02,-1.670444126042380101e-02,-3.012353591085559917e-02,-1.762938102341739949e-02,-2.592261998182820038e-03,4.976865992074899769e-02,-5.906719430815229877e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,-8.488623552911400694e-02,-2.632783471735180084e-02,-3.596778127523959923e-02,-3.419446591411950259e-02,4.127682384197570165e-02,-5.167075276314189725e-02,-8.238148325810279449e-02,-4.664087356364819692e-02
3.807590643342410180e-02,5.068011873981870252e-02,5.649978676881649634e-03,3.220096707616459941e-02,6.686757328995440036e-03,1.747503028115330106e-02,-2.499265663159149983e-02,3.430885887772629900e-02,1.482271084126630077e-02,6.105390622205419948e-02
1.628067572730669890e-02,-4.464163650698899782e-02,2.073934771121430098e-02,2.187235499495579841e-02,-1.395253554402150001e-02,-1.321351897422090062e-02,-6.584467611156170040e-03,-2.592261998182820038e-03,1.331596790892770020e-02,4.034337164788070335e-02
4.170844488444359899e-02,-4.464163650698899782e-02,-7.283766209689159811e-03,2.875809638242839833e-02,-4.284754556624519733e-02,-4.828614669464850045e-02,5.232173725423699961e-02,-7.639450375000099436e-02,-7.212845460195599356e-02,2.377494398854190089e-02
1.991321417832630017e-02,5.068011873981870252e-02,1.048086894739250069e-01,7.007254470726349826e-02,-3.596778127523959923e-02,-2.667890283117069911e-02,-2.499265663159149983e-02,-2.592261998182820038e-03,3.711738233435969789e-03,4.034337164788070335e-02
-4.910501639104519755e-02,5.068011873981870252e-02,-2.452875939178359929e-02,6.750727943574620551e-05,-4.697540414084860200e-02,-2.824464514011839830e-02,-6.549067247654929980e-02,2.840467953758080144e-02,1.919903307856710151e-02,1.134862324403770016e-02
1.750521923228520000e-03,5.068011873981870252e-02,-6.205954135808240159e-03,-1.944209332987930153e-02,-9.824676969418109224e-03,4.949091809572019746e-03,-3.971920784793980114e-02,3.430885887772629900e-02,1.482271084126630077e-02,9.833286845556660216e-02
3.444336798240450054e-02,-4.464163650698899782e-02,-3.854031635223530150e-02,-1.255635194240680048e-02,9.438663045397699403e-03,5.262240271361550044e-03,-6.584467611156170040e-03,-2.592261998182820038e-03,3.119299070280229930e-02,9.833286845556660216e-02
-4.547247794002570037e-02,5.068011873981870252e-02,1.371430516903520136e-01,-1.599922263614299983e-02,4.108557878402369773e-02,3.187985952347179713e-02,-4.340084565202689815e-02,7.120997975363539678e-02,7.102157794598219775e-02,4.862758547755009764e-02
-9.147093429830140468e-03,5.068011873981870252e-02,1.705552259806600024e-01,1.498661360748330083e-02,3.007795591841460128e-02,3.375875029420900147e-02,-2.131101882750449997e-02,3.430885887772629900e-02,3.365681290238470291e-02,3.205915781821130212e-02
-1.641217033186929963e-02,5.068011873981870252e-02,2.416542455238970041e-03,1.498661360748330083e-02,2.182223876920789951e-02,-1.008203435632550049e-02,-2.499265663159149983e-02,3.430885887772629900e-02,8.553312118743899850e-02,8.176444079622779970e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,3.798434089330870317e-02,-4.009931749229690007e-02,-2.496015840963049931e-02,-3.819065120534880214e-03,-4.340084565202689815e-02,1.585829843977170153e-02,-5.145307980263110273e-03,2.791705090337660150e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-5.794093368209150136e-02,-5.731367096097819691e-02,-1.568959820211340015e-03,-1.258722205064180012e-02,7.441156407875940126e-02,-3.949338287409189657e-02,-6.117659509433449883e-02,-7.563562196749110123e-02
5.260606023750229870e-02,5.068011873981870252e-02,-9.439390357450949676e-03,4.941532054484590319e-02,5.071724879143160031e-02,-1.916333974822199970e-02,-1.394774321933030074e-02,3.430885887772629900e-02,1.193439942037869961e-01,-1.764612515980519894e-02
-2.730978568492789874e-02,5.068011873981870252e-02,-2.345094731790270046e-02,-1.599922263614299983e-02,1.356652162000110060e-02,1.277780335431030062e-02,2.655027262562750096e-02,-2.592261998182820038e-03,-1.090443584737709956e-02,-2.178823207463989955e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,-1.051720243133190055e-02,-5.670610554934250001e-03,-6.623874415566440021e-02,-5.705430362475540085e-02,-2.902829807069099918e-03,-3.949338287409189657e-02,-4.257210492279420166e-02,-1.077697500466389974e-03
-1.072256316073579990e-01,-4.464163650698899782e-02,-3.422906805671169922e-02,-6.764228304218700139e-02,-6.348683843926219983e-02,-7.051968748170529822e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-6.092541861022970299e-04,-7.977772888232589898e-02
4.534098333546320025e-02,5.068011873981870252e-02,-2.972517914165530208e-03,1.079441223383619947e-01,3.558176735121919981e-02,2.248540566978590033e-02,2.655027262562750096e-02,-2.592261998182820038e-03,2.801650652326400162e-02,1.963283707370720027e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,6.816307896197400240e-02,-5.670610554934250001e-03,1.195148917014880047e-01,1.302084765253850029e-01,-2.499265663159149983e-02,8.670845052151719690e-02,4.613233103941480340e-02,-1.077697500466389974e-03
1.991321417832630017e-02,5.068011873981870252e-02,9.961226972405269262e-03,1.842948430121960079e-02,1.494247447820220079e-02,4.471894645684260094e-02,-6.180903467246220279e-02,7.120997975363539678e-02,9.436409146079870192e-03,-6.320930122298699938e-02
1.628067572730669890e-02,5.068011873981870252e-02,2.416542455238970041e-03,-5.670610554934250001e-03,-5.696818394814720174e-03,1.089891258357309975e-02,-5.076412126020100196e-02,3.430885887772629900e-02,2.269202256674450122e-02,-3.835665973397880263e-02
-1.882016527791040067e-03,-4.464163650698899782e-02,-3.854031635223530150e-02,2.187235499495579841e-02,-1.088932827598989989e-01,-1.156130659793979942e-01,2.286863482154040048e-02,-7.639450375000099436e-02,-4.687948284421659950e-02,2.377494398854190089e-02
1.628067572730669890e-02,-4.464163650698899782e-02,2.612840808061879863e-02,5.859630917623830093e-02,-6.073493272285990230e-02,-4.421521669138449989e-02,-1.394774321933030074e-02,-3.395821474270550172e-02,-5.140053526058249722e-02,-2.593033898947460017e-02
-7.090024709716259699e-02,5.068011873981870252e-02,-8.919748382463760228e-02,-7.452802442965950069e-02,-4.284754556624519733e-02,-2.573945744580210040e-02,-3.235593223976569732e-02,-2.592261998182820038e-03,-1.290794225416879923e-02,-5.492508739331759815e-02
4.897352178648269744e-02,-4.464163650698899782e-02,6.061839444480759953e-02,-2.288496402361559975e-02,-2.358420555142939912e-02,-7.271172671423199729e-02,-4.340084565202689815e-02,-2.592261998182820038e-03,1.041376113589790042e-01,3.620126473304600273e-02
5.383060374248070309e-03,5.068011873981870252e-02,-2.884000768730720157e-02,-9.113481248670509197e-03,-3.183992270063620150e-02,-2.887094206369749880e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-1.811826730789670159e-02,7.206516329203029904e-03
3.444336798240450054e-02,5.068011873981870252e-02,-2.991781976118810041e-02,4.658001526274530187e-03,9.337178739566659447e-02,8.699398879842949739e-02,3.391354823380159783e-02,-2.592261998182820038e-03,2.405258322689299982e-02,-3.835665973397880263e-02
2.354575262934580082e-02,5.068011873981870252e-02,-1.913969902237900103e-02,4.941532054484590319e-02,-6.348683843926219983e-02,-6.112523362801929733e-02,4.460445801105040325e-03,-3.949338287409189657e-02,-2.595242443518940012e-02,-1.350401824497050006e-02
1.991321417832630017e-02,-4.464163650698899782e-02,-4.069594049999709917e-02,-1.599922263614299983e-02,-8.448724111216979540e-03,-1.759759743927430051e-02,5.232173725423699961e-02,-3.949338287409189657e-02,-3.075120986455629965e-02,3.064409414368320182e-03
-4.547247794002570037e-02,-4.464163650698899782e-02,1.535028734180979987e-02,-7.452802442965950069e-02,-4.972730985725089953e-02,-1.728444897748479883e-02,-2.867429443567860031e-02,-2.592261998182820038e-03,-1.043648208321659998e-01,-7.563562196749110123e-02
5.260606023750229870e-02,5.068011873981870252e-02,-2.452875939178359929e-02,5.630106193231849965e-02,-7.072771253015849857e-03,-5.071658967693000106e-03,-2.131101882750449997e-02,-2.592261998182820038e-03,2.671425763351279944e-02,-3.835665973397880263e-02
-5.514554978810590376e-03,5.068011873981870252e-02,1.338730381358059929e-03,-8.485663651086830517e-02,-1.120062982761920074e-02,-1.665815205390569834e-02,4.864009945014990260e-02,-3.949338287409189657e-02,-4.118038518800790082e-02,-8.806194271199530021e-02
9.015598825267629943e-03,5.068011873981870252e-02,6.924089103585480409e-02,5.974393262605470073e-02,1.769438019460449832e-02,-2.323426975148589965e-02,-4.708248345611389801e-02,3.430885887772629900e-02,1.032922649115240038e-01,7.348022696655839847e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,-6.979686649478139548e-02,-6.419941234845069622e-02,-5.935897986465880211e-02,-5.047818592717519953e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-8.913686007934769340e-02,-5.078298047848289754e-02
-4.183993948900609910e-02,5.068011873981870252e-02,-2.991781976118810041e-02,-2.227739861197989939e-03,2.182223876920789951e-02,3.657708645031480105e-02,1.182372140927919965e-02,-2.592261998182820038e-03,-4.118038518800790082e-02,6.519601313688899724e-02
-7.453278554818210111e-02,-4.464163650698899782e-02,-4.608500086940160029e-02,-4.354218818603310115e-02,-2.908801698423390050e-02,-2.323426975148589965e-02,1.550535921336619952e-02,-3.949338287409189657e-02,-3.980959436433750137e-02,-2.178823207463989955e-02
3.444336798240450054e-02,-4.464163650698899782e-02,1.858372356345249984e-02,5.630106193231849965e-02,1.219056876180000040e-02,-5.454911593043910295e-02,-6.917231028063640375e-02,7.120997975363539678e-02,1.300806095217529879e-01,7.206516329203029904e-03
-6.000263174410389727e-02,-4.464163650698899782e-02,1.338730381358059929e-03,-2.977070541108809906e-02,-7.072771253015849857e-03,-2.166852744253820046e-02,1.182372140927919965e-02,-2.592261998182820038e-03,3.181521750079859684e-02,-5.492508739331759815e-02
-8.543040090124079389e-02,5.068011873981870252e-02,-3.099563183506899924e-02,-2.288496402361559975e-02,-6.348683843926219983e-02,-5.423596746864960128e-02,1.918699701745330000e-02,-3.949338287409189657e-02,-9.643322289178400675e-02,-3.421455281914410201e-02
5.260606023750229870e-02,-4.464163650698899782e-02,-4.050329988046450294e-03,-3.091832896419060075e-02,-4.697540414084860200e-02,-5.830689747191349775e-02,-1.394774321933030074e-02,-2.583996815000549896e-02,3.605579008983190309e-02,2.377494398854190089e-02
1.264813727628719998e-02,-4.464163650698899782e-02,1.535028734180979987e-02,-3.321357610482440076e-02,4.108557878402369773e-02,3.219300798526129881e-02,-2.902829807069099918e-03,-2.592261998182820038e-03,4.506616833626150148e-02,-6.735140813782170000e-02
5.987113713954139715e-02,5.068011873981870252e-02,2.289497185897609866e-02,4.941532054484590319e-02,1.631842733640340160e-02,1.183835796894170019e-02,-1.394774321933030074e-02,-2.592261998182820038e-03,3.953987807202419963e-02,1.963283707370720027e-02
-2.367724723390840155e-02,-4.464163650698899782e-02,4.552902541047500196e-02,9.072976886968099619e-02,-1.808039411862490120e-02,-3.544705976127759950e-02,7.072992627467229731e-02,-3.949338287409189657e-02,-3.452371533034950118e-02,-9.361911330135799444e-03
1.628067572730669890e-02,-4.464163650698899782e-02,-4.500718879552070145e-02,-5.731367096097819691e-02,-3.459182841703849903e-02,-5.392281900686000246e-02,7.441156407875940126e-02,-7.639450375000099436e-02,-4.257210492279420166e-02,4.034337164788070335e-02
1.107266754538149961e-01,5.068011873981870252e-02,-3.315125598283080038e-02,-2.288496402361559975e-02,-4.320865536613589623e-03,2.029336643725910064e-02,-6.180903467246220279e-02,7.120997975363539678e-02,1.556684454070180086e-02,4.448547856271539702e-02
-2.004470878288880029e-02,-4.464163650698899782e-02,9.726400495675820157e-02,-5.670610554934250001e-03,-5.696818394814720174e-03,-2.386056667506489953e-02,-2.131101882750449997e-02,-2.592261998182820038e-03,6.168584882386619894e-02,4.034337164788070335e-02
-1.641217033186929963e-02,-4.464163650698899782e-02,5.415152200152219958e-02,7.007254470726349826e-02,-3.321587555883730170e-02,-2.793149667832890010e-02,8.142083605192099172e-03,-3.949338287409189657e-02,-2.712864555432650121e-02,-9.361911330135799444e-03
4.897352178648269744e-02,5.068011873981870252e-02,1.231314947298999957e-01,8.384402748220859403e-02,-1.047654241852959967e-01,-1.008950882752900069e-01,-6.917231028063640375e-02,-2.592261998182820038e-03,3.664579779339879884e-02,-3.007244590430930078e-02
-5.637009329308430294e-02,-4.464163650698899782e-02,-8.057498723359039772e-02,-8.485663651086830517e-02,-3.734373413344069942e-02,-3.701280207022530216e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-5.615757309500619965e-02,-1.377672256900120129e-01
2.717829108036539862e-02,-4.464163650698899782e-02,9.295275666123460623e-02,-5.272317671413939699e-02,8.062710187196569719e-03,3.970857106821010230e-02,-2.867429443567860031e-02,2.102445536239900062e-02,-4.836172480289190057e-02,1.963283707370720027e-02
6.350367559056099842e-02,-4.464163650698899782e-02,-5.039624916492520257e-02,1.079441223383619947e-01,3.145390877661580209e-02,1.935392105189049847e-02,-1.762938102341739949e-02,2.360753382371260159e-02,5.803912766389510147e-02,4.034337164788070335e-02
-5.273755484206479882e-02,5.068011873981870252e-02,-1.159501450521270051e-02,5.630106193231849965e-02,5.622106022423609822e-02,7.290230801790049953e-02,-3.971920784793980114e-02,7.120997975363539678e-02,3.056648739841480097e-02,-5.219804415301099697e-03
-9.147093429830140468e-03,5.068011873981870252e-02,-2.776219561342629927e-02,8.100872220010799790e-03,4.796534307502930278e-02,3.720338337389379746e-02,-2.867429443567860031e-02,3.430885887772629900e-02,6.604820616309839409e-02,-4.249876664881350324e-02
5.383060374248070309e-03,-4.464163650698899782e-02,5.846277029704580186e-02,-4.354218818603310115e-02,-7.311850844667000526e-02,-7.239857825244250256e-02,1.918699701745330000e-02,-7.639450375000099436e-02,-5.140053526058249722e-02,-2.593033898947460017e-02
7.440129094361959405e-02,-4.464163650698899782e-02,8.540807214406830050e-02,6.318680331979099896e-02,1.494247447820220079e-02,1.309095181609989944e-02,1.550535921336619952e-02,-2.592261998182820038e-03,6.209315616505399656e-03,8.590654771106250032e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,-8.168937664037369826e-04,-2.632783471735180084e-02,1.081461590359879960e-02,7.141131042098750048e-03,4.864009945014990260e-02,-3.949338287409189657e-02,-3.581672810154919867e-02,1.963283707370720027e-02
8.166636784565869944e-02,5.068011873981870252e-02,6.727790750762559745e-03,-4.522987001831730094e-03,1.098832216940800049e-01,1.170562411302250028e-01,-3.235593223976569732e-02,9.187460744414439884e-02,5.472400334817909689e-02,7.206516329203029904e-03
-5.514554978810590376e-03,-4.464163650698899782e-02,8.883414898524360018e-03,-5.042792957350569760e-02,2.595009734381130070e-02,4.722413415115889884e-02,-4.340084565202689815e-02,7.120997975363539678e-02,1.482271084126630077e-02,3.064409414368320182e-03
-2.730978568492789874e-02,-4.464163650698899782e-02,8.001901177466380632e-02,9.876313370696999938e-02,-2.944912678412469915e-03,1.810132720473240156e-02,-1.762938102341739949e-02,3.311917341962639788e-03,-2.952762274177360077e-02,3.620126473304600273e-02
-5.273755484206479882e-02,-4.464163650698899782e-02,7.139651518361660176e-02,-7.452802442965950069e-02,-1.532848840222260020e-02,-1.313877426218630021e-03,4.460445801105040325e-03,-2.141183364489639834e-02,-4.687948284421659950e-02,3.064409414368320182e-03
9.015598825267629943e-03,-4.464163650698899782e-02,-2.452875939178359929e-02,-2.632783471735180084e-02,9.887559882847110626e-02,9.419640341958869512e-02,7.072992627467229731e-02,-2.592261998182820038e-03,-2.139368094035999993e-02,7.206516329203029904e-03
-2.004470878288880029e-02,-4.464163650698899782e-02,-5.470749746044879791e-02,-5.387080026724189868e-02,-6.623874415566440021e-02,-5.736745208654490252e-02,1.182372140927919965e-02,-3.949338287409189657e-02,-7.408887149153539631e-02,-5.219804415301099697e-03
2.354575262934580082e-02,-4.464163650698899782e-02,-3.638469220447349689e-02,6.750727943574620551e-05,1.182945896190920002e-03,3.469819567957759671e-02,-4.340084565202689815e-02,3.430885887772629900e-02,-3.324878724762579674e-02,6.105390622205419948e-02
3.807590643342410180e-02,5.068011873981870252e-02,1.642809941569069870e-02,2.187235499495579841e-02,3.970962592582259754e-02,4.503209491863210262e-02,-4.340084565202689815e-02,7.120997975363539678e-02,4.976865992074899769e-02,1.549073015887240078e-02
-7.816532399920170238e-02,5.068011873981870252e-02,7.786338762690199478e-02,5.285819123858220142e-02,7.823630595545419397e-02,6.444729954958319795e-02,2.655027262562750096e-02,-2.592261998182820038e-03,4.067226371449769728e-02,-9.361911330135799444e-03
9.015598825267629943e-03,5.068011873981870252e-02,-3.961812842611620034e-02,2.875809638242839833e-02,3.833367306762140020e-02,7.352860494147960002e-02,-7.285394808472339667e-02,1.081111006295440019e-01,1.556684454070180086e-02,-4.664087356364819692e-02
1.750521923228520000e-03,5.068011873981870252e-02,1.103903904628619932e-02,-1.944209332987930153e-02,-1.670444126042380101e-02,-3.819065120534880214e-03,-4.708248345611389801e-02,3.430885887772629900e-02,2.405258322689299982e-02,2.377494398854190089e-02
-7.816532399920170238e-02,-4.464163650698899782e-02,-4.069594049999709917e-02,-8.141376581713200000e-02,-1.006375656106929944e-01,-1.127947298232920004e-01,2.286863482154040048e-02,-7.639450375000099436e-02,-2.028874775162960165e-02,-5.078298047848289754e-02
3.081082953138499989e-02,5.068011873981870252e-02,-3.422906805671169922e-02,4.367720260718979675e-02,5.759701308243719842e-02,6.883137801463659611e-02,-3.235593223976569732e-02,5.755656502954899917e-02,3.546193866076970125e-02,8.590654771106250032e-02
-3.457486258696700065e-02,5.068011873981870252e-02,5.649978676881649634e-03,-5.670610554934250001e-03,-7.311850844667000526e-02,-6.269097593696699999e-02,-6.584467611156170040e-03,-3.949338287409189657e-02,-4.542095777704099890e-02,3.205915781821130212e-02
4.897352178648269744e-02,5.068011873981870252e-02,8.864150836571099701e-02,8.728689817594480205e-02,3.558176735121919981e-02,2.154596028441720101e-02,-2.499265663159149983e-02,3.430885887772629900e-02,6.604820616309839409e-02,1.314697237742440128e-01
-4.183993948900609910e-02,-4.464163650698899782e-02,-3.315125598283080038e-02,-2.288496402361559975e-02,4.658939021682820258e-02,4.158746183894729970e-02,5.600337505832399948e-02,-2.473293452372829840e-02,-2.595242443518940012e-02,-3.835665973397880263e-02
-9.147093429830140468e-03,-4.464163650698899782e-02,-5.686312160821060252e-02,-5.042792957350569760e-02,2.182223876920789951e-02,4.534524338042170144e-02,-2.867429443567860031e-02,3.430885887772629900e-02,-9.918957363154769225e-03,-1.764612515980519894e-02
7.076875249260000666e-02,5.068011873981870252e-02,-3.099563183506899924e-02,2.187235499495579841e-02,-3.734373413344069942e-02,-4.703355284749029946e-02,3.391354823380159783e-02,-3.949338287409189657e-02,-1.495647502491130078e-02,-1.077697500466389974e-03
9.015598825267629943e-03,-4.464163650698899782e-02,5.522933407540309841e-02,-5.670610554934250001e-03,5.759701308243719842e-02,4.471894645684260094e-02,-2.902829807069099918e-03,2.323852261495349888e-02,5.568354770267369691e-02,1.066170822852360034e-01
-2.730978568492789874e-02,-4.464163650698899782e-02,-6.009655782985329903e-02,-2.977070541108809906e-02,4.658939021682820258e-02,1.998021797546959896e-02,1.222728555318910032e-01,-3.949338287409189657e-02,-5.140053526058249722e-02,-9.361911330135799444e-03
1.628067572730669890e-02,-4.464163650698899782e-02,1.338730381358059929e-03,8.100872220010799790e-03,5.310804470794310353e-03,1.089891258357309975e-02,3.023191042971450082e-02,-3.949338287409189657e-02,-4.542095777704099890e-02,3.205915781821130212e-02
-1.277963188084970010e-02,-4.464163650698899782e-02,-2.345094731790270046e-02,-4.009931749229690007e-02,-1.670444126042380101e-02,4.635943347782499856e-03,-1.762938102341739949e-02,-2.592261998182820038e-03,-3.845911230135379971e-02,-3.835665973397880263e-02
-5.637009329308430294e-02,-4.464163650698899782e-02,-7.410811479030500470e-02,-5.042792957350569760e-02,-2.496015840963049931e-02,-4.703355284749029946e-02,9.281975309919469896e-02,-7.639450375000099436e-02,-6.117659509433449883e-02,-4.664087356364819692e-02
4.170844488444359899e-02,5.068011873981870252e-02,1.966153563733339868e-02,5.974393262605470073e-02,-5.696818394814720174e-03,-2.566471273376759888e-03,-2.867429443567860031e-02,-2.592261998182820038e-03,3.119299070280229930e-02,7.206516329203029904e-03
-5.514554978810590376e-03,5.068011873981870252e-02,-1.590626280073640167e-02,-6.764228304218700139e-02,4.934129593323050011e-02,7.916527725369119917e-02,-2.867429443567860031e-02,3.430885887772629900e-02,-1.811826730789670159e-02,4.448547856271539702e-02
4.170844488444359899e-02,5.068011873981870252e-02,-1.590626280073640167e-02,1.728186074811709910e-02,-3.734373413344069942e-02,-1.383981589779990050e-02,-2.499265663159149983e-02,-1.107951979964190078e-02,-4.687948284421659950e-02,1.549073015887240078e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,3.906215296718960200e-02,1.215130832538269907e-03,1.631842733640340160e-02,1.528299104862660025e-02,-2.867429443567860031e-02,2.655962349378539894e-02,4.452837402140529671e-02,-2.593033898947460017e-02
-4.547247794002570037e-02,-4.464163650698899782e-02,-7.303030271642410587e-02,-8.141376581713200000e-02,8.374011738825870577e-02,2.780892952020790065e-02,1.738157847891100005e-01,-3.949338287409189657e-02,-4.219859706946029777e-03,3.064409414368320182e-03
1 3.807590643342410180e-02 5.068011873981870252e-02 6.169620651868849837e-02 2.187235499495579841e-02 -4.422349842444640161e-02 -3.482076283769860309e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 1.990842087631829876e-02 -1.764612515980519894e-02
2 -1.882016527791040067e-03 -4.464163650698899782e-02 -5.147406123880610140e-02 -2.632783471735180084e-02 -8.448724111216979540e-03 -1.916333974822199970e-02 7.441156407875940126e-02 -3.949338287409189657e-02 -6.832974362442149896e-02 -9.220404962683000083e-02
3 8.529890629667830071e-02 5.068011873981870252e-02 4.445121333659410312e-02 -5.670610554934250001e-03 -4.559945128264750180e-02 -3.419446591411950259e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 2.863770518940129874e-03 -2.593033898947460017e-02
4 -8.906293935226029801e-02 -4.464163650698899782e-02 -1.159501450521270051e-02 -3.665644679856060184e-02 1.219056876180000040e-02 2.499059336410210108e-02 -3.603757004385269719e-02 3.430885887772629900e-02 2.269202256674450122e-02 -9.361911330135799444e-03
5 5.383060374248070309e-03 -4.464163650698899782e-02 -3.638469220447349689e-02 2.187235499495579841e-02 3.934851612593179802e-03 1.559613951041610019e-02 8.142083605192099172e-03 -2.592261998182820038e-03 -3.199144494135589684e-02 -4.664087356364819692e-02
6 -9.269547780327989928e-02 -4.464163650698899782e-02 -4.069594049999709917e-02 -1.944209332987930153e-02 -6.899064987206669775e-02 -7.928784441181220555e-02 4.127682384197570165e-02 -7.639450375000099436e-02 -4.118038518800790082e-02 -9.634615654166470144e-02
7 -4.547247794002570037e-02 5.068011873981870252e-02 -4.716281294328249912e-02 -1.599922263614299983e-02 -4.009563984984299695e-02 -2.480001206043359885e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -6.291294991625119570e-02 -3.835665973397880263e-02
8 6.350367559056099842e-02 5.068011873981870252e-02 -1.894705840284650021e-03 6.662967401352719310e-02 9.061988167926439408e-02 1.089143811236970016e-01 2.286863482154040048e-02 1.770335448356720118e-02 -3.581672810154919867e-02 3.064409414368320182e-03
9 4.170844488444359899e-02 5.068011873981870252e-02 6.169620651868849837e-02 -4.009931749229690007e-02 -1.395253554402150001e-02 6.201685656730160021e-03 -2.867429443567860031e-02 -2.592261998182820038e-03 -1.495647502491130078e-02 1.134862324403770016e-02
10 -7.090024709716259699e-02 -4.464163650698899782e-02 3.906215296718960200e-02 -3.321357610482440076e-02 -1.257658268582039982e-02 -3.450761437590899733e-02 -2.499265663159149983e-02 -2.592261998182820038e-03 6.773632611028609918e-02 -1.350401824497050006e-02
11 -9.632801625429950054e-02 -4.464163650698899782e-02 -8.380842345523309422e-02 8.100872220010799790e-03 -1.033894713270950005e-01 -9.056118903623530669e-02 -1.394774321933030074e-02 -7.639450375000099436e-02 -6.291294991625119570e-02 -3.421455281914410201e-02
12 2.717829108036539862e-02 5.068011873981870252e-02 1.750591148957160101e-02 -3.321357610482440076e-02 -7.072771253015849857e-03 4.597154030400080194e-02 -6.549067247654929980e-02 7.120997975363539678e-02 -9.643322289178400675e-02 -5.906719430815229877e-02
13 1.628067572730669890e-02 -4.464163650698899782e-02 -2.884000768730720157e-02 -9.113481248670509197e-03 -4.320865536613589623e-03 -9.768885894535990141e-03 4.495846164606279866e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 -4.249876664881350324e-02
14 5.383060374248070309e-03 5.068011873981870252e-02 -1.894705840284650021e-03 8.100872220010799790e-03 -4.320865536613589623e-03 -1.571870666853709964e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 3.839324821169769891e-02 -1.350401824497050006e-02
15 4.534098333546320025e-02 -4.464163650698899782e-02 -2.560657146566450160e-02 -1.255635194240680048e-02 1.769438019460449832e-02 -6.128357906048329537e-05 8.177483968693349814e-02 -3.949338287409189657e-02 -3.199144494135589684e-02 -7.563562196749110123e-02
16 -5.273755484206479882e-02 5.068011873981870252e-02 -1.806188694849819934e-02 8.040115678847230274e-02 8.924392882106320368e-02 1.076617872765389949e-01 -3.971920784793980114e-02 1.081111006295440019e-01 3.605579008983190309e-02 -4.249876664881350324e-02
17 -5.514554978810590376e-03 -4.464163650698899782e-02 4.229558918883229851e-02 4.941532054484590319e-02 2.457414448561009990e-02 -2.386056667506489953e-02 7.441156407875940126e-02 -3.949338287409189657e-02 5.227999979678119719e-02 2.791705090337660150e-02
18 7.076875249260000666e-02 5.068011873981870252e-02 1.211685112016709989e-02 5.630106193231849965e-02 3.420581449301800248e-02 4.941617338368559792e-02 -3.971920784793980114e-02 3.430885887772629900e-02 2.736770754260900093e-02 -1.077697500466389974e-03
19 -3.820740103798660192e-02 -4.464163650698899782e-02 -1.051720243133190055e-02 -3.665644679856060184e-02 -3.734373413344069942e-02 -1.947648821001150138e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 -1.811826730789670159e-02 -1.764612515980519894e-02
20 -2.730978568492789874e-02 -4.464163650698899782e-02 -1.806188694849819934e-02 -4.009931749229690007e-02 -2.944912678412469915e-03 -1.133462820348369975e-02 3.759518603788870178e-02 -3.949338287409189657e-02 -8.944018957797799166e-03 -5.492508739331759815e-02
21 -4.910501639104519755e-02 -4.464163650698899782e-02 -5.686312160821060252e-02 -4.354218818603310115e-02 -4.559945128264750180e-02 -4.327577130601600180e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -1.190068480150809939e-02 1.549073015887240078e-02
22 -8.543040090124079389e-02 5.068011873981870252e-02 -2.237313524402180162e-02 1.215130832538269907e-03 -3.734373413344069942e-02 -2.636575436938120090e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -7.212845460195599356e-02 -1.764612515980519894e-02
23 -8.543040090124079389e-02 -4.464163650698899782e-02 -4.050329988046450294e-03 -9.113481248670509197e-03 -2.944912678412469915e-03 7.767427965677820186e-03 2.286863482154040048e-02 -3.949338287409189657e-02 -6.117659509433449883e-02 -1.350401824497050006e-02
24 4.534098333546320025e-02 5.068011873981870252e-02 6.061839444480759953e-02 3.105334362634819961e-02 2.870200306021350109e-02 -4.734670130927989828e-02 -5.444575906428809897e-02 7.120997975363539678e-02 1.335989800130079896e-01 1.356118306890790048e-01
25 -6.363517019512339445e-02 -4.464163650698899782e-02 3.582871674554689856e-02 -2.288496402361559975e-02 -3.046396984243510131e-02 -1.885019128643240088e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 -2.595242443518940012e-02 -5.492508739331759815e-02
26 -6.726770864614299572e-02 5.068011873981870252e-02 -1.267282657909369996e-02 -4.009931749229690007e-02 -1.532848840222260020e-02 4.635943347782499856e-03 -5.812739686837520292e-02 3.430885887772629900e-02 1.919903307856710151e-02 -3.421455281914410201e-02
27 -1.072256316073579990e-01 -4.464163650698899782e-02 -7.734155101194770121e-02 -2.632783471735180084e-02 -8.962994274508359616e-02 -9.619786134844690584e-02 2.655027262562750096e-02 -7.639450375000099436e-02 -4.257210492279420166e-02 -5.219804415301099697e-03
28 -2.367724723390840155e-02 -4.464163650698899782e-02 5.954058237092670069e-02 -4.009931749229690007e-02 -4.284754556624519733e-02 -4.358891976780549654e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 4.034337164788070335e-02
29 5.260606023750229870e-02 -4.464163650698899782e-02 -2.129532317014089932e-02 -7.452802442965950069e-02 -4.009563984984299695e-02 -3.763909899380440266e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -6.092541861022970299e-04 -5.492508739331759815e-02
30 6.713621404158050254e-02 5.068011873981870252e-02 -6.205954135808240159e-03 6.318680331979099896e-02 -4.284754556624519733e-02 -9.588471288665739722e-02 5.232173725423699961e-02 -7.639450375000099436e-02 5.942380044479410317e-02 5.276969239238479825e-02
31 -6.000263174410389727e-02 -4.464163650698899782e-02 4.445121333659410312e-02 -1.944209332987930153e-02 -9.824676969418109224e-03 -7.576846662009279788e-03 2.286863482154040048e-02 -3.949338287409189657e-02 -2.712864555432650121e-02 -9.361911330135799444e-03
32 -2.367724723390840155e-02 -4.464163650698899782e-02 -6.548561819925780014e-02 -8.141376581713200000e-02 -3.871968699164179961e-02 -5.360967054507050078e-02 5.968501286241110343e-02 -7.639450375000099436e-02 -3.712834601047360072e-02 -4.249876664881350324e-02
33 3.444336798240450054e-02 5.068011873981870252e-02 1.252871188776620015e-01 2.875809638242839833e-02 -5.385516843185429725e-02 -1.290037051243130006e-02 -1.023070505174200062e-01 1.081111006295440019e-01 2.714857279071319972e-04 2.791705090337660150e-02
34 3.081082953138499989e-02 -4.464163650698899782e-02 -5.039624916492520257e-02 -2.227739861197989939e-03 -4.422349842444640161e-02 -8.993489211265630334e-02 1.185912177278039964e-01 -7.639450375000099436e-02 -1.811826730789670159e-02 3.064409414368320182e-03
35 1.628067572730669890e-02 -4.464163650698899782e-02 -6.332999405149600247e-02 -5.731367096097819691e-02 -5.798302700645770191e-02 -4.891244361822749687e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -5.947269741072230137e-02 -6.735140813782170000e-02
36 4.897352178648269744e-02 5.068011873981870252e-02 -3.099563183506899924e-02 -4.928030602040309877e-02 4.934129593323050011e-02 -4.132213582324419619e-03 1.333177689441520097e-01 -5.351580880693729975e-02 2.131084656824479978e-02 1.963283707370720027e-02
37 1.264813727628719998e-02 -4.464163650698899782e-02 2.289497185897609866e-02 5.285819123858220142e-02 8.062710187196569719e-03 -2.855779360190789998e-02 3.759518603788870178e-02 -3.949338287409189657e-02 5.472400334817909689e-02 -2.593033898947460017e-02
38 -9.147093429830140468e-03 -4.464163650698899782e-02 1.103903904628619932e-02 -5.731367096097819691e-02 -2.496015840963049931e-02 -4.296262284422640298e-02 3.023191042971450082e-02 -3.949338287409189657e-02 1.703713241477999851e-02 -5.219804415301099697e-03
39 -1.882016527791040067e-03 5.068011873981870252e-02 7.139651518361660176e-02 9.761551025715360652e-02 8.786797596286209655e-02 7.540749571221680436e-02 -2.131101882750449997e-02 7.120997975363539678e-02 7.142403278057639360e-02 2.377494398854190089e-02
40 -1.882016527791040067e-03 5.068011873981870252e-02 1.427247526792889930e-02 -7.452802442965950069e-02 2.558898754392050119e-03 6.201685656730160021e-03 -1.394774321933030074e-02 -2.592261998182820038e-03 1.919903307856710151e-02 3.064409414368320182e-03
41 5.383060374248070309e-03 5.068011873981870252e-02 -8.361578283570040432e-03 2.187235499495579841e-02 5.484510736603499803e-02 7.321545647968999426e-02 -2.499265663159149983e-02 3.430885887772629900e-02 1.255315281338930007e-02 9.419076154073199869e-02
42 -9.996055470531900466e-02 -4.464163650698899782e-02 -6.764124234701959781e-02 -1.089567313670219972e-01 -7.449446130487119566e-02 -7.271172671423199729e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -4.986846773523059828e-02 -9.361911330135799444e-03
43 -6.000263174410389727e-02 5.068011873981870252e-02 -1.051720243133190055e-02 -1.485159908304049987e-02 -4.972730985725089953e-02 -2.354741821327540133e-02 -5.812739686837520292e-02 1.585829843977170153e-02 -9.918957363154769225e-03 -3.421455281914410201e-02
44 1.991321417832630017e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -7.108515373592319553e-02 2.044628591100669870e-02 -1.008203435632550049e-02 1.185912177278039964e-01 -7.639450375000099436e-02 -4.257210492279420166e-02 7.348022696655839847e-02
45 4.534098333546320025e-02 5.068011873981870252e-02 6.816307896197400240e-02 8.100872220010799790e-03 -1.670444126042380101e-02 4.635943347782499856e-03 -7.653558588881050062e-02 7.120997975363539678e-02 3.243322577960189995e-02 -1.764612515980519894e-02
46 2.717829108036539862e-02 5.068011873981870252e-02 -3.530688013059259805e-02 3.220096707616459941e-02 -1.120062982761920074e-02 1.504458729887179960e-03 -1.026610541524320026e-02 -2.592261998182820038e-03 -1.495647502491130078e-02 -5.078298047848289754e-02
47 -5.637009329308430294e-02 -4.464163650698899782e-02 -1.159501450521270051e-02 -3.321357610482440076e-02 -4.697540414084860200e-02 -4.765984977106939996e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -7.979397554541639223e-03 -8.806194271199530021e-02
48 -7.816532399920170238e-02 -4.464163650698899782e-02 -7.303030271642410587e-02 -5.731367096097819691e-02 -8.412613131227909824e-02 -7.427746902317970690e-02 -2.499265663159149983e-02 -3.949338287409189657e-02 -1.811826730789670159e-02 -8.391983579716059960e-02
49 6.713621404158050254e-02 5.068011873981870252e-02 -4.177375257387799801e-02 1.154374291374709975e-02 2.558898754392050119e-03 5.888537194940629722e-03 4.127682384197570165e-02 -3.949338287409189657e-02 -5.947269741072230137e-02 -2.178823207463989955e-02
50 -4.183993948900609910e-02 5.068011873981870252e-02 1.427247526792889930e-02 -5.670610554934250001e-03 -1.257658268582039982e-02 6.201685656730160021e-03 -7.285394808472339667e-02 7.120997975363539678e-02 3.546193866076970125e-02 -1.350401824497050006e-02
51 3.444336798240450054e-02 -4.464163650698899782e-02 -7.283766209689159811e-03 1.498661360748330083e-02 -4.422349842444640161e-02 -3.732595053201490098e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -2.139368094035999993e-02 7.206516329203029904e-03
52 5.987113713954139715e-02 5.068011873981870252e-02 1.642809941569069870e-02 2.875809638242839833e-02 -4.147159270804409714e-02 -2.918409052548700047e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 -2.396681493414269844e-03 -2.178823207463989955e-02
53 -5.273755484206479882e-02 -4.464163650698899782e-02 -9.439390357450949676e-03 -5.670610554934250001e-03 3.970962592582259754e-02 4.471894645684260094e-02 2.655027262562750096e-02 -2.592261998182820038e-03 -1.811826730789670159e-02 -1.350401824497050006e-02
54 -9.147093429830140468e-03 -4.464163650698899782e-02 -1.590626280073640167e-02 7.007254470726349826e-02 1.219056876180000040e-02 2.217225720799630151e-02 1.550535921336619952e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 4.862758547755009764e-02
55 -4.910501639104519755e-02 -4.464163650698899782e-02 2.505059600673789980e-02 8.100872220010799790e-03 2.044628591100669870e-02 1.778817874294279927e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 7.206516329203029904e-03
56 -4.183993948900609910e-02 -4.464163650698899782e-02 -4.931843709104429679e-02 -3.665644679856060184e-02 -7.072771253015849857e-03 -2.260797282790679916e-02 8.545647749102060209e-02 -3.949338287409189657e-02 -6.648814822283539983e-02 7.206516329203029904e-03
57 -4.183993948900609910e-02 -4.464163650698899782e-02 4.121777711495139968e-02 -2.632783471735180084e-02 -3.183992270063620150e-02 -3.043668437264510085e-02 -3.603757004385269719e-02 2.942906133203560069e-03 3.365681290238470291e-02 -1.764612515980519894e-02
58 -2.730978568492789874e-02 -4.464163650698899782e-02 -6.332999405149600247e-02 -5.042792957350569760e-02 -8.962994274508359616e-02 -1.043397213549750041e-01 5.232173725423699961e-02 -7.639450375000099436e-02 -5.615757309500619965e-02 -6.735140813782170000e-02
59 4.170844488444359899e-02 -4.464163650698899782e-02 -6.440780612537699845e-02 3.564383776990089764e-02 1.219056876180000040e-02 -5.799374901012400302e-02 1.811790603972839864e-01 -7.639450375000099436e-02 -6.092541861022970299e-04 -5.078298047848289754e-02
60 6.350367559056099842e-02 5.068011873981870252e-02 -2.560657146566450160e-02 1.154374291374709975e-02 6.447677737344290061e-02 4.847672799831700269e-02 3.023191042971450082e-02 -2.592261998182820038e-03 3.839324821169769891e-02 1.963283707370720027e-02
61 -7.090024709716259699e-02 -4.464163650698899782e-02 -4.050329988046450294e-03 -4.009931749229690007e-02 -6.623874415566440021e-02 -7.866154748823310505e-02 5.232173725423699961e-02 -7.639450375000099436e-02 -5.140053526058249722e-02 -3.421455281914410201e-02
62 -4.183993948900609910e-02 5.068011873981870252e-02 4.572166603000769880e-03 -5.387080026724189868e-02 -4.422349842444640161e-02 -2.730519975474979960e-02 -8.021722369289760457e-02 7.120997975363539678e-02 3.664579779339879884e-02 1.963283707370720027e-02
63 -2.730978568492789874e-02 5.068011873981870252e-02 -7.283766209689159811e-03 -4.009931749229690007e-02 -1.120062982761920074e-02 -1.383981589779990050e-02 5.968501286241110343e-02 -3.949338287409189657e-02 -8.238148325810279449e-02 -2.593033898947460017e-02
64 -3.457486258696700065e-02 -4.464163650698899782e-02 -3.746250427835440266e-02 -6.075654165471439799e-02 2.044628591100669870e-02 4.346635260968449710e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -3.075120986455629965e-02 -7.149351505265640061e-02
65 6.713621404158050254e-02 5.068011873981870252e-02 -2.560657146566450160e-02 -4.009931749229690007e-02 -6.348683843926219983e-02 -5.987263978086120042e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -1.919704761394450121e-02 1.134862324403770016e-02
66 -4.547247794002570037e-02 5.068011873981870252e-02 -2.452875939178359929e-02 5.974393262605470073e-02 5.310804470794310353e-03 1.496984258683710031e-02 -5.444575906428809897e-02 7.120997975363539678e-02 4.234489544960749752e-02 1.549073015887240078e-02
67 -9.147093429830140468e-03 5.068011873981870252e-02 -1.806188694849819934e-02 -3.321357610482440076e-02 -2.083229983502719873e-02 1.215150643073130074e-02 -7.285394808472339667e-02 7.120997975363539678e-02 2.714857279071319972e-04 1.963283707370720027e-02
68 4.170844488444359899e-02 5.068011873981870252e-02 -1.482845072685549936e-02 -1.714684618924559867e-02 -5.696818394814720174e-03 8.393724889256879915e-03 -1.394774321933030074e-02 -1.854239580664649974e-03 -1.190068480150809939e-02 3.064409414368320182e-03
69 3.807590643342410180e-02 5.068011873981870252e-02 -2.991781976118810041e-02 -4.009931749229690007e-02 -3.321587555883730170e-02 -2.417371513685449835e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 -1.290794225416879923e-02 3.064409414368320182e-03
70 1.628067572730669890e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -5.670610554934250001e-03 -7.587041416307230279e-02 -6.143838208980879900e-02 -1.394774321933030074e-02 -3.949338287409189657e-02 -5.140053526058249722e-02 1.963283707370720027e-02
71 -1.882016527791040067e-03 -4.464163650698899782e-02 -6.979686649478139548e-02 -1.255635194240680048e-02 -1.930069620102049918e-04 -9.142588970956939953e-03 7.072992627467229731e-02 -3.949338287409189657e-02 -6.291294991625119570e-02 4.034337164788070335e-02
72 -1.882016527791040067e-03 -4.464163650698899782e-02 3.367309259778510089e-02 1.251584758070440062e-01 2.457414448561009990e-02 2.624318721126020146e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 2.671425763351279944e-02 6.105390622205419948e-02
73 6.350367559056099842e-02 5.068011873981870252e-02 -4.050329988046450294e-03 -1.255635194240680048e-02 1.030034574030749966e-01 4.878987646010649742e-02 5.600337505832399948e-02 -2.592261998182820038e-03 8.449528221240310000e-02 -1.764612515980519894e-02
74 1.264813727628719998e-02 5.068011873981870252e-02 -2.021751109626000048e-02 -2.227739861197989939e-03 3.833367306762140020e-02 5.317395492515999966e-02 -6.584467611156170040e-03 3.430885887772629900e-02 -5.145307980263110273e-03 -9.361911330135799444e-03
75 1.264813727628719998e-02 5.068011873981870252e-02 2.416542455238970041e-03 5.630106193231849965e-02 2.732605020201240090e-02 1.716188181936379939e-02 4.127682384197570165e-02 -3.949338287409189657e-02 3.711738233435969789e-03 7.348022696655839847e-02
76 -9.147093429830140468e-03 5.068011873981870252e-02 -3.099563183506899924e-02 -2.632783471735180084e-02 -1.120062982761920074e-02 -1.000728964429089965e-03 -2.131101882750449997e-02 -2.592261998182820038e-03 6.209315616505399656e-03 2.791705090337660150e-02
77 -3.094232413594750000e-02 5.068011873981870252e-02 2.828403222838059977e-02 7.007254470726349826e-02 -1.267806699165139883e-01 -1.068449090492910036e-01 -5.444575906428809897e-02 -4.798064067555100204e-02 -3.075120986455629965e-02 1.549073015887240078e-02
78 -9.632801625429950054e-02 -4.464163650698899782e-02 -3.638469220447349689e-02 -7.452802442965950069e-02 -3.871968699164179961e-02 -2.761834821653930128e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -7.408887149153539631e-02 -1.077697500466389974e-03
79 5.383060374248070309e-03 -4.464163650698899782e-02 -5.794093368209150136e-02 -2.288496402361559975e-02 -6.761469701386560449e-02 -6.832764824917850199e-02 -5.444575906428809897e-02 -2.592261998182820038e-03 4.289568789252869857e-02 -8.391983579716059960e-02
80 -1.035930931563389945e-01 -4.464163650698899782e-02 -3.746250427835440266e-02 -2.632783471735180084e-02 2.558898754392050119e-03 1.998021797546959896e-02 1.182372140927919965e-02 -2.592261998182820038e-03 -6.832974362442149896e-02 -2.593033898947460017e-02
81 7.076875249260000666e-02 -4.464163650698899782e-02 1.211685112016709989e-02 4.252957915737339695e-02 7.135654166444850566e-02 5.348710338694950134e-02 5.232173725423699961e-02 -2.592261998182820038e-03 2.539313491544940155e-02 -5.219804415301099697e-03
82 1.264813727628719998e-02 5.068011873981870252e-02 -2.237313524402180162e-02 -2.977070541108809906e-02 1.081461590359879960e-02 2.843522644378690054e-02 -2.131101882750449997e-02 3.430885887772629900e-02 -6.080248196314420352e-03 -1.077697500466389974e-03
83 -1.641217033186929963e-02 -4.464163650698899782e-02 -3.530688013059259805e-02 -2.632783471735180084e-02 3.282986163481690228e-02 1.716188181936379939e-02 1.001830287073690040e-01 -3.949338287409189657e-02 -7.020931272868760620e-02 -7.977772888232589898e-02
84 -3.820740103798660192e-02 -4.464163650698899782e-02 9.961226972405269262e-03 -4.698505887976939938e-02 -5.935897986465880211e-02 -5.298337362149149743e-02 -1.026610541524320026e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -4.249876664881350324e-02
85 1.750521923228520000e-03 -4.464163650698899782e-02 -3.961812842611620034e-02 -1.009233664264470032e-01 -2.908801698423390050e-02 -3.012353591085559917e-02 4.495846164606279866e-02 -5.019470792810550031e-02 -6.832974362442149896e-02 -1.294830118603420011e-01
86 4.534098333546320025e-02 -4.464163650698899782e-02 7.139651518361660176e-02 1.215130832538269907e-03 -9.824676969418109224e-03 -1.000728964429089965e-03 1.550535921336619952e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -7.149351505265640061e-02
87 -7.090024709716259699e-02 5.068011873981870252e-02 -7.518592686418590354e-02 -4.009931749229690007e-02 -5.110326271545199972e-02 -1.509240974495799914e-02 -3.971920784793980114e-02 -2.592261998182820038e-03 -9.643322289178400675e-02 -3.421455281914410201e-02
88 4.534098333546320025e-02 -4.464163650698899782e-02 -6.205954135808240159e-03 1.154374291374709975e-02 6.310082451524179348e-02 1.622243643399520069e-02 9.650139090328180291e-02 -3.949338287409189657e-02 4.289568789252869857e-02 -3.835665973397880263e-02
89 -5.273755484206479882e-02 5.068011873981870252e-02 -4.069594049999709917e-02 -6.764228304218700139e-02 -3.183992270063620150e-02 -3.701280207022530216e-02 3.759518603788870178e-02 -3.949338287409189657e-02 -3.452371533034950118e-02 6.933812005172369786e-02
90 -4.547247794002570037e-02 -4.464163650698899782e-02 -4.824062501716339796e-02 -1.944209332987930153e-02 -1.930069620102049918e-04 -1.603185513032660131e-02 6.704828847058519337e-02 -3.949338287409189657e-02 -2.479118743246069845e-02 1.963283707370720027e-02
91 1.264813727628719998e-02 -4.464163650698899782e-02 -2.560657146566450160e-02 -4.009931749229690007e-02 -3.046396984243510131e-02 -4.515466207675319921e-02 7.809320188284639419e-02 -7.639450375000099436e-02 -7.212845460195599356e-02 1.134862324403770016e-02
92 4.534098333546320025e-02 -4.464163650698899782e-02 5.199589785376040191e-02 -5.387080026724189868e-02 6.310082451524179348e-02 6.476044801137270657e-02 -1.026610541524320026e-02 3.430885887772629900e-02 3.723201120896890010e-02 1.963283707370720027e-02
93 -2.004470878288880029e-02 -4.464163650698899782e-02 4.572166603000769880e-03 9.761551025715360652e-02 5.310804470794310353e-03 -2.072908205716959829e-02 6.336665066649820044e-02 -3.949338287409189657e-02 1.255315281338930007e-02 1.134862324403770016e-02
94 -4.910501639104519755e-02 -4.464163650698899782e-02 -6.440780612537699845e-02 -1.020709899795499975e-01 -2.944912678412469915e-03 -1.540555820674759969e-02 6.336665066649820044e-02 -4.724261825803279663e-02 -3.324878724762579674e-02 -5.492508739331759815e-02
95 -7.816532399920170238e-02 -4.464163650698899782e-02 -1.698407487461730050e-02 -1.255635194240680048e-02 -1.930069620102049918e-04 -1.352666743601040056e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -9.220404962683000083e-02
96 -7.090024709716259699e-02 -4.464163650698899782e-02 -5.794093368209150136e-02 -8.141376581713200000e-02 -4.559945128264750180e-02 -2.887094206369749880e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 1.143797379512540100e-03 -5.219804415301099697e-03
97 5.623859868852180283e-02 5.068011873981870252e-02 9.961226972405269262e-03 4.941532054484590319e-02 -4.320865536613589623e-03 -1.227407358885230018e-02 -4.340084565202689815e-02 3.430885887772629900e-02 6.078775415074400001e-02 3.205915781821130212e-02
98 -2.730978568492789874e-02 -4.464163650698899782e-02 8.864150836571099701e-02 -2.518021116424929914e-02 2.182223876920789951e-02 4.252690722431590187e-02 -3.235593223976569732e-02 3.430885887772629900e-02 2.863770518940129874e-03 7.762233388139309909e-02
99 1.750521923228520000e-03 5.068011873981870252e-02 -5.128142061927360405e-03 -1.255635194240680048e-02 -1.532848840222260020e-02 -1.383981589779990050e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -6.080248196314420352e-03 -6.735140813782170000e-02
100 -1.882016527791040067e-03 -4.464163650698899782e-02 -6.440780612537699845e-02 1.154374291374709975e-02 2.732605020201240090e-02 3.751653183568340322e-02 -1.394774321933030074e-02 3.430885887772629900e-02 1.178390038357590014e-02 -5.492508739331759815e-02
101 1.628067572730669890e-02 -4.464163650698899782e-02 1.750591148957160101e-02 -2.288496402361559975e-02 6.034891879883950289e-02 4.440579799505309927e-02 3.023191042971450082e-02 -2.592261998182820038e-03 3.723201120896890010e-02 -1.077697500466389974e-03
102 1.628067572730669890e-02 5.068011873981870252e-02 -4.500718879552070145e-02 6.318680331979099896e-02 1.081461590359879960e-02 -3.744320408500199904e-04 6.336665066649820044e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 3.620126473304600273e-02
103 -9.269547780327989928e-02 -4.464163650698899782e-02 2.828403222838059977e-02 -1.599922263614299983e-02 3.695772020942030001e-02 2.499059336410210108e-02 5.600337505832399948e-02 -3.949338287409189657e-02 -5.145307980263110273e-03 -1.077697500466389974e-03
104 5.987113713954139715e-02 5.068011873981870252e-02 4.121777711495139968e-02 1.154374291374709975e-02 4.108557878402369773e-02 7.071026878537380045e-02 -3.603757004385269719e-02 3.430885887772629900e-02 -1.090443584737709956e-02 -3.007244590430930078e-02
105 -2.730978568492789874e-02 -4.464163650698899782e-02 6.492964274033119487e-02 -2.227739861197989939e-03 -2.496015840963049931e-02 -1.728444897748479883e-02 2.286863482154040048e-02 -3.949338287409189657e-02 -6.117659509433449883e-02 -6.320930122298699938e-02
106 2.354575262934580082e-02 5.068011873981870252e-02 -3.207344390894990155e-02 -4.009931749229690007e-02 -3.183992270063620150e-02 -2.166852744253820046e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -1.090443584737709956e-02 1.963283707370720027e-02
107 -9.632801625429950054e-02 -4.464163650698899782e-02 -7.626373893806680238e-02 -4.354218818603310115e-02 -4.559945128264750180e-02 -3.482076283769860309e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -5.947269741072230137e-02 -8.391983579716059960e-02
108 2.717829108036539862e-02 -4.464163650698899782e-02 4.984027370599859730e-02 -5.501842382034440038e-02 -2.944912678412469915e-03 4.064801645357869753e-02 -5.812739686837520292e-02 5.275941931568080279e-02 -5.295879323920039961e-02 -5.219804415301099697e-03
109 1.991321417832630017e-02 5.068011873981870252e-02 4.552902541047500196e-02 2.990571983224480160e-02 -6.211088558106100249e-02 -5.580170977759729700e-02 -7.285394808472339667e-02 2.692863470254440103e-02 4.560080841412490066e-02 4.034337164788070335e-02
110 3.807590643342410180e-02 5.068011873981870252e-02 -9.439390357450949676e-03 2.362754385640800005e-03 1.182945896190920002e-03 3.751653183568340322e-02 -5.444575906428809897e-02 5.017634085436720182e-02 -2.595242443518940012e-02 1.066170822852360034e-01
111 4.170844488444359899e-02 5.068011873981870252e-02 -3.207344390894990155e-02 -2.288496402361559975e-02 -4.972730985725089953e-02 -4.014428668812060341e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -1.260973855604090033e-01 1.549073015887240078e-02
112 1.991321417832630017e-02 -4.464163650698899782e-02 4.572166603000769880e-03 -2.632783471735180084e-02 2.319819162740899970e-02 1.027261565999409987e-02 6.704828847058519337e-02 -3.949338287409189657e-02 -2.364455757213410059e-02 -4.664087356364819692e-02
113 -8.543040090124079389e-02 -4.464163650698899782e-02 2.073934771121430098e-02 -2.632783471735180084e-02 5.310804470794310353e-03 1.966706951368000014e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 -2.364455757213410059e-02 3.064409414368320182e-03
114 1.991321417832630017e-02 5.068011873981870252e-02 1.427247526792889930e-02 6.318680331979099896e-02 1.494247447820220079e-02 2.029336643725910064e-02 -4.708248345611389801e-02 3.430885887772629900e-02 4.666077235681449775e-02 9.004865462589720093e-02
115 2.354575262934580082e-02 -4.464163650698899782e-02 1.101977498433290015e-01 6.318680331979099896e-02 1.356652162000110060e-02 -3.294187206696139875e-02 -2.499265663159149983e-02 2.065544415363990138e-02 9.924022573398999514e-02 2.377494398854190089e-02
116 -3.094232413594750000e-02 5.068011873981870252e-02 1.338730381358059929e-03 -5.670610554934250001e-03 6.447677737344290061e-02 4.941617338368559792e-02 -4.708248345611389801e-02 1.081111006295440019e-01 8.379676636552239877e-02 3.064409414368320182e-03
117 4.897352178648269744e-02 5.068011873981870252e-02 5.846277029704580186e-02 7.007254470726349826e-02 1.356652162000110060e-02 2.060651489904859884e-02 -2.131101882750449997e-02 3.430885887772629900e-02 2.200405045615050001e-02 2.791705090337660150e-02
118 5.987113713954139715e-02 -4.464163650698899782e-02 -2.129532317014089932e-02 8.728689817594480205e-02 4.521343735862710239e-02 3.156671106168230240e-02 -4.708248345611389801e-02 7.120997975363539678e-02 7.912108138965789905e-02 1.356118306890790048e-01
119 -5.637009329308430294e-02 5.068011873981870252e-02 -1.051720243133190055e-02 2.531522568869210010e-02 2.319819162740899970e-02 4.002171952999959703e-02 -3.971920784793980114e-02 3.430885887772629900e-02 2.061233072136409855e-02 5.691179930721949887e-02
120 1.628067572730669890e-02 -4.464163650698899782e-02 -4.716281294328249912e-02 -2.227739861197989939e-03 -1.945634697682600139e-02 -4.296262284422640298e-02 3.391354823380159783e-02 -3.949338287409189657e-02 2.736770754260900093e-02 2.791705090337660150e-02
121 -4.910501639104519755e-02 -4.464163650698899782e-02 4.572166603000769880e-03 1.154374291374709975e-02 -3.734373413344069942e-02 -1.853704282464289921e-02 -1.762938102341739949e-02 -2.592261998182820038e-03 -3.980959436433750137e-02 -2.178823207463989955e-02
122 6.350367559056099842e-02 -4.464163650698899782e-02 1.750591148957160101e-02 2.187235499495579841e-02 8.062710187196569719e-03 2.154596028441720101e-02 -3.603757004385269719e-02 3.430885887772629900e-02 1.990842087631829876e-02 1.134862324403770016e-02
123 4.897352178648269744e-02 5.068011873981870252e-02 8.109682384854470516e-02 2.187235499495579841e-02 4.383748450042589812e-02 6.413415108779360607e-02 -5.444575906428809897e-02 7.120997975363539678e-02 3.243322577960189995e-02 4.862758547755009764e-02
124 5.383060374248070309e-03 5.068011873981870252e-02 3.475090467166599972e-02 -1.080116308095460057e-03 1.525377602983150060e-01 1.987879896572929961e-01 -6.180903467246220279e-02 1.852344432601940039e-01 1.556684454070180086e-02 7.348022696655839847e-02
125 -5.514554978810590376e-03 -4.464163650698899782e-02 2.397278393285700096e-02 8.100872220010799790e-03 -3.459182841703849903e-02 -3.889169284096249957e-02 2.286863482154040048e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -1.350401824497050006e-02
126 -5.514554978810590376e-03 5.068011873981870252e-02 -8.361578283570040432e-03 -2.227739861197989939e-03 -3.321587555883730170e-02 -6.363042132233559522e-02 -3.603757004385269719e-02 -2.592261998182820038e-03 8.058546423866649877e-02 7.206516329203029904e-03
127 -8.906293935226029801e-02 -4.464163650698899782e-02 -6.117436990373419786e-02 -2.632783471735180084e-02 -5.523112129005539744e-02 -5.454911593043910295e-02 4.127682384197570165e-02 -7.639450375000099436e-02 -9.393564550871469354e-02 -5.492508739331759815e-02
128 3.444336798240450054e-02 5.068011873981870252e-02 -1.894705840284650021e-03 -1.255635194240680048e-02 3.833367306762140020e-02 1.371724873967889932e-02 7.809320188284639419e-02 -3.949338287409189657e-02 4.551890466127779880e-03 -9.634615654166470144e-02
129 -5.273755484206479882e-02 -4.464163650698899782e-02 -6.225218197761509670e-02 -2.632783471735180084e-02 -5.696818394814720174e-03 -5.071658967693000106e-03 3.023191042971450082e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 -7.149351505265640061e-02
130 9.015598825267629943e-03 -4.464163650698899782e-02 1.642809941569069870e-02 4.658001526274530187e-03 9.438663045397699403e-03 1.058576412178359981e-02 -2.867429443567860031e-02 3.430885887772629900e-02 3.896836603088559697e-02 1.190434030297399942e-01
131 -6.363517019512339445e-02 5.068011873981870252e-02 9.618619288287730273e-02 1.045012516446259948e-01 -2.944912678412469915e-03 -4.758510505903469807e-03 -6.584467611156170040e-03 -2.592261998182820038e-03 2.269202256674450122e-02 7.348022696655839847e-02
132 -9.632801625429950054e-02 -4.464163650698899782e-02 -6.979686649478139548e-02 -6.764228304218700139e-02 -1.945634697682600139e-02 -1.070833127990459925e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -4.687948284421659950e-02 -7.977772888232589898e-02
133 1.628067572730669890e-02 5.068011873981870252e-02 -2.129532317014089932e-02 -9.113481248670509197e-03 3.420581449301800248e-02 4.785043107473799934e-02 7.788079970179680352e-04 -2.592261998182820038e-03 -1.290794225416879923e-02 2.377494398854190089e-02
134 -4.183993948900609910e-02 5.068011873981870252e-02 -5.362968538656789907e-02 -4.009931749229690007e-02 -8.412613131227909824e-02 -7.177228132886340206e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -7.212845460195599356e-02 -3.007244590430930078e-02
135 -7.453278554818210111e-02 -4.464163650698899782e-02 4.337340126271319735e-02 -3.321357610482440076e-02 1.219056876180000040e-02 2.518648827290310109e-04 6.336665066649820044e-02 -3.949338287409189657e-02 -2.712864555432650121e-02 -4.664087356364819692e-02
136 -5.514554978810590376e-03 -4.464163650698899782e-02 5.630714614928399725e-02 -3.665644679856060184e-02 -4.835135699904979933e-02 -4.296262284422640298e-02 -7.285394808472339667e-02 3.799897096531720114e-02 5.078151336297320045e-02 5.691179930721949887e-02
137 -9.269547780327989928e-02 -4.464163650698899782e-02 -8.165279930747129655e-02 -5.731367096097819691e-02 -6.073493272285990230e-02 -6.801449978738899338e-02 4.864009945014990260e-02 -7.639450375000099436e-02 -6.648814822283539983e-02 -2.178823207463989955e-02
138 5.383060374248070309e-03 -4.464163650698899782e-02 4.984027370599859730e-02 9.761551025715360652e-02 -1.532848840222260020e-02 -1.634500359211620013e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 1.703713241477999851e-02 -1.350401824497050006e-02
139 3.444336798240450054e-02 5.068011873981870252e-02 1.112755619172099975e-01 7.695828609473599757e-02 -3.183992270063620150e-02 -3.388131745233000092e-02 -2.131101882750449997e-02 -2.592261998182820038e-03 2.801650652326400162e-02 7.348022696655839847e-02
140 2.354575262934580082e-02 -4.464163650698899782e-02 6.169620651868849837e-02 5.285819123858220142e-02 -3.459182841703849903e-02 -4.891244361822749687e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 5.472400334817909689e-02 -5.219804415301099697e-03
141 4.170844488444359899e-02 5.068011873981870252e-02 1.427247526792889930e-02 4.252957915737339695e-02 -3.046396984243510131e-02 -1.313877426218630021e-03 -4.340084565202689815e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 1.549073015887240078e-02
142 -2.730978568492789874e-02 -4.464163650698899782e-02 4.768464955823679963e-02 -4.698505887976939938e-02 3.420581449301800248e-02 5.724488492842390308e-02 -8.021722369289760457e-02 1.302517731550900115e-01 4.506616833626150148e-02 1.314697237742440128e-01
143 4.170844488444359899e-02 5.068011873981870252e-02 1.211685112016709989e-02 3.908670846363720280e-02 5.484510736603499803e-02 4.440579799505309927e-02 4.460445801105040325e-03 -2.592261998182820038e-03 4.560080841412490066e-02 -1.077697500466389974e-03
144 -3.094232413594750000e-02 -4.464163650698899782e-02 5.649978676881649634e-03 -9.113481248670509197e-03 1.907033305280559851e-02 6.827982580309210209e-03 7.441156407875940126e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -4.249876664881350324e-02
145 3.081082953138499989e-02 5.068011873981870252e-02 4.660683748435590079e-02 -1.599922263614299983e-02 2.044628591100669870e-02 5.066876723084379891e-02 -5.812739686837520292e-02 7.120997975363539678e-02 6.209315616505399656e-03 7.206516329203029904e-03
146 -4.183993948900609910e-02 -4.464163650698899782e-02 1.285205550993039902e-01 6.318680331979099896e-02 -3.321587555883730170e-02 -3.262872360517189707e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -5.078298047848289754e-02
147 -3.094232413594750000e-02 5.068011873981870252e-02 5.954058237092670069e-02 1.215130832538269907e-03 1.219056876180000040e-02 3.156671106168230240e-02 -4.340084565202689815e-02 3.430885887772629900e-02 1.482271084126630077e-02 7.206516329203029904e-03
148 -5.637009329308430294e-02 -4.464163650698899782e-02 9.295275666123460623e-02 -1.944209332987930153e-02 1.494247447820220079e-02 2.342485105515439842e-02 -2.867429443567860031e-02 2.545258986750810123e-02 2.605608963368469949e-02 4.034337164788070335e-02
149 -6.000263174410389727e-02 5.068011873981870252e-02 1.535028734180979987e-02 -1.944209332987930153e-02 3.695772020942030001e-02 4.816357953652750101e-02 1.918699701745330000e-02 -2.592261998182820038e-03 -3.075120986455629965e-02 -1.077697500466389974e-03
150 -4.910501639104519755e-02 5.068011873981870252e-02 -5.128142061927360405e-03 -4.698505887976939938e-02 -2.083229983502719873e-02 -2.041593359538010008e-02 -6.917231028063640375e-02 7.120997975363539678e-02 6.123790751970099866e-02 -3.835665973397880263e-02
151 2.354575262934580082e-02 -4.464163650698899782e-02 7.031870310973570293e-02 2.531522568869210010e-02 -3.459182841703849903e-02 -1.446611282137899926e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -1.919704761394450121e-02 -9.361911330135799444e-03
152 1.750521923228520000e-03 -4.464163650698899782e-02 -4.050329988046450294e-03 -5.670610554934250001e-03 -8.448724111216979540e-03 -2.386056667506489953e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -8.944018957797799166e-03 -1.350401824497050006e-02
153 -3.457486258696700065e-02 5.068011873981870252e-02 -8.168937664037369826e-04 7.007254470726349826e-02 3.970962592582259754e-02 6.695248724389940564e-02 -6.549067247654929980e-02 1.081111006295440019e-01 2.671425763351279944e-02 7.348022696655839847e-02
154 4.170844488444359899e-02 5.068011873981870252e-02 -4.392937672163980262e-02 6.318680331979099896e-02 -4.320865536613589623e-03 1.622243643399520069e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -3.452371533034950118e-02 1.134862324403770016e-02
155 6.713621404158050254e-02 5.068011873981870252e-02 2.073934771121430098e-02 -5.670610554934250001e-03 2.044628591100669870e-02 2.624318721126020146e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 8.640282933063080789e-03 3.064409414368320182e-03
156 -2.730978568492789874e-02 5.068011873981870252e-02 6.061839444480759953e-02 4.941532054484590319e-02 8.511607024645979902e-02 8.636769187485039689e-02 -2.902829807069099918e-03 3.430885887772629900e-02 3.781447882634390162e-02 4.862758547755009764e-02
157 -1.641217033186929963e-02 -4.464163650698899782e-02 -1.051720243133190055e-02 1.215130832538269907e-03 -3.734373413344069942e-02 -3.576020822306719832e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -2.139368094035999993e-02 -3.421455281914410201e-02
158 -1.882016527791040067e-03 5.068011873981870252e-02 -3.315125598283080038e-02 -1.829446977677679984e-02 3.145390877661580209e-02 4.284005568610550069e-02 -1.394774321933030074e-02 1.991742173612169944e-02 1.022564240495780000e-02 2.791705090337660150e-02
159 -1.277963188084970010e-02 -4.464163650698899782e-02 -6.548561819925780014e-02 -6.993753018282070077e-02 1.182945896190920002e-03 1.684873335757430118e-02 -2.902829807069099918e-03 -7.020396503291909812e-03 -3.075120986455629965e-02 -5.078298047848289754e-02
160 -5.514554978810590376e-03 -4.464163650698899782e-02 4.337340126271319735e-02 8.728689817594480205e-02 1.356652162000110060e-02 7.141131042098750048e-03 -1.394774321933030074e-02 -2.592261998182820038e-03 4.234489544960749752e-02 -1.764612515980519894e-02
161 -9.147093429830140468e-03 -4.464163650698899782e-02 -6.225218197761509670e-02 -7.452802442965950069e-02 -2.358420555142939912e-02 -1.321351897422090062e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -3.581672810154919867e-02 -4.664087356364819692e-02
162 -4.547247794002570037e-02 5.068011873981870252e-02 6.385183066645029604e-02 7.007254470726349826e-02 1.332744202834990066e-01 1.314610703725430096e-01 -3.971920784793980114e-02 1.081111006295440019e-01 7.573758845754760549e-02 8.590654771106250032e-02
163 -5.273755484206479882e-02 -4.464163650698899782e-02 3.043965637614240091e-02 -7.452802442965950069e-02 -2.358420555142939912e-02 -1.133462820348369975e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 -3.075120986455629965e-02 -1.077697500466389974e-03
164 1.628067572730669890e-02 5.068011873981870252e-02 7.247432725749750060e-02 7.695828609473599757e-02 -8.448724111216979540e-03 5.575388733151089883e-03 -6.584467611156170040e-03 -2.592261998182820038e-03 -2.364455757213410059e-02 6.105390622205419948e-02
165 4.534098333546320025e-02 -4.464163650698899782e-02 -1.913969902237900103e-02 2.187235499495579841e-02 2.732605020201240090e-02 -1.352666743601040056e-02 1.001830287073690040e-01 -3.949338287409189657e-02 1.776347786711730131e-02 -1.350401824497050006e-02
166 -4.183993948900609910e-02 -4.464163650698899782e-02 -6.656343027313869898e-02 -4.698505887976939938e-02 -3.734373413344069942e-02 -4.327577130601600180e-02 4.864009945014990260e-02 -3.949338287409189657e-02 -5.615757309500619965e-02 -1.350401824497050006e-02
167 -5.637009329308430294e-02 5.068011873981870252e-02 -6.009655782985329903e-02 -3.665644679856060184e-02 -8.825398988688250290e-02 -7.083283594349480683e-02 -1.394774321933030074e-02 -3.949338287409189657e-02 -7.814091066906959926e-02 -1.046303703713340055e-01
168 7.076875249260000666e-02 -4.464163650698899782e-02 6.924089103585480409e-02 3.793908501382069892e-02 2.182223876920789951e-02 1.504458729887179960e-03 -3.603757004385269719e-02 3.910600459159439823e-02 7.763278919555950675e-02 1.066170822852360034e-01
169 1.750521923228520000e-03 5.068011873981870252e-02 5.954058237092670069e-02 -2.227739861197989939e-03 6.172487165704060308e-02 6.319470570242499696e-02 -5.812739686837520292e-02 1.081111006295440019e-01 6.898221163630259556e-02 1.273276168594099922e-01
170 -1.882016527791040067e-03 -4.464163650698899782e-02 -2.668438353954540043e-02 4.941532054484590319e-02 5.897296594063840269e-02 -1.603185513032660131e-02 -4.708248345611389801e-02 7.120997975363539678e-02 1.335989800130079896e-01 1.963283707370720027e-02
171 2.354575262934580082e-02 5.068011873981870252e-02 -2.021751109626000048e-02 -3.665644679856060184e-02 -1.395253554402150001e-02 -1.509240974495799914e-02 5.968501286241110343e-02 -3.949338287409189657e-02 -9.643322289178400675e-02 -1.764612515980519894e-02
172 -2.004470878288880029e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -9.862811928581330378e-02 -7.587041416307230279e-02 -5.987263978086120042e-02 -1.762938102341739949e-02 -3.949338287409189657e-02 -5.140053526058249722e-02 -4.664087356364819692e-02
173 4.170844488444359899e-02 5.068011873981870252e-02 7.139651518361660176e-02 8.100872220010799790e-03 3.833367306762140020e-02 1.590928797220559840e-02 -1.762938102341739949e-02 3.430885887772629900e-02 7.341007804911610368e-02 8.590654771106250032e-02
174 -6.363517019512339445e-02 5.068011873981870252e-02 -7.949717515970949888e-02 -5.670610554934250001e-03 -7.174255558846899528e-02 -6.644875747844139480e-02 -1.026610541524320026e-02 -3.949338287409189657e-02 -1.811826730789670159e-02 -5.492508739331759815e-02
175 1.628067572730669890e-02 5.068011873981870252e-02 9.961226972405269262e-03 -4.354218818603310115e-02 -9.650970703608929835e-02 -9.463211903949929338e-02 -3.971920784793980114e-02 -3.949338287409189657e-02 1.703713241477999851e-02 7.206516329203029904e-03
176 6.713621404158050254e-02 -4.464163650698899782e-02 -3.854031635223530150e-02 -2.632783471735180084e-02 -3.183992270063620150e-02 -2.636575436938120090e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -2.712864555432650121e-02 3.064409414368320182e-03
177 4.534098333546320025e-02 5.068011873981870252e-02 1.966153563733339868e-02 3.908670846363720280e-02 2.044628591100669870e-02 2.593003874947069978e-02 8.142083605192099172e-03 -2.592261998182820038e-03 -3.303712578676999863e-03 1.963283707370720027e-02
178 4.897352178648269744e-02 -4.464163650698899782e-02 2.720622015449970094e-02 -2.518021116424929914e-02 2.319819162740899970e-02 1.841447566652189977e-02 -6.180903467246220279e-02 8.006624876385350087e-02 7.222365081991240221e-02 3.205915781821130212e-02
179 4.170844488444359899e-02 -4.464163650698899782e-02 -8.361578283570040432e-03 -2.632783471735180084e-02 2.457414448561009990e-02 1.622243643399520069e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -4.836172480289190057e-02 -3.007244590430930078e-02
180 -2.367724723390840155e-02 -4.464163650698899782e-02 -1.590626280073640167e-02 -1.255635194240680048e-02 2.044628591100669870e-02 4.127431337715779802e-02 -4.340084565202689815e-02 3.430885887772629900e-02 1.407245251576850001e-02 -9.361911330135799444e-03
181 -3.820740103798660192e-02 5.068011873981870252e-02 4.572166603000769880e-03 3.564383776990089764e-02 -1.120062982761920074e-02 5.888537194940629722e-03 -4.708248345611389801e-02 3.430885887772629900e-02 1.630495279994180133e-02 -1.077697500466389974e-03
182 4.897352178648269744e-02 -4.464163650698899782e-02 -4.285156464775889684e-02 -5.387080026724189868e-02 4.521343735862710239e-02 5.004247030726469841e-02 3.391354823380159783e-02 -2.592261998182820038e-03 -2.595242443518940012e-02 -6.320930122298699938e-02
183 4.534098333546320025e-02 5.068011873981870252e-02 5.649978676881649634e-03 5.630106193231849965e-02 6.447677737344290061e-02 8.918602803095619647e-02 -3.971920784793980114e-02 7.120997975363539678e-02 1.556684454070180086e-02 -9.361911330135799444e-03
184 4.534098333546320025e-02 5.068011873981870252e-02 -3.530688013059259805e-02 6.318680331979099896e-02 -4.320865536613589623e-03 -1.627025888008149911e-03 -1.026610541524320026e-02 -2.592261998182820038e-03 1.556684454070180086e-02 5.691179930721949887e-02
185 1.628067572730669890e-02 -4.464163650698899782e-02 2.397278393285700096e-02 -2.288496402361559975e-02 -2.496015840963049931e-02 -2.605260590759169922e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 3.723201120896890010e-02 3.205915781821130212e-02
186 -7.453278554818210111e-02 5.068011873981870252e-02 -1.806188694849819934e-02 8.100872220010799790e-03 -1.945634697682600139e-02 -2.480001206043359885e-02 -6.549067247654929980e-02 3.430885887772629900e-02 6.731721791468489591e-02 -1.764612515980519894e-02
187 -8.179786245022120650e-02 5.068011873981870252e-02 4.229558918883229851e-02 -1.944209332987930153e-02 3.970962592582259754e-02 5.755803339021339782e-02 -6.917231028063640375e-02 1.081111006295440019e-01 4.718616788601970313e-02 -3.835665973397880263e-02
188 -6.726770864614299572e-02 -4.464163650698899782e-02 -5.470749746044879791e-02 -2.632783471735180084e-02 -7.587041416307230279e-02 -8.210618056791800512e-02 4.864009945014990260e-02 -7.639450375000099436e-02 -8.682899321629239386e-02 -1.046303703713340055e-01
189 5.383060374248070309e-03 -4.464163650698899782e-02 -2.972517914165530208e-03 4.941532054484590319e-02 7.410844738085080319e-02 7.071026878537380045e-02 4.495846164606279866e-02 -2.592261998182820038e-03 -1.498586820292070049e-03 -9.361911330135799444e-03
190 -1.882016527791040067e-03 -4.464163650698899782e-02 -6.656343027313869898e-02 1.215130832538269907e-03 -2.944912678412469915e-03 3.070201038834840124e-03 1.182372140927919965e-02 -2.592261998182820038e-03 -2.028874775162960165e-02 -2.593033898947460017e-02
191 9.015598825267629943e-03 -4.464163650698899782e-02 -1.267282657909369996e-02 2.875809638242839833e-02 -1.808039411862490120e-02 -5.071658967693000106e-03 -4.708248345611389801e-02 3.430885887772629900e-02 2.337484127982079885e-02 -5.219804415301099697e-03
192 -5.514554978810590376e-03 5.068011873981870252e-02 -4.177375257387799801e-02 -4.354218818603310115e-02 -7.999827273767569358e-02 -7.615635979391689736e-02 -3.235593223976569732e-02 -3.949338287409189657e-02 1.022564240495780000e-02 -9.361911330135799444e-03
193 5.623859868852180283e-02 5.068011873981870252e-02 -3.099563183506899924e-02 8.100872220010799790e-03 1.907033305280559851e-02 2.123281182262769934e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -2.952762274177360077e-02 -5.906719430815229877e-02
194 9.015598825267629943e-03 5.068011873981870252e-02 -5.128142061927360405e-03 -6.419941234845069622e-02 6.998058880624739853e-02 8.386250418053420308e-02 -3.971920784793980114e-02 7.120997975363539678e-02 3.953987807202419963e-02 1.963283707370720027e-02
195 -6.726770864614299572e-02 -4.464163650698899782e-02 -5.901874575597240019e-02 3.220096707616459941e-02 -5.110326271545199972e-02 -4.953874054180659736e-02 -1.026610541524320026e-02 -3.949338287409189657e-02 2.007840549823790115e-03 2.377494398854190089e-02
196 2.717829108036539862e-02 5.068011873981870252e-02 2.505059600673789980e-02 1.498661360748330083e-02 2.595009734381130070e-02 4.847672799831700269e-02 -3.971920784793980114e-02 3.430885887772629900e-02 7.837142301823850701e-03 2.377494398854190089e-02
197 -2.367724723390840155e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -3.321357610482440076e-02 3.282986163481690228e-02 3.626393798852529937e-02 3.759518603788870178e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 1.134862324403770016e-02
198 4.897352178648269744e-02 5.068011873981870252e-02 3.494354529119849794e-03 7.007254470726349826e-02 -8.448724111216979540e-03 1.340410027788939938e-02 -5.444575906428809897e-02 3.430885887772629900e-02 1.331596790892770020e-02 3.620126473304600273e-02
199 -5.273755484206479882e-02 -4.464163650698899782e-02 5.415152200152219958e-02 -2.632783471735180084e-02 -5.523112129005539744e-02 -3.388131745233000092e-02 -1.394774321933030074e-02 -3.949338287409189657e-02 -7.408887149153539631e-02 -5.906719430815229877e-02
200 4.170844488444359899e-02 -4.464163650698899782e-02 -4.500718879552070145e-02 3.449621432008449784e-02 4.383748450042589812e-02 -1.571870666853709964e-02 3.759518603788870178e-02 -1.440062067847370023e-02 8.989869327767099905e-02 7.206516329203029904e-03
201 5.623859868852180283e-02 -4.464163650698899782e-02 -5.794093368209150136e-02 -7.965857695567990157e-03 5.209320164963270050e-02 4.910302492189610318e-02 5.600337505832399948e-02 -2.141183364489639834e-02 -2.832024254799870092e-02 4.448547856271539702e-02
202 -3.457486258696700065e-02 5.068011873981870252e-02 -5.578530953432969675e-02 -1.599922263614299983e-02 -9.824676969418109224e-03 -7.889995123798789270e-03 3.759518603788870178e-02 -3.949338287409189657e-02 -5.295879323920039961e-02 2.791705090337660150e-02
203 8.166636784565869944e-02 5.068011873981870252e-02 1.338730381358059929e-03 3.564383776990089764e-02 1.263946559924939983e-01 9.106491880169340081e-02 1.918699701745330000e-02 3.430885887772629900e-02 8.449528221240310000e-02 -3.007244590430930078e-02
204 -1.882016527791040067e-03 5.068011873981870252e-02 3.043965637614240091e-02 5.285819123858220142e-02 3.970962592582259754e-02 5.661858800484489973e-02 -3.971920784793980114e-02 7.120997975363539678e-02 2.539313491544940155e-02 2.791705090337660150e-02
205 1.107266754538149961e-01 5.068011873981870252e-02 6.727790750762559745e-03 2.875809638242839833e-02 -2.771206412603280031e-02 -7.263698200219739949e-03 -4.708248345611389801e-02 3.430885887772629900e-02 2.007840549823790115e-03 7.762233388139309909e-02
206 -3.094232413594750000e-02 -4.464163650698899782e-02 4.660683748435590079e-02 1.498661360748330083e-02 -1.670444126042380101e-02 -4.703355284749029946e-02 7.788079970179680352e-04 -2.592261998182820038e-03 6.345592137206540473e-02 -2.593033898947460017e-02
207 1.750521923228520000e-03 5.068011873981870252e-02 2.612840808061879863e-02 -9.113481248670509197e-03 2.457414448561009990e-02 3.845597722105199845e-02 -2.131101882750449997e-02 3.430885887772629900e-02 9.436409146079870192e-03 3.064409414368320182e-03
208 9.015598825267629943e-03 -4.464163650698899782e-02 4.552902541047500196e-02 2.875809638242839833e-02 1.219056876180000040e-02 -1.383981589779990050e-02 2.655027262562750096e-02 -3.949338287409189657e-02 4.613233103941480340e-02 3.620126473304600273e-02
209 3.081082953138499989e-02 -4.464163650698899782e-02 4.013996504107050084e-02 7.695828609473599757e-02 1.769438019460449832e-02 3.782968029747289795e-02 -2.867429443567860031e-02 3.430885887772629900e-02 -1.498586820292070049e-03 1.190434030297399942e-01
210 3.807590643342410180e-02 5.068011873981870252e-02 -1.806188694849819934e-02 6.662967401352719310e-02 -5.110326271545199972e-02 -1.665815205390569834e-02 -7.653558588881050062e-02 3.430885887772629900e-02 -1.190068480150809939e-02 -1.350401824497050006e-02
211 9.015598825267629943e-03 -4.464163650698899782e-02 1.427247526792889930e-02 1.498661360748330083e-02 5.484510736603499803e-02 4.722413415115889884e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -3.324878724762579674e-02 -5.906719430815229877e-02
212 9.256398319871740610e-02 -4.464163650698899782e-02 3.690652881942779739e-02 2.187235499495579841e-02 -2.496015840963049931e-02 -1.665815205390569834e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -2.251217192966049885e-02 -2.178823207463989955e-02
213 6.713621404158050254e-02 -4.464163650698899782e-02 3.494354529119849794e-03 3.564383776990089764e-02 4.934129593323050011e-02 3.125356259989280072e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -6.092541861022970299e-04 1.963283707370720027e-02
214 1.750521923228520000e-03 -4.464163650698899782e-02 -7.087467856866229432e-02 -2.288496402361559975e-02 -1.568959820211340015e-03 -1.000728964429089965e-03 2.655027262562750096e-02 -3.949338287409189657e-02 -2.251217192966049885e-02 7.206516329203029904e-03
215 3.081082953138499989e-02 -4.464163650698899782e-02 -3.315125598283080038e-02 -2.288496402361559975e-02 -4.697540414084860200e-02 -8.116673518254939601e-02 1.038646665114559969e-01 -7.639450375000099436e-02 -3.980959436433750137e-02 -5.492508739331759815e-02
216 2.717829108036539862e-02 5.068011873981870252e-02 9.403056873511560221e-02 9.761551025715360652e-02 -3.459182841703849903e-02 -3.200242668159279658e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 3.664579779339879884e-02 1.066170822852360034e-01
217 1.264813727628719998e-02 5.068011873981870252e-02 3.582871674554689856e-02 4.941532054484590319e-02 5.346915450783389784e-02 7.415490186505870052e-02 -6.917231028063640375e-02 1.450122215054540087e-01 4.560080841412490066e-02 4.862758547755009764e-02
218 7.440129094361959405e-02 -4.464163650698899782e-02 3.151746845002330322e-02 1.010583809508899950e-01 4.658939021682820258e-02 3.689023491210430272e-02 1.550535921336619952e-02 -2.592261998182820038e-03 3.365681290238470291e-02 4.448547856271539702e-02
219 -4.183993948900609910e-02 -4.464163650698899782e-02 -6.548561819925780014e-02 -4.009931749229690007e-02 -5.696818394814720174e-03 1.434354566325799982e-02 -4.340084565202689815e-02 3.430885887772629900e-02 7.026862549151949647e-03 -1.350401824497050006e-02
220 -8.906293935226029801e-02 -4.464163650698899782e-02 -4.177375257387799801e-02 -1.944209332987930153e-02 -6.623874415566440021e-02 -7.427746902317970690e-02 8.142083605192099172e-03 -3.949338287409189657e-02 1.143797379512540100e-03 -3.007244590430930078e-02
221 2.354575262934580082e-02 5.068011873981870252e-02 -3.961812842611620034e-02 -5.670610554934250001e-03 -4.835135699904979933e-02 -3.325502052875090042e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.016435479455120028e-01 -6.735140813782170000e-02
222 -4.547247794002570037e-02 -4.464163650698899782e-02 -3.854031635223530150e-02 -2.632783471735180084e-02 -1.532848840222260020e-02 8.781618063081050515e-04 -3.235593223976569732e-02 -2.592261998182820038e-03 1.143797379512540100e-03 -3.835665973397880263e-02
223 -2.367724723390840155e-02 5.068011873981870252e-02 -2.560657146566450160e-02 4.252957915737339695e-02 -5.385516843185429725e-02 -4.765984977106939996e-02 -2.131101882750449997e-02 -3.949338287409189657e-02 1.143797379512540100e-03 1.963283707370720027e-02
224 -9.996055470531900466e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -6.419941234845069622e-02 -5.798302700645770191e-02 -6.018578824265070210e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -1.811826730789670159e-02 -5.078298047848289754e-02
225 -2.730978568492789874e-02 -4.464163650698899782e-02 -6.656343027313869898e-02 -1.123996020607579971e-01 -4.972730985725089953e-02 -4.139688053527879746e-02 7.788079970179680352e-04 -3.949338287409189657e-02 -3.581672810154919867e-02 -9.361911330135799444e-03
226 3.081082953138499989e-02 5.068011873981870252e-02 3.259528052390420205e-02 4.941532054484590319e-02 -4.009563984984299695e-02 -4.358891976780549654e-02 -6.917231028063640375e-02 3.430885887772629900e-02 6.301661511474640487e-02 3.064409414368320182e-03
227 -1.035930931563389945e-01 5.068011873981870252e-02 -4.608500086940160029e-02 -2.632783471735180084e-02 -2.496015840963049931e-02 -2.480001206043359885e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -3.980959436433750137e-02 -5.492508739331759815e-02
228 6.713621404158050254e-02 5.068011873981870252e-02 -2.991781976118810041e-02 5.744868538213489945e-02 -1.930069620102049918e-04 -1.571870666853709964e-02 7.441156407875940126e-02 -5.056371913686460301e-02 -3.845911230135379971e-02 7.206516329203029904e-03
229 -5.273755484206479882e-02 -4.464163650698899782e-02 -1.267282657909369996e-02 -6.075654165471439799e-02 -1.930069620102049918e-04 8.080576427467340075e-03 1.182372140927919965e-02 -2.592261998182820038e-03 -2.712864555432650121e-02 -5.078298047848289754e-02
230 -2.730978568492789874e-02 5.068011873981870252e-02 -1.590626280073640167e-02 -2.977070541108809906e-02 3.934851612593179802e-03 -6.875805026395569565e-04 4.127682384197570165e-02 -3.949338287409189657e-02 -2.364455757213410059e-02 1.134862324403770016e-02
231 -3.820740103798660192e-02 5.068011873981870252e-02 7.139651518361660176e-02 -5.731367096097819691e-02 1.539137131565160022e-01 1.558866503921270130e-01 7.788079970179680352e-04 7.194800217115350505e-02 5.027649338998960160e-02 6.933812005172369786e-02
232 9.015598825267629943e-03 -4.464163650698899782e-02 -3.099563183506899924e-02 2.187235499495579841e-02 8.062710187196569719e-03 8.706873351046409346e-03 4.460445801105040325e-03 -2.592261998182820038e-03 9.436409146079870192e-03 1.134862324403770016e-02
233 1.264813727628719998e-02 5.068011873981870252e-02 2.609183074771409820e-04 -1.140872838930430053e-02 3.970962592582259754e-02 5.724488492842390308e-02 -3.971920784793980114e-02 5.608052019451260223e-02 2.405258322689299982e-02 3.205915781821130212e-02
234 6.713621404158050254e-02 -4.464163650698899782e-02 3.690652881942779739e-02 -5.042792957350569760e-02 -2.358420555142939912e-02 -3.450761437590899733e-02 4.864009945014990260e-02 -3.949338287409189657e-02 -2.595242443518940012e-02 -3.835665973397880263e-02
235 4.534098333546320025e-02 -4.464163650698899782e-02 3.906215296718960200e-02 4.597244985110970211e-02 6.686757328995440036e-03 -2.417371513685449835e-02 8.142083605192099172e-03 -1.255556463467829946e-02 6.432823302367089713e-02 5.691179930721949887e-02
236 6.713621404158050254e-02 5.068011873981870252e-02 -1.482845072685549936e-02 5.859630917623830093e-02 -5.935897986465880211e-02 -3.450761437590899733e-02 -6.180903467246220279e-02 1.290620876969899959e-02 -5.145307980263110273e-03 4.862758547755009764e-02
237 2.717829108036539862e-02 -4.464163650698899782e-02 6.727790750762559745e-03 3.564383776990089764e-02 7.961225881365530110e-02 7.071026878537380045e-02 1.550535921336619952e-02 3.430885887772629900e-02 4.067226371449769728e-02 1.134862324403770016e-02
238 5.623859868852180283e-02 -4.464163650698899782e-02 -6.871905442090049665e-02 -6.878990659528949614e-02 -1.930069620102049918e-04 -1.000728964429089965e-03 4.495846164606279866e-02 -3.764832683029650101e-02 -4.836172480289190057e-02 -1.077697500466389974e-03
239 3.444336798240450054e-02 5.068011873981870252e-02 -9.439390357450949676e-03 5.974393262605470073e-02 -3.596778127523959923e-02 -7.576846662009279788e-03 -7.653558588881050062e-02 7.120997975363539678e-02 1.100810104587249955e-02 -2.178823207463989955e-02
240 2.354575262934580082e-02 -4.464163650698899782e-02 1.966153563733339868e-02 -1.255635194240680048e-02 8.374011738825870577e-02 3.876912568284150012e-02 6.336665066649820044e-02 -2.592261998182820038e-03 6.604820616309839409e-02 4.862758547755009764e-02
241 4.897352178648269744e-02 5.068011873981870252e-02 7.462995140525929827e-02 6.662967401352719310e-02 -9.824676969418109224e-03 -2.253322811587220049e-03 -4.340084565202689815e-02 3.430885887772629900e-02 3.365681290238470291e-02 1.963283707370720027e-02
242 3.081082953138499989e-02 5.068011873981870252e-02 -8.361578283570040432e-03 4.658001526274530187e-03 1.494247447820220079e-02 2.749578105841839898e-02 8.142083605192099172e-03 -8.127430129569179762e-03 -2.952762274177360077e-02 5.691179930721949887e-02
243 -1.035930931563389945e-01 5.068011873981870252e-02 -2.345094731790270046e-02 -2.288496402361559975e-02 -8.687803702868139577e-02 -6.770135132559949864e-02 -1.762938102341739949e-02 -3.949338287409189657e-02 -7.814091066906959926e-02 -7.149351505265640061e-02
244 1.628067572730669890e-02 5.068011873981870252e-02 -4.608500086940160029e-02 1.154374291374709975e-02 -3.321587555883730170e-02 -1.603185513032660131e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 -4.398540256559110156e-02 -4.249876664881350324e-02
245 -6.000263174410389727e-02 5.068011873981870252e-02 5.415152200152219958e-02 -1.944209332987930153e-02 -4.972730985725089953e-02 -4.891244361822749687e-02 2.286863482154040048e-02 -3.949338287409189657e-02 -4.398540256559110156e-02 -5.219804415301099697e-03
246 -2.730978568492789874e-02 -4.464163650698899782e-02 -3.530688013059259805e-02 -2.977070541108809906e-02 -5.660707414825649764e-02 -5.862004593370299943e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -4.986846773523059828e-02 -1.294830118603420011e-01
247 4.170844488444359899e-02 -4.464163650698899782e-02 -3.207344390894990155e-02 -6.190416520781699683e-02 7.961225881365530110e-02 5.098191569263330059e-02 5.600337505832399948e-02 -9.972486173364639508e-03 4.506616833626150148e-02 -5.906719430815229877e-02
248 -8.179786245022120650e-02 -4.464163650698899782e-02 -8.165279930747129655e-02 -4.009931749229690007e-02 2.558898754392050119e-03 -1.853704282464289921e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -1.090443584737709956e-02 -9.220404962683000083e-02
249 -4.183993948900609910e-02 -4.464163650698899782e-02 4.768464955823679963e-02 5.974393262605470073e-02 1.277706088506949944e-01 1.280164372928579986e-01 -2.499265663159149983e-02 1.081111006295440019e-01 6.389312063683939835e-02 4.034337164788070335e-02
250 -1.277963188084970010e-02 -4.464163650698899782e-02 6.061839444480759953e-02 5.285819123858220142e-02 4.796534307502930278e-02 2.937467182915549924e-02 -1.762938102341739949e-02 3.430885887772629900e-02 7.021129819331020649e-02 7.206516329203029904e-03
251 6.713621404158050254e-02 -4.464163650698899782e-02 5.630714614928399725e-02 7.351541540099980343e-02 -1.395253554402150001e-02 -3.920484130275200124e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 7.573758845754760549e-02 3.620126473304600273e-02
252 -5.273755484206479882e-02 5.068011873981870252e-02 9.834181703063900326e-02 8.728689817594480205e-02 6.034891879883950289e-02 4.878987646010649742e-02 -5.812739686837520292e-02 1.081111006295440019e-01 8.449528221240310000e-02 4.034337164788070335e-02
253 5.383060374248070309e-03 -4.464163650698899782e-02 5.954058237092670069e-02 -5.616604740787570216e-02 2.457414448561009990e-02 5.286080646337049799e-02 -4.340084565202689815e-02 5.091436327188540029e-02 -4.219859706946029777e-03 -3.007244590430930078e-02
254 8.166636784565869944e-02 -4.464163650698899782e-02 3.367309259778510089e-02 8.100872220010799790e-03 5.209320164963270050e-02 5.661858800484489973e-02 -1.762938102341739949e-02 3.430885887772629900e-02 3.486419309615960277e-02 6.933812005172369786e-02
255 3.081082953138499989e-02 5.068011873981870252e-02 5.630714614928399725e-02 7.695828609473599757e-02 4.934129593323050011e-02 -1.227407358885230018e-02 -3.603757004385269719e-02 7.120997975363539678e-02 1.200533820015380060e-01 9.004865462589720093e-02
256 1.750521923228520000e-03 -4.464163650698899782e-02 -6.548561819925780014e-02 -5.670610554934250001e-03 -7.072771253015849857e-03 -1.947648821001150138e-02 4.127682384197570165e-02 -3.949338287409189657e-02 -3.303712578676999863e-03 7.206516329203029904e-03
257 -4.910501639104519755e-02 -4.464163650698899782e-02 1.608549173157310108e-01 -4.698505887976939938e-02 -2.908801698423390050e-02 -1.978963667180099958e-02 -4.708248345611389801e-02 3.430885887772629900e-02 2.801650652326400162e-02 1.134862324403770016e-02
258 -2.730978568492789874e-02 5.068011873981870252e-02 -5.578530953432969675e-02 2.531522568869210010e-02 -7.072771253015849857e-03 -2.354741821327540133e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -5.145307980263110273e-03 -5.078298047848289754e-02
259 7.803382939463919532e-02 5.068011873981870252e-02 -2.452875939178359929e-02 -4.239456463293059946e-02 6.686757328995440036e-03 5.286080646337049799e-02 -6.917231028063640375e-02 8.080427118137170628e-02 -3.712834601047360072e-02 5.691179930721949887e-02
260 1.264813727628719998e-02 -4.464163650698899782e-02 -3.638469220447349689e-02 4.252957915737339695e-02 -1.395253554402150001e-02 1.293437758520510003e-02 -2.683347553363510038e-02 5.156973385758089994e-03 -4.398540256559110156e-02 7.206516329203029904e-03
261 4.170844488444359899e-02 -4.464163650698899782e-02 -8.361578283570040432e-03 -5.731367096097819691e-02 8.062710187196569719e-03 -3.137612975801370302e-02 1.517259579645879874e-01 -7.639450375000099436e-02 -8.023654024890179703e-02 -1.764612515980519894e-02
262 4.897352178648269744e-02 -4.464163650698899782e-02 -4.177375257387799801e-02 1.045012516446259948e-01 3.558176735121919981e-02 -2.573945744580210040e-02 1.774974225931970073e-01 -7.639450375000099436e-02 -1.290794225416879923e-02 1.549073015887240078e-02
263 -1.641217033186929963e-02 5.068011873981870252e-02 1.274427430254229943e-01 9.761551025715360652e-02 1.631842733640340160e-02 1.747503028115330106e-02 -2.131101882750449997e-02 3.430885887772629900e-02 3.486419309615960277e-02 3.064409414368320182e-03
264 -7.453278554818210111e-02 5.068011873981870252e-02 -7.734155101194770121e-02 -4.698505887976939938e-02 -4.697540414084860200e-02 -3.262872360517189707e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -7.212845460195599356e-02 -1.764612515980519894e-02
265 3.444336798240450054e-02 5.068011873981870252e-02 2.828403222838059977e-02 -3.321357610482440076e-02 -4.559945128264750180e-02 -9.768885894535990141e-03 -5.076412126020100196e-02 -2.592261998182820038e-03 -5.947269741072230137e-02 -2.178823207463989955e-02
266 -3.457486258696700065e-02 5.068011873981870252e-02 -2.560657146566450160e-02 -1.714684618924559867e-02 1.182945896190920002e-03 -2.879619735166290186e-03 8.142083605192099172e-03 -1.550765430475099967e-02 1.482271084126630077e-02 4.034337164788070335e-02
267 -5.273755484206479882e-02 5.068011873981870252e-02 -6.225218197761509670e-02 1.154374291374709975e-02 -8.448724111216979540e-03 -3.669965360843580049e-02 1.222728555318910032e-01 -7.639450375000099436e-02 -8.682899321629239386e-02 3.064409414368320182e-03
268 5.987113713954139715e-02 -4.464163650698899782e-02 -8.168937664037369826e-04 -8.485663651086830517e-02 7.548440023905199359e-02 7.947842571548069390e-02 4.460445801105040325e-03 3.430885887772629900e-02 2.337484127982079885e-02 2.791705090337660150e-02
269 6.350367559056099842e-02 5.068011873981870252e-02 8.864150836571099701e-02 7.007254470726349826e-02 2.044628591100669870e-02 3.751653183568340322e-02 -5.076412126020100196e-02 7.120997975363539678e-02 2.930041326858690010e-02 7.348022696655839847e-02
270 9.015598825267629943e-03 -4.464163650698899782e-02 -3.207344390894990155e-02 -2.632783471735180084e-02 4.246153164222479792e-02 -1.039518281811509931e-02 1.590892335727620011e-01 -7.639450375000099436e-02 -1.190068480150809939e-02 -3.835665973397880263e-02
271 5.383060374248070309e-03 5.068011873981870252e-02 3.043965637614240091e-02 8.384402748220859403e-02 -3.734373413344069942e-02 -4.734670130927989828e-02 1.550535921336619952e-02 -3.949338287409189657e-02 8.640282933063080789e-03 1.549073015887240078e-02
272 3.807590643342410180e-02 5.068011873981870252e-02 8.883414898524360018e-03 4.252957915737339695e-02 -4.284754556624519733e-02 -2.104223051895920057e-02 -3.971920784793980114e-02 -2.592261998182820038e-03 -1.811826730789670159e-02 7.206516329203029904e-03
273 1.264813727628719998e-02 -4.464163650698899782e-02 6.727790750762559745e-03 -5.616604740787570216e-02 -7.587041416307230279e-02 -6.644875747844139480e-02 -2.131101882750449997e-02 -3.764832683029650101e-02 -1.811826730789670159e-02 -9.220404962683000083e-02
274 7.440129094361959405e-02 5.068011873981870252e-02 -2.021751109626000048e-02 4.597244985110970211e-02 7.410844738085080319e-02 3.281930490884039930e-02 -3.603757004385269719e-02 7.120997975363539678e-02 1.063542767417259977e-01 3.620126473304600273e-02
275 1.628067572730669890e-02 -4.464163650698899782e-02 -2.452875939178359929e-02 3.564383776990089764e-02 -7.072771253015849857e-03 -3.192768196955810076e-03 -1.394774321933030074e-02 -2.592261998182820038e-03 1.556684454070180086e-02 1.549073015887240078e-02
276 -5.514554978810590376e-03 5.068011873981870252e-02 -1.159501450521270051e-02 1.154374291374709975e-02 -2.220825269322829892e-02 -1.540555820674759969e-02 -2.131101882750449997e-02 -2.592261998182820038e-03 1.100810104587249955e-02 6.933812005172369786e-02
277 1.264813727628719998e-02 -4.464163650698899782e-02 2.612840808061879863e-02 6.318680331979099896e-02 1.250187031342930022e-01 9.169121572527250130e-02 6.336665066649820044e-02 -2.592261998182820038e-03 5.757285620242599822e-02 -2.178823207463989955e-02
278 -3.457486258696700065e-02 -4.464163650698899782e-02 -5.901874575597240019e-02 1.215130832538269907e-03 -5.385516843185429725e-02 -7.803525056465400456e-02 6.704828847058519337e-02 -7.639450375000099436e-02 -2.139368094035999993e-02 1.549073015887240078e-02
279 6.713621404158050254e-02 5.068011873981870252e-02 -3.638469220447349689e-02 -8.485663651086830517e-02 -7.072771253015849857e-03 1.966706951368000014e-02 -5.444575906428809897e-02 3.430885887772629900e-02 1.143797379512540100e-03 3.205915781821130212e-02
280 3.807590643342410180e-02 5.068011873981870252e-02 -2.452875939178359929e-02 4.658001526274530187e-03 -2.633611126783170012e-02 -2.636575436938120090e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -1.599826775813870117e-02 -2.593033898947460017e-02
281 9.015598825267629943e-03 5.068011873981870252e-02 1.858372356345249984e-02 3.908670846363720280e-02 1.769438019460449832e-02 1.058576412178359981e-02 1.918699701745330000e-02 -2.592261998182820038e-03 1.630495279994180133e-02 -1.764612515980519894e-02
282 -9.269547780327989928e-02 5.068011873981870252e-02 -9.027529589851850111e-02 -5.731367096097819691e-02 -2.496015840963049931e-02 -3.043668437264510085e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 2.405258322689299982e-02 3.064409414368320182e-03
283 7.076875249260000666e-02 -4.464163650698899782e-02 -5.128142061927360405e-03 -5.670610554934250001e-03 8.786797596286209655e-02 1.029645603496960049e-01 1.182372140927919965e-02 3.430885887772629900e-02 -8.944018957797799166e-03 2.791705090337660150e-02
284 -1.641217033186929963e-02 -4.464163650698899782e-02 -5.255187331268700024e-02 -3.321357610482440076e-02 -4.422349842444640161e-02 -3.638650514664620167e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -6.832974362442149896e-02 -3.007244590430930078e-02
285 4.170844488444359899e-02 5.068011873981870252e-02 -2.237313524402180162e-02 2.875809638242839833e-02 -6.623874415566440021e-02 -4.515466207675319921e-02 -6.180903467246220279e-02 -2.592261998182820038e-03 2.863770518940129874e-03 -5.492508739331759815e-02
286 1.264813727628719998e-02 -4.464163650698899782e-02 -2.021751109626000048e-02 -1.599922263614299983e-02 1.219056876180000040e-02 2.123281182262769934e-02 -7.653558588881050062e-02 1.081111006295440019e-01 5.988072306548120061e-02 -2.178823207463989955e-02
287 -3.820740103798660192e-02 -4.464163650698899782e-02 -5.470749746044879791e-02 -7.797089512339580586e-02 -3.321587555883730170e-02 -8.649025903297140327e-02 1.406810445523269948e-01 -7.639450375000099436e-02 -1.919704761394450121e-02 -5.219804415301099697e-03
288 4.534098333546320025e-02 -4.464163650698899782e-02 -6.205954135808240159e-03 -1.599922263614299983e-02 1.250187031342930022e-01 1.251981011367520047e-01 1.918699701745330000e-02 3.430885887772629900e-02 3.243322577960189995e-02 -5.219804415301099697e-03
289 7.076875249260000666e-02 5.068011873981870252e-02 -1.698407487461730050e-02 2.187235499495579841e-02 4.383748450042589812e-02 5.630543954305530091e-02 3.759518603788870178e-02 -2.592261998182820038e-03 -7.020931272868760620e-02 -1.764612515980519894e-02
290 -7.453278554818210111e-02 5.068011873981870252e-02 5.522933407540309841e-02 -4.009931749229690007e-02 5.346915450783389784e-02 5.317395492515999966e-02 -4.340084565202689815e-02 7.120997975363539678e-02 6.123790751970099866e-02 -3.421455281914410201e-02
291 5.987113713954139715e-02 5.068011873981870252e-02 7.678557555302109594e-02 2.531522568869210010e-02 1.182945896190920002e-03 1.684873335757430118e-02 -5.444575906428809897e-02 3.430885887772629900e-02 2.993564839653250001e-02 4.448547856271539702e-02
292 7.440129094361959405e-02 -4.464163650698899782e-02 1.858372356345249984e-02 6.318680331979099896e-02 6.172487165704060308e-02 4.284005568610550069e-02 8.142083605192099172e-03 -2.592261998182820038e-03 5.803912766389510147e-02 -5.906719430815229877e-02
293 9.015598825267629943e-03 -4.464163650698899782e-02 -2.237313524402180162e-02 -3.206595255172180192e-02 -4.972730985725089953e-02 -6.864079671096809387e-02 7.809320188284639419e-02 -7.085933561861459951e-02 -6.291294991625119570e-02 -3.835665973397880263e-02
294 -7.090024709716259699e-02 -4.464163650698899782e-02 9.295275666123460623e-02 1.269136646684959971e-02 2.044628591100669870e-02 4.252690722431590187e-02 7.788079970179680352e-04 3.598276718899090076e-04 -5.454415271109520208e-02 -1.077697500466389974e-03
295 2.354575262934580082e-02 5.068011873981870252e-02 -3.099563183506899924e-02 -5.670610554934250001e-03 -1.670444126042380101e-02 1.778817874294279927e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -7.408887149153539631e-02 -3.421455281914410201e-02
296 -5.273755484206479882e-02 5.068011873981870252e-02 3.906215296718960200e-02 -4.009931749229690007e-02 -5.696818394814720174e-03 -1.290037051243130006e-02 1.182372140927919965e-02 -3.949338287409189657e-02 1.630495279994180133e-02 3.064409414368320182e-03
297 6.713621404158050254e-02 -4.464163650698899782e-02 -6.117436990373419786e-02 -4.009931749229690007e-02 -2.633611126783170012e-02 -2.448686359864400003e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -5.615757309500619965e-02 -5.906719430815229877e-02
298 1.750521923228520000e-03 -4.464163650698899782e-02 -8.361578283570040432e-03 -6.419941234845069622e-02 -3.871968699164179961e-02 -2.448686359864400003e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -6.468302246445030435e-02 -5.492508739331759815e-02
299 2.354575262934580082e-02 5.068011873981870252e-02 -3.746250427835440266e-02 -4.698505887976939938e-02 -9.100589560328480043e-02 -7.553006287033779687e-02 -3.235593223976569732e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 -1.350401824497050006e-02
300 3.807590643342410180e-02 5.068011873981870252e-02 -1.375063865297449991e-02 -1.599922263614299983e-02 -3.596778127523959923e-02 -2.198167590432769866e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -2.595242443518940012e-02 -1.077697500466389974e-03
301 1.628067572730669890e-02 -4.464163650698899782e-02 7.355213933137849658e-02 -4.124694104539940176e-02 -4.320865536613589623e-03 -1.352666743601040056e-02 -1.394774321933030074e-02 -1.116217163146459961e-03 4.289568789252869857e-02 4.448547856271539702e-02
302 -1.882016527791040067e-03 5.068011873981870252e-02 -2.452875939178359929e-02 5.285819123858220142e-02 2.732605020201240090e-02 3.000096875273459973e-02 3.023191042971450082e-02 -2.592261998182820038e-03 -2.139368094035999993e-02 3.620126473304600273e-02
303 1.264813727628719998e-02 -4.464163650698899782e-02 3.367309259778510089e-02 3.334859052598110329e-02 3.007795591841460128e-02 2.718263259662880016e-02 -2.902829807069099918e-03 8.847085473348980864e-03 3.119299070280229930e-02 2.791705090337660150e-02
304 7.440129094361959405e-02 -4.464163650698899782e-02 3.475090467166599972e-02 9.417263956341730136e-02 5.759701308243719842e-02 2.029336643725910064e-02 2.286863482154040048e-02 -2.592261998182820038e-03 7.380214692004880006e-02 -2.178823207463989955e-02
305 4.170844488444359899e-02 5.068011873981870252e-02 -3.854031635223530150e-02 5.285819123858220142e-02 7.686035309725310072e-02 1.164299442066459994e-01 -3.971920784793980114e-02 7.120997975363539678e-02 -2.251217192966049885e-02 -1.350401824497050006e-02
306 -9.147093429830140468e-03 5.068011873981870252e-02 -3.961812842611620034e-02 -4.009931749229690007e-02 -8.448724111216979540e-03 1.622243643399520069e-02 -6.549067247654929980e-02 7.120997975363539678e-02 1.776347786711730131e-02 -6.735140813782170000e-02
307 9.015598825267629943e-03 5.068011873981870252e-02 -1.894705840284650021e-03 2.187235499495579841e-02 -3.871968699164179961e-02 -2.480001206043359885e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -3.980959436433750137e-02 -1.350401824497050006e-02
308 6.713621404158050254e-02 5.068011873981870252e-02 -3.099563183506899924e-02 4.658001526274530187e-03 2.457414448561009990e-02 3.563764106494619888e-02 -2.867429443567860031e-02 3.430885887772629900e-02 2.337484127982079885e-02 8.176444079622779970e-02
309 1.750521923228520000e-03 -4.464163650698899782e-02 -4.608500086940160029e-02 -3.321357610482440076e-02 -7.311850844667000526e-02 -8.147988364433890462e-02 4.495846164606279866e-02 -6.938329078357829971e-02 -6.117659509433449883e-02 -7.977772888232589898e-02
310 -9.147093429830140468e-03 5.068011873981870252e-02 1.338730381358059929e-03 -2.227739861197989939e-03 7.961225881365530110e-02 7.008397186179469995e-02 3.391354823380159783e-02 -2.592261998182820038e-03 2.671425763351279944e-02 8.176444079622779970e-02
311 -5.514554978810590376e-03 -4.464163650698899782e-02 6.492964274033119487e-02 3.564383776990089764e-02 -1.568959820211340015e-03 1.496984258683710031e-02 -1.394774321933030074e-02 7.288388806489919797e-04 -1.811826730789670159e-02 3.205915781821130212e-02
312 9.619652164973699349e-02 -4.464163650698899782e-02 4.013996504107050084e-02 -5.731367096097819691e-02 4.521343735862710239e-02 6.068951800810880315e-02 -2.131101882750449997e-02 3.615391492152170150e-02 1.255315281338930007e-02 2.377494398854190089e-02
313 -7.453278554818210111e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -5.670610554934250001e-03 -2.083229983502719873e-02 -1.415296435958940044e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -3.845911230135379971e-02 -3.007244590430930078e-02
314 5.987113713954139715e-02 5.068011873981870252e-02 5.307370992764130074e-02 5.285819123858220142e-02 3.282986163481690228e-02 1.966706951368000014e-02 -1.026610541524320026e-02 3.430885887772629900e-02 5.520503808961670089e-02 -1.077697500466389974e-03
315 -2.367724723390840155e-02 -4.464163650698899782e-02 4.013996504107050084e-02 -1.255635194240680048e-02 -9.824676969418109224e-03 -1.000728964429089965e-03 -2.902829807069099918e-03 -2.592261998182820038e-03 -1.190068480150809939e-02 -3.835665973397880263e-02
316 9.015598825267629943e-03 -4.464163650698899782e-02 -2.021751109626000048e-02 -5.387080026724189868e-02 3.145390877661580209e-02 2.060651489904859884e-02 5.600337505832399948e-02 -3.949338287409189657e-02 -1.090443584737709956e-02 -1.077697500466389974e-03
317 1.628067572730669890e-02 5.068011873981870252e-02 1.427247526792889930e-02 1.215130832538269907e-03 1.182945896190920002e-03 -2.135537898074869878e-02 -3.235593223976569732e-02 3.430885887772629900e-02 7.496833602773420036e-02 4.034337164788070335e-02
318 1.991321417832630017e-02 -4.464163650698899782e-02 -3.422906805671169922e-02 5.515343848250200270e-02 6.722868308984519814e-02 7.415490186505870052e-02 -6.584467611156170040e-03 3.283281404268990206e-02 2.472532334280450050e-02 6.933812005172369786e-02
319 8.893144474769780483e-02 -4.464163650698899782e-02 6.727790750762559745e-03 2.531522568869210010e-02 3.007795591841460128e-02 8.706873351046409346e-03 6.336665066649820044e-02 -3.949338287409189657e-02 9.436409146079870192e-03 3.205915781821130212e-02
320 1.991321417832630017e-02 -4.464163650698899782e-02 4.572166603000769880e-03 4.597244985110970211e-02 -1.808039411862490120e-02 -5.454911593043910295e-02 6.336665066649820044e-02 -3.949338287409189657e-02 2.866072031380889965e-02 6.105390622205419948e-02
321 -2.367724723390840155e-02 -4.464163650698899782e-02 3.043965637614240091e-02 -5.670610554934250001e-03 8.236416453005759863e-02 9.200436418706199604e-02 -1.762938102341739949e-02 7.120997975363539678e-02 3.304707235493409972e-02 3.064409414368320182e-03
322 9.619652164973699349e-02 -4.464163650698899782e-02 5.199589785376040191e-02 7.925353333865589600e-02 5.484510736603499803e-02 3.657708645031480105e-02 -7.653558588881050062e-02 1.413221094178629955e-01 9.864637430492799453e-02 6.105390622205419948e-02
323 2.354575262934580082e-02 5.068011873981870252e-02 6.169620651868849837e-02 6.203917986997459916e-02 2.457414448561009990e-02 -3.607335668485669999e-02 -9.126213710515880539e-02 1.553445353507079962e-01 1.333957338374689994e-01 8.176444079622779970e-02
324 7.076875249260000666e-02 5.068011873981870252e-02 -7.283766209689159811e-03 4.941532054484590319e-02 6.034891879883950289e-02 -4.445362044113949918e-03 -5.444575906428809897e-02 1.081111006295440019e-01 1.290194116001679991e-01 5.691179930721949887e-02
325 3.081082953138499989e-02 -4.464163650698899782e-02 5.649978676881649634e-03 1.154374291374709975e-02 7.823630595545419397e-02 7.791268340653299818e-02 -4.340084565202689815e-02 1.081111006295440019e-01 6.604820616309839409e-02 1.963283707370720027e-02
326 -1.882016527791040067e-03 -4.464163650698899782e-02 5.415152200152219958e-02 -6.649465948908450663e-02 7.273249452264969606e-02 5.661858800484489973e-02 -4.340084565202689815e-02 8.486339447772170419e-02 8.449528221240310000e-02 4.862758547755009764e-02
327 4.534098333546320025e-02 5.068011873981870252e-02 -8.361578283570040432e-03 -3.321357610482440076e-02 -7.072771253015849857e-03 1.191310268097639903e-03 -3.971920784793980114e-02 3.430885887772629900e-02 2.993564839653250001e-02 2.791705090337660150e-02
328 7.440129094361959405e-02 -4.464163650698899782e-02 1.145089981388529993e-01 2.875809638242839833e-02 2.457414448561009990e-02 2.499059336410210108e-02 1.918699701745330000e-02 -2.592261998182820038e-03 -6.092541861022970299e-04 -5.219804415301099697e-03
329 -3.820740103798660192e-02 -4.464163650698899782e-02 6.708526688809300642e-02 -6.075654165471439799e-02 -2.908801698423390050e-02 -2.323426975148589965e-02 -1.026610541524320026e-02 -2.592261998182820038e-03 -1.498586820292070049e-03 1.963283707370720027e-02
330 -1.277963188084970010e-02 5.068011873981870252e-02 -5.578530953432969675e-02 -2.227739861197989939e-03 -2.771206412603280031e-02 -2.918409052548700047e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -1.705210460474350029e-02 4.448547856271539702e-02
331 9.015598825267629943e-03 5.068011873981870252e-02 3.043965637614240091e-02 4.252957915737339695e-02 -2.944912678412469915e-03 3.689023491210430272e-02 -6.549067247654929980e-02 7.120997975363539678e-02 -2.364455757213410059e-02 1.549073015887240078e-02
332 8.166636784565869944e-02 5.068011873981870252e-02 -2.560657146566450160e-02 -3.665644679856060184e-02 -7.036660273026780488e-02 -4.640725592391130305e-02 -3.971920784793980114e-02 -2.592261998182820038e-03 -4.118038518800790082e-02 -5.219804415301099697e-03
333 3.081082953138499989e-02 -4.464163650698899782e-02 1.048086894739250069e-01 7.695828609473599757e-02 -1.120062982761920074e-02 -1.133462820348369975e-02 -5.812739686837520292e-02 3.430885887772629900e-02 5.710418744784390155e-02 3.620126473304600273e-02
334 2.717829108036539862e-02 5.068011873981870252e-02 -6.205954135808240159e-03 2.875809638242839833e-02 -1.670444126042380101e-02 -1.627025888008149911e-03 -5.812739686837520292e-02 3.430885887772629900e-02 2.930041326858690010e-02 3.205915781821130212e-02
335 -6.000263174410389727e-02 5.068011873981870252e-02 -4.716281294328249912e-02 -2.288496402361559975e-02 -7.174255558846899528e-02 -5.768060054833450134e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -6.291294991625119570e-02 -5.492508739331759815e-02
336 5.383060374248070309e-03 -4.464163650698899782e-02 -4.824062501716339796e-02 -1.255635194240680048e-02 1.182945896190920002e-03 -6.637401276640669812e-03 6.336665066649820044e-02 -3.949338287409189657e-02 -5.140053526058249722e-02 -5.906719430815229877e-02
337 -2.004470878288880029e-02 -4.464163650698899782e-02 8.540807214406830050e-02 -3.665644679856060184e-02 9.199583453746550121e-02 8.949917649274570508e-02 -6.180903467246220279e-02 1.450122215054540087e-01 8.094791351127560153e-02 5.276969239238479825e-02
338 1.991321417832630017e-02 5.068011873981870252e-02 -1.267282657909369996e-02 7.007254470726349826e-02 -1.120062982761920074e-02 7.141131042098750048e-03 -3.971920784793980114e-02 3.430885887772629900e-02 5.384369968545729690e-03 3.064409414368320182e-03
339 -6.363517019512339445e-02 -4.464163650698899782e-02 -3.315125598283080038e-02 -3.321357610482440076e-02 1.182945896190920002e-03 2.405114797873349891e-02 -2.499265663159149983e-02 -2.592261998182820038e-03 -2.251217192966049885e-02 -5.906719430815229877e-02
340 2.717829108036539862e-02 -4.464163650698899782e-02 -7.283766209689159811e-03 -5.042792957350569760e-02 7.548440023905199359e-02 5.661858800484489973e-02 3.391354823380159783e-02 -2.592261998182820038e-03 4.344317225278129802e-02 1.549073015887240078e-02
341 -1.641217033186929963e-02 -4.464163650698899782e-02 -1.375063865297449991e-02 1.320442171945160059e-01 -9.824676969418109224e-03 -3.819065120534880214e-03 1.918699701745330000e-02 -3.949338287409189657e-02 -3.581672810154919867e-02 -3.007244590430930078e-02
342 3.081082953138499989e-02 5.068011873981870252e-02 5.954058237092670069e-02 5.630106193231849965e-02 -2.220825269322829892e-02 1.191310268097639903e-03 -3.235593223976569732e-02 -2.592261998182820038e-03 -2.479118743246069845e-02 -1.764612515980519894e-02
343 5.623859868852180283e-02 5.068011873981870252e-02 2.181715978509519982e-02 5.630106193231849965e-02 -7.072771253015849857e-03 1.810132720473240156e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -2.364455757213410059e-02 2.377494398854190089e-02
344 -2.004470878288880029e-02 -4.464163650698899782e-02 1.858372356345249984e-02 9.072976886968099619e-02 3.934851612593179802e-03 8.706873351046409346e-03 3.759518603788870178e-02 -3.949338287409189657e-02 -5.780006567561250114e-02 7.206516329203029904e-03
345 -1.072256316073579990e-01 -4.464163650698899782e-02 -1.159501450521270051e-02 -4.009931749229690007e-02 4.934129593323050011e-02 6.444729954958319795e-02 -1.394774321933030074e-02 3.430885887772629900e-02 7.026862549151949647e-03 -3.007244590430930078e-02
346 8.166636784565869944e-02 5.068011873981870252e-02 -2.972517914165530208e-03 -3.321357610482440076e-02 4.246153164222479792e-02 5.787118185200299664e-02 -1.026610541524320026e-02 3.430885887772629900e-02 -6.092541861022970299e-04 -1.077697500466389974e-03
347 5.383060374248070309e-03 5.068011873981870252e-02 1.750591148957160101e-02 3.220096707616459941e-02 1.277706088506949944e-01 1.273901403692790091e-01 -2.131101882750449997e-02 7.120997975363539678e-02 6.257518145805600340e-02 1.549073015887240078e-02
348 3.807590643342410180e-02 5.068011873981870252e-02 -2.991781976118810041e-02 -7.452802442965950069e-02 -1.257658268582039982e-02 -1.258722205064180012e-02 4.460445801105040325e-03 -2.592261998182820038e-03 3.711738233435969789e-03 -3.007244590430930078e-02
349 3.081082953138499989e-02 -4.464163650698899782e-02 -2.021751109626000048e-02 -5.670610554934250001e-03 -4.320865536613589623e-03 -2.949723898727649868e-02 7.809320188284639419e-02 -3.949338287409189657e-02 -1.090443584737709956e-02 -1.077697500466389974e-03
350 1.750521923228520000e-03 5.068011873981870252e-02 -5.794093368209150136e-02 -4.354218818603310115e-02 -9.650970703608929835e-02 -4.703355284749029946e-02 -9.862541271333299941e-02 3.430885887772629900e-02 -6.117659509433449883e-02 -7.149351505265640061e-02
351 -2.730978568492789874e-02 5.068011873981870252e-02 6.061839444480759953e-02 1.079441223383619947e-01 1.219056876180000040e-02 -1.759759743927430051e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 7.021129819331020649e-02 1.356118306890790048e-01
352 -8.543040090124079389e-02 5.068011873981870252e-02 -4.069594049999709917e-02 -3.321357610482440076e-02 -8.137422559587689785e-02 -6.958024209633670298e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -5.780006567561250114e-02 -4.249876664881350324e-02
353 1.264813727628719998e-02 5.068011873981870252e-02 -7.195249064254319316e-02 -4.698505887976939938e-02 -5.110326271545199972e-02 -9.713730673381550107e-02 1.185912177278039964e-01 -7.639450375000099436e-02 -2.028874775162960165e-02 -3.835665973397880263e-02
354 -5.273755484206479882e-02 -4.464163650698899782e-02 -5.578530953432969675e-02 -3.665644679856060184e-02 8.924392882106320368e-02 -3.192768196955810076e-03 8.142083605192099172e-03 3.430885887772629900e-02 1.323726493386760128e-01 3.064409414368320182e-03
355 -2.367724723390840155e-02 5.068011873981870252e-02 4.552902541047500196e-02 2.187235499495579841e-02 1.098832216940800049e-01 8.887287956916670173e-02 7.788079970179680352e-04 3.430885887772629900e-02 7.419253669003070262e-02 6.105390622205419948e-02
356 -7.453278554818210111e-02 5.068011873981870252e-02 -9.439390357450949676e-03 1.498661360748330083e-02 -3.734373413344069942e-02 -2.166852744253820046e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 -3.324878724762579674e-02 1.134862324403770016e-02
357 -5.514554978810590376e-03 5.068011873981870252e-02 -3.315125598283080038e-02 -1.599922263614299983e-02 8.062710187196569719e-03 1.622243643399520069e-02 1.550535921336619952e-02 -2.592261998182820038e-03 -2.832024254799870092e-02 -7.563562196749110123e-02
358 -6.000263174410389727e-02 5.068011873981870252e-02 4.984027370599859730e-02 1.842948430121960079e-02 -1.670444126042380101e-02 -3.012353591085559917e-02 -1.762938102341739949e-02 -2.592261998182820038e-03 4.976865992074899769e-02 -5.906719430815229877e-02
359 -2.004470878288880029e-02 -4.464163650698899782e-02 -8.488623552911400694e-02 -2.632783471735180084e-02 -3.596778127523959923e-02 -3.419446591411950259e-02 4.127682384197570165e-02 -5.167075276314189725e-02 -8.238148325810279449e-02 -4.664087356364819692e-02
360 3.807590643342410180e-02 5.068011873981870252e-02 5.649978676881649634e-03 3.220096707616459941e-02 6.686757328995440036e-03 1.747503028115330106e-02 -2.499265663159149983e-02 3.430885887772629900e-02 1.482271084126630077e-02 6.105390622205419948e-02
361 1.628067572730669890e-02 -4.464163650698899782e-02 2.073934771121430098e-02 2.187235499495579841e-02 -1.395253554402150001e-02 -1.321351897422090062e-02 -6.584467611156170040e-03 -2.592261998182820038e-03 1.331596790892770020e-02 4.034337164788070335e-02
362 4.170844488444359899e-02 -4.464163650698899782e-02 -7.283766209689159811e-03 2.875809638242839833e-02 -4.284754556624519733e-02 -4.828614669464850045e-02 5.232173725423699961e-02 -7.639450375000099436e-02 -7.212845460195599356e-02 2.377494398854190089e-02
363 1.991321417832630017e-02 5.068011873981870252e-02 1.048086894739250069e-01 7.007254470726349826e-02 -3.596778127523959923e-02 -2.667890283117069911e-02 -2.499265663159149983e-02 -2.592261998182820038e-03 3.711738233435969789e-03 4.034337164788070335e-02
364 -4.910501639104519755e-02 5.068011873981870252e-02 -2.452875939178359929e-02 6.750727943574620551e-05 -4.697540414084860200e-02 -2.824464514011839830e-02 -6.549067247654929980e-02 2.840467953758080144e-02 1.919903307856710151e-02 1.134862324403770016e-02
365 1.750521923228520000e-03 5.068011873981870252e-02 -6.205954135808240159e-03 -1.944209332987930153e-02 -9.824676969418109224e-03 4.949091809572019746e-03 -3.971920784793980114e-02 3.430885887772629900e-02 1.482271084126630077e-02 9.833286845556660216e-02
366 3.444336798240450054e-02 -4.464163650698899782e-02 -3.854031635223530150e-02 -1.255635194240680048e-02 9.438663045397699403e-03 5.262240271361550044e-03 -6.584467611156170040e-03 -2.592261998182820038e-03 3.119299070280229930e-02 9.833286845556660216e-02
367 -4.547247794002570037e-02 5.068011873981870252e-02 1.371430516903520136e-01 -1.599922263614299983e-02 4.108557878402369773e-02 3.187985952347179713e-02 -4.340084565202689815e-02 7.120997975363539678e-02 7.102157794598219775e-02 4.862758547755009764e-02
368 -9.147093429830140468e-03 5.068011873981870252e-02 1.705552259806600024e-01 1.498661360748330083e-02 3.007795591841460128e-02 3.375875029420900147e-02 -2.131101882750449997e-02 3.430885887772629900e-02 3.365681290238470291e-02 3.205915781821130212e-02
369 -1.641217033186929963e-02 5.068011873981870252e-02 2.416542455238970041e-03 1.498661360748330083e-02 2.182223876920789951e-02 -1.008203435632550049e-02 -2.499265663159149983e-02 3.430885887772629900e-02 8.553312118743899850e-02 8.176444079622779970e-02
370 -9.147093429830140468e-03 -4.464163650698899782e-02 3.798434089330870317e-02 -4.009931749229690007e-02 -2.496015840963049931e-02 -3.819065120534880214e-03 -4.340084565202689815e-02 1.585829843977170153e-02 -5.145307980263110273e-03 2.791705090337660150e-02
371 1.991321417832630017e-02 -4.464163650698899782e-02 -5.794093368209150136e-02 -5.731367096097819691e-02 -1.568959820211340015e-03 -1.258722205064180012e-02 7.441156407875940126e-02 -3.949338287409189657e-02 -6.117659509433449883e-02 -7.563562196749110123e-02
372 5.260606023750229870e-02 5.068011873981870252e-02 -9.439390357450949676e-03 4.941532054484590319e-02 5.071724879143160031e-02 -1.916333974822199970e-02 -1.394774321933030074e-02 3.430885887772629900e-02 1.193439942037869961e-01 -1.764612515980519894e-02
373 -2.730978568492789874e-02 5.068011873981870252e-02 -2.345094731790270046e-02 -1.599922263614299983e-02 1.356652162000110060e-02 1.277780335431030062e-02 2.655027262562750096e-02 -2.592261998182820038e-03 -1.090443584737709956e-02 -2.178823207463989955e-02
374 -7.453278554818210111e-02 -4.464163650698899782e-02 -1.051720243133190055e-02 -5.670610554934250001e-03 -6.623874415566440021e-02 -5.705430362475540085e-02 -2.902829807069099918e-03 -3.949338287409189657e-02 -4.257210492279420166e-02 -1.077697500466389974e-03
375 -1.072256316073579990e-01 -4.464163650698899782e-02 -3.422906805671169922e-02 -6.764228304218700139e-02 -6.348683843926219983e-02 -7.051968748170529822e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -6.092541861022970299e-04 -7.977772888232589898e-02
376 4.534098333546320025e-02 5.068011873981870252e-02 -2.972517914165530208e-03 1.079441223383619947e-01 3.558176735121919981e-02 2.248540566978590033e-02 2.655027262562750096e-02 -2.592261998182820038e-03 2.801650652326400162e-02 1.963283707370720027e-02
377 -1.882016527791040067e-03 -4.464163650698899782e-02 6.816307896197400240e-02 -5.670610554934250001e-03 1.195148917014880047e-01 1.302084765253850029e-01 -2.499265663159149983e-02 8.670845052151719690e-02 4.613233103941480340e-02 -1.077697500466389974e-03
378 1.991321417832630017e-02 5.068011873981870252e-02 9.961226972405269262e-03 1.842948430121960079e-02 1.494247447820220079e-02 4.471894645684260094e-02 -6.180903467246220279e-02 7.120997975363539678e-02 9.436409146079870192e-03 -6.320930122298699938e-02
379 1.628067572730669890e-02 5.068011873981870252e-02 2.416542455238970041e-03 -5.670610554934250001e-03 -5.696818394814720174e-03 1.089891258357309975e-02 -5.076412126020100196e-02 3.430885887772629900e-02 2.269202256674450122e-02 -3.835665973397880263e-02
380 -1.882016527791040067e-03 -4.464163650698899782e-02 -3.854031635223530150e-02 2.187235499495579841e-02 -1.088932827598989989e-01 -1.156130659793979942e-01 2.286863482154040048e-02 -7.639450375000099436e-02 -4.687948284421659950e-02 2.377494398854190089e-02
381 1.628067572730669890e-02 -4.464163650698899782e-02 2.612840808061879863e-02 5.859630917623830093e-02 -6.073493272285990230e-02 -4.421521669138449989e-02 -1.394774321933030074e-02 -3.395821474270550172e-02 -5.140053526058249722e-02 -2.593033898947460017e-02
382 -7.090024709716259699e-02 5.068011873981870252e-02 -8.919748382463760228e-02 -7.452802442965950069e-02 -4.284754556624519733e-02 -2.573945744580210040e-02 -3.235593223976569732e-02 -2.592261998182820038e-03 -1.290794225416879923e-02 -5.492508739331759815e-02
383 4.897352178648269744e-02 -4.464163650698899782e-02 6.061839444480759953e-02 -2.288496402361559975e-02 -2.358420555142939912e-02 -7.271172671423199729e-02 -4.340084565202689815e-02 -2.592261998182820038e-03 1.041376113589790042e-01 3.620126473304600273e-02
384 5.383060374248070309e-03 5.068011873981870252e-02 -2.884000768730720157e-02 -9.113481248670509197e-03 -3.183992270063620150e-02 -2.887094206369749880e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -1.811826730789670159e-02 7.206516329203029904e-03
385 3.444336798240450054e-02 5.068011873981870252e-02 -2.991781976118810041e-02 4.658001526274530187e-03 9.337178739566659447e-02 8.699398879842949739e-02 3.391354823380159783e-02 -2.592261998182820038e-03 2.405258322689299982e-02 -3.835665973397880263e-02
386 2.354575262934580082e-02 5.068011873981870252e-02 -1.913969902237900103e-02 4.941532054484590319e-02 -6.348683843926219983e-02 -6.112523362801929733e-02 4.460445801105040325e-03 -3.949338287409189657e-02 -2.595242443518940012e-02 -1.350401824497050006e-02
387 1.991321417832630017e-02 -4.464163650698899782e-02 -4.069594049999709917e-02 -1.599922263614299983e-02 -8.448724111216979540e-03 -1.759759743927430051e-02 5.232173725423699961e-02 -3.949338287409189657e-02 -3.075120986455629965e-02 3.064409414368320182e-03
388 -4.547247794002570037e-02 -4.464163650698899782e-02 1.535028734180979987e-02 -7.452802442965950069e-02 -4.972730985725089953e-02 -1.728444897748479883e-02 -2.867429443567860031e-02 -2.592261998182820038e-03 -1.043648208321659998e-01 -7.563562196749110123e-02
389 5.260606023750229870e-02 5.068011873981870252e-02 -2.452875939178359929e-02 5.630106193231849965e-02 -7.072771253015849857e-03 -5.071658967693000106e-03 -2.131101882750449997e-02 -2.592261998182820038e-03 2.671425763351279944e-02 -3.835665973397880263e-02
390 -5.514554978810590376e-03 5.068011873981870252e-02 1.338730381358059929e-03 -8.485663651086830517e-02 -1.120062982761920074e-02 -1.665815205390569834e-02 4.864009945014990260e-02 -3.949338287409189657e-02 -4.118038518800790082e-02 -8.806194271199530021e-02
391 9.015598825267629943e-03 5.068011873981870252e-02 6.924089103585480409e-02 5.974393262605470073e-02 1.769438019460449832e-02 -2.323426975148589965e-02 -4.708248345611389801e-02 3.430885887772629900e-02 1.032922649115240038e-01 7.348022696655839847e-02
392 -2.367724723390840155e-02 -4.464163650698899782e-02 -6.979686649478139548e-02 -6.419941234845069622e-02 -5.935897986465880211e-02 -5.047818592717519953e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -8.913686007934769340e-02 -5.078298047848289754e-02
393 -4.183993948900609910e-02 5.068011873981870252e-02 -2.991781976118810041e-02 -2.227739861197989939e-03 2.182223876920789951e-02 3.657708645031480105e-02 1.182372140927919965e-02 -2.592261998182820038e-03 -4.118038518800790082e-02 6.519601313688899724e-02
394 -7.453278554818210111e-02 -4.464163650698899782e-02 -4.608500086940160029e-02 -4.354218818603310115e-02 -2.908801698423390050e-02 -2.323426975148589965e-02 1.550535921336619952e-02 -3.949338287409189657e-02 -3.980959436433750137e-02 -2.178823207463989955e-02
395 3.444336798240450054e-02 -4.464163650698899782e-02 1.858372356345249984e-02 5.630106193231849965e-02 1.219056876180000040e-02 -5.454911593043910295e-02 -6.917231028063640375e-02 7.120997975363539678e-02 1.300806095217529879e-01 7.206516329203029904e-03
396 -6.000263174410389727e-02 -4.464163650698899782e-02 1.338730381358059929e-03 -2.977070541108809906e-02 -7.072771253015849857e-03 -2.166852744253820046e-02 1.182372140927919965e-02 -2.592261998182820038e-03 3.181521750079859684e-02 -5.492508739331759815e-02
397 -8.543040090124079389e-02 5.068011873981870252e-02 -3.099563183506899924e-02 -2.288496402361559975e-02 -6.348683843926219983e-02 -5.423596746864960128e-02 1.918699701745330000e-02 -3.949338287409189657e-02 -9.643322289178400675e-02 -3.421455281914410201e-02
398 5.260606023750229870e-02 -4.464163650698899782e-02 -4.050329988046450294e-03 -3.091832896419060075e-02 -4.697540414084860200e-02 -5.830689747191349775e-02 -1.394774321933030074e-02 -2.583996815000549896e-02 3.605579008983190309e-02 2.377494398854190089e-02
399 1.264813727628719998e-02 -4.464163650698899782e-02 1.535028734180979987e-02 -3.321357610482440076e-02 4.108557878402369773e-02 3.219300798526129881e-02 -2.902829807069099918e-03 -2.592261998182820038e-03 4.506616833626150148e-02 -6.735140813782170000e-02
400 5.987113713954139715e-02 5.068011873981870252e-02 2.289497185897609866e-02 4.941532054484590319e-02 1.631842733640340160e-02 1.183835796894170019e-02 -1.394774321933030074e-02 -2.592261998182820038e-03 3.953987807202419963e-02 1.963283707370720027e-02
401 -2.367724723390840155e-02 -4.464163650698899782e-02 4.552902541047500196e-02 9.072976886968099619e-02 -1.808039411862490120e-02 -3.544705976127759950e-02 7.072992627467229731e-02 -3.949338287409189657e-02 -3.452371533034950118e-02 -9.361911330135799444e-03
402 1.628067572730669890e-02 -4.464163650698899782e-02 -4.500718879552070145e-02 -5.731367096097819691e-02 -3.459182841703849903e-02 -5.392281900686000246e-02 7.441156407875940126e-02 -7.639450375000099436e-02 -4.257210492279420166e-02 4.034337164788070335e-02
403 1.107266754538149961e-01 5.068011873981870252e-02 -3.315125598283080038e-02 -2.288496402361559975e-02 -4.320865536613589623e-03 2.029336643725910064e-02 -6.180903467246220279e-02 7.120997975363539678e-02 1.556684454070180086e-02 4.448547856271539702e-02
404 -2.004470878288880029e-02 -4.464163650698899782e-02 9.726400495675820157e-02 -5.670610554934250001e-03 -5.696818394814720174e-03 -2.386056667506489953e-02 -2.131101882750449997e-02 -2.592261998182820038e-03 6.168584882386619894e-02 4.034337164788070335e-02
405 -1.641217033186929963e-02 -4.464163650698899782e-02 5.415152200152219958e-02 7.007254470726349826e-02 -3.321587555883730170e-02 -2.793149667832890010e-02 8.142083605192099172e-03 -3.949338287409189657e-02 -2.712864555432650121e-02 -9.361911330135799444e-03
406 4.897352178648269744e-02 5.068011873981870252e-02 1.231314947298999957e-01 8.384402748220859403e-02 -1.047654241852959967e-01 -1.008950882752900069e-01 -6.917231028063640375e-02 -2.592261998182820038e-03 3.664579779339879884e-02 -3.007244590430930078e-02
407 -5.637009329308430294e-02 -4.464163650698899782e-02 -8.057498723359039772e-02 -8.485663651086830517e-02 -3.734373413344069942e-02 -3.701280207022530216e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -5.615757309500619965e-02 -1.377672256900120129e-01
408 2.717829108036539862e-02 -4.464163650698899782e-02 9.295275666123460623e-02 -5.272317671413939699e-02 8.062710187196569719e-03 3.970857106821010230e-02 -2.867429443567860031e-02 2.102445536239900062e-02 -4.836172480289190057e-02 1.963283707370720027e-02
409 6.350367559056099842e-02 -4.464163650698899782e-02 -5.039624916492520257e-02 1.079441223383619947e-01 3.145390877661580209e-02 1.935392105189049847e-02 -1.762938102341739949e-02 2.360753382371260159e-02 5.803912766389510147e-02 4.034337164788070335e-02
410 -5.273755484206479882e-02 5.068011873981870252e-02 -1.159501450521270051e-02 5.630106193231849965e-02 5.622106022423609822e-02 7.290230801790049953e-02 -3.971920784793980114e-02 7.120997975363539678e-02 3.056648739841480097e-02 -5.219804415301099697e-03
411 -9.147093429830140468e-03 5.068011873981870252e-02 -2.776219561342629927e-02 8.100872220010799790e-03 4.796534307502930278e-02 3.720338337389379746e-02 -2.867429443567860031e-02 3.430885887772629900e-02 6.604820616309839409e-02 -4.249876664881350324e-02
412 5.383060374248070309e-03 -4.464163650698899782e-02 5.846277029704580186e-02 -4.354218818603310115e-02 -7.311850844667000526e-02 -7.239857825244250256e-02 1.918699701745330000e-02 -7.639450375000099436e-02 -5.140053526058249722e-02 -2.593033898947460017e-02
413 7.440129094361959405e-02 -4.464163650698899782e-02 8.540807214406830050e-02 6.318680331979099896e-02 1.494247447820220079e-02 1.309095181609989944e-02 1.550535921336619952e-02 -2.592261998182820038e-03 6.209315616505399656e-03 8.590654771106250032e-02
414 -5.273755484206479882e-02 -4.464163650698899782e-02 -8.168937664037369826e-04 -2.632783471735180084e-02 1.081461590359879960e-02 7.141131042098750048e-03 4.864009945014990260e-02 -3.949338287409189657e-02 -3.581672810154919867e-02 1.963283707370720027e-02
415 8.166636784565869944e-02 5.068011873981870252e-02 6.727790750762559745e-03 -4.522987001831730094e-03 1.098832216940800049e-01 1.170562411302250028e-01 -3.235593223976569732e-02 9.187460744414439884e-02 5.472400334817909689e-02 7.206516329203029904e-03
416 -5.514554978810590376e-03 -4.464163650698899782e-02 8.883414898524360018e-03 -5.042792957350569760e-02 2.595009734381130070e-02 4.722413415115889884e-02 -4.340084565202689815e-02 7.120997975363539678e-02 1.482271084126630077e-02 3.064409414368320182e-03
417 -2.730978568492789874e-02 -4.464163650698899782e-02 8.001901177466380632e-02 9.876313370696999938e-02 -2.944912678412469915e-03 1.810132720473240156e-02 -1.762938102341739949e-02 3.311917341962639788e-03 -2.952762274177360077e-02 3.620126473304600273e-02
418 -5.273755484206479882e-02 -4.464163650698899782e-02 7.139651518361660176e-02 -7.452802442965950069e-02 -1.532848840222260020e-02 -1.313877426218630021e-03 4.460445801105040325e-03 -2.141183364489639834e-02 -4.687948284421659950e-02 3.064409414368320182e-03
419 9.015598825267629943e-03 -4.464163650698899782e-02 -2.452875939178359929e-02 -2.632783471735180084e-02 9.887559882847110626e-02 9.419640341958869512e-02 7.072992627467229731e-02 -2.592261998182820038e-03 -2.139368094035999993e-02 7.206516329203029904e-03
420 -2.004470878288880029e-02 -4.464163650698899782e-02 -5.470749746044879791e-02 -5.387080026724189868e-02 -6.623874415566440021e-02 -5.736745208654490252e-02 1.182372140927919965e-02 -3.949338287409189657e-02 -7.408887149153539631e-02 -5.219804415301099697e-03
421 2.354575262934580082e-02 -4.464163650698899782e-02 -3.638469220447349689e-02 6.750727943574620551e-05 1.182945896190920002e-03 3.469819567957759671e-02 -4.340084565202689815e-02 3.430885887772629900e-02 -3.324878724762579674e-02 6.105390622205419948e-02
422 3.807590643342410180e-02 5.068011873981870252e-02 1.642809941569069870e-02 2.187235499495579841e-02 3.970962592582259754e-02 4.503209491863210262e-02 -4.340084565202689815e-02 7.120997975363539678e-02 4.976865992074899769e-02 1.549073015887240078e-02
423 -7.816532399920170238e-02 5.068011873981870252e-02 7.786338762690199478e-02 5.285819123858220142e-02 7.823630595545419397e-02 6.444729954958319795e-02 2.655027262562750096e-02 -2.592261998182820038e-03 4.067226371449769728e-02 -9.361911330135799444e-03
424 9.015598825267629943e-03 5.068011873981870252e-02 -3.961812842611620034e-02 2.875809638242839833e-02 3.833367306762140020e-02 7.352860494147960002e-02 -7.285394808472339667e-02 1.081111006295440019e-01 1.556684454070180086e-02 -4.664087356364819692e-02
425 1.750521923228520000e-03 5.068011873981870252e-02 1.103903904628619932e-02 -1.944209332987930153e-02 -1.670444126042380101e-02 -3.819065120534880214e-03 -4.708248345611389801e-02 3.430885887772629900e-02 2.405258322689299982e-02 2.377494398854190089e-02
426 -7.816532399920170238e-02 -4.464163650698899782e-02 -4.069594049999709917e-02 -8.141376581713200000e-02 -1.006375656106929944e-01 -1.127947298232920004e-01 2.286863482154040048e-02 -7.639450375000099436e-02 -2.028874775162960165e-02 -5.078298047848289754e-02
427 3.081082953138499989e-02 5.068011873981870252e-02 -3.422906805671169922e-02 4.367720260718979675e-02 5.759701308243719842e-02 6.883137801463659611e-02 -3.235593223976569732e-02 5.755656502954899917e-02 3.546193866076970125e-02 8.590654771106250032e-02
428 -3.457486258696700065e-02 5.068011873981870252e-02 5.649978676881649634e-03 -5.670610554934250001e-03 -7.311850844667000526e-02 -6.269097593696699999e-02 -6.584467611156170040e-03 -3.949338287409189657e-02 -4.542095777704099890e-02 3.205915781821130212e-02
429 4.897352178648269744e-02 5.068011873981870252e-02 8.864150836571099701e-02 8.728689817594480205e-02 3.558176735121919981e-02 2.154596028441720101e-02 -2.499265663159149983e-02 3.430885887772629900e-02 6.604820616309839409e-02 1.314697237742440128e-01
430 -4.183993948900609910e-02 -4.464163650698899782e-02 -3.315125598283080038e-02 -2.288496402361559975e-02 4.658939021682820258e-02 4.158746183894729970e-02 5.600337505832399948e-02 -2.473293452372829840e-02 -2.595242443518940012e-02 -3.835665973397880263e-02
431 -9.147093429830140468e-03 -4.464163650698899782e-02 -5.686312160821060252e-02 -5.042792957350569760e-02 2.182223876920789951e-02 4.534524338042170144e-02 -2.867429443567860031e-02 3.430885887772629900e-02 -9.918957363154769225e-03 -1.764612515980519894e-02
432 7.076875249260000666e-02 5.068011873981870252e-02 -3.099563183506899924e-02 2.187235499495579841e-02 -3.734373413344069942e-02 -4.703355284749029946e-02 3.391354823380159783e-02 -3.949338287409189657e-02 -1.495647502491130078e-02 -1.077697500466389974e-03
433 9.015598825267629943e-03 -4.464163650698899782e-02 5.522933407540309841e-02 -5.670610554934250001e-03 5.759701308243719842e-02 4.471894645684260094e-02 -2.902829807069099918e-03 2.323852261495349888e-02 5.568354770267369691e-02 1.066170822852360034e-01
434 -2.730978568492789874e-02 -4.464163650698899782e-02 -6.009655782985329903e-02 -2.977070541108809906e-02 4.658939021682820258e-02 1.998021797546959896e-02 1.222728555318910032e-01 -3.949338287409189657e-02 -5.140053526058249722e-02 -9.361911330135799444e-03
435 1.628067572730669890e-02 -4.464163650698899782e-02 1.338730381358059929e-03 8.100872220010799790e-03 5.310804470794310353e-03 1.089891258357309975e-02 3.023191042971450082e-02 -3.949338287409189657e-02 -4.542095777704099890e-02 3.205915781821130212e-02
436 -1.277963188084970010e-02 -4.464163650698899782e-02 -2.345094731790270046e-02 -4.009931749229690007e-02 -1.670444126042380101e-02 4.635943347782499856e-03 -1.762938102341739949e-02 -2.592261998182820038e-03 -3.845911230135379971e-02 -3.835665973397880263e-02
437 -5.637009329308430294e-02 -4.464163650698899782e-02 -7.410811479030500470e-02 -5.042792957350569760e-02 -2.496015840963049931e-02 -4.703355284749029946e-02 9.281975309919469896e-02 -7.639450375000099436e-02 -6.117659509433449883e-02 -4.664087356364819692e-02
438 4.170844488444359899e-02 5.068011873981870252e-02 1.966153563733339868e-02 5.974393262605470073e-02 -5.696818394814720174e-03 -2.566471273376759888e-03 -2.867429443567860031e-02 -2.592261998182820038e-03 3.119299070280229930e-02 7.206516329203029904e-03
439 -5.514554978810590376e-03 5.068011873981870252e-02 -1.590626280073640167e-02 -6.764228304218700139e-02 4.934129593323050011e-02 7.916527725369119917e-02 -2.867429443567860031e-02 3.430885887772629900e-02 -1.811826730789670159e-02 4.448547856271539702e-02
440 4.170844488444359899e-02 5.068011873981870252e-02 -1.590626280073640167e-02 1.728186074811709910e-02 -3.734373413344069942e-02 -1.383981589779990050e-02 -2.499265663159149983e-02 -1.107951979964190078e-02 -4.687948284421659950e-02 1.549073015887240078e-02
441 -4.547247794002570037e-02 -4.464163650698899782e-02 3.906215296718960200e-02 1.215130832538269907e-03 1.631842733640340160e-02 1.528299104862660025e-02 -2.867429443567860031e-02 2.655962349378539894e-02 4.452837402140529671e-02 -2.593033898947460017e-02
442 -4.547247794002570037e-02 -4.464163650698899782e-02 -7.303030271642410587e-02 -8.141376581713200000e-02 8.374011738825870577e-02 2.780892952020790065e-02 1.738157847891100005e-01 -3.949338287409189657e-02 -4.219859706946029777e-03 3.064409414368320182e-03

View File

@@ -1,442 +0,0 @@
1.510000000000000000e+02
7.500000000000000000e+01
1.410000000000000000e+02
2.060000000000000000e+02
1.350000000000000000e+02
9.700000000000000000e+01
1.380000000000000000e+02
6.300000000000000000e+01
1.100000000000000000e+02
3.100000000000000000e+02
1.010000000000000000e+02
6.900000000000000000e+01
1.790000000000000000e+02
1.850000000000000000e+02
1.180000000000000000e+02
1.710000000000000000e+02
1.660000000000000000e+02
1.440000000000000000e+02
9.700000000000000000e+01
1.680000000000000000e+02
6.800000000000000000e+01
4.900000000000000000e+01
6.800000000000000000e+01
2.450000000000000000e+02
1.840000000000000000e+02
2.020000000000000000e+02
1.370000000000000000e+02
8.500000000000000000e+01
1.310000000000000000e+02
2.830000000000000000e+02
1.290000000000000000e+02
5.900000000000000000e+01
3.410000000000000000e+02
8.700000000000000000e+01
6.500000000000000000e+01
1.020000000000000000e+02
2.650000000000000000e+02
2.760000000000000000e+02
2.520000000000000000e+02
9.000000000000000000e+01
1.000000000000000000e+02
5.500000000000000000e+01
6.100000000000000000e+01
9.200000000000000000e+01
2.590000000000000000e+02
5.300000000000000000e+01
1.900000000000000000e+02
1.420000000000000000e+02
7.500000000000000000e+01
1.420000000000000000e+02
1.550000000000000000e+02
2.250000000000000000e+02
5.900000000000000000e+01
1.040000000000000000e+02
1.820000000000000000e+02
1.280000000000000000e+02
5.200000000000000000e+01
3.700000000000000000e+01
1.700000000000000000e+02
1.700000000000000000e+02
6.100000000000000000e+01
1.440000000000000000e+02
5.200000000000000000e+01
1.280000000000000000e+02
7.100000000000000000e+01
1.630000000000000000e+02
1.500000000000000000e+02
9.700000000000000000e+01
1.600000000000000000e+02
1.780000000000000000e+02
4.800000000000000000e+01
2.700000000000000000e+02
2.020000000000000000e+02
1.110000000000000000e+02
8.500000000000000000e+01
4.200000000000000000e+01
1.700000000000000000e+02
2.000000000000000000e+02
2.520000000000000000e+02
1.130000000000000000e+02
1.430000000000000000e+02
5.100000000000000000e+01
5.200000000000000000e+01
2.100000000000000000e+02
6.500000000000000000e+01
1.410000000000000000e+02
5.500000000000000000e+01
1.340000000000000000e+02
4.200000000000000000e+01
1.110000000000000000e+02
9.800000000000000000e+01
1.640000000000000000e+02
4.800000000000000000e+01
9.600000000000000000e+01
9.000000000000000000e+01
1.620000000000000000e+02
1.500000000000000000e+02
2.790000000000000000e+02
9.200000000000000000e+01
8.300000000000000000e+01
1.280000000000000000e+02
1.020000000000000000e+02
3.020000000000000000e+02
1.980000000000000000e+02
9.500000000000000000e+01
5.300000000000000000e+01
1.340000000000000000e+02
1.440000000000000000e+02
2.320000000000000000e+02
8.100000000000000000e+01
1.040000000000000000e+02
5.900000000000000000e+01
2.460000000000000000e+02
2.970000000000000000e+02
2.580000000000000000e+02
2.290000000000000000e+02
2.750000000000000000e+02
2.810000000000000000e+02
1.790000000000000000e+02
2.000000000000000000e+02
2.000000000000000000e+02
1.730000000000000000e+02
1.800000000000000000e+02
8.400000000000000000e+01
1.210000000000000000e+02
1.610000000000000000e+02
9.900000000000000000e+01
1.090000000000000000e+02
1.150000000000000000e+02
2.680000000000000000e+02
2.740000000000000000e+02
1.580000000000000000e+02
1.070000000000000000e+02
8.300000000000000000e+01
1.030000000000000000e+02
2.720000000000000000e+02
8.500000000000000000e+01
2.800000000000000000e+02
3.360000000000000000e+02
2.810000000000000000e+02
1.180000000000000000e+02
3.170000000000000000e+02
2.350000000000000000e+02
6.000000000000000000e+01
1.740000000000000000e+02
2.590000000000000000e+02
1.780000000000000000e+02
1.280000000000000000e+02
9.600000000000000000e+01
1.260000000000000000e+02
2.880000000000000000e+02
8.800000000000000000e+01
2.920000000000000000e+02
7.100000000000000000e+01
1.970000000000000000e+02
1.860000000000000000e+02
2.500000000000000000e+01
8.400000000000000000e+01
9.600000000000000000e+01
1.950000000000000000e+02
5.300000000000000000e+01
2.170000000000000000e+02
1.720000000000000000e+02
1.310000000000000000e+02
2.140000000000000000e+02
5.900000000000000000e+01
7.000000000000000000e+01
2.200000000000000000e+02
2.680000000000000000e+02
1.520000000000000000e+02
4.700000000000000000e+01
7.400000000000000000e+01
2.950000000000000000e+02
1.010000000000000000e+02
1.510000000000000000e+02
1.270000000000000000e+02
2.370000000000000000e+02
2.250000000000000000e+02
8.100000000000000000e+01
1.510000000000000000e+02
1.070000000000000000e+02
6.400000000000000000e+01
1.380000000000000000e+02
1.850000000000000000e+02
2.650000000000000000e+02
1.010000000000000000e+02
1.370000000000000000e+02
1.430000000000000000e+02
1.410000000000000000e+02
7.900000000000000000e+01
2.920000000000000000e+02
1.780000000000000000e+02
9.100000000000000000e+01
1.160000000000000000e+02
8.600000000000000000e+01
1.220000000000000000e+02
7.200000000000000000e+01
1.290000000000000000e+02
1.420000000000000000e+02
9.000000000000000000e+01
1.580000000000000000e+02
3.900000000000000000e+01
1.960000000000000000e+02
2.220000000000000000e+02
2.770000000000000000e+02
9.900000000000000000e+01
1.960000000000000000e+02
2.020000000000000000e+02
1.550000000000000000e+02
7.700000000000000000e+01
1.910000000000000000e+02
7.000000000000000000e+01
7.300000000000000000e+01
4.900000000000000000e+01
6.500000000000000000e+01
2.630000000000000000e+02
2.480000000000000000e+02
2.960000000000000000e+02
2.140000000000000000e+02
1.850000000000000000e+02
7.800000000000000000e+01
9.300000000000000000e+01
2.520000000000000000e+02
1.500000000000000000e+02
7.700000000000000000e+01
2.080000000000000000e+02
7.700000000000000000e+01
1.080000000000000000e+02
1.600000000000000000e+02
5.300000000000000000e+01
2.200000000000000000e+02
1.540000000000000000e+02
2.590000000000000000e+02
9.000000000000000000e+01
2.460000000000000000e+02
1.240000000000000000e+02
6.700000000000000000e+01
7.200000000000000000e+01
2.570000000000000000e+02
2.620000000000000000e+02
2.750000000000000000e+02
1.770000000000000000e+02
7.100000000000000000e+01
4.700000000000000000e+01
1.870000000000000000e+02
1.250000000000000000e+02
7.800000000000000000e+01
5.100000000000000000e+01
2.580000000000000000e+02
2.150000000000000000e+02
3.030000000000000000e+02
2.430000000000000000e+02
9.100000000000000000e+01
1.500000000000000000e+02
3.100000000000000000e+02
1.530000000000000000e+02
3.460000000000000000e+02
6.300000000000000000e+01
8.900000000000000000e+01
5.000000000000000000e+01
3.900000000000000000e+01
1.030000000000000000e+02
3.080000000000000000e+02
1.160000000000000000e+02
1.450000000000000000e+02
7.400000000000000000e+01
4.500000000000000000e+01
1.150000000000000000e+02
2.640000000000000000e+02
8.700000000000000000e+01
2.020000000000000000e+02
1.270000000000000000e+02
1.820000000000000000e+02
2.410000000000000000e+02
6.600000000000000000e+01
9.400000000000000000e+01
2.830000000000000000e+02
6.400000000000000000e+01
1.020000000000000000e+02
2.000000000000000000e+02
2.650000000000000000e+02
9.400000000000000000e+01
2.300000000000000000e+02
1.810000000000000000e+02
1.560000000000000000e+02
2.330000000000000000e+02
6.000000000000000000e+01
2.190000000000000000e+02
8.000000000000000000e+01
6.800000000000000000e+01
3.320000000000000000e+02
2.480000000000000000e+02
8.400000000000000000e+01
2.000000000000000000e+02
5.500000000000000000e+01
8.500000000000000000e+01
8.900000000000000000e+01
3.100000000000000000e+01
1.290000000000000000e+02
8.300000000000000000e+01
2.750000000000000000e+02
6.500000000000000000e+01
1.980000000000000000e+02
2.360000000000000000e+02
2.530000000000000000e+02
1.240000000000000000e+02
4.400000000000000000e+01
1.720000000000000000e+02
1.140000000000000000e+02
1.420000000000000000e+02
1.090000000000000000e+02
1.800000000000000000e+02
1.440000000000000000e+02
1.630000000000000000e+02
1.470000000000000000e+02
9.700000000000000000e+01
2.200000000000000000e+02
1.900000000000000000e+02
1.090000000000000000e+02
1.910000000000000000e+02
1.220000000000000000e+02
2.300000000000000000e+02
2.420000000000000000e+02
2.480000000000000000e+02
2.490000000000000000e+02
1.920000000000000000e+02
1.310000000000000000e+02
2.370000000000000000e+02
7.800000000000000000e+01
1.350000000000000000e+02
2.440000000000000000e+02
1.990000000000000000e+02
2.700000000000000000e+02
1.640000000000000000e+02
7.200000000000000000e+01
9.600000000000000000e+01
3.060000000000000000e+02
9.100000000000000000e+01
2.140000000000000000e+02
9.500000000000000000e+01
2.160000000000000000e+02
2.630000000000000000e+02
1.780000000000000000e+02
1.130000000000000000e+02
2.000000000000000000e+02
1.390000000000000000e+02
1.390000000000000000e+02
8.800000000000000000e+01
1.480000000000000000e+02
8.800000000000000000e+01
2.430000000000000000e+02
7.100000000000000000e+01
7.700000000000000000e+01
1.090000000000000000e+02
2.720000000000000000e+02
6.000000000000000000e+01
5.400000000000000000e+01
2.210000000000000000e+02
9.000000000000000000e+01
3.110000000000000000e+02
2.810000000000000000e+02
1.820000000000000000e+02
3.210000000000000000e+02
5.800000000000000000e+01
2.620000000000000000e+02
2.060000000000000000e+02
2.330000000000000000e+02
2.420000000000000000e+02
1.230000000000000000e+02
1.670000000000000000e+02
6.300000000000000000e+01
1.970000000000000000e+02
7.100000000000000000e+01
1.680000000000000000e+02
1.400000000000000000e+02
2.170000000000000000e+02
1.210000000000000000e+02
2.350000000000000000e+02
2.450000000000000000e+02
4.000000000000000000e+01
5.200000000000000000e+01
1.040000000000000000e+02
1.320000000000000000e+02
8.800000000000000000e+01
6.900000000000000000e+01
2.190000000000000000e+02
7.200000000000000000e+01
2.010000000000000000e+02
1.100000000000000000e+02
5.100000000000000000e+01
2.770000000000000000e+02
6.300000000000000000e+01
1.180000000000000000e+02
6.900000000000000000e+01
2.730000000000000000e+02
2.580000000000000000e+02
4.300000000000000000e+01
1.980000000000000000e+02
2.420000000000000000e+02
2.320000000000000000e+02
1.750000000000000000e+02
9.300000000000000000e+01
1.680000000000000000e+02
2.750000000000000000e+02
2.930000000000000000e+02
2.810000000000000000e+02
7.200000000000000000e+01
1.400000000000000000e+02
1.890000000000000000e+02
1.810000000000000000e+02
2.090000000000000000e+02
1.360000000000000000e+02
2.610000000000000000e+02
1.130000000000000000e+02
1.310000000000000000e+02
1.740000000000000000e+02
2.570000000000000000e+02
5.500000000000000000e+01
8.400000000000000000e+01
4.200000000000000000e+01
1.460000000000000000e+02
2.120000000000000000e+02
2.330000000000000000e+02
9.100000000000000000e+01
1.110000000000000000e+02
1.520000000000000000e+02
1.200000000000000000e+02
6.700000000000000000e+01
3.100000000000000000e+02
9.400000000000000000e+01
1.830000000000000000e+02
6.600000000000000000e+01
1.730000000000000000e+02
7.200000000000000000e+01
4.900000000000000000e+01
6.400000000000000000e+01
4.800000000000000000e+01
1.780000000000000000e+02
1.040000000000000000e+02
1.320000000000000000e+02
2.200000000000000000e+02
5.700000000000000000e+01
1 1.510000000000000000e+02
2 7.500000000000000000e+01
3 1.410000000000000000e+02
4 2.060000000000000000e+02
5 1.350000000000000000e+02
6 9.700000000000000000e+01
7 1.380000000000000000e+02
8 6.300000000000000000e+01
9 1.100000000000000000e+02
10 3.100000000000000000e+02
11 1.010000000000000000e+02
12 6.900000000000000000e+01
13 1.790000000000000000e+02
14 1.850000000000000000e+02
15 1.180000000000000000e+02
16 1.710000000000000000e+02
17 1.660000000000000000e+02
18 1.440000000000000000e+02
19 9.700000000000000000e+01
20 1.680000000000000000e+02
21 6.800000000000000000e+01
22 4.900000000000000000e+01
23 6.800000000000000000e+01
24 2.450000000000000000e+02
25 1.840000000000000000e+02
26 2.020000000000000000e+02
27 1.370000000000000000e+02
28 8.500000000000000000e+01
29 1.310000000000000000e+02
30 2.830000000000000000e+02
31 1.290000000000000000e+02
32 5.900000000000000000e+01
33 3.410000000000000000e+02
34 8.700000000000000000e+01
35 6.500000000000000000e+01
36 1.020000000000000000e+02
37 2.650000000000000000e+02
38 2.760000000000000000e+02
39 2.520000000000000000e+02
40 9.000000000000000000e+01
41 1.000000000000000000e+02
42 5.500000000000000000e+01
43 6.100000000000000000e+01
44 9.200000000000000000e+01
45 2.590000000000000000e+02
46 5.300000000000000000e+01
47 1.900000000000000000e+02
48 1.420000000000000000e+02
49 7.500000000000000000e+01
50 1.420000000000000000e+02
51 1.550000000000000000e+02
52 2.250000000000000000e+02
53 5.900000000000000000e+01
54 1.040000000000000000e+02
55 1.820000000000000000e+02
56 1.280000000000000000e+02
57 5.200000000000000000e+01
58 3.700000000000000000e+01
59 1.700000000000000000e+02
60 1.700000000000000000e+02
61 6.100000000000000000e+01
62 1.440000000000000000e+02
63 5.200000000000000000e+01
64 1.280000000000000000e+02
65 7.100000000000000000e+01
66 1.630000000000000000e+02
67 1.500000000000000000e+02
68 9.700000000000000000e+01
69 1.600000000000000000e+02
70 1.780000000000000000e+02
71 4.800000000000000000e+01
72 2.700000000000000000e+02
73 2.020000000000000000e+02
74 1.110000000000000000e+02
75 8.500000000000000000e+01
76 4.200000000000000000e+01
77 1.700000000000000000e+02
78 2.000000000000000000e+02
79 2.520000000000000000e+02
80 1.130000000000000000e+02
81 1.430000000000000000e+02
82 5.100000000000000000e+01
83 5.200000000000000000e+01
84 2.100000000000000000e+02
85 6.500000000000000000e+01
86 1.410000000000000000e+02
87 5.500000000000000000e+01
88 1.340000000000000000e+02
89 4.200000000000000000e+01
90 1.110000000000000000e+02
91 9.800000000000000000e+01
92 1.640000000000000000e+02
93 4.800000000000000000e+01
94 9.600000000000000000e+01
95 9.000000000000000000e+01
96 1.620000000000000000e+02
97 1.500000000000000000e+02
98 2.790000000000000000e+02
99 9.200000000000000000e+01
100 8.300000000000000000e+01
101 1.280000000000000000e+02
102 1.020000000000000000e+02
103 3.020000000000000000e+02
104 1.980000000000000000e+02
105 9.500000000000000000e+01
106 5.300000000000000000e+01
107 1.340000000000000000e+02
108 1.440000000000000000e+02
109 2.320000000000000000e+02
110 8.100000000000000000e+01
111 1.040000000000000000e+02
112 5.900000000000000000e+01
113 2.460000000000000000e+02
114 2.970000000000000000e+02
115 2.580000000000000000e+02
116 2.290000000000000000e+02
117 2.750000000000000000e+02
118 2.810000000000000000e+02
119 1.790000000000000000e+02
120 2.000000000000000000e+02
121 2.000000000000000000e+02
122 1.730000000000000000e+02
123 1.800000000000000000e+02
124 8.400000000000000000e+01
125 1.210000000000000000e+02
126 1.610000000000000000e+02
127 9.900000000000000000e+01
128 1.090000000000000000e+02
129 1.150000000000000000e+02
130 2.680000000000000000e+02
131 2.740000000000000000e+02
132 1.580000000000000000e+02
133 1.070000000000000000e+02
134 8.300000000000000000e+01
135 1.030000000000000000e+02
136 2.720000000000000000e+02
137 8.500000000000000000e+01
138 2.800000000000000000e+02
139 3.360000000000000000e+02
140 2.810000000000000000e+02
141 1.180000000000000000e+02
142 3.170000000000000000e+02
143 2.350000000000000000e+02
144 6.000000000000000000e+01
145 1.740000000000000000e+02
146 2.590000000000000000e+02
147 1.780000000000000000e+02
148 1.280000000000000000e+02
149 9.600000000000000000e+01
150 1.260000000000000000e+02
151 2.880000000000000000e+02
152 8.800000000000000000e+01
153 2.920000000000000000e+02
154 7.100000000000000000e+01
155 1.970000000000000000e+02
156 1.860000000000000000e+02
157 2.500000000000000000e+01
158 8.400000000000000000e+01
159 9.600000000000000000e+01
160 1.950000000000000000e+02
161 5.300000000000000000e+01
162 2.170000000000000000e+02
163 1.720000000000000000e+02
164 1.310000000000000000e+02
165 2.140000000000000000e+02
166 5.900000000000000000e+01
167 7.000000000000000000e+01
168 2.200000000000000000e+02
169 2.680000000000000000e+02
170 1.520000000000000000e+02
171 4.700000000000000000e+01
172 7.400000000000000000e+01
173 2.950000000000000000e+02
174 1.010000000000000000e+02
175 1.510000000000000000e+02
176 1.270000000000000000e+02
177 2.370000000000000000e+02
178 2.250000000000000000e+02
179 8.100000000000000000e+01
180 1.510000000000000000e+02
181 1.070000000000000000e+02
182 6.400000000000000000e+01
183 1.380000000000000000e+02
184 1.850000000000000000e+02
185 2.650000000000000000e+02
186 1.010000000000000000e+02
187 1.370000000000000000e+02
188 1.430000000000000000e+02
189 1.410000000000000000e+02
190 7.900000000000000000e+01
191 2.920000000000000000e+02
192 1.780000000000000000e+02
193 9.100000000000000000e+01
194 1.160000000000000000e+02
195 8.600000000000000000e+01
196 1.220000000000000000e+02
197 7.200000000000000000e+01
198 1.290000000000000000e+02
199 1.420000000000000000e+02
200 9.000000000000000000e+01
201 1.580000000000000000e+02
202 3.900000000000000000e+01
203 1.960000000000000000e+02
204 2.220000000000000000e+02
205 2.770000000000000000e+02
206 9.900000000000000000e+01
207 1.960000000000000000e+02
208 2.020000000000000000e+02
209 1.550000000000000000e+02
210 7.700000000000000000e+01
211 1.910000000000000000e+02
212 7.000000000000000000e+01
213 7.300000000000000000e+01
214 4.900000000000000000e+01
215 6.500000000000000000e+01
216 2.630000000000000000e+02
217 2.480000000000000000e+02
218 2.960000000000000000e+02
219 2.140000000000000000e+02
220 1.850000000000000000e+02
221 7.800000000000000000e+01
222 9.300000000000000000e+01
223 2.520000000000000000e+02
224 1.500000000000000000e+02
225 7.700000000000000000e+01
226 2.080000000000000000e+02
227 7.700000000000000000e+01
228 1.080000000000000000e+02
229 1.600000000000000000e+02
230 5.300000000000000000e+01
231 2.200000000000000000e+02
232 1.540000000000000000e+02
233 2.590000000000000000e+02
234 9.000000000000000000e+01
235 2.460000000000000000e+02
236 1.240000000000000000e+02
237 6.700000000000000000e+01
238 7.200000000000000000e+01
239 2.570000000000000000e+02
240 2.620000000000000000e+02
241 2.750000000000000000e+02
242 1.770000000000000000e+02
243 7.100000000000000000e+01
244 4.700000000000000000e+01
245 1.870000000000000000e+02
246 1.250000000000000000e+02
247 7.800000000000000000e+01
248 5.100000000000000000e+01
249 2.580000000000000000e+02
250 2.150000000000000000e+02
251 3.030000000000000000e+02
252 2.430000000000000000e+02
253 9.100000000000000000e+01
254 1.500000000000000000e+02
255 3.100000000000000000e+02
256 1.530000000000000000e+02
257 3.460000000000000000e+02
258 6.300000000000000000e+01
259 8.900000000000000000e+01
260 5.000000000000000000e+01
261 3.900000000000000000e+01
262 1.030000000000000000e+02
263 3.080000000000000000e+02
264 1.160000000000000000e+02
265 1.450000000000000000e+02
266 7.400000000000000000e+01
267 4.500000000000000000e+01
268 1.150000000000000000e+02
269 2.640000000000000000e+02
270 8.700000000000000000e+01
271 2.020000000000000000e+02
272 1.270000000000000000e+02
273 1.820000000000000000e+02
274 2.410000000000000000e+02
275 6.600000000000000000e+01
276 9.400000000000000000e+01
277 2.830000000000000000e+02
278 6.400000000000000000e+01
279 1.020000000000000000e+02
280 2.000000000000000000e+02
281 2.650000000000000000e+02
282 9.400000000000000000e+01
283 2.300000000000000000e+02
284 1.810000000000000000e+02
285 1.560000000000000000e+02
286 2.330000000000000000e+02
287 6.000000000000000000e+01
288 2.190000000000000000e+02
289 8.000000000000000000e+01
290 6.800000000000000000e+01
291 3.320000000000000000e+02
292 2.480000000000000000e+02
293 8.400000000000000000e+01
294 2.000000000000000000e+02
295 5.500000000000000000e+01
296 8.500000000000000000e+01
297 8.900000000000000000e+01
298 3.100000000000000000e+01
299 1.290000000000000000e+02
300 8.300000000000000000e+01
301 2.750000000000000000e+02
302 6.500000000000000000e+01
303 1.980000000000000000e+02
304 2.360000000000000000e+02
305 2.530000000000000000e+02
306 1.240000000000000000e+02
307 4.400000000000000000e+01
308 1.720000000000000000e+02
309 1.140000000000000000e+02
310 1.420000000000000000e+02
311 1.090000000000000000e+02
312 1.800000000000000000e+02
313 1.440000000000000000e+02
314 1.630000000000000000e+02
315 1.470000000000000000e+02
316 9.700000000000000000e+01
317 2.200000000000000000e+02
318 1.900000000000000000e+02
319 1.090000000000000000e+02
320 1.910000000000000000e+02
321 1.220000000000000000e+02
322 2.300000000000000000e+02
323 2.420000000000000000e+02
324 2.480000000000000000e+02
325 2.490000000000000000e+02
326 1.920000000000000000e+02
327 1.310000000000000000e+02
328 2.370000000000000000e+02
329 7.800000000000000000e+01
330 1.350000000000000000e+02
331 2.440000000000000000e+02
332 1.990000000000000000e+02
333 2.700000000000000000e+02
334 1.640000000000000000e+02
335 7.200000000000000000e+01
336 9.600000000000000000e+01
337 3.060000000000000000e+02
338 9.100000000000000000e+01
339 2.140000000000000000e+02
340 9.500000000000000000e+01
341 2.160000000000000000e+02
342 2.630000000000000000e+02
343 1.780000000000000000e+02
344 1.130000000000000000e+02
345 2.000000000000000000e+02
346 1.390000000000000000e+02
347 1.390000000000000000e+02
348 8.800000000000000000e+01
349 1.480000000000000000e+02
350 8.800000000000000000e+01
351 2.430000000000000000e+02
352 7.100000000000000000e+01
353 7.700000000000000000e+01
354 1.090000000000000000e+02
355 2.720000000000000000e+02
356 6.000000000000000000e+01
357 5.400000000000000000e+01
358 2.210000000000000000e+02
359 9.000000000000000000e+01
360 3.110000000000000000e+02
361 2.810000000000000000e+02
362 1.820000000000000000e+02
363 3.210000000000000000e+02
364 5.800000000000000000e+01
365 2.620000000000000000e+02
366 2.060000000000000000e+02
367 2.330000000000000000e+02
368 2.420000000000000000e+02
369 1.230000000000000000e+02
370 1.670000000000000000e+02
371 6.300000000000000000e+01
372 1.970000000000000000e+02
373 7.100000000000000000e+01
374 1.680000000000000000e+02
375 1.400000000000000000e+02
376 2.170000000000000000e+02
377 1.210000000000000000e+02
378 2.350000000000000000e+02
379 2.450000000000000000e+02
380 4.000000000000000000e+01
381 5.200000000000000000e+01
382 1.040000000000000000e+02
383 1.320000000000000000e+02
384 8.800000000000000000e+01
385 6.900000000000000000e+01
386 2.190000000000000000e+02
387 7.200000000000000000e+01
388 2.010000000000000000e+02
389 1.100000000000000000e+02
390 5.100000000000000000e+01
391 2.770000000000000000e+02
392 6.300000000000000000e+01
393 1.180000000000000000e+02
394 6.900000000000000000e+01
395 2.730000000000000000e+02
396 2.580000000000000000e+02
397 4.300000000000000000e+01
398 1.980000000000000000e+02
399 2.420000000000000000e+02
400 2.320000000000000000e+02
401 1.750000000000000000e+02
402 9.300000000000000000e+01
403 1.680000000000000000e+02
404 2.750000000000000000e+02
405 2.930000000000000000e+02
406 2.810000000000000000e+02
407 7.200000000000000000e+01
408 1.400000000000000000e+02
409 1.890000000000000000e+02
410 1.810000000000000000e+02
411 2.090000000000000000e+02
412 1.360000000000000000e+02
413 2.610000000000000000e+02
414 1.130000000000000000e+02
415 1.310000000000000000e+02
416 1.740000000000000000e+02
417 2.570000000000000000e+02
418 5.500000000000000000e+01
419 8.400000000000000000e+01
420 4.200000000000000000e+01
421 1.460000000000000000e+02
422 2.120000000000000000e+02
423 2.330000000000000000e+02
424 9.100000000000000000e+01
425 1.110000000000000000e+02
426 1.520000000000000000e+02
427 1.200000000000000000e+02
428 6.700000000000000000e+01
429 3.100000000000000000e+02
430 9.400000000000000000e+01
431 1.830000000000000000e+02
432 6.600000000000000000e+01
433 1.730000000000000000e+02
434 7.200000000000000000e+01
435 4.900000000000000000e+01
436 6.400000000000000000e+01
437 4.800000000000000000e+01
438 1.780000000000000000e+02
439 1.040000000000000000e+02
440 1.320000000000000000e+02
441 2.200000000000000000e+02
442 5.700000000000000000e+01

View File

@@ -80,9 +80,9 @@
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
"source": [ "source": [
"## Register input and output datasets\n", "## Create trained model\n",
"\n", "\n",
"For this example, we have provided a small model (`sklearn_regression_model.pkl` in the notebook's directory) that was trained on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). Here, you will register the data used to create this model in your workspace." "For this example, we will train a small model on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). "
] ]
}, },
{ {
@@ -91,9 +91,42 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"import joblib\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"\n",
"dataset_x, dataset_y = load_diabetes(return_X_y=True)\n",
"\n",
"model = Ridge().fit(dataset_x, dataset_y)\n",
"\n",
"joblib.dump(model, 'sklearn_regression_model.pkl')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Register input and output datasets\n",
"\n",
"Here, you will register the data used to create the model in your workspace."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"from azureml.core import Dataset\n", "from azureml.core import Dataset\n",
"\n", "\n",
"\n", "\n",
"np.savetxt('features.csv', dataset_x, delimiter=',')\n",
"np.savetxt('labels.csv', dataset_y, delimiter=',')\n",
"\n",
"datastore = ws.get_default_datastore()\n", "datastore = ws.get_default_datastore()\n",
"datastore.upload_files(files=['./features.csv', './labels.csv'],\n", "datastore.upload_files(files=['./features.csv', './labels.csv'],\n",
" target_path='sklearn_regression/',\n", " target_path='sklearn_regression/',\n",
@@ -125,6 +158,8 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"import sklearn\n",
"\n",
"from azureml.core import Model\n", "from azureml.core import Model\n",
"from azureml.core.resource_configuration import ResourceConfiguration\n", "from azureml.core.resource_configuration import ResourceConfiguration\n",
"\n", "\n",
@@ -133,7 +168,7 @@
" model_name='my-sklearn-model', # Name of the registered model in your workspace.\n", " model_name='my-sklearn-model', # Name of the registered model in your workspace.\n",
" model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n", " model_path='./sklearn_regression_model.pkl', # Local file to upload and register as a model.\n",
" model_framework=Model.Framework.SCIKITLEARN, # Framework used to create the model.\n", " model_framework=Model.Framework.SCIKITLEARN, # Framework used to create the model.\n",
" model_framework_version='0.19.1', # Version of scikit-learn used to create the model.\n", " model_framework_version=sklearn.__version__, # Version of scikit-learn used to create the model.\n",
" sample_input_dataset=input_dataset,\n", " sample_input_dataset=input_dataset,\n",
" sample_output_dataset=output_dataset,\n", " sample_output_dataset=output_dataset,\n",
" resource_configuration=ResourceConfiguration(cpu=1, memory_in_gb=0.5),\n", " resource_configuration=ResourceConfiguration(cpu=1, memory_in_gb=0.5),\n",
@@ -174,19 +209,9 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.core import Webservice\n",
"from azureml.exceptions import WebserviceException\n",
"\n",
"\n",
"service_name = 'my-sklearn-service'\n", "service_name = 'my-sklearn-service'\n",
"\n", "\n",
"# Remove any existing service under the same name.\n", "service = Model.deploy(ws, service_name, [model], overwrite=True)\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"service = Model.deploy(ws, service_name, [model])\n",
"service.wait_for_deployment(show_output=True)" "service.wait_for_deployment(show_output=True)"
] ]
}, },
@@ -207,10 +232,7 @@
"\n", "\n",
"\n", "\n",
"input_payload = json.dumps({\n", "input_payload = json.dumps({\n",
" 'data': [\n", " 'data': dataset_x[0:2].tolist(),\n",
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
" -0.03482076, -0.04340085, -0.00259226, 0.01990842, -0.01764613]\n",
" ],\n",
" 'method': 'predict' # If you have a classification model, you can get probabilities by changing this to 'predict_proba'.\n", " 'method': 'predict' # If you have a classification model, you can get probabilities by changing this to 'predict_proba'.\n",
"})\n", "})\n",
"\n", "\n",
@@ -262,7 +284,7 @@
" 'inference-schema[numpy-support]',\n", " 'inference-schema[numpy-support]',\n",
" 'joblib',\n", " 'joblib',\n",
" 'numpy',\n", " 'numpy',\n",
" 'scikit-learn'\n", " 'scikit-learn=={}'.format(sklearn.__version__)\n",
"])" "])"
] ]
}, },
@@ -303,20 +325,12 @@
}, },
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.core import Webservice\n",
"from azureml.core.model import InferenceConfig\n", "from azureml.core.model import InferenceConfig\n",
"from azureml.core.webservice import AciWebservice\n", "from azureml.core.webservice import AciWebservice\n",
"from azureml.exceptions import WebserviceException\n",
"\n", "\n",
"\n", "\n",
"service_name = 'my-custom-env-service'\n", "service_name = 'my-custom-env-service'\n",
"\n", "\n",
"# Remove any existing service under the same name.\n",
"try:\n",
" Webservice(ws, service_name).delete()\n",
"except WebserviceException:\n",
" pass\n",
"\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n", "inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n", "aci_config = AciWebservice.deploy_configuration(cpu_cores=1, memory_gb=1)\n",
"\n", "\n",
@@ -324,7 +338,8 @@
" name=service_name,\n", " name=service_name,\n",
" models=[model],\n", " models=[model],\n",
" inference_config=inference_config,\n", " inference_config=inference_config,\n",
" deployment_config=aci_config)\n", " deployment_config=aci_config,\n",
" overwrite=True)\n",
"service.wait_for_deployment(show_output=True)" "service.wait_for_deployment(show_output=True)"
] ]
}, },
@@ -342,10 +357,7 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"input_payload = json.dumps({\n", "input_payload = json.dumps({\n",
" 'data': [\n", " 'data': dataset_x[0:2].tolist()\n",
" [ 0.03807591, 0.05068012, 0.06169621, 0.02187235, -0.0442235,\n",
" -0.03482076, -0.04340085, -0.00259226, 0.01990842, -0.01764613]\n",
" ]\n",
"})\n", "})\n",
"\n", "\n",
"output = service.run(input_payload)\n", "output = service.run(input_payload)\n",
@@ -471,7 +483,7 @@
" 'inference-schema[numpy-support]',\n", " 'inference-schema[numpy-support]',\n",
" 'joblib',\n", " 'joblib',\n",
" 'numpy',\n", " 'numpy',\n",
" 'scikit-learn'\n", " 'scikit-learn=={}'.format(sklearn.__version__)\n",
"])\n", "])\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n", "inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"# if cpu and memory_in_gb parameters are not provided\n", "# if cpu and memory_in_gb parameters are not provided\n",

View File

@@ -2,3 +2,5 @@ name: model-register-and-deploy
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- numpy
- scikit-learn

View File

@@ -1,8 +0,0 @@
name: project_environment
dependencies:
- python=3.6.2
- pip:
- azureml-defaults
- scikit-learn==0.19.1
- numpy
- inference-schema[numpy-support]

View File

@@ -75,6 +75,33 @@
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')" "print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
] ]
}, },
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create trained model\n",
"\n",
"For this example, we will train a small model on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"dataset_x, dataset_y = load_diabetes(return_X_y=True)\n",
"\n",
"sk_model = Ridge().fit(dataset_x, dataset_y)\n",
"\n",
"joblib.dump(sk_model, \"sklearn_regression_model.pkl\")"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
@@ -148,13 +175,10 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"%%writefile source_directory/x/y/score.py\n", "%%writefile source_directory/x/y/score.py\n",
"import os\n", "import joblib\n",
"import pickle\n",
"import json\n", "import json\n",
"import numpy as np\n", "import numpy as np\n",
"from sklearn.externals import joblib\n", "import os\n",
"from sklearn.linear_model import Ridge\n",
"from azureml.core.model import Model\n",
"\n", "\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n", "from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n", "from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
@@ -165,16 +189,17 @@
" # It holds the path to the directory that contains the deployed model (./azureml-models/$MODEL_NAME/$VERSION)\n", " # It holds the path to the directory that contains the deployed model (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # If there are multiple models, this value is the path to the directory containing all deployed models (./azureml-models)\n", " # If there are multiple models, this value is the path to the directory containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n", " model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n", " # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n", " model = joblib.load(model_path)\n",
"\n",
" global name\n", " global name\n",
" # note here, entire source directory on inference config gets added into image\n", " # Note here, the entire source directory from inference config gets added into image.\n",
" # bellow is the example how you can use any extra files in image\n", " # Below is an example of how you can use any extra files in image.\n",
" with open('./source_directory/extradata.json') as json_file:\n", " with open('./source_directory/extradata.json') as json_file:\n",
" data = json.load(json_file)\n", " data = json.load(json_file)\n",
" name = data[\"people\"][0][\"name\"]\n", " name = data[\"people\"][0][\"name\"]\n",
"\n", "\n",
"input_sample = np.array([[10,9,8,7,6,5,4,3,2,1]])\n", "input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n", "output_sample = np.array([3726.995])\n",
"\n", "\n",
"@input_schema('data', NumpyParameterType(input_sample))\n", "@input_schema('data', NumpyParameterType(input_sample))\n",
@@ -182,37 +207,13 @@
"def run(data):\n", "def run(data):\n",
" try:\n", " try:\n",
" result = model.predict(data)\n", " result = model.predict(data)\n",
" # you can return any datatype as long as it is JSON-serializable\n", " # You can return any JSON-serializable object.\n",
" return \"Hello \" + name + \" here is your result = \" + str(result)\n", " return \"Hello \" + name + \" here is your result = \" + str(result)\n",
" except Exception as e:\n", " except Exception as e:\n",
" error = str(e)\n", " error = str(e)\n",
" return error" " return error"
] ]
}, },
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Please note that you must indicate azureml-defaults with verion >= 1.0.45 as a pip dependency for your environemnt. This package contains the functionality needed to host the model as a web service."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile source_directory/env/myenv.yml\n",
"name: project_environment\n",
"dependencies:\n",
" - python=3.6.2\n",
" - pip:\n",
" - azureml-defaults\n",
" - scikit-learn\n",
" - numpy\n",
" - inference-schema[numpy-support]"
]
},
{ {
"cell_type": "code", "cell_type": "code",
"execution_count": null, "execution_count": null,
@@ -249,11 +250,16 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"import sklearn\n",
"\n",
"from azureml.core.environment import Environment\n", "from azureml.core.environment import Environment\n",
"from azureml.core.model import InferenceConfig\n", "from azureml.core.model import InferenceConfig\n",
"\n", "\n",
"\n", "\n",
"myenv = Environment.from_conda_specification(name='myenv', file_path='myenv.yml')\n", "myenv = Environment('myenv')\n",
"myenv.python.conda_dependencies.add_pip_package(\"inference-schema[numpy-support]\")\n",
"myenv.python.conda_dependencies.add_pip_package(\"joblib\")\n",
"myenv.python.conda_dependencies.add_pip_package(\"scikit-learn=={}\".format(sklearn.__version__))\n",
"\n", "\n",
"# explicitly set base_image to None when setting base_dockerfile\n", "# explicitly set base_image to None when setting base_dockerfile\n",
"myenv.docker.base_image = None\n", "myenv.docker.base_image = None\n",
@@ -262,7 +268,7 @@
"\n", "\n",
"inference_config = InferenceConfig(source_directory=source_directory,\n", "inference_config = InferenceConfig(source_directory=source_directory,\n",
" entry_script=\"x/y/score.py\",\n", " entry_script=\"x/y/score.py\",\n",
" environment=myenv)\n" " environment=myenv)"
] ]
}, },
{ {
@@ -352,15 +358,10 @@
"import json\n", "import json\n",
"\n", "\n",
"sample_input = json.dumps({\n", "sample_input = json.dumps({\n",
" 'data': [\n", " 'data': dataset_x[0:2].tolist()\n",
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
" ]\n",
"})\n", "})\n",
"\n", "\n",
"sample_input = bytes(sample_input, encoding='utf-8')\n", "print(local_service.run(sample_input))"
"\n",
"print(local_service.run(input_data=sample_input))"
] ]
}, },
{ {
@@ -379,12 +380,10 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"%%writefile source_directory/x/y/score.py\n", "%%writefile source_directory/x/y/score.py\n",
"import os\n", "import joblib\n",
"import pickle\n",
"import json\n", "import json\n",
"import numpy as np\n", "import numpy as np\n",
"from sklearn.externals import joblib\n", "import os\n",
"from sklearn.linear_model import Ridge\n",
"\n", "\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n", "from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n", "from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
@@ -395,17 +394,18 @@
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n", " # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n", " # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n", " model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n", " # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n", " model = joblib.load(model_path)\n",
"\n",
" global name, from_location\n", " global name, from_location\n",
" # note here, entire source directory on inference config gets added into image\n", " # Note here, the entire source directory from inference config gets added into image.\n",
" # bellow is the example how you can use any extra files in image\n", " # Below is an example of how you can use any extra files in image.\n",
" with open('source_directory/extradata.json') as json_file: \n", " with open('source_directory/extradata.json') as json_file: \n",
" data = json.load(json_file)\n", " data = json.load(json_file)\n",
" name = data[\"people\"][0][\"name\"]\n", " name = data[\"people\"][0][\"name\"]\n",
" from_location = data[\"people\"][0][\"from\"]\n", " from_location = data[\"people\"][0][\"from\"]\n",
"\n", "\n",
"input_sample = np.array([[10,9,8,7,6,5,4,3,2,1]])\n", "input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n", "output_sample = np.array([3726.995])\n",
"\n", "\n",
"@input_schema('data', NumpyParameterType(input_sample))\n", "@input_schema('data', NumpyParameterType(input_sample))\n",
@@ -413,8 +413,8 @@
"def run(data):\n", "def run(data):\n",
" try:\n", " try:\n",
" result = model.predict(data)\n", " result = model.predict(data)\n",
" # you can return any datatype as long as it is JSON-serializable\n", " # You can return any JSON-serializable object.\n",
" return \"Hello \" + name + \" from \" + from_location + \" here is your result = \" + str(result)\n", " return \"Hello \" + name + \" from \" + from_location + \" here is your result = \" + str(result)\n",
" except Exception as e:\n", " except Exception as e:\n",
" error = str(e)\n", " error = str(e)\n",
" return error" " return error"
@@ -430,7 +430,7 @@
"print(\"--------------------------------------------------------------\")\n", "print(\"--------------------------------------------------------------\")\n",
"\n", "\n",
"# After calling reload(), run() will return the updated message.\n", "# After calling reload(), run() will return the updated message.\n",
"local_service.run(input_data=sample_input)" "local_service.run(sample_input)"
] ]
}, },
{ {

View File

@@ -0,0 +1,5 @@
name: register-model-deploy-local-advanced
dependencies:
- pip:
- azureml-sdk
- scikit-learn

View File

@@ -71,6 +71,33 @@
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')" "print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep='\\n')"
] ]
}, },
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create trained model\n",
"\n",
"For this example, we will train a small model on scikit-learn's [diabetes dataset](https://scikit-learn.org/stable/datasets/index.html#diabetes-dataset). "
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import joblib\n",
"\n",
"from sklearn.datasets import load_diabetes\n",
"from sklearn.linear_model import Ridge\n",
"\n",
"dataset_x, dataset_y = load_diabetes(return_X_y=True)\n",
"\n",
"sk_model = Ridge().fit(dataset_x, dataset_y)\n",
"\n",
"joblib.dump(sk_model, \"sklearn_regression_model.pkl\")"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
@@ -82,9 +109,9 @@
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
"source": [ "source": [
"You can add tags and descriptions to your models. we are using `sklearn_regression_model.pkl` file in the current directory as a model with the name `sklearn_regression_model` in the workspace.\n", "Here we are registering the serialized file `sklearn_regression_model.pkl` in the current directory as a model with the name `sklearn_regression_model` in the workspace.\n",
"\n", "\n",
"Using tags, you can track useful information such as the name and version of the machine learning library used to train the model, framework, category, target customer etc. Note that tags must be alphanumeric." "You can add tags and descriptions to your models. Using tags, you can track useful information such as the name and version of the machine learning library used to train the model, framework, category, target customer etc. Note that tags must be alphanumeric."
] ]
}, },
{ {
@@ -119,11 +146,62 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.core.conda_dependencies import CondaDependencies\n", "import sklearn\n",
"\n",
"from azureml.core.environment import Environment\n", "from azureml.core.environment import Environment\n",
"\n", "\n",
"environment = Environment(\"LocalDeploy\")\n", "environment = Environment(\"LocalDeploy\")\n",
"environment.python.conda_dependencies = CondaDependencies(\"myenv.yml\")" "environment.python.conda_dependencies.add_pip_package(\"inference-schema[numpy-support]\")\n",
"environment.python.conda_dependencies.add_pip_package(\"joblib\")\n",
"environment.python.conda_dependencies.add_pip_package(\"scikit-learn=={}\".format(sklearn.__version__))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Provide the Scoring Script\n",
"\n",
"This Python script handles the model execution inside the service container. The `init()` method loads the model file, and `run(data)` is called for every input to the service."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile score.py\n",
"import joblib\n",
"import json\n",
"import numpy as np\n",
"import os\n",
"\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
"\n",
"def init():\n",
" global model\n",
" # AZUREML_MODEL_DIR is an environment variable created during deployment.\n",
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n",
"\n",
"input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n",
"\n",
"@input_schema('data', NumpyParameterType(input_sample))\n",
"@output_schema(NumpyParameterType(output_sample))\n",
"def run(data):\n",
" try:\n",
" result = model.predict(data)\n",
" # You can return any JSON-serializable object.\n",
" return result.tolist()\n",
" except Exception as e:\n",
" error = str(e)\n",
" return error"
] ]
}, },
{ {
@@ -145,114 +223,6 @@
" environment=environment)" " environment=environment)"
] ]
}, },
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Model Profiling\n",
"\n",
"Profile your model to understand how much CPU and memory the service, created as a result of its deployment, will need. Profiling returns information such as CPU usage, memory usage, and response latency. It also provides a CPU and memory recommendation based on the resource usage. You can profile your model (or more precisely the service built based on your model) on any CPU and/or memory combination where 0.1 <= CPU <= 3.5 and 0.1GB <= memory <= 15GB. If you do not provide a CPU and/or memory requirement, we will test it on the default configuration of 3.5 CPU and 15GB memory.\n",
"\n",
"In order to profile your model you will need:\n",
"- a registered model\n",
"- an entry script\n",
"- an inference configuration\n",
"- a single column tabular dataset, where each row contains a string representing sample request data sent to the service.\n",
"\n",
"Please, note that profiling is a long running operation and can take up to 25 minutes depending on the size of the dataset.\n",
"\n",
"At this point we only support profiling of services that expect their request data to be a string, for example: string serialized json, text, string serialized image, etc. The content of each row of the dataset (string) will be put into the body of the HTTP request and sent to the service encapsulating the model for scoring.\n",
"\n",
"Below is an example of how you can construct an input dataset to profile a service which expects its incoming requests to contain serialized json. In this case we created a dataset based one hundred instances of the same request data. In real world scenarios however, we suggest that you use larger datasets with various inputs, especially if your model resource usage/behavior is input dependent."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import json\n",
"from azureml.core import Datastore\n",
"from azureml.core.dataset import Dataset\n",
"from azureml.data import dataset_type_definitions\n",
"\n",
"\n",
"# create a string that can be put in the body of the request\n",
"serialized_input_json = json.dumps({\n",
" 'data': [\n",
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
" ]\n",
"})\n",
"dataset_content = []\n",
"for i in range(100):\n",
" dataset_content.append(serialized_input_json)\n",
"dataset_content = '\\n'.join(dataset_content)\n",
"file_name = 'sample_request_data_diabetes.txt'\n",
"f = open(file_name, 'w')\n",
"f.write(dataset_content)\n",
"f.close()\n",
"\n",
"# upload the txt file created above to the Datastore and create a dataset from it\n",
"data_store = Datastore.get_default(ws)\n",
"data_store.upload_files(['./' + file_name], target_path='sample_request_data_diabetes')\n",
"datastore_path = [(data_store, 'sample_request_data_diabetes' +'/' + file_name)]\n",
"sample_request_data_diabetes = Dataset.Tabular.from_delimited_files(\n",
" datastore_path,\n",
" separator='\\n',\n",
" infer_column_types=True,\n",
" header=dataset_type_definitions.PromoteHeadersBehavior.NO_HEADERS)\n",
"sample_request_data_diabetes = sample_request_data_diabetes.register(workspace=ws,\n",
" name='sample_request_data_diabetes',\n",
" create_new_version=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now that we have an input dataset we are ready to go ahead with profiling. In this case we are testing the previously introduced sklearn regression model on 1 CPU and 0.5 GB memory. The memory usage and recommendation presented in the result is measured in Gigabytes. The CPU usage and recommendation is measured in CPU cores."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from datetime import datetime\n",
"from azureml.core import Environment\n",
"from azureml.core.conda_dependencies import CondaDependencies\n",
"from azureml.core.model import Model, InferenceConfig\n",
"\n",
"\n",
"environment = Environment('my-sklearn-environment')\n",
"environment.python.conda_dependencies = CondaDependencies.create(pip_packages=[\n",
" 'azureml-defaults',\n",
" 'inference-schema[numpy-support]',\n",
" 'joblib',\n",
" 'numpy',\n",
" 'scikit-learn==0.19.1',\n",
" 'scipy'\n",
"])\n",
"inference_config = InferenceConfig(entry_script='score.py', environment=environment)\n",
"# if cpu and memory_in_gb parameters are not provided\n",
"# the model will be profiled on default configuration of\n",
"# 3.5CPU and 15GB memory\n",
"profile = Model.profile(ws,\n",
" 'profile-%s' % datetime.now().strftime('%m%d%Y-%H%M%S'),\n",
" [model],\n",
" inference_config,\n",
" input_dataset=sample_request_data_diabetes,\n",
" cpu=1.0,\n",
" memory_in_gb=0.5)\n",
"\n",
"# profiling is a long running operation and may take up to 25 min\n",
"profile.wait_for_completion(True)\n",
"details = profile.get_details()"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
@@ -339,15 +309,10 @@
"import json\n", "import json\n",
"\n", "\n",
"sample_input = json.dumps({\n", "sample_input = json.dumps({\n",
" 'data': [\n", " 'data': dataset_x[0:2].tolist()\n",
" [1, 2, 3, 4, 5, 6, 7, 8, 9, 10],\n",
" [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]\n",
" ]\n",
"})\n", "})\n",
"\n", "\n",
"sample_input = bytes(sample_input, encoding='utf-8')\n", "local_service.run(sample_input)"
"\n",
"local_service.run(input_data=sample_input)"
] ]
}, },
{ {
@@ -366,12 +331,10 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"%%writefile score.py\n", "%%writefile score.py\n",
"import os\n", "import joblib\n",
"import pickle\n",
"import json\n", "import json\n",
"import numpy as np\n", "import numpy as np\n",
"from sklearn.externals import joblib\n", "import os\n",
"from sklearn.linear_model import Ridge\n",
"\n", "\n",
"from inference_schema.schema_decorators import input_schema, output_schema\n", "from inference_schema.schema_decorators import input_schema, output_schema\n",
"from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n", "from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType\n",
@@ -382,10 +345,10 @@
" # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n", " # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)\n",
" # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n", " # For multiple models, it points to the folder containing all deployed models (./azureml-models)\n",
" model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n", " model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')\n",
" # deserialize the model file back into a sklearn model\n", " # Deserialize the model file back into a sklearn model.\n",
" model = joblib.load(model_path)\n", " model = joblib.load(model_path)\n",
"\n", "\n",
"input_sample = np.array([[10,9,8,7,6,5,4,3,2,1]])\n", "input_sample = np.array([[10.0, 9.0, 8.0, 7.0, 6.0, 5.0, 4.0, 3.0, 2.0, 1.0]])\n",
"output_sample = np.array([3726.995])\n", "output_sample = np.array([3726.995])\n",
"\n", "\n",
"@input_schema('data', NumpyParameterType(input_sample))\n", "@input_schema('data', NumpyParameterType(input_sample))\n",
@@ -393,8 +356,8 @@
"def run(data):\n", "def run(data):\n",
" try:\n", " try:\n",
" result = model.predict(data)\n", " result = model.predict(data)\n",
" # you can return any datatype as long as it is JSON-serializable\n", " # You can return any JSON-serializable object.\n",
" return 'hello from updated score.py'\n", " return 'Hello from the updated score.py: ' + str(result.tolist())\n",
" except Exception as e:\n", " except Exception as e:\n",
" error = str(e)\n", " error = str(e)\n",
" return error" " return error"
@@ -410,7 +373,7 @@
"print(\"--------------------------------------------------------------\")\n", "print(\"--------------------------------------------------------------\")\n",
"\n", "\n",
"# After calling reload(), run() will return the updated message.\n", "# After calling reload(), run() will return the updated message.\n",
"local_service.run(input_data=sample_input)" "local_service.run(sample_input)"
] ]
}, },
{ {

View File

@@ -0,0 +1,5 @@
name: register-model-deploy-local
dependencies:
- pip:
- azureml-sdk
- scikit-learn

View File

@@ -1,35 +0,0 @@
import os
import pickle
import json
import numpy as np
from sklearn.externals import joblib
from sklearn.linear_model import Ridge
from inference_schema.schema_decorators import input_schema, output_schema
from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType
def init():
global model
# AZUREML_MODEL_DIR is an environment variable created during deployment.
# It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)
# For multiple models, it points to the folder containing all deployed models (./azureml-models)
model_path = os.path.join(os.getenv('AZUREML_MODEL_DIR'), 'sklearn_regression_model.pkl')
# deserialize the model file back into a sklearn model
model = joblib.load(model_path)
input_sample = np.array([[10, 9, 8, 7, 6, 5, 4, 3, 2, 1]])
output_sample = np.array([3726.995])
@input_schema('data', NumpyParameterType(input_sample))
@output_schema(NumpyParameterType(output_sample))
def run(data):
try:
result = model.predict(data)
# you can return any datatype as long as it is JSON-serializable
return result.tolist()
except Exception as e:
error = str(e)
return error

View File

@@ -716,7 +716,6 @@
"\n", "\n",
"trainWithAutomlStep = AutoMLStep(name='AutoML_Regression',\n", "trainWithAutomlStep = AutoMLStep(name='AutoML_Regression',\n",
" automl_config=automl_config,\n", " automl_config=automl_config,\n",
" passthru_automl_config=False,\n",
" allow_reuse=True)\n", " allow_reuse=True)\n",
"print(\"trainWithAutomlStep created.\")" "print(\"trainWithAutomlStep created.\")"
] ]

View File

@@ -13,7 +13,7 @@ def init():
global g_tf_sess global g_tf_sess
# pull down model from workspace # pull down model from workspace
model_path = Model.get_model_path("mnist") model_path = Model.get_model_path("mnist-prs")
# contruct graph to execute # contruct graph to execute
tf.reset_default_graph() tf.reset_default_graph()

View File

@@ -120,6 +120,6 @@ pipeline_run.wait_for_completion(show_output=True)
- [file-dataset-image-inference-mnist.ipynb](./file-dataset-image-inference-mnist.ipynb) demonstrates how to run batch inference on an MNIST dataset using FileDataset. - [file-dataset-image-inference-mnist.ipynb](./file-dataset-image-inference-mnist.ipynb) demonstrates how to run batch inference on an MNIST dataset using FileDataset.
- [tabular-dataset-inference-iris.ipynb](./tabular-dataset-inference-iris.ipynb) demonstrates how to run batch inference on an IRIS dataset using TabularDataset. - [tabular-dataset-inference-iris.ipynb](./tabular-dataset-inference-iris.ipynb) demonstrates how to run batch inference on an IRIS dataset using TabularDataset.
- [pipeline-style-transfer.ipynb](../pipeline-style-transfer/pipeline-style-transfer.ipynb) demonstrates using ParallelRunStep in multi-step pipeline and using output from one step as input to ParallelRunStep. - [pipeline-style-transfer.ipynb](../pipeline-style-transfer/pipeline-style-transfer-parallel-run.ipynb) demonstrates using ParallelRunStep in multi-step pipeline and using output from one step as input to ParallelRunStep.
![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/parallel-run/README.png) ![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/parallel-run/README.png)

View File

@@ -274,7 +274,7 @@
"\n", "\n",
"# register downloaded model \n", "# register downloaded model \n",
"model = Model.register(model_path = \"models/\",\n", "model = Model.register(model_path = \"models/\",\n",
" model_name = \"mnist\", # this is the name the model is registered as\n", " model_name = \"mnist-prs\", # this is the name the model is registered as\n",
" tags = {'pretrained': \"mnist\"},\n", " tags = {'pretrained': \"mnist\"},\n",
" description = \"Mnist trained tensorflow model\",\n", " description = \"Mnist trained tensorflow model\",\n",
" workspace = ws)" " workspace = ws)"
@@ -474,8 +474,7 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"path_on_datastore = mnist_data.path('mnist/0.png')\n", "path_on_datastore = mnist_data.path('mnist/0.png')\n",
"single_image_ds = Dataset.File.from_files(path=path_on_datastore, validate=False)\n", "single_image_ds = Dataset.File.from_files(path=path_on_datastore, validate=False)"
"single_image_ds._ensure_saved(ws)"
] ]
}, },
{ {

View File

@@ -227,7 +227,7 @@
"\n", "\n",
"# register downloaded model\n", "# register downloaded model\n",
"model = Model.register(model_path = \"iris_model.pkl/iris_model.pkl\",\n", "model = Model.register(model_path = \"iris_model.pkl/iris_model.pkl\",\n",
" model_name = \"iris\", # this is the name the model is registered as\n", " model_name = \"iris-prs\", # this is the name the model is registered as\n",
" tags = {'pretrained': \"iris\"},\n", " tags = {'pretrained': \"iris\"},\n",
" workspace = ws)" " workspace = ws)"
] ]
@@ -332,7 +332,7 @@
" append_row_file_name=\"iris_outputs.txt\",\n", " append_row_file_name=\"iris_outputs.txt\",\n",
" environment=predict_env,\n", " environment=predict_env,\n",
" compute_target=compute_target, \n", " compute_target=compute_target, \n",
" node_count=3,\n", " node_count=2,\n",
" run_invocation_timeout=600\n", " run_invocation_timeout=600\n",
")" ")"
] ]
@@ -356,7 +356,7 @@
" inputs=[named_iris_ds],\n", " inputs=[named_iris_ds],\n",
" output=output_folder,\n", " output=output_folder,\n",
" parallel_run_config=parallel_run_config,\n", " parallel_run_config=parallel_run_config,\n",
" arguments=['--model_name', 'iris'],\n", " arguments=['--model_name', 'iris-prs'],\n",
" allow_reuse=True\n", " allow_reuse=True\n",
")" ")"
] ]
@@ -380,7 +380,7 @@
"\n", "\n",
"pipeline = Pipeline(workspace=ws, steps=[distributed_csv_iris_step])\n", "pipeline = Pipeline(workspace=ws, steps=[distributed_csv_iris_step])\n",
"\n", "\n",
"pipeline_run = Experiment(ws, 'iris').submit(pipeline)" "pipeline_run = Experiment(ws, 'iris-prs').submit(pipeline)"
] ]
}, },
{ {

View File

@@ -0,0 +1,185 @@
# Original source: https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/neural_style.py
import argparse
import os
import sys
import re
from PIL import Image
import torch
from torchvision import transforms
def load_image(filename, size=None, scale=None):
img = Image.open(filename)
if size is not None:
img = img.resize((size, size), Image.ANTIALIAS)
elif scale is not None:
img = img.resize((int(img.size[0] / scale), int(img.size[1] / scale)), Image.ANTIALIAS)
return img
def save_image(filename, data):
img = data.clone().clamp(0, 255).numpy()
img = img.transpose(1, 2, 0).astype("uint8")
img = Image.fromarray(img)
img.save(filename)
class TransformerNet(torch.nn.Module):
def __init__(self):
super(TransformerNet, self).__init__()
# Initial convolution layers
self.conv1 = ConvLayer(3, 32, kernel_size=9, stride=1)
self.in1 = torch.nn.InstanceNorm2d(32, affine=True)
self.conv2 = ConvLayer(32, 64, kernel_size=3, stride=2)
self.in2 = torch.nn.InstanceNorm2d(64, affine=True)
self.conv3 = ConvLayer(64, 128, kernel_size=3, stride=2)
self.in3 = torch.nn.InstanceNorm2d(128, affine=True)
# Residual layers
self.res1 = ResidualBlock(128)
self.res2 = ResidualBlock(128)
self.res3 = ResidualBlock(128)
self.res4 = ResidualBlock(128)
self.res5 = ResidualBlock(128)
# Upsampling Layers
self.deconv1 = UpsampleConvLayer(128, 64, kernel_size=3, stride=1, upsample=2)
self.in4 = torch.nn.InstanceNorm2d(64, affine=True)
self.deconv2 = UpsampleConvLayer(64, 32, kernel_size=3, stride=1, upsample=2)
self.in5 = torch.nn.InstanceNorm2d(32, affine=True)
self.deconv3 = ConvLayer(32, 3, kernel_size=9, stride=1)
# Non-linearities
self.relu = torch.nn.ReLU()
def forward(self, X):
y = self.relu(self.in1(self.conv1(X)))
y = self.relu(self.in2(self.conv2(y)))
y = self.relu(self.in3(self.conv3(y)))
y = self.res1(y)
y = self.res2(y)
y = self.res3(y)
y = self.res4(y)
y = self.res5(y)
y = self.relu(self.in4(self.deconv1(y)))
y = self.relu(self.in5(self.deconv2(y)))
y = self.deconv3(y)
return y
class ConvLayer(torch.nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, stride):
super(ConvLayer, self).__init__()
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
out = self.reflection_pad(x)
out = self.conv2d(out)
return out
class ResidualBlock(torch.nn.Module):
"""ResidualBlock
introduced in: https://arxiv.org/abs/1512.03385
recommended architecture: http://torch.ch/blog/2016/02/04/resnets.html
"""
def __init__(self, channels):
super(ResidualBlock, self).__init__()
self.conv1 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in1 = torch.nn.InstanceNorm2d(channels, affine=True)
self.conv2 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in2 = torch.nn.InstanceNorm2d(channels, affine=True)
self.relu = torch.nn.ReLU()
def forward(self, x):
residual = x
out = self.relu(self.in1(self.conv1(x)))
out = self.in2(self.conv2(out))
out = out + residual
return out
class UpsampleConvLayer(torch.nn.Module):
"""UpsampleConvLayer
Upsamples the input and then does a convolution. This method gives better results
compared to ConvTranspose2d.
ref: http://distill.pub/2016/deconv-checkerboard/
"""
def __init__(self, in_channels, out_channels, kernel_size, stride, upsample=None):
super(UpsampleConvLayer, self).__init__()
self.upsample = upsample
if upsample:
self.upsample_layer = torch.nn.Upsample(mode='nearest', scale_factor=upsample)
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
x_in = x
if self.upsample:
x_in = self.upsample_layer(x_in)
out = self.reflection_pad(x_in)
out = self.conv2d(out)
return out
def stylize(args):
device = torch.device("cuda" if args.cuda else "cpu")
with torch.no_grad():
style_model = TransformerNet()
state_dict = torch.load(os.path.join(args.model_dir, args.style + ".pth"))
# remove saved deprecated running_* keys in InstanceNorm from the checkpoint
for k in list(state_dict.keys()):
if re.search(r'in\d+\.running_(mean|var)$', k):
del state_dict[k]
style_model.load_state_dict(state_dict)
style_model.to(device)
filenames = os.listdir(args.content_dir)
for filename in filenames:
print("Processing {}".format(filename))
full_path = os.path.join(args.content_dir, filename)
content_image = load_image(full_path, scale=args.content_scale)
content_transform = transforms.Compose([
transforms.ToTensor(),
transforms.Lambda(lambda x: x.mul(255))
])
content_image = content_transform(content_image)
content_image = content_image.unsqueeze(0).to(device)
output = style_model(content_image).cpu()
output_path = os.path.join(args.output_dir, filename)
save_image(output_path, output[0])
def main():
arg_parser = argparse.ArgumentParser(description="parser for fast-neural-style")
arg_parser.add_argument("--content-scale", type=float, default=None,
help="factor for scaling down the content image")
arg_parser.add_argument("--model-dir", type=str, required=True,
help="saved model to be used for stylizing the image.")
arg_parser.add_argument("--cuda", type=int, required=True,
help="set it to 1 for running on GPU, 0 for CPU")
arg_parser.add_argument("--style", type=str,
help="style name")
arg_parser.add_argument("--content-dir", type=str, required=True,
help="directory holding the images")
arg_parser.add_argument("--output-dir", type=str, required=True,
help="directory holding the output images")
args = arg_parser.parse_args()
if args.cuda and not torch.cuda.is_available():
print("ERROR: cuda is not available, try running on CPU")
sys.exit(1)
os.makedirs(args.output_dir, exist_ok=True)
stylize(args)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,207 @@
# Original source: https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/neural_style.py
import argparse
import os
import sys
import re
from PIL import Image
import torch
from torchvision import transforms
from mpi4py import MPI
def load_image(filename, size=None, scale=None):
img = Image.open(filename)
if size is not None:
img = img.resize((size, size), Image.ANTIALIAS)
elif scale is not None:
img = img.resize((int(img.size[0] / scale), int(img.size[1] / scale)), Image.ANTIALIAS)
return img
def save_image(filename, data):
img = data.clone().clamp(0, 255).numpy()
img = img.transpose(1, 2, 0).astype("uint8")
img = Image.fromarray(img)
img.save(filename)
class TransformerNet(torch.nn.Module):
def __init__(self):
super(TransformerNet, self).__init__()
# Initial convolution layers
self.conv1 = ConvLayer(3, 32, kernel_size=9, stride=1)
self.in1 = torch.nn.InstanceNorm2d(32, affine=True)
self.conv2 = ConvLayer(32, 64, kernel_size=3, stride=2)
self.in2 = torch.nn.InstanceNorm2d(64, affine=True)
self.conv3 = ConvLayer(64, 128, kernel_size=3, stride=2)
self.in3 = torch.nn.InstanceNorm2d(128, affine=True)
# Residual layers
self.res1 = ResidualBlock(128)
self.res2 = ResidualBlock(128)
self.res3 = ResidualBlock(128)
self.res4 = ResidualBlock(128)
self.res5 = ResidualBlock(128)
# Upsampling Layers
self.deconv1 = UpsampleConvLayer(128, 64, kernel_size=3, stride=1, upsample=2)
self.in4 = torch.nn.InstanceNorm2d(64, affine=True)
self.deconv2 = UpsampleConvLayer(64, 32, kernel_size=3, stride=1, upsample=2)
self.in5 = torch.nn.InstanceNorm2d(32, affine=True)
self.deconv3 = ConvLayer(32, 3, kernel_size=9, stride=1)
# Non-linearities
self.relu = torch.nn.ReLU()
def forward(self, X):
y = self.relu(self.in1(self.conv1(X)))
y = self.relu(self.in2(self.conv2(y)))
y = self.relu(self.in3(self.conv3(y)))
y = self.res1(y)
y = self.res2(y)
y = self.res3(y)
y = self.res4(y)
y = self.res5(y)
y = self.relu(self.in4(self.deconv1(y)))
y = self.relu(self.in5(self.deconv2(y)))
y = self.deconv3(y)
return y
class ConvLayer(torch.nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, stride):
super(ConvLayer, self).__init__()
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
out = self.reflection_pad(x)
out = self.conv2d(out)
return out
class ResidualBlock(torch.nn.Module):
"""ResidualBlock
introduced in: https://arxiv.org/abs/1512.03385
recommended architecture: http://torch.ch/blog/2016/02/04/resnets.html
"""
def __init__(self, channels):
super(ResidualBlock, self).__init__()
self.conv1 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in1 = torch.nn.InstanceNorm2d(channels, affine=True)
self.conv2 = ConvLayer(channels, channels, kernel_size=3, stride=1)
self.in2 = torch.nn.InstanceNorm2d(channels, affine=True)
self.relu = torch.nn.ReLU()
def forward(self, x):
residual = x
out = self.relu(self.in1(self.conv1(x)))
out = self.in2(self.conv2(out))
out = out + residual
return out
class UpsampleConvLayer(torch.nn.Module):
"""UpsampleConvLayer
Upsamples the input and then does a convolution. This method gives better results
compared to ConvTranspose2d.
ref: http://distill.pub/2016/deconv-checkerboard/
"""
def __init__(self, in_channels, out_channels, kernel_size, stride, upsample=None):
super(UpsampleConvLayer, self).__init__()
self.upsample = upsample
if upsample:
self.upsample_layer = torch.nn.Upsample(mode='nearest', scale_factor=upsample)
reflection_padding = kernel_size // 2
self.reflection_pad = torch.nn.ReflectionPad2d(reflection_padding)
self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride)
def forward(self, x):
x_in = x
if self.upsample:
x_in = self.upsample_layer(x_in)
out = self.reflection_pad(x_in)
out = self.conv2d(out)
return out
def stylize(args, comm):
rank = comm.Get_rank()
size = comm.Get_size()
device = torch.device("cuda" if args.cuda else "cpu")
with torch.no_grad():
style_model = TransformerNet()
state_dict = torch.load(os.path.join(args.model_dir, args.style + ".pth"))
# remove saved deprecated running_* keys in InstanceNorm from the checkpoint
for k in list(state_dict.keys()):
if re.search(r'in\d+\.running_(mean|var)$', k):
del state_dict[k]
style_model.load_state_dict(state_dict)
style_model.to(device)
filenames = os.listdir(args.content_dir)
filenames = sorted(filenames)
partition_size = len(filenames) // size
partitioned_filenames = filenames[rank * partition_size: (rank + 1) * partition_size]
print("RANK {} - is processing {} images out of the total {}".format(rank, len(partitioned_filenames),
len(filenames)))
output_paths = []
for filename in partitioned_filenames:
# print("Processing {}".format(filename))
full_path = os.path.join(args.content_dir, filename)
content_image = load_image(full_path, scale=args.content_scale)
content_transform = transforms.Compose([
transforms.ToTensor(),
transforms.Lambda(lambda x: x.mul(255))
])
content_image = content_transform(content_image)
content_image = content_image.unsqueeze(0).to(device)
output = style_model(content_image).cpu()
output_path = os.path.join(args.output_dir, filename)
save_image(output_path, output[0])
output_paths.append(output_path)
print("RANK {} - number of pre-aggregated output files {}".format(rank, len(output_paths)))
output_paths_list = comm.gather(output_paths, root=0)
if rank == 0:
print("RANK {} - number of aggregated output files {}".format(rank, len(output_paths_list)))
print("RANK {} - end".format(rank))
def main():
arg_parser = argparse.ArgumentParser(description="parser for fast-neural-style")
arg_parser.add_argument("--content-scale", type=float, default=None,
help="factor for scaling down the content image")
arg_parser.add_argument("--model-dir", type=str, required=True,
help="saved model to be used for stylizing the image.")
arg_parser.add_argument("--cuda", type=int, required=True,
help="set it to 1 for running on GPU, 0 for CPU")
arg_parser.add_argument("--style", type=str, help="style name")
arg_parser.add_argument("--content-dir", type=str, required=True,
help="directory holding the images")
arg_parser.add_argument("--output-dir", type=str, required=True,
help="directory holding the output images")
args = arg_parser.parse_args()
comm = MPI.COMM_WORLD
if args.cuda and not torch.cuda.is_available():
print("ERROR: cuda is not available, try running on CPU")
sys.exit(1)
os.makedirs(args.output_dir, exist_ok=True)
stylize(args, comm)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,728 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer-mpi.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Neural style transfer on video\n",
"Using modified code from `pytorch`'s neural style [example](https://pytorch.org/tutorials/advanced/neural_style_tutorial.html), we show how to setup a pipeline for doing style transfer on video. The pipeline has following steps:\n",
"1. Split a video into images\n",
"2. Run neural style on each image using one of the provided models (from `pytorch` pretrained models for this example).\n",
"3. Stitch the image back into a video."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prerequisites\n",
"If you are using an Azure Machine Learning Notebook VM, you are all set. Otherwise, make sure you go through the configuration Notebook located at https://github.com/Azure/MachineLearningNotebooks first if you haven't. This sets you up with a working config file that has information on your workspace, subscription id, etc. "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize Workspace\n",
"\n",
"Initialize a workspace object from persisted configuration."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"from azureml.core import Workspace, Experiment\n",
"\n",
"ws = Workspace.from_config()\n",
"print('Workspace name: ' + ws.name, \n",
" 'Azure region: ' + ws.location, \n",
" 'Subscription id: ' + ws.subscription_id, \n",
" 'Resource group: ' + ws.resource_group, sep = '\\n')\n",
"\n",
"scripts_folder = \"mpi_scripts\"\n",
"\n",
"if not os.path.isdir(scripts_folder):\n",
" os.mkdir(scripts_folder)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.compute import AmlCompute, ComputeTarget\n",
"from azureml.core.datastore import Datastore\n",
"from azureml.data.data_reference import DataReference\n",
"from azureml.pipeline.core import Pipeline, PipelineData\n",
"from azureml.pipeline.steps import PythonScriptStep, MpiStep\n",
"from azureml.core.runconfig import CondaDependencies, RunConfiguration\n",
"from azureml.core.compute_target import ComputeTargetException"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Create or use existing compute"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# AmlCompute\n",
"cpu_cluster_name = \"cpu-cluster\"\n",
"try:\n",
" cpu_cluster = AmlCompute(ws, cpu_cluster_name)\n",
" print(\"found existing cluster.\")\n",
"except ComputeTargetException:\n",
" print(\"creating new cluster\")\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_D2_v2\",\n",
" max_nodes = 1)\n",
"\n",
" # create the cluster\n",
" cpu_cluster = ComputeTarget.create(ws, cpu_cluster_name, provisioning_config)\n",
" cpu_cluster.wait_for_completion(show_output=True)\n",
" \n",
"# AmlCompute\n",
"gpu_cluster_name = \"gpu-cluster\"\n",
"try:\n",
" gpu_cluster = AmlCompute(ws, gpu_cluster_name)\n",
" print(\"found existing cluster.\")\n",
"except ComputeTargetException:\n",
" print(\"creating new cluster\")\n",
" provisioning_config = AmlCompute.provisioning_configuration(vm_size = \"STANDARD_NC6\",\n",
" max_nodes = 3)\n",
"\n",
" # create the cluster\n",
" gpu_cluster = ComputeTarget.create(ws, gpu_cluster_name, provisioning_config)\n",
" gpu_cluster.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Python Scripts\n",
"We use an edited version of `neural_style_mpi.py` (original is [here](https://github.com/pytorch/examples/blob/master/fast_neural_style/neural_style/neural_style.py)). Scripts to split and stitch the video are thin wrappers to calls to `ffmpeg`. These scripts are also located in the \"scripts_folder\".\n",
"\n",
"We install `ffmpeg` through conda dependencies."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile $scripts_folder/process_video.py\n",
"import argparse\n",
"import glob\n",
"import os\n",
"import subprocess\n",
"\n",
"parser = argparse.ArgumentParser(description=\"Process input video\")\n",
"parser.add_argument('--input_video', required=True)\n",
"parser.add_argument('--output_audio', required=True)\n",
"parser.add_argument('--output_images', required=True)\n",
"\n",
"args = parser.parse_args()\n",
"\n",
"os.makedirs(args.output_audio, exist_ok=True)\n",
"os.makedirs(args.output_images, exist_ok=True)\n",
"\n",
"subprocess.run(\"ffmpeg -i {} {}/video.aac\"\n",
" .format(args.input_video, args.output_audio),\n",
" shell=True, check=True\n",
" )\n",
"\n",
"subprocess.run(\"ffmpeg -i {} {}/%05d_video.jpg -hide_banner\"\n",
" .format(args.input_video, args.output_images),\n",
" shell=True, check=True\n",
" )"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%writefile $scripts_folder/stitch_video.py\n",
"import argparse\n",
"import os\n",
"import subprocess\n",
"\n",
"parser = argparse.ArgumentParser(description=\"Process input video\")\n",
"parser.add_argument('--images_dir', required=True)\n",
"parser.add_argument('--input_audio', required=True)\n",
"parser.add_argument('--output_dir', required=True)\n",
"\n",
"args = parser.parse_args()\n",
"\n",
"os.makedirs(args.output_dir, exist_ok=True)\n",
"\n",
"subprocess.run(\"ffmpeg -framerate 30 -i {}/%05d_video.jpg -c:v libx264 -profile:v high -crf 20 -pix_fmt yuv420p \"\n",
" \"-y {}/video_without_audio.mp4\"\n",
" .format(args.images_dir, args.output_dir),\n",
" shell=True, check=True\n",
" )\n",
"\n",
"subprocess.run(\"ffmpeg -i {}/video_without_audio.mp4 -i {}/video.aac -map 0:0 -map 1:0 -vcodec \"\n",
" \"copy -acodec copy -y {}/video_with_audio.mp4\"\n",
" .format(args.output_dir, args.input_audio, args.output_dir),\n",
" shell=True, check=True\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The sample video **organutan.mp4** is stored at a publicly shared datastore. We are registering the datastore below. If you want to take a look at the original video, click here. (https://pipelinedata.blob.core.windows.net/sample-videos/orangutan.mp4)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# datastore for input video\n",
"account_name = \"pipelinedata\"\n",
"video_ds = Datastore.register_azure_blob_container(ws, \"videos\", \"sample-videos\",\n",
" account_name=account_name, overwrite=True)\n",
"\n",
"# datastore for models\n",
"models_ds = Datastore.register_azure_blob_container(ws, \"models\", \"styletransfer\", \n",
" account_name=\"pipelinedata\", \n",
" overwrite=True)\n",
" \n",
"# downloaded models from https://pytorch.org/tutorials/advanced/neural_style_tutorial.html are kept here\n",
"models_dir = DataReference(data_reference_name=\"models\", datastore=models_ds, \n",
" path_on_datastore=\"saved_models\", mode=\"download\")\n",
"\n",
"# the default blob store attached to a workspace\n",
"default_datastore = ws.get_default_datastore()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Sample video"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"video_name=os.getenv(\"STYLE_TRANSFER_VIDEO_NAME\", \"orangutan.mp4\") \n",
"orangutan_video = DataReference(datastore=video_ds,\n",
" data_reference_name=\"video\",\n",
" path_on_datastore=video_name, mode=\"download\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"cd = CondaDependencies()\n",
"\n",
"cd.add_channel(\"conda-forge\")\n",
"cd.add_conda_package(\"ffmpeg\")\n",
"\n",
"cd.add_channel(\"pytorch\")\n",
"cd.add_conda_package(\"pytorch\")\n",
"cd.add_conda_package(\"torchvision\")\n",
"\n",
"# Runconfig\n",
"amlcompute_run_config = RunConfiguration(conda_dependencies=cd)\n",
"amlcompute_run_config.environment.docker.enabled = True\n",
"amlcompute_run_config.environment.docker.base_image = \"pytorch/pytorch\"\n",
"amlcompute_run_config.environment.spark.precache_packages = False"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"ffmpeg_audio = PipelineData(name=\"ffmpeg_audio\", datastore=default_datastore)\n",
"ffmpeg_images = PipelineData(name=\"ffmpeg_images\", datastore=default_datastore)\n",
"processed_images = PipelineData(name=\"processed_images\", datastore=default_datastore)\n",
"output_video = PipelineData(name=\"output_video\", datastore=default_datastore)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Define tweakable parameters to pipeline\n",
"These parameters can be changed when the pipeline is published and rerun from a REST call"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core.graph import PipelineParameter\n",
"# create a parameter for style (one of \"candy\", \"mosaic\", \"rain_princess\", \"udnie\") to transfer the images to\n",
"style_param = PipelineParameter(name=\"style\", default_value=\"mosaic\")\n",
"# create a parameter for the number of nodes to use in step no. 2 (style transfer)\n",
"nodecount_param = PipelineParameter(name=\"nodecount\", default_value=1)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"split_video_step = PythonScriptStep(\n",
" name=\"split video\",\n",
" script_name=\"process_video.py\",\n",
" arguments=[\"--input_video\", orangutan_video,\n",
" \"--output_audio\", ffmpeg_audio,\n",
" \"--output_images\", ffmpeg_images,\n",
" ],\n",
" compute_target=cpu_cluster,\n",
" inputs=[orangutan_video],\n",
" outputs=[ffmpeg_images, ffmpeg_audio],\n",
" runconfig=amlcompute_run_config,\n",
" source_directory=scripts_folder\n",
")\n",
"\n",
"# create a MPI step for distributing style transfer step across multiple nodes in AmlCompute \n",
"# using 'nodecount_param' PipelineParameter\n",
"distributed_style_transfer_step = MpiStep(\n",
" name=\"mpi style transfer\",\n",
" script_name=\"neural_style_mpi.py\",\n",
" arguments=[\"--content-dir\", ffmpeg_images,\n",
" \"--output-dir\", processed_images,\n",
" \"--model-dir\", models_dir,\n",
" \"--style\", style_param,\n",
" \"--cuda\", 1\n",
" ],\n",
" compute_target=gpu_cluster,\n",
" node_count=nodecount_param, \n",
" process_count_per_node=1,\n",
" inputs=[models_dir, ffmpeg_images],\n",
" outputs=[processed_images],\n",
" pip_packages=[\"mpi4py\", \"torch\", \"torchvision\"],\n",
" use_gpu=True,\n",
" source_directory=scripts_folder\n",
")\n",
"\n",
"stitch_video_step = PythonScriptStep(\n",
" name=\"stitch\",\n",
" script_name=\"stitch_video.py\",\n",
" arguments=[\"--images_dir\", processed_images, \n",
" \"--input_audio\", ffmpeg_audio, \n",
" \"--output_dir\", output_video],\n",
" compute_target=cpu_cluster,\n",
" inputs=[processed_images, ffmpeg_audio],\n",
" outputs=[output_video],\n",
" runconfig=amlcompute_run_config,\n",
" source_directory=scripts_folder\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Run the pipeline"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline = Pipeline(workspace=ws, steps=[stitch_video_step])\n",
"# submit the pipeline and provide values for the PipelineParameters used in the pipeline\n",
"pipeline_run = Experiment(ws, 'style_transfer').submit(pipeline, pipeline_parameters={\"style\": \"mosaic\", \"nodecount\": 3})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Monitor using widget"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.widgets import RunDetails\n",
"RunDetails(pipeline_run).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Downloads the video in `output_video` folder"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Download output video"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"def download_video(run, target_dir=None):\n",
" stitch_run = run.find_step_run(\"stitch\")[0]\n",
" port_data = stitch_run.get_output_data(\"output_video\")\n",
" port_data.download(target_dir, show_progress=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"pipeline_run.wait_for_completion()\n",
"download_video(pipeline_run, \"output_video_mosaic\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Publish pipeline"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline = pipeline_run.publish_pipeline(\n",
" name=\"batch score style transfer\", description=\"style transfer\", version=\"1.0\")\n",
"\n",
"published_pipeline"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get published pipeline\n",
"\n",
"You can get the published pipeline using **pipeline id**.\n",
"\n",
"To get all the published pipelines for a given workspace(ws): \n",
"```css\n",
"all_pub_pipelines = PublishedPipeline.get_all(ws)\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core import PublishedPipeline\n",
"\n",
"pipeline_id = published_pipeline.id # use your published pipeline id\n",
"published_pipeline = PublishedPipeline.get(ws, pipeline_id)\n",
"\n",
"published_pipeline"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Re-run pipeline through REST calls for other styles"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get AAD token\n",
"[This notebook](https://aka.ms/pl-restep-auth) shows how to authenticate to AML workspace."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core.authentication import InteractiveLoginAuthentication\n",
"import requests\n",
"\n",
"auth = InteractiveLoginAuthentication()\n",
"aad_token = auth.get_authentication_header()\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Get endpoint URL"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"rest_endpoint = published_pipeline.endpoint"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Send request and monitor"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Run the pipeline using PipelineParameter values style='candy' and nodecount=2"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": \"style_transfer\",\n",
" \"ParameterAssignments\": {\"style\": \"candy\", \"nodecount\": 2}})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.pipeline.core.run import PipelineRun\n",
"published_pipeline_run_candy = PipelineRun(ws.experiments[\"style_transfer\"], run_id)\n",
"RunDetails(published_pipeline_run_candy).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Run the pipeline using PipelineParameter values style='rain_princess' and nodecount=3"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": \"style_transfer\",\n",
" \"ParameterAssignments\": {\"style\": \"rain_princess\", \"nodecount\": 3}})"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_rain = PipelineRun(ws.experiments[\"style_transfer\"], run_id)\n",
"RunDetails(published_pipeline_run_rain).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Run the pipeline using PipelineParameter values style='udnie' and nodecount=4"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"response = requests.post(rest_endpoint, \n",
" headers=aad_token,\n",
" json={\"ExperimentName\": \"style_transfer\",\n",
" \"ParameterAssignments\": {\"style\": \"udnie\", \"nodecount\": 3}})\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"try:\n",
" response.raise_for_status()\n",
"except Exception: \n",
" raise Exception('Received bad response from the endpoint: {}\\n'\n",
" 'Response Code: {}\\n'\n",
" 'Headers: {}\\n'\n",
" 'Content: {}'.format(rest_endpoint, response.status_code, response.headers, response.content))\n",
"\n",
"run_id = response.json().get('Id')\n",
"print('Submitted pipeline run: ', run_id)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_udnie = PipelineRun(ws.experiments[\"style_transfer\"], run_id)\n",
"RunDetails(published_pipeline_run_udnie).show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Download output from re-run"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"published_pipeline_run_candy.wait_for_completion()\n",
"published_pipeline_run_rain.wait_for_completion()\n",
"published_pipeline_run_udnie.wait_for_completion()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"download_video(published_pipeline_run_candy, target_dir=\"output_video_candy\")\n",
"download_video(published_pipeline_run_rain, target_dir=\"output_video_rain_princess\")\n",
"download_video(published_pipeline_run_udnie, target_dir=\"output_video_udnie\")"
]
}
],
"metadata": {
"authors": [
{
"name": "balapv mabables"
}
],
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.7"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -1,4 +1,4 @@
name: pipeline-style-transfer name: pipeline-style-transfer-mpi
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk

View File

@@ -13,7 +13,7 @@
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},
"source": [ "source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer.png)" "![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer-parallel-run.png)"
] ]
}, },
{ {

View File

@@ -0,0 +1,7 @@
name: pipeline-style-transfer-parallel-run
dependencies:
- pip:
- azureml-sdk
- azureml-pipeline-steps
- azureml-widgets
- requests

View File

@@ -456,6 +456,24 @@
"monitor.enable_schedule()" "monitor.enable_schedule()"
] ]
}, },
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Delete the DataDriftDetector\n",
"\n",
"Invoking the `delete()` method on the object deletes the the drift monitor permanently and cannot be undone. You will no longer be able to find it in the UI and the `list()` or `get()` methods. The object on which delete() was called will have its state set to deleted and name suffixed with deleted. The baseline and target datasets and model data that was collected, if any, are not deleted. The compute is not deleted. The DataDrift schedule pipeline is disabled and archived."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"monitor.delete()"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},

View File

@@ -86,7 +86,7 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"import azureml.core\n", "import azureml.core\n",
"print(\"Azure Machine Learning SDK Version: \", azureml.core.VERSION)" "print(\"Azure Machine Learning SDK Version:\", azureml.core.VERSION)"
] ]
}, },
{ {
@@ -128,24 +128,12 @@
"source": [ "source": [
"import os.path\n", "import os.path\n",
"\n", "\n",
"\n",
"# Get information about the currently running compute instance (notebook VM), like its name and prefix.\n", "# Get information about the currently running compute instance (notebook VM), like its name and prefix.\n",
"def load_nbvm():\n", "def load_nbvm():\n",
" if not os.path.isfile(\"/mnt/azmnt/.nbvm\"):\n", " if not os.path.isfile(\"/mnt/azmnt/.nbvm\"):\n",
" return None\n", " return None\n",
" with open(\"/mnt/azmnt/.nbvm\", 'r') as file:\n", " with open(\"/mnt/azmnt/.nbvm\", 'r') as nbvm_file:\n",
" return {key:value for (key, value) in [line.strip().split('=') for line in file]}\n", " return {key:value for (key, value) in line.strip().split('=') for line in nbvm_file}\n"
"\n",
"\n",
"# Get information about the capabilities of an azureml.core.compute.AmlCompute target\n",
"# In particular how much RAM + GPU + HDD it has.\n",
"def get_compute_size(self, workspace):\n",
" for size in self.supported_vmsizes(workspace):\n",
" if(size['name'].upper() == self.vm_size):\n",
" return size\n",
"\n",
"azureml.core.compute.ComputeTarget.size = get_compute_size\n",
"del(get_compute_size)"
] ]
}, },
{ {
@@ -161,7 +149,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.core.compute import ComputeTarget, ComputeInstance\n", "from azureml.core.compute import ComputeInstance\n",
"from azureml.core.compute_target import ComputeTargetException\n", "from azureml.core.compute_target import ComputeTargetException\n",
"\n", "\n",
"# Load current compute instance info\n", "# Load current compute instance info\n",
@@ -188,9 +176,7 @@
"compute_target = ws.compute_targets[instance_name]\n", "compute_target = ws.compute_targets[instance_name]\n",
"\n", "\n",
"print(\"Compute target status:\")\n", "print(\"Compute target status:\")\n",
"print(compute_target.get_status().serialize())\n", "print(compute_target.get_status().serialize())\n"
"print(\"Compute target size:\")\n",
"print(compute_target.size(ws))"
] ]
}, },
{ {
@@ -525,7 +511,6 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"# Find checkpoints and last checkpoint number\n", "# Find checkpoints and last checkpoint number\n",
"from os import path\n",
"checkpoint_files = [\n", "checkpoint_files = [\n",
" os.path.basename(file) for file in training_artifacts_ds.to_path() \\\n", " os.path.basename(file) for file in training_artifacts_ds.to_path() \\\n",
" if os.path.basename(file).startswith('checkpoint-') and \\\n", " if os.path.basename(file).startswith('checkpoint-') and \\\n",
@@ -629,8 +614,6 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.widgets import RunDetails\n",
"\n",
"RunDetails(rollout_run).show()" "RunDetails(rollout_run).show()"
] ]
}, },

View File

@@ -4,4 +4,3 @@ dependencies:
- azureml-sdk - azureml-sdk
- azureml-contrib-reinforcementlearning - azureml-contrib-reinforcementlearning
- azureml-widgets - azureml-widgets
- azureml-dataprep

View File

@@ -87,7 +87,7 @@
"source": [ "source": [
"import azureml.core\n", "import azureml.core\n",
"\n", "\n",
"print(\"Azure Machine Learning SDK Version: \", azureml.core.VERSION)" "print(\"Azure Machine Learning SDK Version:\", azureml.core.VERSION)"
] ]
}, },
{ {
@@ -248,9 +248,8 @@
" # Ray's video capture support requires to run everything under a headless display driver called (xvfb).\n", " # Ray's video capture support requires to run everything under a headless display driver called (xvfb).\n",
" # There are two parts to this:\n", " # There are two parts to this:\n",
" # 1. Use a custom docker file with proper instructions to install xvfb, ffmpeg, python-opengl\n", " # 1. Use a custom docker file with proper instructions to install xvfb, ffmpeg, python-opengl\n",
" # and other dependencies. \n", " # and other dependencies.\n",
" # TODO: Add these instructions to default reinforcement learning base image and drop this docker file.\n", " \n",
" \n",
" with open(\"files/docker/Dockerfile\", \"r\") as f:\n", " with open(\"files/docker/Dockerfile\", \"r\") as f:\n",
" dockerfile=f.read()\n", " dockerfile=f.read()\n",
"\n", "\n",
@@ -546,11 +545,7 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"import os\n",
"from os import path\n",
"from distutils import dir_util\n",
"import shutil\n", "import shutil\n",
"from files.utils import misc\n",
"\n", "\n",
"# A helper function to download (copy) movies from a dataset to local directory\n", "# A helper function to download (copy) movies from a dataset to local directory\n",
"def download_movies(artifacts_ds, movies, destination):\n", "def download_movies(artifacts_ds, movies, destination):\n",
@@ -560,7 +555,7 @@
" dir_util.mkpath(destination)\n", " dir_util.mkpath(destination)\n",
" \n", " \n",
" try:\n", " try:\n",
" pirnt(\"Trying mounting dataset and copying movies.\")\n", " print(\"Trying mounting dataset and copying movies.\")\n",
" # Note: We assume movie paths start with '\\'\n", " # Note: We assume movie paths start with '\\'\n",
" mount_context = artifacts_ds.mount()\n", " mount_context = artifacts_ds.mount()\n",
" mount_context.start()\n", " mount_context.start()\n",
@@ -568,11 +563,11 @@
" print('Copying {} ...'.format(movie))\n", " print('Copying {} ...'.format(movie))\n",
" shutil.copy2(path.join(mount_context.mount_point, movie[1:]), destination)\n", " shutil.copy2(path.join(mount_context.mount_point, movie[1:]), destination)\n",
" mount_context.stop()\n", " mount_context.stop()\n",
" except:\n", " except OSError as e:\n",
" print(\"Mounting failed! Going with dataset download.\")\n", " print(\"Mounting failed with error '{0}'. Going with dataset download.\".format(e))\n",
" for i, file in enumerate(artifacts_ds.to_path()):\n", " for i, artifact in enumerate(artifacts_ds.to_path()):\n",
" if file in movies:\n", " if artifact in movies:\n",
" print('Downloading {} ...'.format(file))\n", " print('Downloading {} ...'.format(artifact))\n",
" artifacts_ds.skip(i).take(1).download(target_path=destination, overwrite=True)\n", " artifacts_ds.skip(i).take(1).download(target_path=destination, overwrite=True)\n",
" \n", " \n",
" print('Downloading movies completed!')\n", " print('Downloading movies completed!')\n",
@@ -581,14 +576,14 @@
"# A helper function to find movies in a directory\n", "# A helper function to find movies in a directory\n",
"def find_movies(movie_path):\n", "def find_movies(movie_path):\n",
" print(\"Looking in path:\", movie_path)\n", " print(\"Looking in path:\", movie_path)\n",
" mp4_files = []\n", " mp4_movies = []\n",
" for root, _, files in os.walk(movie_path):\n", " for root, _, files in os.walk(movie_path):\n",
" for file in files:\n", " for name in files:\n",
" if file.endswith('.mp4'):\n", " if name.endswith('.mp4'):\n",
" mp4_files.append(path.join(root, file))\n", " mp4_movies.append(path.join(root, name))\n",
" print('Found {} movies'.format(len(mp4_files)))\n", " print('Found {} movies'.format(len(mp4_movies)))\n",
"\n", "\n",
" return mp4_files\n", " return mp4_movies\n",
"\n", "\n",
"\n", "\n",
"# A helper function to display a movie\n", "# A helper function to display a movie\n",
@@ -718,7 +713,6 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"# Find checkpoints and last checkpoint number\n", "# Find checkpoints and last checkpoint number\n",
"from os import path\n",
"checkpoint_files = [\n", "checkpoint_files = [\n",
" os.path.basename(file) for file in training_artifacts_ds.to_path() \\\n", " os.path.basename(file) for file in training_artifacts_ds.to_path() \\\n",
" if os.path.basename(file).startswith('checkpoint-') and \\\n", " if os.path.basename(file).startswith('checkpoint-') and \\\n",
@@ -783,7 +777,6 @@
"# 1. Use a custom docker file with proper instructions to install xvfb, ffmpeg, python-opengl\n", "# 1. Use a custom docker file with proper instructions to install xvfb, ffmpeg, python-opengl\n",
"# and other dependencies.\n", "# and other dependencies.\n",
"# Note: Even when the rendering is off pyhton-opengl is needed.\n", "# Note: Even when the rendering is off pyhton-opengl is needed.\n",
"# TODO: Add these instructions to default reinforcement learning base image and drop this docker file.\n",
"\n", "\n",
"with open(\"files/docker/Dockerfile\", \"r\") as f:\n", "with open(\"files/docker/Dockerfile\", \"r\") as f:\n",
" dockerfile=f.read()\n", " dockerfile=f.read()\n",
@@ -852,8 +845,6 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.widgets import RunDetails\n",
"\n",
"RunDetails(rollout_run).show()" "RunDetails(rollout_run).show()"
] ]
}, },
@@ -890,8 +881,6 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from azureml.core import Dataset\n",
"\n",
"# Get a handle to child run\n", "# Get a handle to child run\n",
"child_runs = list(rollout_run.get_children())\n", "child_runs = list(rollout_run.get_children())\n",
"print('Number of child runs:', len(child_runs))\n", "print('Number of child runs:', len(child_runs))\n",
@@ -971,9 +960,6 @@
"metadata": {}, "metadata": {},
"outputs": [], "outputs": [],
"source": [ "source": [
"from os import path\n",
"from distutils import dir_util\n",
"\n",
"# To archive the created experiment:\n", "# To archive the created experiment:\n",
"#exp.archive()\n", "#exp.archive()\n",
"\n", "\n",

View File

@@ -4,4 +4,3 @@ dependencies:
- azureml-sdk - azureml-sdk
- azureml-contrib-reinforcementlearning - azureml-contrib-reinforcementlearning
- azureml-widgets - azureml-widgets
- azureml-dataprep

View File

@@ -15,3 +15,9 @@ def on_train_result(info):
run.log( run.log(
name='episodes_total', name='episodes_total',
value=info["result"]["episodes_total"]) value=info["result"]["episodes_total"])
run.log(
name='perf_cpu_percent',
value=info["result"]["perf"]["cpu_util_percent"])
run.log(
name='perf_memory_percent',
value=info["result"]["perf"]["ram_util_percent"])

View File

@@ -100,7 +100,7 @@
"\n", "\n",
"# Check core SDK version number\n", "# Check core SDK version number\n",
"\n", "\n",
"print(\"This notebook was created using SDK version 1.7.0, you are currently running version\", azureml.core.VERSION)" "print(\"This notebook was created using SDK version 1.9.0, you are currently running version\", azureml.core.VERSION)"
] ]
}, },
{ {

View File

@@ -92,7 +92,7 @@
" # Specify the configuration for the new cluster\n", " # Specify the configuration for the new cluster\n",
" compute_config = AmlCompute.provisioning_configuration(vm_size=\"STANDARD_D2_V2\",\n", " compute_config = AmlCompute.provisioning_configuration(vm_size=\"STANDARD_D2_V2\",\n",
" min_nodes=0,\n", " min_nodes=0,\n",
" max_nodes=1)\n", " max_nodes=2)\n",
"\n", "\n",
" # Create the cluster with the specified name and configuration\n", " # Create the cluster with the specified name and configuration\n",
" cpu_cluster = ComputeTarget.create(ws, cpu_cluster_name, compute_config)\n", " cpu_cluster = ComputeTarget.create(ws, cpu_cluster_name, compute_config)\n",

View File

@@ -6,9 +6,14 @@ from sklearn.linear_model import Ridge
from sklearn.metrics import mean_squared_error from sklearn.metrics import mean_squared_error
from sklearn.model_selection import train_test_split from sklearn.model_selection import train_test_split
from azureml.core.run import Run from azureml.core.run import Run
from sklearn.externals import joblib
import os import os
import numpy as np import numpy as np
from sklearn import __version__ as sklearnver
from packaging.version import Version
if Version(sklearnver) < Version("0.23.0"):
from sklearn.externals import joblib
else:
import joblib
os.makedirs('./outputs', exist_ok=True) os.makedirs('./outputs', exist_ok=True)

View File

@@ -0,0 +1,374 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Copyright (c) Microsoft Corporation. All rights reserved.\n",
"\n",
"Licensed under the MIT License."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"![Impressions](https://PixelServer20190423114238.azurewebsites.net/api/impressions/MachineLearningNotebooks/how-to-use-azureml/training/train-on-amlcompute/train-on-computeinstance.png)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Train using Azure Machine Learning Compute Instance\n",
"\n",
"* Initialize Workspace\n",
"* Introduction to ComputeInstance\n",
"* Create an Experiment\n",
"* Submit ComputeInstance run\n",
"* Additional operations to perform on ComputeInstance"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Prerequisites\n",
"If you are using an Azure Machine Learning ComputeInstance, you are all set. Otherwise, go through the [configuration](../../../configuration.ipynb) notebook first if you haven't already to establish your connection to the AzureML Workspace."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Check core SDK version number\n",
"import azureml.core\n",
"\n",
"print(\"SDK version:\", azureml.core.VERSION)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Initialize Workspace\n",
"\n",
"Initialize a workspace object"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"create workspace"
]
},
"outputs": [],
"source": [
"from azureml.core import Workspace\n",
"\n",
"ws = Workspace.from_config()\n",
"print(ws.name, ws.resource_group, ws.location, ws.subscription_id, sep = '\\n')"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Introduction to ComputeInstance\n",
"\n",
"\n",
"Azure Machine Learning compute instance is a fully-managed cloud-based workstation optimized for your machine learning development environment. It is created **within your workspace region**.\n",
"\n",
"For more information on ComputeInstance, please read [this article](https://docs.microsoft.com/en-us/azure/machine-learning/concept-compute-instance)\n",
"\n",
"**Note**: As with other Azure services, there are limits on certain resources (for eg. AmlCompute quota) associated with the Azure Machine Learning service. Please read [this article](https://docs.microsoft.com/azure/machine-learning/service/how-to-manage-quotas) on the default limits and how to request more quota."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Create ComputeInstance\n",
"First lets check which VM families are available in your region. Azure is a regional service and some specialized SKUs (especially GPUs) are only available in certain regions. Since ComputeInstance is created in the region of your workspace, we will use the supported_vms () function to see if the VM family we want to use ('STANDARD_D3_V2') is supported.\n",
"\n",
"You can also pass a different region to check availability and then re-create your workspace in that region through the [configuration notebook](../../../configuration.ipynb)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"msdoc": "how-to-auto-train-remote.md",
"name": "check_region"
},
"outputs": [],
"source": [
"from azureml.core.compute import ComputeTarget, ComputeInstance\n",
"\n",
"ComputeInstance.supported_vmsizes(workspace = ws)\n",
"# ComputeInstance.supported_vmsizes(workspace = ws, location='eastus')"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"msdoc": "how-to-auto-train-remote.md",
"name": "create_instance"
},
"outputs": [],
"source": [
"import datetime\n",
"import time\n",
"\n",
"from azureml.core.compute import ComputeTarget, ComputeInstance\n",
"from azureml.core.compute_target import ComputeTargetException\n",
"\n",
"# Choose a name for your instance\n",
"compute_name = \"compute-instance\"\n",
"\n",
"# Verify that instance does not exist already\n",
"try:\n",
" instance = ComputeInstance(workspace=ws, name=compute_name)\n",
" print('Found existing instance, use it.')\n",
"except ComputeTargetException:\n",
" compute_config = ComputeInstance.provisioning_configuration(\n",
" vm_size='STANDARD_D3_V2',\n",
" ssh_public_access=False,\n",
" # vnet_resourcegroup_name='<my-resource-group>',\n",
" # vnet_name='<my-vnet-name>',\n",
" # subnet_name='default',\n",
" # admin_user_ssh_public_key='<my-sshkey>'\n",
" )\n",
" instance = ComputeInstance.create(ws, compute_name, compute_config)\n",
" instance.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Create An Experiment\n",
"\n",
"**Experiment** is a logical container in an Azure ML Workspace. It hosts run records which can include run metrics and output artifacts from your experiments."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Experiment\n",
"experiment_name = 'train-on-computeinstance'\n",
"experiment = Experiment(workspace = ws, name = experiment_name)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Submit ComputeInstance run\n",
"The training script `train.py` is already created for you"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Create environment\n",
"\n",
"Create Docker based environment with scikit-learn installed."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import Environment\n",
"from azureml.core.conda_dependencies import CondaDependencies\n",
"\n",
"myenv = Environment(\"myenv\")\n",
"\n",
"myenv.docker.enabled = True\n",
"myenv.python.conda_dependencies = CondaDependencies.create(conda_packages=['scikit-learn'])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Configure & Run"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from azureml.core import ScriptRunConfig\n",
"from azureml.core.runconfig import DEFAULT_CPU_IMAGE\n",
"\n",
"src = ScriptRunConfig(source_directory='', script='train.py')\n",
"\n",
"# Set compute target to the one created in previous step\n",
"src.run_config.target = instance\n",
"\n",
"# Set environment\n",
"src.run_config.environment = myenv\n",
" \n",
"run = experiment.submit(config=src)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Note: if you need to cancel a run, you can follow [these instructions](https://aka.ms/aml-docs-cancel-run)."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"%%time\n",
"# Shows output of the run on stdout.\n",
"run.wait_for_completion(show_output=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(run.get_metrics())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Additional operations to perform on ComputeInstance\n",
"\n",
"You can perform more operations on ComputeInstance such as get status, change the state or deleting the compute."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"msdoc": "how-to-auto-train-remote.md",
"name": "get_status"
},
"outputs": [],
"source": [
"# get_status() gets the latest status of the ComputeInstance target\n",
"instance.get_status()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"msdoc": "how-to-auto-train-remote.md",
"name": "stop"
},
"outputs": [],
"source": [
"# stop() is used to stop the ComputeInstance\n",
"# Stopping ComputeInstance will stop the billing meter and persist the state on the disk.\n",
"# Available Quota will not be changed with this operation.\n",
"instance.stop(wait_for_completion=True, show_output=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"msdoc": "how-to-auto-train-remote.md",
"name": "start"
},
"outputs": [],
"source": [
"# start() is used to start the ComputeInstance if it is in stopped state\n",
"instance.start(wait_for_completion=True, show_output=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# restart() is used to restart the ComputeInstance\n",
"instance.restart(wait_for_completion=True, show_output=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# delete() is used to delete the ComputeInstance target. Useful if you want to re-use the compute name \n",
"# instance.delete()"
]
}
],
"metadata": {
"authors": [
{
"name": "ramagott"
}
],
"category": "training",
"compute": [
"Compute Instance"
],
"datasets": [
"Diabetes"
],
"deployment": [
"None"
],
"exclude_from_index": false,
"framework": [
"None"
],
"friendly_name": "Train on Azure Machine Learning Compute Instance",
"index_order": 1,
"kernelspec": {
"display_name": "Python 3.6",
"language": "python",
"name": "python36"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.7"
},
"tags": [
"None"
],
"task": "Submit a run on Azure Machine Learning Compute Instance."
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@@ -0,0 +1,6 @@
name: train-on-computeinstance
dependencies:
- scikit-learn
- pip:
- azureml-sdk
- azureml-widgets

View File

@@ -0,0 +1,48 @@
# Copyright (c) Microsoft. All rights reserved.
# Licensed under the MIT license.
from sklearn.datasets import load_diabetes
from sklearn.linear_model import Ridge
from sklearn.metrics import mean_squared_error
from sklearn.model_selection import train_test_split
from azureml.core.run import Run
import os
import numpy as np
# sklearn.externals.joblib is removed in 0.23
try:
from sklearn.externals import joblib
except ImportError:
import joblib
os.makedirs('./outputs', exist_ok=True)
X, y = load_diabetes(return_X_y=True)
run = Run.get_context()
X_train, X_test, y_train, y_test = train_test_split(X, y,
test_size=0.2,
random_state=0)
data = {"train": {"X": X_train, "y": y_train},
"test": {"X": X_test, "y": y_test}}
# list of numbers from 0.0 to 1.0 with a 0.05 interval
alphas = np.arange(0.0, 1.0, 0.05)
for alpha in alphas:
# Use Ridge algorithm to create a regression model
reg = Ridge(alpha=alpha)
reg.fit(data["train"]["X"], data["train"]["y"])
preds = reg.predict(data["test"]["X"])
mse = mean_squared_error(preds, data["test"]["y"])
run.log('alpha', alpha)
run.log('mse', mse)
model_file_name = 'ridge_{0:.2f}.pkl'.format(alpha)
# save model in the outputs folder so it automatically get uploaded
with open(model_file_name, "wb") as file:
joblib.dump(value=reg, filename=os.path.join('./outputs/',
model_file_name))
print('alpha is {0:.2f}, and mse is {1:0.2f}'.format(alpha, mse))

View File

@@ -6,10 +6,14 @@ from sklearn.linear_model import Ridge
from sklearn.metrics import mean_squared_error from sklearn.metrics import mean_squared_error
from sklearn.model_selection import train_test_split from sklearn.model_selection import train_test_split
from azureml.core.run import Run from azureml.core.run import Run
from sklearn.externals import joblib
import os import os
import numpy as np import numpy as np
import mylib import mylib
# sklearn.externals.joblib is removed in 0.23
try:
from sklearn.externals import joblib
except ImportError:
import joblib
os.makedirs('./outputs', exist_ok=True) os.makedirs('./outputs', exist_ok=True)

View File

@@ -8,10 +8,15 @@ from sklearn.linear_model import Ridge
from sklearn.metrics import mean_squared_error from sklearn.metrics import mean_squared_error
from sklearn.model_selection import train_test_split from sklearn.model_selection import train_test_split
from azureml.core.run import Run from azureml.core.run import Run
from sklearn.externals import joblib
import numpy as np import numpy as np
# sklearn.externals.joblib is removed in 0.23
try:
from sklearn.externals import joblib
except ImportError:
import joblib
os.makedirs('./outputs', exist_ok=True) os.makedirs('./outputs', exist_ok=True)
parser = argparse.ArgumentParser() parser = argparse.ArgumentParser()
parser.add_argument('--data-folder', type=str, parser.add_argument('--data-folder', type=str,

View File

@@ -1,9 +1,13 @@
import pickle import pickle
import json import json
import numpy as np import numpy as np
from sklearn.externals import joblib
from sklearn.linear_model import Ridge from sklearn.linear_model import Ridge
from azureml.core.model import Model from azureml.core.model import Model
# sklearn.externals.joblib is removed in 0.23
try:
from sklearn.externals import joblib
except ImportError:
import joblib
def init(): def init():

View File

@@ -380,6 +380,24 @@
"#compute_target.delete()" "#compute_target.delete()"
] ]
}, },
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Delete the DataDriftDetector\n",
"\n",
"Invoking the `delete()` method on the object deletes the the drift monitor permanently and cannot be undone. You will no longer be able to find it in the UI and the `list()` or `get()` methods. The object on which delete() was called will have its state set to deleted and name suffixed with deleted. The baseline and target datasets and model data that was collected, if any, are not deleted. The compute is not deleted. The DataDrift schedule pipeline is disabled and archived."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"monitor.delete()"
]
},
{ {
"cell_type": "markdown", "cell_type": "markdown",
"metadata": {}, "metadata": {},

View File

@@ -65,6 +65,7 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
| [Resuming a model](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/ml-frameworks/tensorflow/training/train-tensorflow-resume-training/train-tensorflow-resume-training.ipynb) | Resume a model in TensorFlow from a previously submitted run | MNIST | AML Compute | None | TensorFlow | None | | [Resuming a model](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/ml-frameworks/tensorflow/training/train-tensorflow-resume-training/train-tensorflow-resume-training.ipynb) | Resume a model in TensorFlow from a previously submitted run | MNIST | AML Compute | None | TensorFlow | None |
| [Training in Spark](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-in-spark/train-in-spark.ipynb) | Submiting a run on a spark cluster | None | HDI cluster | None | PySpark | None | | [Training in Spark](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-in-spark/train-in-spark.ipynb) | Submiting a run on a spark cluster | None | HDI cluster | None | PySpark | None |
| [Train on Azure Machine Learning Compute](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb) | Submit a run on Azure Machine Learning Compute. | Diabetes | AML Compute | None | None | None | | [Train on Azure Machine Learning Compute](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-amlcompute/train-on-amlcompute.ipynb) | Submit a run on Azure Machine Learning Compute. | Diabetes | AML Compute | None | None | None |
| [Train on Azure Machine Learning Compute Instance](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-computeinstance/train-on-computeinstance.ipynb) | Submit a run on Azure Machine Learning Compute Instance. | Diabetes | Compute Instance | None | None | None |
| [Train on local compute](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-local/train-on-local.ipynb) | Train a model locally | Diabetes | Local | None | None | None | | [Train on local compute](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-local/train-on-local.ipynb) | Train a model locally | Diabetes | Local | None | None | None |
| [Train in a remote Linux virtual machine](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-remote-vm/train-on-remote-vm.ipynb) | Configure and execute a run | Diabetes | Data Science Virtual Machine | None | None | None | | [Train in a remote Linux virtual machine](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training/train-on-remote-vm/train-on-remote-vm.ipynb) | Configure and execute a run | Diabetes | Data Science Virtual Machine | None | None | None |
| [Using Tensorboard](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training-with-deep-learning/export-run-history-to-tensorboard/export-run-history-to-tensorboard.ipynb) | Export the run history as Tensorboard logs | None | None | None | TensorFlow | None | | [Using Tensorboard](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/training-with-deep-learning/export-run-history-to-tensorboard/export-run-history-to-tensorboard.ipynb) | Export the run history as Tensorboard logs | None | None | None | TensorFlow | None |
@@ -95,6 +96,8 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
|:----|:-----|:-------:|:----------------:|:-----------------:|:------------:|:------------:| |:----|:-----|:-------:|:----------------:|:-----------------:|:------------:|:------------:|
| [DNN Text Featurization](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/classification-text-dnn/auto-ml-classification-text-dnn.ipynb) | Text featurization using DNNs for classification | None | AML Compute | None | None | None | | [DNN Text Featurization](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/classification-text-dnn/auto-ml-classification-text-dnn.ipynb) | Text featurization using DNNs for classification | None | AML Compute | None | None | None |
| [configuration](https://github.com/Azure/MachineLearningNotebooks/blob/master/configuration.ipynb) | | | | | | | | [configuration](https://github.com/Azure/MachineLearningNotebooks/blob/master/configuration.ipynb) | | | | | | |
| [fairlearn-azureml-mitigation](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/fairness/fairlearn-azureml-mitigation.ipynb) | | | | | | |
| [upload-fairness-dashboard](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/fairness/upload-fairness-dashboard.ipynb) | | | | | | |
| [lightgbm-example](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/gbdt/lightgbm/lightgbm-example.ipynb) | | | | | | | | [lightgbm-example](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/gbdt/lightgbm/lightgbm-example.ipynb) | | | | | | |
| [azure-ml-with-nvidia-rapids](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/RAPIDS/azure-ml-with-nvidia-rapids.ipynb) | | | | | | | | [azure-ml-with-nvidia-rapids](https://github.com/Azure/MachineLearningNotebooks/blob/master//contrib/RAPIDS/azure-ml-with-nvidia-rapids.ipynb) | | | | | | |
| [auto-ml-continuous-retraining](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/continuous-retraining/auto-ml-continuous-retraining.ipynb) | | | | | | | | [auto-ml-continuous-retraining](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/automated-machine-learning/continuous-retraining/auto-ml-continuous-retraining.ipynb) | | | | | | |
@@ -121,6 +124,7 @@ Machine Learning notebook samples and encourage efficient retrieval of topics an
| [train-explain-model-on-amlcompute-and-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/explain-model/azure-integration/scoring-time/train-explain-model-on-amlcompute-and-deploy.ipynb) | | | | | | | | [train-explain-model-on-amlcompute-and-deploy](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/explain-model/azure-integration/scoring-time/train-explain-model-on-amlcompute-and-deploy.ipynb) | | | | | | |
| [training_notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/notebook_runner/training_notebook.ipynb) | | | | | | | | [training_notebook](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/intro-to-pipelines/notebook_runner/training_notebook.ipynb) | | | | | | |
| [nyc-taxi-data-regression-model-building](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/nyc-taxi-data-regression-model-building/nyc-taxi-data-regression-model-building.ipynb) | | | | | | | | [nyc-taxi-data-regression-model-building](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/nyc-taxi-data-regression-model-building/nyc-taxi-data-regression-model-building.ipynb) | | | | | | |
| [pipeline-style-transfer-mpi](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/machine-learning-pipelines/pipeline-style-transfer/pipeline-style-transfer-mpi.ipynb) | | | | | | |
| [authentication-in-azureml](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/manage-azureml-service/authentication-in-azureml/authentication-in-azureml.ipynb) | | | | | | | | [authentication-in-azureml](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/manage-azureml-service/authentication-in-azureml/authentication-in-azureml.ipynb) | | | | | | |
| [pong_rllib](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/reinforcement-learning/atari-on-distributed-compute/pong_rllib.ipynb) | | | | | | | | [pong_rllib](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/reinforcement-learning/atari-on-distributed-compute/pong_rllib.ipynb) | | | | | | |
| [cartpole_ci](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/reinforcement-learning/cartpole-on-compute-instance/cartpole_ci.ipynb) | | | | | | | | [cartpole_ci](https://github.com/Azure/MachineLearningNotebooks/blob/master//how-to-use-azureml/reinforcement-learning/cartpole-on-compute-instance/cartpole_ci.ipynb) | | | | | | |

View File

@@ -102,7 +102,7 @@
"source": [ "source": [
"import azureml.core\n", "import azureml.core\n",
"\n", "\n",
"print(\"This notebook was created using version 1.7.0 of the Azure ML SDK\")\n", "print(\"This notebook was created using version 1.9.0 of the Azure ML SDK\")\n",
"print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")" "print(\"You are currently using version\", azureml.core.VERSION, \"of the Azure ML SDK\")"
] ]
}, },

View File

@@ -217,6 +217,7 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"%%time\n", "%%time\n",
"import uuid\n",
"from azureml.core.webservice import Webservice\n", "from azureml.core.webservice import Webservice\n",
"from azureml.core.model import InferenceConfig\n", "from azureml.core.model import InferenceConfig\n",
"from azureml.core.environment import Environment\n", "from azureml.core.environment import Environment\n",
@@ -230,8 +231,9 @@
"myenv = Environment.get(workspace=ws, name=\"tutorial-env\", version=\"1\")\n", "myenv = Environment.get(workspace=ws, name=\"tutorial-env\", version=\"1\")\n",
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)\n", "inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)\n",
"\n", "\n",
"service_name = 'sklearn-mnist-svc-' + str(uuid.uuid4())[:4]\n",
"service = Model.deploy(workspace=ws, \n", "service = Model.deploy(workspace=ws, \n",
" name='sklearn-mnist-svc', \n", " name=service_name, \n",
" models=[model], \n", " models=[model], \n",
" inference_config=inference_config, \n", " inference_config=inference_config, \n",
" deployment_config=aciconfig)\n", " deployment_config=aciconfig)\n",

View File

@@ -272,6 +272,7 @@
"outputs": [], "outputs": [],
"source": [ "source": [
"%%time\n", "%%time\n",
"import uuid\n",
"from azureml.core.webservice import Webservice\n", "from azureml.core.webservice import Webservice\n",
"from azureml.core.model import InferenceConfig\n", "from azureml.core.model import InferenceConfig\n",
"from azureml.core.environment import Environment\n", "from azureml.core.environment import Environment\n",
@@ -284,8 +285,9 @@
"myenv = Environment.get(workspace=ws, name=\"tutorial-env\")\n", "myenv = Environment.get(workspace=ws, name=\"tutorial-env\")\n",
"inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)\n", "inference_config = InferenceConfig(entry_script=\"score.py\", environment=myenv)\n",
"\n", "\n",
"service_name = 'sklearn-mnist-svc-' + str(uuid.uuid4())[:4]\n",
"service = Model.deploy(workspace=ws, \n", "service = Model.deploy(workspace=ws, \n",
" name='sklearn-mnist-svc', \n", " name=service_name, \n",
" models=[model], \n", " models=[model], \n",
" inference_config=inference_config, \n", " inference_config=inference_config, \n",
" deployment_config=aciconfig)\n", " deployment_config=aciconfig)\n",
@@ -489,7 +491,7 @@
"import json\n", "import json\n",
"from azureml.core import Webservice\n", "from azureml.core import Webservice\n",
"\n", "\n",
"service = Webservice(ws, 'sklearn-mnist-svc')\n", "service = Webservice(ws, service_name)\n",
"\n", "\n",
"#pass the connection string for blob storage to give the server access to the uploaded public keys \n", "#pass the connection string for blob storage to give the server access to the uploaded public keys \n",
"conn_str_template = 'DefaultEndpointsProtocol={};AccountName={};AccountKey={};EndpointSuffix=core.windows.net'\n", "conn_str_template = 'DefaultEndpointsProtocol={};AccountName={};AccountKey={};EndpointSuffix=core.windows.net'\n",

View File

@@ -340,7 +340,7 @@
"* input and output data, and any custom parameters\n", "* input and output data, and any custom parameters\n",
"* reference to a script or SDK-logic to run during the step\n", "* reference to a script or SDK-logic to run during the step\n",
"\n", "\n",
"There are multiple classes that inherit from the parent class [`PipelineStep`](https://docs.microsoft.com/python/api/azureml-pipeline-core/azureml.pipeline.core.builder.pipelinestep?view=azure-ml-py) to assist with building a step using certain frameworks and stacks. In this example, you use the [`ParallelRunStep`](https://docs.microsoft.com/en-us/python/api/azureml-contrib-pipeline-steps/azureml.contrib.pipeline.steps.parallelrunstep?view=azure-ml-py) class to define your step logic using a scoring script. \n", "There are multiple classes that inherit from the parent class [`PipelineStep`](https://docs.microsoft.com/python/api/azureml-pipeline-steps/azureml.pipeline.steps.parallelrunstep?view=azure-ml-py) to assist with building a step using certain frameworks and stacks. In this example, you use the [`ParallelRunStep`](https://docs.microsoft.com/en-us/python/api/azureml-contrib-pipeline-steps/azureml.contrib.pipeline.steps.parallelrunstep?view=azure-ml-py) class to define your step logic using a scoring script. \n",
"\n", "\n",
"An object reference in the `outputs` array becomes available as an **input** for a subsequent pipeline step, for scenarios where there is more than one step." "An object reference in the `outputs` array becomes available as an **input** for a subsequent pipeline step, for scenarios where there is more than one step."
] ]

View File

@@ -2,6 +2,4 @@ name: regression-automated-ml
dependencies: dependencies:
- pip: - pip:
- azureml-sdk - azureml-sdk
- azureml-train-automl
- azureml-widgets
- azureml-opendatasets - azureml-opendatasets