diff --git a/Final Exam/FinalExam-Test.zip b/Final Exam/FinalExam-Test.zip new file mode 100644 index 0000000..14f0b17 Binary files /dev/null and b/Final Exam/FinalExam-Test.zip differ diff --git a/Final Exam/FinalExam.zip b/Final Exam/FinalExam.zip new file mode 100644 index 0000000..4f2790b Binary files /dev/null and b/Final Exam/FinalExam.zip differ diff --git a/Module1/IntroductionToMachineLearning.ipynb b/Module1/IntroductionToMachineLearning.ipynb new file mode 100644 index 0000000..5365361 --- /dev/null +++ b/Module1/IntroductionToMachineLearning.ipynb @@ -0,0 +1,390 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction to Machine Learning\n", + "\n", + "This lab introduces some basic concepts of machine learning with Python. In this lab you will use the K-Nearest Neighbor (KNN) algorithm to classify the species of iris flowers, given measurements of flower characteristics. By completing this lab you will have an overview of an end-to-end machine learning modeling process. \n", + "\n", + "By the completion of this lab, you will:\n", + "1. Follow and understand a complete end-to-end machine learning process including data exploration, data preparation, modeling, and model evaluation. \n", + "2. Develop a basic understanding of the principles of machine learning and associated terminology. \n", + "3. Understand the basic process for evaluating machine learning models. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overview of KNN classification\n", + "\n", + "Before discussing a specific algorithm, it helps to know a bit of machine learning terminology. In supervised machine learning a set of ***cases*** are used to ***train***, ***test*** and ***evaluate*** the model. Each case is comprised of the values of one or more ***features*** and a ***label*** value. The features are variables used by the model to ***predict** the value of the label. Minimizing the ***errors*** between the true value of the label and the prediction supervises the training of this model. Once the model is trained and tested, it can be evaluated based on the accuracy in predicting the label of a new set of cases. \n", + "\n", + "In this lab you will use randomly selected cases to first train and then evaluate a k-nearest-neighbor (KNN) machine learning model. The goal is to predict the type or class of the label, which makes the machine learning model a ***classification*** model. \n", + "\n", + "The k-nearest-neighbor algorithm is conceptually simple. In fact, there is no formal training step. Given a known set of cases, a new case is classified by majority vote of the K (where $k = 1, 2, 3$, etc.) points nearest to the values of the new case; that is, the nearest neighbors of the new case. \n", + "\n", + "The schematic figure below illustrates the basic concepts of a KNN classifier. In this case there are two features, the values of one shown on the horizontal axis and the values of the other shown on the vertical axis. The cases are shown on the diagram as one of two classes, red triangles and blue circles. To summarize, each case has a value for the two features, and a class. The goal of the KNN algorithm is to classify cases with unknown labels. \n", + "\n", + "Continuing with the example, on the left side of the diagram the $K = 1$ case is illustrated. The nearest neighbor is a red triangle. Therefore, this KNN algorithm will classify the unknown case, '?', as a red triangle. On the right side of the diagram, the $K = 3$ case is illustrated. There are three near neighbors within the circle. The majority of nearest neighbors for $K = 3$ are the blue circles, so the algorithm classifies the unknown case, '?', as a blue circle. Notice that class predicted for the unknown case changes as K changes. This behavior is inherent in the KNN method. \n", + "\n", + "![](img/KNN.jpg)\n", + "
**KNN for k = 1 and k = 3**
\n", + "\n", + "There are some additional considerations in creating a robust KNN algorithm. These will be addressed later in this course. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Examine the data set\n", + "\n", + "In this lab you will work with the Iris data set. This data set is famous in the history of statistics. The first publication using these data in statistics by the pioneering statistician Ronald A Fisher was in 1936. Fisher proposed an algorithm to classify the species of iris flowers from physical measurements of their characteristics. The data set has been used as a teaching example ever since. \n", + "\n", + "Now, you will load and examine these data which are in the statsmodels.api package. Execute the code in the cell below and examine the first few rows of the data frame. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "#from statsmodels.api import datasets\n", + "from sklearn import datasets ## Get dataset from sklearn\n", + "\n", + "## Import the dataset from sklearn.datasets\n", + "iris = datasets.load_iris()\n", + "\n", + "## Create a data frame from the dictionary\n", + "species = [iris.target_names[x] for x in iris.target]\n", + "iris = pd.DataFrame(iris['data'], columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "iris['Species'] = species\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are four features, containing the dimensions of parts of the iris flower structures. The label column is the Species of the flower. The goal is to create and test a KNN algorithm to correctly classify the species. \n", + "\n", + "Next, you will execute the code in the cell below to show the data types of each column. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "iris.dtypes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The features are all numeric, and the label is a categorical string variable.\n", + "\n", + "Next, you will determine the number of unique categories, and number of cases for each category, for the label variable, Species. Execute the code in the cell below and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "iris['count'] = 1\n", + "iris[['Species', 'count']].groupby('Species').count()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see there are three species of iris, each with 50 cases. \n", + "\n", + "Next, you will create some plots to see how the classes might, or might not, be well separated by the value of the features. In an ideal case, the label classes will be perfectly separated by one or more of the feature pairs. In the real-world this ideal situation will rarely, if ever, be the case.\n", + " \n", + "There are six possible pair-wise scatter plots of these four features. For now, we will just create scatter plots of two variable pairs. Execute the code in the cell below and examine the resulting plots.\n", + "***\n", + "**Note:** Data visualization and the Seaborn package are covered in another lesson.\n", + "***" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "%matplotlib inline\n", + "def plot_iris(iris, col1, col2):\n", + " import seaborn as sns\n", + " import matplotlib.pyplot as plt\n", + " sns.lmplot(x = col1, y = col2, \n", + " data = iris, \n", + " hue = \"Species\", \n", + " fit_reg = False)\n", + " plt.xlabel(col1)\n", + " plt.ylabel(col2)\n", + " plt.title('Iris species shown by color')\n", + " plt.show()\n", + "plot_iris(iris, 'Petal_Width', 'Sepal_Length')\n", + "plot_iris(iris, 'Sepal_Width', 'Sepal_Length')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results noticing the separation, or overlap, of the label values. \n", + "\n", + "Then, answer **Question 1** on the course page. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare the data set\n", + "\n", + "Data preparation is an important step before training any machine learning model. These data require only two preparation steps:\n", + "- Scale the numeric values of the features. It is important that numeric features used to train machine learning models have a similar range of values. Otherwise, features which happen to have large numeric values may dominate model training, even if other features with smaller numeric values are more informative. In this case Zscore normalization is used. This normalization process scales each feature so that the mean is 0 and the variance is 1.0. \n", + "- Split the dataset into randomly sampled training and evaluation data sets. The random selection of cases seeks to limit the leakage of information between the training and evaluation cases.\n", + "\n", + "The code in the cell below normalizes the features by these steps:\n", + "- The scale function from scikit-learn.preprocessing is used to normalize the features.\n", + "- Column names are assigned to the resulting data frame. \n", + "- A statitical summary of the data frame is then printed. \n", + "\n", + "***\n", + "**Note:** Data preparation with scikit-learn is covered in another lesson. \n", + "***\n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sklearn.preprocessing import scale\n", + "import pandas as pd\n", + "num_cols = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']\n", + "iris_scaled = scale(iris[num_cols])\n", + "iris_scaled = pd.DataFrame(iris_scaled, columns = num_cols)\n", + "print(iris_scaled.describe().round(3))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. The mean of each column is zero and the standard deviation is approximately 1.0.\n", + "\n", + "The methods in the scikit-learn package requires numeric numpy arrays as arguments. Therefore, the strings indicting species must be re-coded as numbers. The code in the cell below does this using a dictionary lookup. Execute this code and examine the head of the data frame. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", + "iris_scaled['Species'] = [levels[x] for x in iris['Species']]\n", + "iris_scaled.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will split the dataset into a test and evaluation sub-sets. The code in the cell below randomly splits the dataset into training and testing subsets. The features and labels are then separated into numpy arrays. The dimension of each array is printed as a check. Execute this code to create these subsets. \n", + "\n", + "***\n", + "**Note:** Splitting data sets for machine learning with scikit-learn is discussed in another lesson.\n", + "***" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Split the data into a training and test set by Bernoulli sampling\n", + "from sklearn.model_selection import train_test_split\n", + "import numpy as np\n", + "np.random.seed(3456)\n", + "iris_split = train_test_split(np.asmatrix(iris_scaled), test_size = 75)\n", + "iris_train_features = iris_split[0][:, :4]\n", + "iris_train_labels = np.ravel(iris_split[0][:, 4])\n", + "iris_test_features = iris_split[1][:, :4]\n", + "iris_test_labels = np.ravel(iris_split[1][:, 4])\n", + "print(iris_train_features.shape)\n", + "print(iris_train_labels.shape)\n", + "print(iris_test_features.shape)\n", + "print(iris_test_labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Train and evaluate the KNN model\n", + "\n", + "With some understanding of the relationships between the features and the label and preparation of the data completed you will now train and evaluate a $K = 3$ model. The code in the cell below does the following:\n", + "- The KNN model is defined as having $K = 3$.\n", + "- The model is trained using the fit method with the feature and label numpy arrays as arguments.\n", + "- Displays a summary of the model. \n", + "\n", + "Execute this code and examine the summary of these results.\n", + "\n", + "\n", + "***\n", + "**Note:** Constructing machine learning models with scikit-learn is covered in another lesson.\n", + "***" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Define and train the KNN model\n", + "from sklearn.neighbors import KNeighborsClassifier\n", + "KNN_mod = KNeighborsClassifier(n_neighbors = 3)\n", + "KNN_mod.fit(iris_train_features, iris_train_labels)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, you will evaluate this model using the accuracy statistic and a set of plots. The following steps create model predictions and compute accuracy:\n", + "- The predict method is used to compute KNN predictions from the model using the test features as an argument. \n", + "- The predictions are scored as correct or not using a list comprehension. \n", + "- Accuracy is computed as the percentage of the test cases correctly classified. \n", + "\n", + "Execute this code, examine the results, and answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "iris_test = pd.DataFrame(iris_test_features, columns = num_cols)\n", + "iris_test['predicted'] = KNN_mod.predict(iris_test_features)\n", + "iris_test['correct'] = [1 if x == z else 0 for x, z in zip(iris_test['predicted'], iris_test_labels)]\n", + "accuracy = 100.0 * float(sum(iris_test['correct'])) / float(iris_test.shape[0])\n", + "print(accuracy)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The accuracy is pretty good.\n", + "\n", + "Now, execute the code in the cell below and examine plots of the classifications of the iris species. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", + "iris_test['Species'] = [levels[x] for x in iris_test['predicted']]\n", + "markers = {1:'^', 0:'o'}\n", + "colors = {'setosa':'blue', 'versicolor':'green', 'virginica':'red'}\n", + "def plot_shapes(df, col1,col2, markers, colors):\n", + " import matplotlib.pyplot as plt\n", + " import seaborn as sns\n", + " ax = plt.figure(figsize=(6, 6)).gca() # define plot axis\n", + " for m in markers: # iterate over marker dictioary keys\n", + " for c in colors: # iterate over color dictionary keys\n", + " df_temp = df[(df['correct'] == m) & (df['Species'] == c)]\n", + " sns.regplot(x = col1, y = col2, \n", + " data = df_temp, \n", + " fit_reg = False, \n", + " scatter_kws={'color': colors[c]},\n", + " marker = markers[m],\n", + " ax = ax)\n", + " plt.xlabel(col1)\n", + " plt.ylabel(col2)\n", + " plt.title('Iris species by color')\n", + " return 'Done'\n", + "plot_shapes(iris_test, 'Petal_Width', 'Sepal_Length', markers, colors)\n", + "plot_shapes(iris_test, 'Sepal_Width', 'Sepal_Length', markers, colors)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "In the plots above color is used to show the predicted class. Correctly classified cases are shown by triangles and incorrectly classified cases are shown by circles. \n", + "\n", + "Examine the plot and answer **Question 3** on the course page." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have created and evaluated a KNN machine learning classification model. Specifically you have:\n", + "1. Loaded and explored the data using visualization to determine if the features separate the classes.\n", + "2. Prepared the data by normalizing the numeric features and randomly sampling into training and testing subsets. \n", + "3. Constructing and evaluating the machine learning model. Evaluation was performed by statistically, with the accuracy metric, and with visualization. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/Module1/Lab1-IntroToML-Python.ipynb b/Module1/Lab1-IntroToML-Python.ipynb deleted file mode 100644 index 75a569e..0000000 --- a/Module1/Lab1-IntroToML-Python.ipynb +++ /dev/null @@ -1,740 +0,0 @@ -{ - "cells": [ - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "# Lab 1: Introduction to Machine Learning\n", - "\n", - "This lab introduces some basic concepts of machine learning with R. In this lab you will use the K-Nearest Neighbor (KNN) algorithm to classify the species of iris flowers, given meaurements of flower characteristics. By completing this lab you will have an overview of an end-to-end machine learning modeling process. \n", - "\n", - "By the completion of this lab, you will:\n", - "1. Follow and understand a complete end-to-end machine learning process including data exploration, data preparation, modeling, and model evaluation. \n", - "2. Develop a basic understanding of the principles of machine learning and associated terminology. \n", - "3. Understand the basic process for evaluating machine learning models. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Overview of KNN classification\n", - "\n", - "Before discussing a specific algorithm, it helps to know a bit of machine learning terminology. In supervised machine learning a set of ***cases*** are used to ***train***, ***test*** and ***evaluate*** the model. Each case is comprised of the values of one or more ***features*** and a ***label*** value. The features are variables used by the model to ***predict** the value of the label. Minimizing the ***errors*** between the true value of the label and the prediction supervises the training of this model. Once the model is trained and tested, it can be evaluated based on the accuracy in predicting the label of a new set of cases. \n", - "\n", - "In this lab you will use randomly selected cases to first train and then evaluate a k-nearest-neighbor (KNN) machine learning model. The goal is to predict the type or class of the label, which makes the machine learning model a ***classification*** model. \n", - "\n", - "The k-nearest-neighbor algorithm is conceptually simple. In fact, there is no formal traning step. Given a known set of cases, a new case is classified by majority vote of the K (where $k = 1, 2, 3$, etc.) points nearest to the values of the new case; that is, the nearest neighbors of the new case. \n", - "\n", - "The schematic figure below illustrates the basic concepts of a KNN classifier. In this case there are two features, the values of one shown on the horrizontal axis and the values of the other shown on the vertical axis. The cases are shown on the diagram as one of two classes, red triangles and blue circles. To summarize, each case has a value for the two features, and a class. The goal of thee KNN algorithm is to classify cases with unknown labels. \n", - "\n", - "Continuing with the example, on the left side of the diagram the $K = 1$ case is illustrated. The nearest neighbor is a red triangle. Therefore, this KNN algorithm will classify the unknown case, '?', as a red triangle. On the right side of the diagram, the $K = 3$ case is illustrated. There are three near neighbors within the circle. The majority of nearest neighbors for $K = 3$ are the blue circles, so the algorithm classifies the unknown case, '?', as a blue circle. Notice that class predicted for the unknown case changes as K changes. This behavior is inherent in the KNN method. \n", - "\n", - "![](img/KNN.jpg)\n", - "
**KNN for k = 1 and k = 3**
\n", - "\n", - "There are some additional considerations in creating a robust KNN algorithm. These will be addressed later in this course. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Examine the data set\n", - "\n", - "In this lab you will work with the Iris data set. This data set is famous in the history of statistics. The first publication using these data in statistics by the pioneering statistician Ronald A Fisher was in his 1936. Fisher proposed an algorthm to classify the species of iris flowers from physical measurements of their characteristics. The data set has been used as a teaching example ever since. \n", - "\n", - "Now, you will load and examine these data which are in the statsmodels.api package. Execute the code in the cell below and examine the first few rows of the data frame. " - ] - }, - { - "cell_type": "code", - "execution_count": 1, - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "C:\\Users\\StevePC2\\Anaconda3\\lib\\site-packages\\statsmodels\\compat\\pandas.py:56: FutureWarning: The pandas.core.datetools module is deprecated and will be removed in a future version. Please use the pandas.tseries module instead.\n", - " from pandas.core import datetools\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
Sepal_LengthSepal_WidthPetal_LengthPetal_WidthSpecies
05.13.51.40.2setosa
14.93.01.40.2setosa
24.73.21.30.2setosa
34.63.11.50.2setosa
45.03.61.40.2setosa
\n", - "
" - ], - "text/plain": [ - " Sepal_Length Sepal_Width Petal_Length Petal_Width Species\n", - "0 5.1 3.5 1.4 0.2 setosa\n", - "1 4.9 3.0 1.4 0.2 setosa\n", - "2 4.7 3.2 1.3 0.2 setosa\n", - "3 4.6 3.1 1.5 0.2 setosa\n", - "4 5.0 3.6 1.4 0.2 setosa" - ] - }, - "execution_count": 1, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "from statsmodels.api import datasets\n", - "iris = datasets.get_rdataset(\"iris\")\n", - "iris.data.columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Species']\n", - "iris.data.head()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "There are four features, containing the dimensions of parts of the iris flower structures. The label column is the Species of the flower. The goal is to create and test a KNN algorithm to correctly classify the species. \n", - "\n", - "Next, you will execute the code in the cell below to show the data types of each column. " - ] - }, - { - "cell_type": "code", - "execution_count": 2, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "Sepal_Length float64\n", - "Sepal_Width float64\n", - "Petal_Length float64\n", - "Petal_Width float64\n", - "Species object\n", - "dtype: object" - ] - }, - "execution_count": 2, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "iris.data.dtypes" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The features are all numeric, and the label is a categorical string variable.\n", - "\n", - "Next, you will determine the number of unique categories, and number of cases for each category, for the lable variable, Species. Execute the code in the cell below annd examine the results. " - ] - }, - { - "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
count
Species
setosa50
versicolor50
virginica50
\n", - "
" - ], - "text/plain": [ - " count\n", - "Species \n", - "setosa 50\n", - "versicolor 50\n", - "virginica 50" - ] - }, - "execution_count": 3, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "iris.data['count'] = 1\n", - "iris.data[['Species', 'count']].groupby('Species').count()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "You can see there are three species of iris, each with 50 cases. \n", - "\n", - "Next, you will create some plots to see how the classes might, or might not, be well seperated by the value of the features. In an idea case, the lable classes will be perfectly separated by one or more of the feature pairs. In the real-world this ideal situation will rarely, if ever, be the case.\n", - " \n", - "There are five possible pair-wise scatter plots of these four features. For now, we will just create scatter plots of two variable pairs. Execute the code in the cell below and examine the resulting plots.\n", - "***\n", - "**Note:** Data visualization and the Seaborn package are covered in another lesson.\n", - "***" - ] - }, - { - "cell_type": "code", - "execution_count": 4, - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAa8AAAFtCAYAAACwS+W+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl8VPW5P/DPM5PJRghr2MLmFgOxgJAKWqOotRdbpdervdIqEKsFcUV//qyXttZLvd6L9WqlKoL2FsEFq14sbkitPyAuYCNiWAKogIQQIGwJkIXJzPP745zgZJgkc5I5M3NmPu/Xa14zc+Y7Z75nBvN4znme84iqgoiIyElcsZ4AERGRVQxeRETkOAxeRETkOAxeRETkOAxeRETkOAxeRETkOAxeSUBErheRFbGeR3tEZJOIjI/SZz0oIi9E47PCISLjRWR3jOewU0S+H8s5EIWLwSsBtPdHR1VfVNUfRHNOHaGqBaq6MtbzIKL4x+CV4EQkJdZzoMTGf2MUCwxeCUZEikXkIxF5XEQOAXjQXPah+bqYr+0XkRoRKRORc9pY13YROSoiO0Tk+qDP+KO5ji0iclnA+7qJyJ9EpEpEKkXkIRFxB7z+CxEpN9e7WURGm8tP7kGKiEtE7heRr0XkoIj8RUR6mq+li8gL5vIjIvIPEenbyjb80pzDURHZGjhPAKkissh8bZOIFAa8b5iIrDTXv0lEJprLTzOXucznz4nI/oD3vSAiM83HK0Xkd+Z3dVREVohI73Z+v1kicsD8Lpq/7++KyL7AICEi14jI+lbWkSEi/y0i35i/z4cikmG+NtHcniPm/Ia1so40EfmDiOwxb38QkTTztfEistv8bvcC+HNb20RkBwavxDQWwHYAfQD8R9BrPwBwEYA8AN0BXAfgYPAKRKQLgLkArlDVrgAuABD4x7L5M3oD+C2A/20OLgCeB9AE4EwA55qfebO53p8AeBDAFADZACaG+nwAdwL4ZwAXAxgA4DCAp8zXpgLoBmAQgF4AbgFQH2IbzgZwO4DvmtvwTwB2BgyZCGCJ+T0sA/Ck+T4PgDcBrIDxHd4B4EUROVtVdwCoNbcLAIoAHAsIAhcBWBXwGT8DcKO5nlQA94bY1mb9YHyfueY2LjA/8x/md3R5wNgbACxuZT2PAhgD4zfrCeA+AH4RyQPwMoCZAHIAvAPgTRFJDbGOXwEYB2AUgJEAzgPw66C59gQwBMC0NraJyB6qypvDbzD+IH/ffFwMYFfQ68UAPjQfXwpgG4w/TK421tkFwBEA1wDICLG+PQAkYNmnACYD6AugMfA9AH4K4P+Zj98DcFcY21EO4LKA1/oD8AJIAfBzAB8DGNHO93ImgP0Avg/AE/TagwDeD3g+HEC9+bgIwN7A7wfGH/0HzceLAdwD4w/4VgCPwAigp5nfmcsctxLArwPWcSuA5a3MdTyMgN8lYNlfAPzGfPxLAC+aj3sCqAPQP8R6XDAC+cgQr/0GwF+CxlYCGB/i+/8awA8Dxv4TgJ0Bcz0BID3W//Z5S94b97wSU0VrL6jqBzD2MJ4CsE9EFohIdohxx2Hsld0CoEpE3haR/IAhlaoaeFXnb2DsIQ0B4DHfc0REjgCYD2PPAzD2lr4OYxuGAFgasI5yAD4YwXExjCC4xDyk9Yi5txS8DV/B2Mt4EMB+EVkiIgMChuwNeFwHIN08NDcAQIWq+oO2L9d8vArGH/CLAKyGEaQuNm8lQe8L/oysNrb5sPm9B35m83xfAHCViGQB+Ffzc6pCrKM3gHSE/o4HmOsEAJjzrAjYrlbHBs0FAKpVtaGNbSGyFYNXYmqzVYCqzlXVMQAKYBw+/L+tjHtPVS+HsdezBcCzAS/niogEPB8MY2+sAsaeV29V7W7eslW1wBxXAeCMMLahAsYhy+4Bt3RVrVRVr6r+u6oOh3Fo7EoYhyFDbcNLqnohjGCoAOaE8dl7AAxqPq8VsH2V5uNVMPbOxpuPPwTwPRjBK/CQoVU9zMO1gZ+5BwBUtRLAJwCuhrGH29ohwwMAGhD6O94D43sAYJz/hPE/E5XtjQ2ci4ntKCimGLySjHnyf6y5p3Icxh86X4hxfc2T+11gBKNjQeP6ALhTRDzmeaxhAN4x9wZWAPhvEck2Ey/OEJGLzfc9B+BeERkjhjNFJPCPZLNnAPxH82sikiMiPzYfXyIi3xEjCaQWxuHEUNtwtohcaiYaNMA4nHbKuBDWmt/Nfeb2jQdwFYzzY1DVL8113QBgtarWAtgH4xBrZ4IXAPy7iKSKSBGMoPxqwGuLYJy/+g6ApaHebO5N/Q+Ax0RkgIi4ReR88zv4C4Afichl5u//f2D8th+HWNXLAH5tfu+9ATwAY++PKC4weCWfbBh7UIdhHAo6COMEfzAXjD9uewAcgrFXcWvA62sBnAXj//T/A8C1qtqceDEFRnLCZvNzXoOx9wZVfdUc/xKAowDegHEOJ9gTMJIoVojIUQBrYCSJAMa5ptdgBK5yGAEj1B/WNAD/Zc5xL4yAOyvUlxJIVU/ASOa4wnzv0wCmqOqWgGGrABxU1V0BzwXA5+2tvw17YXxfewC8COCWoM9cCvNwatDhxWD3AtgA4B8wfrs5MM7DbYURcP9obtdVAK4ytzfYQwBKAZSZ61pnLiOKC9LytAVR+0SkGMDN5uE4iiIR+RrAdFV9P9ZzIYol7nkROYSIXAPjXNMHsZ4LUayxMp7IAURkJYx0/slB2YxESYmHDYmIyHF42JCIiByHwYuIiBzHkee8JkyYoMuXL4/1NIiIgkn7QygSHLnndeDAgVhPgYiIYsiRwYuIiJIbgxcRETkOgxcRETmO7cFLRO42O7duFJGXRSQ96PU0EXlFRL4SkbUiMtTuORERkbPZGrxEJBdGR9xCVT0HgBvApKBhN8HoY3QmgMcRXssKIiJKYtE4bJgCIMNs8peJlj2BAODHMNrGA8aVwi8L6hNFRETUgq3By2yg9yiAXQCqANSo6oqgYbkwO/+qahOAGgC9gtclItNEpFRESqurq+2cNhERxTm7Dxv2gLFndRqMFuJdROSG4GEh3nrKBRdVdYGqFqpqYU5OTuQnS0REjmH3YcPvA9ihqtWq6gXwvzDatgfaDaMVOcxDi91gNNAjIiIKye7gtQvAOBHJNM9jXQaj822gZQCmmo+vBfCB8lL3RETUBrvPea2FkYSxDkYrcReABSIyW0QmmsP+BKCXiHwF4B4A99s5JyIicj5H9vMqLCzU0tLSWE+DKKmU7C7Bwk0LUXmsErlZuSguKEbRwKJYTyveMFM6SniFDSJqV8nuEjy89mFU11cjOzUb1fXVeHjtwyjZXRLrqVGSYvAionYt3LQQHrcHGSkZEBFkpGTA4/Zg4aaFsZ4aJSkGLyJqV+WxSqS7W1zZDenudFQeq4zRjCjZMXgRUbtys3LR4GtosazB14DcrNwYzYiSHYMXEbWruKAYXp8X9U31UFXUN9XD6/OiuKA41lOjJMXgRUTtKhpYhFljZyEnIwe1J2qRk5GDWWNnMduQYiYl1hMgImcoGljEYEVxg3teRETkOAxeRETkOAxeRETkOAxeRETkOAxeRETkOAxeRETkOAxeRETkOKzzIqKw2NUSxcp6562fh8Xli1HnrUOmJxOTh03GjFEzOj0Hch7ueRFRu+xqiWJlvfPWz8P8svmob6pHiqSgvqke88vmY976eZ2aAzkTgxcRtcuulihW1ru4fDFEBCmS0uJ+cfniTs2BnInBi4jaZVdLFCvrrfPWwQ13i2VuuFHnrevUHMiZGLyIqF12tUSxst5MTyZ88LVY5oMPmZ7MTs2BnInBi4jaZVdLFCvrnTxsMlQVTdrU4n7ysMmdmgM5k6hqrOdgWWFhoZaWlsZ6GkRJhdmGYZFYTyBZMHgREUUOg1eUsM6LiCLOrr00omY850VEEWVXTRhRIAYvIooou2rCiAIxeBFRRNlVE0YUiMGLiCLKrpowokAMXkQUUXbVhBEFsjV4icjZIrI+4FYrIjODxowXkZqAMQ/YOScislfRwCLMGjsLORk5qD1Ri5yMHMwaO4vZhhRRtqbKq+pWAKMAQETcACoBLA0xtERVr7RzLkQUPUUDixisyFbRPGx4GYCvVfWbKH4mEREloGgGr0kAXm7ltfNF5AsReVdECqI4JyIicqCoBC8RSQUwEcCrIV5eB2CIqo4E8EcAb7SyjmkiUioipdXV1fZNloiI4l609ryuALBOVfcFv6Cqtap6zHz8DgCPiPQOMW6BqhaqamFOTo79MyYiorgVreD1U7RyyFBE+omImI/PM+d0MErzIiIiB7L9wrwikgngcgDTA5bdAgCq+gyAawHMEJEmAPUAJqkTL3VPRERRw5YoRESRw5YoUcKWKERJjK1LyKl4eSiiJMXWJeRkDF5ESYqtS8jJGLyIkhRbl5CTMXgRJSm2LiEnY/AiSlJsXUJOxuBFlKTYuoScjKnyRAko3BR4ti4hp+KeF1GCYQo8JQMGL6IEwxR4SgYMXkQJhinwlAwYvIgSDFPgKRkweBElGKbAUzJg8CJKMEyBp2TAVHmiBMQUeEp0DF5EMWRXSxK2OrGG35fz8LAhUYzYVY/FOi9r+H05E4MXUYzYVY/FOi9r+H05E4MXUYzYVY/FOi9r+H05E4MXUYzYVY/FOi9r+H05E4MXUYzYVY/FOi9r+H05k6hqrOdgWWFhoZaWlsZ6GkSdxmzD+BDB70siPTcKjcGLiChyGLyihHVeRBRx3PMju/GcFxFFFOumKBoYvIgoolg3RdHA4EVEEcW6KYoGBi8iiijWTVE0MHgRUUSxboqiwdbgJSJni8j6gFutiMwMGiMiMldEvhKRMhEZbeeciMhe7CdG0WBrqryqbgUwCgBExA2gEsDSoGFXADjLvI0FMM+8J4of2/4GfPwEcOQboPsQ4IK7gLzLYz2ruMV+YmS3aB42vAzA16r6TdDyHwNYpIY1ALqLSP8ozouobdv+Brx7L3B0H5Dew7h/915jORHFRDSD1yQAL4dYngugIuD5bnMZUXz4+AnAlQqkZgIixr0r1VhORDERleAlIqkAJgJ4NdTLIZadcs0qEZkmIqUiUlpdXR3pKRK17sg3gCej5TJPBnBkV2zmQ0RR2/O6AsA6Vd0X4rXdAAYFPB8IYE/wIFVdoKqFqlqYk5Nj0zSJQug+BPDWt1zmrQe6D47NfIgoasHrpwh9yBAAlgGYYmYdjgNQo6pVUZoXUfsuuAvwnwBO1AGqxr3/hLGciGLC9uAlIpkALgfwvwHLbhGRW8yn7wDYDuArAM8CuNXuORFZknc5cMWjQNe+QMMR4/6KR5ltSBRDbIlCRBQ5bIkSJWyJQhRprAmzrSXKvPXzsLh8Meq8dcj0ZGLysMmYMWpGBGZMTsPLQxFFEmvCbGuJMm/9PMwvm4/6pnqkSArqm+oxv2w+5q2fF6GZk5MweBFFEmvCbGuJsrh8MUQEKZLS4n5x+eLITJwchcGLKJJYE2ZbS5Q6bx3ccLdY5oYbdd66Tq2XnInBiyiSWBNmW0uUTE8mfPC1WOaDD5mezE6tl5yJwYsoklgTZltLlMnDJkNV0aRNLe4nD5scmYmTozB4EUUSa8Jsa4kyY9QMTB8xHRkpGWjSJmSkZGD6iOnMNkxSTJUnCoeV9Pe8y5MqWEXTOb3PwbCew06m4J/T+5xYT4lihHteRO1h+rsldqXK27VeciYGL6L2MP3dErtS5e1aLzkTgxdRe5j+boldqfJ2rZecicGLqD1Mf7fErlR5u9ZLzsTgRdQepr9bYleqvF3rJWdi8CJqD9PfLbErVd6u9ZIzsSUKEVHksCVKlLDOiygcVuq87GqJYmG9drUkCZddn29lvXaNpfjAPS+i9jTXeblSjSxDb71xzivUoUMrY22aQ3M9lMftQbo7HQ2+Bnh93qgdYrPr862s166xYeCeV5TwnBdRe6zUedlVE2ZhvbGuh4qHOi+7xlL8YPAiao+VOi+7asIsrDfW9VDxUOdl11iKHwxeRO2xUudlV02YhfXGuh4qHuq87BpL8YPBi6g9Vuq87KoJs7DeWNdDxUOdl11jKX64H3zwwVjPwbIFCxY8OG3atFhPg5JFrzOAnmcC+zcCx/YB3XKBSx8InYBhZaxNcxiSPQRDsofgy8Nf4kD9AfTr0g93nHtH1LLn7Pp8K+u1a2wY/r0jbyLrmG1IRBQ5zDaMEtZ5UfKKh9otIuoQnvOi5GSlRxf7eRHFHQYvSk7xULtFRB3G4EXJKR5qt4iowxi8KDnFQ+0WEXUYgxclp3io3SKiDrM9eIlIdxF5TUS2iEi5iJwf9Pp4EakRkfXm7QG750RkqUcX+3kRxZ1opMo/AWC5ql4rIqkAMkOMKVHVK6MwF6Jv5V0efgCyMjYO0urtaPExb/08LC5fjDpvHTI9mZg8bDJmjJoRl3OlxGepSFlELgAwFAFBT1UXtTE+G8AXAE7XVj5IRMYDuNdK8GKRMsUtu1qiWGBHS5J56+dhftl8iAjccMMHH1QV00dM71QAi3X7FhuwSDlKwj5sKCKLATwK4EIA3zVvhe287XQA1QD+LCKfi8hzItIlxLjzReQLEXlXRArCnRNR3ImDtHo7WnwsLl8MEUGKpLS4X1y+OO7mSsnBymHDQgDDW9uDamP9owHcoaprReQJAPcD+E3AmHUAhqjqMRH5IYA3AJwVvCIRmQZgGgAMHswsL4pTR74xCpkDRTmtvvJYJbJTs1ss62yLjzpvHVKk5Z8LN9yo89Z1eJ2APXOl5GAlYWMjgH4W178bwG5VXWs+fw1GMDtJVWtV9Zj5+B0AHhHpHbwiVV2gqoWqWpiTk2NxGkRREgdp9Xa0+Mj0ZMIHX4tlPviQ6Ql1Cjt8bEdCHdVu8BKRN0VkGYDeADaLyHsisqz51tZ7VXUvgAoROdtcdBmAzUHr7yciYj4+z5zTwQ5sC1HsxUFavR0tPiYPmwxVRZM2tbifPGxy3M2VkkO7CRsicnFbr6vqqnbePwrAcwBSAWwHcCOA68z3PiMitwOYAaAJQD2Ae1T147bWyYQNimsnsw13GXtczDaM+lxjiAkbURJ2tqGIzFHVX7a3LBoYvIgoTjF4RYmVhI3LAQQHqitCLCNyBrvqsRxU52XXXo+V9do1lhJbOIcNZwC4FUba+9cBL3UF8JGq3mDf9ELjnhd1ml31WA6q87KrxsrKeu0aG0MCAJ999lmflJSU5wCcA16Gr7P8ADY2NTXdPGbMmP3NC8PZ83oJwLsA/hNGmnuzo6p6KLJzJIqSwHoswLg/YS7vTJCxa70WBNZOATh5v3DTwhZ/5MMdZ9fn2zk21lJSUp7r16/fsJycnMMul8t57erjiN/vl+rq6uF79+59DsDE5uXt/h+Bqtao6k4AtwE4GnCDiHjsmS6RzexqcxIH7VMqj1Ui3Z3eYlmo2qlwx9n1+XaOjQPn5OTk1DJwdZ7L5dKcnJwaGHux3y63sI51MK6WsQ3Al+bjHSKyTkTGRGymRNFgVz2Wg+q87KqxsrJeu8bGARcDV+SY32WLeGUleC0H8ENV7a2qvWAka/wFxvmwpyM2S6JosKsey0F1XnbVWFlZr11jk8Evf/nLfmeeeWZBXl7e8Pz8/OEffPBBqEvvdcjFF1985oEDB9yRWp8drKTKl6pqYahlIrJeVUfZMsMQmLBBEWFXPZaD6ryYbRhxAgBffPHFzpEjRx6w60Pef//9Lvfee++gTz75ZGtGRoZWVVWlNDY2ytChQ712fWasffHFF71Hjhw5tPm5lVT5QyLySwBLzOfXATgsIm4Y2SBEDhXZozslh8uxEHtRma3IxV4UHy5HEVoJXjal1W88sBHlh8pR561D7YlabDywMap/5IsGFoX9eXaNTbCg2EJlZaWnZ8+eTRkZGQoA/fv3bwKA3Nzc70ycOPHQhx9+mA0AL7/88vZzzjmncc+ePSk33njjkMrKylQAeOyxx3b94Ac/OF5TU+O66aabBpeVlWUCwKxZs/YUFxcfyc3N/U5paWl5//79m55++ume8+bN6+v1emX06NHHFy1a9A0AXHfddUPLysq6iIhef/31B37729/uDz1be1g5bPgzAANhXDj3rwAGm8vcAP418lMjslFzSvvRfcaFdI/uM55v+1unVluydi4e3vQsqv2NyBYXqv2NeHjTsyhZOzdqc2huX1LfVI8USUF9Uz3ml83HvPXzWs7VTD2vrq9Gdmo2quur8fDah1Gyu6RTnx8PrGybE7+Hf/7nf67ds2dP6tChQ8+54YYbBr/99ttZza9lZ2f7NmzYUD59+vT9d9xxxyAAmD59+qB77rln38aNG8uXLl369S233DIUAO6///7+2dnZvm3btm3etm3b5h/96EdHAz9n3bp16a+99lrP0tLSLVu2bNnscrn0mWee6fXJJ59kVlVVeb788stN27Zt23zbbbdF/ZJ+YQcvVT2gqneo6rmqOkpVb1fValU9oapf2TlJooizqXXJwvJF8ADIEDcEggxxw2Muj9Ycwm1fksjtSKxsmxO/h27duvk3bty4+cknn/wmJyenaerUqWfMnTu3FwBMnTr1EAD84he/OPT5559nAcBHH32Ufddddw3Oz88fftVVV5157Ngx9+HDh12rV6/Ovvvuu0/uMeXk5LS4+vLy5cu7bty4MXPkyJHD8vPzh3/44YfZ27dvT8vPz2+sqKhImzp16qDXXnstu0ePHi2v2hwFYR82FJE8APfi1GaUl0Z+WkQ2s6l1SaW5xxUoXVyo9DdGbQ7hti9J5HYkVrbNqd9DSkoKrrzyyqNXXnnl0REjRtQvXry4FwC4XN/++xMRBQBVRWlpaXlWVlaLY+SqCvO66CGpqvzkJz85+NRTT53yZWzcuHHz0qVLs59++uk+r7zySs9XX311Z4Q2LSxWDhu+CuBzAL8G8H8DbkTOY1NKe64rDQ3a8hRwg/qR60qL2hzCbV/isNRzSxI4BR8A8MUXX6Rt2LDh5D+qzz//PGPgwIEnAGDRokU9AeBPf/pTj3PPPfc4AFx44YW1c+bM6dM8/uOPP84AgPHjx9c+9thjJ5dXV1e3yDCcMGFC7VtvvdWjsrIyBQD27dvn3rZtW2pVVVWKz+dDcXHxkYceeqhyw4YNneuN0wFWgleTqs5T1U9V9bPmm20zI7KTTSntxcOmwAugXn1QKOrVB6+5PFpzCLd9SSKnnid6Cn5tba17ypQpp51xxhkFeXl5w7ds2ZIxZ86cPQDQ2NgoI0aMyH/66af7zp07twIAFixYULFu3boueXl5w88444yCJ598MgcA/vM//7PqyJEj7rPOOqvg7LPPHv7OO+90DfycMWPGNPz617+uvOyyy/Ly8vKGX3rppXkVFRWenTt3ei688MKz8/Pzh//85z8/bfbs2buj/R1YSZV/EMB+AEsBnDwGEotLRDFVniLCppT2krVzsbB8ESr9jch1paF42BQUjb0zqnMIt32J07LsrIhRtmFUUuVbE5glGO3PtltwqryV4LUjxGJV1dMjNLewMXgRUZxi8LJJh+u8VPU0W2ZEFCtsieI4ibpdkVJZWbkh1nOIlrDPeYlIpoj8WkQWmM/PEpEr7ZsakY1sqrGybb0WOLFuKRyJul3UMVYSNv4Mo7nDBebz3QAeiviMiKLBphor29ZrgRPrlsKRqNtFHWMleJ2hqo8A8AKAqtaDLa/JqdgSxXESdbuoY6wErxMikgHzQnAicgYCsg6JHIUtURwnUbeLOsZK8PotjLYog0TkRQB/B3CfLbMishtbojhOom5XPJg7d26vnTt3Oqq5sJVrG/4NwL8AKAbwMoBCGI0piZwn73LgikeBrn2BhiPG/RWPdj4r0K71WlA0sAizxs5CTkYOak/UIicjB7PGznJ8Vl6iblc8eOGFF3rv2rXLUcHLSksUqOpBAG83PxeRXTCuLk8UHzqUpt5+rWPJO3di4e4VqHQJcv2K4oE/QNEPQ1wpHjA+L8qp8cnCSksUJ3m7bE/2syU7+lXV1Kf175bR+Iui0/b+aMSA2s6ss7a21jVx4sTTq6qqUv1+v9x333178vPzG++5555BdXV1rh49ejS9+OKLOz/44IOsjRs3Zk6ZMuX09PR0f2lpafnf//73rPvvv3+Qz+fDyJEj6xYtWvRNRkaG3nrrrbnvvfded7fbrePHj69dsGDB7pdeeqnbf/3Xf/X3er2uHj16NL3yyivbBw0aZHudWdhFyiHfLFKhqoMiOJ+wsEiZQmpOU3elGkkS3nrjkF2oPR8LY0veuRMP7/kbPKpIV6BBAK8IZg24vPUAFkPNKeUetwfp7nQ0+Brg9Xm5lxIdlouU3y7bk/27t8sHe1yiaSkuf2OT3+X1q/zmR8N2dSaALVy4sPvy5cu7LVmy5BsAOHjwoPv73//+WW+//fZXAwYMaHr22Wd7rFixoturr76687zzzjv70Ucfrbjooovq6urq5PTTT//OihUrto4YMaLx6quvHnruuefWTZ8+/eDYsWOHbd++faPL5cKBAwfcvXv39lVXV7t79erlc7lceOyxx3qXl5enP/vssxG/XFRwkbKVc16hRLaLH1FnWElTtzB24e4V8KgiA8bQDAAeVSzcvcL2TeoIppQ7y7MlO/p5XKLpHrdfRJDucfs9LtFnS3b068x6R48eXV9SUpI9Y8aM3OXLl2dt377d8+WXX2Zceumlefn5+cN///vf99+zZ88phwq/+OKL9IEDBzaOGDGiEQCKi4sPfvjhh1179uzpS0tL80+aNGnI888/3z0rK8sPADt27EgtKio6Ky8vb/jcuXP7bdmyJSN4nXZo97ChiPwRoYOUAOge8RkRdZSVFiMWxla6BNl+bVEYkq7G8njk1BYfyaqqpj6ta1pKi8NsaSkuf1VNfYhWBOEbMWJE47p16za//vrr3X71q1/ljh8/vvbMM8+sX79+/Za23tfa0TiPx4P169eXL1u2LHvJkiU95s2b12fNmjXbbr/99sF33XXX3uuvv77mrbfe6jp79uwBnZl3uMLZ8yoF8FmIWymAO+ybGpFFVtLULYzN9SsaguJUgxjL4xFTyp2lf7eMxsYmf4u/xY1Nflf/bhmdKkXauXNjTnqZAAAgAElEQVSnp2vXrv5bb7310MyZM/eVlpZ2OXToUMr777/fBTCuPl9aWpoOAFlZWb6amho3AIwaNaqhsrIydePGjWkAsGjRol5FRUVHa2pqXIcOHXJfd911Nc8880xFeXl5JgAcPXrUPXjwYC8ALFy4sFdn5mxFu8FLVZ9v69Y8ztxDI4odK2nqFsYWD/wBvCKohzG0HsY5r+KBP7B9kzqCKeXO8oui0/Z6/SoNXp9LVdHg9bm8fpVfFJ22tzPr/eyzzzJGjRo1LD8/f/icOXP6/+53v9uzZMmSr++///6BZ5999vCCgoLhq1atygKAKVOmHLjjjjuG5OfnD/f7/XjmmWd2/uQnPzkjLy9vuMvlwr333lt95MgR94QJE87Ky8sbXlRUdPZDDz1UAQC/+tWv9vz0pz89Y8yYMWf36tUrahcE7lTCRosViaxT1dERWVk7mLBBrbLSYsTCWEvZhnGAF7CNmQ5dVd6ObMNE0+GWKO1pLXiJSHcAzwE4B8a5s5+r6icBrwuAJwD8EEAdgGJVXdfWZzF4EVGcimlLlETW4ZYonfAEgOWqeq2IpAIIbhd9BYCzzNtYAPPMe0pkcdA2xC6WmlGunAOseQpoPAakZQHjbgPG/7Lzc+CeFyW4zqbKBzol9UpEsgFcBOBPAKCqJ1T1SNCwHwNYpIY1ALqLSP8IzoviTRy0DbFLydq5eHjTs6j2NyJbXKj2N+LhTc+iZG2IQ4wr5wCrHzHOt7k8xv3qR4zlnZkDW4dQEohk8ArV8+F0ANUA/iwin4vIcyLSJWhMLoCKgOe7zWWUqOKgbYhdFpYvggdAhrghEGSIGx5z+SnWPAVAAHcK4DLvIebyTsyBdV6UBMKp83oTbRQjq+pE835hK+sfDeAOVV0rIk8AuB/AbwI/ItRqQ8xjGoBpADB4MK9I5WhW6rEcptLc4wqULi5U+kNkPTceM/a4AonbWN6ZObDOi5JAOOe8Hu3E+ncD2K2qa83nr8EIXsFjAi8xNRDAnuAVqeoCAAsAI2GjE3OiWOs+xDhUmBpw+jPKbUPskutKQ7W/ERniPrmsQf3IdYWoN03LMg4VBv5nqD5jeWfmkJWL6vpqZKR8e6ED1nlRogmnzmtVW7d23rsXQIWInG0uugzA5qBhywBMEcM4ADWqWtWRjSGHiIO2IXYpHjYFXgD16oNCUa8+eM3lpxh3GwAFfE2A37yHmss7MQfWeZENZs6cOeCNN97oavV9b731VtdLLrnkzEjPJ+xsQxE5C8B/AhgO4GQ7U1U9vZ233gHgRTPTcDuAG0XkFvO9zwB4B0aa/FcwUuVvtLIB5EB5lwN4NPx6LAcpGnsnZgHhZRs2ZxVGONuwaGARZmEWsw3JMr/fD1WF2+0+5bU//OEPpxwRs4PX64XH0353Fiup8n+G0ZDycQCXwAgy7V7cTVXXw+j9FeiZgNcVQOf+V5OcJ4HbhhSNvbP11Phg438ZkdT4U+aQoK1DEtampdn4+Ml+qK1MQ3ZuIy64fS8Kru5wkfKMGTNyhwwZcuL++++vBoB77rlnQNeuXX1+vx9Lly7teeLECfnRj3505PHHH9+zdevW1CuuuOKsCy644Ohnn32W9de//vWrf/u3fxtQVlbWRUT0+uuvP/Db3/52/zXXXDP0yiuvrLnxxhsPr1q1KnPmzJmD6+rqXKmpqbp69eqtaWlpOmXKlCFlZWWZbrcbjzzySMVVV111NHBe+/btc19//fVDd+3alZaRkeFfsGDBN2PHjq2/5557BlRVVXl27dqV2rNnz6Y333xzR3vbaCXbMENV/w6jsPkbVX0QwKVWvlCipLDtb8DCK4E/fMe4j0EJQMnaubhpYSEm/M93cNPCwtCp+hQfNi3NxvJZg3G82oO07CYcr/Zg+azB2LQ0u/03h3bDDTccev3113s2P//rX//aIycnp+mrr75KLysrKy8vL9+8fv36zHfffTcLAHbu3Jl+4403HiwvL9+8b9++lKqqKs+XX365adu2bZtvu+22g4HrbmhokOuvv/6MP/zhD7u2bt26edWqVVuzsrL8c+bM6QMA27Zt2/zSSy9tnzZt2tC6uroWOzj33XffgJEjR9Zt27Zt8+9+97vKqVOnntb8WllZWeZ77733VTiBC7AWvBpExAXgSxG5XUSuBtDHwvuJEl8c1LBZqjWj2Pv4yX5wexSeDD9EAE+GH26P4uMnO9wS5Xvf+179wYMHU3bu3On55JNPMrp16+YrKyvLWL16dfbw4cOHFxQUDP/666/Tt2zZkg4A/fv3P3HZZZcdB4D8/PzGioqKtKlTpw567bXXsnv06OELXHdZWVl6nz59vBdffHEdAPTs2dPv8Xjw8ccfZ02ZMuUgAJx77rkNAwYMOLFhw4b0wPd++umnXW+66aaDADBx4sSjR44cSTl48KAbACZMmHAkKysr7GQ8K8FrJoyrY9wJYAyAyQCmWng/UeKLgxo2S7VmFHu1lWlISfe3WJaS7kdtZadaolx11VWHX3jhhR4vvvhiz2uuueaQqmLmzJlVW7Zs2bxly5bNu3bt2nj33XcfAIDMzMyTn5+Tk+PbuHHj5ksuueTo008/3WfSpElDA9erqhCRU4JMOJcaDDWmeV1dunTxn/JiG8IOXqr6D1U9BqAWwJ2q+i/mFTGIqNmRb4yatUBRrmGr9DciPdxaM4q97NxGNDW0/MGaGlzIzu3UDzZ58uRDr7/+es+33nqrxw033HD4iiuuqF28eHHvmpoaFwDs2LHDU1lZeUreQ1VVVYrP50NxcfGRhx56qHLDhg0tLuk3cuTIhn379qWuWrUqEwAOHz7s8nq9uPDCC4+98MILPQGgrKwsraqqKnXEiBEtevOMGzfu6J///OdegJGF2KNHj6aePXtaClrNrGQbFsJI2uhqPq+BcZHdzzrywUQJKQ5q2CzVmlHsXXD7XiyfNRiACynpfjQ1uODzCi64vVMtUQoLCxuOHz/u6tu374khQ4Z4hwwZ4t20aVP6d7/73XzA2Nt68cUXd6SkpLTYHdq5c6fnpptuGur3+wUAZs+evTvw9fT0dH3xxRe/vvPOOwc3NDS40tPT/atXr95233337Z88efKQvLy84W63G/Pnz9+ZkZHRYt1z5szZ87Of/WxoXl7e8IyMDP/ChQvDOr8VSthXlReRMgC3qWqJ+fxCAE+r6oiOfnhH8aryFLeaz3m5Uo09Lm+9UcN2xaNRy65sPuflgbHH1aB+eAHMKvhF+FmQ1FEdu6p8hLMNE1Fnrip/tDlwAYCqfigiR9t6A1HSiYMaNku1ZhQfCq6uZbCyxkrw+lRE5gN4Gca1B68DsFJERgNAez24nGbllv2Yv3o7Kg7XYVCPTEy/6HSMz2dyZcTY1RLFphYjlsRBDVu4tWbz1s/D4vLFqPPWIdOTicnDJmPGqBlRmKH92BYmsVkJXqPM+98GLb8ARjBLmJqvlVv244Flm+BxC7pneLD/aAMeWLYJswEGsEgIPLQWmE6OTh5aa24xAmnZYgSIfgBzgHnr52F+2XyICFIkBfVN9ZhfNh8AHB/AmtvCeNyeFm1hZmEWA1iCsJJteEkbt4QJXAAwf/V2eNyCzNQUiBj3Hrdg/urtsZ5aYrArndymFiOJanH54pOBK/B+cfniWE+t09gWJvGFHbxEpK+I/ElE3jWfDxeRm+ybWuxUHK5Dhqfltb0yPG7sPlwXoxklGLvSyRuPGS1FAkWgxUiiqvPWwY2W35cbbtR5nf/vvPJYJdLdLepj2RYmwVgpUl4I4D0AA8zn22AULiecQT0yUe9tUVSOeq8PA3tktvIOsqT7ECMLL1Ak0snTsoyWIoEi0GIkUWV6MuFDy+/LBx8yPc7/d56blYsGX4sSI7aFSTBWgldvVf0LAD8AqGoTEPQvP0FMv+h0eH2KuhNNUDXuvT7F9Ivau4A+hcWulig2tRhJVJOHTYaqokmbWtxPHjY51lPrNLaFOdXOnTs9EyZMsPxH7Lrrrhvy2Wefpbc15pFHHsl58skne3V8dtZZqfNaCeAaAH9T1dFm7605qnqxjfMLKRp1Xs3ZhrsP12Egsw0j72S2YYTTyeMh29BBmG0YcR2r84qhcFuQxFpwnZeV4DUawB8BnANgI4AcANeqapkN82wTi5SJKE51KHi9t/O97Oc3Pd9vX92+tL6ZfRunFkzd+09D/yniLVFeeuml3l9++eWmuXPn9nr33Xe7NTY2uurq6lwfffTRtqlTpw5es2ZN10GDBjX6/X4UFxcfvPHGGw+fd955Zz/66KMVF110UV1mZua5N9100/4VK1Z0S09P97/11ltfDRo0qOmee+4ZkJWV5Zs9e/a+jRs3pk2bNm3IwYMHU9xut7766qvbBw4c6J0wYcKZNTU17qamJnnggQf23HDDDUesbFNw8Gr3sKGIfFdE+pl1XBcDmAWgEcAKALvbfHOSWLllP366YA0unPMBfrpgDVZu2R+RsQnNrrYhVtb7+jRgdk/gwW7G/evTIjOHBFWyuwQ3vXcTJrw+ATe9dxNKdpe0/yZq13s738v+/T9+P/hQwyFPlier6VDDIc/v//H7we/tfC+iLVHGjRt3PHDMunXrsl5++eUda9as2bZo0aIeFRUVqVu3bt30/PPP7/z8889Dniiur693nX/++ce2bt26+fzzzz/2xz/+MSd4zM9+9rPTbrnllv1bt27dXFpaumXw4MHezMxM/9tvv/3V5s2by1etWrVt1qxZA/3+Dl3S8KRwznnNB3DCfHwBgF8BeArAYQALOvXpCaC5Jmz/0YYWNWGhgpKVsQnNrrYhVtb7+jRgwyuA3zxt6/cZzxnAQmqum6qur25RN8UA1nnPb3q+X4orRdNT0v0igvSUdH+KK0Wf3/R8RFuinH766ScCxxQVFdX27dvXBwAlJSVZ//Iv/3LY7XZj8ODBTePGjQt59SSPx6OTJk2qAYAxY8Yc/+abb1IDXz98+LBr3759qVOmTDkCAJmZmdq1a1e/3++XmTNnDszLyxt+ySWX5O3fvz919+7dVuqMTxFO8HKr6iHz8XUAFqjq66r6GwBndubDE4GVmjDWj5nsqvOyst5Nrxn3It/eApdTC6ybss++un1pae60Frshae40/766fRFtiRL8emAblHBPH6WkpKjL5Wp+jKamphbNJltbz/z583sePHgwZcOGDeVbtmzZ3KtXL299fb2VhMFThBW8RKQ5Ql4G4IOA1zoVOROBlZow1o+Z7KrzsrJefyuJsq0tT3Ksm7JP38y+jY2+xhZ/ixt9ja6+mX0j2hKlrbFFRUXH3njjjR4+nw8VFRUpa9eu7dqRz+zZs6e/X79+JxYvXtwdAOrr6+Xo0aOumpoad+/evb1paWn65ptvdt2zZ09qe+tqTzjB62UAq0TkrwDqATRfVf5MADWdnYDTWakJY/2Yya46LyvrdblPXdbW8iTHuin7TC2YurfJ3yQNTQ0uVUVDU4Oryd8kUwumRrQlSptzmDr1cP/+/U/k5eUV3HjjjUNGjhx5vHv37h36P7kXXnhhx1NPPdUnLy9veGFhYX5FRUXKzTfffOiLL77ocs455wx74YUXep522mkN7a+pbWFlG5pp8f0BrFDV4+ayPABZsbggbzxlGwZeBzHD40a91wevTzF7YsEpqfVWxiY0u9qGWFlv8zmvYN+5Drgm6U/lniLwWoHp7nQ0+Brg9XkxayyvFRgkLrINO6KmpsbVrVs3/969e93f/e53h3300UdbBg8e3BTNObSlw6ny8SSeghdgrSaM9WMmu+q8rKz39WnGOS6/z9jjKriWgasNvEp7WBxX59XsvPPOO7u2ttbt9Xrlrrvu2nvnnXcejPWcAnWmnxe1Ynx+n7ADkJWxCc1K2xAr7VPioB2Jbe1eiGz06aefbo31HKzoVLYHke3sSqu3K1XervnGGFPlKd4weFF8syut3q5UebvmG2NMlbfM7/f7pf1hFA7zu2xRTsDgRfHNrrR6u1Ll7ZpvjDFV3rKN1dXV3RjAOs/v90t1dXU3GJclPInnvCi+dR9iHHpLDSgniERavcsdOlB1NlXervnGWG5WLqrrq5GR8m1gZqp865qamm7eu3fvc3v37j0H3EnoLD+AjU1NTTcHLmTwovh2wV3GOaMTaJn+3tn2KQXXGue4grNtC67t3Hrtmm+MFRcU4+G1DwNAi1T5ZG4x0pYxY8bsBzAx1vNIZPw/AopveZcbdVpd+wINR4z7ztaDAUZK/Heu+3ZPy+WOTI2XXfONsaKBRZg1dhZyMnJQe6IWORk5rPGimLK9zktEdgI4CqNxZZOqFga9Ph7AXwHsMBf9r6rObmud8VbnRURk4jmuKInWYcNLVLWtgr0SVb0ySnMJy9z3t+G5D3fg+AkfuqS6cfOFp+HO7+fFelqJw0otlF1j2biSyLF42DCEue9vwxMffIV6rw8pLuP6g0988BXmvr8t1lNLDFZqoewau3IOsPoR4EQd4PIY96sfMZYTUdyLRvBSACtE5DMRaa0C9HwR+UJE3hWRgijMqU3PfbgDLgFSXC64xGXeG8spAqzUQtk1ds1TAARwpwAu8x5iLieieBeNw4bfU9U9ItIHwN9EZIuqrg54fR2AIap6TER+COANAGcFr8QMfNMAYPBge9OOj58w9rgCucRYThFw5BtjzyhQa7VQdo1tPGbscQUSt7GciOKe7XteqrrHvN8PYCmA84Jer1XVY+bjdwB4RKR3iPUsUNVCVS3MyTml83REdUl1wx+Ux+JXYzlFgJXWJXaNTcsCNOh/RtRnLCeiuGdr8BKRLiLStfkxgB8gqEpaRPqJGNfmEZHzzDnF9GrGN194GvwKNPn98KvfvDeWUwRccJdR+3SizqizOlHXei2UXWPH3QZAAV+T8X8mvibj+bjbIr21RGQDuw8b9gWw1IxNKQBeUtXlInILAKjqMwCuBTBDRJpgNLucpDHu09KcVchsQ5vkXQ7g0fBal9g1tjmrkNmGRI7Efl5ERJHDOq8o4eWhWmFXndfdS9ZhWdle+PwKt0swcUQ/PD5pdARmTHGD/byIbMc6rxDsqvO6e8k6LF1fBZ+ZDeLzK5aur8LdS9ZFYtoUDxK0nxdRvGHwCsGuOq9lZXsBnNpCqnk5JYAE7edFFG8YvEI4fsIHV9CR60jUefmC8+/bWU4OlKD9vIjiDYNXCHbVebmDI2I7y8mBrNSaEVGHMXiFYFed18QR/QAYJUjNt8DllACs1JoRUYcxeIVw5/fzcNelZyLD40aTH8jwuHHXpWd2Otvw8UmjcfWo/if3tNwuwdWj+jPbMJEkaD8vonjDVPlWjBjYHQUDuqHicB0G9cjEiIHdI7LexyeNxuOTIrIqild5lzNYEdmMe14hrNyyHw8s24T9RxvQPcOD/Ucb8MCyTVi5ZX+sp0ZERGDwCmn+6u3wuAWZqSkQMe49bsH81dtjPTUiIgKDV0gVh+uQ4WmZWZjhcWP34boYzYiIiAIxeIUwqEcm6r0ta7rqvT4M7JEZoxkREVEgBq8Qpl90Orw+Rd2JJqga916fYvpFp8d6akREBAavkMbn98HsiQXo0zUdNfVe9OmajtkTCzA+v0+sp0ZERGCqfKvG5/dhsCIiilNJFbxWbtmP+au3n6zdmn7R6a0GKLtaopBDsc0JUVxJmsOGVmq37GqJQg7FNidEcSdpgpeV2i27WqKQQ7HNCVHcSZrgZaV2y66WKORQbHNCFHeSJnhZqd2yqyUKORTbnBDFnaQJXlZqt+xqiUIOxTYnRHEnaYKXldotu1qikEOxzQlR3EmqVHkrtVs7DhzD8RM++PyK4yd82HHgWKtjraTVWxlrJbU/ocVDmjrbnBDFlaTZ87Li7iXrsHR9FXzmiS+fX7F0fRXuXrLulLFW0uqtjGVbFhPT1IkoBAavEJaV7QVgZEU33wKXB7KSVm9lLNuymJimTkQhMHiF4AtONWxjuZW0eitj2ZbFxDR1IgqBwSsEd3CEaWO5lbR6K2PZlsXENHUiCoHBK4SJI/oBMLKim2+BywNZSau3MpZtWUxMUyeiEBi8Qnh80mhcPar/yT0tt0tw9aj+eHzS6FPGWkmrtzKWbVlMTFMnohBENfT5nYh9gMhOAEcB+AA0qWph0OsC4AkAPwRQB6BYVU9N6wtQWFiopaWl9kyYiKjjQp9zoIiLVp3XJap6oJXXrgBwlnkbC2CeeR9TEx5fiS37jp98nt+3C5bfPT7kWCv1WHbVbjmuJmzlHGDNU0DjMSAtCxh3GzD+l7GeFRE5RDwcNvwxgEVqWAOgu4j0j+WEggMXAGzZdxwTHl95ylgr9Vh21W45riZs5Rxg9SPG+SuXx7hf/YixnIgoDNEIXgpghYh8JiLTQryeC6Ai4Pluc1nMBAeutpZbqceyq3bLcTVha54CIIA7xagVcKcYz9c8FeuZEZFDROOw4fdUdY+I9AHwNxHZoqqrA14PdYz4lBNxZuCbBgCDB8dPmnTF4Tp0z/C0WNZaPZaVsXbNIS40HjP2uAKJ21hORBQG2/e8VHWPeb8fwFIA5wUN2Q1gUMDzgQD2hFjPAlUtVNXCnJwcu6ZrmZV6LLtqtxxXE5aWBWhQYbb6jOVERGGwNXiJSBcR6dr8GMAPAGwMGrYMwBQxjANQo6pVds6rPfl9u4S93Eo9ll21W46rCRt3GwAFfE1GlbavyXg+7rZYz4yIHMLWVHkROR3G3hZgHKJ8SVX/Q0RuAQBVfcZMlX8SwAQYqfI3qmqbefDRSJXvSLbh7sN1GBhmtmE4Y62wa722YbYhJSamykeJ7XVedmCdFxHFKQavKEmqfl529d2iDoiHHl0OU7K7BAs3LUTlsUrkZuWiuKAYRQOLYj0topiIhzqvqLCr7xZ1AHt0WVayuwQPr30Y1fXVyE7NRnV9NR5e+zBKdpfEempEMZE0wcuuvlvUAezRZdnCTQvhcXuQkZIBEUFGSgY8bg8WbloY66kRxUTSBC+7+m5RB7BHl2WVxyqR7k5vsSzdnY7KY5UxmhFRbCVN8LKr7xZ1AHt0WZablYsGX0OLZQ2+BuRmxfRiNEQxkzTBy66+W9QB7NFlWXFBMbw+L+qb6qGqqG+qh9fnRXFBcaynRhQTSRO87Oq7RR3AHl2WFQ0swqyxs5CTkYPaE7XIycjBrLGzmG1ISSupUuVHDOyOggHdTrYNGTGwe6tjdxw4huMnfPD5FcdP+LDjQOvX3XNcO5J4kHc5g5VFRQOLGKyITEmz52WlbcjdS9Zh6foq+MwTXz6/Yun6Kty95NQemY5rR0JElACSJnhZaRuyrGwvACOLu/kWuLyj6yUioshImuBVcbgOGZ6W2YKttQ3xBacatrHcynqJiCgykiZ4WWkb4g4u8mpjuePakRARJYCkCV5W2oZMHNEPgJHF3XwLXN7R9RIRUWQkTfAan98HsycWoE/XdNTUe9GnazpmTywImRX4+KTRuHpU/5N7Wm6X4OpR/fH4pNGdWi8REUUGW6IQEUUOW6JESVLVeVnBlihERPGLwSuE5pYoxpXlv22JAoABjIgoDiTNOS8r2BKFiCi+MXiFwJYoRETxjcErBLZEISKKbwxeIbAlChFRfGPCRgjNSRnMNiQiik8MXq1gSxQiovjFw4YhsCUKEVF8Y/AKgS1RiIjiG4NXCGyJQkQU3xi8QmBLFCKi+MbgFQJbohARxTcGrxDYEoWIKL5FpSWKiLgBlAKoVNUrg14rBvB7AJXmoidV9bm21seWKEQUp9gSJUqiVed1F4ByANmtvP6Kqt4epblEHOu8iIiiy/bDhiIyEMCPALS5N+VUrPMiIoq+aJzz+gOA+wD42xhzjYiUichrIjIoCnOKGNZ5ERFFn63BS0SuBLBfVT9rY9ibAIaq6ggA7wN4vpV1TRORUhEpra6utmG2HcM6LyKi6LN7z+t7ACaKyE4ASwBcKiIvBA5Q1YOq2mg+fRbAmFArUtUFqlqoqoU5OTl2ztkS1nkREUWfrcFLVf9NVQeq6lAAkwB8oKo3BI4Rkf4BTyfCSOxwDNZ5ERFFX0yuKi8iswGUquoyAHeKyEQATQAOASiOxZw6anx+H8yGce5r9+E6DGS2IRGR7aJS5xVprPMiojjFOq8oYT+vVty9ZB2Wle2Fz69wuwQTR/QLeYUNIiKKPl4eKgQr/byIiCj6GLxCsNLPi4iIoo/BKwQr/byIiCj6GLxCsNLPi4iIoo/BKwQr/byIiCj6GLxCsNLPi4iIoi+p6rzYuoSIbMZzC1GSNHtebF1CRJQ4kiZ4sXUJEVHiSJrgxdYlRESJI2mCF1uXEBEljqQJXmxdQkSUOJImeI3P74PZEwvQp2s6auq96NM1HbMnFjDbkIjIgZLqqvLj8/swWBERJYCk2fMiIqLEweBFRESOw+BFRESOw+BFRESOw+BFRESOw+BFRESOw+BFRESOw+BFRESOw+BFRESO48hmlCJSDeCboMW9ARyIwXSigdvmTIm6bYm6XUDnt+2Aqk6I1GSodY4MXqGISKmqFsZ6HnbgtjlTom5bom4XkNjblmh42JCIiByHwYuIiBwnkYLXglhPwEbcNmdK1G1L1O0CEnvbEkrCnPMiIqLkkUh7XkRElCQcF7xEZIKIbBWRr0Tk/hCvp4nIK+bra0VkaPRn2TFhbFuxiFSLyHrzdnMs5mmViPyPiOwXkY2tvC4iMtfc7jIRGR3tOXZUGNs2XkRqAn6zB6I9x44QkUEi8v9EpFxENonIXSHGOPJ3C3PbHPm7JRVVdcwNgBvA1wBOB5AK4AsAw4PG3ArgGfPxJACvxHreEdy2YgBPxnquHdi2iwCMBrCxldd/COBdAAJgHIC1sZ5zBLdtPIC3Yj3PDmxXfwCjzcddAWwL8e/Rkb9bmNvmyN8tmW5O2/M6D8BXqrpdVU8AWALgx0FjfgzgefPxawAuExGJ4hw7KpxtcyRVXQ3gUPobNf8AAATiSURBVBtDfgxgkRrWAOguIv2jM7vOCWPbHElVq1R1nfn4KIByALlBwxz5u4W5bRTnnBa8cgFUBDzfjVP/0Z0co6pNAGoA9IrK7DonnG0DgGvMQzSvicig6EzNduFuu1OdLyJfiMi7IlIQ68lYZR56PxfA2qCXHP+7tbFtgMN/t0TntOAVag8qOF0ynDHxKJx5vwlgqKqOAPA+vt3DdDqn/mbhWAdgiKqOBPBHAG/EeD6WiEgWgNcBzFTV2uCXQ7zFMb9bO9vm6N8tGTgteO0GELi3MRDAntbGiEgKgG5wxmGddrdNVQ+qaqP59FkAY6I0N7uF87s6kqrWquox8/E7ADwi0jvG0wqLiHhg/HF/UVX/N8QQx/5u7W2bk3+3ZOG04PUPAGeJyGkikgojIWNZ0JhlAKaaj68F8IGqOuH/BtvdtqDzCRNhHKtPBMsATDGz18YBqFHVqlhPKhJEpF/zOVcROQ/Gf3MHYzur9plz/hOAclV9rJVhjvzdwtk2p/5uySQl1hOwQlWbROR2AO/ByM77H1XdJCKzAZSq6jIY/ygXi8hXMPa4JsVuxuELc9vuFJGJAJpgbFtxzCZsgYi8DCN7q7eI7AbwWwAeAFDVZwC8AyNz7SsAdQBujM1MrQtj264FMENEmgDUA5jkkP+Z+h6AyQA2iMh6c9ksAIMBx/9u4WybU3+3pMErbBARkeM47bAhERERgxcRETkPgxcRETkOgxcRETkOgxcRETkOgxcRETkOgxfFFRHxmS0oNorIqyKS2c74WWGud2drV0gQkcdFZGbA8/dE5LmA5/8tIveIyAARea2VdawUkcLgOYnI0NbapRBRxzF4UbypV9VRqnoOgBMAbmlnfFjBqx0fA7gAAETEBaA3gMALsV4A4CNV3aOq14axvkjMiYjawOBF8awEwJkAICI3iMin5l7ZfBFxi8h/Acgwl71ojntDRD4zmwxOC/NzPoIZvGAErY0AjopIDxFJAzAMwOeBe1EikiEiS8wr/L8CIMNcfsqcALhF5FlzTitEJKPzXw1RcmPworhkXlT5ChiX8BkG4DoA31PVUQB8AK5X1fvx7Z7a9eZbf66qYwAUwricVrvtcFR1D4AmERkMI4h9AqNFxvnmesrMHmuBZgCoM6/w/x8wL5LcypzOAvCUqhYAOALgmo58J0T0LUdd25CSQkbA9eZKYFyrchqM4PAP81qpGQD2t/L+O0XkavPxIBiBI5wLqjbvfV0A4DEYfakugNEP7uMQ4y8CMBcAVLVMRMraWPcOVW3eps8ADA1jPkTUBgYvijf15t7VSebVvZ9X1X9r640iMh7A9wGcr6p1IrISQHqYn9t83us7MA4bVgD4PwBqAfxPK+8J98KgjQGPfTAPMRJRx/GwITnB3wFcKyJ9AEBEeorIEPM1r9mbCTB6tx02A1c+gHEWPuMjAFcCOKSqPlU9BKA7jEOHn4QYvxrA9eZ8zgEwIuC1wDkRkQ0YvCjuqepmAL8GsMI8PPc3AM29zRYAKDOTI5YDSDHH/A7AGgsfswFGluGaoGU1qnogxPh5ALLMz7oPwKcBrwXOiYhswJYoRETkONzzIiIix2HCBiUNM23+7yFeukxV2eKdyEF42JCIiByHhw2JiMhxGLyIiMhxGLyIiMhxGLyIiMhxGLyIiMhx/j8i4n6IrUCcPgAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAa8AAAFtCAYAAACwS+W+AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl8FPX9P/DXezebixDOcIVLxBDAAkoqSg3i1WJV+m21X20RCEVFPJD682v90vNLrS3q1ypVEdRvEcSjarGegNZyKIKNCOGOCEgIV7gSMAfJ7vv3x0xwsyxhJ9nZndm8nj7y2N2Z2dnPzMa8mZn3e96iqiAiInITT7wHQEREZBWDFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DVwsgImNEZEm8x3EmIrJRREbG6LN+JyIvxOKzIiEiI0Vkd5zHsFNErojnGIgixeCVAM70R0dVF6jqd2M5pqZQ1YGqujTe4yAi52PwSnAikhTvMVBi4+8YxQODV4IRkQIR+VhE/iwihwH8zpz2kTlfzHkHRKRcRIpE5NxG1rVdRI6JyA4RGRPyGX8x17FFRC4Pel8bEXlORPaKSKmIPCAi3qD5t4jIZnO9m0TkfHP6ySNIEfGIyP0i8qWIHBKRv4lIe3Neqoi8YE4/KiL/FpHOp9mGX5hjOCYiW4PHCSBZROaZ8zaKSF7Q+/qLyFJz/RtFZLQ5/Sxzmsd8/ayIHAh63wsiMtV8vlREfm/uq2MiskREOp7h+5smIgfNfVG/v78tIvuDg4SIXCcia0+zjjQR+V8R+cr8fj4SkTRz3mhze46a4+t/mnWkiMhjIrLH/HlMRFLMeSNFZLe5b/cB+Gtj20RkBwavxDQMwHYAnQD8IWTedwGMAJADoC2AGwAcCl2BiLQCMBPAVaraGsBwAMF/LOs/oyOA3wL4e31wAfA8gDoAfQGcZ37mzeZ6fwzgdwDGAcgEMDrc5wOYAuA/AFwCoBuAIwCeNOeNB9AGQA8AHQDcBqAqzDb0A3AngG+b2/A9ADuDFhkN4GVzP7wJ4AnzfT4AbwFYAmMf3gVggYj0U9UdACrM7QKAfADHg4LACADLgj7jpwAmmOtJBnBvmG2t1wXG/sw2t3GO+Zn/NvfRlUHL3gRg/mnW8wiAoTC+s/YA7gMQEJEcAC8BmAogC8C7AN4SkeQw6/glgAsBDAEwGMAFAH4VMtb2AHoBuLWRbSKyh6ryx+U/MP4gX2E+LwCwK2R+AYCPzOeXASiG8YfJ08g6WwE4CuA6AGlh1rcHgARN+xTAWACdAdQEvwfATwD8y3y+GMDdEWzHZgCXB83rCqAWQBKAnwFYCWDQGfZLXwAHAFwBwBcy73cAPgh6PQBAlfk8H8C+4P0D44/+78zn8wHcA+MP+FYAD8EIoGeZ+8xjLrcUwK+C1nE7gEWnGetIGAG/VdC0vwH4tfn8FwAWmM/bA6gE0DXMejwwAvngMPN+DeBvIcuWAhgZZv9/CeD7Qct+D8DOoLGeAJAa7999/rTcHx55JaaS081Q1Q9hHGE8CWC/iMwRkcwwy30N46jsNgB7ReQdEckNWqRUVYPv6vwVjCOkXgB85nuOishRALNhHHkAxtHSlxFsQy8AC4PWsRmAH0ZwnA8jCL5sntJ6yDxaCt2GbTCOMn4H4ICIvCwi3YIW2Rf0vBJAqnlqrhuAElUNhGxftvl8GYw/4CMALIcRpC4xf1aEvC/0MzIa2eYj5n4P/sz68b4A4FoRyQDwn+bn7A2zjo4AUhF+H3cz1wkAMMdZErRdp102ZCwAUKaq1Y1sC5GtGLwSU6OtAlR1pqoOBTAQxunD/zrNcotV9UoYRz1bADwTNDtbRCTodU8YR2MlMI68OqpqW/MnU1UHmsuVADg7gm0ogXHKsm3QT6qqlqpqrar+j6oOgHFq7BoYpyHDbcOLqnoxjGCoAGZE8Nl7APSov64VtH2l5vNlMI7ORprPPwLwHRjBK/iUoVXtzNO1wZ+5BwBUtRTAJwB+COMI93SnDA8CqEb4fbwHxn4AYFz/hPGPidIzLRs8FhPbUVBcMXi1MObF/2HmkcrXMP7Q+cMs19m8uN8KRjA6HrJcJwBTRMRnXsfqD+Bd82hgCYD/FZFMM/HibBG5xHzfswDuFZGhYugrIsF/JOs9DeAP9fNEJEtEfmA+v1REviVGEkgFjNOJ4bahn4hcZiYaVMM4nXbKcmGsNvfNfeb2jQRwLYzrY1DVL8x13QRguapWANgP4xRrc4IXAPyPiCSLSD6MoPxq0Lx5MK5ffQvAwnBvNo+m/g/AoyLSTUS8InKRuQ/+BuBqEbnc/P7/H4zvdmWYVb0E4Ffmfu8I4Dcwjv6IHIHBq+XJhHEEdQTGqaBDMC7wh/LA+OO2B8BhGEcVtwfNXw3gHBj/0v8DgOtVtT7xYhyM5IRN5ue8BuPoDar6qrn8iwCOAXgDxjWcUI/DSKJYIiLHAKyCkSQCGNeaXoMRuDbDCBjh/rCmAPiTOcZ9MALutHA7JZiqnoCRzHGV+d6nAIxT1S1Biy0DcEhVdwW9FgCfn2n9jdgHY3/tAbAAwG0hn7kQ5unUkNOLoe4FsB7Av2F8dzNgXIfbCiPg/sXcrmsBXGtub6gHABQCKDLXtcacRuQI0vCyBdGZiUgBgJvN03EUQyLyJYBJqvpBvMdCFE888iJyCRG5Dsa1pg/jPRaieGNlPJELiMhSGOn8Y0OyGYlaJJ42JCIi1+FpQyIich0GLyIich1XXvMaNWqULlq0KN7DICIKJWdehKLBlUdeBw8ejPcQiIgojlwZvIiIqGVj8CIiItdh8CIiItexPXiJyM/Nzq0bROQlEUkNmZ8iIq+IyDYRWS0ive0eExERuZutwUtEsmF0xM1T1XMBeAHcGLLYRBh9jPoC+DMia1lBREQtWCxOGyYBSDOb/KWjYU8gAPgBjLbxgHGn8MtD+kQRERE1YGvwMhvoPQJgF4C9AMpVdUnIYtkwO/+qah2AcgAdQtclIreKSKGIFJaVldk5bCIicji7Txu2g3FkdRaMFuKtROSm0MXCvPWUGy6q6hxVzVPVvKysrOgPloiIXMPu04ZXANihqmWqWgvg7zDatgfbDaMVOcxTi21gNNAjIiIKy+7gtQvAhSKSbl7HuhxG59tgbwIYbz6/HsCHylvdExFRI+y+5rUaRhLGGhitxD0A5ojIdBEZbS72HIAOIrINwD0A7rdzTERE5H6u7OeVl5enhYWF8R4GUUJYsXsF5m6ci9LjpcjOyEbBwALkd8+P97DcipnSMcI7bBC1YCt2r8CDqx9EWVUZMpMzUVZVhgdXP4gVu1fEe2hEjWLwImrB5m6cC5/Xh7SkNIgI0pLS4PP6MHfj3HgPjahRDF5ELVjp8VKkehvcsQ2p3lSUHi+N04iIIsPgRdSCZWdko9pf3WBatb8a2RnZcRoRUWQYvIhasIKBBaj116Kqrgqqiqq6KtT6a1EwsCDeQyNqFIMXUQuW3z0f04ZNQ1ZaFipOVCArLQvThk1jtiE5XlK8B0BE8ZXfPZ/BilyHR15EROQ6DF5EROQ6DF5EROQ6DF5EROQ6DF5EROQ6DF5EROQ6DF5EROQ6rPNqgexqgeG29ZKB+5fciEdeLYxdLTDctl4ycP+SWzF4tTB2tcBw23rJwP1LbsXg1cLY1QLDbeslA/cvuRWDVwtjVwsMt62XDNy/5FYMXi2MXS0w3LZeMnD/kluJqsZ7DJbl5eVpYWFhvIfhWm7LCmQ2nL24f6NK4j2AloLBi4goehi8YoR1XuR4PDIgolC85kWOxjokIgqHwYscjXVIRBQOgxc5GuuQiCgcBi9yNNYhEVE4DF7kaKxDIqJwbA1eItJPRNYG/VSIyNSQZUaKSHnQMr+xc0zkLvnd8zFt2DRkpWWh4kQFstKyMG3YNGYbErVwtqbKq+pWAEMAQES8AEoBLAyz6ApVvcbOsZB75XfPZ7AiogZiedrwcgBfqupXMfxMIiJKQLEMXjcCeOk08y4SkXUi8p6IDIzhmIiIyIViErxEJBnAaACvhpm9BkAvVR0M4C8A3jjNOm4VkUIRKSwrK7NvsERE5HixOvK6CsAaVd0fOkNVK1T1uPn8XQA+EekYZrk5qpqnqnlZWVn2j5iIiBwrVsHrJzjNKUMR6SIiYj6/wBzToRiNi4iIXMj2G/OKSDqAKwFMCpp2GwCo6tMArgcwWUTqAFQBuFHdeKt7IiKKGbZEISKKHrZEiRG2RKGoYesSIooV3h6KooKtS4golhi8KCrYuoSIYonBi6KCrUuIKJYYvCgq2LqEiGKJwYuigq1LiCiWGLwoKti6hIhiianyFDVWWpcwrZ6ImoNHXhRzTKsnouZi8KKYY1o9ETUXgxfFHNPqiai5GLwo5phWT0TNxeBFMce0eiJqLgYvijmm1RNRczFVnuLCSlo9EVEoBi9yPLtqwtxWa+a28RLZiacNydHsqglzW62Z28ZLZDcGL3I0u2rC3FZr5rbxEtmNwYscza6aMLfVmrltvER2Y/AiR7OrJsxttWZuGy+R3Ri8yNHsqglzW62Z28ZLZDdR1XiPwbK8vDwtLCyM9zAoRphtaHDbeFsoifcAWgoGLyKi6GHwihHWeZHj8YiDiELxmhc5GuubiCgcBi9yNNY3EVE4DF7kaKxvIqJwGLzI0VjfREThMHiRo7G+iYjCsTV4iUg/EVkb9FMhIlNDlhERmSki20SkSETOt3NM5C7s/UVE4diaKq+qWwEMAQAR8QIoBbAwZLGrAJxj/gwDMMt8JAJgY++v4veBlY8DR78C2vYCht8N5FwZ/c8hoqiL5WnDywF8qapfhUz/AYB5algFoK2IdI3huKglKn4feO9e4Nh+ILWd8fjevcZ0InK8WAavGwG8FGZ6NoCSoNe7zWlE9ln5OOBJBpLTARHj0ZNsTCcix4tJ8BKRZACjAbwabnaYaafcs0pEbhWRQhEpLCsri/YQqaU5+hXgS2s4zZcGHN0Vn/EQkSWxOvK6CsAaVd0fZt5uAD2CXncHsCd0IVWdo6p5qpqXlZVl0zCpxWjbC6itajittgpo2zM+4yEiS2IVvH6C8KcMAeBNAOPMrMMLAZSr6t4YjYtaquF3A4ETwIlKQNV4DJwwphOR49kevEQkHcCVAP4eNO02EbnNfPkugO0AtgF4BsDtdo+JCDlXAlc9ArTuDFQfNR6veoTZhkQuwZYoRETRw5YoMcKWKJRQZv3zvzC/ZDEqoUiHYGyP72Hy5Q+HXZatVojci7eHooQx65//hdkli1AFRRKAKihmlyzCrH/+1ynLstUKkbsxeFHCmF+yGALjdELw4/ySxacsy1YrRO7G4EUJoxIKb8g0rzk9FFutELkbgxcljHQI/CHT/Ob0UGy1QuRuDF6UMMb2+B4UQB3Q4HFsj++dsixbrRC5G4MXJYzJlz+MST1GIQ2COgBpEEzqMSpstiFbrRC5G1PlKXpsajFiJaV98uUPYzLCp8YTUeLgkRdFh00tRuxKaWeqPJG7MXhRdNjUYsSulHamyhO5G4MXRYdNLUbsSmlnqjyRuzF4UXTY1GLErpR2psoTuRuDF0WHTS1G7EppZ6o8kbsxeFF02NRixK6UdqbKE7kbW6IQEUUPW6LECOu8EkQit/dYsXom5m6eh9JADbI9KSjoPw75w6aEX/bdKZi7ewlKPYLsgKKg+3eR//2ZzR7DrLWzMH/zfFTWViLdl46x/cdi8pDJzV4vETUNTxsmgESuWVqxeiYe3PgMygI1yBQPygI1eHDjM1ix+tSAtOLdKXhwz/soEyAzoCgT4ME972PFu+EDXaRmrZ2F2UWzUVVXhSRJQlVdFWYXzcastbOatV4iajoGrwSQyDVLczfPgw9AmnghEKSJFz5z+inL7l4CnyrSYJSapQHwqWLu7iXNGsP8zfMhIkiSpAaP8zfPb9Z6iajpGLwSQCLXLJUGapAqDX9NU8WD0kDNqct6BKkhl3BT1ZjeHJW1lfCGNFvxwovK2spmrZeImo7BKwEkcs1SticF1RpoMK1aA8j2pJy6bEBRHRKnqsWY3hzpvnT4Q5qt+OFHui+9WesloqZj8EoAiVyzVNB/HGoBVKkfCkWV+lFrTj9l2e7fRa0IqmCUmlUBqBVBQffvNmsMY/uPhaqiTusaPI7tP7ZZ6yWipmPwSgCJXLOUP2wKpg28BVmeFFRoAFmeFEwbeEvYbMP878/EtG5XIkuBCo8gS4Fp3a5sdrbh5CGTMWnQJKQlpaFO65CWlIZJgyYx25AojljnRUQUPazzihHWeVF8WOn9ZVOfMCJyL542pNiz0vvLpj5hRORuDF4Ue1Z6f9nUJ4yI3I3Bi2LPSu8vm/qEEZG7MXhR7Fnp/WVTnzAicjcGL4o9K72/bOoTRkTuZnvwEpG2IvKaiGwRkc0iclHI/JEiUi4ia82f39g9JoozK72/bOoTRkTuFotU+ccBLFLV60UkGUC4e+qsUNVrYjAWspOVlPacKyMOQPfv+ycWeXbD307hld0Yte+f+FMUglcit5EhSnSWjrxEZLiI/FRExtX/nGH5TAAjADwHAKp6QlWPNn245Fg2pbTfv/x+vLPjHfjVuLegX/14Z8c7uH/5/c1abyK3kSFqCSIOXiIyH8AjAC4G8G3zJ+8Mb+sDoAzAX0XkcxF5VkRahVnuIhFZJyLvicjASMdEDmJTSvuinYsAABL0X/D0pkrkNjJELYGV04Z5AAaotftJJQE4H8BdqrpaRB4HcD+AXwctswZAL1U9LiLfB/AGgHNCVyQitwK4FQB69mSmmeMc/co44goWhZT2+iOuSKdHqvR4KTKTMxtMS5Q2MkQtgZXThhsAdLG4/t0AdqvqavP1azCC2UmqWqGqx83n7wLwiUjH0BWp6hxVzVPVvKysLIvDINvZlNLuFa+l6ZFK5DYyRC3BGYOXiLwlIm8C6Ahgk4gsFpE3638ae6+q7gNQIiL9zEmXA9gUsv4uIiLm8wvMMR1qwrZQPNmU0j6q9ygAgAb9Fzy9qRK5jQxRSxDJacNHmvkZdwFYYGYabgcwQURuAwBVfRrA9QAmi0gdjBZMN1o8NUlOkHMlgEfMbMNdxhFXFG6g+6cRfwJgXOPyqx9e8WJU71EnpzdVfvd8TMM0ZhsSuVTELVFEZIaq/uJM02KBLVGIyKHYEiVGrCRsXAkgNFBdFWYaJRArtVCz1s7C/M3zUVlbiXRfOsb2H3v6ho02tTmxMt4Vq2di7uZ5KA3UINuTgoL+48I2uSQi5znjkZeITAZwO4y09y+DZrUG8LGq3mTf8MLjkVds1NdC+bw+pHpTUe2vRq2/NmyX5llrZ2F20WyICLzwwg8/VDV8x+H6mjBPspGRWFtlXB9r5p0zrIx3xeqZeHDjM/ABSBUPqjWAWuC0XZqJIiQA8Nlnn3VKSkp6FsC54G34misAYENdXd3NQ4cOPVA/MZLg1QZAOwB/hJHmXu+Yqh62Y6RnwuAVGxMXT0RZVRnSkr65q3tVXRWy0rLw3Peea7Ds8JeGo6quCknyzcF8ndYhLSkNK3+ysuGK515jFDEnB91s5USlceungrdjMt6Jc/NQFqhBWlDWYpX6keVJwXMF/N2iJhMAWLdu3ZtdunTpn5WVVeHxeHgNvxkCgYCUlZW12bdv36bBgwePrp9+xn8RqGq5qu4EcAeAY0E/EBGfTeMlByg9XopUb2qDaaerhaqsrYQXDdPXvfCisrby1BXb1ObEynhLAzVIlYa//qniQWmgplljIDKdy8AVHR6PR7OyssphHMV+M93COtbAuFtGMYAvzOc7RGSNiAyN2kjJMazUQqX70uFHw8JhP/xI94W5laVNNWFWxpvtSUG1BhouqwFke1KaNQYik4eBK3rMfdkgXlkJXosAfF9VO6pqBxjJGn+DcT3sqaiNkhzDSi3U2P5joaqo07oGj2P7jz11xTbVhFkZb0H/caiFcapQoahSP2rN6URu8Itf/KJL3759B+bk5AzIzc0d8OGHH4a79V6TXHLJJX0PHjzYvDsB2MxK8MpT1cX1L1R1CYARqroKAP+5moDyu+dj2rBpyErLQsWJCmSlZYVNfgCAyUMmY9KgSUhLSjt5rStssgZgW5sTK+PNHzYF0wbegixPCio0gCxPCpM1yDU++OCDVosXL267fv36TcXFxZv+9a9/Fffp0+dEtNa/bNmybR07dmzePdhsZiVV/rCI/ALAy+brGwAcEREvjGwQioRdKeIOSPs+t+O56N++/8k09XM7nnvaZVekp2Ju184obV2H7IzOKEhPRTTKg/O750dcaJw/bErE+8jK/mWrFbJbaWmpr3379nVpaWkKAF27dq0DgOzs7G+NHj368EcffZQJAC+99NL2c889t2bPnj1JEyZM6FVaWpoMAI8++uiu7373u1+Xl5d7Jk6c2LOoqCgdAKZNm7anoKDgaHZ29rcKCws3d+3ate6pp55qP2vWrM61tbVy/vnnfz1v3ryvAOCGG27oXVRU1EpEdMyYMQd/+9vfHgg/WntYOfL6KYDuMG6c+w8APc1pXgD/Gf2hJSCb2obUp32XBWqQKR6UBWrw4MZnsGL1zOat10LbELuWdQIr+9dt20bu9B//8R8Ve/bsSe7du/e5N910U8933nkno35eZmamf/369ZsnTZp04K677uoBAJMmTepxzz337N+wYcPmhQsXfnnbbbf1BoD777+/a2Zmpr+4uHhTcXHxpquvvvpY8OesWbMm9bXXXmtfWFi4ZcuWLZs8Ho8+/fTTHT755JP0vXv3+r744ouNxcXFm+64446Y39Iv4uClqgdV9S5VPU9Vh6jqnapaZvbo2mbnIBOGTW1D5m6eBx+ANPFCIEgTL3zm9Gat10LbELuWdQIr+9dt20bu1KZNm8CGDRs2PfHEE19lZWXVjR8//uyZM2d2AIDx48cfBoBbbrnl8Oeff54BAB9//HHm3Xff3TM3N3fAtdde2/f48ePeI0eOeJYvX57585///OQRU1ZWVoNThYsWLWq9YcOG9MGDB/fPzc0d8NFHH2Vu3749JTc3t6akpCRl/PjxPV577bXMdu3axfwUY8SnDUUkB8C9AHoHv09VL4v+sBKUTW1DSs0jgmDRSPu20jbErmWdwMr+ddu2kXslJSXhmmuuOXbNNdccGzRoUNX8+fM7AIDH883vqogoAKgqCgsLN2dkZDTIgFRVmPdFD0tV5cc//vGhJ5988pRf4A0bNmxauHBh5lNPPdXplVdeaf/qq6/ujNKmRcTKacNXAXwO4FcA/ivohyJlV4q4TWnfllLPbVrWCazsX7dtG7nTunXrUtavX3/yF/Dzzz9P6969+wkAmDdvXnsAeO6559qdd955XwPAxRdfXDFjxoxO9cuvXLkyDQBGjhxZ8eijj56cXlZW1iDDcNSoURVvv/12u9LS0iQA2L9/v7e4uDh57969SX6/HwUFBUcfeOCB0vXr14epibGXleBVp6qzVPVTVf2s/se2kSUiu1LEbUr7tpR6btOyTmBl/7pt28idKioqvOPGjTvr7LPPHpiTkzNgy5YtaTNmzNgDADU1NTJo0KDcp556qvPMmTNLAGDOnDkla9asaZWTkzPg7LPPHvjEE09kAcAf//jHvUePHvWec845A/v16zfg3XffbR38OUOHDq3+1a9+VXr55Zfn5OTkDLjssstySkpKfDt37vRdfPHF/XJzcwf87Gc/O2v69Om7Y70PrNxV/ncADgBYCODk+ZJ43CLK1beHOpltGL22IYB92YaWbnRr07JOwGxDilD97aF2Dh48+GCsPzw4SzDWn223devWdRw8eHDv+tdWgteOMJNVVftEaWwRc3XwIqJExuBlk9DgFXHChqqeZcuIEoFNtVtuY+mII4H3GY+8KF5KS0vXx3sMsRLxNS8RSReRX4nIHPP1OSJyjX1DcwmbarfcxlJ9UwLvM9Z5EcWGlYSNvwI4AWC4+Xo3gAeiPiK3sal2y20s1Tcl8D5jnRdRbFgJXmer6kMAagFAVavAlte2tfdwGyvtSBJ5n1naD0TUZFaC1wkRSQOgACAiZyMo67DFsql2y20s1Tcl8D5jnRdRbFgJXr+F0Ralh4gsAPBPAPfZMio3sal2y20s1Tcl8D5jnRe50cyZMzvs3LnTVc2Frdzb8H0APwJQAOAlAHkwGlO2bDa193AbK+1IEnmfWdoPRA7xwgsvdNy1a5ergpeVlihQ1UMA3ql/LSK7YNxdvmXLudJVf3jtSuXO37YS+Z+/D9QcB1IygJQc4IzrPXOdoRPavVhhpS0LEQC8U7Qn85kVO7rsLa9K6domreaW/LP2XT2oW0Vz1llRUeEZPXp0n7179yYHAgG577779uTm5tbcc889PSorKz3t2rWrW7Bgwc4PP/wwY8OGDenjxo3rk5qaGigsLNz8z3/+M+P+++/v4ff7MXjw4Mp58+Z9lZaWprfffnv24sWL23q9Xh05cmTFnDlzdr/44ott/vSnP3Wtra31tGvXru6VV17Z3qNHD9vrzCIuUg77ZpESVe0RxfFEhEXKTVefyu3z+pDqTUW1vxq1/trmHx0snQEsfwiAAOIF1A9AgRH3ASN/0XDZ+lR5T7KRqFFbZZw2DHP0Vd+OxAfjZrjVGkAtwMaR5FSWi5TfKdqT+ft3Nvf0eURTkjyBmrqApzag8uur++9qTgCbO3du20WLFrV5+eWXvwKAQ4cOea+44opz3nnnnW3dunWre+aZZ9otWbKkzauvvrrzggsu6PfII4+UjBgxorKyslL69OnzrSVLlmwdNGhQzQ9/+MPe5513XuWkSZMODRs2rP/27ds3eDweHDx40NuxY0d/WVmZt0OHDn6Px4NHH3204+bNm1OfeeaZqN8uKrRI2co1r3CaHvkoLmxL5V71JAABvEmAx3yEmNNDWEiVt6vdC5FTPLNiRxefRzTV5w2ICFJ93oDPI/rMih1dmrPe888/v2rFihWZkydPzl60aFHG9u3bfV988UXaZZddlpObmzvg4Ycf7rpnz55TThWuW7cutXv37jWDBg2qAYCCgoJDH330Uev27dv7U1JSAjfeeGOv559/vm1GRkYAAHbs2JGcn59/Tk5OzoCZM2dMDk+hAAAgAElEQVR22bJlS1roOu1wxtOGIvIXhA9SAqBt1EdEtrKtZUfNccAT8v+BeI3poSy0hrGr3QuRU+wtr0ppnZLU4DRbSpInsLe8qlltIQYNGlSzZs2aTa+//nqbX/7yl9kjR46s6Nu3b9XatWu3NPa+052N8/l8WLt27eY333wz8+WXX243a9asTqtWrSq+8847e9599937xowZU/7222+3nj59erfmjDtSkRx5FQL4LMxPIYC77Bsa2cG2VO6UDPNUYRD1G9NDWUiVt6vdC5FTdG2TVlNTF2jwt7imLuDp2iatWf9C27lzp69169aB22+//fDUqVP3FxYWtjp8+HDSBx980Aow7j5fWFiYCgAZGRn+8vJyLwAMGTKkurS0NHnDhg0pADBv3rwO+fn5x8rLyz2HDx/23nDDDeVPP/10yebNm9MB4NixY96ePXvWAsDcuXM7NGfMVpwxeKnq84391C9nHqGRw9mWyn3hHQAU8NcBAfMRak4PYSFV3q52L0ROcUv+WftqAyrVtX6PqqK61u+pDajckn/Wvuas97PPPksbMmRI/9zc3AEzZszo+vvf/37Pyy+//OX999/fvV+/fgMGDhw4YNmyZRkAMG7cuIN33XVXr9zc3AGBQABPP/30zh//+Mdn5+TkDPB4PLj33nvLjh496h01atQ5OTk5A/Lz8/s98MADJQDwy1/+cs9PfvKTs4cOHdqvQ4cOMbshcLMSNhqsSGSNqp4flZWdARM2mse2G8cunWFc46rPNrzwjlOTNepZaA3jtmxDatGadFd5O7INE02TW6KcyemCl4i0BfAsgHNhXDv7map+EjRfADwO4PsAKgEUqOqaxj6LwYuIHCquLVESWZNbojTD4wAWqer1IpIMILRd9FUAzjF/hgGYZT66RkIfGdjVusTCel3XYiSB270QOUVzU+WDnXKTXhHJBDACwHMAoKonVPVoyGI/ADBPDasAtBWRrlEcl63q65DKzKy4skANHtz4DFasnhnvoTWfXa1LLKzXdS1GErjdC5GTRDN4hetn0QdAGYC/isjnIvKsiLQKWSYbQEnQ693mNFdI6Doku1qXWKnzcluLkQRu90LkJJHUeb2FRoqRVXW0+Tj3NOs/H8BdqrpaRB4HcD+AXwd/RLjVhhnHrQBuBYCePZ1zR6qErkOyUI9l13ptq0uzi137jIgaiOSa1yPNWP9uALtVdbX5+jUYwSt0meBbTHUHsCd0Rao6B8AcwEjYaMaYoirbk4KyQA3SxHtyWsLUIbXtZZz2Sg66TBmN1iUW1pudkY2yqjKkJX1TtO/oFiN27TMiaiCSOq9ljf2c4b37AJSISD9z0uUANoUs9iaAcWK4EEC5qu5tysbEQ0LXIdnVusRKnZfbWowkcLsXatmmTp3a7Y033mht9X1vv/1260svvbRvtMcTcbahiJwD4I8ABgA42SpWVfuc4a13AVhgZhpuBzBBRG4z3/s0gHdhpMlvg5EqP8HKBsRb/rApmAYkZrZhzpUAHom4HsuO9eZ3z8c0THNPtqFd+4woBgKBAFQVXq/3lHmPPfbYKWfE7FBbWwuf78zdWaykyv8VRkPKPwO4FEaQCXe9qgFVXQuj91ewp4PmK4Awt2Fwj/xhUxIjWIVjV7sXC+t1XYsRl7XIIQfYuDATK5/ogorSFGRm12D4nfsw8IdNLlKePHlydq9evU7cf//9ZQBwzz33dGvdurU/EAhg4cKF7U+cOCFXX3310T//+c97tm7dmnzVVVedM3z48GOfffZZxj/+8Y9t//3f/92tqKiolYjomDFjDv72t789cN111/W+5ppryidMmHBk2bJl6VOnTu1ZWVnpSU5O1uXLl29NSUnRcePG9SoqKkr3er146KGHSq699tpjwePav3+/d8yYMb137dqVkpaWFpgzZ85Xw4YNq7rnnnu67d2717dr167k9u3b17311ls7zrSNVrIN01T1nzAKm79S1d8BuMzKDiVqihWrZ2Li3DyM+r9vYeLcvOiVIRS/D8y9BnjsW8Yj09kpHjYuzMSiaT3xdZkPKZl1+LrMh0XTemLjwswzvzm8m2666fDrr7/evv71P/7xj3ZZWVl127ZtSy0qKtq8efPmTWvXrk1/7733MgBg586dqRMmTDi0efPmTfv370/au3ev74svvthYXFy86Y477jgUvO7q6moZM2bM2Y899tiurVu3blq2bNnWjIyMwIwZMzoBQHFx8aYXX3xx+6233tq7srKywQHOfffd123w4MGVxcXFm37/+9+Xjh8//qz6eUVFRemLFy/eFkngAqwFr2oR8QD4QkTuFJEfAuhk4f1EltlWR8d6LHKKlU90gden8KUFIAL40gLw+hQrn2hyS5TvfOc7VYcOHUrauXOn75NPPklr06aNv6ioKG358uWZAwYMGDBw4MABX375ZeqWLVtSAaBr164nLr/88q8BIDc3t6akpCRl/PjxPV577bXMdu3aNbjjdlFRUWqnTp1qL7nkkkoAaN++fcDn82HlypUZ48aNOwQA5513XnW3bt1OrF+/PjX4vZ9++mnriRMnHgKA0aNHHzt69GjSoUOHvAAwatSooxkZGREn41kJXlNh3B1jCoChAMYCGG/h/USW2VZHx3oscoqK0hQkpTZsnZCUGkBFabNSlq+99tojL7zwQrsFCxa0v+666w6rKqZOnbp3y5Ytm7Zs2bJp165dG37+858fBID09PSTn5+VleXfsGHDpksvvfTYU0891enGG2/sHbxeVYWInBJkIrnVYLhl6tfVqlWrwCkzGxFx8FLVf6vqcQAVAKao6o/MO2IQ2aY0UINUO+rojn5l1F8FYz0WxUNmdg3qqhv+ktdVe5CZ3axf8rFjxx5+/fXX27/99tvtbrrppiNXXXVVxfz58zuWl5d7AGDHjh2+0tLSU/Ie9u7dm+T3+1FQUHD0gQceKF2/fn2DW/oNHjy4ev/+/cnLli1LB4AjR454amtrcfHFFx9/4YUX2gNAUVFRyt69e5MHDRrUoP/ShRdeeOyvf/1rB8DIQmzXrl1d+/btLQWtelayDfNgJG20Nl+Xw7jJ7mdN+WCiSNhWR8d6LHKK4Xfuw6JpPQF4kJQaQF21B/5awfA7m9USJS8vr/rrr7/2dO7c+USvXr1qe/XqVbtx48bUb3/727mAcbS1YMGCHUlJSQ0Oh3bu3OmbOHFi70AgIAAwffr03cHzU1NTdcGCBV9OmTKlZ3V1tSc1NTWwfPny4vvuu+/A2LFje+Xk5Azwer2YPXv2zrS0tAbrnjFjxp6f/vSnvXNycgakpaUF5s6dG9H1rXAivqu8iBQBuENVV5ivLwbwlKoOauqHNxXvKt9y1F/z8sE44qrWAGoBTBt4S/MyPOuveXmSjSOu2iqjHuuqR5gpSM3RtLvKRznbMBE1567yx+oDFwCo6kcicqyxNxA1l211dKzHIicZ+MMKBitrrASvT0VkNoCXYNx78AYAS0XkfAA4Uw8uchALLTtmLRyD+UfWotIjSA8oxrYbgsk/XBB2Wbtal9hWR8d6LNss3XIAs5dvR8mRSvRol45JI/pgZC6Tkyl6rGQbDgGQA6NQ+XcA+gMYDuB/0bz7H1IsWUgRn7VwDGaXr0OVAEmqqBJgdvk6zFo45pRlXde6hGyzdMsB/ObNjThwrBpt03w4cKwav3lzI5ZuORDvoVECsZJteGkjPyxWdgsLKeLzj6yFqCIJxon8JACiivlH1p6yrOtal5BtZi/fDp9XkJ6cBBHj0ecVzF6+Pd5DowQScfASkc4i8pyIvGe+HiAiE+0bGtnCQop4pUcQeoczrzk9VOnxUqR6G9QjOrt1Cdmm5Egl0nwNf3PSfF7sPlIZpxFRIrJy2nAugMUAupmvi2EULpObtO1lZNYFO02KeHpA4Q+Z5jenh8rOyEa1v0FJh7Nbl5BterRLR1Vtw9+cqlo/urdLP807iKyzErw6qurfAAQAQFXrgFP+tpHTWWjZMbbdEKgI6mBk6NQBUBGMbTfklGVd17qEbDNpRB/U+hWVJ+qgajzW+hWTRpypAQXZaefOnb5Ro0ZZ/hJuuOGGXp999llqY8s89NBDWU888USHpo/OOit1XksBXAfgfVU93+y9NUNVL7FxfGGxzquZTmYbnjlF3AnZhuQ+9dmGu49UonvLyjZsWp1XHEXagiTeQuu8rASv8wH8BcC5ADYAyAJwvaoW2TDORjF4EZFDNSl4Ld65OPP5jc932V+5P6Vzeuea8QPH7/te7+9FvSXKiy++2PGLL77YOHPmzA7vvfdem5qaGk9lZaXn448/Lh4/fnzPVatWte7Ro0dNIBBAQUHBoQkTJhy54IIL+j3yyCMlI0aMqExPTz9v4sSJB5YsWdImNTU18Pbbb2/r0aNH3T333NMtIyPDP3369P0bNmxIufXWW3sdOnQoyev16quvvrq9e/futaNGjepbXl7uraurk9/85jd7brrppqNWtik0eJ3xtKGIfFtEuph1XJcAmAagBsASALsbfTPFzIrdKzBx8USMen0UJi6e2HiKul2tQJbOAP7UE/if9sbj0hnRGQNbl9hq6ZYD+MmcVbh4xof4yZxVLTKlPZ77YPHOxZkP//vhnoerD/syfBl1h6sP+x7+98M9F+9cHNWWKBdeeOHXwcusWbMm46WXXtqxatWq4nnz5rUrKSlJ3rp168bnn39+5+eff54Rbr1VVVWeiy666PjWrVs3XXTRRcf/8pe/ZIUu89Of/vSs22677cDWrVs3FRYWbunZs2dtenp64J133tm2adOmzcuWLSueNm1a90CgSbc0PCmSa16zAZwwnw8H8EsATwI4AmBOsz6dosJSjZVdrUCWzgCWP2RcQ/P4jMflD4UPYFbGwNYltmJNVvz3wfMbn++S5EnS1KTUgIggNSk1kORJ0uc3Ph/Vlih9+vQ5EbxMfn5+RefOnf0AsGLFiowf/ehHR7xeL3r27Fl34YUXhr17ks/n0xtvvLEcAIYOHfr1V199lRw8/8iRI579+/cnjxs37igApKena+vWrQOBQECmTp3aPScnZ8Cll16ac+DAgeTdu3dbuUnGKSIJXl5VPWw+vwHAHFV9XVV/DaBvcz6cosNSjZVdrUBWPQlAAG8S4DEfIeb0ZoyBrUtsxZqs+O+D/ZX7U1K8KQ0OQ1K8KYH9lfuj2hIldH5wG5RILx8lJSWpx+Opf466uroGdTOnW8/s2bPbHzp0KGn9+vWbt2zZsqlDhw61VVVVVhIGTxFR8BKR+gh5OYAPg+Y1K3JSdFiqsbKrFUjNcUBCqsLEa0xvzhjYusRWrMmK/z7onN65psZf0+BvcY2/xtM5vXNUW6I0tmx+fv7xN954o53f70dJSUnS6tWrWzflM9u3bx/o0qXLifnz57cFgKqqKjl27JinvLzc27Fjx9qUlBR96623Wu/Zsyf5TOs6k0iC10sAlonIPwBUAai/q3xfAOXNHQA1n6UaKwt1XpakZAAaUjmhfmN6c8Zg13gJAGuygPjvg/EDx++rC9RJdV21R1VRXVftqQvUyfiB46PaEqXRMYwff6Rr164ncnJyBk6YMKHX4MGDv27btm2TSqFeeOGFHU8++WSnnJycAXl5ebklJSVJN9988+F169a1Ovfcc/u/8MIL7c8666zqM6+pcRFlG5pp8V0BLFHVr81pOQAy4nFDXmYbNlR/zcvn9SHVm4pqfzVq/bWYNmzaqanqdrUCqb/mBTGOuNQPQIER9wEjf9H0MbB1ia3qr/f4vII0nxdVtX7U+hXTRw9sKant0d4Hjsg2bIry8nJPmzZtAvv27fN++9vf7v/xxx9v6dmzZ10sx9CYJqfKOwmD16ks1VhZqPOyZOkM4xpXzXHjiOvCO04NXE0Zg13jJQAtuibrpCjuA9fVedW74IIL+lVUVHhra2vl7rvv3jdlypRD8R5TsOb08yIHy6+sRv7e/cDRUqBtEtDYUbmVViAW2qes6Dscc2uKvwmgfYfjzCXKEfzjac8aYF+RERSrjxqvGbyiZmRupxYXrEJxHwCffvrp1niPwYpmZXuQQ9iVTm5hvbal61tJwSeiFoPBKxHYlU5uYb22petbScEnco5AIBA4tf0CNYm5LxuUEzB4JQK70sktrNe2dH0rKfhEzrGhrKysDQNY8wUCASkrK2sD47aEJ/GaVyJo28s49ZYclNobjXRyC+vNzshGWVUZ0pK+CUqNputHOt6UDONUYfCv6ulS8Ikcoq6u7uZ9+/Y9u2/fvnPBg4TmCgDYUFdXd3PwRAavRDD8buOa0Qk0TCcP0+bErvUWDCzAg6sfBIAG6fphW6JYGe+FdxjXuPx1DVPwL7yjedtGZKOhQ4ceADA63uNIZPwXQSLIudKoe2rd2cjGa905OnVQFtab3z0f04ZNQ1ZaFipOVCArLSt8nZnV8Y78hVErlpwOBGqNx3C1Y0TUothe5yUiOwEcg9G4sk5V80LmjwTwDwA7zEl/V9Xpja2TdV5E5FC8xhUjsTpteKmqNlawt0JVr4nRWOLLQt1U04p+I1ivlWXJVvXFsSVHKtGjhRYIW8H9RfV42jCW7KpvYosRV4p3Kw634f6iYLEIXgpgiYh8JiK3nmaZi0RknYi8JyIDYzCm+LCrvoktRlwp3q043Ib7i4LF4rThd1R1j4h0AvC+iGxR1eVB89cA6KWqx0Xk+wDeAHBO6ErMwHcrAPTs6dI7ih/9yjjaCdZYfZPH13BaYy1GIl2vlWXJViVHKtE2reF33NLakVjB/UXBbD/yUtU95uMBAAsBXBAyv0JVj5vP3wXgE5GOYdYzR1XzVDUvK+uUztPuYKW9B1uMJLx4t+JwG+4vCmZr8BKRViLSuv45gO8ipEpaRLqIiJjPLzDH5Ki7GUfN8LuNeqYTlYCq8dhYfRPUqG8KmI+nq2+ysl4ry5KtJo3og1q/ovJEHVSNx1q/YtKIPvEemiNxf1EwW1PlRaQPjKMtwDhF+aKq/kFEbgMAVX1aRO4EMBlAHYxml/eo6srG1uvqVHkr7T3YYiThsR2JNS7YX0yVjxH28yIiih4Grxjh7aHI+VxWlzbzg2I8+9EOfH3Cj1bJXtx88VmYckVOTMdgRz2UE7aLqB7rvMjZXFaXNvODYjz+4TZU1fqR5DESCh7/cBtmflAcszHYUQ/lhO0iCsbgRc7msrq0Zz/aAY8ASR4PPOIxH43psWJHPZQTtosoGIMXOZtdvcps8vUJPzwhVz08YkyPlZIjlUjzNeyB1tx6KCdsF1EwBi9yNpfVpbVK9iIQkgMVUGN6rNhRD+WE7SIKxuBFzuayurSbLz4LAQXqAgEENGA+GtNjxY56KCdsF1EwBi9yNrt6ldlkyhU5uPuyvkjzeVEXME7X3X1Z35hm5Y3M7YTpoweiU+tUlFfVolPrVEwfPbBZ2YZO2C6iYKzzIkpAbB0SN6zzihEeeRElGLYOoZaAwYsowbB1CLUEDF5ECcaOVHkip2HwIkowbB1CLQGDF1GCYesQagkYvIgSjB2p8kROw7vKEyWgkbmdGKwooTF4UeNc1o7ECazUWLEei/uAmoanDen0XNaOxAms1FixHov7gJqOwYtOz2XtSJzASo0V67G4D6jpGLzo9FzWjsQJrNRYsR6L+4CajsGLTs9l7UicwEqNFeuxuA+o6Ri86PRc1o7ECazUWLEei/uAmo53lafGncw23GUccTHb8Izqs+d2H6lE9wizDSNZNlEl2D7gXeVjhKnyicKulPacKxms0LR07kj+WVi0+yg27inH1yf8KK+qRdHuo1H5w21lvD9/eQ3eLNoHf0Dh9QhGD+qCP994frPHECnWpFFT8LRhImBKu63sSn+f+UExHv9wG6pq/UjyGNd6Hv9wG2Z+UByz8f785TVYuHYv/AEj1PoDioVr9+LnL69p1hiI7MbglQiY0m4ru9Lfn/1oBzwCJHk88IjHfDSmx2q8bxbtA2D82tT/BE8ncioGr0TAlHZb2ZX+/vUJPzwhV0g8YkyP1Xjrj7ginU7kFAxeiYAp7bayK/29VbIXoTEioMb0WI3XGxo9zzCdyCkYvBIBU9ptZVf6+80Xn4WAAnWBAAIaMB+N6bEa7+hBXQAYvzb1P8HTiZyKqfKJginttrIr/X3mB8V49qMd+PqEH62Svbj54rMw5YqcmI433tmGCYaHrDFie/ASkZ0AjgHwA6hT1byQ+QLgcQDfB1AJoEBVG011YvAiIodi8IqRWNV5XaqqB08z7yoA55g/wwDMMh+jzwntPZwwBpexq2WGXa1LrBxNWVmWrUOIvuGEa14/ADBPDasAtBWRrlH/FCfUQjlhDC5jV8sMJ9RuWVmWrUOIGopF8FIAS0TkMxG5Ncz8bAAlQa93m9Oiywm1UE4Yg8vY1TLDCbVbVpZl6xCihmIRvL6jqufDOD14h4iMCJkf7hzxKRfiRORWESkUkcKysjLro3BCLZQTxuAydrXMcELtlpVl2TqEqCHbg5eq7jEfDwBYCOCCkEV2A+gR9Lo7gD1h1jNHVfNUNS8rK8v6QJxQC+WEMbiMXS0znFC7ZWVZtg4hasjW4CUirUSkdf1zAN8FsCFksTcBjBPDhQDKVXVv1AfjhFooJ4zBZexqmeGE2i0ry7J1CFFDtqbKi0gfGEdbgJHZ+KKq/kFEbgMAVX3aTJV/AsAoGKnyE1S10Tz4JqfKO6EWygljcBm7WmY4oXarKdmGCdI6JFExVT5GWKRMRBQ9DF4xwn5eTmahJmzF7hWYu3EuSo+XIjsjGwUDC5DfPT/GA7aHXTVWbsM6L6JvOKHOi8KxUBO2YvcKPLj6QZRVlSEzORNlVWV4cPWDWLF7RRwGHl1O6I/lBKzzImqIwcupLNSEzd04Fz6vD2lJaRARpCWlwef1Ye7GubEfd5Q5oT+WE7DOi6ghBi+nslATVnq8FKne1AbTUr2pKD1eaucIY8IJ/bGcgHVeRA0xeDmVhZqw7IxsVPurG0yr9lcjOyP6NyqJNSf0x3IC1nkRNcTg5VQWasIKBhag1l+LqroqqCqq6qpQ669FwcCC2I87ypzQH8sJWOdF1BBT5Z3MQk1YS8g2jGd/LCdgnZcrMFU+Rpgq72Q5V0ZcwJzfPT9hglWoot1HsXFPOb4+4Ud5VS2Kdh897R/tQd3bYmC3NifTyQd1bxvj0doXQEfmdmKwIjLxtCE5mtvahiRyuj6RkzB4kaO5rW1IIqfrEzkJgxc5mtvahiRyuj6RkzB4kaO5rW1IIqfrEzkJgxc5mtvahiRyuj6RkzDbkBytPksvkuy9kbmdMB2Iazq5lfESUdOxzouIKHpY5xUjPPKiFiuRC5qJEh2DF7VI9fVYRlr7N/VYABjAiFyACRvUIrEei8jdGLyoRWI9FpG7MXhRi8R6LCJ3Y/CiFon1WETuxoQNapFYj0XkbgxelFDqe17Vt0RprEjZrvYpVsbghPUSuRFPG1LCsNISxa72KW5bL5FbMXhRwrDSEsWu9iluWy+RWzF4UcKw0hLFrvYpblsvkVsxeFHCsNISxa72KW5bL5FbMXhRwrDSEsWu9iluWy+RW/Gu8pRQ6jPyImmJYmVZu8bghPVSVPGu8jESk+AlIl4AhQBKVfWakHkFAB4GUGpOekJVn21sfQxeRORQDF4xEqs6r7sBbAaQeZr5r6jqnTEaS1yxVoeIqPlsv+YlIt0BXA2g0aOploC1OkRE0RGLhI3HANwHINDIMteJSJGIvCYiPWIwprhgrQ4RUXTYGrxE5BoAB1T1s0YWewtAb1UdBOADAM+fZl23ikihiBSWlZXZMFr7sVaHiCg67D7y+g6A0SKyE8DLAC4TkReCF1DVQ6paY758BsDQcCtS1TmqmqeqeVlZWXaO2Tas1SEiig5bg5eq/reqdlfV3gBuBPChqt4UvIyIdA16ORpGYkdCYq0OEVF0xOWu8iIyHUChqr4JYIqIjAZQB+AwgIJ4jCkWRuZ2wnSAtTpERM3EImUiouhhnVeMsJ8XxYWVejf2xyKiULy3IcVcIvfdIqLYYPCimEvkvltEFBsMXhRzidx3i4hig8GLYi6R+24RUWwweFHMJXLfLSKKDQYvirmRuZ0wffRAdGqdivKqWnRqnYrpoweGzfSzsqxdYyAi52GqPMXFyNxOlgNFtCsSmzIGInIGHnmRozGlnYjCYfAiR2NKOxGFw+BFjsaUdiIKh8GLHI0p7UQUDoMXORpT2okoHAYvcjSmtBNROEyVJ8djSjsRheKRFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DFxERuQ6DFxERuY6oRrvFn/1EpAzAV81YRUcAB6M0HKfhtrlTom5bom4XEH7bDqrqqHgMpqVxZfBqLhEpVNW8eI/DDtw2d0rUbUvU7QISe9vcgKcNiYjIdRi8iIjIdVpq8JoT7wHYiNvmTom6bYm6XUBib5vjtchrXkRE5G4t9ciLiIhcLGGDl4j0EJF/ichmEdkoIneHWUZEZKaIbBORIhE5Px5jtSrCbRspIuUistb8+U08xmqViKSKyKciss7ctv8Js0yKiLxifm+rRaR37EdqTYTbVSAiZUHf2c3xGGtTiYhXRD4XkbfDzHPddxbsDNvm6u/NrRK5k3IdgP+nqmtEpDWAz0TkfVXdFLTMVQDOMX+GAZhlPjpdJNsGACtU9Zo4jK85agBcpqrHRcQH4CMReU9VVwUtMxHAEVXtKyI3ApgB4IZ4DNaCSLYLAF5R1TvjML5ouBvAZgCZYea58TsL1ti2Ae7+3lwpYY+8VHWvqq4xnx+D8YuXHbLYDwDMU8MqAG1FpGuMh2pZhNvmSuZ3cdx86TN/Qi/M/gDA8+bz1wBcLiISoyE2SYTb5Voi0h3A1QCePc0irvvO6kWwbRQHCRu8gpmnKM4DsDpkVjaAkqDXu+GyINDItgHAReZpqpO2Ih8AAAU+SURBVPdEZGBMB9YM5imatQAOAHhfVU/7valqHYByAB1iO0rrItguALjOPIX9moj0iPEQm+MxAPcBCJxmviu/M9OZtg1w7/fmWgkfvEQkA8DrAKaqakXo7DBvcc2/hs+wbWsA9FLVwQD+AuCNWI+vqVTVr6pDAHQHcIGInBuyiCu/twi26y0AvVV1EIAP8M2RiqOJyDUADqjqZ40tFmaa47+zCLfNld+b2yV08DKvLbwOYIGq/j3MIrsBBP8rqTuAPbEYW3OdadtUtaL+NJWqvgvAJyIdYzzMZlHVowCWAgi9V9zJ701EkgC0AXA4poNrhtNtl6oeUtUa8+UzAIbGeGhN9R0Ao0VkJ4CXAVwmIi+ELOPW7+yM2+bi783VEjZ4mefTnwOwWVUfPc1ibwIYZ2YdXgigXFX3xmyQTRTJtolIl/prCiJyAYzv+lDsRtk0IpIlIm3N52kArgCwJWSxNwGMN59fD+BDdXjBYiTbFXK9dTSMa5mOp6r/rardVbU3gBthfB83hSzmuu8MiGzb3Pq9uV0iZxt+B8BYAOvN6wwAMA1ATwBQ1acBvAvg+wC2AagEMCEO42yKSLbtegCTRaQOQBWAG93wxwJAVwDPi4gXRsD9m6q+LSLTARSq6pswAvd8EdkG41/vN8ZvuBGLZLumiMhoGNmkhwEUxG20UZAA39lpJfL35ha8wwYREblOwp42JCKixMXgRURErsPgRURErsPgRURErsPgRURErsPgRURErsPgRY4hIr8024UUma0lonaHfzFaxJzSzsKcJyJyUETama+7ioiKyMVBy5SJSAcRuU1ExoVZR28R2WA+HyIi3w+a9zsRuTda20JEiV2kTC4iIhcBuAbA+apaY97KKjkWn62qKiKrAVwEo3B9OIDPzcePRKQfgIOqegjA0xGscgiAPHNdRGQDHnmRU3SFESBqAEBVD6rqHhEZKiLLROQzEVlcfyseEVkqIo+JyEoR2WDeAgsicoE57XPzsV+En/8xjGAF8/FRGMGs/vVKc/0nj6LMsa0TkU8A3GFOSwYwHcAN5tFjfc+qAeaYt4vIlKbvJiICGLzIOZYA6CEixSLylIhcYt58+C8ArlfVoQD+D8Afgt7TSlWHA7jdnAcY9wscoarnAfgNgAcj/PyV+CZ4XQDjLvz1N20eDiO4hforgCmqWh/koKonzM99RVWHqOor5qxcAN8z1/1bc9uIqIl42pAcwewwPBRAPoBLAbwC4AEA5wJ437zHsBdA8I2TXzLfu1xEMs0b37aGcQ/Bc2C03Ig0SHwK4DwRaQXAZ45nu4j0hRG8/jd4YRFpA6Ctqi4zJ82H0Zn7dN4xjyprROQAgM4w7rRORE3A4EWOoap+GK1ClorIehin4jYGH9mEviXM698D+Jeq/lCMRp1LI/zsSvOmsT+D0QsNAFbBuHFzJwBbQ94iYT6/MTVBz/3g/3tEzcLThuQIItLPPFqqNwRGa4ksM5kDIuKThh2hbzCnXwyjnU05jD5Rpeb8AovD+BjAVACfmK8/AXA3gFWhd+Q3e3KVB2UkjgmafQzGESAR2YTBi5wiA8bpvk0iUgRgAIxrR9cDmCEi6wCsxTfXpQDgiIishJEBONGc9hCAP4rIxzBOM1rxMYA++CZ4rYHRoHTlaZafAOBJM2GjKmj6v2AkaAQnbBBRFLElCrmSiCwFcK+qFsZ7LEQUezzyIiIi1+GRF7UoIjIBxnWsYB+r6h3xGA8RNQ2DFxERuQ5PGxIRkesweBERkesweBERkesweBERkesweBERkev8f17mDHfJ1IW3AAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "%matplotlib inline\n", - "def plot_iris(iris, col1, col2):\n", - " import seaborn as sns\n", - " import matplotlib.pyplot as plt\n", - " sns.lmplot(x = col1, y = col2, \n", - " data = iris, \n", - " hue = \"Species\", \n", - " fit_reg = False)\n", - " plt.xlabel(col1)\n", - " plt.ylabel(col2)\n", - " plt.title('Iris species shown by color')\n", - "plot_iris(iris.data, 'Petal_Width', 'Sepal_Length')\n", - "plot_iris(iris.data, 'Sepal_Width', 'Sepal_Length')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Examine these results noticing the spearation, or overlap, of the label values. Which pairs of classes are well seperated and which pairs show overlap?\n", - "\n", - "ANS: Versicolor and Virginica show overlap. All other pairs are well separated. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Prepare the data set\n", - "\n", - "Data preparation is an important step before traning any machine learning model. These data require only two preparation steps:\n", - "- Scale the numeric values of the features. It is important that numeric features used to train machine learning models have a similar range of values. Otherwise, features which happen to have large numeric values may dominate model training, even if other features with smaller numeric values are more informative. In this case Zscore normalization is used. This normalization process scales each feature so that the mean is 0 and the variance is 1.0. \n", - "- Split the dataset into randomly sampled training and evaluation data sets. The random selection of cases seeks to limit the leakage of information between the training and evaluation cases.\n", - "\n", - "The code in the cell below normalizes the features by these steps:\n", - "- The scale funnction from scikit-learn.preprocessing is used to normalize the features.\n", - "- Column names are assigned to the resulting data frame. \n", - "- A statitical summary of the data frame is then printed. \n", - "\n", - "***\n", - "**Note:** Data preparation with scikit-learn is covered in another lesson. \n", - "***\n", - "\n", - "Execute this code and examine the results. " - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - " Sepal_Length Sepal_Width Petal_Length Petal_Width\n", - "count 1.500000e+02 1.500000e+02 1.500000e+02 1.500000e+02\n", - "mean -2.775558e-16 -9.695948e-16 -8.652338e-16 -4.662937e-16\n", - "std 1.003350e+00 1.003350e+00 1.003350e+00 1.003350e+00\n", - "min -1.870024e+00 -2.433947e+00 -1.567576e+00 -1.447076e+00\n", - "25% -9.006812e-01 -5.923730e-01 -1.226552e+00 -1.183812e+00\n", - "50% -5.250608e-02 -1.319795e-01 3.364776e-01 1.325097e-01\n", - "75% 6.745011e-01 5.586108e-01 7.627583e-01 7.906707e-01\n", - "max 2.492019e+00 3.090775e+00 1.785832e+00 1.712096e+00\n" - ] - } - ], - "source": [ - "from sklearn.preprocessing import scale\n", - "import pandas as pd\n", - "num_cols = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']\n", - "iris_scaled = scale(iris.data[num_cols])\n", - "iris_scaled = pd.DataFrame(iris_scaled, columns = num_cols)\n", - "print(iris_scaled.describe())" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Examine these results. You can see the mean and varience of each column in the list printed. The mean is zero and the variance approximately 1.0.\n", - "\n", - "The methods in the scikit-learn package requires numeric numpy arrays as arguments. Therefore, the strings indicting species must be re-coded as numbers. The code in the cell below does just this using a dictionary lookup. Execute this code and examine the head of the data frame. " - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
Sepal_LengthSepal_WidthPetal_LengthPetal_WidthSpecies
0-0.9006811.019004-1.340227-1.3154440
1-1.143017-0.131979-1.340227-1.3154440
2-1.3853530.328414-1.397064-1.3154440
3-1.5065210.098217-1.283389-1.3154440
4-1.0218491.249201-1.340227-1.3154440
\n", - "
" - ], - "text/plain": [ - " Sepal_Length Sepal_Width Petal_Length Petal_Width Species\n", - "0 -0.900681 1.019004 -1.340227 -1.315444 0\n", - "1 -1.143017 -0.131979 -1.340227 -1.315444 0\n", - "2 -1.385353 0.328414 -1.397064 -1.315444 0\n", - "3 -1.506521 0.098217 -1.283389 -1.315444 0\n", - "4 -1.021849 1.249201 -1.340227 -1.315444 0" - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", - "iris_scaled['Species'] = [levels[x] for x in iris.data['Species']]\n", - "iris_scaled.head()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Now, you will split the dataset into a test and evaluation sub-sets. The code in the cell below randomly splits the dataset into training and testing subsets. The features and lables are then seperated into numpy arrays. The dimension of each array is printed as a check. Execute this code to create these subsets. \n", - "\n", - "***\n", - "**Note:** Spliting data sets for machine learning with scikit-learn is discussed in another lesson.\n", - "***" - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "(75, 4)\n", - "(75,)\n", - "(75, 4)\n", - "(75,)\n" - ] - } - ], - "source": [ - "## Split the data into a training and test set by Bernoulli sampling\n", - "from sklearn.model_selection import train_test_split\n", - "import numpy as np\n", - "np.random.seed(3456)\n", - "iris_split = train_test_split(np.asmatrix(iris_scaled), test_size = 75)\n", - "iris_train_features = iris_split[0][:, :4]\n", - "iris_train_labels = np.ravel(iris_split[0][:, 4])\n", - "iris_test_features = iris_split[1][:, :4]\n", - "iris_test_labels = np.ravel(iris_split[1][:, 4])\n", - "print(iris_train_features.shape)\n", - "print(iris_train_labels.shape)\n", - "print(iris_test_features.shape)\n", - "print(iris_test_labels.shape)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Train and evaluate the KNN model\n", - "\n", - "With some understanding of the relationships between the features and the label and preparation of the data completed you will now train and evaluate a $K = 3$ model. The code in the cell below does the following:\n", - "- The KNN model is defined as having $K = 3$.\n", - "- The model is trained using the fit method with the feature and label numpy arrays as arguments.\n", - "- Displays a summary of the model. \n", - "\n", - "Execute this code and examine the summary of these results.\n", - "\n", - "\n", - "***\n", - "**Note:** Constructing machine learning models with scikit-learn is covered in another lesson.\n", - "***" - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "KNeighborsClassifier(algorithm='auto', leaf_size=30, metric='minkowski',\n", - " metric_params=None, n_jobs=1, n_neighbors=3, p=2,\n", - " weights='uniform')" - ] - }, - "execution_count": 8, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "## Defin and train the KNN model\n", - "from sklearn.neighbors import KNeighborsClassifier\n", - "KNN_mod = KNeighborsClassifier(n_neighbors = 3)\n", - "KNN_mod.fit(iris_train_features, iris_train_labels)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Next, you will evaluate this model using the accuracy statistic and a set of plots. The following steps create model predictions and compute accuracy:\n", - "- The predict method is used to compute KNN predictions from the model using the test features as an argument. \n", - "- The predictions are scored as correct or not using a list comprehension. \n", - "- Accuracy is computed as the percentage of the test cases correctly classified. \n", - "\n", - "Execute this code and examine the results. " - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "96.0\n" - ] - } - ], - "source": [ - "iris_test = pd.DataFrame(iris_test_features, columns = num_cols)\n", - "iris_test['predicted'] = KNN_mod.predict(iris_test_features)\n", - "iris_test['correct'] = [1 if x == z else 0 for x, z in zip(iris_test['predicted'], iris_test_labels)]\n", - "accuracy = 100.0 * float(sum(iris_test['correct'])) / float(iris_test.shape[0])\n", - "print(accuracy)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "What is the accuacy of the $K = 3$ model? ANS; 96%\n", - "\n", - "Now, exectue the code in the cell below and examine plots of the classifications of the iris species. " - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "'Done'" - ] - }, - "execution_count": 10, - "metadata": {}, - "output_type": "execute_result" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAGECAYAAADZfzztAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3X+UG3d57/HPs9aajcMG13GMDYTYAQcChBuS5ZfxBepQinEh5QKH0hBCgaaULlsO1PwoPUBbuBQC5QKmgBfSLhDCr/IjQLaJix0gDrZZc+MkTlgcEkMS+5KV7STGi70r73P/mNFYtrVaSSvNjGber3N0pJVGmkeydz/6zswzX3N3AQAgSV1JFwAASA9CAQAQIRQAABFCAQAQIRQAABFCAQAQIRSQODO7xMyuT7qOmZjZTjN7fotfc7eZvaCVr9ng+t9vZl9Oav1IH0IBbTfTHz53v8rdXxhnTc1w9ye7+w1J1wG0E6GARJlZIeka0Dj+3bKLUECszOx1ZrbZzD5uZvslvT+878bwcQsfu9/MHjSzW8zsKTVe6y4zO2hmd5vZJSes41Pha/zCzC6qeN4jzOwLZrbXzO4zsw+Y2ZyKx//SzO4IX/d2M7sgvD8a8ZhZl5m9y8x+ZWb7zOzrZrYgfKzHzL4c3v+Amf3MzB5Z42N5erieA2b272bWE77ObWb2koq6us2saGbnT/N5XGxmN5vZQ2FdLwrvf5SZXWNm+83sTjP7yxr/Pi8NN5M9YGY3mNm5FY/tNrN3mtktkg4RDNlEKCAJz5R0l6RFkj54wmMvlPRcSedImi/pVZL2nfgCZnaqpE9KWu3uvZJWSLq5yjoWSnqfpG+V/2hLGpJUkvR4SU8L1/nG8HVfKen9kl4r6TRJL622fkkDkv5U0vMkPUrSAUmfDh+7TNIjJJ0p6XRJb5L0+xqfxyWS/ljS48L3/Q/h/V+U9JqK5V4saa+736wTmNkzwuXXKvjcnitpd/jw1ZLuDet8haT/XRmSFa9xTrjsWyWdIelaSd8zs7kVi71a0hpJ8929VOM9oUMRCkjCHnf/lLuX3P3EP5aTknolPVGSufsd7r53mteZkvQUMzvF3fe6+86Kx+6X9H/cfdLdvyZpVNKa8Bv7aklvdfdD7n6/pI9L+rPweW+U9BF3/5kH7nT3X1dZ919Jeo+73+vuRxQEySvCb8+TCsLg8e5+1N23u/tDNT6Pde5+j7vvVxCSrw7v/7KkF5vZaeHPl0r60jSv8QZJV7r7Bnefcvf73P0XZnampJWS3unuh8NA+Xz4Wid6laQfhK8xKemjkk5RELhlnwxrrRVy6GCEApJwz3QPuPtGSesUfOv+rZmtr/ijWLncIQV/xN4kaa+Z/cDMnlixyH1+/Nkef63gm/JZkrrD5zxgZg9I+pyCUYsUfLv/VR3v4SxJ3654jTskHZX0SAV/uK+T9FUz22NmHzGz7hqvVfl5lOuUu++RtFnSy81svoIwu2qa15iu7kdJ2u/uB09Yx6OnWTYKQHefCmurXHbafztkA6GAJNQ8Na+7f9LdL5T0ZAWbU9ZOs9x17v5HkpZI+oWkwYqHH21mVvHzYyXtUfBH7Yikhe4+P7yc5u5PDpe7R8FmnJnco2DT1fyKS0/4DX3S3f/R3Z+k4Fv2nyjYHDWdM6vUWTakYBPSKyX91N3vq1FPtbr3SFpgZr0nrKPa6+xREHaSgv07YW2Vy3Ja5YwjFJAqZvZ0M3tm+M36kKTDCr6Bn7jcI8Odoqcq+CP/uxOWWyRpINw5+0pJ50q6NtwUdb2kj5nZaeEO48eZ2fPC531e0t+Z2YXhTu/Hm9lZOtlnJX2w/JiZnWFmF4e3/9DMzgt3Xj+kYHPSSe+hwt+Y2WPCfR5/L+lrFY99R9IFkv5WwT6D6XxB0l+Y2UXhe3q0mT3R3e+RdJOkD4U7wJ+qYFNTtRHH1xVsYrso/PzfruCzvanGepExhALS5jQF3/gPKNiUsU/Btu0TdSn4o7VH0n4FO3zfXPH4VknLJRUVbKd/hbuXdxi/VtJcSbeH6/mmgtGG3P0b4fJfkXRQwR/lBTrZJyRdI+l6MzsoaYuCnduStDh8zYcUbFb6kYL9A9P5ioKguiu8fKD8QLjt/j8lLZP0relewN23SfoLBftHHgzXWQ6zV0taquCz+rak97n7hiqvMapgVPIpBZ/bSyS9xN0natSOjDEm2UHWmNnrJL3R3VcmXUsrmNl7JZ3j7q+ZcWFgljjOGEixcJPSG1T9aCGg5dh8BKRU2GR2j6Rhd/9x0vUgH9h8BACIMFIAAEQIBQBApON2NC9cuNCXLl2adBkA0FG2b99edPczZlqu40Jh6dKlGhkZSboMAOgoZlbtHF4nYfMRACBCKAAAIoQCACBCKAAAIoQCACBCKAAAIoQCACBCKAAAIoQCACBCKAAAIoQCACBCKAAAIoQCgHQrFqU1a6R9+5KuJBcIBQDpNjgobd0qrV+fdCW5QCgASK9iURoakhYvDq4ZLbQdoQAgvQYHpclJqacnuGa00HaEAoB0Ko8SenuDn3t7GS3EgFAAkE7lUUIhnCCyUGC0EANCAUA6bdgglUrS2NixS6kU3I+26bg5mgHkxMaNSVeQS4wUAAARQgEAECEUAAARQgEAECEUAAARQgEAECEUAAARQgEAECEUAKRbHPMpNLOO0VFp0SJp16721ZUAQgFAusUxn0Iz6xgYCMKkv799dSXA3D3pGhrS19fnIyMjSZcBIA7ForRyZXAyvFJJ2rxZOv305NcxOiqdd55kJrlLO3dKy5e3tq4WM7Pt7t4303KMFACkVxzzKTSzjoGBIAwKheA6Q6MFQgFAOsUxn0Iz6xgdlTZtOv6U3ps2ZWbfAqEAIJ3imE+hmXWURwld4Z/Prq5MjRYIBQDpFMd8Cs2sY9u2IASOHDl2cQ/uzwDmUwCQTnHMp9DMOg4caH0dKcJIAQAQIRQAABFCAQAQIRQAABFCAQAQIRQAABFCAQAQIRQAABFCAQAQSTQUzOxMM9tkZneY2U4z+9sk6wGAvEv6NBclSW9395+bWa+k7Wa2wd1vT7guAMilREcK7r7X3X8e3j4o6Q5Jj06yJgDIs9TsUzCzpZKeJmlrspUAQH6lIhTM7OGS/lPSW939oSqPX25mI2Y2MjY2Fn+BAJATiYeCmXUrCISr3P1b1ZZx9/Xu3ufufWeccUa8BQJAjiR99JFJ+oKkO9z9X5OsBQCQ/EjhOZIulbTKzG4OLy9OuCYAyK1ED0l19xslWZI1AACOSXqkAABIEUIBABAhFAAAEUIBABAhFAAAEUIBABAhFAAAEUIBABAhFAA0r1iU1qyR9u1LuhK0CKEAoHmDg9LWrdL69UlXghYhFAA0p1iUhoakxYuDa0YLmUAoAGjO4KA0OSn19ATXjBYygVAA0LjyKKG3N/i5t5fRQkYQCgAaVx4lFMITLRcKjBYyglAA0LgNG6RSSRobO3YplYL70dESnU8BQIfauDHpCtAmjBQANI8+hcwhFAA0jz6FzCEUADSHPoVMIhQANIc+hUwiFAA0jj6FzCIUADSOPoXMIhQANI4+hcyiTwFA4+hTyCxGCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgDQjNFRadEiadeupCtpKUIBAJoxMCAVi1J/f9KVtBShAACNGh2VNm2SuruD6wyNFggFAGjUwIDkLhUKwXWGRguEAgA0ojxKKBSCnwuFTI0WCAUAaER5lNAV/vns6srUaKGQdAEA0FG2bQtC4MiRk+/PAEIBABpx4EDSFbQVm48AABFCAQAQIRQAABFCAQAQIRQAABFCAQAQIRQAABFCAQAQIRSADlAcL2rNVWu0b3xf0qWgWcWitGaNtC/d/4aEAtABBrcPaut9W7V++/qkS0GzBgelrVul9en+NyQUgJQrjhc1tGNIix++WEM7hhgtdKJiURoakhYvDq5TPFpIPBTM7Eozu9/Mbku6FiCNBrcPanJqUj2FHk1OTTJa6ESDg9LkpNTTE1yneLSQeChI+g9JL0q6CCCNyqOE3rm9kqTeub2MFjpNeZTQG/wbqrc31aOFxEPB3X8saX/SdQBpVB4lFLqCExoXugqMFjpNeZRQOSlPikcLiYdCPczscjMbMbORsbGxpMsBYrPhrg0qTZU0Nj4WXUpTJW24a0PSpaFeGzZIpZI0NnbsUioF96eQuXvSNcjMlkr6vrs/ZaZl+/r6fGRkpO01AUCWmNl2d++babmOGCkAAOJBKACzRGMZsiTxUDCzqyX9VNITzOxeM3tD0jUBjaCxDFmSeCi4+6vdfYm7d7v7Y9z9C0nXBNSLxjJkTeKhAHQyGsuQNYQC0CQay5BFhALQJBrLkEWEAtAkGsuQRYWkCwA61cbLNiZdAtByjBSADpDrXojRUWnRImnXrqQryYWGQsHMVpjZn5vZa8uXdhUG4Jhc90IMDARnGu3vT7qSXKg7FMzsS5I+KmmlpKeHlxnPowFgdnLdCzE6Km3aJHV3B9eMFtqukZFCn6TnuPub3f0t4WWgXYUBCOS6F2JgQHIPTjftzmghBo2Ewm2SFrerEAAny3UvRHmUUDkPAaOFtpsxFMzse2Z2jaSFkm43s+vM7Jrypf0lAvmV616I8iihK/wz1dXFaCEG9RyS+tG2VwGgqspeiBPvf/f/fHdCVcVk27YgBI4cOfl+tM2MoeDuP5IkM/uwu7+z8jEz+7CkH7WpNiD3ct0LceBA0hXkUiP7FP6oyn2rW1UIACB59exT+Gszu1XBfAe3VFzulnRL+0sE0i2OxrLUNq8Vi9KaNdK+Bupq5jlx1AVJ9Y0UviLpJZKuCa/Llwvd/TVtrA3oCHE0lqW2eW1wUNq6VVrfQF3NPCeOuiBJMnevb0GzBVXuPujuk60tqba+vj4fGRmJc5XAtIrjRa28cqUKXQWVpkra/PrNOn3e6R23juYKK0orVwaHipZK0ubN0ukz1NXMc+KoKwfMbLu7z9hw3Mg+hZ9LGpP0S0m7wtt3m9nPzezC5soEOlscjWWpbV4bHJQmJ6WenuC6nm/lzTwnjroQaWSk8FlJ33b368KfXyjpRZK+LukT7v7MtlVZgZEC0qL8Db6n0BN9iz9cOtzSb/JxrKO5wsJv4z09x76RHz5c+1t5M8+Jo66caMdIoa8cCJLk7tdLeq67b5H0sCZqBDpaHI1lqW1eK38br+w2nulbeTPPiaMuHKeRUNhvZu80s7PCyzskHTCzOZKm2lQfkFpxTLKT2ol8NmwIvoWPjR27lErB/a18Thx14TiNbD5aKOl9Cs6SapJulPSPkh6U9Fh3v7NdRVZi8xEANK7lm4/cvRieGfVp7n6+u/e7+5i7T8QVCEBupfW4+2YmwEnre2m3tPZ0nKCR+RTOMbP1Zna9mW0sX9pZHIBQWo+7b2YCnLS+l3ZLa0/HCRrZfLRD0mclbZd0tHy/u29vT2nVsfkIuZPW4+5HR6XzzpPMghPX7dwpLV9e+zlpfS/tloKejnYcfVRy98+4+zZ3316+NF0hgPqk9bj7ZibASet7abe09nRU0chI4f2S7pf0bUnRuWzdfX9bKpsGIwXkSlqPuy+PEubMCeY5mJqSjh6tPVpI63tpt5T0dLRjpHCZpLWSblKwCWm7JP46A+2U1uPum5kAJ63vpd3S2tMxjUaOPlpW5XJ2O4sDci+tx91XToBTvrjXngAnre+l3dLa0zGNRjYfzZP0NgU9CZeb2XJJT3D377ezwBOx+QgAGteOzUf/LmlC0orw53slfaCJ2gAAKdVIKDzO3T8iaVKS3P33CjqbgVxL7QQ4cchrI5qU2ffeSChMmNkpklySzOxxqjgKCcir1E6AE4e8NqJJmX3vjYTC+yT9l6QzzewqST+U9I62VAV0iOJ4UUM7hrT44Ys1tGMoX6OFYlEaGpIWLw6uM/aNuaYMv/dGjj7aIOl/SXqdpKsl9SmYcAfIrdROgBOHvDaiSZl+73UffVT1yWa/cffHtrCeGXH0EdIitRPgxCGvjWhSx773dhx9VHU9s3w+0LFSOwFOHPLaiCZl/r3PNhSaH2YAHS61E+DEIa+NaFLm33thpgXM7FOq/sffJM1veUVAh9h4WY7PHL+R955V9YwURnTsXEeVlxFJb2lfaUAy4ug7GC2OatEVi7RrXwOT06TM8K5hrRpapWWfWKZVQ6s0vGs46ZLQAjOGgrsP1bqUlwtHFEDHi6PvYGB4QMXxovqvbWBymhQZ3jWs/uF+7T24Vwt6Fmjvwb3qH+4nGDJgtvsUKj2nha8FJCKOvoPR4qg27d6k7jnd2rR7U0eOFq646QrN7ZqrU+eeKjPTqXNP1dyuubripiuSLg2z1MpQADpeHH0HA8MDcrkKXQW5vCNHC3c/cLfmdc877r553fO0+4HdyRSEliEUgFB5lNA7t1eS1Du3t+WjhfIooWDhYaxW6MjRwrL5yzQ+OX7cfeOT41o6f2kyBaFlWhkK9Cygo8XRd1AeJXSFk9N0dXV15Ghh7Yq1mpia0KGJQ3J3HZo4pImpCa1dsTbp0jBLrQyFT7TwtYDYxdF3sG3PNrm7jpSORBd317Y9NSanSaHVy1dr3ep1WtK7RAcOH9CS3iVat3qdVi9fnXRpmKUZT3NhZt9TjSY1d39pq4uqhdNcAEDjWnmai49K+liNCxCLPM9b0PB7Hx6WVq2Sli0LrofbdKhoRucUyLN6+hR+VOsSR5GAlO95Cxp678PDUn+/tHevtGBBcN3f355gyOicAnnWyBzNyyV9SNKTJPWU73f3s9tTWnVsPsqn8hlJy2cjzcWZSEMNv/dVq4IgOPXUY/cdOiQtWdLaUzSUzxZaPlNoys8SmnftmqP5M5JKkv5Q0hclfam58oDG5Hnegobf+913S/OO7yHQvHnS7t0tLiy7cwrkWSOhcIq7/1DB6OLX7v5+SavaUxZwTBz9A2nV1HtftkwaP76HQOPj0tKlLSwsnHmsN6hLvb2Zm4EsrxoJhcNm1iVpl5n1m9nLJC1qU11AJM/zFjT13teulSYmgk1G7sH1xERwf8sKy/acAnnWSCi8VdI8SQOSLpR0qaTL2lEUUCnP8xY09d5Xr5bWrQv2IRw4EFyvWxfc37LCsj2nQJ41PB2nmZ0myd39YHtKqo0dzQDQuJbvaDazPjO7VdItkm41sx1mduFsigQApEsjm4+ulPRmd1/q7ksl/Y2CI5IAtFlaG/f2/WZUPzt/kfbf01kn9MP0GgmFg+7+k/IP7n6jpEQ2IQF5k9bGvVs/MKCzdxV1yz931gn9ML1GQmGbmX3OzJ5vZs8zs3+TdIOZXWBmF7SrQCDv4pj4pxn7fjOqx353k/Y/oju4ZrSQCY2EwvmSzpH0Pknvl3SupBUKzn/00ZZXBkBSehv3bv3AgLqOuia7C+o66owWMqLho49aXoDZixScdnuOpM+7+7/UWp6jj5An5VNc9BR6otNcHC4dTvw0H/t+M6oHn36eJrrnaGpOl7qOTmnu5FHNH9mpBWcuT6wuTK8dRx890sy+YGbD4c9PMrM3zLLIOZI+LWm1gnMqvdrMnjSb1wSyJK2Ne+VRwtSc4E9IEAyMFrKgkc1H/yHpOkmPCn/+pYKGttl4hqQ73f0ud5+Q9FVJF8/yNYHMSGvj3mk3blNhynXawSPRpTDlOu3GzposCCcrNLDsQnf/upm9W5LcvWRmR2e5/kdLuqfi53slPXOWrwlkxsbLWnhW0xa64PYDVe9/TMx1oPUaGSkcMrPTFc7CZmbPkvTgLNdfbV7nk3ZymNnlZjZiZiNjY2OzXGX9mD8kn9LaE5AZef3F6pD33UgovE3SNZIeZ2abFZw6+y2zXP+9ks6s+PkxkvacuJC7r3f3PnfvO+OMM2a5yvoxf0g+pbUnIDPy+ovVIe+7njmany7pHnf/f2ZWkPRXkl4u6XZJ73X3/U2vPHi9X0q6SNJ9kn4m6c/dfed0z4nr6CPmD8mnPE/mE4u8/mKl4H238uijz0maCG+vkPQeBUcMHZA0q8hz95KkfgU7sO+Q9PVagRAn5g/Jp7T2BGRGXn+xOuh91zNS2OHu/yO8/WlJY+EEOzKzm939/LZXWSGOkUI51Ht6jgX74cP5+VKTV2ntCciMvP5ipeR9t3KkMCfczCMFm3kqD4do5OiljsH8IfmU1p6AzMjrL1aHve96QuFqST8ys+9K+r2kn0iSmT1esz/6KJWYPySf0toTkBl5/cXqsPdd12kuwsNPl0i63t0PhfedI+nh7v7z9pZ4PE5zAQCNq3fzUV2bf9x9S5X7ftlMYQCA9GqkTwFAFaPFUS26YpF27ePU0eh8hAIwSwPDAyqOF9V/LSeDQ+cjFIBZGC2OatPuTeqe061NuzcxWkDHIxSAWRgYHpDLVegqyOWMFtDxCAWgSeVRQiFs4ylYgdECOh6hADSpPEro6gp+jbq6uhgtoONlsiMZiMO2Pdvk7jpSOnLS/UCnIhSAJh14Z/WJZoBOxuYjZFocE+ZkZR2p1SGT02QFoYBMi2PCnKysI7U6ZHKarCAUkFnF8aKGdgxp8cMXa2jHUFu+ZWdlHalVLEpDQ9LixcE1o4W2IxSQWXFMmJOVdaRWB01OkxWEAjKp/O26d26vJKl3bm/Lv2VnZR2pVR4l9AbvXb29jBZiQCggk+KYMCcr60itDpucJisIBWRSHBPmZGUdqdVhk9NkRV2T7KQJk+wAQONaOUcz0LGaOb4/1z0ByD1CAZnWzPH9ue4JQO4RCsisZo7vz3VPACBCARnWzPH9ue4JAEQoIKOaOb4/1z0BQIhQQCY1c3x/rnsCgBChgExq5vj+XPcEACH6FAAgB+hTAAA0jFCoIY65PUZHpUWLpF3M9Y4aaKhDXAiFGuKY22NgIAiffuZ6Rw001CEuhMI04pjbY3RU2rRJ6u4OrhktoBoa6hAnQmEaccztMTAguQdnBHZntIDqaKhDnAiFKuKY26M8Sqg8VTyjBZyIhjrEjVCoIo65PcqjhK7wX6Cri9ECTkZDHeJGKFQRx9we27YFIXDkyLGLe3A/UEZDHeJWSLqANNq4sf3rOHCg/etA59t4WQz/GYEKjBRqiKNPAQDShFCoIY4+BQBIE0JhGnH0KQBA2hAK04ijTwEA0oZQqCKOPgUASCNCoYo4+hQAII0IhSri6FMAgDSiT6GKOPoUACCNGCkAACKEQg00rwHIG0KhBprXAOQNoTANmtcA5BGhMA2a1wDkEaFQBc1rAPKKUKiC5jUAeUUoVEHzGoC8onmtCprXAOQVI4UaGu1TaKavgV6I9iqOF7XmqjVMdA/UiVCoodE+hWb6GuiFaK/B7YPaet9WJroH6mTunnQNDenr6/ORkZG2r6dYlFauDHYyl0rS5s3S6ae3bvlmn4P6FceLWnnlShW6CipNlbT59Zt1+jw+YOSTmW13976ZlmOkMI1G+xSa6WugF6K9BrcPanJqUj2FHk1OTTJaAOrASKGK8jf4np5j3+IPH57+m3yjyzf7HNSvPEroKfREI4XDpcOMFpBbjBRmodE+hWb6GuiFaK/yKKHQFXzAha4CowWgDomFgpm90sx2mtmUmc2YXnFqtE+hmb4GeiHaa8NdG1SaKmlsfCy6lKZK2nAXHzBQS2Kbj8zsXElTkj4n6e/cva5tQnHtaAaALKl381FizWvufockmVlSJQAATsA+hRq2bJHmzpW2bWvfOmheqx+NaED7tTUUzOy/zey2KpeLG3ydy81sxMxGxsbG2lXuSS69NNj5e8kl7VsHzWv1oxENaL/ED0k1sxuUwn0KW7ZIz372sZ+3bpWe8YzWroPmtfrRiAbMDoekztKllwbXXeEn1I7RAs1r9aMRDYhHkoekvszM7pX0bEk/MLPrkqrlRFu2SHfeefx9d97Z2n0LTORTv+J4UUM7htQ7N/iweuf2amjHEPsWgDZILBTc/dvu/hh3f5i7P9Ld/zipWk504iihHaMFmtfqRyMaEB82H1Vx993B9dTUsUvl/a1A81r9aEQD4sMkO1WUSu1fBxP51G/jZXxYQFxyM1JgApzOR59C+wzvGtaqoVVa9ollWjW0SsO7hpMuCQnJTSgwAU7no0+hPYZ3Dat/uF97D+7Vgp4F2ntwr/qH+wmGnEq8T6FRzfQpMAFO56NPoX1WDa3S3oN7dercU6P7Dk0c0pLeJWy6yxD6FCowAU7no0+hfe5+4G7N65533H3zuudp9wO7kykIicp8KDTTD0APQbrQp9Bey+Yv0/jk+HH3jU+Oa+n8pckUhERlPhSYAKfz0afQXmtXrNXE1IQOTRySu+vQxCFNTE1o7Yq1SZeGBGQ+FJgAp/PRp9Beq5ev1rrV67Skd4kOHD6gJb1LtG71Oq1evjrp0pCAXOxoBoC8Y0dzCzTap0BfA4BORyjU0GifAn0NADodoTCN8hFIixfXd+RRo8sDQBoRCtNotE+BvgYAWUAoVNFonwJ9DQCyglCootE+BfoaAGQFoVBFo30K9DUAyArmU6ii0bkOmBsBQFYwUgAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUAAARAgFAECEUKhhdFRatEjatSvpSgAgHoRCDQMDUrEo9fcnXQkAxINQmMboqLRpk9TdHVwzWgCQB4TCNAYGJHepUAiuGS0AyANCoYryKKFQCH4uFBgtAMgHQqGK8iihK/x0uroYLQDIh0LSBaTRtm1BCBw5cvL9AJBlhEIVBw4kXQEAJIPNRwCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIgQCgCACKEAAIiYuyddQ0PMbEzSr+tYdKGkYpvLaSfqT04n1y5Rf9LSWv9Z7n7GTAt1XCjUy8xG3L0v6TqaRf3J6eTaJepPWqfXz+YjAECEUAAARLIcCuuTLmCWqD85nVy7RP1J6+j6M7tPAQDQuCwBHojKAAAGoklEQVSPFAAADcpMKJjZK81sp5lNmdm0e/7NbLeZ3WpmN5vZSJw11tJA/S8ys1Ezu9PM3hVnjbWY2QIz22Bmu8LrP5hmuaPhZ3+zmV0Td50n1FLzszSzh5nZ18LHt5rZ0virnF4d9b/OzMYqPu83JlFnNWZ2pZndb2a3TfO4mdknw/d2i5ldEHeNtdRR//PN7MGKz/69cdfYNHfPxEXSuZKeIOkGSX01ltstaWHS9TZTv6Q5kn4l6WxJcyXtkPSkpGsPa/uIpHeFt98l6cPTLPe7pGut97OU9GZJnw1v/5mkryVdd4P1v07SuqRrnab+50q6QNJt0zz+YknDkkzSsyRtTbrmBut/vqTvJ11nM5fMjBTc/Q53H026jmbVWf8zJN3p7ne5+4Skr0q6uP3V1eViSUPh7SFJf5pgLfWo57OsfE/flHSRmVmMNdaS5v8LM3L3H0vaX2ORiyV90QNbJM03syXxVDezOurvWJkJhQa4pOvNbLuZXZ50MQ16tKR7Kn6+N7wvDR7p7nslKbxeNM1yPWY2YmZbzCzJ4Kjns4yWcfeSpAclnR5LdTOr9//Cy8PNL980szPjKa0l0vx/vV7PNrMdZjZsZk9Ouph6FZIuoBFm9t+SFld56D3u/t06X+Y57r7HzBZJ2mBmvwhTv+1aUH+1b6mxHT5Wq/4GXuax4ed/tqSNZnaru/+qNRU2pJ7PMtHPewb11PY9SVe7+xEze5OCUc+qtlfWGmn+7OvxcwWnlfidmb1Y0nckLU+4prp0VCi4+wta8Bp7wuv7zezbCobhsYRCC+q/V1Llt73HSNozy9esW636zey3ZrbE3feGw/z7p3mN8ud/l5ndIOlpCraNx62ez7K8zL1mVpD0CKVnk8GM9bv7voofByV9OIa6WiXR/+uz5e4PVdy+1sz+zcwWunsaz4l0nFxtPjKzU82st3xb0gslVT16IKV+Jmm5mS0zs7kKdn4megRPhWskXRbevkzSSSMfM/sDM3tYeHuhpOdIuj22Co9Xz2dZ+Z5eIWmjh3sRU2DG+k/YBv9SSXfEWN9sXSPpteFRSM+S9GB582QnMLPF5f1PZvYMBX9r99V+Vkokvae7VRdJL1Pw7eKIpN9Kui68/1GSrg1vn63gKI0dknYq2GyTeO311h/+/GJJv1Tw7TpN9Z8u6YeSdoXXC8L7+yR9Pry9QtKt4ed/q6Q3JFzzSZ+lpH+S9NLwdo+kb0i6U9I2SWcn/Tk3WP+Hwv/nOyRtkvTEpGuuqP1qSXslTYb/798g6U2S3hQ+bpI+Hb63W1XjiMKU1t9f8dlvkbQi6ZrrvdDRDACI5GrzEQCgNkIBABAhFAAAEUIBABAhFAAAEUIBABAhFJBZFafpvs3MvmFm82ZY/u/rfN3dYfNdtcc+bmZvrfj5OjP7fMXPHzOzt5nZo8zsm9O8xg3l06dX1mRmS6c7VTPQKoQCsuz37n6+uz9F0oSC5qJa6gqFGdykoElPZtYlaaGkypOhrZC02d33uPsr6ni9VtQE1I1QQF78RNLjJcnMXmNm28JRxOfMbI6Z/YukU8L7rgqX+054Nt2dDZxRd7PCUFAQBrdJOlhxio9zJf3fym/9ZnaKmX01PJvp1ySdEt5/Uk2S5pjZYFjT9WZ2yuw/GuAYQgGZF57MbrWkW83sXEmvUnC23PMlHZV0ibu/S8dGFpeET329u1+o4FQdA2Y242mzPTjhX8nMHqsgHH4qaaukZ4evc4sH8x9U+mtJ4+7+VEkflHRh+FrValou6dPu/mRJD0h6eTOfCTCdjjpLKtCgU8zs5vD2TyR9QdLlCv7o/iw8X9kpmuaMrgqC4GXh7TMV/EGu56Rm5dHCCkn/qmAegBUK5mO4qcryz5X0SUly91vM7JYar323u5ff03ZJS+uoB6gboYAs+304GoiEZ64ccvd313qimT1f0gskPdvdx8PTfPfUud7yfoXzFGw+ukfS2yU9JOnKaZ5T70nIjlTcPqpwUxPQKmw+Qt78UNIrwkmWZGYLzOys8LFJM+sObz9C0oEwEJ6oYJ7gem2W9CeS9rv7UXffL2m+gk1IP62y/I8lXRLW8xRJT614rLImoO0IBeSKu98u6R8UTMl6i6QNksrzDqyXdEu4U/e/JBXCZf5ZwemP63WrgqOOtpxw34NefZKVz0h6eLiudyg4TXdZZU1A23HqbABAhJECACDCjmagCeHhqT+s8tBFfvzcyEBHYfMRACDC5iMAQIRQAABECAUAQIRQAABECAUAQOT/A9qyWBH8HHViAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - }, - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAGECAYAAADZfzztAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XuUHGd95vHnGY2UQXYbxTdkfEHGmIshWWMLSIyAxORiowBLFg5hCRhCwmZBEZwEhxD2BLIxyQaTC8EkoAlOJsQhsBsITojWViIbYhlkJMd3I+TYxjed9bQs24qEpBnPb/+o6lJbHs1Mq7r77er6fs7p0+rqrq5fV4/6qbeq3nodEQIAQJJGUhcAABgchAIAoEAoAAAKhAIAoEAoAAAKhAIAoEAoIDnbb7V9deo65mP7dts/1uX3vNf2T3TzPTtc/kdt/3Wq5WPwEAroufl++CLiioj4qX7WdCQi4oURcW3qOoBeIhSQlO3R1DWgc3xvw4tQQF/ZfoftTbb/yPYjkj6aT7suf975cw/bfsz2LbZfNMd73W17t+17bL/1kGV8Kn+P79h+ddt8T7f9Ods7bD9o+xLbi9qe/yXbd+bve4ftc/LpRYvH9ojt37D977Z32v6S7WPz58Zs/3U+/VHb37b9jDlWy0vy5eyy/Re2x/L3uc32a9vqWmy7afvsw6yP19u+yfbjeV0X5NOfaftK24/Yvsv2L83x/bwu3032qO1rbb+g7bl7bX/Q9i2S9hAMw4lQQAovk3S3pBMlfeyQ535K0islPVfSMklvlrTz0DewfZSkP5F0YUQ0JJ0n6aZZlnG8pI9I+nLrR1vShKRpSc+R9OJ8mb+Yv++bJH1U0tslHSPpdbMtX9JaSf9Z0qskPVPSLkmfzp+7SNLTJZ0q6ThJvyzp+3Osj7dK+mlJZ+Sf+3/k0/9K0s+3ve41knZExE06hO2X5q+/WNl6e6Wke/OnvyDpgbzON0r63faQbHuP5+avfb+kEyT9k6R/sL2k7WVvkbRa0rKImJ7jM6GiCAWk8FBEfCoipiPi0B/LKUkNSc+X5Ii4MyJ2HOZ9ZiS9yPbTImJHRNze9tzDkv44IqYi4ouStklanW+xXyjp/RGxJyIelvRHkn4un+8XJX08Ir4dmbsi4nuzLPu/SfpwRDwQEfuVBckb863nKWVh8JyIeCIitkbE43Osj8si4v6IeERZSL4ln/7Xkl5j+5j88dskff4w7/EuSZdHxIaImImIByPiO7ZPlbRK0gcjYl8eKH+ev9eh3izpa/l7TEn6hKSnKQvclj/Ja50r5FBhhAJSuP9wT0TERkmXKdvq/n+217X9KLa/bo+yH7FflrTD9tdsP7/tJQ/Gk6/2+D1lW8rPkrQ4n+dR249K+qyyVouUbd3/+wI+w7MkfaXtPe6U9ISkZyj74b5K0t/afsj2x20vnuO92tdHq05FxEOSNkn6L7aXKQuzKw7zHoer+5mSHomI3Ycs4+TDvLYIwIiYyWtrf+1hvzsMB0IBKcx5ad6I+JOIOFfSC5XtTrn4MK+7KiJ+UtJJkr4jabzt6ZNtu+3xaZIeUvajtl/S8RGxLL8dExEvzF93v7LdOPO5X9muq2Vtt7F8C30qIn47Is5StpX9M8p2Rx3OqbPU2TKhbBfSmyR9MyIenKOe2ep+SNKxthuHLGO293lIWdhJyo7v5LW1v5bLKg85QgEDxfZLbL8s37LeI2mfsi3wQ1/3jPyg6FHKfuT/45DXnShpbX5w9k2SXiDpn/JdUVdL+gPbx+QHjM+w/ap8vj+X9AHb5+YHvZ9j+1l6qs9I+ljrOdsn2H59/u8ft/1D+cHrx5XtTnrKZ2jzXtun5Mc8flPSF9ue+3tJ50h6n7JjBofzOUnvtP3q/DOdbPv5EXG/pOsl/V5+APyHle1qmq3F8SVlu9hena//X1O2bq+fY7kYMoQCBs0xyrb4dynblbFT2b7tQ40o+9F6SNIjyg74vqft+c2SzpTUVLaf/o0R0Tpg/HZJSyTdkS/n/yhrbSgi/nf++r+RtFvZj/KxeqpPSrpS0tW2d0v6lrKD25K0PH/Px5XtVvq6suMDh/M3yoLq7vx2SeuJfN/930k6XdKXD/cGEXGDpHcqOz7yWL7MVpi9RdIKZevqK5I+EhEbZnmPbcpaJZ9Stt5eK+m1EXFgjtoxZMwgOxg2tt8h6RcjYlXqWrrB9m9Jem5E/Py8LwZK4jxjYIDlu5TepdnPFgK6jt1HwIDKO5ndL2l9RHwjdT2oB3YfAQAKtBQAAAVCAQBQqNyB5uOPPz5WrFiRugwAqJStW7c2I+KE+V5XuVBYsWKFtmzZkroMAKgU27Ndw+sp2H0EACgQCgCAAqEAACgQCgCAAqEAACgQCgCAAqEAACgQCgCAAqEAACgQCgCAAqEAACgQCgCAAqEAAAvVbEqrV0s7d6aupGcIBQBYqPFxafNmad261JX0DKEAAAvRbEoTE9Ly5dn9kLYWCAUAWIjxcWlqShoby+6HtLVAKADAfFqthEYje9xoDG1rgVAAgPm0Wgmj+WCVo6ND21ogFABgPhs2SNPT0uTkwdv0dDZ9yFRujGYA6LuNG1NX0De0FAAABUIBAFAgFAAABUIBAFAgFAAABUIBAFAgFAAABUIBAFAgFNA7qa49X7flon9q8B0TCuidVNeer9ty0T81+I4dEalr6MjKlStjy5YtqcvAfJpNadWq7MJh09PSpk3SccexXFRXxb9j21sjYuV8r6OlgN5Ide35ui0X/VOT75iWArqvtUU1NnZwq2rfvt5vWdVtueifIfiOaSkgnVTXnq/bctE/NfqOCQV0X6prz9dtueifGn3H7D4CgBpg9xEAoGOEAgCgQCgAAAqEAgCgQCgAAAqEAgCgQCgAAAqEAgCgQCgAAApJQ8H2qbavsX2n7dttvy9lPQBQd6OJlz8t6dci4kbbDUlbbW+IiDsS1wUAtZS0pRAROyLixvzfuyXdKenklDUBQJ0NzDEF2yskvVjS5rSVAEB9DUQo2D5a0t9Jen9EPD7L8++2vcX2lsnJyf4XCAA1kTwUbC9WFghXRMSXZ3tNRKyLiJURsfKEE07ob4EAUCOpzz6ypM9JujMi/jBlLQCA9C2Fl0t6m6Tzbd+U316TuCYAqK2kp6RGxHWSnLIGAMBBqVsKAIABQigAAAqEAgCgQCgAAAqEAgCgQCgAAAqEAgCgQCgAAAqEAtCu2ZRWr5Z27kxdCZAEoQC0Gx+XNm+W1q1LXQmQBKEAtDSb0sSEtHx5dk9rATVEKAAt4+PS1JQ0Npbd01pADREKgHSwldBoZI8bDVoLqCVCAZAOthJG8wsHj47SWkAtEQqAJG3YIE1PS5OTB2/T09l0oEaSjqcADIyNG1NXAAwEWgpA1dG3Al1EKABVR98KdBGhAFQZfSvQZYQCUGX0rUCXEQpAVdG3Aj1AKABVRd8K9AChAFQVfSvQA/RTAKqKvhXoAVoKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKAIACoQAAKBAKwCBoNqXVq6WdO/s7L3AIQgEYBOPj0ubN0rp1/Z0XOAShAKTWbEoTE9Ly5dl9J1v8ZeYFZkEoAKmNj0tTU9LYWHbfyRZ/mXmBWRAKQEqtLf1GI3vcaCx8i7/MvMBhEApASq0t/dHR7PHo6MK3+MvMCxwGoQCktGGDND0tTU4evE1PZ9N7OS9wGKOpCwBqbePGNPMCh0FLAQBQIBQAAAVCAQBQIBQAAAVCAQBQIBQAAAVCAQBQIBQAAAVCARgAO+/bpm+ffaIeuX976lJQc4QCMABuvWStnr29qVt+Z03qUlBzhAKQ2M77tum0r16jR56+OLuntYCEkoeC7cttP2z7ttS1ACnceslajTwRmlo8qpEngtYCkkoeCpL+UtIFqYsAUmi1EvaNZdem3Dc2SmsBSSUPhYj4hqRHUtcBpNBqJcwsyv4rziwaobWApCpx6Wzb75b0bkk67bTTElcDdM8x192g0ZnQMbv3P2U6kEIlQiEi1klaJ0krV66MxOUAXXPOHbtmnX5Kn+sAWpLvPgIADA5CAWjT3NvU6itWa+fenalLAZJIHgq2vyDpm5KeZ/sB2+9KXRPqa3zruDY/uFnrtq5LXQqQRPJQiIi3RMRJEbE4Ik6JiM+lrgn11Nzb1MTNE1p+9HJN3DxBawG1lDwUgEExvnVcUzNTGhsd09TMFK0F1BKhAOhgK6GxpCFJaixp0FpALREKgA62EkZHsrO0R0dGaS2glggFQNKGuzdoemZak3sni9v0zLQ23L0hdWlAX1Wi8xrQaxsv2pi6BGAg0FLA0EnW16DZlFavlnbWZLkYSh2Fgu3zbP9X229v3XpVGHCkkvU1GB+XNm+W1tVkuRhKjljYpYRsf17SGZJukvREPjkiYm2PapvVypUrY8uWLf1cJCqkubepVZev0ujIqKZnprXpFzbpuKXH9WHBTWnVKml0VJqeljZtko4b4uWicmxvjYiV872uk5bCSkkvj4j3RMSv5Le+BgIwn2R9DcbHpakpaWwsu+/XVnuq5WJodRIKt0la3qtCgLKS9TVoNqWJCamRLVeNRva41/v4Uy0XQ23eULD9D7avlHS8pDtsX2X7ytat9yUCC5Osr0Fra300P5lvdLQ/W+2plouhtpBTUj/R8yqALmjva3Do9A+94kM9XPCGbH/+5ORTp39oCJeLodbJgebfj4gPzjet1zjQDACd68WB5p+cZdqFHcwPABhwCzmm8N9t36psvINb2m73SLql9yWilsp0yCoxbxU7vjEwELppIS2Fv5H0WklX5vet27kR8fM9rA11VqZDVol5q9jxjYGB0E2dHFM4dpbJuyNiqrslzY1jCjVQpkNWiXmr2PEtWc2onF4cU7hR0qSk70ranv/7Hts32j73yMoEZlGmQ1aJeavY8Y2BgdBtnbQUPiPpKxFxVf74pyRdIOlLkj4ZES/rWZVtaCkMudZW89jYwS3nffsWtvVcYt7WFvfY6Fix1b1vel/vt7yrWDMqqSeXuWgFgiRFxNWSXhkR35L0A0dQI/BUZTpklZi3ih3fGBgIvdBJKDxi+4O2n5Xffl3SLtuLJM30qD7UTXuHrNZtejqb3sN5kw2yU8WaMdQ62X10vKSPSFolyZKuk/Tbkh6TdFpE3NWrItux+wgAOtf13UcR0cyvjPriiDg7ItZExGREHOhXICCBKg4ck2reqmJ9oc2CQ8H2c22vs3217Y2tWy+LwwCo4sAxqeatKtYX2nSy++hmSZ+RtFUHB9lRRGztTWmzY/dRH1Vx4JhU81YV66s2enH20XRE/FlE3BARW1u3EjVi0FVx4JhU81YV6wuH6KSl8FFJD0v6iqT9rekR8UhPKjsMWgp9Uqa/QKrlppq3qlhftdKLlsJFki6WdL2yXUhbJfHrPKyqOHBMqnmrivWFWXRy9tHps9ye3cvikFCZ/gKplptq3qpifWEWnew+WirpV5X1SXi37TMlPS8i/rGXBR6K3UcA0Lle7D76C0kHJJ2XP35A0iVHUBsAYEB1EgpnRMTHJU1JUkR8X1nPZmCwMGBN/9B5beh0EgoHbD9NUkiS7TPUdhYSMDAYsKZ/6Lw2dDoJhY9I+r+STrV9haR/kfTrPakKOFLNpjQxIS1fnt13sAXb3NvUxM0TWn70ck3cPEFrYT4l1jUGVydnH22Q9LOS3iHpC5JWKhtwBxgcDFjTP3ReG0oLPvto1pnt+yLitC7WMy/OPsJhMWBN/9B5rXJ6cfbRrMspOT/QPQxY0z90XhtaZUPhyJsZQLcxYE3/0HltaI3O9wLbn9LsP/6WtKzrFQFHauORX8l940VcBb4jJdY1BttCWgpbdPBaR+23LZJ+pXelAUemTF+DI513/fb1esXlr9DSjy3VKy5/hdZvX9/xsoFBMG8oRMTEXLfW6/IWBZBcmb4GRzLv+u3rtWb9Gm3buU37p/dr285tWrN+DcGASip7TKHdy7v4XsARKdPX4EjnvfT6S7XIi/T4/se1ZHSJHt//uBZ5kS69/tIj/RhAMt0MBSC5Mn0NjnTeex69R3sO7FEoNOIRhUJ7DuzRvY/ee4SfAkiHUMDQaG3pN5Y0JEmNJY0Fb/GXmfeUxima3DupRV4kSVrkRZrcO6mTGyeX+DRAGt0MBfosIKkyfQ3KzHvGD56hUGgmZhSR3yt0xg+eUe4DAQl0MxQ+2cX3AjpWpq9BmXnve/w+LRvLzs4+8MQBSdKysWW67/H7yn0gIIF5L3Nh+x80Rye1iHhdt4uaC5e5AIDOdfMyF5+Q9Adz3LAAXKe/A+vXS+efL51+ena/vn+ndpb6nhhbAENgIf0Uvj7XrR9FDgOu079A69dLa9ZIO3ZIxx6b3a9Z07dgKPU9MbYAhkAnYzSfKen3JJ0laaw1PSKe3ZvSZlfF3UetK3C2rr7JlTfncP75WRAcddTBaXv2SCed1PNLK5T6nlpXDW1dMZSrhWLA9GqM5j+TNC3pxyX9laTPH1l59cJ1+jtwzz3S0qVPnrZ0qXTvvT1fdKnvibEFMCQ6CYWnRcS/KGtdfC8iPirp/N6UNTzKnP9eS6efLu3d++Rpe/dKK1b0dLGlvqfWCGSNbF41GoxEhsrqJBT22R6RtN32GttvkHRij+oaGlynv0MXXywdOJDtMorI7g8cyKb3UKnvibEFMEQ6CYX3S1oqaa2kcyW9TdJFvShqmHCd/g5deKF02WXZMYRdu7L7yy7LpvdQqe+JsQUwRDoejtP2MZIiInb3pqS5VfFAMwCk1vUDzbZX2r5V0i2SbrV9s+1zyxQJABgsnew+ulzSeyJiRUSskPReZWckAZC0875t+vbZJ+qR+7f3dbl0jEQ3dRIKuyPiX1sPIuI6SUl2IQGD6NZL1urZ25u65XfW9HW5dIxEN3USCjfY/qztH7P9Ktt/Kula2+fYPqdXBQJVsPO+bTrtq9fokacvzu771FooM6gQMJtOQuFsSc+V9BFJH5X0AknnKbv+0Se6XhlQIbdeslYjT4SmFo9q5InoW2uBjpHoto7PPup6AfYFyi67vUjSn0fE/5rr9Zx9hEGz875teuwlP6QDixdpZtGIRp6Y0ZKpJ7Rsy+069tQze7bc1mU5xkbHiktz7Jvex2VUMKtenH30DNufs70+f3yW7XeVLHKRpE9LulDZNZXeYvusMu8J9FurlTCzKPvvlAVD71sLdIxEL3Sy++gvJV0l6Zn54+8q69BWxksl3RURd0fEAUl/K+n1Jd8T6KtjrrtBozOhY3bvL26jM6Fjrruhp8ulYyR6YbSD1x4fEV+y/SFJiohp20+UXP7Jku5ve/yApJeVfE+gr865Y9es00/p8XI3XtTbq8ainjppKeyxfZzyUdhs/4ikx0ouf7ZxnZ9ykMP2u21vsb1lcnKy5CJRBaXO+WewGwyZfv5JdxIKvyrpSkln2N6k7NLZv1Jy+Q9IOrXt8SmSHjr0RRGxLiJWRsTKE044oeQiUQWlzvlnsBsMmX7+Sc8bCrZfYnt5RNwo6VWSflPSfklXK/tRL+Pbks60fbrtJZJ+TlnwoMZKnfPfuoz18uVcvhpDod9/0gtpKXxW0oH83+dJ+rCyM4Z2SSqVWxExLWmNsgPYd0r6UkTcXuY9UX2lzvlnsBsMmX7/Sc/bT8H2zRHxn/J/f1rSZD7AjmzfFBFn97bEJ6OfwnArdc5/a0jMsbGDw2Lu28fQmKisbv5Jd7OfwiLbrbOUXi2p/ZSHTs5eAuZV6px/BrvBkEnxJ72QUPiCpK/b/qqk70v6V0my/RyVP/sIeJJS5/wz2A2GTIo/6QVd5iI//fQkSVdHxJ582nMlHZ0fgO4bdh8BQOcWuvtoQbt/IuJbs0z77pEUBgAYXJ30UwAqocygMwxYg7ojFDB0ygw6w4A1qDtCAUOlzKAzDFgDEAoYMmUGnWHAGoBQwBBpbek3ljQkSY0ljQVv8ZeZFxgmhAKGRplBZxiwBsgQChgaZQadYcAaIJN8jOZO0XkNADrX9TGaUU+pztsvtdxEg+zQxwHDgFDAnFKdt19quYkG2aGPA4YBoYDDSnXefqnlJhpkhz4OGBaEAg4r1Xn7pZabaJAd+jhgWBAKmFWq8/ZLLbfVSmhk86rR6EtrgT4OGCaEAmaV6rz9UstNNMgOfRwwTAgFzCrVefullptokB36OGCY0E8BAGqAfgqQlPbcec7bB6qHUBhyKc+d57x9oHoIhSGW8tx5ztsHqolQGGIpz53nvH2gmgiFIZXy3HnO2weqi1AYUinPnee8faC6CIUhlfLcec7bB6qLfgoAUAP0UwAAdIxQALqkjp31Eo1nVEnbtkknniht3566krkRCkCX1LGzXqLxjCpp7dosRNesSV3J3AgFoAvq2Fkv0XhGlbRtm3TNNdLixdn9ILcWCAWgC+rYWS/ReEaVtHatFJFdzT1isFsLhAJQUh076yUaz6iSWq2E9mE+Brm1QCgAJdWxs16i8YwqqdVKGMl/bUdGBru1QCgAJdWxs16i8Ywq6YYbshDYv//gLSKbPohGUxcAVN3GizamLqHvNtbvIx+xXbtSV9AZWgoAaoW+FXMjFADUCn0r5kYoAKgN+lbMj1AAUBv0rZgfoQCgFuhbsTCEAoBaoG/FwhAKAGqBvhULQz8FALVA34qFoaUAACgQCgCSoBPZYCIUACRBJ7LBRCgA6Ds6kQ0uQgFA39GJbHARCgD6ik5kg41QANBXdCIbbIQCgL6iE9lgo/MagL6iE9lgo6WAOTX3NrX6itVDPQg90kjVT6HMcuvQt4JQwJzGt45r84Obh3oQeqSRqp9CmeXWoW+FIyJ1DR1ZuXJlbNmyJXUZtdDc29Sqy1dpdGRU0zPT2vQLm3Tc0uNSl4Uh0GxKq1ZlB5mnp6VNm6Tj+vCnVWa5qWruFttbI2LlfK+jpYDDGt86rqmZKY2NjmlqZorWAromVT+FMsutS98KWgqYVauVMDY6VrQU9k3vo7WA0lpb3GNjB7e69+3r/ZZ3meWmqrmbaCmglFYrYXQkO0FtdGSU1gK6IlU/hTLLrVPfimShYPtNtm+3PWN73vRCf224e4OmZ6Y1uXeyuE3PTGvD3ZxMjnJS9VMos9w69a1ItvvI9gskzUj6rKQPRMSC9gmx+wgAOrfQ3UfJOq9FxJ2SZDtVCQCAQ9TumAKdsdArVe3YVMXOXFVd11XQ01Cw/c+2b5vl9voO3+fdtrfY3jI5OVmqJjpjoVeq2rGpip25qrquqyD5Kam2r1WfjinQGQu9UtWOTVXszFXVdZ0ap6TOgs5Y6JWqdmyqYmeuqq7rqkh59tEbJH1K0gmSHpV0U0T89HzzHWlLgc5Y6JWqdmyqYmeuqq7rQTDwLYWI+EpEnBIRPxARz1hIIJRBZyz0SlU7NlWxM1dV13WV1Gb3EZ2x0CtV7dhUxc5cVV3XVZL8QHOn6LwGAJ0b+N1H6I8y/TLWb1+v8yfO1+mfPF3nT5yv9dvX96BCSNK2bdKJJ0rbt6euBHVHKAy5I+2XsX77eq1Zv0Y7du/QsWPHasfuHVqzfg3B0CNr12YHUdesSV0J6o5QGGLNvU1N3Dyh5Ucv18TNEx21Fi69/lItGVmio5YcJds6aslRWjKyRJdef2kPK66nbduka66RFi/O7mktICVCYYiV6Zdxz6P3aOnipU+atnTxUt376L1drhJr10oR2Zk0EbQWkBahMKRarYTGkoYkqbGk0VFr4fRlp2vv1N4nTds7tVcrlq3odqm11moltJ9iSWsBKREKQ6psv4yLz7tYB2YOaM+BPYoI7TmwRwdmDuji8y7uZdm102oljOT/E0dGaC0gLUJhSJXtl3HhmRfqsgsv00mNk7Rr3y6d1DhJl114mS4888IeV14vN9yQhcD+/QdvEdl0IAX6KQBADdBPAeizOl7jv46fedgRCkCX1PEa/3X8zMOOUAC6oNmUJiak5cuz+zpsOdfxM9cBoQB0QR2v8V/Hz1wHhAJQUmuLuZF1CVGjMfxbznX8zHVBKAAl1fEa/3X8zHVBKAAl1fEa/3X8zHUxmroAoOo2bkxdQf/V8TPXBS0FAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFAECBUAAAFAgFoE2zKa1eLe3cmbqShatizRhchALQZnxc2rxZWrcudSULV8WaMbgIBSDXbEoTE9Ly5dl9Fba8q1gzBhuhAOTGx6WpKWlsLLuvwpZ3FWvGYCMUAB3c4m40sseNxuBveVexZgw+QgHQwS3u0dHs8ejo4G95V7FmDD5CAZC0YYM0PS1NTh68TU9n0wdVFWvG4BtNXQAwCDZuTF1B56pYMwYfLQUAQIFQAAAUCAUAQIFQAAAUCAUAQIFQAAAUCAUAQIFQAAAUCAUAQIFQAAAUHBGpa+iI7UlJ30tdR4eOl9RMXcQAYD1kWA8Z1kN/18GzIuKE+V5UuVCoIttbImJl6jpSYz1kWA8Z1sNgrgN2HwEACoQCAKBAKPQHw55kWA8Z1kOG9TCA64BjCgCAAi0FAECBUOgT25fa/o7tW2x/xfay1DWlYPtNtm+3PWN7oM666DXbF9jeZvsu27+Rup4UbF9u+2Hbt6WuJSXbp9q+xvad+f+H96WuqYVQ6J8Nkl4UET8s6buSPpS4nlRuk/Szkr6RupB+sr1I0qclXSjpLElvsX1W2qqS+EtJF6QuYgBMS/q1iHiBpB+R9N5B+XsgFPokIq6OiOn84bcknZKynlQi4s6I2Ja6jgReKumuiLg7Ig5I+ltJr09cU99FxDckPZK6jtQiYkdE3Jj/e7ekOyWdnLaqDKGQxi9IWp+6CPTVyZLub3v8gAbkRwBp2V4h6cWSNqetJDOauoBhYvufJS2f5akPR8RX89d8WFnT8Yp+1tZPC1kPNeRZpnHqX83ZPlrS30l6f0Q8nroeiVDoqoj4ibmet32RpJ+CrSVSAAADzklEQVSR9OoY4nOB51sPNfWApFPbHp8i6aFEtWAA2F6sLBCuiIgvp66nhd1HfWL7AkkflPS6iNibuh703bclnWn7dNtLJP2cpCsT14REbFvS5yTdGRF/mLqedoRC/1wmqSFpg+2bbH8mdUEp2H6D7Qck/aikr9m+KnVN/ZCfZLBG0lXKDip+KSJuT1tV/9n+gqRvSnqe7Qdsvyt1TYm8XNLbJJ2f/x7cZPs1qYuS6NEMAGhDSwEAUCAUAAAFQgEAUCAUAAAFQgEAUCAUAAAFQgFDyfaH80sS35KfA/6yLr73j9n+x8M8Z9tN2z+YPz7Jdthe1faaSdvH2f5l22+f5T1WtC4tbfvs9vPXbX/U9ge69VmAQ3GZCwwd2z+q7HIi50TEftvHS1rSj2VHRNjerKxz3j9JOk/Sv+X319l+nqRmROyUtJAOjGdLWpm/F9BztBQwjE5S9sO7X5IiohkRD9k+1/bXbW+1fZXtkyTJ9rW2/9j29bZvs/3SfPpL82n/lt8/b4HL36QsBJTf/6GykGg9vj5//2KrP6/tZtvflPTefNoSSf9T0pvz1s6b8/c4K6/5bttrj3w1AU9FKGAYXS3pVNvftf2ntl+VX3zsU5LeGBHnSrpc0sfa5jkqIs6T9J78OUn6jqRXRsSLJf2WpN9d4PKv18FQeKmkv9fBi+Gdpyw0DvUXktZGRCs8lI+78FuSvhgRZ0fEF/Onni/pp/P3/kj+2YCuYPcRhk5E/IftcyW9QtKPS/qipEskvUjZtackaZGkHW2zfSGf9xu2j8mHS21ImrB9prLLXC/0x/cGSS+2fZSkxXk9d9t+jrJQ+IP2F9t+uqRlEfH1fNLnlY3Qdjhfy1tB+20/LOkZyq7CCpRGKGAoRcQTkq6VdK3tW5Xtkrm9fUv80Flmefw7kq6JiDfkA6Fcu8Bl77V9l7LBlG7MJ39L0msknSjp0JHnPMvy57K/7d9PiP/H6CJ2H2Ho2H5evnXfcrayK5OekB+Elu3Ftl/Y9po359NXSXosIh6T9HRJD+bPv6PDMjZJer+yK4Iqv3+fpG8dOpZGRDwq6bG2M5Te2vb0bmUtFqAvCAUMo6OV7fa5w/Ytks5Stm/+jZJ+3/bNkm7Swf3+krTL9vXKzghqXc7545J+z/YmZbubOrFJ0rN1MBRuVDawzvWHef07JX06P9D8/bbp1yg7sNx+oBnoGS6djdqzfa2kD0TEltS1AKnRUgAAFGgpAEfI9juVHSdotyki3puiHqAbCAUAQIHdRwCAAqEAACgQCgCAAqEAACgQCgCAwv8HVFJnlLCA4rUAAAAASUVORK5CYII=\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", - "iris_test['Species'] = [levels[x] for x in iris_test['predicted']]\n", - "markers = {1:'^', 0:'o'}\n", - "colors = {'setosa':'blue', 'versicolor':'green', 'virginica':'red'}\n", - "def plot_shapes(df, col1,col2, markers, colors):\n", - " import matplotlib.pyplot as plt\n", - " import seaborn as sns\n", - " ax = plt.figure(figsize=(6, 6)).gca() # define plot axis\n", - " for m in markers: # iterate over marker dictioary keys\n", - " for c in colors: # iterate over color dictionary keys\n", - " df_temp = df[(df['correct'] == m) & (df['Species'] == c)]\n", - " sns.regplot(x = col1, y = col2, \n", - " data = df_temp, \n", - " fit_reg = False, \n", - " scatter_kws={'color': colors[c]},\n", - " marker = markers[m],\n", - " ax = ax)\n", - " plt.xlabel(col1)\n", - " plt.ylabel(col2)\n", - " plt.title('Iris species by color')\n", - " return 'Done'\n", - "plot_shapes(iris_test, 'Petal_Width', 'Sepal_Length', markers, colors)\n", - "plot_shapes(iris_test, 'Sepal_Width', 'Sepal_Length', markers, colors)" - ] - }, - { - "cell_type": "markdown", - "metadata": { - "collapsed": true - }, - "source": [ - "In the plots above color is used to show the predicted class. Correctly classified cases are shown by triangles and incorrectly classified cases are shown by circles. \n", - "\n", - "Answer the following questions:\n", - "1. How many misclassified cases are there? ANS: 3\n", - "2. Do the misclassified cases appear to be on the boundary between clasess? ANS: Yes" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Summary\n", - "\n", - "In this lab you have created and evaluated a KNN machine learning classification model. Specifically you have:\n", - "1. Loaded and explored the data using visualiztion to deterime if the features seperate the classes.\n", - "2. Prepared the data by normalizing the numberic features and randomally sampling into training and testing subsets. \n", - "3. Constructing and evaluating the machine learning model. Evaluation was performed by statistically, with the accuracy metric, and with visualization. " - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [] - } - ], - "metadata": { - "anaconda-cloud": {}, - "kernelspec": { - "display_name": "Python 3", - "language": "python", - "name": "python3" - }, - "language_info": { - "codemirror_mode": { - "name": "ipython", - "version": 3 - }, - "file_extension": ".py", - "mimetype": "text/x-python", - "name": "python", - "nbconvert_exporter": "python", - "pygments_lexer": "ipython3", - "version": "3.6.4" - } - }, - "nbformat": 4, - "nbformat_minor": 1 -} diff --git a/Module4/Automobile price data _Raw_.csv b/Module2/Automobile price data _Raw_.csv similarity index 100% rename from Module4/Automobile price data _Raw_.csv rename to Module2/Automobile price data _Raw_.csv diff --git a/Module2/German_Credit.csv b/Module2/German_Credit.csv new file mode 100644 index 0000000..0f6b915 --- /dev/null +++ b/Module2/German_Credit.csv @@ -0,0 +1,1012 @@ +1122334,A11,6,A34,A43,1169,A65,A75,4,A93,A101,4,A121,67,A143,A152,2,A173,1,A192,A201,1 +6156361,A12,48,A32,A43,5951,A61,A73,2,A92,A101,2,A121,22,A143,A152,1,A173,1,A191,A201,2 +2051359,A14,12,A34,A46,2096,A61,A74,2,A93,A101,3,A121,49,A143,A152,1,A172,2,A191,A201,1 +8740590,A11,42,A32,A42,7882,A61,A74,2,A93,A103,4,A122,45,A143,A153,1,A173,2,A191,A201,1 +3924540,A11,24,A33,A40,4870,A61,A73,3,A93,A101,4,A124,53,A143,A153,2,A173,2,A191,A201,2 +3115687,A14,36,A32,A46,9055,A65,A73,2,A93,A101,4,A124,35,A143,A153,1,A172,2,A192,A201,1 +8251714,A14,24,A32,A42,2835,A63,A75,3,A93,A101,4,A122,53,A143,A152,1,A173,1,A191,A201,1 +2272783,A12,36,A32,A41,6948,A61,A73,2,A93,A101,2,A123,35,A143,A151,1,A174,1,A192,A201,1 +1865292,A14,12,A32,A43,3059,A64,A74,2,A91,A101,4,A121,61,A143,A152,1,A172,1,A191,A201,1 +8369450,A12,30,A34,A40,5234,A61,A71,4,A94,A101,2,A123,28,A143,A152,2,A174,1,A191,A201,2 +2859658,A12,12,A32,A40,1295,A61,A72,3,A92,A101,1,A123,25,A143,A151,1,A173,1,A191,A201,2 +7677202,A11,48,A32,A49,4308,A61,A72,3,A92,A101,4,A122,24,A143,A151,1,A173,1,A191,A201,2 +6368245,A12,12,A32,A43,1567,A61,A73,1,A92,A101,1,A123,22,A143,A152,1,A173,1,A192,A201,1 +8363768,A11,24,A34,A40,1199,A61,A75,4,A93,A101,4,A123,60,A143,A152,2,A172,1,A191,A201,2 +6846974,A11,15,A32,A40,1403,A61,A73,2,A92,A101,4,A123,28,A143,A151,1,A173,1,A191,A201,1 +6293067,A11,24,A32,A43,1282,A62,A73,4,A92,A101,2,A123,32,A143,A152,1,A172,1,A191,A201,2 +2807551,A14,24,A34,A43,2424,A65,A75,4,A93,A101,4,A122,53,A143,A152,2,A173,1,A191,A201,1 +8394907,A11,30,A30,A49,8072,A65,A72,2,A93,A101,3,A123,25,A141,A152,3,A173,1,A191,A201,1 +9516779,A12,24,A32,A41,12579,A61,A75,4,A92,A101,2,A124,44,A143,A153,1,A174,1,A192,A201,2 +1750839,A14,24,A32,A43,3430,A63,A75,3,A93,A101,2,A123,31,A143,A152,1,A173,2,A192,A201,1 +9007770,A14,9,A34,A40,2134,A61,A73,4,A93,A101,4,A123,48,A143,A152,3,A173,1,A192,A201,1 +5611578,A11,6,A32,A43,2647,A63,A73,2,A93,A101,3,A121,44,A143,A151,1,A173,2,A191,A201,1 +3677496,A11,10,A34,A40,2241,A61,A72,1,A93,A101,3,A121,48,A143,A151,2,A172,2,A191,A202,1 +3265162,A12,12,A34,A41,1804,A62,A72,3,A93,A101,4,A122,44,A143,A152,1,A173,1,A191,A201,1 +7795200,A14,10,A34,A42,2069,A65,A73,2,A94,A101,1,A123,26,A143,A152,2,A173,1,A191,A202,1 +4362293,A11,6,A32,A42,1374,A61,A73,1,A93,A101,2,A121,36,A141,A152,1,A172,1,A192,A201,1 +4506902,A14,6,A30,A43,426,A61,A75,4,A94,A101,4,A123,39,A143,A152,1,A172,1,A191,A201,1 +8881521,A13,12,A31,A43,409,A64,A73,3,A92,A101,3,A121,42,A143,A151,2,A173,1,A191,A201,1 +1398341,A12,7,A32,A43,2415,A61,A73,3,A93,A103,2,A121,34,A143,A152,1,A173,1,A191,A201,1 +7333893,A11,60,A33,A49,6836,A61,A75,3,A93,A101,4,A124,63,A143,A152,2,A173,1,A192,A201,2 +7890417,A12,18,A32,A49,1913,A64,A72,3,A94,A101,3,A121,36,A141,A152,1,A173,1,A192,A201,1 +2017286,A11,24,A32,A42,4020,A61,A73,2,A93,A101,2,A123,27,A142,A152,1,A173,1,A191,A201,1 +7081292,A12,18,A32,A40,5866,A62,A73,2,A93,A101,2,A123,30,A143,A152,2,A173,1,A192,A201,1 +8807838,A14,12,A34,A49,1264,A65,A75,4,A93,A101,4,A124,57,A143,A151,1,A172,1,A191,A201,1 +1221741,A13,12,A32,A42,1474,A61,A72,4,A92,A101,1,A122,33,A141,A152,1,A174,1,A192,A201,1 +3977061,A12,45,A34,A43,4746,A61,A72,4,A93,A101,2,A122,25,A143,A152,2,A172,1,A191,A201,2 +6028350,A14,48,A34,A46,6110,A61,A73,1,A93,A101,3,A124,31,A141,A153,1,A173,1,A192,A201,1 +2903516,A13,18,A32,A43,2100,A61,A73,4,A93,A102,2,A121,37,A142,A152,1,A173,1,A191,A201,2 +8335289,A13,10,A32,A44,1225,A61,A73,2,A93,A101,2,A123,37,A143,A152,1,A173,1,A192,A201,1 +8596814,A12,9,A32,A43,458,A61,A73,4,A93,A101,3,A121,24,A143,A152,1,A173,1,A191,A201,1 +1473485,A14,30,A32,A43,2333,A63,A75,4,A93,A101,2,A123,30,A141,A152,1,A174,1,A191,A201,1 +5043986,A12,12,A32,A43,1158,A63,A73,3,A91,A101,1,A123,26,A143,A152,1,A173,1,A192,A201,1 +7015188,A12,18,A33,A45,6204,A61,A73,2,A93,A101,4,A121,44,A143,A152,1,A172,2,A192,A201,1 +5609809,A11,30,A34,A41,6187,A62,A74,1,A94,A101,4,A123,24,A143,A151,2,A173,1,A191,A201,1 +6574474,A11,48,A34,A41,6143,A61,A75,4,A92,A101,4,A124,58,A142,A153,2,A172,1,A191,A201,2 +6668437,A14,11,A34,A40,1393,A61,A72,4,A92,A101,4,A123,35,A143,A152,2,A174,1,A191,A201,1 +7468489,A14,36,A32,A43,2299,A63,A75,4,A93,A101,4,A123,39,A143,A152,1,A173,1,A191,A201,1 +6045059,A11,6,A32,A41,1352,A63,A71,1,A92,A101,2,A122,23,A143,A151,1,A171,1,A192,A201,1 +3678840,A14,11,A34,A40,7228,A61,A73,1,A93,A101,4,A122,39,A143,A152,2,A172,1,A191,A201,1 +9514325,A14,12,A32,A43,2073,A62,A73,4,A92,A102,2,A121,28,A143,A152,1,A173,1,A191,A201,1 +7764002,A12,24,A33,A42,2333,A65,A72,4,A93,A101,2,A122,29,A141,A152,1,A172,1,A191,A201,1 +4029031,A12,27,A33,A41,5965,A61,A75,1,A93,A101,2,A123,30,A143,A152,2,A174,1,A192,A201,1 +1318782,A14,12,A32,A43,1262,A61,A73,3,A93,A101,2,A123,25,A143,A152,1,A173,1,A191,A201,1 +1442097,A14,18,A32,A41,3378,A65,A73,2,A93,A101,1,A122,31,A143,A152,1,A173,1,A192,A201,1 +7784864,A12,36,A33,A40,2225,A61,A75,4,A93,A101,4,A124,57,A141,A153,2,A173,1,A192,A201,2 +4572543,A14,6,A31,A40,783,A65,A73,1,A93,A103,2,A121,26,A142,A152,1,A172,2,A191,A201,1 +4365555,A12,12,A32,A43,6468,A65,A71,2,A93,A101,1,A124,52,A143,A152,1,A174,1,A192,A201,2 +6134562,A14,36,A34,A43,9566,A61,A73,2,A92,A101,2,A123,31,A142,A152,2,A173,1,A191,A201,1 +3409265,A13,18,A32,A40,1961,A61,A75,3,A92,A101,2,A123,23,A143,A152,1,A174,1,A191,A201,1 +1504574,A11,36,A34,A42,6229,A61,A72,4,A92,A102,4,A124,23,A143,A151,2,A172,1,A192,A201,2 +3256655,A12,9,A32,A49,1391,A61,A73,2,A94,A101,1,A121,27,A141,A152,1,A173,1,A192,A201,1 +8806796,A12,15,A34,A43,1537,A65,A75,4,A93,A103,4,A121,50,A143,A152,2,A173,1,A192,A201,1 +1364372,A12,36,A30,A49,1953,A61,A75,4,A93,A101,4,A124,61,A143,A153,1,A174,1,A192,A201,2 +8688712,A12,48,A30,A49,14421,A61,A73,2,A93,A101,2,A123,25,A143,A152,1,A173,1,A192,A201,2 +3747039,A14,24,A32,A43,3181,A61,A72,4,A92,A101,4,A122,26,A143,A152,1,A173,1,A192,A201,1 +3453642,A14,27,A32,A45,5190,A65,A75,4,A93,A101,4,A122,48,A143,A152,4,A173,2,A192,A201,1 +5520489,A14,12,A32,A43,2171,A61,A72,2,A92,A101,2,A123,29,A141,A152,1,A173,1,A191,A201,1 +1647906,A12,12,A32,A40,1007,A64,A73,4,A94,A101,1,A121,22,A143,A152,1,A173,1,A191,A201,1 +2987381,A14,36,A32,A46,1819,A61,A73,4,A93,A101,4,A124,37,A142,A153,1,A173,1,A192,A201,2 +3683782,A14,36,A32,A43,2394,A65,A73,4,A92,A101,4,A123,25,A143,A152,1,A173,1,A191,A201,1 +6065675,A14,36,A32,A41,8133,A61,A73,1,A92,A101,2,A122,30,A141,A152,1,A173,1,A191,A201,1 +1510672,A14,7,A34,A43,730,A65,A75,4,A93,A101,2,A122,46,A143,A151,2,A172,1,A192,A201,1 +6390550,A11,8,A34,A410,1164,A61,A75,3,A93,A101,4,A124,51,A141,A153,2,A174,2,A192,A201,1 +6771352,A12,42,A34,A49,5954,A61,A74,2,A92,A101,1,A121,41,A141,A152,2,A172,1,A191,A201,1 +4802045,A11,36,A32,A46,1977,A65,A75,4,A93,A101,4,A124,40,A143,A152,1,A174,1,A192,A201,2 +8911390,A11,12,A34,A41,1526,A61,A75,4,A93,A101,4,A124,66,A143,A153,2,A174,1,A191,A201,1 +1139059,A11,42,A32,A43,3965,A61,A72,4,A93,A101,3,A123,34,A143,A152,1,A173,1,A191,A201,2 +7399701,A12,11,A33,A43,4771,A61,A74,2,A93,A101,4,A122,51,A143,A152,1,A173,1,A191,A201,1 +5604915,A14,54,A30,A41,9436,A65,A73,2,A93,A101,2,A122,39,A143,A152,1,A172,2,A191,A201,1 +7683335,A12,30,A32,A42,3832,A61,A72,2,A94,A101,1,A122,22,A143,A152,1,A173,1,A191,A201,1 +5503995,A14,24,A32,A43,5943,A65,A72,1,A92,A101,1,A123,44,A143,A152,2,A173,1,A192,A201,2 +8801062,A14,15,A32,A43,1213,A63,A75,4,A93,A101,3,A122,47,A142,A152,1,A173,1,A192,A201,1 +1817735,A14,18,A32,A49,1568,A62,A73,3,A92,A101,4,A122,24,A143,A151,1,A172,1,A191,A201,1 +1806409,A11,24,A32,A410,1755,A61,A75,4,A92,A103,4,A121,58,A143,A152,1,A172,1,A192,A201,1 +3028735,A11,10,A32,A43,2315,A61,A75,3,A93,A101,4,A121,52,A143,A152,1,A172,1,A191,A201,1 +3707969,A14,12,A34,A49,1412,A61,A73,4,A92,A103,2,A121,29,A143,A152,2,A174,1,A192,A201,1 +7327747,A12,18,A34,A42,1295,A61,A72,4,A92,A101,1,A122,27,A143,A152,2,A173,1,A191,A201,1 +5790827,A12,36,A32,A46,12612,A62,A73,1,A93,A101,4,A124,47,A143,A153,1,A173,2,A192,A201,2 +1812102,A11,18,A32,A40,2249,A62,A74,4,A93,A101,3,A123,30,A143,A152,1,A174,2,A192,A201,1 +4049219,A11,12,A30,A45,1108,A61,A74,4,A93,A101,3,A121,28,A143,A152,2,A173,1,A191,A201,2 +6177564,A14,12,A34,A43,618,A61,A75,4,A93,A101,4,A121,56,A143,A152,1,A173,1,A191,A201,1 +9127163,A11,12,A34,A41,1409,A61,A75,4,A93,A101,3,A121,54,A143,A152,1,A173,1,A191,A201,1 +2628060,A14,12,A34,A43,797,A65,A75,4,A92,A101,3,A122,33,A141,A152,1,A172,2,A191,A201,2 +2031456,A13,24,A34,A42,3617,A65,A75,4,A93,A102,4,A124,20,A143,A151,2,A173,1,A191,A201,1 +3725303,A12,12,A32,A40,1318,A64,A75,4,A93,A101,4,A121,54,A143,A152,1,A173,1,A192,A201,1 +8326409,A12,54,A30,A49,15945,A61,A72,3,A93,A101,4,A124,58,A143,A151,1,A173,1,A192,A201,2 +1957928,A14,12,A34,A46,2012,A65,A74,4,A92,A101,2,A123,61,A143,A152,1,A173,1,A191,A201,1 +6428170,A12,18,A32,A49,2622,A62,A73,4,A93,A101,4,A123,34,A143,A152,1,A173,1,A191,A201,1 +4371483,A12,36,A34,A43,2337,A61,A75,4,A93,A101,4,A121,36,A143,A152,1,A173,1,A191,A201,1 +7476603,A12,20,A33,A41,7057,A65,A74,3,A93,A101,4,A122,36,A141,A151,2,A174,2,A192,A201,1 +2083425,A14,24,A32,A40,1469,A62,A75,4,A94,A101,4,A121,41,A143,A151,1,A172,1,A191,A201,1 +7663021,A12,36,A32,A43,2323,A61,A74,4,A93,A101,4,A123,24,A143,A151,1,A173,1,A191,A201,1 +8330126,A14,6,A33,A43,932,A61,A73,3,A92,A101,2,A121,24,A143,A152,1,A173,1,A191,A201,1 +4762252,A12,9,A34,A42,1919,A61,A74,4,A93,A101,3,A123,35,A143,A151,1,A173,1,A192,A201,1 +8744199,A14,12,A32,A41,2445,A65,A72,2,A94,A101,4,A123,26,A143,A151,1,A173,1,A192,A201,1 +7423252,A12,24,A34,A410,11938,A61,A73,2,A93,A102,3,A123,39,A143,A152,2,A174,2,A192,A201,2 +7539327,A14,18,A31,A40,6458,A61,A75,2,A93,A101,4,A124,39,A141,A152,2,A174,2,A192,A201,2 +9510423,A12,12,A32,A40,6078,A61,A74,2,A93,A101,2,A123,32,A143,A152,1,A173,1,A191,A201,1 +7532368,A11,24,A32,A42,7721,A65,A72,1,A92,A101,2,A122,30,A143,A152,1,A173,1,A192,A202,1 +1929784,A12,14,A32,A49,1410,A63,A75,1,A94,A101,2,A121,35,A143,A152,1,A173,1,A192,A201,1 +9517102,A13,6,A34,A40,1299,A61,A73,1,A93,A101,1,A121,74,A143,A152,3,A171,2,A191,A202,1 +2801710,A12,9,A32,A42,918,A61,A73,4,A92,A101,1,A122,30,A143,A152,1,A173,1,A191,A201,2 +3066581,A12,6,A33,A49,1449,A62,A75,1,A91,A101,2,A123,31,A141,A152,2,A173,2,A191,A201,1 +2924368,A13,15,A32,A46,392,A61,A72,4,A92,A101,4,A122,23,A143,A151,1,A173,1,A192,A201,1 +5814710,A12,18,A32,A40,6260,A61,A74,3,A93,A101,3,A121,28,A143,A151,1,A172,1,A191,A201,1 +5186756,A14,36,A34,A40,7855,A61,A73,4,A92,A101,2,A121,25,A142,A152,2,A173,1,A192,A201,2 +3473006,A11,12,A32,A43,1680,A63,A75,3,A94,A101,1,A121,35,A143,A152,1,A173,1,A191,A201,1 +7491936,A14,48,A34,A43,3578,A65,A75,4,A93,A101,1,A121,47,A143,A152,1,A173,1,A192,A201,1 +8010233,A11,42,A32,A43,7174,A65,A74,4,A92,A101,3,A123,30,A143,A152,1,A174,1,A192,A201,2 +4298578,A11,10,A34,A42,2132,A65,A72,2,A92,A102,3,A121,27,A143,A151,2,A173,1,A191,A202,1 +2515323,A11,33,A34,A42,4281,A63,A73,1,A92,A101,4,A123,23,A143,A152,2,A173,1,A191,A201,2 +2317941,A12,12,A34,A40,2366,A63,A74,3,A91,A101,3,A123,36,A143,A152,1,A174,1,A192,A201,1 +3906586,A11,21,A32,A43,1835,A61,A73,3,A92,A101,2,A121,25,A143,A152,2,A173,1,A192,A201,2 +4863589,A14,24,A34,A41,3868,A61,A75,4,A92,A101,2,A123,41,A143,A151,2,A174,1,A192,A201,1 +4625102,A14,12,A32,A42,1768,A61,A73,3,A93,A101,2,A121,24,A143,A151,1,A172,1,A191,A201,1 +8954004,A13,10,A34,A40,781,A61,A75,4,A93,A101,4,A124,63,A143,A153,2,A173,1,A192,A201,1 +1498295,A12,18,A32,A42,1924,A65,A72,4,A92,A101,3,A121,27,A143,A151,1,A173,1,A191,A201,2 +4851363,A11,12,A34,A40,2121,A61,A73,4,A93,A101,2,A122,30,A143,A152,2,A173,1,A191,A201,1 +1018706,A11,12,A32,A43,701,A61,A73,4,A94,A101,2,A121,40,A143,A152,1,A172,1,A191,A201,1 +9208119,A12,12,A32,A45,639,A61,A73,4,A93,A101,2,A123,30,A143,A152,1,A173,1,A191,A201,2 +4300293,A12,12,A34,A41,1860,A61,A71,4,A93,A101,2,A123,34,A143,A152,2,A174,1,A192,A201,1 +7728014,A11,12,A34,A40,3499,A61,A73,3,A92,A102,2,A121,29,A143,A152,2,A173,1,A191,A201,2 +7990216,A12,48,A32,A40,8487,A65,A74,1,A92,A101,2,A123,24,A143,A152,1,A173,1,A191,A201,1 +1318029,A11,36,A33,A46,6887,A61,A73,4,A93,A101,3,A122,29,A142,A152,1,A173,1,A192,A201,2 +9949768,A14,15,A32,A42,2708,A61,A72,2,A93,A101,3,A122,27,A141,A152,2,A172,1,A191,A201,1 +7906171,A14,18,A32,A42,1984,A61,A73,4,A93,A101,4,A124,47,A141,A153,2,A173,1,A191,A201,1 +4495501,A14,60,A32,A43,10144,A62,A74,2,A92,A101,4,A121,21,A143,A152,1,A173,1,A192,A201,1 +4891536,A14,12,A34,A43,1240,A65,A75,4,A92,A101,2,A121,38,A143,A152,2,A173,1,A192,A201,1 +6332313,A14,27,A33,A41,8613,A64,A73,2,A93,A101,2,A123,27,A143,A152,2,A173,1,A191,A201,1 +2114106,A12,12,A32,A43,766,A63,A73,4,A93,A101,3,A121,66,A143,A152,1,A172,1,A191,A201,2 +8221756,A12,15,A34,A43,2728,A65,A74,4,A93,A103,2,A121,35,A141,A152,3,A173,1,A192,A201,1 +7308446,A13,12,A32,A43,1881,A61,A73,2,A92,A101,2,A123,44,A143,A151,1,A172,1,A192,A201,1 +5759358,A13,6,A32,A40,709,A64,A72,2,A94,A101,2,A121,27,A143,A152,1,A171,1,A191,A202,1 +1432196,A12,36,A32,A43,4795,A61,A72,4,A92,A101,1,A124,30,A143,A152,1,A174,1,A192,A201,1 +2429343,A11,27,A32,A43,3416,A61,A73,3,A93,A101,2,A123,27,A143,A152,1,A174,1,A191,A201,1 +1581866,A11,18,A32,A42,2462,A61,A73,2,A93,A101,2,A123,22,A143,A152,1,A173,1,A191,A201,2 +6989622,A14,21,A34,A42,2288,A61,A72,4,A92,A101,4,A122,23,A143,A152,1,A173,1,A192,A201,1 +2025676,A12,48,A31,A49,3566,A62,A74,4,A93,A101,2,A123,30,A143,A152,1,A173,1,A191,A201,1 +3000727,A11,6,A34,A40,860,A61,A75,1,A92,A101,4,A124,39,A143,A152,2,A173,1,A192,A201,1 +3953814,A14,12,A34,A40,682,A62,A74,4,A92,A101,3,A123,51,A143,A152,2,A173,1,A192,A201,1 +7408825,A11,36,A34,A42,5371,A61,A73,3,A93,A103,2,A122,28,A143,A152,2,A173,1,A191,A201,1 +6226321,A14,18,A34,A43,1582,A64,A75,4,A93,A101,4,A123,46,A143,A152,2,A173,1,A191,A201,1 +6083792,A14,6,A32,A43,1346,A62,A75,2,A93,A101,4,A124,42,A141,A153,1,A173,2,A192,A201,1 +5122662,A14,10,A32,A43,1924,A61,A73,1,A93,A101,4,A122,38,A143,A152,1,A173,1,A192,A202,1 +9746441,A13,36,A32,A43,5848,A61,A73,4,A93,A101,1,A123,24,A143,A152,1,A173,1,A191,A201,1 +7353198,A12,24,A34,A41,7758,A64,A75,2,A92,A101,4,A124,29,A143,A151,1,A173,1,A191,A201,1 +7766762,A12,24,A33,A49,6967,A62,A74,4,A93,A101,4,A123,36,A143,A151,1,A174,1,A192,A201,1 +5175459,A11,12,A32,A42,1282,A61,A73,2,A92,A101,4,A123,20,A143,A151,1,A173,1,A191,A201,2 +5520884,A11,9,A34,A45,1288,A62,A75,3,A93,A103,4,A121,48,A143,A152,2,A173,2,A191,A202,1 +1137438,A11,12,A31,A48,339,A61,A75,4,A94,A101,1,A123,45,A141,A152,1,A172,1,A191,A201,1 +3892302,A12,24,A32,A40,3512,A62,A74,2,A93,A101,3,A123,38,A141,A152,2,A173,1,A192,A201,1 +9957275,A14,6,A34,A43,1898,A65,A73,1,A93,A101,2,A121,34,A143,A152,2,A172,2,A191,A201,1 +4695507,A14,24,A34,A43,2872,A62,A75,3,A93,A101,4,A121,36,A143,A152,1,A173,2,A192,A201,1 +7775118,A14,18,A34,A40,1055,A61,A72,4,A92,A101,1,A122,30,A143,A152,2,A173,1,A191,A201,1 +4717846,A14,15,A32,A44,1262,A63,A74,4,A93,A101,3,A122,36,A143,A152,2,A173,1,A192,A201,1 +3539651,A12,10,A32,A40,7308,A61,A71,2,A93,A101,4,A124,70,A141,A153,1,A174,1,A192,A201,1 +6446799,A14,36,A32,A40,909,A63,A75,4,A93,A101,4,A122,36,A143,A152,1,A173,1,A191,A201,1 +5749718,A14,6,A32,A42,2978,A63,A73,1,A93,A101,2,A123,32,A143,A152,1,A173,1,A192,A201,1 +8930405,A11,18,A32,A42,1131,A61,A71,4,A92,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,2 +4030908,A12,11,A32,A42,1577,A64,A72,4,A92,A101,1,A121,20,A143,A152,1,A173,1,A191,A201,1 +1327333,A14,24,A32,A42,3972,A61,A74,2,A92,A101,4,A122,25,A143,A151,1,A173,1,A192,A201,1 +5329060,A12,24,A34,A49,1935,A61,A75,4,A91,A101,4,A121,31,A143,A152,2,A173,1,A192,A201,2 +7922005,A11,15,A30,A40,950,A61,A75,4,A93,A101,3,A123,33,A143,A151,2,A173,2,A191,A201,2 +6450905,A14,12,A32,A42,763,A61,A73,4,A92,A101,1,A121,26,A143,A152,1,A173,1,A192,A201,1 +1909580,A12,24,A33,A42,2064,A61,A71,3,A92,A101,2,A122,34,A143,A152,1,A174,1,A192,A201,2 +2714562,A12,8,A32,A43,1414,A61,A73,4,A93,A103,2,A121,33,A143,A152,1,A173,1,A191,A202,1 +6826470,A11,21,A33,A46,3414,A61,A72,2,A93,A101,1,A122,26,A143,A152,2,A173,1,A191,A201,2 +4340005,A14,30,A31,A41,7485,A65,A71,4,A92,A101,1,A121,53,A141,A152,1,A174,1,A192,A201,2 +7800786,A11,12,A32,A42,2577,A61,A73,2,A91,A101,1,A123,42,A143,A152,1,A173,1,A191,A201,1 +9266615,A11,6,A34,A43,338,A63,A75,4,A93,A101,4,A123,52,A143,A152,2,A173,1,A191,A201,1 +6212932,A14,12,A32,A43,1963,A61,A74,4,A93,A101,2,A123,31,A143,A151,2,A174,2,A192,A201,1 +5419741,A11,21,A34,A40,571,A61,A75,4,A93,A101,4,A121,65,A143,A152,2,A173,1,A191,A201,1 +9177065,A14,36,A33,A49,9572,A61,A72,1,A91,A101,1,A123,28,A143,A152,2,A173,1,A191,A201,2 +9804802,A12,36,A33,A49,4455,A61,A73,2,A91,A101,2,A121,30,A142,A152,2,A174,1,A192,A201,2 +8560790,A11,21,A31,A40,1647,A65,A73,4,A93,A101,2,A122,40,A143,A152,2,A172,2,A191,A201,2 +2450622,A14,24,A34,A42,3777,A64,A73,4,A93,A101,4,A121,50,A143,A152,1,A173,1,A192,A201,1 +6376471,A12,18,A34,A40,884,A61,A75,4,A93,A101,4,A123,36,A141,A152,1,A173,2,A192,A201,2 +2549308,A14,15,A34,A43,1360,A61,A73,4,A93,A101,2,A122,31,A143,A152,2,A173,1,A191,A201,1 +5285284,A12,9,A31,A41,5129,A61,A75,2,A92,A101,4,A124,74,A141,A153,1,A174,2,A192,A201,2 +3020208,A12,16,A34,A40,1175,A61,A71,2,A93,A101,3,A123,68,A143,A153,3,A171,1,A192,A201,1 +4186149,A11,12,A32,A43,674,A62,A74,4,A94,A101,1,A122,20,A143,A152,1,A173,1,A191,A201,2 +4599122,A12,18,A30,A42,3244,A61,A73,1,A92,A101,4,A123,33,A141,A152,2,A173,1,A192,A201,1 +1730668,A14,24,A32,A49,4591,A64,A73,2,A93,A101,3,A122,54,A143,A152,3,A174,1,A192,A201,2 +3082071,A12,48,A30,A49,3844,A62,A74,4,A93,A101,4,A124,34,A143,A153,1,A172,2,A191,A201,2 +5960666,A12,27,A32,A49,3915,A61,A73,4,A93,A101,2,A123,36,A143,A152,1,A173,2,A192,A201,2 +7033466,A14,6,A32,A43,2108,A61,A74,2,A94,A101,2,A121,29,A143,A151,1,A173,1,A191,A201,1 +4027322,A12,45,A32,A43,3031,A62,A73,4,A93,A103,4,A122,21,A143,A151,1,A173,1,A191,A201,2 +4067442,A12,9,A34,A46,1501,A61,A75,2,A92,A101,3,A123,34,A143,A152,2,A174,1,A192,A201,2 +2816577,A14,6,A34,A43,1382,A61,A73,1,A92,A101,1,A123,28,A143,A152,2,A173,1,A192,A201,1 +2964220,A12,12,A32,A42,951,A62,A72,4,A92,A101,4,A123,27,A141,A151,4,A173,1,A191,A201,2 +4483104,A12,24,A32,A41,2760,A65,A75,4,A93,A101,4,A124,36,A141,A153,1,A173,1,A192,A201,1 +6604417,A12,18,A33,A42,4297,A61,A75,4,A91,A101,3,A124,40,A143,A152,1,A174,1,A192,A201,2 +7854022,A14,9,A34,A46,936,A63,A75,4,A93,A101,2,A123,52,A143,A152,2,A173,1,A192,A201,1 +5232363,A11,12,A32,A40,1168,A61,A73,4,A94,A101,3,A121,27,A143,A152,1,A172,1,A191,A201,1 +7627208,A14,27,A33,A49,5117,A61,A74,3,A93,A101,4,A123,26,A143,A152,2,A173,1,A191,A201,1 +8257351,A11,12,A32,A48,902,A61,A74,4,A94,A101,4,A122,21,A143,A151,1,A173,1,A191,A201,2 +4920965,A14,12,A34,A40,1495,A61,A75,4,A93,A101,1,A121,38,A143,A152,2,A172,2,A191,A201,1 +7198693,A11,30,A34,A41,10623,A61,A75,3,A93,A101,4,A124,38,A143,A153,3,A174,2,A192,A201,1 +6463715,A14,12,A34,A42,1935,A61,A75,4,A93,A101,4,A121,43,A143,A152,3,A173,1,A192,A201,1 +3336919,A12,12,A34,A44,1424,A61,A74,4,A93,A101,3,A122,26,A143,A152,1,A173,1,A191,A201,1 +1727035,A11,24,A32,A49,6568,A61,A73,2,A94,A101,2,A123,21,A142,A152,1,A172,1,A191,A201,1 +1425487,A14,12,A32,A41,1413,A64,A74,3,A93,A101,2,A122,55,A143,A152,1,A173,1,A191,A202,1 +3808400,A14,9,A34,A43,3074,A65,A73,1,A93,A101,2,A121,33,A143,A152,2,A173,2,A191,A201,1 +9925070,A14,36,A32,A43,3835,A65,A75,2,A92,A101,4,A121,45,A143,A152,1,A172,1,A192,A201,1 +3589132,A11,27,A30,A49,5293,A61,A71,2,A93,A101,4,A122,50,A142,A152,2,A173,1,A192,A201,2 +1102227,A13,30,A33,A49,1908,A61,A75,4,A93,A101,4,A121,66,A143,A152,1,A174,1,A192,A201,2 +3188138,A14,36,A34,A43,3342,A65,A75,4,A93,A101,2,A123,51,A143,A152,1,A173,1,A192,A201,1 +2870117,A12,6,A34,A48,932,A65,A74,1,A92,A101,3,A122,39,A143,A152,2,A172,1,A191,A201,1 +3212489,A11,18,A30,A49,3104,A61,A74,3,A93,A101,1,A122,31,A141,A152,1,A173,1,A192,A201,1 +8004570,A13,36,A32,A43,3913,A61,A73,2,A93,A101,2,A121,23,A143,A152,1,A173,1,A192,A201,1 +1878782,A11,24,A32,A42,3021,A61,A73,2,A91,A101,2,A121,24,A143,A151,1,A172,1,A191,A201,1 +8474591,A14,10,A32,A40,1364,A61,A73,2,A92,A101,4,A123,64,A143,A152,1,A173,1,A192,A201,1 +7696088,A12,12,A32,A43,625,A61,A72,4,A94,A103,1,A121,26,A141,A152,1,A172,1,A191,A201,1 +6328300,A11,12,A32,A46,1200,A65,A73,4,A92,A101,4,A122,23,A141,A151,1,A173,1,A192,A201,1 +2321864,A14,12,A32,A43,707,A61,A73,4,A93,A101,2,A121,30,A141,A152,2,A173,1,A191,A201,1 +4113839,A14,24,A33,A49,2978,A65,A73,4,A93,A101,4,A121,32,A143,A152,2,A173,2,A192,A201,1 +6328203,A14,15,A32,A41,4657,A61,A73,3,A93,A101,2,A123,30,A143,A152,1,A173,1,A192,A201,1 +4449813,A14,36,A30,A45,2613,A61,A73,4,A93,A101,2,A123,27,A143,A152,2,A173,1,A191,A201,1 +2935635,A12,48,A32,A43,10961,A64,A74,1,A93,A102,2,A124,27,A141,A152,2,A173,1,A192,A201,2 +4544060,A11,12,A32,A42,7865,A61,A75,4,A93,A101,4,A124,53,A143,A153,1,A174,1,A192,A201,2 +8787497,A14,9,A32,A43,1478,A61,A74,4,A93,A101,2,A123,22,A143,A152,1,A173,1,A191,A201,2 +5985557,A11,24,A32,A42,3149,A61,A72,4,A93,A101,1,A124,22,A141,A153,1,A173,1,A191,A201,1 +7271945,A13,36,A32,A43,4210,A61,A73,4,A93,A101,2,A123,26,A143,A152,1,A173,1,A191,A201,2 +1813988,A14,9,A32,A40,2507,A63,A75,2,A93,A101,4,A124,51,A143,A153,1,A172,1,A191,A201,1 +7860808,A14,12,A32,A43,2141,A62,A74,3,A93,A101,1,A124,35,A143,A152,1,A173,1,A191,A201,1 +2709717,A12,18,A32,A43,866,A61,A73,4,A94,A103,2,A121,25,A143,A152,1,A172,1,A191,A201,1 +8437780,A14,4,A34,A43,1544,A61,A74,2,A93,A101,1,A121,42,A143,A152,3,A172,2,A191,A201,1 +9635061,A11,24,A32,A43,1823,A61,A71,4,A93,A101,2,A123,30,A142,A152,1,A174,2,A191,A201,2 +7532911,A12,6,A32,A40,14555,A65,A71,1,A93,A101,2,A122,23,A143,A152,1,A171,1,A192,A201,2 +7932675,A12,21,A32,A49,2767,A62,A75,4,A91,A101,2,A123,61,A141,A151,2,A172,1,A191,A201,2 +7028331,A14,12,A34,A43,1291,A61,A73,4,A92,A101,2,A122,35,A143,A152,2,A173,1,A191,A201,1 +5361518,A11,30,A32,A43,2522,A61,A75,1,A93,A103,3,A122,39,A143,A152,1,A173,2,A191,A201,1 +2444643,A11,24,A32,A40,915,A65,A75,4,A92,A101,2,A123,29,A141,A152,1,A173,1,A191,A201,2 +2566697,A14,6,A32,A43,1595,A61,A74,3,A93,A101,2,A122,51,A143,A152,1,A173,2,A191,A201,1 +4776602,A12,12,A32,A43,1155,A61,A75,3,A94,A103,3,A121,40,A141,A152,2,A172,1,A191,A201,1 +7860808,A14,12,A32,A43,2141,A62,A74,3,A93,A101,1,A124,35,A143,A152,1,A173,1,A191,A201,1 +5490556,A11,48,A30,A41,4605,A61,A75,3,A93,A101,4,A124,24,A143,A153,2,A173,2,A191,A201,2 +9532249,A14,12,A34,A49,1185,A61,A73,3,A92,A101,2,A121,27,A143,A152,2,A173,1,A191,A201,1 +8266385,A14,12,A31,A48,3447,A63,A73,4,A92,A101,3,A121,35,A143,A152,1,A172,2,A191,A201,1 +2445116,A14,24,A32,A49,1258,A61,A74,4,A93,A101,1,A121,25,A143,A152,1,A173,1,A192,A201,1 +4852106,A14,12,A34,A43,717,A61,A75,4,A93,A101,4,A121,52,A143,A152,3,A173,1,A191,A201,1 +9087286,A14,6,A30,A40,1204,A62,A73,4,A93,A101,1,A124,35,A141,A151,1,A173,1,A191,A202,1 +7735589,A13,24,A32,A42,1925,A61,A73,2,A93,A101,2,A121,26,A143,A152,1,A173,1,A191,A201,1 +5075795,A14,18,A32,A43,433,A61,A71,3,A92,A102,4,A121,22,A143,A151,1,A173,1,A191,A201,2 +9404354,A11,6,A34,A40,666,A64,A74,3,A92,A101,4,A121,39,A143,A152,2,A172,1,A192,A201,1 +7374672,A13,12,A32,A42,2251,A61,A73,1,A92,A101,2,A123,46,A143,A152,1,A172,1,A191,A201,1 +5537602,A12,30,A32,A40,2150,A61,A73,4,A92,A103,2,A124,24,A141,A152,1,A173,1,A191,A201,2 +2779845,A14,24,A33,A42,4151,A62,A73,2,A93,A101,3,A122,35,A143,A152,2,A173,1,A191,A201,1 +9096946,A12,9,A32,A42,2030,A65,A74,2,A93,A101,1,A123,24,A143,A152,1,A173,1,A192,A201,1 +2649014,A12,60,A33,A43,7418,A65,A73,1,A93,A101,1,A121,27,A143,A152,1,A172,1,A191,A201,1 +8542371,A14,24,A34,A43,2684,A61,A73,4,A93,A101,2,A121,35,A143,A152,2,A172,1,A191,A201,1 +3719734,A11,12,A31,A43,2149,A61,A73,4,A91,A101,1,A124,29,A143,A153,1,A173,1,A191,A201,2 +5769297,A14,15,A32,A41,3812,A62,A72,1,A92,A101,4,A123,23,A143,A152,1,A173,1,A192,A201,1 +8002907,A14,11,A34,A43,1154,A62,A71,4,A92,A101,4,A121,57,A143,A152,3,A172,1,A191,A201,1 +4171983,A11,12,A32,A42,1657,A61,A73,2,A93,A101,2,A121,27,A143,A152,1,A173,1,A191,A201,1 +7094119,A11,24,A32,A43,1603,A61,A75,4,A92,A101,4,A123,55,A143,A152,1,A173,1,A191,A201,1 +5079269,A11,18,A34,A40,5302,A61,A75,2,A93,A101,4,A124,36,A143,A153,3,A174,1,A192,A201,1 +8926902,A14,12,A34,A46,2748,A61,A75,2,A92,A101,4,A124,57,A141,A153,3,A172,1,A191,A201,1 +5810771,A14,10,A34,A40,1231,A61,A75,3,A93,A101,4,A121,32,A143,A152,2,A172,2,A191,A202,1 +7823319,A12,15,A32,A43,802,A61,A75,4,A93,A101,3,A123,37,A143,A152,1,A173,2,A191,A201,2 +5303576,A14,36,A34,A49,6304,A65,A75,4,A93,A101,4,A121,36,A143,A152,2,A173,1,A191,A201,1 +3572402,A14,24,A32,A43,1533,A61,A72,4,A92,A101,3,A123,38,A142,A152,1,A173,1,A192,A201,1 +3892416,A11,14,A32,A40,8978,A61,A75,1,A91,A101,4,A122,45,A143,A152,1,A174,1,A192,A202,2 +4517508,A14,24,A32,A43,999,A65,A75,4,A93,A101,2,A123,25,A143,A152,2,A173,1,A191,A201,1 +8060870,A14,18,A32,A40,2662,A65,A74,4,A93,A101,3,A122,32,A143,A152,1,A173,1,A191,A202,1 +2081683,A14,12,A34,A42,1402,A63,A74,3,A92,A101,4,A123,37,A143,A151,1,A173,1,A192,A201,1 +6752061,A12,48,A31,A40,12169,A65,A71,4,A93,A102,4,A124,36,A143,A153,1,A174,1,A192,A201,1 +7081559,A12,48,A32,A43,3060,A61,A74,4,A93,A101,4,A121,28,A143,A152,2,A173,1,A191,A201,2 +9948056,A11,30,A32,A45,11998,A61,A72,1,A91,A101,1,A124,34,A143,A152,1,A172,1,A192,A201,2 +4447932,A14,9,A32,A43,2697,A61,A73,1,A93,A101,2,A121,32,A143,A152,1,A173,2,A191,A201,1 +5069388,A14,18,A34,A43,2404,A61,A73,2,A92,A101,2,A123,26,A143,A152,2,A173,1,A191,A201,1 +5910940,A11,12,A32,A42,1262,A65,A75,2,A91,A101,4,A122,49,A143,A152,1,A172,1,A192,A201,1 +3733764,A14,6,A32,A42,4611,A61,A72,1,A92,A101,4,A122,32,A143,A152,1,A173,1,A191,A201,2 +5012199,A14,24,A32,A43,1901,A62,A73,4,A93,A101,4,A123,29,A143,A151,1,A174,1,A192,A201,1 +6009011,A14,15,A34,A41,3368,A64,A75,3,A93,A101,4,A124,23,A143,A151,2,A173,1,A192,A201,1 +5953648,A14,12,A32,A42,1574,A61,A73,4,A93,A101,2,A121,50,A143,A152,1,A173,1,A191,A201,1 +1511501,A13,18,A31,A43,1445,A65,A74,4,A93,A101,4,A123,49,A141,A152,1,A172,1,A191,A201,1 +5113857,A14,15,A34,A42,1520,A65,A75,4,A93,A101,4,A122,63,A143,A152,1,A173,1,A191,A201,1 +3627592,A12,24,A34,A40,3878,A62,A72,4,A91,A101,2,A123,37,A143,A152,1,A173,1,A192,A201,1 +8803488,A11,47,A32,A40,10722,A61,A72,1,A92,A101,1,A121,35,A143,A152,1,A172,1,A192,A201,1 +9505849,A11,48,A32,A41,4788,A61,A74,4,A93,A101,3,A122,26,A143,A152,1,A173,2,A191,A201,1 +3200360,A12,48,A33,A410,7582,A62,A71,2,A93,A101,4,A124,31,A143,A153,1,A174,1,A192,A201,1 +5155553,A12,12,A32,A43,1092,A61,A73,4,A92,A103,4,A121,49,A143,A152,2,A173,1,A192,A201,1 +1380636,A11,24,A33,A43,1024,A61,A72,4,A94,A101,4,A121,48,A142,A152,1,A173,1,A191,A201,2 +3468954,A14,12,A32,A49,1076,A61,A73,2,A94,A101,2,A121,26,A143,A152,1,A173,1,A192,A202,1 +1526208,A12,36,A32,A41,9398,A61,A72,1,A94,A101,4,A123,28,A143,A151,1,A174,1,A192,A201,2 +2689247,A11,24,A34,A41,6419,A61,A75,2,A92,A101,4,A124,44,A143,A153,2,A174,2,A192,A201,1 +2359358,A13,42,A34,A41,4796,A61,A75,4,A93,A101,4,A124,56,A143,A153,1,A173,1,A191,A201,1 +1099632,A14,48,A34,A49,7629,A65,A75,4,A91,A101,2,A123,46,A141,A152,2,A174,2,A191,A201,1 +3753189,A12,48,A32,A42,9960,A61,A72,1,A92,A101,2,A123,26,A143,A152,1,A173,1,A192,A201,2 +9668162,A14,12,A32,A41,4675,A65,A72,1,A92,A101,4,A123,20,A143,A151,1,A173,1,A191,A201,1 +4302438,A14,10,A32,A40,1287,A65,A75,4,A93,A102,2,A122,45,A143,A152,1,A172,1,A191,A202,1 +1985883,A14,18,A32,A42,2515,A61,A73,3,A93,A101,4,A121,43,A143,A152,1,A173,1,A192,A201,1 +7530685,A12,21,A34,A42,2745,A64,A74,3,A93,A101,2,A123,32,A143,A152,2,A173,1,A192,A201,1 +3965440,A14,6,A32,A40,672,A61,A71,1,A92,A101,4,A121,54,A143,A152,1,A171,1,A192,A201,1 +9545127,A12,36,A30,A43,3804,A61,A73,4,A92,A101,1,A123,42,A143,A152,1,A173,1,A192,A201,2 +2663117,A13,24,A34,A40,1344,A65,A74,4,A93,A101,2,A121,37,A141,A152,2,A172,2,A191,A201,2 +8785566,A11,10,A34,A40,1038,A61,A74,4,A93,A102,3,A122,49,A143,A152,2,A173,1,A192,A201,1 +8697852,A14,48,A34,A40,10127,A63,A73,2,A93,A101,2,A124,44,A141,A153,1,A173,1,A191,A201,2 +9910337,A14,6,A32,A42,1543,A64,A73,4,A91,A101,2,A121,33,A143,A152,1,A173,1,A191,A201,1 +7200952,A14,30,A32,A41,4811,A65,A74,2,A92,A101,4,A122,24,A142,A151,1,A172,1,A191,A201,1 +2482842,A11,12,A32,A43,727,A62,A72,4,A94,A101,3,A124,33,A143,A152,1,A172,1,A192,A201,2 +5873455,A12,8,A32,A42,1237,A61,A73,3,A92,A101,4,A121,24,A143,A152,1,A173,1,A191,A201,2 +9224996,A12,9,A32,A40,276,A61,A73,4,A94,A101,4,A121,22,A143,A151,1,A172,1,A191,A201,1 +9804740,A12,48,A32,A410,5381,A65,A71,3,A93,A101,4,A124,40,A141,A153,1,A171,1,A192,A201,1 +5724920,A14,24,A32,A42,5511,A62,A73,4,A93,A101,1,A123,25,A142,A152,1,A173,1,A191,A201,1 +5401207,A13,24,A32,A42,3749,A61,A72,2,A92,A101,4,A123,26,A143,A152,1,A173,1,A191,A201,1 +6883393,A12,12,A32,A40,685,A61,A74,2,A94,A101,3,A123,25,A141,A152,1,A172,1,A191,A201,2 +9941505,A13,4,A32,A40,1494,A65,A72,1,A93,A101,2,A121,29,A143,A152,1,A172,2,A191,A202,1 +1310810,A11,36,A31,A42,2746,A61,A75,4,A93,A101,4,A123,31,A141,A152,1,A173,1,A191,A201,2 +6083949,A11,12,A32,A42,708,A61,A73,2,A93,A103,3,A122,38,A143,A152,1,A172,2,A191,A201,1 +7464596,A12,24,A32,A42,4351,A65,A73,1,A92,A101,4,A122,48,A143,A152,1,A172,1,A192,A201,1 +2298828,A14,12,A34,A46,701,A61,A73,4,A93,A101,2,A123,32,A143,A152,2,A173,1,A191,A201,1 +2953405,A11,15,A33,A42,3643,A61,A75,1,A92,A101,4,A122,27,A143,A152,2,A172,1,A191,A201,1 +5680012,A12,30,A34,A40,4249,A61,A71,4,A94,A101,2,A123,28,A143,A152,2,A174,1,A191,A201,2 +3243360,A11,24,A32,A43,1938,A61,A72,4,A91,A101,3,A122,32,A143,A152,1,A173,1,A191,A201,2 +6960053,A11,24,A32,A41,2910,A61,A74,2,A93,A101,1,A124,34,A143,A153,1,A174,1,A192,A201,1 +8870244,A11,18,A32,A42,2659,A64,A73,4,A93,A101,2,A123,28,A143,A152,1,A173,1,A191,A201,1 +9717422,A14,18,A34,A40,1028,A61,A73,4,A92,A101,3,A121,36,A143,A152,2,A173,1,A191,A201,1 +9554635,A11,8,A34,A40,3398,A61,A74,1,A93,A101,4,A121,39,A143,A152,2,A172,1,A191,A202,1 +3536846,A14,12,A34,A42,5801,A65,A75,2,A93,A101,4,A122,49,A143,A151,1,A173,1,A192,A201,1 +7951981,A14,24,A32,A40,1525,A64,A74,4,A92,A101,3,A123,34,A143,A152,1,A173,2,A192,A201,1 +5678097,A13,36,A32,A43,4473,A61,A75,4,A93,A101,2,A123,31,A143,A152,1,A173,1,A191,A201,1 +1771168,A12,6,A32,A43,1068,A61,A75,4,A93,A101,4,A123,28,A143,A152,1,A173,2,A191,A201,1 +7161704,A11,24,A34,A41,6615,A61,A71,2,A93,A101,4,A124,75,A143,A153,2,A174,1,A192,A201,1 +8866806,A14,18,A34,A46,1864,A62,A73,4,A92,A101,2,A121,30,A143,A152,2,A173,1,A191,A201,2 +3955252,A12,60,A32,A40,7408,A62,A72,4,A92,A101,2,A122,24,A143,A152,1,A174,1,A191,A201,2 +2239899,A14,48,A34,A41,11590,A62,A73,2,A92,A101,4,A123,24,A141,A151,2,A172,1,A191,A201,2 +6209500,A11,24,A30,A42,4110,A61,A75,3,A93,A101,4,A124,23,A141,A151,2,A173,2,A191,A201,2 +7504124,A11,6,A34,A42,3384,A61,A73,1,A91,A101,4,A121,44,A143,A151,1,A174,1,A192,A201,2 +1114455,A12,13,A32,A43,2101,A61,A72,2,A92,A103,4,A122,23,A143,A152,1,A172,1,A191,A201,1 +1803583,A11,15,A32,A44,1275,A65,A73,4,A92,A101,2,A123,24,A143,A151,1,A173,1,A191,A201,2 +8565471,A11,24,A32,A42,4169,A61,A73,4,A93,A101,4,A122,28,A143,A152,1,A173,1,A191,A201,1 +5974627,A12,10,A32,A42,1521,A61,A73,4,A91,A101,2,A123,31,A143,A152,1,A172,1,A191,A201,1 +6416552,A12,24,A34,A46,5743,A61,A72,2,A92,A101,4,A124,24,A143,A153,2,A173,1,A192,A201,1 +5807460,A11,21,A32,A42,3599,A61,A74,1,A92,A101,4,A123,26,A143,A151,1,A172,1,A191,A201,1 +4314639,A12,18,A32,A43,3213,A63,A72,1,A94,A101,3,A121,25,A143,A151,1,A173,1,A191,A201,1 +3529638,A12,18,A32,A49,4439,A61,A75,1,A93,A102,1,A121,33,A141,A152,1,A174,1,A192,A201,1 +1222158,A13,10,A32,A40,3949,A61,A72,1,A93,A103,1,A122,37,A143,A152,1,A172,2,A191,A201,1 +1440915,A14,15,A34,A43,1459,A61,A73,4,A92,A101,2,A123,43,A143,A152,1,A172,1,A191,A201,1 +3930437,A12,13,A34,A43,882,A61,A72,4,A93,A103,4,A121,23,A143,A152,2,A173,1,A191,A201,1 +6612328,A12,24,A32,A43,3758,A63,A71,1,A92,A101,4,A124,23,A143,A151,1,A171,1,A191,A201,1 +9945625,A14,6,A33,A49,1743,A62,A73,1,A93,A101,2,A121,34,A143,A152,2,A172,1,A191,A201,1 +5001052,A12,9,A34,A46,1136,A64,A75,4,A93,A101,3,A124,32,A143,A153,2,A173,2,A191,A201,2 +4733785,A14,9,A32,A44,1236,A61,A72,1,A92,A101,4,A121,23,A143,A151,1,A173,1,A192,A201,1 +1525448,A12,9,A32,A42,959,A61,A73,1,A92,A101,2,A123,29,A143,A152,1,A173,1,A191,A202,2 +6875226,A14,18,A34,A41,3229,A65,A71,2,A93,A101,4,A124,38,A143,A152,1,A174,1,A192,A201,1 +6189987,A11,12,A30,A43,6199,A61,A73,4,A93,A101,2,A122,28,A143,A151,2,A173,1,A192,A201,2 +1690342,A14,10,A32,A46,727,A63,A75,4,A93,A101,4,A124,46,A143,A153,1,A173,1,A192,A201,1 +5643344,A12,24,A32,A40,1246,A61,A72,4,A93,A101,2,A121,23,A142,A152,1,A172,1,A191,A201,2 +9430378,A14,12,A34,A43,2331,A65,A75,1,A93,A102,4,A121,49,A143,A152,1,A173,1,A192,A201,1 +9982432,A14,36,A33,A43,4463,A61,A73,4,A93,A101,2,A123,26,A143,A152,2,A174,1,A192,A201,2 +4344490,A14,12,A32,A43,776,A61,A73,4,A94,A101,2,A121,28,A143,A152,1,A173,1,A191,A201,1 +4256428,A11,30,A32,A42,2406,A61,A74,4,A92,A101,4,A121,23,A143,A151,1,A173,1,A191,A201,2 +9723360,A12,18,A32,A46,1239,A65,A73,4,A93,A101,4,A124,61,A143,A153,1,A173,1,A191,A201,1 +6762528,A13,12,A32,A43,3399,A65,A75,2,A93,A101,3,A123,37,A143,A152,1,A174,1,A191,A201,1 +5280442,A13,12,A33,A40,2247,A61,A73,2,A92,A101,2,A123,36,A142,A152,2,A173,1,A192,A201,1 +7631660,A14,6,A32,A42,1766,A61,A73,1,A94,A101,2,A122,21,A143,A151,1,A173,1,A191,A201,1 +9842159,A11,18,A32,A42,2473,A61,A71,4,A93,A101,1,A123,25,A143,A152,1,A171,1,A191,A201,2 +3343196,A14,12,A32,A49,1542,A61,A74,2,A93,A101,4,A123,36,A143,A152,1,A173,1,A192,A201,1 +1342256,A14,18,A34,A41,3850,A61,A74,3,A93,A101,1,A123,27,A143,A152,2,A173,1,A191,A201,1 +7966598,A11,18,A32,A42,3650,A61,A72,1,A92,A101,4,A123,22,A143,A151,1,A173,1,A191,A201,1 +9974040,A11,36,A32,A42,3446,A61,A75,4,A93,A101,2,A123,42,A143,A152,1,A173,2,A191,A201,2 +9518729,A12,18,A32,A42,3001,A61,A74,2,A92,A101,4,A121,40,A143,A151,1,A173,1,A191,A201,1 +6655507,A14,36,A32,A40,3079,A65,A73,4,A93,A101,4,A121,36,A143,A152,1,A173,1,A191,A201,1 +1296391,A14,18,A34,A43,6070,A61,A75,3,A93,A101,4,A123,33,A143,A152,2,A173,1,A192,A201,1 +2366810,A14,10,A34,A42,2146,A61,A72,1,A92,A101,3,A121,23,A143,A151,2,A173,1,A191,A201,1 +1470952,A14,60,A34,A40,13756,A65,A75,2,A93,A101,4,A124,63,A141,A153,1,A174,1,A192,A201,1 +4427304,A12,60,A31,A410,14782,A62,A75,3,A92,A101,4,A124,60,A141,A153,2,A174,1,A192,A201,2 +3112559,A11,48,A31,A49,7685,A61,A74,2,A92,A103,4,A123,37,A143,A151,1,A173,1,A191,A201,2 +4850866,A14,18,A33,A43,2320,A61,A71,2,A94,A101,3,A121,34,A143,A152,2,A173,1,A191,A201,1 +1019796,A14,7,A33,A43,846,A65,A75,3,A93,A101,4,A124,36,A143,A153,1,A173,1,A191,A201,1 +1707851,A12,36,A32,A40,14318,A61,A75,4,A93,A101,2,A124,57,A143,A153,1,A174,1,A192,A201,2 +4197078,A14,6,A34,A40,362,A62,A73,4,A92,A101,4,A123,52,A143,A152,2,A172,1,A191,A201,1 +6008260,A11,20,A32,A42,2212,A65,A74,4,A93,A101,4,A123,39,A143,A152,1,A173,1,A192,A201,1 +4238613,A12,18,A32,A41,12976,A61,A71,3,A92,A101,4,A124,38,A143,A153,1,A174,1,A192,A201,2 +8881848,A14,22,A32,A40,1283,A65,A74,4,A92,A101,4,A122,25,A143,A151,1,A173,1,A191,A201,1 +1176152,A13,12,A32,A40,1330,A61,A72,4,A93,A101,1,A121,26,A143,A152,1,A173,1,A191,A201,1 +5328999,A14,30,A33,A49,4272,A62,A73,2,A93,A101,2,A122,26,A143,A152,2,A172,1,A191,A201,1 +2017205,A14,18,A34,A43,2238,A61,A73,2,A92,A101,1,A123,25,A143,A152,2,A173,1,A191,A201,1 +9326362,A14,18,A32,A43,1126,A65,A72,4,A92,A101,2,A121,21,A143,A151,1,A173,1,A192,A201,1 +6571397,A12,18,A34,A42,7374,A61,A71,4,A93,A101,4,A122,40,A142,A152,2,A174,1,A192,A201,1 +8705378,A12,15,A34,A49,2326,A63,A73,2,A93,A101,4,A123,27,A141,A152,1,A173,1,A191,A201,1 +3969916,A14,9,A32,A49,1449,A61,A74,3,A92,A101,2,A123,27,A143,A152,2,A173,1,A191,A201,1 +1797600,A14,18,A32,A40,1820,A61,A73,2,A94,A101,2,A122,30,A143,A152,1,A174,1,A192,A201,1 +8978303,A12,12,A32,A42,983,A64,A72,1,A92,A101,4,A121,19,A143,A151,1,A172,1,A191,A201,1 +9199505,A11,36,A32,A40,3249,A61,A74,2,A93,A101,4,A124,39,A141,A153,1,A174,2,A192,A201,1 +5334896,A11,6,A34,A43,1957,A61,A74,1,A92,A101,4,A123,31,A143,A152,1,A173,1,A191,A201,1 +5018277,A14,9,A34,A42,2406,A61,A71,2,A93,A101,3,A123,31,A143,A152,1,A174,1,A191,A201,1 +9994482,A12,39,A33,A46,11760,A62,A74,2,A93,A101,3,A124,32,A143,A151,1,A173,1,A192,A201,1 +2520877,A11,12,A32,A42,2578,A61,A71,3,A92,A101,4,A124,55,A143,A153,1,A174,1,A191,A201,1 +9506380,A11,36,A34,A42,2348,A61,A73,3,A94,A101,2,A122,46,A143,A152,2,A173,1,A192,A201,1 +7043482,A12,12,A32,A40,1223,A61,A75,1,A91,A101,1,A121,46,A143,A151,2,A173,1,A191,A201,2 +1638879,A14,24,A34,A43,1516,A64,A73,4,A92,A101,1,A121,43,A143,A152,2,A172,1,A191,A201,1 +5296104,A14,18,A32,A43,1473,A61,A72,3,A94,A101,4,A121,39,A143,A152,1,A173,1,A192,A201,1 +7379526,A12,18,A34,A49,1887,A65,A73,4,A94,A101,4,A121,28,A141,A152,2,A173,1,A191,A201,1 +3276838,A14,24,A33,A49,8648,A61,A72,2,A93,A101,2,A123,27,A141,A152,2,A173,1,A192,A201,2 +8629007,A14,14,A33,A40,802,A61,A73,4,A93,A101,2,A123,27,A143,A152,2,A172,1,A191,A201,1 +6989363,A12,18,A33,A40,2899,A65,A75,4,A93,A101,4,A123,43,A143,A152,1,A173,2,A191,A201,1 +3486169,A12,24,A32,A43,2039,A61,A72,1,A94,A101,1,A122,22,A143,A152,1,A173,1,A192,A201,2 +5914287,A14,24,A34,A41,2197,A65,A74,4,A93,A101,4,A123,43,A143,A152,2,A173,2,A192,A201,1 +2842414,A11,15,A32,A43,1053,A61,A72,4,A94,A101,2,A121,27,A143,A152,1,A173,1,A191,A202,1 +6610185,A14,24,A32,A43,3235,A63,A75,3,A91,A101,2,A123,26,A143,A152,1,A174,1,A192,A201,1 +7974174,A13,12,A34,A40,939,A63,A74,4,A94,A101,2,A121,28,A143,A152,3,A173,1,A192,A201,2 +6076228,A12,24,A32,A43,1967,A61,A75,4,A92,A101,4,A123,20,A143,A152,1,A173,1,A192,A201,1 +2636238,A14,33,A34,A41,7253,A61,A74,3,A93,A101,2,A123,35,A143,A152,2,A174,1,A192,A201,1 +3698559,A14,12,A34,A49,2292,A61,A71,4,A93,A101,2,A123,42,A142,A152,2,A174,1,A192,A201,2 +1480210,A14,10,A32,A40,1597,A63,A73,3,A93,A101,2,A124,40,A143,A151,1,A172,2,A191,A202,1 +8545813,A11,24,A32,A40,1381,A65,A73,4,A92,A101,2,A122,35,A143,A152,1,A173,1,A191,A201,2 +8778324,A14,36,A34,A41,5842,A61,A75,2,A93,A101,2,A122,35,A143,A152,2,A173,2,A192,A201,1 +4645046,A11,12,A32,A40,2579,A61,A72,4,A93,A101,1,A121,33,A143,A152,1,A172,2,A191,A201,2 +8672645,A11,18,A33,A46,8471,A65,A73,1,A92,A101,2,A123,23,A143,A151,2,A173,1,A192,A201,1 +6875256,A14,21,A32,A40,2782,A63,A74,1,A92,A101,2,A123,31,A141,A152,1,A174,1,A191,A201,1 +9774597,A12,18,A32,A40,1042,A65,A73,4,A92,A101,2,A122,33,A143,A152,1,A173,1,A191,A201,2 +4501534,A14,15,A32,A40,3186,A64,A74,2,A92,A101,3,A123,20,A143,A151,1,A173,1,A191,A201,1 +2225902,A12,12,A32,A41,2028,A65,A73,4,A93,A101,2,A123,30,A143,A152,1,A173,1,A191,A201,1 +5678751,A12,12,A34,A40,958,A61,A74,2,A93,A101,3,A121,47,A143,A152,2,A172,2,A191,A201,1 +5549074,A14,21,A33,A42,1591,A62,A74,4,A93,A101,3,A121,34,A143,A152,2,A174,1,A191,A201,1 +4247210,A12,12,A32,A42,2762,A65,A75,1,A92,A101,2,A122,25,A141,A152,1,A173,1,A192,A201,2 +2450182,A12,18,A32,A41,2779,A61,A73,1,A94,A101,3,A123,21,A143,A151,1,A173,1,A192,A201,1 +9966032,A14,28,A34,A43,2743,A61,A75,4,A93,A101,2,A123,29,A143,A152,2,A173,1,A191,A201,1 +3797662,A14,18,A34,A43,1149,A64,A73,4,A93,A101,3,A121,46,A143,A152,2,A173,1,A191,A201,1 +7318192,A14,9,A32,A42,1313,A61,A75,1,A93,A101,4,A123,20,A143,A152,1,A173,1,A191,A201,1 +8969170,A11,18,A34,A45,1190,A61,A71,2,A92,A101,4,A124,55,A143,A153,3,A171,2,A191,A201,2 +6376369,A14,5,A32,A49,3448,A61,A74,1,A93,A101,4,A121,74,A143,A152,1,A172,1,A191,A201,1 +7195551,A12,24,A32,A410,11328,A61,A73,2,A93,A102,3,A123,29,A141,A152,2,A174,1,A192,A201,2 +9907330,A11,6,A34,A42,1872,A61,A71,4,A93,A101,4,A124,36,A143,A153,3,A174,1,A192,A201,1 +3653212,A14,24,A34,A45,2058,A61,A73,4,A91,A101,2,A121,33,A143,A152,2,A173,1,A192,A201,1 +1329531,A11,9,A32,A42,2136,A61,A73,3,A93,A101,2,A121,25,A143,A152,1,A173,1,A191,A201,1 +1320796,A12,12,A32,A43,1484,A65,A73,2,A94,A101,1,A121,25,A143,A152,1,A173,1,A192,A201,2 +5337783,A14,6,A32,A45,660,A63,A74,2,A94,A101,4,A121,23,A143,A151,1,A172,1,A191,A201,1 +8339835,A14,24,A34,A40,1287,A64,A75,4,A92,A101,4,A121,37,A143,A152,2,A173,1,A192,A201,1 +9464784,A11,42,A34,A45,3394,A61,A71,4,A93,A102,4,A123,65,A143,A152,2,A171,1,A191,A201,1 +3252011,A13,12,A31,A49,609,A61,A72,4,A92,A101,1,A121,26,A143,A152,1,A171,1,A191,A201,2 +4946613,A14,12,A32,A40,1884,A61,A75,4,A93,A101,4,A123,39,A143,A152,1,A174,1,A192,A201,1 +8418012,A11,12,A32,A42,1620,A61,A73,2,A92,A102,3,A122,30,A143,A152,1,A173,1,A191,A201,1 +7819002,A12,20,A33,A410,2629,A61,A73,2,A93,A101,3,A123,29,A141,A152,2,A173,1,A192,A201,1 +8650326,A14,12,A32,A46,719,A61,A75,4,A93,A101,4,A123,41,A141,A152,1,A172,2,A191,A201,2 +6493165,A12,48,A34,A42,5096,A61,A73,2,A92,A101,3,A123,30,A143,A152,1,A174,1,A192,A201,2 +8221917,A14,9,A34,A46,1244,A65,A75,4,A92,A101,4,A122,41,A143,A151,2,A172,1,A191,A201,1 +6793340,A11,36,A32,A40,1842,A61,A72,4,A92,A101,4,A123,34,A143,A152,1,A173,1,A192,A201,2 +5800337,A12,7,A32,A43,2576,A61,A73,2,A93,A103,2,A121,35,A143,A152,1,A173,1,A191,A201,1 +3033374,A13,12,A32,A42,1424,A65,A75,3,A92,A101,4,A121,55,A143,A152,1,A174,1,A192,A201,1 +6260138,A12,15,A33,A45,1512,A64,A73,3,A94,A101,3,A122,61,A142,A152,2,A173,1,A191,A201,2 +5464982,A14,36,A34,A41,11054,A65,A73,4,A93,A101,2,A123,30,A143,A152,1,A174,1,A192,A201,1 +1123635,A14,6,A32,A43,518,A61,A73,3,A92,A101,1,A121,29,A143,A152,1,A173,1,A191,A201,1 +7267566,A14,12,A30,A42,2759,A61,A75,2,A93,A101,4,A122,34,A143,A152,2,A173,1,A191,A201,1 +7836298,A14,24,A32,A41,2670,A61,A75,4,A93,A101,4,A123,35,A143,A152,1,A174,1,A192,A201,1 +5804062,A11,24,A32,A40,4817,A61,A74,2,A93,A102,3,A122,31,A143,A152,1,A173,1,A192,A201,2 +9314182,A14,24,A32,A41,2679,A61,A72,4,A92,A101,1,A124,29,A143,A152,1,A174,1,A192,A201,1 +1557803,A11,11,A34,A40,3905,A61,A73,2,A93,A101,2,A121,36,A143,A151,2,A173,2,A191,A201,1 +3608529,A11,12,A32,A41,3386,A61,A75,3,A93,A101,4,A124,35,A143,A153,1,A173,1,A192,A201,2 +8714298,A11,6,A32,A44,343,A61,A72,4,A92,A101,1,A121,27,A143,A152,1,A173,1,A191,A201,1 +2245784,A14,18,A32,A43,4594,A61,A72,3,A93,A101,2,A123,32,A143,A152,1,A173,1,A192,A201,1 +6049064,A11,36,A32,A42,3620,A61,A73,1,A93,A103,2,A122,37,A143,A152,1,A173,2,A191,A201,1 +6641808,A11,15,A32,A40,1721,A61,A72,2,A93,A101,3,A121,36,A143,A152,1,A173,1,A191,A201,1 +5057956,A12,12,A32,A42,3017,A61,A72,3,A92,A101,1,A121,34,A143,A151,1,A174,1,A191,A201,1 +4441263,A12,12,A32,A48,754,A65,A75,4,A93,A101,4,A122,38,A143,A152,2,A173,1,A191,A201,1 +3291069,A14,18,A32,A49,1950,A61,A74,4,A93,A101,1,A123,34,A142,A152,2,A173,1,A192,A201,1 +7518282,A11,24,A32,A41,2924,A61,A73,3,A93,A103,4,A124,63,A141,A152,1,A173,2,A192,A201,1 +1425368,A11,24,A33,A43,1659,A61,A72,4,A92,A101,2,A123,29,A143,A151,1,A172,1,A192,A201,2 +7122922,A14,48,A33,A43,7238,A65,A75,3,A93,A101,3,A123,32,A141,A152,2,A173,2,A191,A201,1 +6752438,A14,33,A33,A49,2764,A61,A73,2,A92,A101,2,A123,26,A143,A152,2,A173,1,A192,A201,1 +6177001,A14,24,A33,A41,4679,A61,A74,3,A93,A101,3,A123,35,A143,A152,2,A172,1,A192,A201,1 +6983743,A12,24,A32,A43,3092,A62,A72,3,A94,A101,2,A123,22,A143,A151,1,A173,1,A192,A201,2 +6691152,A11,6,A32,A46,448,A61,A72,4,A92,A101,4,A122,23,A143,A152,1,A173,1,A191,A201,2 +8808032,A11,9,A32,A40,654,A61,A73,4,A93,A101,3,A123,28,A143,A152,1,A172,1,A191,A201,2 +9230494,A14,6,A32,A48,1238,A65,A71,4,A93,A101,4,A122,36,A143,A152,1,A174,2,A192,A201,1 +7173285,A12,18,A34,A43,1245,A61,A73,4,A94,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,2 +5890956,A11,18,A30,A42,3114,A61,A72,1,A92,A101,4,A122,26,A143,A151,1,A173,1,A191,A201,2 +9796998,A14,39,A32,A41,2569,A63,A73,4,A93,A101,4,A123,24,A143,A152,1,A173,1,A191,A201,1 +9197850,A13,24,A32,A43,5152,A61,A74,4,A93,A101,2,A123,25,A141,A152,1,A173,1,A191,A201,1 +2693563,A12,12,A32,A49,1037,A62,A74,3,A93,A101,4,A121,39,A143,A152,1,A172,1,A191,A201,1 +4912289,A11,15,A34,A42,1478,A61,A75,4,A93,A101,4,A123,44,A143,A152,2,A173,2,A192,A201,1 +6383336,A12,12,A34,A43,3573,A61,A73,1,A92,A101,1,A121,23,A143,A152,1,A172,1,A191,A201,1 +7709637,A12,24,A32,A40,1201,A61,A72,4,A93,A101,1,A122,26,A143,A152,1,A173,1,A191,A201,1 +4072676,A11,30,A32,A42,3622,A64,A75,4,A92,A101,4,A122,57,A143,A151,2,A173,1,A192,A201,1 +9216806,A14,15,A33,A42,960,A64,A74,3,A92,A101,2,A122,30,A143,A152,2,A173,1,A191,A201,1 +2718715,A14,12,A34,A40,1163,A63,A73,4,A93,A101,4,A121,44,A143,A152,1,A173,1,A192,A201,1 +3865105,A12,6,A33,A40,1209,A61,A71,4,A93,A101,4,A122,47,A143,A152,1,A174,1,A192,A201,2 +9640833,A14,12,A32,A43,3077,A61,A73,2,A93,A101,4,A123,52,A143,A152,1,A173,1,A192,A201,1 +2667404,A14,24,A32,A40,3757,A61,A75,4,A92,A102,4,A124,62,A143,A153,1,A173,1,A192,A201,1 +4167143,A14,10,A32,A40,1418,A62,A73,3,A93,A101,2,A121,35,A143,A151,1,A172,1,A191,A202,1 +7957595,A12,18,A32,A43,1113,A61,A73,4,A92,A103,4,A121,26,A143,A152,1,A172,2,A191,A201,1 +6793340,A11,36,A32,A40,1842,A61,A72,4,A92,A101,4,A123,34,A143,A152,1,A173,1,A192,A201,2 +5378110,A14,6,A32,A40,3518,A61,A73,2,A93,A103,3,A122,26,A143,A151,1,A173,1,A191,A201,1 +7398663,A14,12,A34,A43,1934,A61,A75,2,A93,A101,2,A124,26,A143,A152,2,A173,1,A191,A201,1 +2689827,A12,27,A30,A49,8318,A61,A75,2,A92,A101,4,A124,42,A143,A153,2,A174,1,A192,A201,2 +4156618,A14,6,A34,A43,1237,A62,A73,1,A92,A101,1,A122,27,A143,A152,2,A173,1,A191,A201,1 +6319109,A12,6,A32,A43,368,A65,A75,4,A93,A101,4,A122,38,A143,A152,1,A173,1,A191,A201,1 +7120920,A11,12,A34,A40,2122,A61,A73,3,A93,A101,2,A121,39,A143,A151,2,A172,2,A191,A202,1 +1078929,A11,24,A32,A42,2996,A65,A73,2,A94,A101,4,A123,20,A143,A152,1,A173,1,A191,A201,2 +1023769,A12,36,A32,A42,9034,A62,A72,4,A93,A102,1,A124,29,A143,A151,1,A174,1,A192,A201,2 +5288061,A14,24,A34,A42,1585,A61,A74,4,A93,A101,3,A122,40,A143,A152,2,A173,1,A191,A201,1 +6898416,A12,18,A32,A43,1301,A61,A75,4,A94,A103,2,A121,32,A143,A152,1,A172,1,A191,A201,1 +3933279,A13,6,A34,A40,1323,A62,A75,2,A91,A101,4,A123,28,A143,A152,2,A173,2,A192,A201,1 +2929379,A11,24,A32,A40,3123,A61,A72,4,A92,A101,1,A122,27,A143,A152,1,A173,1,A191,A201,2 +7338113,A11,36,A32,A41,5493,A61,A75,2,A93,A101,4,A124,42,A143,A153,1,A173,2,A191,A201,1 +1383056,A13,9,A32,A43,1126,A62,A75,2,A91,A101,4,A121,49,A143,A152,1,A173,1,A191,A201,1 +9894792,A12,24,A34,A43,1216,A62,A72,4,A93,A101,4,A124,38,A141,A152,2,A173,2,A191,A201,2 +4821533,A11,24,A32,A40,1207,A61,A72,4,A92,A101,4,A122,24,A143,A151,1,A173,1,A191,A201,2 +2599724,A14,10,A32,A40,1309,A65,A73,4,A93,A103,4,A122,27,A143,A152,1,A172,1,A191,A201,2 +7122065,A13,15,A34,A41,2360,A63,A73,2,A93,A101,2,A123,36,A143,A152,1,A173,1,A192,A201,1 +4771670,A12,15,A31,A40,6850,A62,A71,1,A93,A101,2,A122,34,A143,A152,1,A174,2,A192,A201,2 +7524556,A14,24,A32,A43,1413,A61,A73,4,A94,A101,2,A122,28,A143,A152,1,A173,1,A191,A201,1 +8295830,A14,39,A32,A41,8588,A62,A75,4,A93,A101,2,A123,45,A143,A152,1,A174,1,A192,A201,1 +5815745,A11,12,A32,A40,759,A61,A74,4,A93,A101,2,A121,26,A143,A152,1,A173,1,A191,A201,2 +2954111,A14,36,A32,A41,4686,A61,A73,2,A93,A101,2,A124,32,A143,A153,1,A174,1,A192,A201,1 +3356855,A13,15,A32,A49,2687,A61,A74,2,A93,A101,4,A122,26,A143,A151,1,A173,1,A192,A201,1 +2676379,A12,12,A33,A43,585,A61,A73,4,A94,A102,4,A121,20,A143,A151,2,A173,1,A191,A201,1 +8727753,A14,24,A32,A40,2255,A65,A72,4,A93,A101,1,A122,54,A143,A152,1,A173,1,A191,A201,1 +6823953,A11,6,A34,A40,609,A61,A74,4,A92,A101,3,A122,37,A143,A152,2,A173,1,A191,A202,1 +7281914,A11,6,A34,A40,1361,A61,A72,2,A93,A101,4,A121,40,A143,A152,1,A172,2,A191,A202,1 +7911515,A14,36,A34,A42,7127,A61,A72,2,A92,A101,4,A122,23,A143,A151,2,A173,1,A192,A201,2 +6306871,A11,6,A32,A40,1203,A62,A75,3,A93,A101,2,A122,43,A143,A152,1,A173,1,A192,A201,1 +6961576,A14,6,A34,A43,700,A65,A75,4,A93,A101,4,A124,36,A143,A153,2,A173,1,A191,A201,1 +3215162,A14,24,A34,A45,5507,A61,A75,3,A93,A101,4,A124,44,A143,A153,2,A173,1,A191,A201,1 +3199662,A11,18,A32,A43,3190,A61,A73,2,A92,A101,2,A121,24,A143,A152,1,A173,1,A191,A201,2 +3873021,A11,48,A30,A42,7119,A61,A73,3,A93,A101,4,A124,53,A143,A153,2,A173,2,A191,A201,2 +9751634,A14,24,A32,A41,3488,A62,A74,3,A92,A101,4,A123,23,A143,A152,1,A173,1,A191,A201,1 +7957595,A12,18,A32,A43,1113,A61,A73,4,A92,A103,4,A121,26,A143,A152,1,A172,2,A191,A201,1 +5857366,A12,26,A32,A41,7966,A61,A72,2,A93,A101,3,A123,30,A143,A152,2,A173,1,A191,A201,1 +2509169,A14,15,A34,A46,1532,A62,A73,4,A92,A101,3,A123,31,A143,A152,1,A173,1,A191,A201,1 +2621457,A14,4,A34,A43,1503,A61,A74,2,A93,A101,1,A121,42,A143,A152,2,A172,2,A191,A201,1 +8637579,A11,36,A32,A43,2302,A61,A73,4,A91,A101,4,A123,31,A143,A151,1,A173,1,A191,A201,2 +4032468,A11,6,A32,A40,662,A61,A72,3,A93,A101,4,A121,41,A143,A152,1,A172,2,A192,A201,1 +9481590,A12,36,A32,A46,2273,A61,A74,3,A93,A101,1,A123,32,A143,A152,2,A173,2,A191,A201,1 +7828502,A12,15,A32,A40,2631,A62,A73,2,A92,A101,4,A123,28,A143,A151,2,A173,1,A192,A201,2 +6996047,A14,12,A33,A41,1503,A61,A73,4,A94,A101,4,A121,41,A143,A151,1,A173,1,A191,A201,1 +8785106,A14,24,A32,A43,1311,A62,A74,4,A94,A101,3,A122,26,A143,A152,1,A173,1,A192,A201,1 +3836745,A14,24,A32,A43,3105,A65,A72,4,A93,A101,2,A123,25,A143,A152,2,A173,1,A191,A201,1 +8659616,A13,21,A34,A46,2319,A61,A72,2,A91,A101,1,A123,33,A143,A151,1,A173,1,A191,A201,2 +5027921,A11,6,A32,A40,1374,A65,A71,4,A92,A101,3,A122,75,A143,A152,1,A174,1,A192,A201,1 +9292167,A12,18,A34,A42,3612,A61,A75,3,A92,A101,4,A122,37,A143,A152,1,A173,1,A192,A201,1 +9619216,A11,48,A32,A40,7763,A61,A75,4,A93,A101,4,A124,42,A141,A153,1,A174,1,A191,A201,2 +5342884,A13,18,A32,A42,3049,A61,A72,1,A92,A101,1,A122,45,A142,A152,1,A172,1,A191,A201,1 +3841990,A12,12,A32,A43,1534,A61,A72,1,A94,A101,1,A121,23,A143,A151,1,A173,1,A191,A201,2 +9862078,A14,24,A33,A40,2032,A61,A75,4,A93,A101,4,A124,60,A143,A153,2,A173,1,A192,A201,1 +5779045,A11,30,A32,A42,6350,A65,A75,4,A93,A101,4,A122,31,A143,A152,1,A173,1,A191,A201,2 +4285035,A13,18,A32,A42,2864,A61,A73,2,A93,A101,1,A121,34,A143,A152,1,A172,2,A191,A201,2 +9708453,A14,12,A34,A40,1255,A61,A75,4,A93,A101,4,A121,61,A143,A152,2,A172,1,A191,A201,1 +2572735,A11,24,A33,A40,1333,A61,A71,4,A93,A101,2,A121,43,A143,A153,2,A173,2,A191,A201,2 +9506182,A14,24,A34,A40,2022,A61,A73,4,A92,A101,4,A123,37,A143,A152,1,A173,1,A192,A201,1 +3835775,A14,24,A32,A43,1552,A61,A74,3,A93,A101,1,A123,32,A141,A152,1,A173,2,A191,A201,1 +8995384,A11,12,A31,A43,626,A61,A73,4,A92,A101,4,A121,24,A141,A152,1,A172,1,A191,A201,2 +1831870,A14,48,A34,A41,8858,A65,A74,2,A93,A101,1,A124,35,A143,A153,2,A173,1,A192,A201,1 +6930739,A14,12,A34,A45,996,A65,A74,4,A92,A101,4,A121,23,A143,A152,2,A173,1,A191,A201,1 +2082338,A14,6,A31,A43,1750,A63,A75,2,A93,A101,4,A122,45,A141,A152,1,A172,2,A191,A201,1 +4367265,A11,48,A32,A43,6999,A61,A74,1,A94,A103,1,A121,34,A143,A152,2,A173,1,A192,A201,2 +6993548,A12,12,A34,A40,1995,A62,A72,4,A93,A101,1,A123,27,A143,A152,1,A173,1,A191,A201,1 +5774450,A12,9,A32,A46,1199,A61,A74,4,A92,A101,4,A122,67,A143,A152,2,A174,1,A192,A201,1 +1859841,A12,12,A32,A43,1331,A61,A72,2,A93,A101,1,A123,22,A142,A152,1,A173,1,A191,A201,2 +2420058,A12,18,A30,A40,2278,A62,A72,3,A92,A101,3,A123,28,A143,A152,2,A173,1,A191,A201,2 +5136657,A14,21,A30,A40,5003,A65,A73,1,A92,A101,4,A122,29,A141,A152,2,A173,1,A192,A201,2 +4921496,A11,24,A31,A42,3552,A61,A74,3,A93,A101,4,A123,27,A141,A152,1,A173,1,A191,A201,2 +1259242,A12,18,A34,A42,1928,A61,A72,2,A93,A101,2,A121,31,A143,A152,2,A172,1,A191,A201,2 +4695993,A11,24,A32,A41,2964,A65,A75,4,A93,A101,4,A124,49,A141,A153,1,A173,2,A192,A201,1 +3427239,A11,24,A31,A43,1546,A61,A74,4,A93,A103,4,A123,24,A141,A151,1,A172,1,A191,A201,2 +4030339,A13,6,A33,A43,683,A61,A72,2,A92,A101,1,A122,29,A141,A152,1,A173,1,A191,A201,1 +6457386,A12,36,A32,A40,12389,A65,A73,1,A93,A101,4,A124,37,A143,A153,1,A173,1,A192,A201,2 +2492415,A12,24,A33,A49,4712,A65,A73,4,A93,A101,2,A122,37,A141,A152,2,A174,1,A192,A201,1 +3819769,A12,24,A33,A43,1553,A62,A74,3,A92,A101,2,A122,23,A143,A151,2,A173,1,A192,A201,1 +3248150,A11,12,A32,A40,1372,A61,A74,2,A91,A101,3,A123,36,A143,A152,1,A173,1,A191,A201,2 +2140412,A14,24,A34,A43,2578,A64,A75,2,A93,A101,2,A123,34,A143,A152,1,A173,1,A191,A201,1 +8918532,A12,48,A32,A43,3979,A65,A74,4,A93,A101,1,A123,41,A143,A152,2,A173,2,A192,A201,1 +9205934,A11,48,A32,A43,6758,A61,A73,3,A92,A101,2,A123,31,A143,A152,1,A173,1,A192,A201,2 +6546357,A11,24,A32,A42,3234,A61,A72,4,A92,A101,4,A121,23,A143,A151,1,A172,1,A192,A201,2 +4352053,A14,30,A34,A43,5954,A61,A74,3,A93,A102,2,A123,38,A143,A152,1,A173,1,A191,A201,1 +6620263,A14,24,A32,A41,5433,A65,A71,2,A92,A101,4,A122,26,A143,A151,1,A174,1,A192,A201,1 +1039684,A11,15,A32,A49,806,A61,A73,4,A92,A101,4,A122,22,A143,A152,1,A172,1,A191,A201,1 +1999133,A12,9,A32,A43,1082,A61,A75,4,A93,A101,4,A123,27,A143,A152,2,A172,1,A191,A201,1 +6641350,A14,15,A34,A42,2788,A61,A74,2,A92,A102,3,A123,24,A141,A152,2,A173,1,A191,A201,1 +9791462,A12,12,A32,A43,2930,A61,A74,2,A92,A101,1,A121,27,A143,A152,1,A173,1,A191,A201,1 +2964444,A14,24,A34,A46,1927,A65,A73,3,A92,A101,2,A123,33,A143,A152,2,A173,1,A192,A201,1 +4612811,A12,36,A34,A40,2820,A61,A72,4,A91,A101,4,A123,27,A143,A152,2,A173,1,A191,A201,2 +8885321,A14,24,A32,A48,937,A61,A72,4,A94,A101,3,A123,27,A143,A152,2,A172,1,A191,A201,1 +8198782,A12,18,A34,A40,1056,A61,A75,3,A93,A103,3,A121,30,A141,A152,2,A173,1,A191,A201,2 +2454414,A12,12,A34,A40,3124,A61,A72,1,A93,A101,3,A121,49,A141,A152,2,A172,2,A191,A201,1 +5678348,A14,9,A32,A42,1388,A61,A73,4,A92,A101,2,A121,26,A143,A151,1,A173,1,A191,A201,1 +5182819,A12,36,A32,A45,2384,A61,A72,4,A93,A101,1,A124,33,A143,A151,1,A172,1,A191,A201,2 +2727231,A14,12,A32,A40,2133,A65,A75,4,A92,A101,4,A124,52,A143,A153,1,A174,1,A192,A201,1 +1623348,A11,18,A32,A42,2039,A61,A73,1,A92,A101,4,A121,20,A141,A151,1,A173,1,A191,A201,2 +2307018,A11,9,A34,A40,2799,A61,A73,2,A93,A101,2,A121,36,A143,A151,2,A173,2,A191,A201,1 +1856079,A11,12,A32,A42,1289,A61,A73,4,A93,A103,1,A122,21,A143,A152,1,A172,1,A191,A201,1 +6274836,A11,18,A32,A44,1217,A61,A73,4,A94,A101,3,A121,47,A143,A152,1,A172,1,A192,A201,2 +6514251,A11,12,A34,A42,2246,A61,A75,3,A93,A101,3,A122,60,A143,A152,2,A173,1,A191,A201,2 +4867066,A11,12,A34,A43,385,A61,A74,4,A92,A101,3,A121,58,A143,A152,4,A172,1,A192,A201,1 +4883891,A12,24,A33,A40,1965,A65,A73,4,A92,A101,4,A123,42,A143,A151,2,A173,1,A192,A201,1 +9366782,A14,21,A32,A49,1572,A64,A75,4,A92,A101,4,A121,36,A141,A152,1,A172,1,A191,A201,1 +8194462,A12,24,A32,A40,2718,A61,A73,3,A92,A101,4,A122,20,A143,A151,1,A172,1,A192,A201,2 +3570555,A11,24,A31,A410,1358,A65,A75,4,A93,A101,3,A123,40,A142,A152,1,A174,1,A192,A201,2 +1491305,A12,6,A31,A40,931,A62,A72,1,A92,A101,1,A122,32,A142,A152,1,A172,1,A191,A201,2 +3303554,A11,24,A32,A40,1442,A61,A74,4,A92,A101,4,A123,23,A143,A151,2,A173,1,A191,A201,2 +3657362,A12,24,A30,A49,4241,A61,A73,1,A93,A101,4,A121,36,A143,A152,3,A172,1,A192,A201,2 +5220673,A14,18,A34,A40,2775,A61,A74,2,A93,A101,2,A122,31,A141,A152,2,A173,1,A191,A201,2 +5869534,A14,24,A33,A49,3863,A61,A73,1,A93,A101,2,A124,32,A143,A153,1,A173,1,A191,A201,1 +2213067,A12,7,A32,A43,2329,A61,A72,1,A92,A103,1,A121,45,A143,A152,1,A173,1,A191,A201,1 +2801710,A12,9,A32,A42,918,A61,A73,4,A92,A101,1,A122,30,A143,A152,1,A173,1,A191,A201,2 +1024391,A12,24,A31,A46,1837,A61,A74,4,A92,A101,4,A124,34,A141,A153,1,A172,1,A191,A201,2 +5699583,A14,36,A32,A42,3349,A61,A73,4,A92,A101,2,A123,28,A143,A152,1,A174,1,A192,A201,2 +2566539,A13,10,A32,A42,1275,A61,A72,4,A92,A101,2,A122,23,A143,A152,1,A173,1,A191,A201,1 +6680952,A11,24,A31,A42,2828,A63,A73,4,A93,A101,4,A121,22,A142,A152,1,A173,1,A192,A201,1 +7697857,A14,24,A34,A49,4526,A61,A73,3,A93,A101,2,A121,74,A143,A152,1,A174,1,A192,A201,1 +1147017,A12,36,A32,A43,2671,A62,A73,4,A92,A102,4,A124,50,A143,A153,1,A173,1,A191,A201,2 +1877439,A14,18,A32,A43,2051,A61,A72,4,A93,A101,1,A121,33,A143,A152,1,A173,1,A191,A201,1 +8278164,A14,15,A32,A41,1300,A65,A75,4,A93,A101,4,A124,45,A141,A153,1,A173,2,A191,A201,1 +5491895,A11,12,A32,A44,741,A62,A71,4,A92,A101,3,A122,22,A143,A152,1,A173,1,A191,A201,2 +1885075,A13,10,A32,A40,1240,A62,A75,1,A92,A101,4,A124,48,A143,A153,1,A172,2,A191,A201,2 +6750986,A11,21,A32,A43,3357,A64,A72,4,A92,A101,2,A123,29,A141,A152,1,A173,1,A191,A201,1 +8900389,A11,24,A31,A41,3632,A61,A73,1,A92,A103,4,A123,22,A141,A151,1,A173,1,A191,A202,1 +3583117,A14,18,A33,A42,1808,A61,A74,4,A92,A101,1,A121,22,A143,A152,1,A173,1,A191,A201,2 +6196790,A12,48,A30,A49,12204,A65,A73,2,A93,A101,2,A123,48,A141,A152,1,A174,1,A192,A201,1 +9626587,A12,60,A33,A43,9157,A65,A73,2,A93,A101,2,A124,27,A143,A153,1,A174,1,A191,A201,1 +1451248,A11,6,A34,A40,3676,A61,A73,1,A93,A101,3,A121,37,A143,A151,3,A173,2,A191,A201,1 +2753919,A12,30,A32,A42,3441,A62,A73,2,A92,A102,4,A123,21,A143,A151,1,A173,1,A191,A201,2 +5526862,A14,12,A32,A40,640,A61,A73,4,A91,A101,2,A121,49,A143,A152,1,A172,1,A191,A201,1 +4872213,A12,21,A34,A49,3652,A61,A74,2,A93,A101,3,A122,27,A143,A152,2,A173,1,A191,A201,1 +2393150,A14,18,A34,A40,1530,A61,A73,3,A93,A101,2,A122,32,A141,A152,2,A173,1,A191,A201,2 +5713333,A14,48,A32,A49,3914,A65,A73,4,A91,A101,2,A121,38,A141,A152,1,A173,1,A191,A201,2 +1937588,A11,12,A32,A42,1858,A61,A72,4,A92,A101,1,A123,22,A143,A151,1,A173,1,A191,A201,1 +9355241,A11,18,A32,A43,2600,A61,A73,4,A93,A101,4,A124,65,A143,A153,2,A173,1,A191,A201,2 +5508301,A14,15,A32,A43,1979,A65,A75,4,A93,A101,2,A123,35,A143,A152,1,A173,1,A191,A201,1 +3586335,A13,6,A32,A42,2116,A61,A73,2,A93,A101,2,A121,41,A143,A152,1,A173,1,A192,A201,1 +1838843,A12,9,A31,A40,1437,A62,A74,2,A93,A101,3,A124,29,A143,A152,1,A173,1,A191,A201,2 +4763001,A14,42,A34,A42,4042,A63,A73,4,A93,A101,4,A121,36,A143,A152,2,A173,1,A192,A201,1 +4124475,A14,9,A32,A46,3832,A65,A75,1,A93,A101,4,A121,64,A143,A152,1,A172,1,A191,A201,1 +1746129,A11,24,A32,A43,3660,A61,A73,2,A92,A101,4,A123,28,A143,A152,1,A173,1,A191,A201,1 +8578734,A11,18,A31,A42,1553,A61,A73,4,A93,A101,3,A123,44,A141,A152,1,A173,1,A191,A201,2 +3544577,A12,15,A32,A43,1444,A65,A72,4,A93,A101,1,A122,23,A143,A152,1,A173,1,A191,A201,1 +5640616,A14,9,A32,A42,1980,A61,A72,2,A92,A102,2,A123,19,A143,A151,2,A173,1,A191,A201,2 +3695680,A12,24,A32,A40,1355,A61,A72,3,A92,A101,4,A123,25,A143,A152,1,A172,1,A192,A201,2 +1665899,A14,12,A32,A46,1393,A61,A75,4,A93,A101,4,A122,47,A141,A152,3,A173,2,A192,A201,1 +3336776,A14,24,A32,A43,1376,A63,A74,4,A92,A101,1,A123,28,A143,A152,1,A173,1,A191,A201,1 +3175919,A14,60,A33,A43,15653,A61,A74,2,A93,A101,4,A123,21,A143,A152,2,A173,1,A192,A201,1 +6193893,A14,12,A32,A43,1493,A61,A72,4,A92,A101,3,A123,34,A143,A152,1,A173,2,A191,A201,1 +9308059,A11,42,A33,A43,4370,A61,A74,3,A93,A101,2,A122,26,A141,A152,2,A173,2,A192,A201,2 +7240888,A11,18,A32,A46,750,A61,A71,4,A92,A101,1,A121,27,A143,A152,1,A171,1,A191,A201,2 +7390720,A12,15,A32,A45,1308,A61,A75,4,A93,A101,4,A123,38,A143,A152,2,A172,1,A191,A201,1 +2244055,A14,15,A32,A46,4623,A62,A73,3,A93,A101,2,A122,40,A143,A152,1,A174,1,A192,A201,2 +9290774,A14,24,A34,A43,1851,A61,A74,4,A94,A103,2,A123,33,A143,A152,2,A173,1,A192,A201,1 +9895994,A11,18,A34,A43,1880,A61,A74,4,A94,A101,1,A122,32,A143,A152,2,A174,1,A192,A201,1 +5985690,A14,36,A33,A49,7980,A65,A72,4,A93,A101,4,A123,27,A143,A151,2,A173,1,A192,A201,2 +1445890,A11,30,A30,A42,4583,A61,A73,2,A91,A103,2,A121,32,A143,A152,2,A173,1,A191,A201,1 +9446992,A14,12,A32,A40,1386,A63,A73,2,A92,A101,2,A122,26,A143,A152,1,A173,1,A191,A201,2 +7946983,A13,24,A32,A40,947,A61,A74,4,A93,A101,3,A124,38,A141,A153,1,A173,2,A191,A201,2 +3808692,A11,12,A32,A46,684,A61,A73,4,A93,A101,4,A123,40,A143,A151,1,A172,2,A191,A201,2 +6960835,A11,48,A32,A46,7476,A61,A74,4,A93,A101,1,A124,50,A143,A153,1,A174,1,A192,A201,1 +6472767,A12,12,A32,A42,1922,A61,A73,4,A93,A101,2,A122,37,A143,A152,1,A172,1,A191,A201,2 +9116367,A11,24,A32,A40,2303,A61,A75,4,A93,A102,1,A121,45,A143,A152,1,A173,1,A191,A201,2 +2210355,A12,36,A33,A40,8086,A62,A75,2,A93,A101,4,A123,42,A143,A152,4,A174,1,A192,A201,2 +2597995,A14,24,A34,A41,2346,A61,A74,4,A93,A101,3,A123,35,A143,A152,2,A173,1,A192,A201,1 +9519867,A11,14,A32,A40,3973,A61,A71,1,A93,A101,4,A124,22,A143,A153,1,A173,1,A191,A201,1 +3698602,A12,12,A32,A40,888,A61,A75,4,A93,A101,4,A123,41,A141,A152,1,A172,2,A191,A201,2 +7748894,A14,48,A32,A43,10222,A65,A74,4,A93,A101,3,A123,37,A142,A152,1,A173,1,A192,A201,1 +5309238,A12,30,A30,A49,4221,A61,A73,2,A92,A101,1,A123,28,A143,A152,2,A173,1,A191,A201,1 +4970061,A12,18,A34,A42,6361,A61,A75,2,A93,A101,1,A124,41,A143,A152,1,A173,1,A192,A201,1 +2854826,A13,12,A32,A43,1297,A61,A73,3,A94,A101,4,A121,23,A143,A151,1,A173,1,A191,A201,1 +3463374,A11,12,A32,A40,900,A65,A73,4,A94,A101,2,A123,23,A143,A152,1,A173,1,A191,A201,2 +3481971,A14,21,A32,A42,2241,A61,A75,4,A93,A101,2,A121,50,A143,A152,2,A173,1,A191,A201,1 +1153712,A12,6,A33,A42,1050,A61,A71,4,A93,A101,1,A122,35,A142,A152,2,A174,1,A192,A201,1 +3077979,A13,6,A34,A46,1047,A61,A73,2,A92,A101,4,A122,50,A143,A152,1,A172,1,A191,A201,1 +6586335,A14,24,A34,A410,6314,A61,A71,4,A93,A102,2,A124,27,A141,A152,2,A174,1,A192,A201,1 +2300255,A12,30,A31,A42,3496,A64,A73,4,A93,A101,2,A123,34,A142,A152,1,A173,2,A192,A201,1 +6107764,A14,48,A31,A49,3609,A61,A73,1,A92,A101,1,A121,27,A142,A152,1,A173,1,A191,A201,1 +9756743,A11,12,A34,A40,4843,A61,A75,3,A93,A102,4,A122,43,A143,A151,2,A173,1,A192,A201,2 +2412093,A13,30,A34,A43,3017,A61,A75,4,A93,A101,4,A122,47,A143,A152,1,A173,1,A191,A201,1 +2967207,A14,24,A34,A49,4139,A62,A73,3,A93,A101,3,A122,27,A143,A152,2,A172,1,A192,A201,1 +5294255,A14,36,A32,A49,5742,A62,A74,2,A93,A101,2,A123,31,A143,A152,2,A173,1,A192,A201,1 +3960084,A14,60,A32,A40,10366,A61,A75,2,A93,A101,4,A122,42,A143,A152,1,A174,1,A192,A201,1 +5926774,A14,15,A32,A41,3029,A61,A74,2,A93,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,1 +5969658,A14,24,A32,A40,1393,A61,A73,2,A93,A103,2,A121,31,A143,A152,1,A173,1,A192,A201,1 +4016637,A14,6,A34,A40,2080,A63,A73,1,A94,A101,2,A123,24,A143,A152,1,A173,1,A191,A201,1 +5319265,A14,21,A33,A49,2580,A63,A72,4,A93,A101,2,A121,41,A141,A152,1,A172,2,A191,A201,2 +8616661,A14,30,A34,A43,4530,A61,A74,4,A92,A101,4,A123,26,A143,A151,1,A174,1,A192,A201,1 +9306673,A14,24,A34,A42,5150,A61,A75,4,A93,A101,4,A123,33,A143,A152,1,A173,1,A192,A201,1 +7790498,A12,72,A32,A43,5595,A62,A73,2,A94,A101,2,A123,24,A143,A152,1,A173,1,A191,A201,2 +3314262,A11,24,A32,A43,2384,A61,A75,4,A93,A101,4,A121,64,A141,A151,1,A172,1,A191,A201,1 +8563405,A14,18,A32,A43,1453,A61,A72,3,A92,A101,1,A121,26,A143,A152,1,A173,1,A191,A201,1 +8021407,A14,6,A32,A46,1538,A61,A72,1,A92,A101,2,A124,56,A143,A152,1,A173,1,A191,A201,1 +1620760,A14,12,A32,A43,2279,A65,A73,4,A93,A101,4,A124,37,A143,A153,1,A173,1,A192,A201,1 +7861677,A14,15,A33,A43,1478,A61,A73,4,A94,A101,3,A121,33,A141,A152,2,A173,1,A191,A201,1 +5532106,A14,24,A34,A43,5103,A61,A72,3,A94,A101,3,A124,47,A143,A153,3,A173,1,A192,A201,1 +6577647,A12,36,A33,A49,9857,A62,A74,1,A93,A101,3,A122,31,A143,A152,2,A172,2,A192,A201,1 +2592971,A14,60,A32,A40,6527,A65,A73,4,A93,A101,4,A124,34,A143,A153,1,A173,2,A192,A201,1 +7186212,A13,10,A34,A43,1347,A65,A74,4,A93,A101,2,A122,27,A143,A152,2,A173,1,A192,A201,1 +5547078,A12,36,A33,A40,2862,A62,A75,4,A93,A101,3,A124,30,A143,A153,1,A173,1,A191,A201,1 +8856174,A14,9,A32,A43,2753,A62,A75,3,A93,A102,4,A123,35,A143,A152,1,A173,1,A192,A201,1 +3625778,A11,12,A32,A40,3651,A64,A73,1,A93,A101,3,A122,31,A143,A152,1,A173,2,A191,A201,1 +6075200,A11,15,A34,A42,975,A61,A73,2,A91,A101,3,A122,25,A143,A152,2,A173,1,A191,A201,1 +2626457,A12,15,A32,A45,2631,A62,A73,3,A92,A101,2,A121,25,A143,A152,1,A172,1,A191,A201,1 +7031172,A12,24,A32,A43,2896,A62,A72,2,A93,A101,1,A123,29,A143,A152,1,A173,1,A191,A201,1 +7612370,A11,6,A34,A40,4716,A65,A72,1,A93,A101,3,A121,44,A143,A152,2,A172,2,A191,A201,1 +3634470,A14,24,A32,A43,2284,A61,A74,4,A93,A101,2,A123,28,A143,A152,1,A173,1,A192,A201,1 +1891464,A14,6,A32,A41,1236,A63,A73,2,A93,A101,4,A122,50,A143,A151,1,A173,1,A191,A201,1 +4385444,A12,12,A32,A43,1103,A61,A74,4,A93,A103,3,A121,29,A143,A152,2,A173,1,A191,A202,1 +4245378,A14,12,A34,A40,926,A61,A71,1,A92,A101,2,A122,38,A143,A152,1,A171,1,A191,A201,1 +8748357,A14,18,A34,A43,1800,A61,A73,4,A93,A101,2,A123,24,A143,A152,2,A173,1,A191,A201,1 +6154067,A13,15,A32,A46,1905,A61,A75,4,A93,A101,4,A123,40,A143,A151,1,A174,1,A192,A201,1 +8475240,A14,12,A32,A42,1123,A63,A73,4,A92,A101,4,A123,29,A143,A151,1,A172,1,A191,A201,2 +8497474,A11,48,A34,A41,6331,A61,A75,4,A93,A101,4,A124,46,A143,A153,2,A173,1,A192,A201,2 +4216405,A13,24,A32,A43,1377,A62,A75,4,A92,A101,2,A124,47,A143,A153,1,A173,1,A192,A201,1 +2028520,A12,30,A33,A49,2503,A62,A75,4,A93,A101,2,A122,41,A142,A152,2,A173,1,A191,A201,1 +6062683,A12,27,A32,A49,2528,A61,A72,4,A92,A101,1,A122,32,A143,A152,1,A173,2,A192,A201,1 +3591516,A14,15,A32,A40,5324,A63,A75,1,A92,A101,4,A124,35,A143,A153,1,A173,1,A191,A201,1 +8852736,A12,48,A32,A40,6560,A62,A74,3,A93,A101,2,A122,24,A143,A152,1,A173,1,A191,A201,2 +2188239,A12,12,A30,A42,2969,A61,A72,4,A92,A101,3,A122,25,A143,A151,2,A173,1,A191,A201,2 +8356737,A12,9,A32,A43,1206,A61,A75,4,A92,A101,4,A121,25,A143,A152,1,A173,1,A191,A201,1 +7311456,A12,9,A32,A43,2118,A61,A73,2,A93,A101,2,A121,37,A143,A152,1,A172,2,A191,A201,1 +2717625,A14,18,A34,A43,629,A63,A75,4,A93,A101,3,A122,32,A141,A152,2,A174,1,A192,A201,1 +1257668,A11,6,A31,A46,1198,A61,A75,4,A92,A101,4,A124,35,A143,A153,1,A173,1,A191,A201,2 +9770090,A14,21,A32,A41,2476,A65,A75,4,A93,A101,4,A121,46,A143,A152,1,A174,1,A192,A201,1 +5376189,A11,9,A34,A43,1138,A61,A73,4,A93,A101,4,A121,25,A143,A152,2,A172,1,A191,A201,1 +7684771,A12,60,A32,A40,14027,A61,A74,4,A93,A101,2,A124,27,A143,A152,1,A174,1,A192,A201,2 +5147905,A14,30,A34,A41,7596,A65,A75,1,A93,A101,4,A123,63,A143,A152,2,A173,1,A191,A201,1 +9729748,A14,30,A34,A43,3077,A65,A75,3,A93,A101,2,A123,40,A143,A152,2,A173,2,A192,A201,1 +3354393,A14,18,A32,A43,1505,A61,A73,4,A93,A101,2,A124,32,A143,A153,1,A174,1,A192,A201,1 +2738606,A13,24,A34,A43,3148,A65,A73,3,A93,A101,2,A123,31,A143,A152,2,A173,1,A192,A201,1 +7723603,A12,20,A30,A41,6148,A62,A75,3,A94,A101,4,A123,31,A141,A152,2,A173,1,A192,A201,1 +4275545,A13,9,A30,A43,1337,A61,A72,4,A93,A101,2,A123,34,A143,A152,2,A174,1,A192,A201,2 +8516067,A12,6,A31,A46,433,A64,A72,4,A92,A101,2,A122,24,A141,A151,1,A173,2,A191,A201,2 +8269904,A11,12,A32,A40,1228,A61,A73,4,A92,A101,2,A121,24,A143,A152,1,A172,1,A191,A201,2 +3797846,A12,9,A32,A43,790,A63,A73,4,A92,A101,3,A121,66,A143,A152,1,A172,1,A191,A201,1 +1759390,A14,27,A32,A40,2570,A61,A73,3,A92,A101,3,A121,21,A143,A151,1,A173,1,A191,A201,2 +8406970,A14,6,A34,A40,250,A64,A73,2,A92,A101,2,A121,41,A141,A152,2,A172,1,A191,A201,1 +4726527,A14,15,A34,A43,1316,A63,A73,2,A94,A101,2,A122,47,A143,A152,2,A172,1,A191,A201,1 +7293796,A11,18,A32,A43,1882,A61,A73,4,A92,A101,4,A123,25,A141,A151,2,A173,1,A191,A201,2 +8019704,A12,48,A31,A49,6416,A61,A75,4,A92,A101,3,A124,59,A143,A151,1,A173,1,A191,A201,2 +1119926,A13,24,A34,A49,1275,A64,A73,2,A91,A101,4,A121,36,A143,A152,2,A173,1,A192,A201,1 +8976091,A12,24,A33,A43,6403,A61,A72,1,A93,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,1 +2467002,A11,24,A32,A43,1987,A61,A73,2,A93,A101,4,A121,21,A143,A151,1,A172,2,A191,A201,2 +1767395,A12,8,A32,A43,760,A61,A74,4,A92,A103,2,A121,44,A143,A152,1,A172,1,A191,A201,1 +1412724,A14,24,A32,A41,2603,A64,A73,2,A92,A101,4,A123,28,A143,A151,1,A173,1,A192,A201,1 +7497034,A14,4,A34,A40,3380,A61,A74,1,A92,A101,1,A121,37,A143,A152,1,A173,2,A191,A201,1 +1760449,A12,36,A31,A44,3990,A65,A72,3,A92,A101,2,A124,29,A141,A152,1,A171,1,A191,A201,1 +3573461,A12,24,A32,A41,11560,A61,A73,1,A92,A101,4,A123,23,A143,A151,2,A174,1,A191,A201,2 +5869439,A11,18,A32,A40,4380,A62,A73,3,A93,A101,4,A123,35,A143,A152,1,A172,2,A192,A201,1 +5359049,A14,6,A34,A40,6761,A61,A74,1,A93,A101,3,A124,45,A143,A152,2,A174,2,A192,A201,1 +7610754,A12,30,A30,A49,4280,A62,A73,4,A92,A101,4,A123,26,A143,A151,2,A172,1,A191,A201,2 +7876212,A11,24,A31,A40,2325,A62,A74,2,A93,A101,3,A123,32,A141,A152,1,A173,1,A191,A201,1 +7058762,A12,10,A31,A43,1048,A61,A73,4,A93,A101,4,A121,23,A142,A152,1,A172,1,A191,A201,1 +4452376,A14,21,A32,A43,3160,A65,A75,4,A93,A101,3,A122,41,A143,A152,1,A173,1,A192,A201,1 +3813837,A11,24,A31,A42,2483,A63,A73,4,A93,A101,4,A121,22,A142,A152,1,A173,1,A192,A201,1 +4808834,A11,39,A34,A42,14179,A65,A74,4,A93,A101,4,A122,30,A143,A152,2,A174,1,A192,A201,1 +6902723,A11,13,A34,A49,1797,A61,A72,3,A93,A101,1,A122,28,A141,A152,2,A172,1,A191,A201,1 +8866025,A11,15,A32,A40,2511,A61,A71,1,A92,A101,4,A123,23,A143,A151,1,A173,1,A191,A201,1 +5625748,A11,12,A32,A40,1274,A61,A72,3,A92,A101,1,A121,37,A143,A152,1,A172,1,A191,A201,2 +3897156,A14,21,A32,A41,5248,A65,A73,1,A93,A101,3,A123,26,A143,A152,1,A173,1,A191,A201,1 +5926774,A14,15,A32,A41,3029,A61,A74,2,A93,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,1 +1337550,A11,6,A32,A42,428,A61,A75,2,A92,A101,1,A122,49,A141,A152,1,A173,1,A192,A201,1 +4131044,A11,18,A32,A40,976,A61,A72,1,A92,A101,2,A123,23,A143,A152,1,A172,1,A191,A201,2 +3163950,A12,12,A32,A49,841,A62,A74,2,A92,A101,4,A121,23,A143,A151,1,A172,1,A191,A201,1 +1657747,A14,30,A34,A43,5771,A61,A74,4,A92,A101,2,A123,25,A143,A152,2,A173,1,A191,A201,1 +9250778,A14,12,A33,A45,1555,A64,A75,4,A93,A101,4,A124,55,A143,A153,2,A173,2,A191,A201,2 +7923489,A11,24,A32,A40,1285,A65,A74,4,A92,A101,4,A124,32,A143,A151,1,A173,1,A191,A201,2 +9517102,A13,6,A34,A40,1299,A61,A73,1,A93,A101,1,A121,74,A143,A152,3,A171,2,A191,A202,1 +1700698,A13,15,A34,A43,1271,A65,A73,3,A93,A101,4,A124,39,A143,A153,2,A173,1,A192,A201,2 +5969658,A14,24,A32,A40,1393,A61,A73,2,A93,A103,2,A121,31,A143,A152,1,A173,1,A192,A201,1 +8022152,A11,12,A34,A40,691,A61,A75,4,A93,A101,3,A122,35,A143,A152,2,A173,1,A191,A201,2 +1605780,A14,15,A34,A40,5045,A65,A75,1,A92,A101,4,A123,59,A143,A152,1,A173,1,A192,A201,1 +8975629,A11,18,A34,A42,2124,A61,A73,4,A92,A101,4,A121,24,A143,A151,2,A173,1,A191,A201,2 +4062254,A11,12,A32,A43,2214,A61,A73,4,A93,A101,3,A122,24,A143,A152,1,A172,1,A191,A201,1 +9324382,A14,21,A34,A40,12680,A65,A75,4,A93,A101,4,A124,30,A143,A153,1,A174,1,A192,A201,2 +1025006,A14,24,A34,A40,2463,A62,A74,4,A94,A101,3,A122,27,A143,A152,2,A173,1,A192,A201,1 +4776602,A12,12,A32,A43,1155,A61,A75,3,A94,A103,3,A121,40,A141,A152,2,A172,1,A191,A201,1 +1846586,A11,30,A32,A42,3108,A61,A72,2,A91,A101,4,A122,31,A143,A152,1,A172,1,A191,A201,2 +8968659,A14,10,A32,A41,2901,A65,A72,1,A92,A101,4,A121,31,A143,A151,1,A173,1,A191,A201,1 +2233391,A12,12,A34,A42,3617,A61,A75,1,A93,A101,4,A123,28,A143,A151,3,A173,1,A192,A201,1 +1491565,A14,12,A34,A43,1655,A61,A75,2,A93,A101,4,A121,63,A143,A152,2,A172,1,A192,A201,1 +4567772,A11,24,A32,A41,2812,A65,A75,2,A92,A101,4,A121,26,A143,A151,1,A173,1,A191,A201,1 +7000454,A11,36,A34,A46,8065,A61,A73,3,A92,A101,2,A124,25,A143,A152,2,A174,1,A192,A201,2 +8450534,A14,21,A34,A41,3275,A61,A75,1,A93,A101,4,A123,36,A143,A152,1,A174,1,A192,A201,1 +9182898,A14,24,A34,A43,2223,A62,A75,4,A93,A101,4,A122,52,A141,A152,2,A173,1,A191,A201,1 +4237007,A13,12,A34,A40,1480,A63,A71,2,A93,A101,4,A124,66,A141,A153,3,A171,1,A191,A201,1 +2480667,A11,24,A32,A40,1371,A65,A73,4,A92,A101,4,A121,25,A143,A151,1,A173,1,A191,A201,2 +2564769,A14,36,A34,A40,3535,A61,A74,4,A93,A101,4,A123,37,A143,A152,2,A173,1,A192,A201,1 +6154723,A11,18,A32,A43,3509,A61,A74,4,A92,A103,1,A121,25,A143,A152,1,A173,1,A191,A201,1 +4299505,A14,36,A34,A41,5711,A64,A75,4,A93,A101,2,A123,38,A143,A152,2,A174,1,A192,A201,1 +2287593,A12,18,A32,A45,3872,A61,A71,2,A92,A101,4,A123,67,A143,A152,1,A173,1,A192,A201,1 +3094366,A12,39,A34,A43,4933,A61,A74,2,A93,A103,2,A121,25,A143,A152,2,A173,1,A191,A201,2 +7536220,A14,24,A34,A40,1940,A64,A75,4,A93,A101,4,A121,60,A143,A152,1,A173,1,A192,A201,1 +9783545,A12,12,A30,A48,1410,A61,A73,2,A93,A101,2,A121,31,A143,A152,1,A172,1,A192,A201,1 +6146218,A12,12,A32,A40,836,A62,A72,4,A92,A101,2,A122,23,A141,A152,1,A172,1,A191,A201,2 +7934311,A12,20,A32,A41,6468,A65,A71,1,A91,A101,4,A121,60,A143,A152,1,A174,1,A192,A201,1 +7300802,A12,18,A32,A49,1941,A64,A73,4,A93,A101,2,A122,35,A143,A152,1,A172,1,A192,A201,1 +7658137,A14,22,A32,A43,2675,A63,A75,3,A93,A101,4,A123,40,A143,A152,1,A173,1,A191,A201,1 +8404631,A14,48,A34,A41,2751,A65,A75,4,A93,A101,3,A123,38,A143,A152,2,A173,2,A192,A201,1 +6944710,A12,48,A33,A46,6224,A61,A75,4,A93,A101,4,A124,50,A143,A153,1,A173,1,A191,A201,2 +1514807,A11,40,A34,A46,5998,A61,A73,4,A93,A101,3,A124,27,A141,A152,1,A173,1,A192,A201,2 +2003384,A12,21,A32,A49,1188,A61,A75,2,A92,A101,4,A122,39,A143,A152,1,A173,2,A191,A201,2 +5659086,A14,24,A32,A41,6313,A65,A75,3,A93,A101,4,A123,41,A143,A152,1,A174,2,A192,A201,1 +4628471,A14,6,A34,A42,1221,A65,A73,1,A94,A101,2,A122,27,A143,A152,2,A173,1,A191,A201,1 +5403673,A13,24,A32,A42,2892,A61,A75,3,A91,A101,4,A124,51,A143,A153,1,A173,1,A191,A201,1 +8614255,A14,24,A32,A42,3062,A63,A75,4,A93,A101,3,A124,32,A143,A151,1,A173,1,A192,A201,1 +7923482,A14,9,A32,A42,2301,A62,A72,2,A92,A101,4,A122,22,A143,A151,1,A173,1,A191,A201,1 +2383650,A11,18,A32,A41,7511,A65,A75,1,A93,A101,4,A122,51,A143,A153,1,A173,2,A192,A201,2 +6958612,A14,12,A34,A42,1258,A61,A72,2,A92,A101,4,A122,22,A143,A151,2,A172,1,A191,A201,1 +1883328,A14,24,A33,A40,717,A65,A75,4,A94,A101,4,A123,54,A143,A152,2,A173,1,A192,A201,1 +8386168,A12,9,A32,A40,1549,A65,A72,4,A93,A101,2,A121,35,A143,A152,1,A171,1,A191,A201,1 +9660695,A14,24,A34,A46,1597,A61,A75,4,A93,A101,4,A124,54,A143,A153,2,A173,2,A191,A201,1 +5786462,A12,18,A34,A43,1795,A61,A75,3,A92,A103,4,A121,48,A141,A151,2,A172,1,A192,A201,1 +1678929,A11,20,A34,A42,4272,A61,A75,1,A92,A101,4,A122,24,A143,A152,2,A173,1,A191,A201,1 +4809900,A14,12,A34,A43,976,A65,A75,4,A93,A101,4,A123,35,A143,A152,2,A173,1,A191,A201,1 +6983851,A12,12,A32,A40,7472,A65,A71,1,A92,A101,2,A121,24,A143,A151,1,A171,1,A191,A201,1 +9335369,A11,36,A32,A40,9271,A61,A74,2,A93,A101,1,A123,24,A143,A152,1,A173,1,A192,A201,2 +7140312,A12,6,A32,A43,590,A61,A72,3,A94,A101,3,A121,26,A143,A152,1,A172,1,A191,A202,1 +4639968,A14,12,A34,A43,930,A65,A75,4,A93,A101,4,A121,65,A143,A152,4,A173,1,A191,A201,1 +9646710,A12,42,A31,A41,9283,A61,A71,1,A93,A101,2,A124,55,A141,A153,1,A174,1,A192,A201,1 +1322037,A12,15,A30,A40,1778,A61,A72,2,A92,A101,1,A121,26,A143,A151,2,A171,1,A191,A201,2 +3439631,A12,8,A32,A49,907,A61,A72,3,A94,A101,2,A121,26,A143,A152,1,A173,1,A192,A201,1 +7403964,A12,6,A32,A43,484,A61,A74,3,A94,A103,3,A121,28,A141,A152,1,A172,1,A191,A201,1 +6675884,A11,36,A34,A41,9629,A61,A74,4,A93,A101,4,A123,24,A143,A152,2,A173,1,A192,A201,2 +5903457,A11,48,A32,A44,3051,A61,A73,3,A93,A101,4,A123,54,A143,A152,1,A173,1,A191,A201,2 +6892095,A11,48,A32,A40,3931,A61,A74,4,A93,A101,4,A124,46,A143,A153,1,A173,2,A191,A201,2 +2535665,A12,36,A33,A40,7432,A61,A73,2,A92,A101,2,A122,54,A143,A151,1,A173,1,A191,A201,1 +2113858,A14,6,A32,A44,1338,A63,A73,1,A91,A101,4,A121,62,A143,A152,1,A173,1,A191,A201,1 +6940571,A14,6,A34,A43,1554,A61,A74,1,A92,A101,2,A123,24,A143,A151,2,A173,1,A192,A201,1 +1818305,A11,36,A32,A410,15857,A61,A71,2,A91,A102,3,A123,43,A143,A152,1,A174,1,A191,A201,1 +1136050,A11,18,A32,A43,1345,A61,A73,4,A94,A101,3,A121,26,A141,A152,1,A173,1,A191,A201,2 +5408293,A14,12,A32,A40,1101,A61,A73,3,A94,A101,2,A121,27,A143,A152,2,A173,1,A192,A201,1 +6973245,A13,12,A32,A43,3016,A61,A73,3,A94,A101,1,A123,24,A143,A152,1,A173,1,A191,A201,1 +3458174,A11,36,A32,A42,2712,A61,A75,2,A93,A101,2,A122,41,A141,A152,1,A173,2,A191,A201,2 +1641060,A11,8,A34,A40,731,A61,A75,4,A93,A101,4,A121,47,A143,A152,2,A172,1,A191,A201,1 +3831409,A14,18,A34,A42,3780,A61,A72,3,A91,A101,2,A123,35,A143,A152,2,A174,1,A192,A201,1 +7489128,A11,21,A34,A40,1602,A61,A75,4,A94,A101,3,A123,30,A143,A152,2,A173,1,A192,A201,1 +2737942,A11,18,A34,A40,3966,A61,A75,1,A92,A101,4,A121,33,A141,A151,3,A173,1,A192,A201,2 +3155576,A14,18,A30,A49,4165,A61,A73,2,A93,A101,2,A123,36,A142,A152,2,A173,2,A191,A201,2 +6658732,A11,36,A32,A41,8335,A65,A75,3,A93,A101,4,A124,47,A143,A153,1,A173,1,A191,A201,2 +6286813,A12,48,A33,A49,6681,A65,A73,4,A93,A101,4,A124,38,A143,A153,1,A173,2,A192,A201,1 +7203909,A14,24,A33,A49,2375,A63,A73,4,A93,A101,2,A123,44,A143,A152,2,A173,2,A192,A201,1 +8071605,A11,18,A32,A40,1216,A61,A72,4,A92,A101,3,A123,23,A143,A151,1,A173,1,A192,A201,2 +2576628,A11,45,A30,A49,11816,A61,A75,2,A93,A101,4,A123,29,A143,A151,2,A173,1,A191,A201,2 +3544441,A12,24,A32,A43,5084,A65,A75,2,A92,A101,4,A123,42,A143,A152,1,A173,1,A192,A201,1 +6494860,A13,15,A32,A43,2327,A61,A72,2,A92,A101,3,A121,25,A143,A152,1,A172,1,A191,A201,2 +5247218,A11,12,A30,A40,1082,A61,A73,4,A93,A101,4,A123,48,A141,A152,2,A173,1,A191,A201,2 +7024848,A14,12,A32,A43,886,A65,A73,4,A92,A101,2,A123,21,A143,A152,1,A173,1,A191,A201,1 +7182646,A14,4,A32,A42,601,A61,A72,1,A92,A101,3,A121,23,A143,A151,1,A172,2,A191,A201,1 +6839088,A11,24,A34,A41,2957,A61,A75,4,A93,A101,4,A122,63,A143,A152,2,A173,1,A192,A201,1 +9873458,A14,24,A34,A43,2611,A61,A75,4,A94,A102,3,A121,46,A143,A152,2,A173,1,A191,A201,1 +1896222,A11,36,A32,A42,5179,A61,A74,4,A93,A101,2,A122,29,A143,A152,1,A173,1,A191,A201,2 +3865701,A14,21,A33,A41,2993,A61,A73,3,A93,A101,2,A121,28,A142,A152,2,A172,1,A191,A201,1 +1157229,A14,18,A32,A45,1943,A61,A72,4,A92,A101,4,A121,23,A143,A152,1,A173,1,A191,A201,2 +3448928,A14,24,A31,A49,1559,A61,A74,4,A93,A101,4,A123,50,A141,A152,1,A173,1,A192,A201,1 +4318645,A14,18,A32,A42,3422,A61,A75,4,A93,A101,4,A122,47,A141,A152,3,A173,2,A192,A201,1 +7624820,A12,21,A32,A42,3976,A65,A74,2,A93,A101,3,A123,35,A143,A152,1,A173,1,A192,A201,1 +8556957,A14,18,A32,A40,6761,A65,A73,2,A93,A101,4,A123,68,A143,A151,2,A173,1,A191,A201,2 +9294026,A14,24,A32,A40,1249,A61,A72,4,A94,A101,2,A121,28,A143,A152,1,A173,1,A191,A201,1 +8814062,A11,9,A32,A43,1364,A61,A74,3,A93,A101,4,A121,59,A143,A152,1,A173,1,A191,A201,1 +7578622,A11,12,A32,A43,709,A61,A75,4,A93,A101,4,A121,57,A142,A152,1,A172,1,A191,A201,2 +1129973,A11,20,A34,A40,2235,A61,A73,4,A94,A103,2,A122,33,A141,A151,2,A173,1,A191,A202,2 +3785461,A14,24,A34,A41,4042,A65,A74,3,A93,A101,4,A122,43,A143,A152,2,A173,1,A192,A201,1 +6563566,A14,15,A34,A43,1471,A61,A73,4,A93,A101,4,A124,35,A143,A153,2,A173,1,A192,A201,1 +8205907,A11,18,A31,A40,1442,A61,A74,4,A93,A101,4,A124,32,A143,A153,2,A172,2,A191,A201,2 +4983715,A14,36,A33,A40,10875,A61,A75,2,A93,A101,2,A123,45,A143,A152,2,A173,2,A192,A201,1 +1607673,A14,24,A32,A40,1474,A62,A72,4,A94,A101,3,A121,33,A143,A152,1,A173,1,A192,A201,1 +7984360,A14,10,A32,A48,894,A65,A74,4,A92,A101,3,A122,40,A143,A152,1,A173,1,A192,A201,1 +8489771,A14,15,A34,A42,3343,A61,A73,4,A93,A101,2,A124,28,A143,A153,1,A173,1,A192,A201,1 +1769803,A11,15,A32,A40,3959,A61,A73,3,A92,A101,2,A122,29,A143,A152,1,A173,1,A192,A201,2 +8600642,A14,9,A32,A40,3577,A62,A73,1,A93,A103,2,A121,26,A143,A151,1,A173,2,A191,A202,1 +2919320,A14,24,A34,A41,5804,A64,A73,4,A93,A101,2,A121,27,A143,A152,2,A173,1,A191,A201,1 +2941810,A14,18,A33,A49,2169,A61,A73,4,A94,A101,2,A123,28,A143,A152,1,A173,1,A192,A201,2 +5203823,A11,24,A32,A43,2439,A61,A72,4,A92,A101,4,A121,35,A143,A152,1,A173,1,A192,A201,2 +4987253,A14,27,A34,A42,4526,A64,A72,4,A93,A101,2,A121,32,A142,A152,2,A172,2,A192,A201,1 +6273169,A14,10,A32,A42,2210,A61,A73,2,A93,A101,2,A121,25,A141,A151,1,A172,1,A191,A201,2 +6364124,A14,15,A32,A42,2221,A63,A73,2,A92,A101,4,A123,20,A143,A151,1,A173,1,A191,A201,1 +1218238,A11,18,A32,A43,2389,A61,A72,4,A92,A101,1,A123,27,A142,A152,1,A173,1,A191,A201,1 +8334248,A14,12,A34,A42,3331,A61,A75,2,A93,A101,4,A122,42,A142,A152,1,A173,1,A191,A201,1 +7214897,A14,36,A32,A49,7409,A65,A75,3,A93,A101,2,A122,37,A143,A152,2,A173,1,A191,A201,1 +7299602,A11,12,A32,A42,652,A61,A75,4,A92,A101,4,A122,24,A143,A151,1,A173,1,A191,A201,1 +8110158,A14,36,A33,A42,7678,A63,A74,2,A92,A101,4,A123,40,A143,A152,2,A173,1,A192,A201,1 +3654815,A13,6,A34,A40,1343,A61,A75,1,A93,A101,4,A121,46,A143,A152,2,A173,2,A191,A202,1 +6572384,A11,24,A34,A49,1382,A62,A74,4,A93,A101,1,A121,26,A143,A152,2,A173,1,A192,A201,1 +4718363,A14,15,A32,A44,874,A65,A72,4,A92,A101,1,A121,24,A143,A152,1,A173,1,A191,A201,1 +4110964,A11,12,A32,A42,3590,A61,A73,2,A93,A102,2,A122,29,A143,A152,1,A172,2,A191,A201,1 +5456397,A12,11,A34,A40,1322,A64,A73,4,A92,A101,4,A123,40,A143,A152,2,A173,1,A191,A201,1 +7560842,A11,18,A31,A43,1940,A61,A72,3,A93,A102,4,A124,36,A141,A153,1,A174,1,A192,A201,1 +2867079,A14,36,A32,A43,3595,A61,A75,4,A93,A101,2,A123,28,A143,A152,1,A173,1,A191,A201,1 +9165349,A11,9,A32,A40,1422,A61,A72,3,A93,A101,2,A124,27,A143,A153,1,A174,1,A192,A201,2 +5074170,A14,30,A34,A43,6742,A65,A74,2,A93,A101,3,A122,36,A143,A152,2,A173,1,A191,A201,1 +1674264,A14,24,A32,A41,7814,A61,A74,3,A93,A101,3,A123,38,A143,A152,1,A174,1,A192,A201,1 +7549449,A14,24,A32,A41,9277,A65,A73,2,A91,A101,4,A124,48,A143,A153,1,A173,1,A192,A201,1 +9453883,A12,30,A34,A40,2181,A65,A75,4,A93,A101,4,A121,36,A143,A152,2,A173,1,A191,A201,1 +7539149,A14,18,A34,A43,1098,A61,A71,4,A92,A101,4,A123,65,A143,A152,2,A171,1,A191,A201,1 +2758767,A12,24,A32,A42,4057,A61,A74,3,A91,A101,3,A123,43,A143,A152,1,A173,1,A192,A201,2 +5771164,A11,12,A32,A46,795,A61,A72,4,A92,A101,4,A122,53,A143,A152,1,A173,1,A191,A201,2 +7250365,A12,24,A34,A49,2825,A65,A74,4,A93,A101,3,A124,34,A143,A152,2,A173,2,A192,A201,1 +6463462,A12,48,A32,A49,15672,A61,A73,2,A93,A101,2,A123,23,A143,A152,1,A173,1,A192,A201,2 +3678150,A14,36,A34,A40,6614,A61,A75,4,A93,A101,4,A123,34,A143,A152,2,A174,1,A192,A201,1 +7159187,A14,28,A31,A41,7824,A65,A72,3,A93,A103,4,A121,40,A141,A151,2,A173,2,A192,A201,1 +2996743,A11,27,A34,A49,2442,A61,A75,4,A93,A101,4,A123,43,A142,A152,4,A174,2,A192,A201,1 +3755936,A14,15,A34,A43,1829,A61,A75,4,A93,A101,4,A123,46,A143,A152,2,A173,1,A192,A201,1 +9858169,A11,12,A34,A40,2171,A61,A73,4,A93,A101,4,A122,38,A141,A152,2,A172,1,A191,A202,1 +6011271,A12,36,A34,A41,5800,A61,A73,3,A93,A101,4,A123,34,A143,A152,2,A173,1,A192,A201,1 +7464596,A12,24,A32,A42,4351,A65,A73,1,A92,A101,4,A122,48,A143,A152,1,A172,1,A192,A201,1 +9907330,A11,6,A34,A42,1872,A61,A71,4,A93,A101,4,A124,36,A143,A153,3,A174,1,A192,A201,1 +2676319,A14,18,A34,A43,1169,A65,A73,4,A93,A101,3,A122,29,A143,A152,2,A173,1,A192,A201,1 +9543202,A14,36,A33,A41,8947,A65,A74,3,A93,A101,2,A123,31,A142,A152,1,A174,2,A192,A201,1 +4682068,A11,21,A32,A43,2606,A61,A72,4,A92,A101,4,A122,28,A143,A151,1,A174,1,A192,A201,1 +5609033,A14,12,A34,A42,1592,A64,A74,3,A92,A101,2,A122,35,A143,A152,1,A173,1,A191,A202,1 +1863246,A14,15,A32,A42,2186,A65,A74,1,A92,A101,4,A121,33,A141,A151,1,A172,1,A191,A201,1 +3453367,A11,18,A32,A42,4153,A61,A73,2,A93,A102,3,A123,42,A143,A152,1,A173,1,A191,A201,2 +7549449,A14,24,A32,A41,9277,A65,A73,2,A91,A101,4,A124,48,A143,A153,1,A173,1,A192,A201,1 +6328203,A14,15,A32,A41,4657,A61,A73,3,A93,A101,2,A123,30,A143,A152,1,A173,1,A192,A201,1 +5078589,A11,16,A34,A40,2625,A61,A75,2,A93,A103,4,A122,43,A141,A151,1,A173,1,A192,A201,2 +7409626,A14,20,A34,A40,3485,A65,A72,2,A91,A101,4,A121,44,A143,A152,2,A173,1,A192,A201,1 +3861438,A14,36,A34,A41,10477,A65,A75,2,A93,A101,4,A124,42,A143,A153,2,A173,1,A191,A201,1 +6653362,A14,15,A32,A43,1386,A65,A73,4,A94,A101,2,A121,40,A143,A151,1,A173,1,A192,A201,1 +2279281,A14,24,A32,A43,1278,A61,A75,4,A93,A101,1,A121,36,A143,A152,1,A174,1,A192,A201,1 +9374296,A11,12,A32,A43,1107,A61,A73,2,A93,A101,2,A121,20,A143,A151,1,A174,2,A192,A201,1 +2769534,A11,21,A32,A40,3763,A65,A74,2,A93,A102,2,A121,24,A143,A152,1,A172,1,A191,A202,1 +2117451,A12,36,A32,A46,3711,A65,A73,2,A94,A101,2,A123,27,A143,A152,1,A173,1,A191,A201,1 +5910784,A14,15,A33,A41,3594,A61,A72,1,A92,A101,2,A122,46,A143,A152,2,A172,1,A191,A201,1 +6446222,A12,9,A32,A40,3195,A65,A73,1,A92,A101,2,A121,33,A143,A152,1,A172,1,A191,A201,1 +1562036,A14,36,A33,A43,4454,A61,A73,4,A92,A101,4,A121,34,A143,A152,2,A173,1,A191,A201,1 +4535110,A12,24,A34,A42,4736,A61,A72,2,A92,A101,4,A123,25,A141,A152,1,A172,1,A191,A201,2 +4788744,A12,30,A32,A43,2991,A65,A75,2,A92,A101,4,A123,25,A143,A152,1,A173,1,A191,A201,1 +6790659,A14,11,A32,A49,2142,A64,A75,1,A91,A101,2,A121,28,A143,A152,1,A173,1,A192,A201,1 +6687237,A11,24,A31,A49,3161,A61,A73,4,A93,A101,2,A122,31,A143,A151,1,A173,1,A192,A201,2 +3700907,A12,48,A30,A410,18424,A61,A73,1,A92,A101,2,A122,32,A141,A152,1,A174,1,A192,A202,2 +2284183,A14,10,A32,A41,2848,A62,A73,1,A93,A102,2,A121,32,A143,A152,1,A173,2,A191,A201,1 +9968219,A11,6,A32,A40,14896,A61,A75,1,A93,A101,4,A124,68,A141,A152,1,A174,1,A192,A201,2 +6987335,A11,24,A32,A42,2359,A62,A71,1,A91,A101,1,A122,33,A143,A152,1,A173,1,A191,A201,2 +8752925,A11,24,A32,A42,3345,A61,A75,4,A93,A101,2,A122,39,A143,A151,1,A174,1,A192,A201,2 +5367701,A14,18,A34,A42,1817,A61,A73,4,A92,A101,2,A124,28,A143,A152,2,A173,1,A191,A201,1 +5393100,A14,48,A33,A43,12749,A63,A74,4,A93,A101,1,A123,37,A143,A152,1,A174,1,A192,A201,1 +4459246,A11,9,A32,A43,1366,A61,A72,3,A92,A101,4,A122,22,A143,A151,1,A173,1,A191,A201,2 +1342448,A12,12,A32,A40,2002,A61,A74,3,A93,A101,4,A122,30,A143,A151,1,A173,2,A192,A201,1 +1646042,A11,24,A31,A42,6872,A61,A72,2,A91,A101,1,A122,55,A141,A152,1,A173,1,A192,A201,2 +8308521,A11,12,A31,A40,697,A61,A72,4,A93,A101,2,A123,46,A141,A152,2,A173,1,A192,A201,2 +6172317,A11,18,A34,A42,1049,A61,A72,4,A92,A101,4,A122,21,A143,A151,1,A173,1,A191,A201,1 +5333717,A11,48,A32,A41,10297,A61,A74,4,A93,A101,4,A124,39,A142,A153,3,A173,2,A192,A201,2 +3861659,A14,30,A32,A43,1867,A65,A75,4,A93,A101,4,A123,58,A143,A152,1,A173,1,A192,A201,1 +6910543,A11,12,A33,A40,1344,A61,A73,4,A93,A101,2,A121,43,A143,A152,2,A172,2,A191,A201,1 +1061602,A11,24,A32,A42,1747,A61,A72,4,A93,A102,1,A122,24,A143,A152,1,A172,1,A191,A202,1 +4970330,A12,9,A32,A43,1670,A61,A72,4,A92,A101,2,A123,22,A143,A152,1,A173,1,A192,A201,2 +9608683,A14,9,A34,A40,1224,A61,A73,3,A93,A101,1,A121,30,A143,A152,2,A173,1,A191,A201,1 +8847915,A14,12,A34,A43,522,A63,A75,4,A93,A101,4,A122,42,A143,A152,2,A173,2,A192,A201,1 +4860879,A11,12,A32,A43,1498,A61,A73,4,A92,A101,1,A123,23,A141,A152,1,A173,1,A191,A201,1 +1242372,A12,30,A33,A43,1919,A62,A72,4,A93,A101,3,A124,30,A142,A152,2,A174,1,A191,A201,2 +5171329,A13,9,A32,A43,745,A61,A73,3,A92,A101,2,A121,28,A143,A152,1,A172,1,A191,A201,2 +8086940,A12,6,A32,A43,2063,A61,A72,4,A94,A101,3,A123,30,A143,A151,1,A174,1,A192,A201,1 +5621508,A12,60,A32,A46,6288,A61,A73,4,A93,A101,4,A124,42,A143,A153,1,A173,1,A191,A201,2 +5701557,A14,24,A34,A41,6842,A65,A73,2,A93,A101,4,A122,46,A143,A152,2,A174,2,A192,A201,1 +1490791,A14,12,A32,A40,3527,A65,A72,2,A93,A101,3,A122,45,A143,A152,1,A174,2,A192,A201,1 +7984042,A14,10,A32,A40,1546,A61,A73,3,A93,A101,2,A121,31,A143,A152,1,A172,2,A191,A202,1 +4745533,A14,24,A32,A42,929,A65,A74,4,A93,A101,2,A123,31,A142,A152,1,A173,1,A192,A201,1 +1360133,A14,4,A34,A40,1455,A61,A74,2,A93,A101,1,A121,42,A143,A152,3,A172,2,A191,A201,1 +9152373,A11,15,A32,A42,1845,A61,A72,4,A92,A103,1,A122,46,A143,A151,1,A173,1,A191,A201,1 +8743438,A12,48,A30,A40,8358,A63,A72,1,A92,A101,1,A123,30,A143,A152,2,A173,1,A191,A201,1 +4419761,A11,24,A31,A42,3349,A63,A72,4,A93,A101,4,A124,30,A143,A153,1,A173,2,A192,A201,2 +5135310,A14,12,A32,A40,2859,A65,A71,4,A93,A101,4,A124,38,A143,A152,1,A174,1,A192,A201,1 +6359252,A14,18,A32,A42,1533,A61,A72,4,A94,A102,1,A122,43,A143,A152,1,A172,2,A191,A201,2 +2579500,A14,24,A32,A43,3621,A62,A75,2,A93,A101,4,A123,31,A143,A152,2,A173,1,A191,A201,2 +5856859,A12,18,A34,A49,3590,A61,A71,3,A94,A101,3,A123,40,A143,A152,3,A171,2,A192,A201,1 +4531283,A11,36,A33,A49,2145,A61,A74,2,A93,A101,1,A123,24,A143,A152,2,A173,1,A192,A201,2 +5825152,A12,24,A32,A41,4113,A63,A72,3,A92,A101,4,A123,28,A143,A151,1,A173,1,A191,A201,2 +7225727,A14,36,A32,A42,10974,A61,A71,4,A92,A101,2,A123,26,A143,A152,2,A174,1,A192,A201,2 +7714243,A11,12,A32,A40,1893,A61,A73,4,A92,A103,4,A122,29,A143,A152,1,A173,1,A192,A201,1 +4837212,A11,24,A34,A43,1231,A64,A75,4,A92,A101,4,A122,57,A143,A151,2,A174,1,A192,A201,1 +6919828,A13,30,A34,A43,3656,A65,A75,4,A93,A101,4,A122,49,A142,A152,2,A172,1,A191,A201,1 +2791847,A12,9,A34,A43,1154,A61,A75,2,A93,A101,4,A121,37,A143,A152,3,A172,1,A191,A201,1 +2849845,A11,28,A32,A40,4006,A61,A73,3,A93,A101,2,A123,45,A143,A152,1,A172,1,A191,A201,2 +3667502,A12,24,A32,A42,3069,A62,A75,4,A93,A101,4,A124,30,A143,A153,1,A173,1,A191,A201,1 +3226277,A14,6,A34,A43,1740,A61,A75,2,A94,A101,2,A121,30,A143,A151,2,A173,1,A191,A201,1 +6097436,A12,21,A33,A40,2353,A61,A73,1,A91,A101,4,A122,47,A143,A152,2,A173,1,A191,A201,1 +1545817,A14,15,A32,A40,3556,A65,A73,3,A93,A101,2,A124,29,A143,A152,1,A173,1,A191,A201,1 +7606017,A14,24,A32,A43,2397,A63,A75,3,A93,A101,2,A123,35,A141,A152,2,A173,1,A192,A201,2 +6567094,A12,6,A32,A45,454,A61,A72,3,A94,A101,1,A122,22,A143,A152,1,A172,1,A191,A201,1 +1721068,A12,30,A32,A43,1715,A65,A73,4,A92,A101,1,A123,26,A143,A152,1,A173,1,A191,A201,1 +5114820,A12,27,A34,A43,2520,A63,A73,4,A93,A101,2,A122,23,A143,A152,2,A172,1,A191,A201,2 +7437809,A14,15,A32,A43,3568,A61,A75,4,A92,A101,2,A123,54,A141,A151,1,A174,1,A192,A201,1 +9159315,A14,42,A32,A43,7166,A65,A74,2,A94,A101,4,A122,29,A143,A151,1,A173,1,A192,A201,1 +5764003,A11,11,A34,A40,3939,A61,A73,1,A93,A101,2,A121,40,A143,A152,2,A172,2,A191,A201,1 +1419235,A12,15,A32,A45,1514,A62,A73,4,A93,A103,2,A121,22,A143,A152,1,A173,1,A191,A201,1 +5842641,A14,24,A32,A40,7393,A61,A73,1,A93,A101,4,A122,43,A143,A152,1,A172,2,A191,A201,1 +1259519,A11,24,A31,A40,1193,A61,A71,1,A92,A102,4,A124,29,A143,A151,2,A171,1,A191,A201,2 +3793320,A11,60,A32,A49,7297,A61,A75,4,A93,A102,4,A124,36,A143,A151,1,A173,1,A191,A201,2 +7276981,A14,30,A34,A43,2831,A61,A73,4,A92,A101,2,A123,33,A143,A152,1,A173,1,A192,A201,1 +3160914,A13,24,A32,A43,1258,A63,A73,3,A92,A101,3,A123,57,A143,A152,1,A172,1,A191,A201,1 +4116332,A12,6,A32,A43,753,A61,A73,2,A92,A103,3,A121,64,A143,A152,1,A173,1,A191,A201,1 +2401016,A12,18,A33,A49,2427,A65,A75,4,A93,A101,2,A122,42,A143,A152,2,A173,1,A191,A201,1 +6097153,A14,24,A33,A40,2538,A61,A75,4,A93,A101,4,A123,47,A143,A152,2,A172,2,A191,A201,2 +3797948,A12,15,A31,A40,1264,A62,A73,2,A94,A101,2,A122,25,A143,A151,1,A173,1,A191,A201,2 +6006685,A12,30,A34,A42,8386,A61,A74,2,A93,A101,2,A122,49,A143,A152,1,A173,1,A191,A201,2 +6720199,A14,48,A32,A49,4844,A61,A71,3,A93,A101,2,A123,33,A141,A151,1,A174,1,A192,A201,2 +2846538,A13,21,A32,A40,2923,A62,A73,1,A92,A101,1,A123,28,A141,A152,1,A174,1,A192,A201,1 +1852011,A11,36,A32,A41,8229,A61,A73,2,A93,A101,2,A122,26,A143,A152,1,A173,2,A191,A201,2 +6947593,A14,24,A34,A42,2028,A61,A74,2,A93,A101,2,A122,30,A143,A152,2,A172,1,A191,A201,1 +5518158,A11,15,A34,A42,1433,A61,A73,4,A92,A101,3,A122,25,A143,A151,2,A173,1,A191,A201,1 +8456653,A13,42,A30,A49,6289,A61,A72,2,A91,A101,1,A122,33,A143,A152,2,A173,1,A191,A201,1 +7323240,A14,13,A32,A43,1409,A62,A71,2,A92,A101,4,A121,64,A143,A152,1,A173,1,A191,A201,1 +5716073,A11,24,A32,A41,6579,A61,A71,4,A93,A101,2,A124,29,A143,A153,1,A174,1,A192,A201,1 +4639718,A12,24,A34,A43,1743,A61,A75,4,A93,A101,2,A122,48,A143,A152,2,A172,1,A191,A201,1 +2797680,A14,12,A34,A46,3565,A65,A72,2,A93,A101,1,A122,37,A143,A152,2,A172,2,A191,A201,1 +3956429,A14,15,A31,A43,1569,A62,A75,4,A93,A101,4,A123,34,A141,A152,1,A172,2,A191,A201,1 +3257050,A11,18,A32,A43,1936,A65,A74,2,A94,A101,4,A123,23,A143,A151,2,A172,1,A191,A201,1 +3566008,A11,36,A32,A42,3959,A61,A71,4,A93,A101,3,A122,30,A143,A152,1,A174,1,A192,A201,1 +5331918,A14,12,A32,A40,2390,A65,A75,4,A93,A101,3,A123,50,A143,A152,1,A173,1,A192,A201,1 +9671059,A14,12,A32,A42,1736,A61,A74,3,A92,A101,4,A121,31,A143,A152,1,A172,1,A191,A201,1 +2180183,A11,30,A32,A41,3857,A61,A73,4,A91,A101,4,A122,40,A143,A152,1,A174,1,A192,A201,1 +3130615,A14,12,A32,A43,804,A61,A75,4,A93,A101,4,A123,38,A143,A152,1,A173,1,A191,A201,1 +6267789,A11,45,A32,A43,1845,A61,A73,4,A93,A101,4,A124,23,A143,A153,1,A173,1,A192,A201,2 +6959896,A12,45,A34,A41,4576,A62,A71,3,A93,A101,4,A123,27,A143,A152,1,A173,1,A191,A201,1 diff --git a/Module2/VisualizingDataForClassification.ipynb b/Module2/VisualizingDataForClassification.ipynb new file mode 100644 index 0000000..80bd3b8 --- /dev/null +++ b/Module2/VisualizingDataForClassification.ipynb @@ -0,0 +1,391 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Visualizing Data for Classification\n", + "\n", + "In the previous lab, you explored the automotive price dataset to understand the relationships for a regression problem. In this lab you will explore the German bank credit dataset to understand the relationships for a **classification** problem. The difference being, that in classification problems the label is a categorical variable. \n", + "\n", + "In other labs you will use what you learn through visualization to create a solution that predicts the customers with bad credit. For now, the focus of this lab is on visually exploring the data to determine which features may be useful in predicting customer's bad credit.\n", + "\n", + "Visualization for classification problems shares much in common with visualization for regression problems. Colinear features should be identified so they can be eliminated or otherwise dealt with. However, for classification problems you are looking for features that help **separate the label categories**. Separation is achieved when there are distinctive feature values for each label category. Good separation results in low classification error rate." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load and prepare the data set\n", + "\n", + "As a first step you must load the dataset. \n", + "\n", + "Execute the code in the cell below to load the packages required for the rest of this notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below loads the dataset and assigns human-readable names to the columns. The shape and head of the data frame are then printed. Execute this code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit = pd.read_csv('German_Credit.csv', header=None)\n", + "credit.columns = ['customer_id',\n", + " 'checking_account_status', 'loan_duration_mo', 'credit_history', \n", + " 'purpose', 'loan_amount', 'savings_account_balance', \n", + " 'time_employed_yrs', 'payment_pcnt_income','gender_status', \n", + " 'other_signators', 'time_in_residence', 'property', 'age_yrs',\n", + " 'other_credit_outstanding', 'home_ownership', 'number_loans', \n", + " 'job_category', 'dependents', 'telephone', 'foreign_worker', \n", + " 'bad_credit']\n", + "print(credit.shape)\n", + "credit.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are 1011 rows and 22 columns in the dataset. The first column is customer_id, which is an identifier. We will drop this since this is not a feature." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit.drop(['customer_id'], axis=1, inplace=True)\n", + "print(credit.shape)\n", + "credit.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, there are 21 columns left. Of the 21 columns, there are 20 features plus a label column. These features represent information a bank might have on its customers. There are both numeric and categorical features. However, the categorical features are coded in a way that makes them hard to understand. Further, the label is coded as $\\{ 1,2 \\}$ which is a bit awkward. \n", + "\n", + "The code in the cell below using a list of dictionaries to recode the categorical features with human-readable text. The final dictionary in the list recodes good and bad credit as a binary variable, $\\{ 0,1 \\}$. The `for` loop iterates over the columns and maps codes to human-readable category names. Having human readable coding of data greatly improves peoples' ability to understand the relationships in the data.\n", + "\n", + "Execute this code and examine the result: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "code_list = [['checking_account_status', \n", + " {'A11' : '< 0 DM', \n", + " 'A12' : '0 - 200 DM', \n", + " 'A13' : '> 200 DM or salary assignment', \n", + " 'A14' : 'none'}],\n", + " ['credit_history',\n", + " {'A30' : 'no credit - paid', \n", + " 'A31' : 'all loans at bank paid', \n", + " 'A32' : 'current loans paid', \n", + " 'A33' : 'past payment delays', \n", + " 'A34' : 'critical account - other non-bank loans'}],\n", + " ['purpose',\n", + " {'A40' : 'car (new)', \n", + " 'A41' : 'car (used)',\n", + " 'A42' : 'furniture/equipment',\n", + " 'A43' : 'radio/television', \n", + " 'A44' : 'domestic appliances', \n", + " 'A45' : 'repairs', \n", + " 'A46' : 'education', \n", + " 'A47' : 'vacation',\n", + " 'A48' : 'retraining',\n", + " 'A49' : 'business', \n", + " 'A410' : 'other' }],\n", + " ['savings_account_balance',\n", + " {'A61' : '< 100 DM', \n", + " 'A62' : '100 - 500 DM', \n", + " 'A63' : '500 - 1000 DM', \n", + " 'A64' : '>= 1000 DM',\n", + " 'A65' : 'unknown/none' }],\n", + " ['time_employed_yrs',\n", + " {'A71' : 'unemployed',\n", + " 'A72' : '< 1 year', \n", + " 'A73' : '1 - 4 years', \n", + " 'A74' : '4 - 7 years', \n", + " 'A75' : '>= 7 years'}],\n", + " ['gender_status',\n", + " {'A91' : 'male-divorced/separated', \n", + " 'A92' : 'female-divorced/separated/married',\n", + " 'A93' : 'male-single', \n", + " 'A94' : 'male-married/widowed', \n", + " 'A95' : 'female-single'}],\n", + " ['other_signators',\n", + " {'A101' : 'none', \n", + " 'A102' : 'co-applicant', \n", + " 'A103' : 'guarantor'}],\n", + " ['property',\n", + " {'A121' : 'real estate',\n", + " 'A122' : 'building society savings/life insurance', \n", + " 'A123' : 'car or other',\n", + " 'A124' : 'unknown-none' }],\n", + " ['other_credit_outstanding',\n", + " {'A141' : 'bank', \n", + " 'A142' : 'stores', \n", + " 'A143' : 'none'}],\n", + " ['home_ownership',\n", + " {'A151' : 'rent', \n", + " 'A152' : 'own', \n", + " 'A153' : 'for free'}],\n", + " ['job_category',\n", + " {'A171' : 'unemployed-unskilled-non-resident', \n", + " 'A172' : 'unskilled-resident', \n", + " 'A173' : 'skilled',\n", + " 'A174' : 'highly skilled'}],\n", + " ['telephone', \n", + " {'A191' : 'none', \n", + " 'A192' : 'yes'}],\n", + " ['foreign_worker',\n", + " {'A201' : 'yes', \n", + " 'A202' : 'no'}],\n", + " ['bad_credit',\n", + " {2 : 1,\n", + " 1 : 0}]]\n", + "\n", + "for col_dic in code_list:\n", + " col = col_dic[0]\n", + " dic = col_dic[1]\n", + " credit[col] = [dic[x] for x in credit[col]]\n", + " \n", + "credit.head() " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The categorical features now have meaningful coding. Additionally, the label is now coded as a binary variable. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Examine classes and class imbalance\n", + "\n", + "In this case, the label has significant **class imbalance**. Class imbalance means that there are unequal numbers of cases for the categories of the label. Class imbalance can seriously bias the training of classifier algorithms. It many cases, the imbalance leads to a higher error rate for the minority class. Most real-world classification problems have class imbalance, sometimes severe class imbalance, so it is important to test for this before training any model. \n", + "\n", + "Fortunately, it is easy to test for class imbalance using a frequency table. Execute the code in the cell below to display a frequency table of the classes: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit_counts = credit['bad_credit'].value_counts()\n", + "print(credit_counts)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that only 30% of the cases have bad credit. This is not surprising, since a bank would typically retain customers with good credit. While this is not a cases of severe imbalance, it is enough to bias the training of any model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualize class separation by numeric features\n", + "\n", + "As stated previously, the primary goal of visualization for classification problems is to understand which features are useful for class separation. In this section, you will start by visualizing the separation quality of numeric features. \n", + "\n", + "Execute the code, examine the results, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_box(credit, cols, col_x = 'bad_credit'):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.boxplot(col_x, col, data=credit)\n", + " plt.xlabel(col_x) # Set text for the x axis\n", + " plt.ylabel(col)# Set text for y axis\n", + " plt.show()\n", + "\n", + "num_cols = ['loan_duration_mo', 'loan_amount', 'payment_pcnt_income',\n", + " 'age_yrs', 'number_loans', 'dependents']\n", + "plot_box(credit, num_cols)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "How can you interpret these results? Box plots are useful, since by their very construction you are forced to focus on the overlap (or not) of the quartiles of the distribution. In this case, the question is there sufficient differences in the quartiles for the feature to be useful in separation the label classes? The following cases are displayed in the above plots:\n", + "1. For loan_duration_mo, loan_amount, and payment as a percent of income (payment_pcnt_income), there is useful separation between good and bad credit customers. As one might expect, bad credit customers have longer loan duration on larger loans and with payments being a greater percentage of their income. \n", + "2. On the other hand, age in years, number_loans and dependents does not seem to matter. In latter two cases, this situation seems to result from the median value being zero. There are just not enough non-zero cases to make these useful features. \n", + "\n", + "As an alternative to box plots, you can use violin plots to examine the separation of label cases by numeric features. Execute the code in the cell below and examine the results:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_violin(credit, cols, col_x = 'bad_credit'):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.violinplot(col_x, col, data=credit)\n", + " plt.xlabel(col_x) # Set text for the x axis\n", + " plt.ylabel(col)# Set text for y axis\n", + " plt.show()\n", + "\n", + "plot_violin(credit, num_cols)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The interpretation of these plots is largely the same as the box plots. However, there is one detail worth noting. The differences between loan_duration_mo and loan_amount for good and bad credit customers are only for the more extreme values. It may be that these features are less useful and the box plot indicates. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualize class separation by categorical features\n", + "\n", + "Now you will turn to the problem of visualizing the ability of categorical features to separate classes of the label. Ideally, a categorical feature will have very different counts of the categories for each of the label values. A good way to visualize these relationships is with bar plots.\n", + "\n", + "The code in the cell below creates side by side plots of the categorical variables for each of the labels categories. \n", + "\n", + "Execute this code, examine the results, and answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "import numpy as np\n", + "cat_cols = ['checking_account_status', 'credit_history', 'purpose', 'savings_account_balance', \n", + " 'time_employed_yrs', 'gender_status', 'other_signators', 'property', \n", + " 'other_credit_outstanding', 'home_ownership', 'job_category', 'telephone', \n", + " 'foreign_worker']\n", + "\n", + "credit['dummy'] = np.ones(shape = credit.shape[0])\n", + "for col in cat_cols:\n", + " print(col)\n", + " counts = credit[['dummy', 'bad_credit', col]].groupby(['bad_credit', col], as_index = False).count()\n", + " temp = counts[counts['bad_credit'] == 0][[col, 'dummy']]\n", + " _ = plt.figure(figsize = (10,4))\n", + " plt.subplot(1, 2, 1)\n", + " temp = counts[counts['bad_credit'] == 0][[col, 'dummy']]\n", + " plt.bar(temp[col], temp.dummy)\n", + " plt.xticks(rotation=90)\n", + " plt.title('Counts for ' + col + '\\n Bad credit')\n", + " plt.ylabel('count')\n", + " plt.subplot(1, 2, 2)\n", + " temp = counts[counts['bad_credit'] == 1][[col, 'dummy']]\n", + " plt.bar(temp[col], temp.dummy)\n", + " plt.xticks(rotation=90)\n", + " plt.title('Counts for ' + col + '\\n Good credit')\n", + " plt.ylabel('count')\n", + " plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is a lot of information in these plots. The key to interpreting these plots is comparing the proportion of the categories for each of the label values. If these proportions are distinctly different for each label category, the feature is likely to be useful in separating the label. \n", + "\n", + "There are several cases evident in these plots:\n", + "1. Some features such as checking_account_status and credit_history have significantly different distribution of categories between the label categories. \n", + "2. Others features such as gender_status and telephone show small differences, but these differences are unlikely to be significant. \n", + "3. Other features like other_signators, foreign_worker, home_ownership, and job_category have a dominant category with very few cases of other categories. These features will likely have very little power to separate the cases. \n", + "\n", + "Notice that only a few of these categorical features will be useful in separating the cases. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have performed exploration and visualization to understand the relationships in a classification dataset. Specifically:\n", + "1. Examine the imbalance in the label cases using a frequency table. \n", + "2. Find numeric or categorical features that separate the cases using visualization." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module2/VisualizingDataForRegression.ipynb b/Module2/VisualizingDataForRegression.ipynb new file mode 100644 index 0000000..c43c21d --- /dev/null +++ b/Module2/VisualizingDataForRegression.ipynb @@ -0,0 +1,1015 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Visualizing Data for Regression\n", + "\n", + "In this lab, you will learn how to use Python to **visualize and explore data**. This process is also known as **exploratory data analysis**. \n", + "\n", + "Before creating analytical models, a data scientist must develop an understanding of the properties and relationships in a dataset. There are two goals for data exploration and visualization. First to understand the relationships between the data columns. Second to identify features that may be useful for predicting labels in machine learning projects. Additionally, redundant, collinear features can be identified. Thus, visualization for data exploration is an essential data science skill.\n", + "\n", + "In this module you will explore two datasets. Specifically, in this lab, your first goal is to explore a dataset that includes information about automobile pricing. In other labs you will use what you learn through visualization to create a solution that predicts the price of an automobile based on its characteristics. This type of predictive modeling, in which you attempt to predict a real numeric value, is known as **regression**; and it will be discussed in more detail later in the course. For now, the focus of this lab is on visually exploring the data to determine which features may be useful in predicting automobile prices." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This lesson is divided into several parts. In each part you will learn how to use the visualization tools available in Python to explore complex data. Specifically you will learn:\n", + "\n", + "- **Summarizing and manipulating data**:\n", + " * How large is it?\n", + " * What columns are of interest?\n", + " * What are the characteristics of the data derived from summary statistics and counts?\n", + "- **Developing multiple views of complex data** using multiple chart types. Exploring complex data requires multiple views to understand the many relationships. It is impossible to develop a complete understanding from just a few plots.\n", + "- **Overview of Matplotlib, Pandas plotting and Seaborn** which are commonly used Python plotting packages. \n", + "- **Overview of univariate plot types** is a review of creating these basic plots using three Python packages. These plot types allow you to study the distributional properties of the variables in your data set. \n", + "- **Overview of two dimensional plot types** is a review of creating basic plot types used to construct visualizations. These plots naturally display the relationship between two variables on the 2-d computer graphics display. \n", + "- **Using Aesthetics** is an overview of how to project additional plot dimensions using plot aesthetics. Using aesthetics provides a method for projecting additional dimensions onto the 2-d computer graphics display. \n", + "- **Facetted plotting** also know as conditioned plotting or lattice plotting, introduces a powerful method for visualizing higher dimensional data. Arrays of plots of subsets of the data are arranged on the 2-d computer graphics display. \n", + "- **Using Matplotlib methods** to add attributes to plots, such as titles and axis labels. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Loading the dataset\n", + "\n", + "Before you can start visualization, you must load the dataset. The code in the cell below loads the data and performs some data cleaning. You will work through data cleaning and preparation methods in other labs. \n", + "\n", + "As a first step execute the code in the cell below to import the Python packages you will need for the rest of this notebook. Notice the IPython \"magic\" command `%matplotlib inline`. The `%` tells the Python interpreter that this is the start of a magic command; a command which configures the execution environment. The `matplotlib inline` magic indicates that you want to display graphics inline within the notebook. Execute this code " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The function clean_auto_data() below prepare the dataset. Data preparation is explained in more details in the subsequent module.\n", + "\n", + "For now, execute the code in the cell below to load and prepare the automotive price dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices = pd.read_csv('Automobile price data _Raw_.csv')\n", + "\n", + "def clean_auto_data(auto_prices):\n", + " 'Function to load the auto price data set from a .csv file' \n", + " import pandas as pd\n", + " import numpy as np\n", + " \n", + " ## Recode names\n", + " ## fix column names so the '-' character becomes '_'\n", + " cols = auto_prices.columns\n", + " auto_prices.columns = [str.replace('-', '_') for str in cols]\n", + " \n", + " ## Treat missing values\n", + " ## Remove rows with missing values, accounting for mising values coded as '?'\n", + " cols = ['price', 'bore', 'stroke', \n", + " 'horsepower', 'peak_rpm']\n", + " for column in cols:\n", + " auto_prices.loc[auto_prices[column] == '?', column] = np.nan\n", + " auto_prices.dropna(axis = 0, inplace = True)\n", + "\n", + " ## Transform column data type\n", + " ## Convert some columns to numeric values\n", + " for column in cols:\n", + " auto_prices[column] = pd.to_numeric(auto_prices[column])\n", + "\n", + " return auto_prices\n", + "auto_prices = clean_auto_data(auto_prices)\n", + "\n", + "print(auto_prices.columns)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice the column names. Most of these names are human interpretable and give you an idea of the information in this dataset. The `price` column is the **label**, the variable you are trying to predict. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Exploring the data\n", + "\n", + "With the dataset loaded, you will now explore some basic properties using summary methods. \n", + "\n", + "First, you will examine the head (first few rows) of the Pandas data frame to gain an idea of the contents by executing the code in the cell below" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Scroll across and examine the contents of each column in this dataset. Some columns have **numeric values** and other columns contain **strings variables**. Some of the numeric columns appear to contain **integer values** and other have **floating point numbers**. In machine learning we treat the string columns as **categorical variables**. \n", + "\n", + "To better understand the data types in this dataset execute the code in the cell below to print the `dtypes` attribute of each column." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices.dtypes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results confirm the earlier operations. A column of type `object` contains a text string. \n", + "\n", + "Pandas provides a simple way to compute and display summary statistics for numeric columns using the `describe` method. Execute the code in the cell below and examine the results for each numeric column." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices.describe()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is a lot of information here. For each numeric column the following is printed:\n", + "- The count of the cases in the column. In this case, all counts are the same. \n", + "- The mean and standard deviation of the values in the column. Notice that there is a wide range of mean and scale (standard deviation values) across these columns. \n", + "- The minimum and maximum of the values in the column. Again, the extreme range of these columns varies quite a lot. \n", + "- The quartiles are displayed, 25%, 50% or median value, and 75%. For many of these columns, such as curb_weight and the label value, price, there is a significant difference between the mean and the median values. When the median value is less than the mean, this indicates that the distribution is right-skewed, that is, with a tail stretching toward the right. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will consider how you can understand the distributions of categorical variables. Using a single line of Pandas code allows you to compute and display a **frequency table** using the `value_counts` method. A frequency table shows the frequency of each unique category of a categorical variable. \n", + "\n", + "The code in the cell prints frequency table for a list of categorical columns. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def count_unique(auto_prices, cols):\n", + " for col in cols:\n", + " print('\\n' + 'For column ' + col)\n", + " print(auto_prices[col].value_counts())\n", + "\n", + "cat_cols = ['make', 'fuel_type', 'aspiration', 'num_of_doors', 'body_style', \n", + " 'drive_wheels', 'engine_location', 'engine_type', 'num_of_cylinders', \n", + " 'fuel_system']\n", + "count_unique(auto_prices, cat_cols)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are some basic facts you can derive from these frequency tables. \n", + "1. Some of these variables have a large number of categories. When performing machine learning with a limited size training dataset, having a large number of categories is problematic, since there will be few samples per category. For example, notice how many auto makes are represented. There is only 1 Mercury and 2 Isuzus. Thus, any statistical property for these categories will be poorly determined. \n", + "2. There are significant imbalances in the counts of some categories. You have already seen that there are significant differences in the counts of autos by make. As another example, there are only 3 cars with rear engine autos. Again, any statistical property of rear engine cars will be poorly determined.\n", + "3. Some categorical variables could reasonably converted to numeric variables. For example, the number of cylinders is currently a categorical variable, but could be transformed to a numeric variable. \n", + "\n", + "***\n", + "**Note:** There are two other cases to consider with the transformations between numeric and categorical variables.\n", + "1. Some categorical variables indicate rank, for example large, medium and small. In these cases, it may be better to transform these values to numeric levels.\n", + "2. Just as it might be useful to transform a categorical variable to numeric, it may be advantageous to convert a numeric variable to a categorical variable. This is particularly the case if the numeric values are simply coding for a category with no particular meaning. \n", + "***" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualizing Automobile Data for Regression\n", + "Python supports the matplotlib library; which provides extensive graphical capabilities. Additionally, the Python Pandas library and the Seaborn library add a higher level graphics capability. Pandas and Seaborn abstract a lot of the low level details. As you will see, since these libraries are based on Matplotlib, you can always add needed details. These features make Python a useful language to create visualizations of your data when exploring relationships between the data features. Further, you can identify features that may be useful for predicting labels in machine learning projects." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Visualizing distributions\n", + "\n", + "With some basic understanding of the data set in mind, it is time to dig a bit deeper. In this section you will apply methods to explore the distributions of categorical and numeric data.\n", + "\n", + "### Bar charts\n", + "\n", + "As a first step, you will use **bar charts** to examine the frequency distributions of categorical variables. A bar chart displays frequencies of each category. In most cases, the categories should be **ordered by frequency**; ascending or descending. Ordering categories by frequency aids in viewer interpretation. \n", + "\n", + "Execute the function in the cell below to perform the following processing:\n", + "1. Iterates over the list of columns.\n", + "2. The figure and axes are defined using Matplotlib methods. \n", + "3. The counts or frequencies of the categories are computed.\n", + "4. The bar plot is created using the Pandas `plot.bar` method. Notice that the color argument is set to blue. The default to used **multiple colors for no reason whatsoever is distracting and does not add to interpretation**.\n", + "5. Annotations are added to the plot using Matplotlib methods.\n", + "\n", + "Since Pandas plotting is built on Matplotlib, it is always possible to add additional plot attributes using methods in this package.\n", + "\n", + "Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_bars(auto_prices, cols):\n", + " for col in cols:\n", + " fig = plt.figure(figsize=(6,6)) # define plot area\n", + " ax = fig.gca() # define axis \n", + " counts = auto_prices[col].value_counts() # find the counts for each unique category\n", + " counts.plot.bar(ax = ax, color = 'blue') # Use the plot.bar method on the counts data frame\n", + " ax.set_title('Number of autos by' + col) # Give the plot a main title\n", + " ax.set_xlabel(col) # Set text for the x axis\n", + " ax.set_ylabel('Number of autos')# Set text for y axis\n", + " plt.show()\n", + "\n", + "plot_cols = ['make', 'body_style', 'num_of_cylinders']\n", + "plot_bars(auto_prices, plot_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These plots show the wide range of frequencies for the categorical variables plotted. This will be a problem with modeling, as there are so few members of some classes. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Histograms\n", + "\n", + "**Histograms** are related to bar plots. Whereas, a bar plot shows the counts of unique categories, a histogram shows the **number of data values within a bin** for a **numeric variable**. The bins divide the values of the variable into equal segments. The vertical axis of the histogram shows the count of data values within each bin. \n", + "\n", + "The code below follows the same basic recipe used for the bar plot to create a histogram. In this case, the Pandas `plot.hist` method is used. \n", + "\n", + "Execute this code, examine the results, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_histogram(auto_prices, cols, bins = 10):\n", + " for col in cols:\n", + " fig = plt.figure(figsize=(6,6)) # define plot area\n", + " ax = fig.gca() # define axis \n", + " auto_prices[col].plot.hist(ax = ax, bins = bins) # Use the plot.hist method on subset of the data frame\n", + " ax.set_title('Histogram of ' + col) # Give the plot a main title\n", + " ax.set_xlabel(col) # Set text for the x axis\n", + " ax.set_ylabel('Number of autos')# Set text for y axis\n", + " plt.show()\n", + " \n", + "num_cols = ['curb_weight', 'engine_size', 'city_mpg', 'price'] \n", + "plot_histogram(auto_prices, num_cols)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some of these variables have distributions that are right-skewed, or skewed to the right side. This skewed distribution will affect the statistics of any machine learning model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Kernel density plots and introduction to Seaborn\n", + "\n", + "Up until now you have been working exclusively with the plotting methods in Pandas. Now, you will use the Seaborn package. Seaborn is a newer Python package which abstracts lower level matplotlib charts. Seaborn also implements some additional cutting-edge chart types.\n", + "\n", + "**Kernel density estimation** or **kde** plots are similar in concept to a histogram. A kernel density plot displays the values of a smoothed density curve of the data values. In other words, the kernel density plot is a smoothed version of a histogram.\n", + " \n", + "The code in the cell below creates a kernel density plot following the recipe used before. Using Seaborn adds the following to the recipe:\n", + "1. Set a style for the plot grid.\n", + "2. Define the plot type with `distplot` using the engine-size column as the argument. In this case, no histogram is plotted. A 'rug' showing the locations of the data points on the axis is displayed along the horizontal axis.\n", + "3. Once again, Matplotlib methods are used to add the annotations to the plot. Seaborn is built on Matplotlib, so it is always possible to mix Matplotlib methods. \n", + " \n", + "****\n", + "**Note:** Depending on your platform and versions of numpy, you may see a deprecation warning. You can safely ignore this warning. \n", + "****" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_density_hist(auto_prices, cols, bins = 10, hist = False):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.distplot(auto_prices[col], bins = bins, rug=True, hist = hist)\n", + " plt.title('Histogram of ' + col) # Give the plot a main title\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel('Number of autos')# Set text for y axis\n", + " plt.show()\n", + " \n", + "plot_density_hist(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The kde plots show the same skewness properties of the histogram. The rug shows a different view of the density of the data points on the axis. Some details are more evident in this view. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Combine histograms and kdes\n", + "\n", + "Combining a histogram and a kde can highlight different aspects of a distribution. This is easy to do with Seaborn, as the code below demonstrates. In this case, the number of bins for the histogram has been increased from 10 to 20. \n", + "\n", + "Execute this code, examine the results, and answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "plot_density_hist(auto_prices, num_cols, bins = 20, hist = True) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This view highlights the fact that some features have multimodal distributions. This fact, will have implications for the statistics of any machine learning model trained with these data." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Two dimensional plots\n", + "\n", + "Having used summary statistics and several one dimensional plot methods to explore data, you will continue this exploration using **two dimensional plots**. Two dimensional plots help you develop an understanding of the **relationship between two variables**. For machine learning, the relationship of greatest interest is between the **features** and the **label**. It can also be useful to examine the relationships between features to determine if the features are co-variate or not. Such a procedure can prove more reliable than simply computing correlation when the relationship is nonlinear. \n", + "\n", + "### Create Scatter Plots\n", + "\n", + "Scatter plots are widely used to examine the relationship between two variables. In this case, the plots created are of some features vs. the label, price of the auto. \n", + "\n", + "The code in the cell below follows the previously used recipe for using Pandas plotting methods, using the `plot.scatter` method. This method has two required arguments for the x and y axes. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_scatter(auto_prices, cols, col_y = 'price'):\n", + " for col in cols:\n", + " fig = plt.figure(figsize=(7,6)) # define plot area\n", + " ax = fig.gca() # define axis \n", + " auto_prices.plot.scatter(x = col, y = col_y, ax = ax)\n", + " ax.set_title('Scatter plot of ' + col_y + ' vs. ' + col) # Give the plot a main title\n", + " ax.set_xlabel(col) # Set text for the x axis\n", + " ax.set_ylabel(col_y\n", + " )# Set text for y axis\n", + " plt.show()\n", + "\n", + "num_cols = ['curb_weight', 'engine_size', 'horsepower', 'city_mpg']\n", + "plot_scatter(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These plots show a strong relationship between these features and the label. It is likely these features will be useful in predicting the price of autos. engine_size and horsepower have fairly linear relationships with price, whereas curb_weight and especially city_mpg do not. \n", + "\n", + "It seems likely that hoursepower and engine_size are colinear. To test this hypothesis execute the code in the cell below and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plot_scatter(auto_prices, ['horsepower'], 'engine_size') " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indeed these features do appear linearly dependent. Therefore, you will not want to use them in the same machine learning model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Deal with overplotting\n", + "\n", + "Examine the engine_size or city_mpg vs price above. Notice, that for certain engine sizes and city_mpg ratings there are numerous autos in a narrow price range. Apparently, auto manufactures target these characteristics. The result is that many points are plotted one over the other on the scatter plots, resulting in **over plotting**. Over plotting is a serious problem when scatter plots are applied to large datasets. Serious over plotting can render a plot meaningless or uninterpretable. \n", + "\n", + "Fortunately, there are several good ways to deal with over plotting:\n", + "1. Use **transparency** of the points to allow the view to see though points. With mild over plotting this approach can be quite effective.\n", + "2. **Countour plots** or **2d density plots** show the density of points, much as a topographic map shows elevation. Generating the contours has high computational complexity and making this method unsuitable for massive datasets.\n", + "3. **Hexbin plots** are the two-dimensional analog of a histogram. The density of the shading in the hexagonal cells indicates the density of points. Generating hexbins is computationally efficient and can be applied to massive datasets.\n", + "\n", + "The code in the cell below modifies the scatter plot function used previously to add a transparency argument. In statistical graphics, alpha, the inverse of transparency is specified; alpha = 1.0 is opaque, alpha = 0.0 is perfectly transparent. The code in the cell below uses low alpha of 0.2 (high transparency). Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_scatter_t(auto_prices, cols, col_y = 'price', alpha = 1.0):\n", + " for col in cols:\n", + " fig = plt.figure(figsize=(7,6)) # define plot area\n", + " ax = fig.gca() # define axis \n", + " auto_prices.plot.scatter(x = col, y = col_y, ax = ax, alpha = alpha)\n", + " ax.set_title('Scatter plot of ' + col_y + ' vs. ' + col) # Give the plot a main title\n", + " ax.set_xlabel(col) # Set text for the x axis\n", + " ax.set_ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + "\n", + "plot_scatter_t(auto_prices, num_cols, alpha = 0.2) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the high transparency it is now possible to see though points in dense areas to get a better view of the data. \n", + "\n", + "Using transparency for overplotting is useful, but limited. With large number of points, you will need other methods. Using contour or 2d density plots is one such solution. The code in the cell below uses the `joinplot` function from Seaborn. This plots displays 1d KDE plots along with the countour plot showing 2d density. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_desity_2d(auto_prices, cols, col_y = 'price', kind ='kde'):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.jointplot(col, col_y, data=auto_prices, kind=kind)\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + "\n", + "plot_desity_2d(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These density contour plots show quite a different view of the relationship between these features and the label. In particular, 2d multi-modal behavior is visible for curb_weight, horsepower and particularly city_mpg. Notice also, a correlation coefficient is displayed. \n", + "\n", + "The code in the cell below displays the 2d hexbin plots and 1d histograms for the same variables. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plot_desity_2d(auto_prices, num_cols, kind = 'hex') " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The overall impression from the hexbin plot approximately the same as for the countour plots. A bit more detail is visible since cells with as few as 1 point are displayed. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Relation between categorical and numeric variables\n", + "\n", + "You have created 2d plots of numeric variables But, what can you do if some of the features are categorical variables? There are two plot types specifically intended for this situation:\n", + "1. **Box plots** which highlight the quartiles of a distribution. Not surprisingly, the box plot contains a box. The range of the **inner two quartiles** are contained within the box. The lenght of the box shows the **interquartile range**. A line within the box shows the median. **Wiskers** extend for the maximum of 1.5 times the interquartile range or the extreme value of the data. Outliers beyond the wiskers are shown in a symbol. \n", + "2. **Violine plots** which are a variation on the 1d KDE plot. Two back to back KDE curves are used to show the density estimate. \n", + "\n", + "Box plots of violin plots can be arranged side by side with data of the numerical variable grouped by the categories of the categorical variable. In this way each box or violin display represents the value of the numeric variable for cases of each category of the categorical variable.\n", + "\n", + "Execute the code in the cell below to display box plots for the list of categorical variables, and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_box(auto_prices, cols, col_y = 'price'):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.boxplot(col, col_y, data=auto_prices)\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + " \n", + "cat_cols = ['fuel_type', 'aspiration', 'num_of_doors', 'body_style', \n", + " 'drive_wheels', 'engine_location', 'engine_type', 'num_of_cylinders']\n", + "plot_box(auto_prices, cat_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For each categorical variable, you can see that a box plot is created for each unique category. Notice that for some of these cases, there are some noticeable differences between the price of autos by category. For example, for fuel_type or aspiration there are noticeable differences. In other cases, such as num_of_doors, the differences do not appear significant. For num_of_cylinders there are significant differences, but there are two categories with only one case, which is problematic. \n", + "\n", + "The code in the cell below creates a similar display as above using violin plots. Execute the code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_violin(auto_prices, cols, col_y = 'price'):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.violinplot(col, col_y, data=auto_prices)\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + " \n", + "plot_violin(auto_prices, cat_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The interpretation of the violin plots is similar to the box plots. However, a bit more detail on distributions is visible. The area of each violin is the same on each plot display. Notice also that a type of box plot is visible inside each violin plot. \n", + "\n", + "As you examine the above plots notice that some relationships are more obvious. For example, it quite clear that the number of doors does not affect the price of the car and the body style has marginal influence at best. Whereas, engine_location and num_of_cylinders does affect price. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Use aesthetics to add project additional dimensions\n", + "\n", + "Up until now, you have work with one or two variables on a single plot. But, with complex datasets it is useful to view multiple dimensions on each plot. The question is, how can this be done when graphics displays are limited to two dimensions? \n", + "\n", + "In this section, plot aesthetics are used to project additional dimensions. Some aesthetics are useful only for categorical variables, while others are useful for numeric variables. Keep in mind that not all plot aesthetics are equally effective. Tests of human perceptions have shown that people are very good as noticing small differences in position. This fact explains why scatter plots are so effective. In rough order of effectiveness these aesthetics are:\n", + "1. **Marker shape** is an effective indicator variable category. It is critical to select shapes which are easily distinguished by the viewer. \n", + "2. **Marker size** shows values of a numeric variable. Be careful, as size is the span across the marker, not the area. \n", + "3. **Marker color** is useful as an indicator of variable category. Color is the least effective of these three aesthetics in terms of human perception. Colors should be chosen to appear distinct. Additionally, keep in mind that many people, particularly men are red-green color blind. \n", + "\n", + "Categorical aesthetics, such as marker shape and color, are only effective if the differences in markers are perceptable. Using too many shapes or color creates a situation where the viewer cannot tell the differences between the categories. Typically a limit of about five to seven categories should be observed. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Marker shape\n", + "\n", + "The code in the cell below uses marker shape to show the fuel type of the auto on a scatter plot. This is done by subsetting the data frame by each unique value of the categorical column. Shapes are defined in a list, which is referenced on each iteration of this inner loop. \n", + "\n", + "There is one tricky aspect to this code. The transparency parameter, alpha, must be passed to Matplotlib though a dictionary. The key is the argument, alpha, and the value is the argument value. \n", + "\n", + "Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_scatter_shape(auto_prices, cols, shape_col = 'fuel_type', col_y = 'price', alpha = 0.2):\n", + " shapes = ['+', 'o', 's', 'x', '^'] # pick distinctive shapes\n", + " unique_cats = auto_prices[shape_col].unique()\n", + " for col in cols: # loop over the columns to plot\n", + " sns.set_style(\"whitegrid\")\n", + " for i, cat in enumerate(unique_cats): # loop over the unique categories\n", + " temp = auto_prices[auto_prices[shape_col] == cat]\n", + " sns.regplot(col, col_y, data=temp, marker = shapes[i], label = cat,\n", + " scatter_kws={\"alpha\":alpha}, fit_reg = False, color = 'blue')\n", + " plt.title('Scatter plot of ' + col_y + ' vs. ' + col) # Give the plot a main title\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.legend()\n", + " plt.show()\n", + " \n", + "num_cols = ['curb_weight', 'engine_size', 'horsepower', 'city_mpg']\n", + "plot_scatter_shape(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "While there is some overlap, the differences between gas and diesel autos are now apparent in these plots. This new view of the data helps to confirm and fuel type is a significant feature for determining auto price. \n", + "\n", + "Notice, that rather distinctive shapes have been chosen for this display. In summary, by adding shape by category a third dimension is projected onto these plots." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Marker size\n", + "\n", + "The code in the cell below uses marker size to display curb weight. Since Matplotlib uses area to compute marker size, the values of curb weight are squared and then scaled by a convenient multiplier. Notice that since size, `s`, is a Matplotlib argument, it is passed in a dictionary along with alpha. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_scatter_size(auto_prices, cols, shape_col = 'fuel_type', size_col = 'curb_weight',\n", + " size_mul = 0.000025, col_y = 'price', alpha = 0.2):\n", + " shapes = ['+', 'o', 's', 'x', '^'] # pick distinctive shapes\n", + " unique_cats = auto_prices[shape_col].unique()\n", + " for col in cols: # loop over the columns to plot\n", + " sns.set_style(\"whitegrid\")\n", + " for i, cat in enumerate(unique_cats): # loop over the unique categories\n", + " temp = auto_prices[auto_prices[shape_col] == cat]\n", + " sns.regplot(col, col_y, data=temp, marker = shapes[i], label = cat,\n", + " scatter_kws={\"alpha\":alpha, \"s\":size_mul*auto_prices[size_col]**2}, \n", + " fit_reg = False, color = 'blue')\n", + " plt.title('Scatter plot of ' + col_y + ' vs. ' + col) # Give the plot a main title\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.legend()\n", + " plt.show()\n", + "\n", + "num_cols = ['engine_size', 'horsepower', 'city_mpg']\n", + "plot_scatter_size(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are several interesting aspects of these plots, each of which is useful in predicting the price of autos For diesel autos the relationship between curb_weight, price, engine_size, horsepower and city_mpg is complex with no clear trend. On the other hand, it appears that high price, large engine, high horsepower, and low city_mpg cars have large gas engines. \n", + "\n", + "The above plots are now projecting four dimensions on the 2d plot surface. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Color\n", + "\n", + "As was already discussed, changes in color are hard for many people to perceive. None the less, color is useful for projecting a limited number of categories of a variable. Choice of distinctive color helps this situation. \n", + "\n", + "The code in the cell below uses color to display the aspiration category of the auto. The two inner loops create subsets of the data which are plotted with a specific shape and color for the markers. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_scatter_shape_size_col(auto_prices, cols, shape_col = 'fuel_type', size_col = 'curb_weight',\n", + " size_mul = 0.000025, color_col = 'aspiration', col_y = 'price', alpha = 0.2):\n", + " shapes = ['+', 'o', 's', 'x', '^'] # pick distinctive shapes\n", + " colors = ['green', 'blue', 'orange', 'magenta', 'gray'] # specify distinctive colors\n", + " unique_cats = auto_prices[shape_col].unique()\n", + " unique_colors = auto_prices[color_col].unique()\n", + " for col in cols: # loop over the columns to plot\n", + " sns.set_style(\"whitegrid\")\n", + " for i, cat in enumerate(unique_cats): # loop over the unique categories\n", + " for j, color in enumerate(unique_colors):\n", + " temp = auto_prices[(auto_prices[shape_col] == cat) & (auto_prices[color_col] == color)]\n", + " sns.regplot(col, col_y, data=temp, marker = shapes[i],\n", + " scatter_kws={\"alpha\":alpha, \"s\":size_mul*temp[size_col]**2}, \n", + " label = (cat + ' and ' + color), fit_reg = False, color = colors[j])\n", + " plt.title('Scatter plot of ' + col_y + ' vs. ' + col) # Give the plot a main title\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.legend()\n", + " plt.show()\n", + "\n", + "num_cols = ['engine_size', 'horsepower', 'city_mpg'] \n", + "plot_scatter_shape_size_col(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Each of these plots projects five dimensions of data onto the 2d display. Several relationship are now apparent in these data.\n", + "\n", + "In summary, aspiration along with fuel_type should be useful predictors of price. \n", + "\n", + "Now, answer **Question 3** on the course page." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Color (or hue) can be used in other types of plots. For example, the code in the cell below displays split violin plots. The violins are split by type of aspiration and shown in different colors. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_violin_hue(auto_prices, cols, col_y = 'price', hue_col = 'aspiration'):\n", + " for col in cols:\n", + " sns.set_style(\"whitegrid\")\n", + " sns.violinplot(col, col_y, data=auto_prices, hue = hue_col, split = True)\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + " \n", + "plot_violin_hue(auto_prices, cat_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These plots show that autos with turbo aspiration are generally more expensive than the comparable standard car. Thus, aspiration should be a useful predictor of price. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Multi-axis views of data\n", + "\n", + "Up to now, you have been working with plots with a single pair of axes. However, it is quite possible to create powerful data visualizations with multiple axes. These methods allows you to examine the relationships between many variables in one view. These multiple views aid in understanding of the many relationships in complex datasets. There are a number of powerful multi-axes plot methods. In this lab you will work with two commonly applied methods:\n", + "1. **Pair-wise scatter plots** or **scatter plot matrices** are an array of scatter plots with common axes along the rows and columns of the array. The diagonal of the array can be used to display distribution plots. The cells above or below the diagonal can be used for other plot types like contour density plots.\n", + "2. **Conditioned plots**, **facetted plots** or **small multiple plots** use **group-by** operations to create and display subsets of the dataset. The display can be a one or two dimensional array organized by the groupings of the dataset. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Pair-wise scatter plot\n", + "\n", + "You will now apply a scatter plot matrix to the auto.price dataset. The code in the cell below uses the `pairplot` function from the Seaborn package. This function creates a basic scatter plot matrix below the diagonal. Kernel density estimates of each variable are displayed on the diagonal. Using the `map_upper` method 2d density plots are displayed above the diagonal. Run the cell below to create a scatter plot matrix of the numeric features in the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "num_cols = [\"curb_weight\", \"engine_size\", \"horsepower\", \"city_mpg\", \"price\", \"fuel_type\"] \n", + "sns.pairplot(auto_prices[num_cols], hue='fuel_type', palette=\"Set2\", diag_kind=\"kde\", size=2).map_upper(sns.kdeplot, cmap=\"Blues_d\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Review the scatter plot matrix (if the plot is too large for the cell, you can expand the cell by clicking its left margin).\n", + "\n", + "Note that this plot is comprised of a number of scatter plots. For each variable there is both a row and a column. The variable is plotted on the vertical axis in the row, and on the horizontal axis in the column. In this way, every combination of cross plots for all variables is displayed in both possible orientations. KDE plots for each variable are on the diagonal. Above the diagonal you can see contour plots of 2d density estimates. There is a lot of detail here. \n", + "\n", + "Examine the above scatter plot matrix, which shows plots of each numeric column verses every other numeric column, and note the following: \n", + "- Many features show significant collinearity, such as horsepower, engine_size and curb_weight. This suggests that all of these features should not be used when training a machine learning model.\n", + "- All of the features show a strong relationship with the label, price, such as city_mpg, engine_size, horsepower and curb_weight.\n", + "- Several of these relationships are nonlinear, particularly the relationships with the city_mpg feature.\n", + "- There is distinctively different behavior for the diesel vs. gas cars. \n", + "- Most of the variables have asymmetric distributions.\n", + "\n", + "Many of these relationships have been noted earlier. Having all this information on one plot can be useful. However, you may notice that some details are hard to see in such a display. \n", + "\n", + "**** \n", + "Note: The number of scatter plots and the memory required to compute and display them can be a bit daunting. You may wish to make a scatter plot matrix with fewer columns. For example, you can eliminate columns which are collinear with other columns. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Conditioned plots\n", + "\n", + "Now you will explore the use of conditioned plots. The code in the cell below does the following:\n", + "1. The Seaborn `FacetGrid` function defines the grid object over which the plots are displayed. The arguments of this function are the Pandas data frame and the grouping variables for the rows and the columns. \n", + "2. The `map` method displays (maps) the plot function over the plot grid. In this case, the histogram function, `hist`, from Matplotlib is used. \n", + "\n", + "Execute the code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Function to plot conditioned histograms\n", + "def cond_hists(df, plot_cols, grid_col):\n", + " import matplotlib.pyplot as plt\n", + " import seaborn as sns\n", + " ## Loop over the list of columns\n", + " for col in plot_cols:\n", + " grid1 = sns.FacetGrid(df, col=grid_col)\n", + " grid1.map(plt.hist, col, alpha=.7)\n", + " return grid_col\n", + "\n", + "## Define columns for making a conditioned histogram\n", + "plot_cols2 = [\"length\",\n", + " \"curb_weight\",\n", + " \"engine_size\",\n", + " \"city_mpg\",\n", + " \"price\"]\n", + "\n", + "cond_hists(auto_prices, plot_cols2, 'drive_wheels')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine this series of conditioned plots. There is a consistent difference in the distributions of the numeric features conditioned on the categories of drive_wheels. \n", + "\n", + "Now, answer **Question 4** on the course page." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next you will create and examine conditioned scatter plots. By conditioning multiple dimensions, you can project several additional dimensions onto the two-dimensional plot. The conditioning can be thought of as a group-by operation.\n", + "\n", + "You will not use point shape as a differentiator in this exercise, but keep in mind that shape can be as useful as color. Additionally, shape may be easier for the significant fraction of the population who are color blind. \n", + "\n", + "Note: Be careful when combining methods for projecting multiple dimensions. You can easily end up with a plot that is not only hard to interpret, but even harder for you to communicate your observations to your colleagues. For example, if you use three conditioning variables, plus color and shape, you are projecting seven dimensions of your dataset. While this approach might reveal important relationships, it may just create a complex plot. \n", + "\n", + "The code in the cell below uses the Python seaborn package to create a two dimensional array of scatter plots in the following way:\n", + "1. As before, the grid object is defined. A color aesthetic is used with the palette specified.\n", + "2. The scatter plots are mapped to the grid. \n", + "\n", + "Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def cond_plot(cols):\n", + " import IPython.html.widgets\n", + " import seaborn as sns\n", + " for col in cols:\n", + " g = sns.FacetGrid(auto_prices, col=\"drive_wheels\", row = 'body_style', \n", + " hue=\"fuel_type\", palette=\"Set2\", margin_titles=True)\n", + " g.map(sns.regplot, col, \"price\", fit_reg = False)\n", + "\n", + "num_cols = [\"curb_weight\", \"engine_size\", \"city_mpg\"]\n", + "cond_plot(num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Carefully examine the plots you have created. These plots show a total of five dimensions of the data and there is a lot of detail to understand. Generally, you should be able to conclude the following:\n", + "- There are no cars at all for some combinations of conditioning variables. For example, there are no 4 wheel drive convertables or hardtops. As a result these plots are blank. \n", + "- There are no diesel cars for some of the conditioning combinations, such as convertables and 4 wheel drive cars. \n", + "- There are a number of distinct groupings. For example, front wheel drive and rear wheel drive sedans or wagons have distinctly different behaviors. \n", + "\n", + "Again, many of these observations can be made with other types of plots. However, there is an advantage of laying these plots out in sub-groups of the datasets. Different sub-groupings of the data highlight different relationships of the data. \n", + "\n", + "Finally, answer **Question 5** on the course page." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "By now, you should realize that exploring a dataset in detail is an open-ended and complex task. You may wish to try other combinations of conditioning variables, and color variable, to find some other interesting relationships in these data. Only by using multiple views with different plot types can you truely develop an understanding of relationships in complex data.\n", + "\n", + "A constant challenge in visualizing complex datasets is the limitation of 2d projections. Aesthetics and multiple axis methods allow projection of higher dimensions onto the 2d plot surface. \n", + "\n", + "Specifically in this lab you:\n", + "\n", + "1. Used summary statistics to understand the basics of a data set.\n", + "2. Used several types of plots to display distributions.\n", + "3. Created scatter plots with different transparency. \n", + "4. Used density plots and hex bin plots to overcome overplotting. \n", + "5. Applied aesthetics to project additional dimensions of categorical and numeric variables onto a 2d plot surface. \n", + "6. Used pair-wise scatter plots and conditioned plots to create displays with multiple axes. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3.6", + "language": "python", + "name": "python36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.3" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/Module3/Automobile price data _Raw_.csv b/Module3/Automobile price data _Raw_.csv new file mode 100644 index 0000000..9b36ab2 --- /dev/null +++ b/Module3/Automobile price data _Raw_.csv @@ -0,0 +1,206 @@ +symboling,normalized-losses,make,fuel-type,aspiration,num-of-doors,body-style,drive-wheels,engine-location,wheel-base,length,width,height,curb-weight,engine-type,num-of-cylinders,engine-size,fuel-system,bore,stroke,compression-ratio,horsepower,peak-rpm,city-mpg,highway-mpg,price +3,?,alfa-romero,gas,std,two,convertible,rwd,front,88.60,168.80,64.10,48.80,2548,dohc,four,130,mpfi,3.47,2.68,9.00,111,5000,21,27,13495 +3,?,alfa-romero,gas,std,two,convertible,rwd,front,88.60,168.80,64.10,48.80,2548,dohc,four,130,mpfi,3.47,2.68,9.00,111,5000,21,27,16500 +1,?,alfa-romero,gas,std,two,hatchback,rwd,front,94.50,171.20,65.50,52.40,2823,ohcv,six,152,mpfi,2.68,3.47,9.00,154,5000,19,26,16500 +2,164,audi,gas,std,four,sedan,fwd,front,99.80,176.60,66.20,54.30,2337,ohc,four,109,mpfi,3.19,3.40,10.00,102,5500,24,30,13950 +2,164,audi,gas,std,four,sedan,4wd,front,99.40,176.60,66.40,54.30,2824,ohc,five,136,mpfi,3.19,3.40,8.00,115,5500,18,22,17450 +2,?,audi,gas,std,two,sedan,fwd,front,99.80,177.30,66.30,53.10,2507,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,25,15250 +1,158,audi,gas,std,four,sedan,fwd,front,105.80,192.70,71.40,55.70,2844,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,25,17710 +1,?,audi,gas,std,four,wagon,fwd,front,105.80,192.70,71.40,55.70,2954,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,25,18920 +1,158,audi,gas,turbo,four,sedan,fwd,front,105.80,192.70,71.40,55.90,3086,ohc,five,131,mpfi,3.13,3.40,8.30,140,5500,17,20,23875 +0,?,audi,gas,turbo,two,hatchback,4wd,front,99.50,178.20,67.90,52.00,3053,ohc,five,131,mpfi,3.13,3.40,7.00,160,5500,16,22,? +2,192,bmw,gas,std,two,sedan,rwd,front,101.20,176.80,64.80,54.30,2395,ohc,four,108,mpfi,3.50,2.80,8.80,101,5800,23,29,16430 +0,192,bmw,gas,std,four,sedan,rwd,front,101.20,176.80,64.80,54.30,2395,ohc,four,108,mpfi,3.50,2.80,8.80,101,5800,23,29,16925 +0,188,bmw,gas,std,two,sedan,rwd,front,101.20,176.80,64.80,54.30,2710,ohc,six,164,mpfi,3.31,3.19,9.00,121,4250,21,28,20970 +0,188,bmw,gas,std,four,sedan,rwd,front,101.20,176.80,64.80,54.30,2765,ohc,six,164,mpfi,3.31,3.19,9.00,121,4250,21,28,21105 +1,?,bmw,gas,std,four,sedan,rwd,front,103.50,189.00,66.90,55.70,3055,ohc,six,164,mpfi,3.31,3.19,9.00,121,4250,20,25,24565 +0,?,bmw,gas,std,four,sedan,rwd,front,103.50,189.00,66.90,55.70,3230,ohc,six,209,mpfi,3.62,3.39,8.00,182,5400,16,22,30760 +0,?,bmw,gas,std,two,sedan,rwd,front,103.50,193.80,67.90,53.70,3380,ohc,six,209,mpfi,3.62,3.39,8.00,182,5400,16,22,41315 +0,?,bmw,gas,std,four,sedan,rwd,front,110.00,197.00,70.90,56.30,3505,ohc,six,209,mpfi,3.62,3.39,8.00,182,5400,15,20,36880 +2,121,chevrolet,gas,std,two,hatchback,fwd,front,88.40,141.10,60.30,53.20,1488,l,three,61,2bbl,2.91,3.03,9.50,48,5100,47,53,5151 +1,98,chevrolet,gas,std,two,hatchback,fwd,front,94.50,155.90,63.60,52.00,1874,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,6295 +0,81,chevrolet,gas,std,four,sedan,fwd,front,94.50,158.80,63.60,52.00,1909,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,6575 +1,118,dodge,gas,std,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,1876,ohc,four,90,2bbl,2.97,3.23,9.41,68,5500,37,41,5572 +1,118,dodge,gas,std,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,1876,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6377 +1,118,dodge,gas,turbo,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,2128,ohc,four,98,mpfi,3.03,3.39,7.60,102,5500,24,30,7957 +1,148,dodge,gas,std,four,hatchback,fwd,front,93.70,157.30,63.80,50.60,1967,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6229 +1,148,dodge,gas,std,four,sedan,fwd,front,93.70,157.30,63.80,50.60,1989,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6692 +1,148,dodge,gas,std,four,sedan,fwd,front,93.70,157.30,63.80,50.60,1989,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,7609 +1,148,dodge,gas,turbo,?,sedan,fwd,front,93.70,157.30,63.80,50.60,2191,ohc,four,98,mpfi,3.03,3.39,7.60,102,5500,24,30,8558 +-1,110,dodge,gas,std,four,wagon,fwd,front,103.30,174.60,64.60,59.80,2535,ohc,four,122,2bbl,3.34,3.46,8.50,88,5000,24,30,8921 +3,145,dodge,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2811,ohc,four,156,mfi,3.60,3.90,7.00,145,5000,19,24,12964 +2,137,honda,gas,std,two,hatchback,fwd,front,86.60,144.60,63.90,50.80,1713,ohc,four,92,1bbl,2.91,3.41,9.60,58,4800,49,54,6479 +2,137,honda,gas,std,two,hatchback,fwd,front,86.60,144.60,63.90,50.80,1819,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,31,38,6855 +1,101,honda,gas,std,two,hatchback,fwd,front,93.70,150.00,64.00,52.60,1837,ohc,four,79,1bbl,2.91,3.07,10.10,60,5500,38,42,5399 +1,101,honda,gas,std,two,hatchback,fwd,front,93.70,150.00,64.00,52.60,1940,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,30,34,6529 +1,101,honda,gas,std,two,hatchback,fwd,front,93.70,150.00,64.00,52.60,1956,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,30,34,7129 +0,110,honda,gas,std,four,sedan,fwd,front,96.50,163.40,64.00,54.50,2010,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,30,34,7295 +0,78,honda,gas,std,four,wagon,fwd,front,96.50,157.10,63.90,58.30,2024,ohc,four,92,1bbl,2.92,3.41,9.20,76,6000,30,34,7295 +0,106,honda,gas,std,two,hatchback,fwd,front,96.50,167.50,65.20,53.30,2236,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,7895 +0,106,honda,gas,std,two,hatchback,fwd,front,96.50,167.50,65.20,53.30,2289,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,9095 +0,85,honda,gas,std,four,sedan,fwd,front,96.50,175.40,65.20,54.10,2304,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,8845 +0,85,honda,gas,std,four,sedan,fwd,front,96.50,175.40,62.50,54.10,2372,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,10295 +0,85,honda,gas,std,four,sedan,fwd,front,96.50,175.40,65.20,54.10,2465,ohc,four,110,mpfi,3.15,3.58,9.00,101,5800,24,28,12945 +1,107,honda,gas,std,two,sedan,fwd,front,96.50,169.10,66.00,51.00,2293,ohc,four,110,2bbl,3.15,3.58,9.10,100,5500,25,31,10345 +0,?,isuzu,gas,std,four,sedan,rwd,front,94.30,170.70,61.80,53.50,2337,ohc,four,111,2bbl,3.31,3.23,8.50,78,4800,24,29,6785 +1,?,isuzu,gas,std,two,sedan,fwd,front,94.50,155.90,63.60,52.00,1874,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,? +0,?,isuzu,gas,std,four,sedan,fwd,front,94.50,155.90,63.60,52.00,1909,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,? +2,?,isuzu,gas,std,two,hatchback,rwd,front,96.00,172.60,65.20,51.40,2734,ohc,four,119,spfi,3.43,3.23,9.20,90,5000,24,29,11048 +0,145,jaguar,gas,std,four,sedan,rwd,front,113.00,199.60,69.60,52.80,4066,dohc,six,258,mpfi,3.63,4.17,8.10,176,4750,15,19,32250 +0,?,jaguar,gas,std,four,sedan,rwd,front,113.00,199.60,69.60,52.80,4066,dohc,six,258,mpfi,3.63,4.17,8.10,176,4750,15,19,35550 +0,?,jaguar,gas,std,two,sedan,rwd,front,102.00,191.70,70.60,47.80,3950,ohcv,twelve,326,mpfi,3.54,2.76,11.50,262,5000,13,17,36000 +1,104,mazda,gas,std,two,hatchback,fwd,front,93.10,159.10,64.20,54.10,1890,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,30,31,5195 +1,104,mazda,gas,std,two,hatchback,fwd,front,93.10,159.10,64.20,54.10,1900,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,31,38,6095 +1,104,mazda,gas,std,two,hatchback,fwd,front,93.10,159.10,64.20,54.10,1905,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,31,38,6795 +1,113,mazda,gas,std,four,sedan,fwd,front,93.10,166.80,64.20,54.10,1945,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,31,38,6695 +1,113,mazda,gas,std,four,sedan,fwd,front,93.10,166.80,64.20,54.10,1950,ohc,four,91,2bbl,3.08,3.15,9.00,68,5000,31,38,7395 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2380,rotor,two,70,4bbl,?,?,9.40,101,6000,17,23,10945 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2380,rotor,two,70,4bbl,?,?,9.40,101,6000,17,23,11845 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2385,rotor,two,70,4bbl,?,?,9.40,101,6000,17,23,13645 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2500,rotor,two,80,mpfi,?,?,9.40,135,6000,16,23,15645 +1,129,mazda,gas,std,two,hatchback,fwd,front,98.80,177.80,66.50,53.70,2385,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,8845 +0,115,mazda,gas,std,four,sedan,fwd,front,98.80,177.80,66.50,55.50,2410,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,8495 +1,129,mazda,gas,std,two,hatchback,fwd,front,98.80,177.80,66.50,53.70,2385,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,10595 +0,115,mazda,gas,std,four,sedan,fwd,front,98.80,177.80,66.50,55.50,2410,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,10245 +0,?,mazda,diesel,std,?,sedan,fwd,front,98.80,177.80,66.50,55.50,2443,ohc,four,122,idi,3.39,3.39,22.70,64,4650,36,42,10795 +0,115,mazda,gas,std,four,hatchback,fwd,front,98.80,177.80,66.50,55.50,2425,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,11245 +0,118,mazda,gas,std,four,sedan,rwd,front,104.90,175.00,66.10,54.40,2670,ohc,four,140,mpfi,3.76,3.16,8.00,120,5000,19,27,18280 +0,?,mazda,diesel,std,four,sedan,rwd,front,104.90,175.00,66.10,54.40,2700,ohc,four,134,idi,3.43,3.64,22.00,72,4200,31,39,18344 +-1,93,mercedes-benz,diesel,turbo,four,sedan,rwd,front,110.00,190.90,70.30,56.50,3515,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,25552 +-1,93,mercedes-benz,diesel,turbo,four,wagon,rwd,front,110.00,190.90,70.30,58.70,3750,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,28248 +0,93,mercedes-benz,diesel,turbo,two,hardtop,rwd,front,106.70,187.50,70.30,54.90,3495,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,28176 +-1,93,mercedes-benz,diesel,turbo,four,sedan,rwd,front,115.60,202.60,71.70,56.30,3770,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,31600 +-1,?,mercedes-benz,gas,std,four,sedan,rwd,front,115.60,202.60,71.70,56.50,3740,ohcv,eight,234,mpfi,3.46,3.10,8.30,155,4750,16,18,34184 +3,142,mercedes-benz,gas,std,two,convertible,rwd,front,96.60,180.30,70.50,50.80,3685,ohcv,eight,234,mpfi,3.46,3.10,8.30,155,4750,16,18,35056 +0,?,mercedes-benz,gas,std,four,sedan,rwd,front,120.90,208.10,71.70,56.70,3900,ohcv,eight,308,mpfi,3.80,3.35,8.00,184,4500,14,16,40960 +1,?,mercedes-benz,gas,std,two,hardtop,rwd,front,112.00,199.20,72.00,55.40,3715,ohcv,eight,304,mpfi,3.80,3.35,8.00,184,4500,14,16,45400 +1,?,mercury,gas,turbo,two,hatchback,rwd,front,102.70,178.40,68.00,54.80,2910,ohc,four,140,mpfi,3.78,3.12,8.00,175,5000,19,24,16503 +2,161,mitsubishi,gas,std,two,hatchback,fwd,front,93.70,157.30,64.40,50.80,1918,ohc,four,92,2bbl,2.97,3.23,9.40,68,5500,37,41,5389 +2,161,mitsubishi,gas,std,two,hatchback,fwd,front,93.70,157.30,64.40,50.80,1944,ohc,four,92,2bbl,2.97,3.23,9.40,68,5500,31,38,6189 +2,161,mitsubishi,gas,std,two,hatchback,fwd,front,93.70,157.30,64.40,50.80,2004,ohc,four,92,2bbl,2.97,3.23,9.40,68,5500,31,38,6669 +1,161,mitsubishi,gas,turbo,two,hatchback,fwd,front,93,157.30,63.80,50.80,2145,ohc,four,98,spdi,3.03,3.39,7.60,102,5500,24,30,7689 +3,153,mitsubishi,gas,turbo,two,hatchback,fwd,front,96.30,173.00,65.40,49.40,2370,ohc,four,110,spdi,3.17,3.46,7.50,116,5500,23,30,9959 +3,153,mitsubishi,gas,std,two,hatchback,fwd,front,96.30,173.00,65.40,49.40,2328,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,25,32,8499 +3,?,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2833,ohc,four,156,spdi,3.58,3.86,7.00,145,5000,19,24,12629 +3,?,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2921,ohc,four,156,spdi,3.59,3.86,7.00,145,5000,19,24,14869 +3,?,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2926,ohc,four,156,spdi,3.59,3.86,7.00,145,5000,19,24,14489 +1,125,mitsubishi,gas,std,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2365,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,25,32,6989 +1,125,mitsubishi,gas,std,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2405,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,25,32,8189 +1,125,mitsubishi,gas,turbo,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2403,ohc,four,110,spdi,3.17,3.46,7.50,116,5500,23,30,9279 +-1,137,mitsubishi,gas,std,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2403,ohc,four,110,spdi,3.17,3.46,7.50,116,5500,23,30,9279 +1,128,nissan,gas,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,1889,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,5499 +1,128,nissan,diesel,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,2017,ohc,four,103,idi,2.99,3.47,21.90,55,4800,45,50,7099 +1,128,nissan,gas,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,1918,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,6649 +1,122,nissan,gas,std,four,sedan,fwd,front,94.50,165.30,63.80,54.50,1938,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,6849 +1,103,nissan,gas,std,four,wagon,fwd,front,94.50,170.20,63.80,53.50,2024,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7349 +1,128,nissan,gas,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,1951,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7299 +1,128,nissan,gas,std,two,hatchback,fwd,front,94.50,165.60,63.80,53.30,2028,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7799 +1,122,nissan,gas,std,four,sedan,fwd,front,94.50,165.30,63.80,54.50,1971,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7499 +1,103,nissan,gas,std,four,wagon,fwd,front,94.50,170.20,63.80,53.50,2037,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7999 +2,168,nissan,gas,std,two,hardtop,fwd,front,95.10,162.40,63.80,53.30,2008,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,8249 +0,106,nissan,gas,std,four,hatchback,fwd,front,97.20,173.40,65.20,54.70,2324,ohc,four,120,2bbl,3.33,3.47,8.50,97,5200,27,34,8949 +0,106,nissan,gas,std,four,sedan,fwd,front,97.20,173.40,65.20,54.70,2302,ohc,four,120,2bbl,3.33,3.47,8.50,97,5200,27,34,9549 +0,128,nissan,gas,std,four,sedan,fwd,front,100.40,181.70,66.50,55.10,3095,ohcv,six,181,mpfi,3.43,3.27,9.00,152,5200,17,22,13499 +0,108,nissan,gas,std,four,wagon,fwd,front,100.40,184.60,66.50,56.10,3296,ohcv,six,181,mpfi,3.43,3.27,9.00,152,5200,17,22,14399 +0,108,nissan,gas,std,four,sedan,fwd,front,100.40,184.60,66.50,55.10,3060,ohcv,six,181,mpfi,3.43,3.27,9.00,152,5200,19,25,13499 +3,194,nissan,gas,std,two,hatchback,rwd,front,91.30,170.70,67.90,49.70,3071,ohcv,six,181,mpfi,3.43,3.27,9.00,160,5200,19,25,17199 +3,194,nissan,gas,turbo,two,hatchback,rwd,front,91.30,170.70,67.90,49.70,3139,ohcv,six,181,mpfi,3.43,3.27,7.80,200,5200,17,23,19699 +1,231,nissan,gas,std,two,hatchback,rwd,front,99.20,178.50,67.90,49.70,3139,ohcv,six,181,mpfi,3.43,3.27,9.00,160,5200,19,25,18399 +0,161,peugot,gas,std,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3020,l,four,120,mpfi,3.46,3.19,8.40,97,5000,19,24,11900 +0,161,peugot,diesel,turbo,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3197,l,four,152,idi,3.70,3.52,21.00,95,4150,28,33,13200 +0,?,peugot,gas,std,four,wagon,rwd,front,114.20,198.90,68.40,58.70,3230,l,four,120,mpfi,3.46,3.19,8.40,97,5000,19,24,12440 +0,?,peugot,diesel,turbo,four,wagon,rwd,front,114.20,198.90,68.40,58.70,3430,l,four,152,idi,3.70,3.52,21.00,95,4150,25,25,13860 +0,161,peugot,gas,std,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3075,l,four,120,mpfi,3.46,2.19,8.40,95,5000,19,24,15580 +0,161,peugot,diesel,turbo,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3252,l,four,152,idi,3.70,3.52,21.00,95,4150,28,33,16900 +0,?,peugot,gas,std,four,wagon,rwd,front,114.20,198.90,68.40,56.70,3285,l,four,120,mpfi,3.46,2.19,8.40,95,5000,19,24,16695 +0,?,peugot,diesel,turbo,four,wagon,rwd,front,114.20,198.90,68.40,58.70,3485,l,four,152,idi,3.70,3.52,21.00,95,4150,25,25,17075 +0,161,peugot,gas,std,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3075,l,four,120,mpfi,3.46,3.19,8.40,97,5000,19,24,16630 +0,161,peugot,diesel,turbo,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3252,l,four,152,idi,3.70,3.52,21.00,95,4150,28,33,17950 +0,161,peugot,gas,turbo,four,sedan,rwd,front,108.00,186.70,68.30,56.00,3130,l,four,134,mpfi,3.61,3.21,7.00,142,5600,18,24,18150 +1,119,plymouth,gas,std,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,1918,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,37,41,5572 +1,119,plymouth,gas,turbo,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,2128,ohc,four,98,spdi,3.03,3.39,7.60,102,5500,24,30,7957 +1,154,plymouth,gas,std,four,hatchback,fwd,front,93.70,157.30,63.80,50.60,1967,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6229 +1,154,plymouth,gas,std,four,sedan,fwd,front,93.70,167.30,63.80,50.80,1989,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6692 +1,154,plymouth,gas,std,four,sedan,fwd,front,93.70,167.30,63.80,50.80,2191,ohc,four,98,2bbl,2.97,3.23,9.40,68,5500,31,38,7609 +-1,74,plymouth,gas,std,four,wagon,fwd,front,103.30,174.60,64.60,59.80,2535,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,24,30,8921 +3,?,plymouth,gas,turbo,two,hatchback,rwd,front,95.90,173.20,66.30,50.20,2818,ohc,four,156,spdi,3.59,3.86,7.00,145,5000,19,24,12764 +3,186,porsche,gas,std,two,hatchback,rwd,front,94.50,168.90,68.30,50.20,2778,ohc,four,151,mpfi,3.94,3.11,9.50,143,5500,19,27,22018 +3,?,porsche,gas,std,two,hardtop,rwd,rear,89.50,168.90,65.00,51.60,2756,ohcf,six,194,mpfi,3.74,2.90,9.50,207,5900,17,25,32528 +3,?,porsche,gas,std,two,hardtop,rwd,rear,89.50,168.90,65.00,51.60,2756,ohcf,six,194,mpfi,3.74,2.90,9.50,207,5900,17,25,34028 +3,?,porsche,gas,std,two,convertible,rwd,rear,89.50,168.90,65.00,51.60,2800,ohcf,six,194,mpfi,3.74,2.90,9.50,207,5900,17,25,37028 +1,?,porsche,gas,std,two,hatchback,rwd,front,98.40,175.70,72.30,50.50,3366,dohcv,eight,203,mpfi,3.94,3.11,10.00,288,5750,17,28,? +0,?,renault,gas,std,four,wagon,fwd,front,96.10,181.50,66.50,55.20,2579,ohc,four,132,mpfi,3.46,3.90,8.70,?,?,23,31,9295 +2,?,renault,gas,std,two,hatchback,fwd,front,96.10,176.80,66.60,50.50,2460,ohc,four,132,mpfi,3.46,3.90,8.70,?,?,23,31,9895 +3,150,saab,gas,std,two,hatchback,fwd,front,99.10,186.60,66.50,56.10,2658,ohc,four,121,mpfi,3.54,3.07,9.31,110,5250,21,28,11850 +2,104,saab,gas,std,four,sedan,fwd,front,99.10,186.60,66.50,56.10,2695,ohc,four,121,mpfi,3.54,3.07,9.30,110,5250,21,28,12170 +3,150,saab,gas,std,two,hatchback,fwd,front,99.10,186.60,66.50,56.10,2707,ohc,four,121,mpfi,2.54,2.07,9.30,110,5250,21,28,15040 +2,104,saab,gas,std,four,sedan,fwd,front,99.10,186.60,66.50,56.10,2758,ohc,four,121,mpfi,3.54,3.07,9.30,110,5250,21,28,15510 +3,150,saab,gas,turbo,two,hatchback,fwd,front,99.10,186.60,66.50,56.10,2808,dohc,four,121,mpfi,3.54,3.07,9.00,160,5500,19,26,18150 +2,104,saab,gas,turbo,four,sedan,fwd,front,99.10,186.60,66.50,56.10,2847,dohc,four,121,mpfi,3.54,3.07,9.00,160,5500,19,26,18620 +2,83,subaru,gas,std,two,hatchback,fwd,front,93.70,156.90,63.40,53.70,2050,ohcf,four,97,2bbl,3.62,2.36,9.00,69,4900,31,36,5118 +2,83,subaru,gas,std,two,hatchback,fwd,front,93.70,157.90,63.60,53.70,2120,ohcf,four,108,2bbl,3.62,2.64,8.70,73,4400,26,31,7053 +2,83,subaru,gas,std,two,hatchback,4wd,front,93.30,157.30,63.80,55.70,2240,ohcf,four,108,2bbl,3.62,2.64,8.70,73,4400,26,31,7603 +0,102,subaru,gas,std,four,sedan,fwd,front,97.20,172.00,65.40,52.50,2145,ohcf,four,108,2bbl,3.62,2.64,9.50,82,4800,32,37,7126 +0,102,subaru,gas,std,four,sedan,fwd,front,97.20,172.00,65.40,52.50,2190,ohcf,four,108,2bbl,3.62,2.64,9.50,82,4400,28,33,7775 +0,102,subaru,gas,std,four,sedan,fwd,front,97.20,172.00,65.40,52.50,2340,ohcf,four,108,mpfi,3.62,2.64,9.00,94,5200,26,32,9960 +0,102,subaru,gas,std,four,sedan,4wd,front,97.00,172.00,65.40,54.30,2385,ohcf,four,108,2bbl,3.62,2.64,9.00,82,4800,24,25,9233 +0,102,subaru,gas,turbo,four,sedan,4wd,front,97.00,172.00,65.40,54.30,2510,ohcf,four,108,mpfi,3.62,2.64,7.70,111,4800,24,29,11259 +0,89,subaru,gas,std,four,wagon,fwd,front,97.00,173.50,65.40,53.00,2290,ohcf,four,108,2bbl,3.62,2.64,9.00,82,4800,28,32,7463 +0,89,subaru,gas,std,four,wagon,fwd,front,97.00,173.50,65.40,53.00,2455,ohcf,four,108,mpfi,3.62,2.64,9.00,94,5200,25,31,10198 +0,85,subaru,gas,std,four,wagon,4wd,front,96.90,173.60,65.40,54.90,2420,ohcf,four,108,2bbl,3.62,2.64,9.00,82,4800,23,29,8013 +0,85,subaru,gas,turbo,four,wagon,4wd,front,96.90,173.60,65.40,54.90,2650,ohcf,four,108,mpfi,3.62,2.64,7.70,111,4800,23,23,11694 +1,87,toyota,gas,std,two,hatchback,fwd,front,95.70,158.70,63.60,54.50,1985,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,35,39,5348 +1,87,toyota,gas,std,two,hatchback,fwd,front,95.70,158.70,63.60,54.50,2040,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,31,38,6338 +1,74,toyota,gas,std,four,hatchback,fwd,front,95.70,158.70,63.60,54.50,2015,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,31,38,6488 +0,77,toyota,gas,std,four,wagon,fwd,front,95.70,169.70,63.60,59.10,2280,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,31,37,6918 +0,81,toyota,gas,std,four,wagon,4wd,front,95.70,169.70,63.60,59.10,2290,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,27,32,7898 +0,91,toyota,gas,std,four,wagon,4wd,front,95.70,169.70,63.60,59.10,3110,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,27,32,8778 +0,91,toyota,gas,std,four,sedan,fwd,front,95.70,166.30,64.40,53.00,2081,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,30,37,6938 +0,91,toyota,gas,std,four,hatchback,fwd,front,95.70,166.30,64.40,52.80,2109,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,30,37,7198 +0,91,toyota,diesel,std,four,sedan,fwd,front,95.70,166.30,64.40,53.00,2275,ohc,four,110,idi,3.27,3.35,22.50,56,4500,34,36,7898 +0,91,toyota,diesel,std,four,hatchback,fwd,front,95.70,166.30,64.40,52.80,2275,ohc,four,110,idi,3.27,3.35,22.50,56,4500,38,47,7788 +0,91,toyota,gas,std,four,sedan,fwd,front,95.70,166.30,64.40,53.00,2094,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,38,47,7738 +0,91,toyota,gas,std,four,hatchback,fwd,front,95.70,166.30,64.40,52.80,2122,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,28,34,8358 +0,91,toyota,gas,std,four,sedan,fwd,front,95.70,166.30,64.40,52.80,2140,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,28,34,9258 +1,168,toyota,gas,std,two,sedan,rwd,front,94.50,168.70,64.00,52.60,2169,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,29,34,8058 +1,168,toyota,gas,std,two,hatchback,rwd,front,94.50,168.70,64.00,52.60,2204,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,29,34,8238 +1,168,toyota,gas,std,two,sedan,rwd,front,94.50,168.70,64.00,52.60,2265,dohc,four,98,mpfi,3.24,3.08,9.40,112,6600,26,29,9298 +1,168,toyota,gas,std,two,hatchback,rwd,front,94.50,168.70,64.00,52.60,2300,dohc,four,98,mpfi,3.24,3.08,9.40,112,6600,26,29,9538 +2,134,toyota,gas,std,two,hardtop,rwd,front,98.40,176.20,65.60,52.00,2540,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,8449 +2,134,toyota,gas,std,two,hardtop,rwd,front,98.40,176.20,65.60,52.00,2536,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,9639 +2,134,toyota,gas,std,two,hatchback,rwd,front,98.40,176.20,65.60,52.00,2551,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,9989 +2,134,toyota,gas,std,two,hardtop,rwd,front,98.40,176.20,65.60,52.00,2679,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,11199 +2,134,toyota,gas,std,two,hatchback,rwd,front,98.40,176.20,65.60,52.00,2714,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,11549 +2,134,toyota,gas,std,two,convertible,rwd,front,98.40,176.20,65.60,53.00,2975,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,17669 +-1,65,toyota,gas,std,four,sedan,fwd,front,102.40,175.60,66.50,54.90,2326,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,29,34,8948 +-1,65,toyota,diesel,turbo,four,sedan,fwd,front,102.40,175.60,66.50,54.90,2480,ohc,four,110,idi,3.27,3.35,22.50,73,4500,30,33,10698 +-1,65,toyota,gas,std,four,hatchback,fwd,front,102.40,175.60,66.50,53.90,2414,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,27,32,9988 +-1,65,toyota,gas,std,four,sedan,fwd,front,102.40,175.60,66.50,54.90,2414,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,27,32,10898 +-1,65,toyota,gas,std,four,hatchback,fwd,front,102.40,175.60,66.50,53.90,2458,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,27,32,11248 +3,197,toyota,gas,std,two,hatchback,rwd,front,102.90,183.50,67.70,52.00,2976,dohc,six,171,mpfi,3.27,3.35,9.30,161,5200,20,24,16558 +3,197,toyota,gas,std,two,hatchback,rwd,front,102.90,183.50,67.70,52.00,3016,dohc,six,171,mpfi,3.27,3.35,9.30,161,5200,19,24,15998 +-1,90,toyota,gas,std,four,sedan,rwd,front,104.50,187.80,66.50,54.10,3131,dohc,six,171,mpfi,3.27,3.35,9.20,156,5200,20,24,15690 +-1,?,toyota,gas,std,four,wagon,rwd,front,104.50,187.80,66.50,54.10,3151,dohc,six,161,mpfi,3.27,3.35,9.20,156,5200,19,24,15750 +2,122,volkswagen,diesel,std,two,sedan,fwd,front,97.30,171.70,65.50,55.70,2261,ohc,four,97,idi,3.01,3.40,23.00,52,4800,37,46,7775 +2,122,volkswagen,gas,std,two,sedan,fwd,front,97.30,171.70,65.50,55.70,2209,ohc,four,109,mpfi,3.19,3.40,9.00,85,5250,27,34,7975 +2,94,volkswagen,diesel,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2264,ohc,four,97,idi,3.01,3.40,23.00,52,4800,37,46,7995 +2,94,volkswagen,gas,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2212,ohc,four,109,mpfi,3.19,3.40,9.00,85,5250,27,34,8195 +2,94,volkswagen,gas,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2275,ohc,four,109,mpfi,3.19,3.40,9.00,85,5250,27,34,8495 +2,94,volkswagen,diesel,turbo,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2319,ohc,four,97,idi,3.01,3.40,23.00,68,4500,37,42,9495 +2,94,volkswagen,gas,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2300,ohc,four,109,mpfi,3.19,3.40,10.00,100,5500,26,32,9995 +3,?,volkswagen,gas,std,two,convertible,fwd,front,94.50,159.30,64.20,55.60,2254,ohc,four,109,mpfi,3.19,3.40,8.50,90,5500,24,29,11595 +3,256,volkswagen,gas,std,two,hatchback,fwd,front,94.50,165.70,64.00,51.40,2221,ohc,four,109,mpfi,3.19,3.40,8.50,90,5500,24,29,9980 +0,?,volkswagen,gas,std,four,sedan,fwd,front,100.40,180.20,66.90,55.10,2661,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,24,13295 +0,?,volkswagen,diesel,turbo,four,sedan,fwd,front,100.40,180.20,66.90,55.10,2579,ohc,four,97,idi,3.01,3.40,23.00,68,4500,33,38,13845 +0,?,volkswagen,gas,std,four,wagon,fwd,front,100.40,183.10,66.90,55.10,2563,ohc,four,109,mpfi,3.19,3.40,9.00,88,5500,25,31,12290 +-2,103,volvo,gas,std,four,sedan,rwd,front,104.30,188.80,67.20,56.20,2912,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,23,28,12940 +-1,74,volvo,gas,std,four,wagon,rwd,front,104.30,188.80,67.20,57.50,3034,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,23,28,13415 +-2,103,volvo,gas,std,four,sedan,rwd,front,104.30,188.80,67.20,56.20,2935,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,24,28,15985 +-1,74,volvo,gas,std,four,wagon,rwd,front,104.30,188.80,67.20,57.50,3042,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,24,28,16515 +-2,103,volvo,gas,turbo,four,sedan,rwd,front,104.30,188.80,67.20,56.20,3045,ohc,four,130,mpfi,3.62,3.15,7.50,162,5100,17,22,18420 +-1,74,volvo,gas,turbo,four,wagon,rwd,front,104.30,188.80,67.20,57.50,3157,ohc,four,130,mpfi,3.62,3.15,7.50,162,5100,17,22,18950 +-1,95,volvo,gas,std,four,sedan,rwd,front,109.10,188.80,68.90,55.50,2952,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,23,28,16845 +-1,95,volvo,gas,turbo,four,sedan,rwd,front,109.10,188.80,68.80,55.50,3049,ohc,four,141,mpfi,3.78,3.15,8.70,160,5300,19,25,19045 +-1,95,volvo,gas,std,four,sedan,rwd,front,109.10,188.80,68.90,55.50,3012,ohcv,six,173,mpfi,3.58,2.87,8.80,134,5500,18,23,21485 +-1,95,volvo,diesel,turbo,four,sedan,rwd,front,109.10,188.80,68.90,55.50,3217,ohc,six,145,idi,3.01,3.40,23.00,106,4800,26,27,22470 +-1,95,volvo,gas,turbo,four,sedan,rwd,front,109.10,188.80,68.90,55.50,3062,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,19,25,22625 \ No newline at end of file diff --git a/Module3/DataPreparation.ipynb b/Module3/DataPreparation.ipynb new file mode 100644 index 0000000..cd27ca9 --- /dev/null +++ b/Module3/DataPreparation.ipynb @@ -0,0 +1,806 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Data Preparation for Machine Learning\n", + "\n", + "**Data preparation** is a vital step in the machine learning pipeline. Just as visualization is necessary to understand the relationships in data, proper preparation or **data munging** is required to ensure machine learning models work optimally. \n", + "\n", + "The process of data preparation is highly interactive and iterative. A typical process includes at least the following steps:\n", + "1. **Visualization** of the dataset to understand the relationships and identify possible problems with the data.\n", + "2. **Data cleaning and transformation** to address the problems identified. It many cases, step 1 is then repeated to verify that the cleaning and transformation had the desired effect. \n", + "3. **Construction and evaluation of a machine learning models**. Visualization of the results will often lead to understanding of further data preparation that is required; going back to step 1. \n", + "\n", + "In this lab you will learn the following: \n", + "- Recode character strings to eliminate characters that will not be processed correctly.\n", + "- Find and treat missing values. \n", + "- Set correct data type of each column. \n", + "- Transform categorical features to create categories with more cases and coding likely to be useful in predicting the label. \n", + "- Apply transformations to numeric features and the label to improve the distribution properties. \n", + "- Locate and treat duplicate cases. \n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## An example\n", + "\n", + "As a first example you will prepare the automotive dataset. Careful preparation of this dataset, or any dataset, is required before atempting to train any machine learning model. This dataset has a number of problems which must be addressed. Further, some feature engineering will be applied. \n", + "\n", + "### Load the dataset\n", + "\n", + "As a first step you must load the dataset. \n", + "\n", + "Execute the code in the cell below to load the packages required to run this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to load the dataset and print the first few rows of the data frame." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices = pd.read_csv('Automobile price data _Raw_.csv')\n", + "auto_prices.head(20)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You will now perform some data preparation steps. \n", + "\n", + "Recall the function clean_auto_data() in the previous lab. The next few sections perform the same tasks to the clean_auto_data() function, with detail explanations. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Recode names\n", + "\n", + "Notice that several of the column names contain the '-' character. Python will not correctly recognize character strings containing '-'. Rather, such a name will be recognized as two character strings. The same problem will occur with column values containing many special characters including, '-', ',', '*', '/', '|', '>', '<', '@', '!' etc. If such characters appear in column names of values, they must be replaced with another character. \n", + "\n", + "Execute the code in the cell below to replace the '-' characters by '_':" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices.columns = [str.replace('-', '_') for str in auto_prices.columns]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Treat missing values\n", + "\n", + "**Missing values** are a common problem in data set. Failure to deal with missing values before training a machine learning model will lead to biased training at best, and in many cases actual failure. The Python scikit-learn package will not process arrays with missing values. \n", + "\n", + "There are two problems that must be deal with when treating missing values:\n", + "1. First you must find the missing values. This can be difficult as there is no standard way missing values are coded. Some common possibilities for missing values are:\n", + " - Coded by some particular character string, or numeric value like -999. \n", + " - A NULL value or numeric missing value such as a NaN. \n", + "2. You must determine how to treat the missing values:\n", + " - Remove features with substantial numbers of missing values. In many cases, such features are likely to have little information value. \n", + " - Remove rows with missing values. If there are only a few rows with missing values it might be easier and more certain to simply remove them. \n", + " - Impute values. Imputation can be done with simple algorithms such as replacing the missing values with the mean or median value. There are also complex statistical methods such as the expectation maximization (EM) or SMOTE algorithms. \n", + " - Use nearest neighbor values. Alternatives for nearest neighbor values include, averaging, forward filling or backward filling. \n", + " \n", + "Carefully observe the first few cases from the data frame and notice that missing values are coded with a '?' character. Execute the code in the cell below to identify the columns with missing values." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "(auto_prices.astype(np.object) == '?').any()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to display the data types of each column." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices.dtypes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the columns with missing values to their data types. In all cases, the columns with missing values have an `object` (character) type as a result of using the '?' code. As a result, some columns that should be numeric (bore, stroke, horsepower, peak_rpm, and price) are coded as `object`.\n", + "\n", + "The next question is how many missing values are in each of these `object` type columns? Execute the code in the cell below to display the counts of missing values. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for col in auto_prices.columns:\n", + " if auto_prices[col].dtype == object:\n", + " count = 0\n", + " count = [count + 1 for x in auto_prices[col] if x == '?']\n", + " print(col + ' ' + str(sum(count)))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `normalize_losses` column has a significant number of missing values and will be removed. Columns that should be numeric, but contain missing values, are processed in the following manner:\n", + "1. The '?' values are computed to Pandas Numpy missing values `nan`.\n", + "2. Rows containing `nan` values are removed. \n", + "\n", + "Execute this code, noticing the resulting shape of the data frame. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Drop column with too many missing values\n", + "auto_prices.drop('normalized_losses', axis = 1, inplace = True)\n", + "## Remove rows with missing values, accounting for mising values coded as '?'\n", + "cols = ['price', 'bore', 'stroke',\n", + " 'horsepower', 'peak_rpm']\n", + "for column in cols:\n", + " auto_prices.loc[auto_prices[column] == '?', column] = np.nan\n", + "auto_prices.dropna(axis = 0, inplace = True)\n", + "auto_prices.shape " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The data set now contains 195 cases and 25 columns. 10 rows have been dropped by removing missing values. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Transform column data type\n", + "\n", + "As has been previously noted, there are five columns in this dataset which do not have the correct type as a result of missing values. This is a common situation, as the methods used to automatically determine data type when loading files can fail when missing values are present. \n", + "\n", + "The code in the cell below iterates over a list of columns setting them to numeric. Execute this code and observe the resulting types." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for column in cols:\n", + " auto_prices[column] = pd.to_numeric(auto_prices[column])\n", + "auto_prices[cols].dtypes" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Feature engineering and transforming variables\n", + "\n", + "In most cases, machine learning is not done with the raw features. Features are transformed, or combined to form new features in forms which are more predictive. This process is known as **feature engineering**. In many cases, good feature engineering is more important than the details of the machine learning model used. It is often the case that good features can make even poor machine learning models work well, whereas, given poor features even the best machine learning model will produce poor results. As the famous saying goes, \"garbage in, garbage out\".\n", + "\n", + "Some common approaches to feature engineering include:\n", + "- **Aggregating categories** of categorical variables to reduce the number. Categorical features or labels with too many unique categories will limit the predictive power of a machine learning model. Aggregating categories can improve this situation, sometime greatly. However, one must be careful. It only makes sense to aggregate categories that are similar in the domain of the problem. Thus, domain expertise must be applied. \n", + "- **Transforming numeric variables** to improve their distribution properties to make them more covariate with other variables. This process can be applied not only features, but to labels for regression problems. Some common transformations include, **logarithmic** and **power** included squares and square roots. \n", + "- **Compute new features** from two or more existing features. These new features are often referred to as **interaction terms**. An interaction occurs when the behavior of say, the produce of the values of two features, is significantly more predictive than the two features by themselves. Consider the probability of purchase for a luxury mens' shoe. This probability depends on the interaction of the user being a man and the buyer being wealthy. As another example, consider the number of expected riders on a bus route. This value will depend on the interaction between the time of day and if it is a holiday. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Aggregating categorical variables\n", + "\n", + "When a dataset contains categorical variables these need to be investigated to ensure that each category has sufficient samples. It is commonly the case that some categories may have very few samples, or have so many similar categories as to be meaningless. \n", + "\n", + "As a specific case, you will examine the number of cylinders in the cars. Execute the code in the cell below to print a frequency table for this variable and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices['num_of_cylinders'].value_counts()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there is only one car with three and twelve cylinders. There are only four cars with eight cylinders, and 10 cars with five cylinders. It is likely that all of these categories will not have statistically significant difference in predicting auto price. It is clear that these categories need to be aggregated. \n", + "\n", + "The code in the cell below uses a Python dictionary to recode the number of cylinder categories into a smaller number categories. Execute this code and examine the resulting frequency table." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "cylinder_categories = {'three':'three_four', 'four':'three_four', \n", + " 'five':'five_six', 'six':'five_six',\n", + " 'eight':'eight_twelve', 'twelve':'eight_twelve'}\n", + "auto_prices['num_of_cylinders'] = [cylinder_categories[x] for x in auto_prices['num_of_cylinders']]\n", + "auto_prices['num_of_cylinders'].value_counts()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are now three categories. One of these categories only has five members. However, it is likely that these autos will have different pricing from others.\n", + "\n", + "Next, execute the code in the cell below to make box plots of the new cylinder categories." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_box(auto_prices, col, col_y = 'price'):\n", + " sns.set_style(\"whitegrid\")\n", + " sns.boxplot(col, col_y, data=auto_prices)\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + " \n", + "plot_box(auto_prices, 'num_of_cylinders') " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Indeed, the price range of these categories is distinctive. It is likely that these new categories will be useful in predicting the price of autos. \n", + "\n", + "Now, execute the code in the cell below and examine the frequency table for the `body_style` feature." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices['body_style'].value_counts()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Two of these categories have a limited number of cases. These categories can be aggregated to increase the number of cases using a similar approach as used for the number of cylinders. Execute the code in the cell below to aggregate these categories." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "body_cats = {'sedan':'sedan', 'hatchback':'hatchback', 'wagon':'wagon', \n", + " 'hardtop':'hardtop_convert', 'convertible':'hardtop_convert'}\n", + "auto_prices['body_style'] = [body_cats[x] for x in auto_prices['body_style']]\n", + "auto_prices['body_style'].value_counts()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To investigate if this aggregation of categories was a good idea, execute the code in the cell below to display a box plot. \n", + "\n", + "Then, answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_box(auto_prices, col, col_y = 'price'):\n", + " sns.set_style(\"whitegrid\")\n", + " sns.boxplot(col, col_y, data=auto_prices)\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.show()\n", + " \n", + "plot_box(auto_prices, 'body_style') " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `hardtop_convert` category does appear to have values distinct from the other body style. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Transforming numeric variables\n", + "\n", + "To improve performance of machine learning models transformations of the values are often applied. Typically, transformations are used to make the relationships between variables more linear. In other cases, transformations are performed to make distributions closer to Normal, or at least more symmetric. These transformations can include taking logarithms, exponential transformations and power transformations. \n", + "\n", + "In this case, you will transform the label, the price of the car. Execute the code in the cell below to display and examine a histogram of the label. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def hist_plot(vals, lab):\n", + " ## Distribution plot of values\n", + " sns.distplot(vals)\n", + " plt.title('Histogram of ' + lab)\n", + " plt.xlabel('Value')\n", + " plt.ylabel('Density')\n", + " \n", + "#labels = np.array(auto_prices['price'])\n", + "hist_plot(auto_prices['price'], 'prices')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The distribution of auto price is both quite skewed to the left and multimodal. Given the skew and the fact that there are no values less than or equal to zero, a log transformation might be appropriate.\n", + "\n", + "The code in the cell below displays a histogram of the logarithm of prices. Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices['log_price'] = np.log(auto_prices['price'])\n", + "hist_plot(auto_prices['log_price'], 'log prices')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The distribution of the logarithm of price is more symmetric, but still shows some multimodal tendency and skew. Nonetheless, this is an improvement so we will use these values as our label.\n", + "\n", + "The next question is, how does this transformation change the relationship between the label and some of the features? To find out, execute the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_scatter_shape(auto_prices, cols, shape_col = 'fuel_type', col_y = 'log_price', alpha = 0.2):\n", + " shapes = ['+', 'o', 's', 'x', '^'] # pick distinctive shapes\n", + " unique_cats = auto_prices[shape_col].unique()\n", + " for col in cols: # loop over the columns to plot\n", + " sns.set_style(\"whitegrid\")\n", + " for i, cat in enumerate(unique_cats): # loop over the unique categories\n", + " temp = auto_prices[auto_prices[shape_col] == cat]\n", + " sns.regplot(col, col_y, data=temp, marker = shapes[i], label = cat,\n", + " scatter_kws={\"alpha\":alpha}, fit_reg = False, color = 'blue')\n", + " plt.title('Scatter plot of ' + col_y + ' vs. ' + col) # Give the plot a main title\n", + " plt.xlabel(col) # Set text for the x axis\n", + " plt.ylabel(col_y)# Set text for y axis\n", + " plt.legend()\n", + " plt.show()\n", + " \n", + "num_cols = ['curb_weight', 'engine_size', 'horsepower', 'city_mpg']\n", + "plot_scatter_shape(auto_prices, num_cols) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Comparing the results to those obtained in the visualization lab, it does appear that the relationships between curb_weight and log_price and city_mpg and log_price are more linear, compared to the relationships between curb_weight and price and city_mpg and price respectively.\n", + "\n", + "The relationship with the log_price and categorical variables should likely also be investigated. It is also possible that some type of power transformation should be applied to, say horsepower or engine_size. In the interest of brevity, these ideas are not pursued here. \n", + "\n", + "Before proceeding, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Let's save the dataframe to a csv file \n", + "# We will use this in the next module so that we don't have to re-do the steps above\n", + "# You don't have to run this code as the csv file has been saved under the next module's folder\n", + "#auto_prices.to_csv('Auto_Data_Preped.csv', index = False, header = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "Next, you will prepare the German credit data. Execute the code in the cell below to load the dataset and print the head (first 5 rows) of the dataframe." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit = pd.read_csv('German_Credit.csv', header=None)\n", + "credit.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This dataset is a bit hard to understand. For a start, the column names are not human readable. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Recode character strings \n", + "\n", + "You have likely noticed that the the column names are not human readable. This can be changed as was done for the previous dataset. Execute the code in the cell below to add human-readable column names to the data frame. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit.columns = ['customer_id', 'checking_account_status', 'loan_duration_mo', 'credit_history', \n", + " 'purpose', 'loan_amount', 'savings_account_balance', \n", + " 'time_employed_yrs', 'payment_pcnt_income','gender_status', \n", + " 'other_signators', 'time_in_residence', 'property', 'age_yrs',\n", + " 'other_credit_outstanding', 'home_ownership', 'number_loans', \n", + " 'job_category', 'dependents', 'telephone', 'foreign_worker', \n", + " 'bad_credit']\n", + "credit.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, there is a trickier problem to deal with. The current coding of the categorical variables is impossible to understand. This makes interpreting these variables nearly impossible. \n", + "\n", + "The code in the cell below uses a list of dictionaries to recode the categorical features with human-readable text. The final dictionary in the list recodes good and bad credit as a binary variable, $\\{ 0,1 \\}$. Two iterators are used to apply the dictionary:\n", + "1. The `for` loop iterates over the columns and indexes the dictionary for the column. \n", + "2. A list comprehension iterates of the values in the column and uses the dictionary to map the codes to human-readable category names. \n", + "\n", + "Execute this code and examine the result: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "code_list = [['checking_account_status', \n", + " {'A11' : '< 0 DM', \n", + " 'A12' : '0 - 200 DM', \n", + " 'A13' : '> 200 DM or salary assignment', \n", + " 'A14' : 'none'}],\n", + " ['credit_history',\n", + " {'A30' : 'no credit - paid', \n", + " 'A31' : 'all loans at bank paid', \n", + " 'A32' : 'current loans paid', \n", + " 'A33' : 'past payment delays', \n", + " 'A34' : 'critical account - other non-bank loans'}],\n", + " ['purpose',\n", + " {'A40' : 'car (new)', \n", + " 'A41' : 'car (used)',\n", + " 'A42' : 'furniture/equipment',\n", + " 'A43' : 'radio/television', \n", + " 'A44' : 'domestic appliances', \n", + " 'A45' : 'repairs', \n", + " 'A46' : 'education', \n", + " 'A47' : 'vacation',\n", + " 'A48' : 'retraining',\n", + " 'A49' : 'business', \n", + " 'A410' : 'other' }],\n", + " ['savings_account_balance',\n", + " {'A61' : '< 100 DM', \n", + " 'A62' : '100 - 500 DM', \n", + " 'A63' : '500 - 1000 DM', \n", + " 'A64' : '>= 1000 DM',\n", + " 'A65' : 'unknown/none' }],\n", + " ['time_employed_yrs',\n", + " {'A71' : 'unemployed',\n", + " 'A72' : '< 1 year', \n", + " 'A73' : '1 - 4 years', \n", + " 'A74' : '4 - 7 years', \n", + " 'A75' : '>= 7 years'}],\n", + " ['gender_status',\n", + " {'A91' : 'male-divorced/separated', \n", + " 'A92' : 'female-divorced/separated/married',\n", + " 'A93' : 'male-single', \n", + " 'A94' : 'male-married/widowed', \n", + " 'A95' : 'female-single'}],\n", + " ['other_signators',\n", + " {'A101' : 'none', \n", + " 'A102' : 'co-applicant', \n", + " 'A103' : 'guarantor'}],\n", + " ['property',\n", + " {'A121' : 'real estate',\n", + " 'A122' : 'building society savings/life insurance', \n", + " 'A123' : 'car or other',\n", + " 'A124' : 'unknown-none' }],\n", + " ['other_credit_outstanding',\n", + " {'A141' : 'bank', \n", + " 'A142' : 'stores', \n", + " 'A143' : 'none'}],\n", + " ['home_ownership',\n", + " {'A151' : 'rent', \n", + " 'A152' : 'own', \n", + " 'A153' : 'for free'}],\n", + " ['job_category',\n", + " {'A171' : 'unemployed-unskilled-non-resident', \n", + " 'A172' : 'unskilled-resident', \n", + " 'A173' : 'skilled',\n", + " 'A174' : 'highly skilled'}],\n", + " ['telephone', \n", + " {'A191' : 'none', \n", + " 'A192' : 'yes'}],\n", + " ['foreign_worker',\n", + " {'A201' : 'yes', \n", + " 'A202' : 'no'}],\n", + " ['bad_credit',\n", + " {2 : 1,\n", + " 1 : 0}]]\n", + "\n", + "for col_dic in code_list:\n", + " col = col_dic[0]\n", + " dic = col_dic[1]\n", + " credit[col] = [dic[x] for x in credit[col]]\n", + " \n", + "credit.head() " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The categorical values are now coded in a human readable manner. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Remove duplicate rows\n", + "\n", + "Duplicate cases can seriously bias the training of machine learning models. In simple terms, cases which are duplicates add undue weight to that case when training a machine learning model. Therefore, it is necessary to ensure there are no duplicates in the dataset before training a model. \n", + "\n", + "One must be careful when determining if a case is a duplicate or not. It is possible that some cases have identical values, particularly if most or all features are categorical. On the other hand, if there are columns with values guaranteed to be unique these can be used to detect and remove duplicates.\n", + "\n", + "Another consideration when removing duplicate cases is determining which case to remove. If the duplicates have different dates of creation, the newest date is often selected. In the absence of such a criteria, the choice is often arbitrary. You may chose to keep the first case or the last case. \n", + "\n", + "The German credit data has a customer_id column which should be unique. In the previous lab, we simply remove the customer_id. Turns out, this identifier column will be useful to determine duplicate rows. The presence of duplicates can be determined by comparing the number of rows to the number of unique values of the identifier column, in this case the customer_id column. The code in the cell below prints the shape of the data frame and the number of unique customer_id values. \n", + "\n", + "Execute this code, examine the results, and answer **Question 3** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(credit.shape)\n", + "print(credit.customer_id.unique().shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are 12 duplicate cases. These need to be located and the duplicates removed. In this case, the first instance will be kept. \n", + "\n", + "The code in the cell below removes these duplicates from the data frame inplace and the number of remaining rows and unique customer_ids are printed. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "credit.drop_duplicates(subset = 'customer_id', keep = 'first', inplace = True)\n", + "print(credit.shape)\n", + "print(credit.customer_id.unique().shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The duplicate rows have been successfully removed. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Let's save the dataframe to a csv file \n", + "# We will use this in the next module so that we don't have to re-do the steps above\n", + "# You don't have to run this code as the csv file has been saved under the next module's folder\n", + "#credit.to_csv('German_Credit_Preped.csv', index = False, header = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Feature engineering\n", + "\n", + "Some feature engineering needs to be investigated to determine if any improvement in predictive power can be expected. From the previous data exploration, it is apparent that several of the numeric features had a strong left skew. A log transformation may help in a case like this. \n", + "\n", + "Execute the code in the cell below uses the Pandas `applymap` method to apply the `log` function to each element of several columns in the data frame. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit[['log_loan_duration_mo', 'log_loan_amount', 'log_age_yrs']] = credit[['loan_duration_mo', 'loan_amount', 'age_yrs']].applymap(math.log)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to visualize the differences in the distributions of the untransformed and transformed variables for the two label values. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "num_cols = ['log_loan_duration_mo', 'log_loan_amount', 'log_age_yrs',\n", + " 'loan_duration_mo', 'loan_amount', 'age_yrs']\n", + "\n", + "for col in num_cols:\n", + " print(col)\n", + " _ = plt.figure(figsize = (10,4))\n", + " sns.violinplot(x= 'bad_credit', y = col, hue = 'bad_credit', \n", + " data = credit)\n", + " plt.ylabel('value')\n", + " plt.xlabel(col)\n", + " plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The log transformed features have more symmetric distributions. However, it does not appear that the separation of the label cases is improved. These features will not be used further.\n", + "\n", + "****\n", + "**Note:** Recalling the visualization of the categorical features, there are quite a few categories with few cases. However, it is not clear how these categories can be reasonably combined. It may be the case that some of these categorical features are not terribly predictive.\n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "Good data preparation is the key to good machine learning performance. Data preparation or data munging is a time interactive and iterative process. Continue to visualize the results as you test ideas. Expect to try many approaches, reject the ones that do not help, and keep the ones that do. In summary, test a lot of ideas, fail fast, keep what works. The reward is that well prepared data can improve the performance of almost any machine learning algorithm." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module3/German_Credit.csv b/Module3/German_Credit.csv new file mode 100644 index 0000000..0f6b915 --- /dev/null +++ b/Module3/German_Credit.csv @@ -0,0 +1,1012 @@ +1122334,A11,6,A34,A43,1169,A65,A75,4,A93,A101,4,A121,67,A143,A152,2,A173,1,A192,A201,1 +6156361,A12,48,A32,A43,5951,A61,A73,2,A92,A101,2,A121,22,A143,A152,1,A173,1,A191,A201,2 +2051359,A14,12,A34,A46,2096,A61,A74,2,A93,A101,3,A121,49,A143,A152,1,A172,2,A191,A201,1 +8740590,A11,42,A32,A42,7882,A61,A74,2,A93,A103,4,A122,45,A143,A153,1,A173,2,A191,A201,1 +3924540,A11,24,A33,A40,4870,A61,A73,3,A93,A101,4,A124,53,A143,A153,2,A173,2,A191,A201,2 +3115687,A14,36,A32,A46,9055,A65,A73,2,A93,A101,4,A124,35,A143,A153,1,A172,2,A192,A201,1 +8251714,A14,24,A32,A42,2835,A63,A75,3,A93,A101,4,A122,53,A143,A152,1,A173,1,A191,A201,1 +2272783,A12,36,A32,A41,6948,A61,A73,2,A93,A101,2,A123,35,A143,A151,1,A174,1,A192,A201,1 +1865292,A14,12,A32,A43,3059,A64,A74,2,A91,A101,4,A121,61,A143,A152,1,A172,1,A191,A201,1 +8369450,A12,30,A34,A40,5234,A61,A71,4,A94,A101,2,A123,28,A143,A152,2,A174,1,A191,A201,2 +2859658,A12,12,A32,A40,1295,A61,A72,3,A92,A101,1,A123,25,A143,A151,1,A173,1,A191,A201,2 +7677202,A11,48,A32,A49,4308,A61,A72,3,A92,A101,4,A122,24,A143,A151,1,A173,1,A191,A201,2 +6368245,A12,12,A32,A43,1567,A61,A73,1,A92,A101,1,A123,22,A143,A152,1,A173,1,A192,A201,1 +8363768,A11,24,A34,A40,1199,A61,A75,4,A93,A101,4,A123,60,A143,A152,2,A172,1,A191,A201,2 +6846974,A11,15,A32,A40,1403,A61,A73,2,A92,A101,4,A123,28,A143,A151,1,A173,1,A191,A201,1 +6293067,A11,24,A32,A43,1282,A62,A73,4,A92,A101,2,A123,32,A143,A152,1,A172,1,A191,A201,2 +2807551,A14,24,A34,A43,2424,A65,A75,4,A93,A101,4,A122,53,A143,A152,2,A173,1,A191,A201,1 +8394907,A11,30,A30,A49,8072,A65,A72,2,A93,A101,3,A123,25,A141,A152,3,A173,1,A191,A201,1 +9516779,A12,24,A32,A41,12579,A61,A75,4,A92,A101,2,A124,44,A143,A153,1,A174,1,A192,A201,2 +1750839,A14,24,A32,A43,3430,A63,A75,3,A93,A101,2,A123,31,A143,A152,1,A173,2,A192,A201,1 +9007770,A14,9,A34,A40,2134,A61,A73,4,A93,A101,4,A123,48,A143,A152,3,A173,1,A192,A201,1 +5611578,A11,6,A32,A43,2647,A63,A73,2,A93,A101,3,A121,44,A143,A151,1,A173,2,A191,A201,1 +3677496,A11,10,A34,A40,2241,A61,A72,1,A93,A101,3,A121,48,A143,A151,2,A172,2,A191,A202,1 +3265162,A12,12,A34,A41,1804,A62,A72,3,A93,A101,4,A122,44,A143,A152,1,A173,1,A191,A201,1 +7795200,A14,10,A34,A42,2069,A65,A73,2,A94,A101,1,A123,26,A143,A152,2,A173,1,A191,A202,1 +4362293,A11,6,A32,A42,1374,A61,A73,1,A93,A101,2,A121,36,A141,A152,1,A172,1,A192,A201,1 +4506902,A14,6,A30,A43,426,A61,A75,4,A94,A101,4,A123,39,A143,A152,1,A172,1,A191,A201,1 +8881521,A13,12,A31,A43,409,A64,A73,3,A92,A101,3,A121,42,A143,A151,2,A173,1,A191,A201,1 +1398341,A12,7,A32,A43,2415,A61,A73,3,A93,A103,2,A121,34,A143,A152,1,A173,1,A191,A201,1 +7333893,A11,60,A33,A49,6836,A61,A75,3,A93,A101,4,A124,63,A143,A152,2,A173,1,A192,A201,2 +7890417,A12,18,A32,A49,1913,A64,A72,3,A94,A101,3,A121,36,A141,A152,1,A173,1,A192,A201,1 +2017286,A11,24,A32,A42,4020,A61,A73,2,A93,A101,2,A123,27,A142,A152,1,A173,1,A191,A201,1 +7081292,A12,18,A32,A40,5866,A62,A73,2,A93,A101,2,A123,30,A143,A152,2,A173,1,A192,A201,1 +8807838,A14,12,A34,A49,1264,A65,A75,4,A93,A101,4,A124,57,A143,A151,1,A172,1,A191,A201,1 +1221741,A13,12,A32,A42,1474,A61,A72,4,A92,A101,1,A122,33,A141,A152,1,A174,1,A192,A201,1 +3977061,A12,45,A34,A43,4746,A61,A72,4,A93,A101,2,A122,25,A143,A152,2,A172,1,A191,A201,2 +6028350,A14,48,A34,A46,6110,A61,A73,1,A93,A101,3,A124,31,A141,A153,1,A173,1,A192,A201,1 +2903516,A13,18,A32,A43,2100,A61,A73,4,A93,A102,2,A121,37,A142,A152,1,A173,1,A191,A201,2 +8335289,A13,10,A32,A44,1225,A61,A73,2,A93,A101,2,A123,37,A143,A152,1,A173,1,A192,A201,1 +8596814,A12,9,A32,A43,458,A61,A73,4,A93,A101,3,A121,24,A143,A152,1,A173,1,A191,A201,1 +1473485,A14,30,A32,A43,2333,A63,A75,4,A93,A101,2,A123,30,A141,A152,1,A174,1,A191,A201,1 +5043986,A12,12,A32,A43,1158,A63,A73,3,A91,A101,1,A123,26,A143,A152,1,A173,1,A192,A201,1 +7015188,A12,18,A33,A45,6204,A61,A73,2,A93,A101,4,A121,44,A143,A152,1,A172,2,A192,A201,1 +5609809,A11,30,A34,A41,6187,A62,A74,1,A94,A101,4,A123,24,A143,A151,2,A173,1,A191,A201,1 +6574474,A11,48,A34,A41,6143,A61,A75,4,A92,A101,4,A124,58,A142,A153,2,A172,1,A191,A201,2 +6668437,A14,11,A34,A40,1393,A61,A72,4,A92,A101,4,A123,35,A143,A152,2,A174,1,A191,A201,1 +7468489,A14,36,A32,A43,2299,A63,A75,4,A93,A101,4,A123,39,A143,A152,1,A173,1,A191,A201,1 +6045059,A11,6,A32,A41,1352,A63,A71,1,A92,A101,2,A122,23,A143,A151,1,A171,1,A192,A201,1 +3678840,A14,11,A34,A40,7228,A61,A73,1,A93,A101,4,A122,39,A143,A152,2,A172,1,A191,A201,1 +9514325,A14,12,A32,A43,2073,A62,A73,4,A92,A102,2,A121,28,A143,A152,1,A173,1,A191,A201,1 +7764002,A12,24,A33,A42,2333,A65,A72,4,A93,A101,2,A122,29,A141,A152,1,A172,1,A191,A201,1 +4029031,A12,27,A33,A41,5965,A61,A75,1,A93,A101,2,A123,30,A143,A152,2,A174,1,A192,A201,1 +1318782,A14,12,A32,A43,1262,A61,A73,3,A93,A101,2,A123,25,A143,A152,1,A173,1,A191,A201,1 +1442097,A14,18,A32,A41,3378,A65,A73,2,A93,A101,1,A122,31,A143,A152,1,A173,1,A192,A201,1 +7784864,A12,36,A33,A40,2225,A61,A75,4,A93,A101,4,A124,57,A141,A153,2,A173,1,A192,A201,2 +4572543,A14,6,A31,A40,783,A65,A73,1,A93,A103,2,A121,26,A142,A152,1,A172,2,A191,A201,1 +4365555,A12,12,A32,A43,6468,A65,A71,2,A93,A101,1,A124,52,A143,A152,1,A174,1,A192,A201,2 +6134562,A14,36,A34,A43,9566,A61,A73,2,A92,A101,2,A123,31,A142,A152,2,A173,1,A191,A201,1 +3409265,A13,18,A32,A40,1961,A61,A75,3,A92,A101,2,A123,23,A143,A152,1,A174,1,A191,A201,1 +1504574,A11,36,A34,A42,6229,A61,A72,4,A92,A102,4,A124,23,A143,A151,2,A172,1,A192,A201,2 +3256655,A12,9,A32,A49,1391,A61,A73,2,A94,A101,1,A121,27,A141,A152,1,A173,1,A192,A201,1 +8806796,A12,15,A34,A43,1537,A65,A75,4,A93,A103,4,A121,50,A143,A152,2,A173,1,A192,A201,1 +1364372,A12,36,A30,A49,1953,A61,A75,4,A93,A101,4,A124,61,A143,A153,1,A174,1,A192,A201,2 +8688712,A12,48,A30,A49,14421,A61,A73,2,A93,A101,2,A123,25,A143,A152,1,A173,1,A192,A201,2 +3747039,A14,24,A32,A43,3181,A61,A72,4,A92,A101,4,A122,26,A143,A152,1,A173,1,A192,A201,1 +3453642,A14,27,A32,A45,5190,A65,A75,4,A93,A101,4,A122,48,A143,A152,4,A173,2,A192,A201,1 +5520489,A14,12,A32,A43,2171,A61,A72,2,A92,A101,2,A123,29,A141,A152,1,A173,1,A191,A201,1 +1647906,A12,12,A32,A40,1007,A64,A73,4,A94,A101,1,A121,22,A143,A152,1,A173,1,A191,A201,1 +2987381,A14,36,A32,A46,1819,A61,A73,4,A93,A101,4,A124,37,A142,A153,1,A173,1,A192,A201,2 +3683782,A14,36,A32,A43,2394,A65,A73,4,A92,A101,4,A123,25,A143,A152,1,A173,1,A191,A201,1 +6065675,A14,36,A32,A41,8133,A61,A73,1,A92,A101,2,A122,30,A141,A152,1,A173,1,A191,A201,1 +1510672,A14,7,A34,A43,730,A65,A75,4,A93,A101,2,A122,46,A143,A151,2,A172,1,A192,A201,1 +6390550,A11,8,A34,A410,1164,A61,A75,3,A93,A101,4,A124,51,A141,A153,2,A174,2,A192,A201,1 +6771352,A12,42,A34,A49,5954,A61,A74,2,A92,A101,1,A121,41,A141,A152,2,A172,1,A191,A201,1 +4802045,A11,36,A32,A46,1977,A65,A75,4,A93,A101,4,A124,40,A143,A152,1,A174,1,A192,A201,2 +8911390,A11,12,A34,A41,1526,A61,A75,4,A93,A101,4,A124,66,A143,A153,2,A174,1,A191,A201,1 +1139059,A11,42,A32,A43,3965,A61,A72,4,A93,A101,3,A123,34,A143,A152,1,A173,1,A191,A201,2 +7399701,A12,11,A33,A43,4771,A61,A74,2,A93,A101,4,A122,51,A143,A152,1,A173,1,A191,A201,1 +5604915,A14,54,A30,A41,9436,A65,A73,2,A93,A101,2,A122,39,A143,A152,1,A172,2,A191,A201,1 +7683335,A12,30,A32,A42,3832,A61,A72,2,A94,A101,1,A122,22,A143,A152,1,A173,1,A191,A201,1 +5503995,A14,24,A32,A43,5943,A65,A72,1,A92,A101,1,A123,44,A143,A152,2,A173,1,A192,A201,2 +8801062,A14,15,A32,A43,1213,A63,A75,4,A93,A101,3,A122,47,A142,A152,1,A173,1,A192,A201,1 +1817735,A14,18,A32,A49,1568,A62,A73,3,A92,A101,4,A122,24,A143,A151,1,A172,1,A191,A201,1 +1806409,A11,24,A32,A410,1755,A61,A75,4,A92,A103,4,A121,58,A143,A152,1,A172,1,A192,A201,1 +3028735,A11,10,A32,A43,2315,A61,A75,3,A93,A101,4,A121,52,A143,A152,1,A172,1,A191,A201,1 +3707969,A14,12,A34,A49,1412,A61,A73,4,A92,A103,2,A121,29,A143,A152,2,A174,1,A192,A201,1 +7327747,A12,18,A34,A42,1295,A61,A72,4,A92,A101,1,A122,27,A143,A152,2,A173,1,A191,A201,1 +5790827,A12,36,A32,A46,12612,A62,A73,1,A93,A101,4,A124,47,A143,A153,1,A173,2,A192,A201,2 +1812102,A11,18,A32,A40,2249,A62,A74,4,A93,A101,3,A123,30,A143,A152,1,A174,2,A192,A201,1 +4049219,A11,12,A30,A45,1108,A61,A74,4,A93,A101,3,A121,28,A143,A152,2,A173,1,A191,A201,2 +6177564,A14,12,A34,A43,618,A61,A75,4,A93,A101,4,A121,56,A143,A152,1,A173,1,A191,A201,1 +9127163,A11,12,A34,A41,1409,A61,A75,4,A93,A101,3,A121,54,A143,A152,1,A173,1,A191,A201,1 +2628060,A14,12,A34,A43,797,A65,A75,4,A92,A101,3,A122,33,A141,A152,1,A172,2,A191,A201,2 +2031456,A13,24,A34,A42,3617,A65,A75,4,A93,A102,4,A124,20,A143,A151,2,A173,1,A191,A201,1 +3725303,A12,12,A32,A40,1318,A64,A75,4,A93,A101,4,A121,54,A143,A152,1,A173,1,A192,A201,1 +8326409,A12,54,A30,A49,15945,A61,A72,3,A93,A101,4,A124,58,A143,A151,1,A173,1,A192,A201,2 +1957928,A14,12,A34,A46,2012,A65,A74,4,A92,A101,2,A123,61,A143,A152,1,A173,1,A191,A201,1 +6428170,A12,18,A32,A49,2622,A62,A73,4,A93,A101,4,A123,34,A143,A152,1,A173,1,A191,A201,1 +4371483,A12,36,A34,A43,2337,A61,A75,4,A93,A101,4,A121,36,A143,A152,1,A173,1,A191,A201,1 +7476603,A12,20,A33,A41,7057,A65,A74,3,A93,A101,4,A122,36,A141,A151,2,A174,2,A192,A201,1 +2083425,A14,24,A32,A40,1469,A62,A75,4,A94,A101,4,A121,41,A143,A151,1,A172,1,A191,A201,1 +7663021,A12,36,A32,A43,2323,A61,A74,4,A93,A101,4,A123,24,A143,A151,1,A173,1,A191,A201,1 +8330126,A14,6,A33,A43,932,A61,A73,3,A92,A101,2,A121,24,A143,A152,1,A173,1,A191,A201,1 +4762252,A12,9,A34,A42,1919,A61,A74,4,A93,A101,3,A123,35,A143,A151,1,A173,1,A192,A201,1 +8744199,A14,12,A32,A41,2445,A65,A72,2,A94,A101,4,A123,26,A143,A151,1,A173,1,A192,A201,1 +7423252,A12,24,A34,A410,11938,A61,A73,2,A93,A102,3,A123,39,A143,A152,2,A174,2,A192,A201,2 +7539327,A14,18,A31,A40,6458,A61,A75,2,A93,A101,4,A124,39,A141,A152,2,A174,2,A192,A201,2 +9510423,A12,12,A32,A40,6078,A61,A74,2,A93,A101,2,A123,32,A143,A152,1,A173,1,A191,A201,1 +7532368,A11,24,A32,A42,7721,A65,A72,1,A92,A101,2,A122,30,A143,A152,1,A173,1,A192,A202,1 +1929784,A12,14,A32,A49,1410,A63,A75,1,A94,A101,2,A121,35,A143,A152,1,A173,1,A192,A201,1 +9517102,A13,6,A34,A40,1299,A61,A73,1,A93,A101,1,A121,74,A143,A152,3,A171,2,A191,A202,1 +2801710,A12,9,A32,A42,918,A61,A73,4,A92,A101,1,A122,30,A143,A152,1,A173,1,A191,A201,2 +3066581,A12,6,A33,A49,1449,A62,A75,1,A91,A101,2,A123,31,A141,A152,2,A173,2,A191,A201,1 +2924368,A13,15,A32,A46,392,A61,A72,4,A92,A101,4,A122,23,A143,A151,1,A173,1,A192,A201,1 +5814710,A12,18,A32,A40,6260,A61,A74,3,A93,A101,3,A121,28,A143,A151,1,A172,1,A191,A201,1 +5186756,A14,36,A34,A40,7855,A61,A73,4,A92,A101,2,A121,25,A142,A152,2,A173,1,A192,A201,2 +3473006,A11,12,A32,A43,1680,A63,A75,3,A94,A101,1,A121,35,A143,A152,1,A173,1,A191,A201,1 +7491936,A14,48,A34,A43,3578,A65,A75,4,A93,A101,1,A121,47,A143,A152,1,A173,1,A192,A201,1 +8010233,A11,42,A32,A43,7174,A65,A74,4,A92,A101,3,A123,30,A143,A152,1,A174,1,A192,A201,2 +4298578,A11,10,A34,A42,2132,A65,A72,2,A92,A102,3,A121,27,A143,A151,2,A173,1,A191,A202,1 +2515323,A11,33,A34,A42,4281,A63,A73,1,A92,A101,4,A123,23,A143,A152,2,A173,1,A191,A201,2 +2317941,A12,12,A34,A40,2366,A63,A74,3,A91,A101,3,A123,36,A143,A152,1,A174,1,A192,A201,1 +3906586,A11,21,A32,A43,1835,A61,A73,3,A92,A101,2,A121,25,A143,A152,2,A173,1,A192,A201,2 +4863589,A14,24,A34,A41,3868,A61,A75,4,A92,A101,2,A123,41,A143,A151,2,A174,1,A192,A201,1 +4625102,A14,12,A32,A42,1768,A61,A73,3,A93,A101,2,A121,24,A143,A151,1,A172,1,A191,A201,1 +8954004,A13,10,A34,A40,781,A61,A75,4,A93,A101,4,A124,63,A143,A153,2,A173,1,A192,A201,1 +1498295,A12,18,A32,A42,1924,A65,A72,4,A92,A101,3,A121,27,A143,A151,1,A173,1,A191,A201,2 +4851363,A11,12,A34,A40,2121,A61,A73,4,A93,A101,2,A122,30,A143,A152,2,A173,1,A191,A201,1 +1018706,A11,12,A32,A43,701,A61,A73,4,A94,A101,2,A121,40,A143,A152,1,A172,1,A191,A201,1 +9208119,A12,12,A32,A45,639,A61,A73,4,A93,A101,2,A123,30,A143,A152,1,A173,1,A191,A201,2 +4300293,A12,12,A34,A41,1860,A61,A71,4,A93,A101,2,A123,34,A143,A152,2,A174,1,A192,A201,1 +7728014,A11,12,A34,A40,3499,A61,A73,3,A92,A102,2,A121,29,A143,A152,2,A173,1,A191,A201,2 +7990216,A12,48,A32,A40,8487,A65,A74,1,A92,A101,2,A123,24,A143,A152,1,A173,1,A191,A201,1 +1318029,A11,36,A33,A46,6887,A61,A73,4,A93,A101,3,A122,29,A142,A152,1,A173,1,A192,A201,2 +9949768,A14,15,A32,A42,2708,A61,A72,2,A93,A101,3,A122,27,A141,A152,2,A172,1,A191,A201,1 +7906171,A14,18,A32,A42,1984,A61,A73,4,A93,A101,4,A124,47,A141,A153,2,A173,1,A191,A201,1 +4495501,A14,60,A32,A43,10144,A62,A74,2,A92,A101,4,A121,21,A143,A152,1,A173,1,A192,A201,1 +4891536,A14,12,A34,A43,1240,A65,A75,4,A92,A101,2,A121,38,A143,A152,2,A173,1,A192,A201,1 +6332313,A14,27,A33,A41,8613,A64,A73,2,A93,A101,2,A123,27,A143,A152,2,A173,1,A191,A201,1 +2114106,A12,12,A32,A43,766,A63,A73,4,A93,A101,3,A121,66,A143,A152,1,A172,1,A191,A201,2 +8221756,A12,15,A34,A43,2728,A65,A74,4,A93,A103,2,A121,35,A141,A152,3,A173,1,A192,A201,1 +7308446,A13,12,A32,A43,1881,A61,A73,2,A92,A101,2,A123,44,A143,A151,1,A172,1,A192,A201,1 +5759358,A13,6,A32,A40,709,A64,A72,2,A94,A101,2,A121,27,A143,A152,1,A171,1,A191,A202,1 +1432196,A12,36,A32,A43,4795,A61,A72,4,A92,A101,1,A124,30,A143,A152,1,A174,1,A192,A201,1 +2429343,A11,27,A32,A43,3416,A61,A73,3,A93,A101,2,A123,27,A143,A152,1,A174,1,A191,A201,1 +1581866,A11,18,A32,A42,2462,A61,A73,2,A93,A101,2,A123,22,A143,A152,1,A173,1,A191,A201,2 +6989622,A14,21,A34,A42,2288,A61,A72,4,A92,A101,4,A122,23,A143,A152,1,A173,1,A192,A201,1 +2025676,A12,48,A31,A49,3566,A62,A74,4,A93,A101,2,A123,30,A143,A152,1,A173,1,A191,A201,1 +3000727,A11,6,A34,A40,860,A61,A75,1,A92,A101,4,A124,39,A143,A152,2,A173,1,A192,A201,1 +3953814,A14,12,A34,A40,682,A62,A74,4,A92,A101,3,A123,51,A143,A152,2,A173,1,A192,A201,1 +7408825,A11,36,A34,A42,5371,A61,A73,3,A93,A103,2,A122,28,A143,A152,2,A173,1,A191,A201,1 +6226321,A14,18,A34,A43,1582,A64,A75,4,A93,A101,4,A123,46,A143,A152,2,A173,1,A191,A201,1 +6083792,A14,6,A32,A43,1346,A62,A75,2,A93,A101,4,A124,42,A141,A153,1,A173,2,A192,A201,1 +5122662,A14,10,A32,A43,1924,A61,A73,1,A93,A101,4,A122,38,A143,A152,1,A173,1,A192,A202,1 +9746441,A13,36,A32,A43,5848,A61,A73,4,A93,A101,1,A123,24,A143,A152,1,A173,1,A191,A201,1 +7353198,A12,24,A34,A41,7758,A64,A75,2,A92,A101,4,A124,29,A143,A151,1,A173,1,A191,A201,1 +7766762,A12,24,A33,A49,6967,A62,A74,4,A93,A101,4,A123,36,A143,A151,1,A174,1,A192,A201,1 +5175459,A11,12,A32,A42,1282,A61,A73,2,A92,A101,4,A123,20,A143,A151,1,A173,1,A191,A201,2 +5520884,A11,9,A34,A45,1288,A62,A75,3,A93,A103,4,A121,48,A143,A152,2,A173,2,A191,A202,1 +1137438,A11,12,A31,A48,339,A61,A75,4,A94,A101,1,A123,45,A141,A152,1,A172,1,A191,A201,1 +3892302,A12,24,A32,A40,3512,A62,A74,2,A93,A101,3,A123,38,A141,A152,2,A173,1,A192,A201,1 +9957275,A14,6,A34,A43,1898,A65,A73,1,A93,A101,2,A121,34,A143,A152,2,A172,2,A191,A201,1 +4695507,A14,24,A34,A43,2872,A62,A75,3,A93,A101,4,A121,36,A143,A152,1,A173,2,A192,A201,1 +7775118,A14,18,A34,A40,1055,A61,A72,4,A92,A101,1,A122,30,A143,A152,2,A173,1,A191,A201,1 +4717846,A14,15,A32,A44,1262,A63,A74,4,A93,A101,3,A122,36,A143,A152,2,A173,1,A192,A201,1 +3539651,A12,10,A32,A40,7308,A61,A71,2,A93,A101,4,A124,70,A141,A153,1,A174,1,A192,A201,1 +6446799,A14,36,A32,A40,909,A63,A75,4,A93,A101,4,A122,36,A143,A152,1,A173,1,A191,A201,1 +5749718,A14,6,A32,A42,2978,A63,A73,1,A93,A101,2,A123,32,A143,A152,1,A173,1,A192,A201,1 +8930405,A11,18,A32,A42,1131,A61,A71,4,A92,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,2 +4030908,A12,11,A32,A42,1577,A64,A72,4,A92,A101,1,A121,20,A143,A152,1,A173,1,A191,A201,1 +1327333,A14,24,A32,A42,3972,A61,A74,2,A92,A101,4,A122,25,A143,A151,1,A173,1,A192,A201,1 +5329060,A12,24,A34,A49,1935,A61,A75,4,A91,A101,4,A121,31,A143,A152,2,A173,1,A192,A201,2 +7922005,A11,15,A30,A40,950,A61,A75,4,A93,A101,3,A123,33,A143,A151,2,A173,2,A191,A201,2 +6450905,A14,12,A32,A42,763,A61,A73,4,A92,A101,1,A121,26,A143,A152,1,A173,1,A192,A201,1 +1909580,A12,24,A33,A42,2064,A61,A71,3,A92,A101,2,A122,34,A143,A152,1,A174,1,A192,A201,2 +2714562,A12,8,A32,A43,1414,A61,A73,4,A93,A103,2,A121,33,A143,A152,1,A173,1,A191,A202,1 +6826470,A11,21,A33,A46,3414,A61,A72,2,A93,A101,1,A122,26,A143,A152,2,A173,1,A191,A201,2 +4340005,A14,30,A31,A41,7485,A65,A71,4,A92,A101,1,A121,53,A141,A152,1,A174,1,A192,A201,2 +7800786,A11,12,A32,A42,2577,A61,A73,2,A91,A101,1,A123,42,A143,A152,1,A173,1,A191,A201,1 +9266615,A11,6,A34,A43,338,A63,A75,4,A93,A101,4,A123,52,A143,A152,2,A173,1,A191,A201,1 +6212932,A14,12,A32,A43,1963,A61,A74,4,A93,A101,2,A123,31,A143,A151,2,A174,2,A192,A201,1 +5419741,A11,21,A34,A40,571,A61,A75,4,A93,A101,4,A121,65,A143,A152,2,A173,1,A191,A201,1 +9177065,A14,36,A33,A49,9572,A61,A72,1,A91,A101,1,A123,28,A143,A152,2,A173,1,A191,A201,2 +9804802,A12,36,A33,A49,4455,A61,A73,2,A91,A101,2,A121,30,A142,A152,2,A174,1,A192,A201,2 +8560790,A11,21,A31,A40,1647,A65,A73,4,A93,A101,2,A122,40,A143,A152,2,A172,2,A191,A201,2 +2450622,A14,24,A34,A42,3777,A64,A73,4,A93,A101,4,A121,50,A143,A152,1,A173,1,A192,A201,1 +6376471,A12,18,A34,A40,884,A61,A75,4,A93,A101,4,A123,36,A141,A152,1,A173,2,A192,A201,2 +2549308,A14,15,A34,A43,1360,A61,A73,4,A93,A101,2,A122,31,A143,A152,2,A173,1,A191,A201,1 +5285284,A12,9,A31,A41,5129,A61,A75,2,A92,A101,4,A124,74,A141,A153,1,A174,2,A192,A201,2 +3020208,A12,16,A34,A40,1175,A61,A71,2,A93,A101,3,A123,68,A143,A153,3,A171,1,A192,A201,1 +4186149,A11,12,A32,A43,674,A62,A74,4,A94,A101,1,A122,20,A143,A152,1,A173,1,A191,A201,2 +4599122,A12,18,A30,A42,3244,A61,A73,1,A92,A101,4,A123,33,A141,A152,2,A173,1,A192,A201,1 +1730668,A14,24,A32,A49,4591,A64,A73,2,A93,A101,3,A122,54,A143,A152,3,A174,1,A192,A201,2 +3082071,A12,48,A30,A49,3844,A62,A74,4,A93,A101,4,A124,34,A143,A153,1,A172,2,A191,A201,2 +5960666,A12,27,A32,A49,3915,A61,A73,4,A93,A101,2,A123,36,A143,A152,1,A173,2,A192,A201,2 +7033466,A14,6,A32,A43,2108,A61,A74,2,A94,A101,2,A121,29,A143,A151,1,A173,1,A191,A201,1 +4027322,A12,45,A32,A43,3031,A62,A73,4,A93,A103,4,A122,21,A143,A151,1,A173,1,A191,A201,2 +4067442,A12,9,A34,A46,1501,A61,A75,2,A92,A101,3,A123,34,A143,A152,2,A174,1,A192,A201,2 +2816577,A14,6,A34,A43,1382,A61,A73,1,A92,A101,1,A123,28,A143,A152,2,A173,1,A192,A201,1 +2964220,A12,12,A32,A42,951,A62,A72,4,A92,A101,4,A123,27,A141,A151,4,A173,1,A191,A201,2 +4483104,A12,24,A32,A41,2760,A65,A75,4,A93,A101,4,A124,36,A141,A153,1,A173,1,A192,A201,1 +6604417,A12,18,A33,A42,4297,A61,A75,4,A91,A101,3,A124,40,A143,A152,1,A174,1,A192,A201,2 +7854022,A14,9,A34,A46,936,A63,A75,4,A93,A101,2,A123,52,A143,A152,2,A173,1,A192,A201,1 +5232363,A11,12,A32,A40,1168,A61,A73,4,A94,A101,3,A121,27,A143,A152,1,A172,1,A191,A201,1 +7627208,A14,27,A33,A49,5117,A61,A74,3,A93,A101,4,A123,26,A143,A152,2,A173,1,A191,A201,1 +8257351,A11,12,A32,A48,902,A61,A74,4,A94,A101,4,A122,21,A143,A151,1,A173,1,A191,A201,2 +4920965,A14,12,A34,A40,1495,A61,A75,4,A93,A101,1,A121,38,A143,A152,2,A172,2,A191,A201,1 +7198693,A11,30,A34,A41,10623,A61,A75,3,A93,A101,4,A124,38,A143,A153,3,A174,2,A192,A201,1 +6463715,A14,12,A34,A42,1935,A61,A75,4,A93,A101,4,A121,43,A143,A152,3,A173,1,A192,A201,1 +3336919,A12,12,A34,A44,1424,A61,A74,4,A93,A101,3,A122,26,A143,A152,1,A173,1,A191,A201,1 +1727035,A11,24,A32,A49,6568,A61,A73,2,A94,A101,2,A123,21,A142,A152,1,A172,1,A191,A201,1 +1425487,A14,12,A32,A41,1413,A64,A74,3,A93,A101,2,A122,55,A143,A152,1,A173,1,A191,A202,1 +3808400,A14,9,A34,A43,3074,A65,A73,1,A93,A101,2,A121,33,A143,A152,2,A173,2,A191,A201,1 +9925070,A14,36,A32,A43,3835,A65,A75,2,A92,A101,4,A121,45,A143,A152,1,A172,1,A192,A201,1 +3589132,A11,27,A30,A49,5293,A61,A71,2,A93,A101,4,A122,50,A142,A152,2,A173,1,A192,A201,2 +1102227,A13,30,A33,A49,1908,A61,A75,4,A93,A101,4,A121,66,A143,A152,1,A174,1,A192,A201,2 +3188138,A14,36,A34,A43,3342,A65,A75,4,A93,A101,2,A123,51,A143,A152,1,A173,1,A192,A201,1 +2870117,A12,6,A34,A48,932,A65,A74,1,A92,A101,3,A122,39,A143,A152,2,A172,1,A191,A201,1 +3212489,A11,18,A30,A49,3104,A61,A74,3,A93,A101,1,A122,31,A141,A152,1,A173,1,A192,A201,1 +8004570,A13,36,A32,A43,3913,A61,A73,2,A93,A101,2,A121,23,A143,A152,1,A173,1,A192,A201,1 +1878782,A11,24,A32,A42,3021,A61,A73,2,A91,A101,2,A121,24,A143,A151,1,A172,1,A191,A201,1 +8474591,A14,10,A32,A40,1364,A61,A73,2,A92,A101,4,A123,64,A143,A152,1,A173,1,A192,A201,1 +7696088,A12,12,A32,A43,625,A61,A72,4,A94,A103,1,A121,26,A141,A152,1,A172,1,A191,A201,1 +6328300,A11,12,A32,A46,1200,A65,A73,4,A92,A101,4,A122,23,A141,A151,1,A173,1,A192,A201,1 +2321864,A14,12,A32,A43,707,A61,A73,4,A93,A101,2,A121,30,A141,A152,2,A173,1,A191,A201,1 +4113839,A14,24,A33,A49,2978,A65,A73,4,A93,A101,4,A121,32,A143,A152,2,A173,2,A192,A201,1 +6328203,A14,15,A32,A41,4657,A61,A73,3,A93,A101,2,A123,30,A143,A152,1,A173,1,A192,A201,1 +4449813,A14,36,A30,A45,2613,A61,A73,4,A93,A101,2,A123,27,A143,A152,2,A173,1,A191,A201,1 +2935635,A12,48,A32,A43,10961,A64,A74,1,A93,A102,2,A124,27,A141,A152,2,A173,1,A192,A201,2 +4544060,A11,12,A32,A42,7865,A61,A75,4,A93,A101,4,A124,53,A143,A153,1,A174,1,A192,A201,2 +8787497,A14,9,A32,A43,1478,A61,A74,4,A93,A101,2,A123,22,A143,A152,1,A173,1,A191,A201,2 +5985557,A11,24,A32,A42,3149,A61,A72,4,A93,A101,1,A124,22,A141,A153,1,A173,1,A191,A201,1 +7271945,A13,36,A32,A43,4210,A61,A73,4,A93,A101,2,A123,26,A143,A152,1,A173,1,A191,A201,2 +1813988,A14,9,A32,A40,2507,A63,A75,2,A93,A101,4,A124,51,A143,A153,1,A172,1,A191,A201,1 +7860808,A14,12,A32,A43,2141,A62,A74,3,A93,A101,1,A124,35,A143,A152,1,A173,1,A191,A201,1 +2709717,A12,18,A32,A43,866,A61,A73,4,A94,A103,2,A121,25,A143,A152,1,A172,1,A191,A201,1 +8437780,A14,4,A34,A43,1544,A61,A74,2,A93,A101,1,A121,42,A143,A152,3,A172,2,A191,A201,1 +9635061,A11,24,A32,A43,1823,A61,A71,4,A93,A101,2,A123,30,A142,A152,1,A174,2,A191,A201,2 +7532911,A12,6,A32,A40,14555,A65,A71,1,A93,A101,2,A122,23,A143,A152,1,A171,1,A192,A201,2 +7932675,A12,21,A32,A49,2767,A62,A75,4,A91,A101,2,A123,61,A141,A151,2,A172,1,A191,A201,2 +7028331,A14,12,A34,A43,1291,A61,A73,4,A92,A101,2,A122,35,A143,A152,2,A173,1,A191,A201,1 +5361518,A11,30,A32,A43,2522,A61,A75,1,A93,A103,3,A122,39,A143,A152,1,A173,2,A191,A201,1 +2444643,A11,24,A32,A40,915,A65,A75,4,A92,A101,2,A123,29,A141,A152,1,A173,1,A191,A201,2 +2566697,A14,6,A32,A43,1595,A61,A74,3,A93,A101,2,A122,51,A143,A152,1,A173,2,A191,A201,1 +4776602,A12,12,A32,A43,1155,A61,A75,3,A94,A103,3,A121,40,A141,A152,2,A172,1,A191,A201,1 +7860808,A14,12,A32,A43,2141,A62,A74,3,A93,A101,1,A124,35,A143,A152,1,A173,1,A191,A201,1 +5490556,A11,48,A30,A41,4605,A61,A75,3,A93,A101,4,A124,24,A143,A153,2,A173,2,A191,A201,2 +9532249,A14,12,A34,A49,1185,A61,A73,3,A92,A101,2,A121,27,A143,A152,2,A173,1,A191,A201,1 +8266385,A14,12,A31,A48,3447,A63,A73,4,A92,A101,3,A121,35,A143,A152,1,A172,2,A191,A201,1 +2445116,A14,24,A32,A49,1258,A61,A74,4,A93,A101,1,A121,25,A143,A152,1,A173,1,A192,A201,1 +4852106,A14,12,A34,A43,717,A61,A75,4,A93,A101,4,A121,52,A143,A152,3,A173,1,A191,A201,1 +9087286,A14,6,A30,A40,1204,A62,A73,4,A93,A101,1,A124,35,A141,A151,1,A173,1,A191,A202,1 +7735589,A13,24,A32,A42,1925,A61,A73,2,A93,A101,2,A121,26,A143,A152,1,A173,1,A191,A201,1 +5075795,A14,18,A32,A43,433,A61,A71,3,A92,A102,4,A121,22,A143,A151,1,A173,1,A191,A201,2 +9404354,A11,6,A34,A40,666,A64,A74,3,A92,A101,4,A121,39,A143,A152,2,A172,1,A192,A201,1 +7374672,A13,12,A32,A42,2251,A61,A73,1,A92,A101,2,A123,46,A143,A152,1,A172,1,A191,A201,1 +5537602,A12,30,A32,A40,2150,A61,A73,4,A92,A103,2,A124,24,A141,A152,1,A173,1,A191,A201,2 +2779845,A14,24,A33,A42,4151,A62,A73,2,A93,A101,3,A122,35,A143,A152,2,A173,1,A191,A201,1 +9096946,A12,9,A32,A42,2030,A65,A74,2,A93,A101,1,A123,24,A143,A152,1,A173,1,A192,A201,1 +2649014,A12,60,A33,A43,7418,A65,A73,1,A93,A101,1,A121,27,A143,A152,1,A172,1,A191,A201,1 +8542371,A14,24,A34,A43,2684,A61,A73,4,A93,A101,2,A121,35,A143,A152,2,A172,1,A191,A201,1 +3719734,A11,12,A31,A43,2149,A61,A73,4,A91,A101,1,A124,29,A143,A153,1,A173,1,A191,A201,2 +5769297,A14,15,A32,A41,3812,A62,A72,1,A92,A101,4,A123,23,A143,A152,1,A173,1,A192,A201,1 +8002907,A14,11,A34,A43,1154,A62,A71,4,A92,A101,4,A121,57,A143,A152,3,A172,1,A191,A201,1 +4171983,A11,12,A32,A42,1657,A61,A73,2,A93,A101,2,A121,27,A143,A152,1,A173,1,A191,A201,1 +7094119,A11,24,A32,A43,1603,A61,A75,4,A92,A101,4,A123,55,A143,A152,1,A173,1,A191,A201,1 +5079269,A11,18,A34,A40,5302,A61,A75,2,A93,A101,4,A124,36,A143,A153,3,A174,1,A192,A201,1 +8926902,A14,12,A34,A46,2748,A61,A75,2,A92,A101,4,A124,57,A141,A153,3,A172,1,A191,A201,1 +5810771,A14,10,A34,A40,1231,A61,A75,3,A93,A101,4,A121,32,A143,A152,2,A172,2,A191,A202,1 +7823319,A12,15,A32,A43,802,A61,A75,4,A93,A101,3,A123,37,A143,A152,1,A173,2,A191,A201,2 +5303576,A14,36,A34,A49,6304,A65,A75,4,A93,A101,4,A121,36,A143,A152,2,A173,1,A191,A201,1 +3572402,A14,24,A32,A43,1533,A61,A72,4,A92,A101,3,A123,38,A142,A152,1,A173,1,A192,A201,1 +3892416,A11,14,A32,A40,8978,A61,A75,1,A91,A101,4,A122,45,A143,A152,1,A174,1,A192,A202,2 +4517508,A14,24,A32,A43,999,A65,A75,4,A93,A101,2,A123,25,A143,A152,2,A173,1,A191,A201,1 +8060870,A14,18,A32,A40,2662,A65,A74,4,A93,A101,3,A122,32,A143,A152,1,A173,1,A191,A202,1 +2081683,A14,12,A34,A42,1402,A63,A74,3,A92,A101,4,A123,37,A143,A151,1,A173,1,A192,A201,1 +6752061,A12,48,A31,A40,12169,A65,A71,4,A93,A102,4,A124,36,A143,A153,1,A174,1,A192,A201,1 +7081559,A12,48,A32,A43,3060,A61,A74,4,A93,A101,4,A121,28,A143,A152,2,A173,1,A191,A201,2 +9948056,A11,30,A32,A45,11998,A61,A72,1,A91,A101,1,A124,34,A143,A152,1,A172,1,A192,A201,2 +4447932,A14,9,A32,A43,2697,A61,A73,1,A93,A101,2,A121,32,A143,A152,1,A173,2,A191,A201,1 +5069388,A14,18,A34,A43,2404,A61,A73,2,A92,A101,2,A123,26,A143,A152,2,A173,1,A191,A201,1 +5910940,A11,12,A32,A42,1262,A65,A75,2,A91,A101,4,A122,49,A143,A152,1,A172,1,A192,A201,1 +3733764,A14,6,A32,A42,4611,A61,A72,1,A92,A101,4,A122,32,A143,A152,1,A173,1,A191,A201,2 +5012199,A14,24,A32,A43,1901,A62,A73,4,A93,A101,4,A123,29,A143,A151,1,A174,1,A192,A201,1 +6009011,A14,15,A34,A41,3368,A64,A75,3,A93,A101,4,A124,23,A143,A151,2,A173,1,A192,A201,1 +5953648,A14,12,A32,A42,1574,A61,A73,4,A93,A101,2,A121,50,A143,A152,1,A173,1,A191,A201,1 +1511501,A13,18,A31,A43,1445,A65,A74,4,A93,A101,4,A123,49,A141,A152,1,A172,1,A191,A201,1 +5113857,A14,15,A34,A42,1520,A65,A75,4,A93,A101,4,A122,63,A143,A152,1,A173,1,A191,A201,1 +3627592,A12,24,A34,A40,3878,A62,A72,4,A91,A101,2,A123,37,A143,A152,1,A173,1,A192,A201,1 +8803488,A11,47,A32,A40,10722,A61,A72,1,A92,A101,1,A121,35,A143,A152,1,A172,1,A192,A201,1 +9505849,A11,48,A32,A41,4788,A61,A74,4,A93,A101,3,A122,26,A143,A152,1,A173,2,A191,A201,1 +3200360,A12,48,A33,A410,7582,A62,A71,2,A93,A101,4,A124,31,A143,A153,1,A174,1,A192,A201,1 +5155553,A12,12,A32,A43,1092,A61,A73,4,A92,A103,4,A121,49,A143,A152,2,A173,1,A192,A201,1 +1380636,A11,24,A33,A43,1024,A61,A72,4,A94,A101,4,A121,48,A142,A152,1,A173,1,A191,A201,2 +3468954,A14,12,A32,A49,1076,A61,A73,2,A94,A101,2,A121,26,A143,A152,1,A173,1,A192,A202,1 +1526208,A12,36,A32,A41,9398,A61,A72,1,A94,A101,4,A123,28,A143,A151,1,A174,1,A192,A201,2 +2689247,A11,24,A34,A41,6419,A61,A75,2,A92,A101,4,A124,44,A143,A153,2,A174,2,A192,A201,1 +2359358,A13,42,A34,A41,4796,A61,A75,4,A93,A101,4,A124,56,A143,A153,1,A173,1,A191,A201,1 +1099632,A14,48,A34,A49,7629,A65,A75,4,A91,A101,2,A123,46,A141,A152,2,A174,2,A191,A201,1 +3753189,A12,48,A32,A42,9960,A61,A72,1,A92,A101,2,A123,26,A143,A152,1,A173,1,A192,A201,2 +9668162,A14,12,A32,A41,4675,A65,A72,1,A92,A101,4,A123,20,A143,A151,1,A173,1,A191,A201,1 +4302438,A14,10,A32,A40,1287,A65,A75,4,A93,A102,2,A122,45,A143,A152,1,A172,1,A191,A202,1 +1985883,A14,18,A32,A42,2515,A61,A73,3,A93,A101,4,A121,43,A143,A152,1,A173,1,A192,A201,1 +7530685,A12,21,A34,A42,2745,A64,A74,3,A93,A101,2,A123,32,A143,A152,2,A173,1,A192,A201,1 +3965440,A14,6,A32,A40,672,A61,A71,1,A92,A101,4,A121,54,A143,A152,1,A171,1,A192,A201,1 +9545127,A12,36,A30,A43,3804,A61,A73,4,A92,A101,1,A123,42,A143,A152,1,A173,1,A192,A201,2 +2663117,A13,24,A34,A40,1344,A65,A74,4,A93,A101,2,A121,37,A141,A152,2,A172,2,A191,A201,2 +8785566,A11,10,A34,A40,1038,A61,A74,4,A93,A102,3,A122,49,A143,A152,2,A173,1,A192,A201,1 +8697852,A14,48,A34,A40,10127,A63,A73,2,A93,A101,2,A124,44,A141,A153,1,A173,1,A191,A201,2 +9910337,A14,6,A32,A42,1543,A64,A73,4,A91,A101,2,A121,33,A143,A152,1,A173,1,A191,A201,1 +7200952,A14,30,A32,A41,4811,A65,A74,2,A92,A101,4,A122,24,A142,A151,1,A172,1,A191,A201,1 +2482842,A11,12,A32,A43,727,A62,A72,4,A94,A101,3,A124,33,A143,A152,1,A172,1,A192,A201,2 +5873455,A12,8,A32,A42,1237,A61,A73,3,A92,A101,4,A121,24,A143,A152,1,A173,1,A191,A201,2 +9224996,A12,9,A32,A40,276,A61,A73,4,A94,A101,4,A121,22,A143,A151,1,A172,1,A191,A201,1 +9804740,A12,48,A32,A410,5381,A65,A71,3,A93,A101,4,A124,40,A141,A153,1,A171,1,A192,A201,1 +5724920,A14,24,A32,A42,5511,A62,A73,4,A93,A101,1,A123,25,A142,A152,1,A173,1,A191,A201,1 +5401207,A13,24,A32,A42,3749,A61,A72,2,A92,A101,4,A123,26,A143,A152,1,A173,1,A191,A201,1 +6883393,A12,12,A32,A40,685,A61,A74,2,A94,A101,3,A123,25,A141,A152,1,A172,1,A191,A201,2 +9941505,A13,4,A32,A40,1494,A65,A72,1,A93,A101,2,A121,29,A143,A152,1,A172,2,A191,A202,1 +1310810,A11,36,A31,A42,2746,A61,A75,4,A93,A101,4,A123,31,A141,A152,1,A173,1,A191,A201,2 +6083949,A11,12,A32,A42,708,A61,A73,2,A93,A103,3,A122,38,A143,A152,1,A172,2,A191,A201,1 +7464596,A12,24,A32,A42,4351,A65,A73,1,A92,A101,4,A122,48,A143,A152,1,A172,1,A192,A201,1 +2298828,A14,12,A34,A46,701,A61,A73,4,A93,A101,2,A123,32,A143,A152,2,A173,1,A191,A201,1 +2953405,A11,15,A33,A42,3643,A61,A75,1,A92,A101,4,A122,27,A143,A152,2,A172,1,A191,A201,1 +5680012,A12,30,A34,A40,4249,A61,A71,4,A94,A101,2,A123,28,A143,A152,2,A174,1,A191,A201,2 +3243360,A11,24,A32,A43,1938,A61,A72,4,A91,A101,3,A122,32,A143,A152,1,A173,1,A191,A201,2 +6960053,A11,24,A32,A41,2910,A61,A74,2,A93,A101,1,A124,34,A143,A153,1,A174,1,A192,A201,1 +8870244,A11,18,A32,A42,2659,A64,A73,4,A93,A101,2,A123,28,A143,A152,1,A173,1,A191,A201,1 +9717422,A14,18,A34,A40,1028,A61,A73,4,A92,A101,3,A121,36,A143,A152,2,A173,1,A191,A201,1 +9554635,A11,8,A34,A40,3398,A61,A74,1,A93,A101,4,A121,39,A143,A152,2,A172,1,A191,A202,1 +3536846,A14,12,A34,A42,5801,A65,A75,2,A93,A101,4,A122,49,A143,A151,1,A173,1,A192,A201,1 +7951981,A14,24,A32,A40,1525,A64,A74,4,A92,A101,3,A123,34,A143,A152,1,A173,2,A192,A201,1 +5678097,A13,36,A32,A43,4473,A61,A75,4,A93,A101,2,A123,31,A143,A152,1,A173,1,A191,A201,1 +1771168,A12,6,A32,A43,1068,A61,A75,4,A93,A101,4,A123,28,A143,A152,1,A173,2,A191,A201,1 +7161704,A11,24,A34,A41,6615,A61,A71,2,A93,A101,4,A124,75,A143,A153,2,A174,1,A192,A201,1 +8866806,A14,18,A34,A46,1864,A62,A73,4,A92,A101,2,A121,30,A143,A152,2,A173,1,A191,A201,2 +3955252,A12,60,A32,A40,7408,A62,A72,4,A92,A101,2,A122,24,A143,A152,1,A174,1,A191,A201,2 +2239899,A14,48,A34,A41,11590,A62,A73,2,A92,A101,4,A123,24,A141,A151,2,A172,1,A191,A201,2 +6209500,A11,24,A30,A42,4110,A61,A75,3,A93,A101,4,A124,23,A141,A151,2,A173,2,A191,A201,2 +7504124,A11,6,A34,A42,3384,A61,A73,1,A91,A101,4,A121,44,A143,A151,1,A174,1,A192,A201,2 +1114455,A12,13,A32,A43,2101,A61,A72,2,A92,A103,4,A122,23,A143,A152,1,A172,1,A191,A201,1 +1803583,A11,15,A32,A44,1275,A65,A73,4,A92,A101,2,A123,24,A143,A151,1,A173,1,A191,A201,2 +8565471,A11,24,A32,A42,4169,A61,A73,4,A93,A101,4,A122,28,A143,A152,1,A173,1,A191,A201,1 +5974627,A12,10,A32,A42,1521,A61,A73,4,A91,A101,2,A123,31,A143,A152,1,A172,1,A191,A201,1 +6416552,A12,24,A34,A46,5743,A61,A72,2,A92,A101,4,A124,24,A143,A153,2,A173,1,A192,A201,1 +5807460,A11,21,A32,A42,3599,A61,A74,1,A92,A101,4,A123,26,A143,A151,1,A172,1,A191,A201,1 +4314639,A12,18,A32,A43,3213,A63,A72,1,A94,A101,3,A121,25,A143,A151,1,A173,1,A191,A201,1 +3529638,A12,18,A32,A49,4439,A61,A75,1,A93,A102,1,A121,33,A141,A152,1,A174,1,A192,A201,1 +1222158,A13,10,A32,A40,3949,A61,A72,1,A93,A103,1,A122,37,A143,A152,1,A172,2,A191,A201,1 +1440915,A14,15,A34,A43,1459,A61,A73,4,A92,A101,2,A123,43,A143,A152,1,A172,1,A191,A201,1 +3930437,A12,13,A34,A43,882,A61,A72,4,A93,A103,4,A121,23,A143,A152,2,A173,1,A191,A201,1 +6612328,A12,24,A32,A43,3758,A63,A71,1,A92,A101,4,A124,23,A143,A151,1,A171,1,A191,A201,1 +9945625,A14,6,A33,A49,1743,A62,A73,1,A93,A101,2,A121,34,A143,A152,2,A172,1,A191,A201,1 +5001052,A12,9,A34,A46,1136,A64,A75,4,A93,A101,3,A124,32,A143,A153,2,A173,2,A191,A201,2 +4733785,A14,9,A32,A44,1236,A61,A72,1,A92,A101,4,A121,23,A143,A151,1,A173,1,A192,A201,1 +1525448,A12,9,A32,A42,959,A61,A73,1,A92,A101,2,A123,29,A143,A152,1,A173,1,A191,A202,2 +6875226,A14,18,A34,A41,3229,A65,A71,2,A93,A101,4,A124,38,A143,A152,1,A174,1,A192,A201,1 +6189987,A11,12,A30,A43,6199,A61,A73,4,A93,A101,2,A122,28,A143,A151,2,A173,1,A192,A201,2 +1690342,A14,10,A32,A46,727,A63,A75,4,A93,A101,4,A124,46,A143,A153,1,A173,1,A192,A201,1 +5643344,A12,24,A32,A40,1246,A61,A72,4,A93,A101,2,A121,23,A142,A152,1,A172,1,A191,A201,2 +9430378,A14,12,A34,A43,2331,A65,A75,1,A93,A102,4,A121,49,A143,A152,1,A173,1,A192,A201,1 +9982432,A14,36,A33,A43,4463,A61,A73,4,A93,A101,2,A123,26,A143,A152,2,A174,1,A192,A201,2 +4344490,A14,12,A32,A43,776,A61,A73,4,A94,A101,2,A121,28,A143,A152,1,A173,1,A191,A201,1 +4256428,A11,30,A32,A42,2406,A61,A74,4,A92,A101,4,A121,23,A143,A151,1,A173,1,A191,A201,2 +9723360,A12,18,A32,A46,1239,A65,A73,4,A93,A101,4,A124,61,A143,A153,1,A173,1,A191,A201,1 +6762528,A13,12,A32,A43,3399,A65,A75,2,A93,A101,3,A123,37,A143,A152,1,A174,1,A191,A201,1 +5280442,A13,12,A33,A40,2247,A61,A73,2,A92,A101,2,A123,36,A142,A152,2,A173,1,A192,A201,1 +7631660,A14,6,A32,A42,1766,A61,A73,1,A94,A101,2,A122,21,A143,A151,1,A173,1,A191,A201,1 +9842159,A11,18,A32,A42,2473,A61,A71,4,A93,A101,1,A123,25,A143,A152,1,A171,1,A191,A201,2 +3343196,A14,12,A32,A49,1542,A61,A74,2,A93,A101,4,A123,36,A143,A152,1,A173,1,A192,A201,1 +1342256,A14,18,A34,A41,3850,A61,A74,3,A93,A101,1,A123,27,A143,A152,2,A173,1,A191,A201,1 +7966598,A11,18,A32,A42,3650,A61,A72,1,A92,A101,4,A123,22,A143,A151,1,A173,1,A191,A201,1 +9974040,A11,36,A32,A42,3446,A61,A75,4,A93,A101,2,A123,42,A143,A152,1,A173,2,A191,A201,2 +9518729,A12,18,A32,A42,3001,A61,A74,2,A92,A101,4,A121,40,A143,A151,1,A173,1,A191,A201,1 +6655507,A14,36,A32,A40,3079,A65,A73,4,A93,A101,4,A121,36,A143,A152,1,A173,1,A191,A201,1 +1296391,A14,18,A34,A43,6070,A61,A75,3,A93,A101,4,A123,33,A143,A152,2,A173,1,A192,A201,1 +2366810,A14,10,A34,A42,2146,A61,A72,1,A92,A101,3,A121,23,A143,A151,2,A173,1,A191,A201,1 +1470952,A14,60,A34,A40,13756,A65,A75,2,A93,A101,4,A124,63,A141,A153,1,A174,1,A192,A201,1 +4427304,A12,60,A31,A410,14782,A62,A75,3,A92,A101,4,A124,60,A141,A153,2,A174,1,A192,A201,2 +3112559,A11,48,A31,A49,7685,A61,A74,2,A92,A103,4,A123,37,A143,A151,1,A173,1,A191,A201,2 +4850866,A14,18,A33,A43,2320,A61,A71,2,A94,A101,3,A121,34,A143,A152,2,A173,1,A191,A201,1 +1019796,A14,7,A33,A43,846,A65,A75,3,A93,A101,4,A124,36,A143,A153,1,A173,1,A191,A201,1 +1707851,A12,36,A32,A40,14318,A61,A75,4,A93,A101,2,A124,57,A143,A153,1,A174,1,A192,A201,2 +4197078,A14,6,A34,A40,362,A62,A73,4,A92,A101,4,A123,52,A143,A152,2,A172,1,A191,A201,1 +6008260,A11,20,A32,A42,2212,A65,A74,4,A93,A101,4,A123,39,A143,A152,1,A173,1,A192,A201,1 +4238613,A12,18,A32,A41,12976,A61,A71,3,A92,A101,4,A124,38,A143,A153,1,A174,1,A192,A201,2 +8881848,A14,22,A32,A40,1283,A65,A74,4,A92,A101,4,A122,25,A143,A151,1,A173,1,A191,A201,1 +1176152,A13,12,A32,A40,1330,A61,A72,4,A93,A101,1,A121,26,A143,A152,1,A173,1,A191,A201,1 +5328999,A14,30,A33,A49,4272,A62,A73,2,A93,A101,2,A122,26,A143,A152,2,A172,1,A191,A201,1 +2017205,A14,18,A34,A43,2238,A61,A73,2,A92,A101,1,A123,25,A143,A152,2,A173,1,A191,A201,1 +9326362,A14,18,A32,A43,1126,A65,A72,4,A92,A101,2,A121,21,A143,A151,1,A173,1,A192,A201,1 +6571397,A12,18,A34,A42,7374,A61,A71,4,A93,A101,4,A122,40,A142,A152,2,A174,1,A192,A201,1 +8705378,A12,15,A34,A49,2326,A63,A73,2,A93,A101,4,A123,27,A141,A152,1,A173,1,A191,A201,1 +3969916,A14,9,A32,A49,1449,A61,A74,3,A92,A101,2,A123,27,A143,A152,2,A173,1,A191,A201,1 +1797600,A14,18,A32,A40,1820,A61,A73,2,A94,A101,2,A122,30,A143,A152,1,A174,1,A192,A201,1 +8978303,A12,12,A32,A42,983,A64,A72,1,A92,A101,4,A121,19,A143,A151,1,A172,1,A191,A201,1 +9199505,A11,36,A32,A40,3249,A61,A74,2,A93,A101,4,A124,39,A141,A153,1,A174,2,A192,A201,1 +5334896,A11,6,A34,A43,1957,A61,A74,1,A92,A101,4,A123,31,A143,A152,1,A173,1,A191,A201,1 +5018277,A14,9,A34,A42,2406,A61,A71,2,A93,A101,3,A123,31,A143,A152,1,A174,1,A191,A201,1 +9994482,A12,39,A33,A46,11760,A62,A74,2,A93,A101,3,A124,32,A143,A151,1,A173,1,A192,A201,1 +2520877,A11,12,A32,A42,2578,A61,A71,3,A92,A101,4,A124,55,A143,A153,1,A174,1,A191,A201,1 +9506380,A11,36,A34,A42,2348,A61,A73,3,A94,A101,2,A122,46,A143,A152,2,A173,1,A192,A201,1 +7043482,A12,12,A32,A40,1223,A61,A75,1,A91,A101,1,A121,46,A143,A151,2,A173,1,A191,A201,2 +1638879,A14,24,A34,A43,1516,A64,A73,4,A92,A101,1,A121,43,A143,A152,2,A172,1,A191,A201,1 +5296104,A14,18,A32,A43,1473,A61,A72,3,A94,A101,4,A121,39,A143,A152,1,A173,1,A192,A201,1 +7379526,A12,18,A34,A49,1887,A65,A73,4,A94,A101,4,A121,28,A141,A152,2,A173,1,A191,A201,1 +3276838,A14,24,A33,A49,8648,A61,A72,2,A93,A101,2,A123,27,A141,A152,2,A173,1,A192,A201,2 +8629007,A14,14,A33,A40,802,A61,A73,4,A93,A101,2,A123,27,A143,A152,2,A172,1,A191,A201,1 +6989363,A12,18,A33,A40,2899,A65,A75,4,A93,A101,4,A123,43,A143,A152,1,A173,2,A191,A201,1 +3486169,A12,24,A32,A43,2039,A61,A72,1,A94,A101,1,A122,22,A143,A152,1,A173,1,A192,A201,2 +5914287,A14,24,A34,A41,2197,A65,A74,4,A93,A101,4,A123,43,A143,A152,2,A173,2,A192,A201,1 +2842414,A11,15,A32,A43,1053,A61,A72,4,A94,A101,2,A121,27,A143,A152,1,A173,1,A191,A202,1 +6610185,A14,24,A32,A43,3235,A63,A75,3,A91,A101,2,A123,26,A143,A152,1,A174,1,A192,A201,1 +7974174,A13,12,A34,A40,939,A63,A74,4,A94,A101,2,A121,28,A143,A152,3,A173,1,A192,A201,2 +6076228,A12,24,A32,A43,1967,A61,A75,4,A92,A101,4,A123,20,A143,A152,1,A173,1,A192,A201,1 +2636238,A14,33,A34,A41,7253,A61,A74,3,A93,A101,2,A123,35,A143,A152,2,A174,1,A192,A201,1 +3698559,A14,12,A34,A49,2292,A61,A71,4,A93,A101,2,A123,42,A142,A152,2,A174,1,A192,A201,2 +1480210,A14,10,A32,A40,1597,A63,A73,3,A93,A101,2,A124,40,A143,A151,1,A172,2,A191,A202,1 +8545813,A11,24,A32,A40,1381,A65,A73,4,A92,A101,2,A122,35,A143,A152,1,A173,1,A191,A201,2 +8778324,A14,36,A34,A41,5842,A61,A75,2,A93,A101,2,A122,35,A143,A152,2,A173,2,A192,A201,1 +4645046,A11,12,A32,A40,2579,A61,A72,4,A93,A101,1,A121,33,A143,A152,1,A172,2,A191,A201,2 +8672645,A11,18,A33,A46,8471,A65,A73,1,A92,A101,2,A123,23,A143,A151,2,A173,1,A192,A201,1 +6875256,A14,21,A32,A40,2782,A63,A74,1,A92,A101,2,A123,31,A141,A152,1,A174,1,A191,A201,1 +9774597,A12,18,A32,A40,1042,A65,A73,4,A92,A101,2,A122,33,A143,A152,1,A173,1,A191,A201,2 +4501534,A14,15,A32,A40,3186,A64,A74,2,A92,A101,3,A123,20,A143,A151,1,A173,1,A191,A201,1 +2225902,A12,12,A32,A41,2028,A65,A73,4,A93,A101,2,A123,30,A143,A152,1,A173,1,A191,A201,1 +5678751,A12,12,A34,A40,958,A61,A74,2,A93,A101,3,A121,47,A143,A152,2,A172,2,A191,A201,1 +5549074,A14,21,A33,A42,1591,A62,A74,4,A93,A101,3,A121,34,A143,A152,2,A174,1,A191,A201,1 +4247210,A12,12,A32,A42,2762,A65,A75,1,A92,A101,2,A122,25,A141,A152,1,A173,1,A192,A201,2 +2450182,A12,18,A32,A41,2779,A61,A73,1,A94,A101,3,A123,21,A143,A151,1,A173,1,A192,A201,1 +9966032,A14,28,A34,A43,2743,A61,A75,4,A93,A101,2,A123,29,A143,A152,2,A173,1,A191,A201,1 +3797662,A14,18,A34,A43,1149,A64,A73,4,A93,A101,3,A121,46,A143,A152,2,A173,1,A191,A201,1 +7318192,A14,9,A32,A42,1313,A61,A75,1,A93,A101,4,A123,20,A143,A152,1,A173,1,A191,A201,1 +8969170,A11,18,A34,A45,1190,A61,A71,2,A92,A101,4,A124,55,A143,A153,3,A171,2,A191,A201,2 +6376369,A14,5,A32,A49,3448,A61,A74,1,A93,A101,4,A121,74,A143,A152,1,A172,1,A191,A201,1 +7195551,A12,24,A32,A410,11328,A61,A73,2,A93,A102,3,A123,29,A141,A152,2,A174,1,A192,A201,2 +9907330,A11,6,A34,A42,1872,A61,A71,4,A93,A101,4,A124,36,A143,A153,3,A174,1,A192,A201,1 +3653212,A14,24,A34,A45,2058,A61,A73,4,A91,A101,2,A121,33,A143,A152,2,A173,1,A192,A201,1 +1329531,A11,9,A32,A42,2136,A61,A73,3,A93,A101,2,A121,25,A143,A152,1,A173,1,A191,A201,1 +1320796,A12,12,A32,A43,1484,A65,A73,2,A94,A101,1,A121,25,A143,A152,1,A173,1,A192,A201,2 +5337783,A14,6,A32,A45,660,A63,A74,2,A94,A101,4,A121,23,A143,A151,1,A172,1,A191,A201,1 +8339835,A14,24,A34,A40,1287,A64,A75,4,A92,A101,4,A121,37,A143,A152,2,A173,1,A192,A201,1 +9464784,A11,42,A34,A45,3394,A61,A71,4,A93,A102,4,A123,65,A143,A152,2,A171,1,A191,A201,1 +3252011,A13,12,A31,A49,609,A61,A72,4,A92,A101,1,A121,26,A143,A152,1,A171,1,A191,A201,2 +4946613,A14,12,A32,A40,1884,A61,A75,4,A93,A101,4,A123,39,A143,A152,1,A174,1,A192,A201,1 +8418012,A11,12,A32,A42,1620,A61,A73,2,A92,A102,3,A122,30,A143,A152,1,A173,1,A191,A201,1 +7819002,A12,20,A33,A410,2629,A61,A73,2,A93,A101,3,A123,29,A141,A152,2,A173,1,A192,A201,1 +8650326,A14,12,A32,A46,719,A61,A75,4,A93,A101,4,A123,41,A141,A152,1,A172,2,A191,A201,2 +6493165,A12,48,A34,A42,5096,A61,A73,2,A92,A101,3,A123,30,A143,A152,1,A174,1,A192,A201,2 +8221917,A14,9,A34,A46,1244,A65,A75,4,A92,A101,4,A122,41,A143,A151,2,A172,1,A191,A201,1 +6793340,A11,36,A32,A40,1842,A61,A72,4,A92,A101,4,A123,34,A143,A152,1,A173,1,A192,A201,2 +5800337,A12,7,A32,A43,2576,A61,A73,2,A93,A103,2,A121,35,A143,A152,1,A173,1,A191,A201,1 +3033374,A13,12,A32,A42,1424,A65,A75,3,A92,A101,4,A121,55,A143,A152,1,A174,1,A192,A201,1 +6260138,A12,15,A33,A45,1512,A64,A73,3,A94,A101,3,A122,61,A142,A152,2,A173,1,A191,A201,2 +5464982,A14,36,A34,A41,11054,A65,A73,4,A93,A101,2,A123,30,A143,A152,1,A174,1,A192,A201,1 +1123635,A14,6,A32,A43,518,A61,A73,3,A92,A101,1,A121,29,A143,A152,1,A173,1,A191,A201,1 +7267566,A14,12,A30,A42,2759,A61,A75,2,A93,A101,4,A122,34,A143,A152,2,A173,1,A191,A201,1 +7836298,A14,24,A32,A41,2670,A61,A75,4,A93,A101,4,A123,35,A143,A152,1,A174,1,A192,A201,1 +5804062,A11,24,A32,A40,4817,A61,A74,2,A93,A102,3,A122,31,A143,A152,1,A173,1,A192,A201,2 +9314182,A14,24,A32,A41,2679,A61,A72,4,A92,A101,1,A124,29,A143,A152,1,A174,1,A192,A201,1 +1557803,A11,11,A34,A40,3905,A61,A73,2,A93,A101,2,A121,36,A143,A151,2,A173,2,A191,A201,1 +3608529,A11,12,A32,A41,3386,A61,A75,3,A93,A101,4,A124,35,A143,A153,1,A173,1,A192,A201,2 +8714298,A11,6,A32,A44,343,A61,A72,4,A92,A101,1,A121,27,A143,A152,1,A173,1,A191,A201,1 +2245784,A14,18,A32,A43,4594,A61,A72,3,A93,A101,2,A123,32,A143,A152,1,A173,1,A192,A201,1 +6049064,A11,36,A32,A42,3620,A61,A73,1,A93,A103,2,A122,37,A143,A152,1,A173,2,A191,A201,1 +6641808,A11,15,A32,A40,1721,A61,A72,2,A93,A101,3,A121,36,A143,A152,1,A173,1,A191,A201,1 +5057956,A12,12,A32,A42,3017,A61,A72,3,A92,A101,1,A121,34,A143,A151,1,A174,1,A191,A201,1 +4441263,A12,12,A32,A48,754,A65,A75,4,A93,A101,4,A122,38,A143,A152,2,A173,1,A191,A201,1 +3291069,A14,18,A32,A49,1950,A61,A74,4,A93,A101,1,A123,34,A142,A152,2,A173,1,A192,A201,1 +7518282,A11,24,A32,A41,2924,A61,A73,3,A93,A103,4,A124,63,A141,A152,1,A173,2,A192,A201,1 +1425368,A11,24,A33,A43,1659,A61,A72,4,A92,A101,2,A123,29,A143,A151,1,A172,1,A192,A201,2 +7122922,A14,48,A33,A43,7238,A65,A75,3,A93,A101,3,A123,32,A141,A152,2,A173,2,A191,A201,1 +6752438,A14,33,A33,A49,2764,A61,A73,2,A92,A101,2,A123,26,A143,A152,2,A173,1,A192,A201,1 +6177001,A14,24,A33,A41,4679,A61,A74,3,A93,A101,3,A123,35,A143,A152,2,A172,1,A192,A201,1 +6983743,A12,24,A32,A43,3092,A62,A72,3,A94,A101,2,A123,22,A143,A151,1,A173,1,A192,A201,2 +6691152,A11,6,A32,A46,448,A61,A72,4,A92,A101,4,A122,23,A143,A152,1,A173,1,A191,A201,2 +8808032,A11,9,A32,A40,654,A61,A73,4,A93,A101,3,A123,28,A143,A152,1,A172,1,A191,A201,2 +9230494,A14,6,A32,A48,1238,A65,A71,4,A93,A101,4,A122,36,A143,A152,1,A174,2,A192,A201,1 +7173285,A12,18,A34,A43,1245,A61,A73,4,A94,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,2 +5890956,A11,18,A30,A42,3114,A61,A72,1,A92,A101,4,A122,26,A143,A151,1,A173,1,A191,A201,2 +9796998,A14,39,A32,A41,2569,A63,A73,4,A93,A101,4,A123,24,A143,A152,1,A173,1,A191,A201,1 +9197850,A13,24,A32,A43,5152,A61,A74,4,A93,A101,2,A123,25,A141,A152,1,A173,1,A191,A201,1 +2693563,A12,12,A32,A49,1037,A62,A74,3,A93,A101,4,A121,39,A143,A152,1,A172,1,A191,A201,1 +4912289,A11,15,A34,A42,1478,A61,A75,4,A93,A101,4,A123,44,A143,A152,2,A173,2,A192,A201,1 +6383336,A12,12,A34,A43,3573,A61,A73,1,A92,A101,1,A121,23,A143,A152,1,A172,1,A191,A201,1 +7709637,A12,24,A32,A40,1201,A61,A72,4,A93,A101,1,A122,26,A143,A152,1,A173,1,A191,A201,1 +4072676,A11,30,A32,A42,3622,A64,A75,4,A92,A101,4,A122,57,A143,A151,2,A173,1,A192,A201,1 +9216806,A14,15,A33,A42,960,A64,A74,3,A92,A101,2,A122,30,A143,A152,2,A173,1,A191,A201,1 +2718715,A14,12,A34,A40,1163,A63,A73,4,A93,A101,4,A121,44,A143,A152,1,A173,1,A192,A201,1 +3865105,A12,6,A33,A40,1209,A61,A71,4,A93,A101,4,A122,47,A143,A152,1,A174,1,A192,A201,2 +9640833,A14,12,A32,A43,3077,A61,A73,2,A93,A101,4,A123,52,A143,A152,1,A173,1,A192,A201,1 +2667404,A14,24,A32,A40,3757,A61,A75,4,A92,A102,4,A124,62,A143,A153,1,A173,1,A192,A201,1 +4167143,A14,10,A32,A40,1418,A62,A73,3,A93,A101,2,A121,35,A143,A151,1,A172,1,A191,A202,1 +7957595,A12,18,A32,A43,1113,A61,A73,4,A92,A103,4,A121,26,A143,A152,1,A172,2,A191,A201,1 +6793340,A11,36,A32,A40,1842,A61,A72,4,A92,A101,4,A123,34,A143,A152,1,A173,1,A192,A201,2 +5378110,A14,6,A32,A40,3518,A61,A73,2,A93,A103,3,A122,26,A143,A151,1,A173,1,A191,A201,1 +7398663,A14,12,A34,A43,1934,A61,A75,2,A93,A101,2,A124,26,A143,A152,2,A173,1,A191,A201,1 +2689827,A12,27,A30,A49,8318,A61,A75,2,A92,A101,4,A124,42,A143,A153,2,A174,1,A192,A201,2 +4156618,A14,6,A34,A43,1237,A62,A73,1,A92,A101,1,A122,27,A143,A152,2,A173,1,A191,A201,1 +6319109,A12,6,A32,A43,368,A65,A75,4,A93,A101,4,A122,38,A143,A152,1,A173,1,A191,A201,1 +7120920,A11,12,A34,A40,2122,A61,A73,3,A93,A101,2,A121,39,A143,A151,2,A172,2,A191,A202,1 +1078929,A11,24,A32,A42,2996,A65,A73,2,A94,A101,4,A123,20,A143,A152,1,A173,1,A191,A201,2 +1023769,A12,36,A32,A42,9034,A62,A72,4,A93,A102,1,A124,29,A143,A151,1,A174,1,A192,A201,2 +5288061,A14,24,A34,A42,1585,A61,A74,4,A93,A101,3,A122,40,A143,A152,2,A173,1,A191,A201,1 +6898416,A12,18,A32,A43,1301,A61,A75,4,A94,A103,2,A121,32,A143,A152,1,A172,1,A191,A201,1 +3933279,A13,6,A34,A40,1323,A62,A75,2,A91,A101,4,A123,28,A143,A152,2,A173,2,A192,A201,1 +2929379,A11,24,A32,A40,3123,A61,A72,4,A92,A101,1,A122,27,A143,A152,1,A173,1,A191,A201,2 +7338113,A11,36,A32,A41,5493,A61,A75,2,A93,A101,4,A124,42,A143,A153,1,A173,2,A191,A201,1 +1383056,A13,9,A32,A43,1126,A62,A75,2,A91,A101,4,A121,49,A143,A152,1,A173,1,A191,A201,1 +9894792,A12,24,A34,A43,1216,A62,A72,4,A93,A101,4,A124,38,A141,A152,2,A173,2,A191,A201,2 +4821533,A11,24,A32,A40,1207,A61,A72,4,A92,A101,4,A122,24,A143,A151,1,A173,1,A191,A201,2 +2599724,A14,10,A32,A40,1309,A65,A73,4,A93,A103,4,A122,27,A143,A152,1,A172,1,A191,A201,2 +7122065,A13,15,A34,A41,2360,A63,A73,2,A93,A101,2,A123,36,A143,A152,1,A173,1,A192,A201,1 +4771670,A12,15,A31,A40,6850,A62,A71,1,A93,A101,2,A122,34,A143,A152,1,A174,2,A192,A201,2 +7524556,A14,24,A32,A43,1413,A61,A73,4,A94,A101,2,A122,28,A143,A152,1,A173,1,A191,A201,1 +8295830,A14,39,A32,A41,8588,A62,A75,4,A93,A101,2,A123,45,A143,A152,1,A174,1,A192,A201,1 +5815745,A11,12,A32,A40,759,A61,A74,4,A93,A101,2,A121,26,A143,A152,1,A173,1,A191,A201,2 +2954111,A14,36,A32,A41,4686,A61,A73,2,A93,A101,2,A124,32,A143,A153,1,A174,1,A192,A201,1 +3356855,A13,15,A32,A49,2687,A61,A74,2,A93,A101,4,A122,26,A143,A151,1,A173,1,A192,A201,1 +2676379,A12,12,A33,A43,585,A61,A73,4,A94,A102,4,A121,20,A143,A151,2,A173,1,A191,A201,1 +8727753,A14,24,A32,A40,2255,A65,A72,4,A93,A101,1,A122,54,A143,A152,1,A173,1,A191,A201,1 +6823953,A11,6,A34,A40,609,A61,A74,4,A92,A101,3,A122,37,A143,A152,2,A173,1,A191,A202,1 +7281914,A11,6,A34,A40,1361,A61,A72,2,A93,A101,4,A121,40,A143,A152,1,A172,2,A191,A202,1 +7911515,A14,36,A34,A42,7127,A61,A72,2,A92,A101,4,A122,23,A143,A151,2,A173,1,A192,A201,2 +6306871,A11,6,A32,A40,1203,A62,A75,3,A93,A101,2,A122,43,A143,A152,1,A173,1,A192,A201,1 +6961576,A14,6,A34,A43,700,A65,A75,4,A93,A101,4,A124,36,A143,A153,2,A173,1,A191,A201,1 +3215162,A14,24,A34,A45,5507,A61,A75,3,A93,A101,4,A124,44,A143,A153,2,A173,1,A191,A201,1 +3199662,A11,18,A32,A43,3190,A61,A73,2,A92,A101,2,A121,24,A143,A152,1,A173,1,A191,A201,2 +3873021,A11,48,A30,A42,7119,A61,A73,3,A93,A101,4,A124,53,A143,A153,2,A173,2,A191,A201,2 +9751634,A14,24,A32,A41,3488,A62,A74,3,A92,A101,4,A123,23,A143,A152,1,A173,1,A191,A201,1 +7957595,A12,18,A32,A43,1113,A61,A73,4,A92,A103,4,A121,26,A143,A152,1,A172,2,A191,A201,1 +5857366,A12,26,A32,A41,7966,A61,A72,2,A93,A101,3,A123,30,A143,A152,2,A173,1,A191,A201,1 +2509169,A14,15,A34,A46,1532,A62,A73,4,A92,A101,3,A123,31,A143,A152,1,A173,1,A191,A201,1 +2621457,A14,4,A34,A43,1503,A61,A74,2,A93,A101,1,A121,42,A143,A152,2,A172,2,A191,A201,1 +8637579,A11,36,A32,A43,2302,A61,A73,4,A91,A101,4,A123,31,A143,A151,1,A173,1,A191,A201,2 +4032468,A11,6,A32,A40,662,A61,A72,3,A93,A101,4,A121,41,A143,A152,1,A172,2,A192,A201,1 +9481590,A12,36,A32,A46,2273,A61,A74,3,A93,A101,1,A123,32,A143,A152,2,A173,2,A191,A201,1 +7828502,A12,15,A32,A40,2631,A62,A73,2,A92,A101,4,A123,28,A143,A151,2,A173,1,A192,A201,2 +6996047,A14,12,A33,A41,1503,A61,A73,4,A94,A101,4,A121,41,A143,A151,1,A173,1,A191,A201,1 +8785106,A14,24,A32,A43,1311,A62,A74,4,A94,A101,3,A122,26,A143,A152,1,A173,1,A192,A201,1 +3836745,A14,24,A32,A43,3105,A65,A72,4,A93,A101,2,A123,25,A143,A152,2,A173,1,A191,A201,1 +8659616,A13,21,A34,A46,2319,A61,A72,2,A91,A101,1,A123,33,A143,A151,1,A173,1,A191,A201,2 +5027921,A11,6,A32,A40,1374,A65,A71,4,A92,A101,3,A122,75,A143,A152,1,A174,1,A192,A201,1 +9292167,A12,18,A34,A42,3612,A61,A75,3,A92,A101,4,A122,37,A143,A152,1,A173,1,A192,A201,1 +9619216,A11,48,A32,A40,7763,A61,A75,4,A93,A101,4,A124,42,A141,A153,1,A174,1,A191,A201,2 +5342884,A13,18,A32,A42,3049,A61,A72,1,A92,A101,1,A122,45,A142,A152,1,A172,1,A191,A201,1 +3841990,A12,12,A32,A43,1534,A61,A72,1,A94,A101,1,A121,23,A143,A151,1,A173,1,A191,A201,2 +9862078,A14,24,A33,A40,2032,A61,A75,4,A93,A101,4,A124,60,A143,A153,2,A173,1,A192,A201,1 +5779045,A11,30,A32,A42,6350,A65,A75,4,A93,A101,4,A122,31,A143,A152,1,A173,1,A191,A201,2 +4285035,A13,18,A32,A42,2864,A61,A73,2,A93,A101,1,A121,34,A143,A152,1,A172,2,A191,A201,2 +9708453,A14,12,A34,A40,1255,A61,A75,4,A93,A101,4,A121,61,A143,A152,2,A172,1,A191,A201,1 +2572735,A11,24,A33,A40,1333,A61,A71,4,A93,A101,2,A121,43,A143,A153,2,A173,2,A191,A201,2 +9506182,A14,24,A34,A40,2022,A61,A73,4,A92,A101,4,A123,37,A143,A152,1,A173,1,A192,A201,1 +3835775,A14,24,A32,A43,1552,A61,A74,3,A93,A101,1,A123,32,A141,A152,1,A173,2,A191,A201,1 +8995384,A11,12,A31,A43,626,A61,A73,4,A92,A101,4,A121,24,A141,A152,1,A172,1,A191,A201,2 +1831870,A14,48,A34,A41,8858,A65,A74,2,A93,A101,1,A124,35,A143,A153,2,A173,1,A192,A201,1 +6930739,A14,12,A34,A45,996,A65,A74,4,A92,A101,4,A121,23,A143,A152,2,A173,1,A191,A201,1 +2082338,A14,6,A31,A43,1750,A63,A75,2,A93,A101,4,A122,45,A141,A152,1,A172,2,A191,A201,1 +4367265,A11,48,A32,A43,6999,A61,A74,1,A94,A103,1,A121,34,A143,A152,2,A173,1,A192,A201,2 +6993548,A12,12,A34,A40,1995,A62,A72,4,A93,A101,1,A123,27,A143,A152,1,A173,1,A191,A201,1 +5774450,A12,9,A32,A46,1199,A61,A74,4,A92,A101,4,A122,67,A143,A152,2,A174,1,A192,A201,1 +1859841,A12,12,A32,A43,1331,A61,A72,2,A93,A101,1,A123,22,A142,A152,1,A173,1,A191,A201,2 +2420058,A12,18,A30,A40,2278,A62,A72,3,A92,A101,3,A123,28,A143,A152,2,A173,1,A191,A201,2 +5136657,A14,21,A30,A40,5003,A65,A73,1,A92,A101,4,A122,29,A141,A152,2,A173,1,A192,A201,2 +4921496,A11,24,A31,A42,3552,A61,A74,3,A93,A101,4,A123,27,A141,A152,1,A173,1,A191,A201,2 +1259242,A12,18,A34,A42,1928,A61,A72,2,A93,A101,2,A121,31,A143,A152,2,A172,1,A191,A201,2 +4695993,A11,24,A32,A41,2964,A65,A75,4,A93,A101,4,A124,49,A141,A153,1,A173,2,A192,A201,1 +3427239,A11,24,A31,A43,1546,A61,A74,4,A93,A103,4,A123,24,A141,A151,1,A172,1,A191,A201,2 +4030339,A13,6,A33,A43,683,A61,A72,2,A92,A101,1,A122,29,A141,A152,1,A173,1,A191,A201,1 +6457386,A12,36,A32,A40,12389,A65,A73,1,A93,A101,4,A124,37,A143,A153,1,A173,1,A192,A201,2 +2492415,A12,24,A33,A49,4712,A65,A73,4,A93,A101,2,A122,37,A141,A152,2,A174,1,A192,A201,1 +3819769,A12,24,A33,A43,1553,A62,A74,3,A92,A101,2,A122,23,A143,A151,2,A173,1,A192,A201,1 +3248150,A11,12,A32,A40,1372,A61,A74,2,A91,A101,3,A123,36,A143,A152,1,A173,1,A191,A201,2 +2140412,A14,24,A34,A43,2578,A64,A75,2,A93,A101,2,A123,34,A143,A152,1,A173,1,A191,A201,1 +8918532,A12,48,A32,A43,3979,A65,A74,4,A93,A101,1,A123,41,A143,A152,2,A173,2,A192,A201,1 +9205934,A11,48,A32,A43,6758,A61,A73,3,A92,A101,2,A123,31,A143,A152,1,A173,1,A192,A201,2 +6546357,A11,24,A32,A42,3234,A61,A72,4,A92,A101,4,A121,23,A143,A151,1,A172,1,A192,A201,2 +4352053,A14,30,A34,A43,5954,A61,A74,3,A93,A102,2,A123,38,A143,A152,1,A173,1,A191,A201,1 +6620263,A14,24,A32,A41,5433,A65,A71,2,A92,A101,4,A122,26,A143,A151,1,A174,1,A192,A201,1 +1039684,A11,15,A32,A49,806,A61,A73,4,A92,A101,4,A122,22,A143,A152,1,A172,1,A191,A201,1 +1999133,A12,9,A32,A43,1082,A61,A75,4,A93,A101,4,A123,27,A143,A152,2,A172,1,A191,A201,1 +6641350,A14,15,A34,A42,2788,A61,A74,2,A92,A102,3,A123,24,A141,A152,2,A173,1,A191,A201,1 +9791462,A12,12,A32,A43,2930,A61,A74,2,A92,A101,1,A121,27,A143,A152,1,A173,1,A191,A201,1 +2964444,A14,24,A34,A46,1927,A65,A73,3,A92,A101,2,A123,33,A143,A152,2,A173,1,A192,A201,1 +4612811,A12,36,A34,A40,2820,A61,A72,4,A91,A101,4,A123,27,A143,A152,2,A173,1,A191,A201,2 +8885321,A14,24,A32,A48,937,A61,A72,4,A94,A101,3,A123,27,A143,A152,2,A172,1,A191,A201,1 +8198782,A12,18,A34,A40,1056,A61,A75,3,A93,A103,3,A121,30,A141,A152,2,A173,1,A191,A201,2 +2454414,A12,12,A34,A40,3124,A61,A72,1,A93,A101,3,A121,49,A141,A152,2,A172,2,A191,A201,1 +5678348,A14,9,A32,A42,1388,A61,A73,4,A92,A101,2,A121,26,A143,A151,1,A173,1,A191,A201,1 +5182819,A12,36,A32,A45,2384,A61,A72,4,A93,A101,1,A124,33,A143,A151,1,A172,1,A191,A201,2 +2727231,A14,12,A32,A40,2133,A65,A75,4,A92,A101,4,A124,52,A143,A153,1,A174,1,A192,A201,1 +1623348,A11,18,A32,A42,2039,A61,A73,1,A92,A101,4,A121,20,A141,A151,1,A173,1,A191,A201,2 +2307018,A11,9,A34,A40,2799,A61,A73,2,A93,A101,2,A121,36,A143,A151,2,A173,2,A191,A201,1 +1856079,A11,12,A32,A42,1289,A61,A73,4,A93,A103,1,A122,21,A143,A152,1,A172,1,A191,A201,1 +6274836,A11,18,A32,A44,1217,A61,A73,4,A94,A101,3,A121,47,A143,A152,1,A172,1,A192,A201,2 +6514251,A11,12,A34,A42,2246,A61,A75,3,A93,A101,3,A122,60,A143,A152,2,A173,1,A191,A201,2 +4867066,A11,12,A34,A43,385,A61,A74,4,A92,A101,3,A121,58,A143,A152,4,A172,1,A192,A201,1 +4883891,A12,24,A33,A40,1965,A65,A73,4,A92,A101,4,A123,42,A143,A151,2,A173,1,A192,A201,1 +9366782,A14,21,A32,A49,1572,A64,A75,4,A92,A101,4,A121,36,A141,A152,1,A172,1,A191,A201,1 +8194462,A12,24,A32,A40,2718,A61,A73,3,A92,A101,4,A122,20,A143,A151,1,A172,1,A192,A201,2 +3570555,A11,24,A31,A410,1358,A65,A75,4,A93,A101,3,A123,40,A142,A152,1,A174,1,A192,A201,2 +1491305,A12,6,A31,A40,931,A62,A72,1,A92,A101,1,A122,32,A142,A152,1,A172,1,A191,A201,2 +3303554,A11,24,A32,A40,1442,A61,A74,4,A92,A101,4,A123,23,A143,A151,2,A173,1,A191,A201,2 +3657362,A12,24,A30,A49,4241,A61,A73,1,A93,A101,4,A121,36,A143,A152,3,A172,1,A192,A201,2 +5220673,A14,18,A34,A40,2775,A61,A74,2,A93,A101,2,A122,31,A141,A152,2,A173,1,A191,A201,2 +5869534,A14,24,A33,A49,3863,A61,A73,1,A93,A101,2,A124,32,A143,A153,1,A173,1,A191,A201,1 +2213067,A12,7,A32,A43,2329,A61,A72,1,A92,A103,1,A121,45,A143,A152,1,A173,1,A191,A201,1 +2801710,A12,9,A32,A42,918,A61,A73,4,A92,A101,1,A122,30,A143,A152,1,A173,1,A191,A201,2 +1024391,A12,24,A31,A46,1837,A61,A74,4,A92,A101,4,A124,34,A141,A153,1,A172,1,A191,A201,2 +5699583,A14,36,A32,A42,3349,A61,A73,4,A92,A101,2,A123,28,A143,A152,1,A174,1,A192,A201,2 +2566539,A13,10,A32,A42,1275,A61,A72,4,A92,A101,2,A122,23,A143,A152,1,A173,1,A191,A201,1 +6680952,A11,24,A31,A42,2828,A63,A73,4,A93,A101,4,A121,22,A142,A152,1,A173,1,A192,A201,1 +7697857,A14,24,A34,A49,4526,A61,A73,3,A93,A101,2,A121,74,A143,A152,1,A174,1,A192,A201,1 +1147017,A12,36,A32,A43,2671,A62,A73,4,A92,A102,4,A124,50,A143,A153,1,A173,1,A191,A201,2 +1877439,A14,18,A32,A43,2051,A61,A72,4,A93,A101,1,A121,33,A143,A152,1,A173,1,A191,A201,1 +8278164,A14,15,A32,A41,1300,A65,A75,4,A93,A101,4,A124,45,A141,A153,1,A173,2,A191,A201,1 +5491895,A11,12,A32,A44,741,A62,A71,4,A92,A101,3,A122,22,A143,A152,1,A173,1,A191,A201,2 +1885075,A13,10,A32,A40,1240,A62,A75,1,A92,A101,4,A124,48,A143,A153,1,A172,2,A191,A201,2 +6750986,A11,21,A32,A43,3357,A64,A72,4,A92,A101,2,A123,29,A141,A152,1,A173,1,A191,A201,1 +8900389,A11,24,A31,A41,3632,A61,A73,1,A92,A103,4,A123,22,A141,A151,1,A173,1,A191,A202,1 +3583117,A14,18,A33,A42,1808,A61,A74,4,A92,A101,1,A121,22,A143,A152,1,A173,1,A191,A201,2 +6196790,A12,48,A30,A49,12204,A65,A73,2,A93,A101,2,A123,48,A141,A152,1,A174,1,A192,A201,1 +9626587,A12,60,A33,A43,9157,A65,A73,2,A93,A101,2,A124,27,A143,A153,1,A174,1,A191,A201,1 +1451248,A11,6,A34,A40,3676,A61,A73,1,A93,A101,3,A121,37,A143,A151,3,A173,2,A191,A201,1 +2753919,A12,30,A32,A42,3441,A62,A73,2,A92,A102,4,A123,21,A143,A151,1,A173,1,A191,A201,2 +5526862,A14,12,A32,A40,640,A61,A73,4,A91,A101,2,A121,49,A143,A152,1,A172,1,A191,A201,1 +4872213,A12,21,A34,A49,3652,A61,A74,2,A93,A101,3,A122,27,A143,A152,2,A173,1,A191,A201,1 +2393150,A14,18,A34,A40,1530,A61,A73,3,A93,A101,2,A122,32,A141,A152,2,A173,1,A191,A201,2 +5713333,A14,48,A32,A49,3914,A65,A73,4,A91,A101,2,A121,38,A141,A152,1,A173,1,A191,A201,2 +1937588,A11,12,A32,A42,1858,A61,A72,4,A92,A101,1,A123,22,A143,A151,1,A173,1,A191,A201,1 +9355241,A11,18,A32,A43,2600,A61,A73,4,A93,A101,4,A124,65,A143,A153,2,A173,1,A191,A201,2 +5508301,A14,15,A32,A43,1979,A65,A75,4,A93,A101,2,A123,35,A143,A152,1,A173,1,A191,A201,1 +3586335,A13,6,A32,A42,2116,A61,A73,2,A93,A101,2,A121,41,A143,A152,1,A173,1,A192,A201,1 +1838843,A12,9,A31,A40,1437,A62,A74,2,A93,A101,3,A124,29,A143,A152,1,A173,1,A191,A201,2 +4763001,A14,42,A34,A42,4042,A63,A73,4,A93,A101,4,A121,36,A143,A152,2,A173,1,A192,A201,1 +4124475,A14,9,A32,A46,3832,A65,A75,1,A93,A101,4,A121,64,A143,A152,1,A172,1,A191,A201,1 +1746129,A11,24,A32,A43,3660,A61,A73,2,A92,A101,4,A123,28,A143,A152,1,A173,1,A191,A201,1 +8578734,A11,18,A31,A42,1553,A61,A73,4,A93,A101,3,A123,44,A141,A152,1,A173,1,A191,A201,2 +3544577,A12,15,A32,A43,1444,A65,A72,4,A93,A101,1,A122,23,A143,A152,1,A173,1,A191,A201,1 +5640616,A14,9,A32,A42,1980,A61,A72,2,A92,A102,2,A123,19,A143,A151,2,A173,1,A191,A201,2 +3695680,A12,24,A32,A40,1355,A61,A72,3,A92,A101,4,A123,25,A143,A152,1,A172,1,A192,A201,2 +1665899,A14,12,A32,A46,1393,A61,A75,4,A93,A101,4,A122,47,A141,A152,3,A173,2,A192,A201,1 +3336776,A14,24,A32,A43,1376,A63,A74,4,A92,A101,1,A123,28,A143,A152,1,A173,1,A191,A201,1 +3175919,A14,60,A33,A43,15653,A61,A74,2,A93,A101,4,A123,21,A143,A152,2,A173,1,A192,A201,1 +6193893,A14,12,A32,A43,1493,A61,A72,4,A92,A101,3,A123,34,A143,A152,1,A173,2,A191,A201,1 +9308059,A11,42,A33,A43,4370,A61,A74,3,A93,A101,2,A122,26,A141,A152,2,A173,2,A192,A201,2 +7240888,A11,18,A32,A46,750,A61,A71,4,A92,A101,1,A121,27,A143,A152,1,A171,1,A191,A201,2 +7390720,A12,15,A32,A45,1308,A61,A75,4,A93,A101,4,A123,38,A143,A152,2,A172,1,A191,A201,1 +2244055,A14,15,A32,A46,4623,A62,A73,3,A93,A101,2,A122,40,A143,A152,1,A174,1,A192,A201,2 +9290774,A14,24,A34,A43,1851,A61,A74,4,A94,A103,2,A123,33,A143,A152,2,A173,1,A192,A201,1 +9895994,A11,18,A34,A43,1880,A61,A74,4,A94,A101,1,A122,32,A143,A152,2,A174,1,A192,A201,1 +5985690,A14,36,A33,A49,7980,A65,A72,4,A93,A101,4,A123,27,A143,A151,2,A173,1,A192,A201,2 +1445890,A11,30,A30,A42,4583,A61,A73,2,A91,A103,2,A121,32,A143,A152,2,A173,1,A191,A201,1 +9446992,A14,12,A32,A40,1386,A63,A73,2,A92,A101,2,A122,26,A143,A152,1,A173,1,A191,A201,2 +7946983,A13,24,A32,A40,947,A61,A74,4,A93,A101,3,A124,38,A141,A153,1,A173,2,A191,A201,2 +3808692,A11,12,A32,A46,684,A61,A73,4,A93,A101,4,A123,40,A143,A151,1,A172,2,A191,A201,2 +6960835,A11,48,A32,A46,7476,A61,A74,4,A93,A101,1,A124,50,A143,A153,1,A174,1,A192,A201,1 +6472767,A12,12,A32,A42,1922,A61,A73,4,A93,A101,2,A122,37,A143,A152,1,A172,1,A191,A201,2 +9116367,A11,24,A32,A40,2303,A61,A75,4,A93,A102,1,A121,45,A143,A152,1,A173,1,A191,A201,2 +2210355,A12,36,A33,A40,8086,A62,A75,2,A93,A101,4,A123,42,A143,A152,4,A174,1,A192,A201,2 +2597995,A14,24,A34,A41,2346,A61,A74,4,A93,A101,3,A123,35,A143,A152,2,A173,1,A192,A201,1 +9519867,A11,14,A32,A40,3973,A61,A71,1,A93,A101,4,A124,22,A143,A153,1,A173,1,A191,A201,1 +3698602,A12,12,A32,A40,888,A61,A75,4,A93,A101,4,A123,41,A141,A152,1,A172,2,A191,A201,2 +7748894,A14,48,A32,A43,10222,A65,A74,4,A93,A101,3,A123,37,A142,A152,1,A173,1,A192,A201,1 +5309238,A12,30,A30,A49,4221,A61,A73,2,A92,A101,1,A123,28,A143,A152,2,A173,1,A191,A201,1 +4970061,A12,18,A34,A42,6361,A61,A75,2,A93,A101,1,A124,41,A143,A152,1,A173,1,A192,A201,1 +2854826,A13,12,A32,A43,1297,A61,A73,3,A94,A101,4,A121,23,A143,A151,1,A173,1,A191,A201,1 +3463374,A11,12,A32,A40,900,A65,A73,4,A94,A101,2,A123,23,A143,A152,1,A173,1,A191,A201,2 +3481971,A14,21,A32,A42,2241,A61,A75,4,A93,A101,2,A121,50,A143,A152,2,A173,1,A191,A201,1 +1153712,A12,6,A33,A42,1050,A61,A71,4,A93,A101,1,A122,35,A142,A152,2,A174,1,A192,A201,1 +3077979,A13,6,A34,A46,1047,A61,A73,2,A92,A101,4,A122,50,A143,A152,1,A172,1,A191,A201,1 +6586335,A14,24,A34,A410,6314,A61,A71,4,A93,A102,2,A124,27,A141,A152,2,A174,1,A192,A201,1 +2300255,A12,30,A31,A42,3496,A64,A73,4,A93,A101,2,A123,34,A142,A152,1,A173,2,A192,A201,1 +6107764,A14,48,A31,A49,3609,A61,A73,1,A92,A101,1,A121,27,A142,A152,1,A173,1,A191,A201,1 +9756743,A11,12,A34,A40,4843,A61,A75,3,A93,A102,4,A122,43,A143,A151,2,A173,1,A192,A201,2 +2412093,A13,30,A34,A43,3017,A61,A75,4,A93,A101,4,A122,47,A143,A152,1,A173,1,A191,A201,1 +2967207,A14,24,A34,A49,4139,A62,A73,3,A93,A101,3,A122,27,A143,A152,2,A172,1,A192,A201,1 +5294255,A14,36,A32,A49,5742,A62,A74,2,A93,A101,2,A123,31,A143,A152,2,A173,1,A192,A201,1 +3960084,A14,60,A32,A40,10366,A61,A75,2,A93,A101,4,A122,42,A143,A152,1,A174,1,A192,A201,1 +5926774,A14,15,A32,A41,3029,A61,A74,2,A93,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,1 +5969658,A14,24,A32,A40,1393,A61,A73,2,A93,A103,2,A121,31,A143,A152,1,A173,1,A192,A201,1 +4016637,A14,6,A34,A40,2080,A63,A73,1,A94,A101,2,A123,24,A143,A152,1,A173,1,A191,A201,1 +5319265,A14,21,A33,A49,2580,A63,A72,4,A93,A101,2,A121,41,A141,A152,1,A172,2,A191,A201,2 +8616661,A14,30,A34,A43,4530,A61,A74,4,A92,A101,4,A123,26,A143,A151,1,A174,1,A192,A201,1 +9306673,A14,24,A34,A42,5150,A61,A75,4,A93,A101,4,A123,33,A143,A152,1,A173,1,A192,A201,1 +7790498,A12,72,A32,A43,5595,A62,A73,2,A94,A101,2,A123,24,A143,A152,1,A173,1,A191,A201,2 +3314262,A11,24,A32,A43,2384,A61,A75,4,A93,A101,4,A121,64,A141,A151,1,A172,1,A191,A201,1 +8563405,A14,18,A32,A43,1453,A61,A72,3,A92,A101,1,A121,26,A143,A152,1,A173,1,A191,A201,1 +8021407,A14,6,A32,A46,1538,A61,A72,1,A92,A101,2,A124,56,A143,A152,1,A173,1,A191,A201,1 +1620760,A14,12,A32,A43,2279,A65,A73,4,A93,A101,4,A124,37,A143,A153,1,A173,1,A192,A201,1 +7861677,A14,15,A33,A43,1478,A61,A73,4,A94,A101,3,A121,33,A141,A152,2,A173,1,A191,A201,1 +5532106,A14,24,A34,A43,5103,A61,A72,3,A94,A101,3,A124,47,A143,A153,3,A173,1,A192,A201,1 +6577647,A12,36,A33,A49,9857,A62,A74,1,A93,A101,3,A122,31,A143,A152,2,A172,2,A192,A201,1 +2592971,A14,60,A32,A40,6527,A65,A73,4,A93,A101,4,A124,34,A143,A153,1,A173,2,A192,A201,1 +7186212,A13,10,A34,A43,1347,A65,A74,4,A93,A101,2,A122,27,A143,A152,2,A173,1,A192,A201,1 +5547078,A12,36,A33,A40,2862,A62,A75,4,A93,A101,3,A124,30,A143,A153,1,A173,1,A191,A201,1 +8856174,A14,9,A32,A43,2753,A62,A75,3,A93,A102,4,A123,35,A143,A152,1,A173,1,A192,A201,1 +3625778,A11,12,A32,A40,3651,A64,A73,1,A93,A101,3,A122,31,A143,A152,1,A173,2,A191,A201,1 +6075200,A11,15,A34,A42,975,A61,A73,2,A91,A101,3,A122,25,A143,A152,2,A173,1,A191,A201,1 +2626457,A12,15,A32,A45,2631,A62,A73,3,A92,A101,2,A121,25,A143,A152,1,A172,1,A191,A201,1 +7031172,A12,24,A32,A43,2896,A62,A72,2,A93,A101,1,A123,29,A143,A152,1,A173,1,A191,A201,1 +7612370,A11,6,A34,A40,4716,A65,A72,1,A93,A101,3,A121,44,A143,A152,2,A172,2,A191,A201,1 +3634470,A14,24,A32,A43,2284,A61,A74,4,A93,A101,2,A123,28,A143,A152,1,A173,1,A192,A201,1 +1891464,A14,6,A32,A41,1236,A63,A73,2,A93,A101,4,A122,50,A143,A151,1,A173,1,A191,A201,1 +4385444,A12,12,A32,A43,1103,A61,A74,4,A93,A103,3,A121,29,A143,A152,2,A173,1,A191,A202,1 +4245378,A14,12,A34,A40,926,A61,A71,1,A92,A101,2,A122,38,A143,A152,1,A171,1,A191,A201,1 +8748357,A14,18,A34,A43,1800,A61,A73,4,A93,A101,2,A123,24,A143,A152,2,A173,1,A191,A201,1 +6154067,A13,15,A32,A46,1905,A61,A75,4,A93,A101,4,A123,40,A143,A151,1,A174,1,A192,A201,1 +8475240,A14,12,A32,A42,1123,A63,A73,4,A92,A101,4,A123,29,A143,A151,1,A172,1,A191,A201,2 +8497474,A11,48,A34,A41,6331,A61,A75,4,A93,A101,4,A124,46,A143,A153,2,A173,1,A192,A201,2 +4216405,A13,24,A32,A43,1377,A62,A75,4,A92,A101,2,A124,47,A143,A153,1,A173,1,A192,A201,1 +2028520,A12,30,A33,A49,2503,A62,A75,4,A93,A101,2,A122,41,A142,A152,2,A173,1,A191,A201,1 +6062683,A12,27,A32,A49,2528,A61,A72,4,A92,A101,1,A122,32,A143,A152,1,A173,2,A192,A201,1 +3591516,A14,15,A32,A40,5324,A63,A75,1,A92,A101,4,A124,35,A143,A153,1,A173,1,A191,A201,1 +8852736,A12,48,A32,A40,6560,A62,A74,3,A93,A101,2,A122,24,A143,A152,1,A173,1,A191,A201,2 +2188239,A12,12,A30,A42,2969,A61,A72,4,A92,A101,3,A122,25,A143,A151,2,A173,1,A191,A201,2 +8356737,A12,9,A32,A43,1206,A61,A75,4,A92,A101,4,A121,25,A143,A152,1,A173,1,A191,A201,1 +7311456,A12,9,A32,A43,2118,A61,A73,2,A93,A101,2,A121,37,A143,A152,1,A172,2,A191,A201,1 +2717625,A14,18,A34,A43,629,A63,A75,4,A93,A101,3,A122,32,A141,A152,2,A174,1,A192,A201,1 +1257668,A11,6,A31,A46,1198,A61,A75,4,A92,A101,4,A124,35,A143,A153,1,A173,1,A191,A201,2 +9770090,A14,21,A32,A41,2476,A65,A75,4,A93,A101,4,A121,46,A143,A152,1,A174,1,A192,A201,1 +5376189,A11,9,A34,A43,1138,A61,A73,4,A93,A101,4,A121,25,A143,A152,2,A172,1,A191,A201,1 +7684771,A12,60,A32,A40,14027,A61,A74,4,A93,A101,2,A124,27,A143,A152,1,A174,1,A192,A201,2 +5147905,A14,30,A34,A41,7596,A65,A75,1,A93,A101,4,A123,63,A143,A152,2,A173,1,A191,A201,1 +9729748,A14,30,A34,A43,3077,A65,A75,3,A93,A101,2,A123,40,A143,A152,2,A173,2,A192,A201,1 +3354393,A14,18,A32,A43,1505,A61,A73,4,A93,A101,2,A124,32,A143,A153,1,A174,1,A192,A201,1 +2738606,A13,24,A34,A43,3148,A65,A73,3,A93,A101,2,A123,31,A143,A152,2,A173,1,A192,A201,1 +7723603,A12,20,A30,A41,6148,A62,A75,3,A94,A101,4,A123,31,A141,A152,2,A173,1,A192,A201,1 +4275545,A13,9,A30,A43,1337,A61,A72,4,A93,A101,2,A123,34,A143,A152,2,A174,1,A192,A201,2 +8516067,A12,6,A31,A46,433,A64,A72,4,A92,A101,2,A122,24,A141,A151,1,A173,2,A191,A201,2 +8269904,A11,12,A32,A40,1228,A61,A73,4,A92,A101,2,A121,24,A143,A152,1,A172,1,A191,A201,2 +3797846,A12,9,A32,A43,790,A63,A73,4,A92,A101,3,A121,66,A143,A152,1,A172,1,A191,A201,1 +1759390,A14,27,A32,A40,2570,A61,A73,3,A92,A101,3,A121,21,A143,A151,1,A173,1,A191,A201,2 +8406970,A14,6,A34,A40,250,A64,A73,2,A92,A101,2,A121,41,A141,A152,2,A172,1,A191,A201,1 +4726527,A14,15,A34,A43,1316,A63,A73,2,A94,A101,2,A122,47,A143,A152,2,A172,1,A191,A201,1 +7293796,A11,18,A32,A43,1882,A61,A73,4,A92,A101,4,A123,25,A141,A151,2,A173,1,A191,A201,2 +8019704,A12,48,A31,A49,6416,A61,A75,4,A92,A101,3,A124,59,A143,A151,1,A173,1,A191,A201,2 +1119926,A13,24,A34,A49,1275,A64,A73,2,A91,A101,4,A121,36,A143,A152,2,A173,1,A192,A201,1 +8976091,A12,24,A33,A43,6403,A61,A72,1,A93,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,1 +2467002,A11,24,A32,A43,1987,A61,A73,2,A93,A101,4,A121,21,A143,A151,1,A172,2,A191,A201,2 +1767395,A12,8,A32,A43,760,A61,A74,4,A92,A103,2,A121,44,A143,A152,1,A172,1,A191,A201,1 +1412724,A14,24,A32,A41,2603,A64,A73,2,A92,A101,4,A123,28,A143,A151,1,A173,1,A192,A201,1 +7497034,A14,4,A34,A40,3380,A61,A74,1,A92,A101,1,A121,37,A143,A152,1,A173,2,A191,A201,1 +1760449,A12,36,A31,A44,3990,A65,A72,3,A92,A101,2,A124,29,A141,A152,1,A171,1,A191,A201,1 +3573461,A12,24,A32,A41,11560,A61,A73,1,A92,A101,4,A123,23,A143,A151,2,A174,1,A191,A201,2 +5869439,A11,18,A32,A40,4380,A62,A73,3,A93,A101,4,A123,35,A143,A152,1,A172,2,A192,A201,1 +5359049,A14,6,A34,A40,6761,A61,A74,1,A93,A101,3,A124,45,A143,A152,2,A174,2,A192,A201,1 +7610754,A12,30,A30,A49,4280,A62,A73,4,A92,A101,4,A123,26,A143,A151,2,A172,1,A191,A201,2 +7876212,A11,24,A31,A40,2325,A62,A74,2,A93,A101,3,A123,32,A141,A152,1,A173,1,A191,A201,1 +7058762,A12,10,A31,A43,1048,A61,A73,4,A93,A101,4,A121,23,A142,A152,1,A172,1,A191,A201,1 +4452376,A14,21,A32,A43,3160,A65,A75,4,A93,A101,3,A122,41,A143,A152,1,A173,1,A192,A201,1 +3813837,A11,24,A31,A42,2483,A63,A73,4,A93,A101,4,A121,22,A142,A152,1,A173,1,A192,A201,1 +4808834,A11,39,A34,A42,14179,A65,A74,4,A93,A101,4,A122,30,A143,A152,2,A174,1,A192,A201,1 +6902723,A11,13,A34,A49,1797,A61,A72,3,A93,A101,1,A122,28,A141,A152,2,A172,1,A191,A201,1 +8866025,A11,15,A32,A40,2511,A61,A71,1,A92,A101,4,A123,23,A143,A151,1,A173,1,A191,A201,1 +5625748,A11,12,A32,A40,1274,A61,A72,3,A92,A101,1,A121,37,A143,A152,1,A172,1,A191,A201,2 +3897156,A14,21,A32,A41,5248,A65,A73,1,A93,A101,3,A123,26,A143,A152,1,A173,1,A191,A201,1 +5926774,A14,15,A32,A41,3029,A61,A74,2,A93,A101,2,A123,33,A143,A152,1,A173,1,A191,A201,1 +1337550,A11,6,A32,A42,428,A61,A75,2,A92,A101,1,A122,49,A141,A152,1,A173,1,A192,A201,1 +4131044,A11,18,A32,A40,976,A61,A72,1,A92,A101,2,A123,23,A143,A152,1,A172,1,A191,A201,2 +3163950,A12,12,A32,A49,841,A62,A74,2,A92,A101,4,A121,23,A143,A151,1,A172,1,A191,A201,1 +1657747,A14,30,A34,A43,5771,A61,A74,4,A92,A101,2,A123,25,A143,A152,2,A173,1,A191,A201,1 +9250778,A14,12,A33,A45,1555,A64,A75,4,A93,A101,4,A124,55,A143,A153,2,A173,2,A191,A201,2 +7923489,A11,24,A32,A40,1285,A65,A74,4,A92,A101,4,A124,32,A143,A151,1,A173,1,A191,A201,2 +9517102,A13,6,A34,A40,1299,A61,A73,1,A93,A101,1,A121,74,A143,A152,3,A171,2,A191,A202,1 +1700698,A13,15,A34,A43,1271,A65,A73,3,A93,A101,4,A124,39,A143,A153,2,A173,1,A192,A201,2 +5969658,A14,24,A32,A40,1393,A61,A73,2,A93,A103,2,A121,31,A143,A152,1,A173,1,A192,A201,1 +8022152,A11,12,A34,A40,691,A61,A75,4,A93,A101,3,A122,35,A143,A152,2,A173,1,A191,A201,2 +1605780,A14,15,A34,A40,5045,A65,A75,1,A92,A101,4,A123,59,A143,A152,1,A173,1,A192,A201,1 +8975629,A11,18,A34,A42,2124,A61,A73,4,A92,A101,4,A121,24,A143,A151,2,A173,1,A191,A201,2 +4062254,A11,12,A32,A43,2214,A61,A73,4,A93,A101,3,A122,24,A143,A152,1,A172,1,A191,A201,1 +9324382,A14,21,A34,A40,12680,A65,A75,4,A93,A101,4,A124,30,A143,A153,1,A174,1,A192,A201,2 +1025006,A14,24,A34,A40,2463,A62,A74,4,A94,A101,3,A122,27,A143,A152,2,A173,1,A192,A201,1 +4776602,A12,12,A32,A43,1155,A61,A75,3,A94,A103,3,A121,40,A141,A152,2,A172,1,A191,A201,1 +1846586,A11,30,A32,A42,3108,A61,A72,2,A91,A101,4,A122,31,A143,A152,1,A172,1,A191,A201,2 +8968659,A14,10,A32,A41,2901,A65,A72,1,A92,A101,4,A121,31,A143,A151,1,A173,1,A191,A201,1 +2233391,A12,12,A34,A42,3617,A61,A75,1,A93,A101,4,A123,28,A143,A151,3,A173,1,A192,A201,1 +1491565,A14,12,A34,A43,1655,A61,A75,2,A93,A101,4,A121,63,A143,A152,2,A172,1,A192,A201,1 +4567772,A11,24,A32,A41,2812,A65,A75,2,A92,A101,4,A121,26,A143,A151,1,A173,1,A191,A201,1 +7000454,A11,36,A34,A46,8065,A61,A73,3,A92,A101,2,A124,25,A143,A152,2,A174,1,A192,A201,2 +8450534,A14,21,A34,A41,3275,A61,A75,1,A93,A101,4,A123,36,A143,A152,1,A174,1,A192,A201,1 +9182898,A14,24,A34,A43,2223,A62,A75,4,A93,A101,4,A122,52,A141,A152,2,A173,1,A191,A201,1 +4237007,A13,12,A34,A40,1480,A63,A71,2,A93,A101,4,A124,66,A141,A153,3,A171,1,A191,A201,1 +2480667,A11,24,A32,A40,1371,A65,A73,4,A92,A101,4,A121,25,A143,A151,1,A173,1,A191,A201,2 +2564769,A14,36,A34,A40,3535,A61,A74,4,A93,A101,4,A123,37,A143,A152,2,A173,1,A192,A201,1 +6154723,A11,18,A32,A43,3509,A61,A74,4,A92,A103,1,A121,25,A143,A152,1,A173,1,A191,A201,1 +4299505,A14,36,A34,A41,5711,A64,A75,4,A93,A101,2,A123,38,A143,A152,2,A174,1,A192,A201,1 +2287593,A12,18,A32,A45,3872,A61,A71,2,A92,A101,4,A123,67,A143,A152,1,A173,1,A192,A201,1 +3094366,A12,39,A34,A43,4933,A61,A74,2,A93,A103,2,A121,25,A143,A152,2,A173,1,A191,A201,2 +7536220,A14,24,A34,A40,1940,A64,A75,4,A93,A101,4,A121,60,A143,A152,1,A173,1,A192,A201,1 +9783545,A12,12,A30,A48,1410,A61,A73,2,A93,A101,2,A121,31,A143,A152,1,A172,1,A192,A201,1 +6146218,A12,12,A32,A40,836,A62,A72,4,A92,A101,2,A122,23,A141,A152,1,A172,1,A191,A201,2 +7934311,A12,20,A32,A41,6468,A65,A71,1,A91,A101,4,A121,60,A143,A152,1,A174,1,A192,A201,1 +7300802,A12,18,A32,A49,1941,A64,A73,4,A93,A101,2,A122,35,A143,A152,1,A172,1,A192,A201,1 +7658137,A14,22,A32,A43,2675,A63,A75,3,A93,A101,4,A123,40,A143,A152,1,A173,1,A191,A201,1 +8404631,A14,48,A34,A41,2751,A65,A75,4,A93,A101,3,A123,38,A143,A152,2,A173,2,A192,A201,1 +6944710,A12,48,A33,A46,6224,A61,A75,4,A93,A101,4,A124,50,A143,A153,1,A173,1,A191,A201,2 +1514807,A11,40,A34,A46,5998,A61,A73,4,A93,A101,3,A124,27,A141,A152,1,A173,1,A192,A201,2 +2003384,A12,21,A32,A49,1188,A61,A75,2,A92,A101,4,A122,39,A143,A152,1,A173,2,A191,A201,2 +5659086,A14,24,A32,A41,6313,A65,A75,3,A93,A101,4,A123,41,A143,A152,1,A174,2,A192,A201,1 +4628471,A14,6,A34,A42,1221,A65,A73,1,A94,A101,2,A122,27,A143,A152,2,A173,1,A191,A201,1 +5403673,A13,24,A32,A42,2892,A61,A75,3,A91,A101,4,A124,51,A143,A153,1,A173,1,A191,A201,1 +8614255,A14,24,A32,A42,3062,A63,A75,4,A93,A101,3,A124,32,A143,A151,1,A173,1,A192,A201,1 +7923482,A14,9,A32,A42,2301,A62,A72,2,A92,A101,4,A122,22,A143,A151,1,A173,1,A191,A201,1 +2383650,A11,18,A32,A41,7511,A65,A75,1,A93,A101,4,A122,51,A143,A153,1,A173,2,A192,A201,2 +6958612,A14,12,A34,A42,1258,A61,A72,2,A92,A101,4,A122,22,A143,A151,2,A172,1,A191,A201,1 +1883328,A14,24,A33,A40,717,A65,A75,4,A94,A101,4,A123,54,A143,A152,2,A173,1,A192,A201,1 +8386168,A12,9,A32,A40,1549,A65,A72,4,A93,A101,2,A121,35,A143,A152,1,A171,1,A191,A201,1 +9660695,A14,24,A34,A46,1597,A61,A75,4,A93,A101,4,A124,54,A143,A153,2,A173,2,A191,A201,1 +5786462,A12,18,A34,A43,1795,A61,A75,3,A92,A103,4,A121,48,A141,A151,2,A172,1,A192,A201,1 +1678929,A11,20,A34,A42,4272,A61,A75,1,A92,A101,4,A122,24,A143,A152,2,A173,1,A191,A201,1 +4809900,A14,12,A34,A43,976,A65,A75,4,A93,A101,4,A123,35,A143,A152,2,A173,1,A191,A201,1 +6983851,A12,12,A32,A40,7472,A65,A71,1,A92,A101,2,A121,24,A143,A151,1,A171,1,A191,A201,1 +9335369,A11,36,A32,A40,9271,A61,A74,2,A93,A101,1,A123,24,A143,A152,1,A173,1,A192,A201,2 +7140312,A12,6,A32,A43,590,A61,A72,3,A94,A101,3,A121,26,A143,A152,1,A172,1,A191,A202,1 +4639968,A14,12,A34,A43,930,A65,A75,4,A93,A101,4,A121,65,A143,A152,4,A173,1,A191,A201,1 +9646710,A12,42,A31,A41,9283,A61,A71,1,A93,A101,2,A124,55,A141,A153,1,A174,1,A192,A201,1 +1322037,A12,15,A30,A40,1778,A61,A72,2,A92,A101,1,A121,26,A143,A151,2,A171,1,A191,A201,2 +3439631,A12,8,A32,A49,907,A61,A72,3,A94,A101,2,A121,26,A143,A152,1,A173,1,A192,A201,1 +7403964,A12,6,A32,A43,484,A61,A74,3,A94,A103,3,A121,28,A141,A152,1,A172,1,A191,A201,1 +6675884,A11,36,A34,A41,9629,A61,A74,4,A93,A101,4,A123,24,A143,A152,2,A173,1,A192,A201,2 +5903457,A11,48,A32,A44,3051,A61,A73,3,A93,A101,4,A123,54,A143,A152,1,A173,1,A191,A201,2 +6892095,A11,48,A32,A40,3931,A61,A74,4,A93,A101,4,A124,46,A143,A153,1,A173,2,A191,A201,2 +2535665,A12,36,A33,A40,7432,A61,A73,2,A92,A101,2,A122,54,A143,A151,1,A173,1,A191,A201,1 +2113858,A14,6,A32,A44,1338,A63,A73,1,A91,A101,4,A121,62,A143,A152,1,A173,1,A191,A201,1 +6940571,A14,6,A34,A43,1554,A61,A74,1,A92,A101,2,A123,24,A143,A151,2,A173,1,A192,A201,1 +1818305,A11,36,A32,A410,15857,A61,A71,2,A91,A102,3,A123,43,A143,A152,1,A174,1,A191,A201,1 +1136050,A11,18,A32,A43,1345,A61,A73,4,A94,A101,3,A121,26,A141,A152,1,A173,1,A191,A201,2 +5408293,A14,12,A32,A40,1101,A61,A73,3,A94,A101,2,A121,27,A143,A152,2,A173,1,A192,A201,1 +6973245,A13,12,A32,A43,3016,A61,A73,3,A94,A101,1,A123,24,A143,A152,1,A173,1,A191,A201,1 +3458174,A11,36,A32,A42,2712,A61,A75,2,A93,A101,2,A122,41,A141,A152,1,A173,2,A191,A201,2 +1641060,A11,8,A34,A40,731,A61,A75,4,A93,A101,4,A121,47,A143,A152,2,A172,1,A191,A201,1 +3831409,A14,18,A34,A42,3780,A61,A72,3,A91,A101,2,A123,35,A143,A152,2,A174,1,A192,A201,1 +7489128,A11,21,A34,A40,1602,A61,A75,4,A94,A101,3,A123,30,A143,A152,2,A173,1,A192,A201,1 +2737942,A11,18,A34,A40,3966,A61,A75,1,A92,A101,4,A121,33,A141,A151,3,A173,1,A192,A201,2 +3155576,A14,18,A30,A49,4165,A61,A73,2,A93,A101,2,A123,36,A142,A152,2,A173,2,A191,A201,2 +6658732,A11,36,A32,A41,8335,A65,A75,3,A93,A101,4,A124,47,A143,A153,1,A173,1,A191,A201,2 +6286813,A12,48,A33,A49,6681,A65,A73,4,A93,A101,4,A124,38,A143,A153,1,A173,2,A192,A201,1 +7203909,A14,24,A33,A49,2375,A63,A73,4,A93,A101,2,A123,44,A143,A152,2,A173,2,A192,A201,1 +8071605,A11,18,A32,A40,1216,A61,A72,4,A92,A101,3,A123,23,A143,A151,1,A173,1,A192,A201,2 +2576628,A11,45,A30,A49,11816,A61,A75,2,A93,A101,4,A123,29,A143,A151,2,A173,1,A191,A201,2 +3544441,A12,24,A32,A43,5084,A65,A75,2,A92,A101,4,A123,42,A143,A152,1,A173,1,A192,A201,1 +6494860,A13,15,A32,A43,2327,A61,A72,2,A92,A101,3,A121,25,A143,A152,1,A172,1,A191,A201,2 +5247218,A11,12,A30,A40,1082,A61,A73,4,A93,A101,4,A123,48,A141,A152,2,A173,1,A191,A201,2 +7024848,A14,12,A32,A43,886,A65,A73,4,A92,A101,2,A123,21,A143,A152,1,A173,1,A191,A201,1 +7182646,A14,4,A32,A42,601,A61,A72,1,A92,A101,3,A121,23,A143,A151,1,A172,2,A191,A201,1 +6839088,A11,24,A34,A41,2957,A61,A75,4,A93,A101,4,A122,63,A143,A152,2,A173,1,A192,A201,1 +9873458,A14,24,A34,A43,2611,A61,A75,4,A94,A102,3,A121,46,A143,A152,2,A173,1,A191,A201,1 +1896222,A11,36,A32,A42,5179,A61,A74,4,A93,A101,2,A122,29,A143,A152,1,A173,1,A191,A201,2 +3865701,A14,21,A33,A41,2993,A61,A73,3,A93,A101,2,A121,28,A142,A152,2,A172,1,A191,A201,1 +1157229,A14,18,A32,A45,1943,A61,A72,4,A92,A101,4,A121,23,A143,A152,1,A173,1,A191,A201,2 +3448928,A14,24,A31,A49,1559,A61,A74,4,A93,A101,4,A123,50,A141,A152,1,A173,1,A192,A201,1 +4318645,A14,18,A32,A42,3422,A61,A75,4,A93,A101,4,A122,47,A141,A152,3,A173,2,A192,A201,1 +7624820,A12,21,A32,A42,3976,A65,A74,2,A93,A101,3,A123,35,A143,A152,1,A173,1,A192,A201,1 +8556957,A14,18,A32,A40,6761,A65,A73,2,A93,A101,4,A123,68,A143,A151,2,A173,1,A191,A201,2 +9294026,A14,24,A32,A40,1249,A61,A72,4,A94,A101,2,A121,28,A143,A152,1,A173,1,A191,A201,1 +8814062,A11,9,A32,A43,1364,A61,A74,3,A93,A101,4,A121,59,A143,A152,1,A173,1,A191,A201,1 +7578622,A11,12,A32,A43,709,A61,A75,4,A93,A101,4,A121,57,A142,A152,1,A172,1,A191,A201,2 +1129973,A11,20,A34,A40,2235,A61,A73,4,A94,A103,2,A122,33,A141,A151,2,A173,1,A191,A202,2 +3785461,A14,24,A34,A41,4042,A65,A74,3,A93,A101,4,A122,43,A143,A152,2,A173,1,A192,A201,1 +6563566,A14,15,A34,A43,1471,A61,A73,4,A93,A101,4,A124,35,A143,A153,2,A173,1,A192,A201,1 +8205907,A11,18,A31,A40,1442,A61,A74,4,A93,A101,4,A124,32,A143,A153,2,A172,2,A191,A201,2 +4983715,A14,36,A33,A40,10875,A61,A75,2,A93,A101,2,A123,45,A143,A152,2,A173,2,A192,A201,1 +1607673,A14,24,A32,A40,1474,A62,A72,4,A94,A101,3,A121,33,A143,A152,1,A173,1,A192,A201,1 +7984360,A14,10,A32,A48,894,A65,A74,4,A92,A101,3,A122,40,A143,A152,1,A173,1,A192,A201,1 +8489771,A14,15,A34,A42,3343,A61,A73,4,A93,A101,2,A124,28,A143,A153,1,A173,1,A192,A201,1 +1769803,A11,15,A32,A40,3959,A61,A73,3,A92,A101,2,A122,29,A143,A152,1,A173,1,A192,A201,2 +8600642,A14,9,A32,A40,3577,A62,A73,1,A93,A103,2,A121,26,A143,A151,1,A173,2,A191,A202,1 +2919320,A14,24,A34,A41,5804,A64,A73,4,A93,A101,2,A121,27,A143,A152,2,A173,1,A191,A201,1 +2941810,A14,18,A33,A49,2169,A61,A73,4,A94,A101,2,A123,28,A143,A152,1,A173,1,A192,A201,2 +5203823,A11,24,A32,A43,2439,A61,A72,4,A92,A101,4,A121,35,A143,A152,1,A173,1,A192,A201,2 +4987253,A14,27,A34,A42,4526,A64,A72,4,A93,A101,2,A121,32,A142,A152,2,A172,2,A192,A201,1 +6273169,A14,10,A32,A42,2210,A61,A73,2,A93,A101,2,A121,25,A141,A151,1,A172,1,A191,A201,2 +6364124,A14,15,A32,A42,2221,A63,A73,2,A92,A101,4,A123,20,A143,A151,1,A173,1,A191,A201,1 +1218238,A11,18,A32,A43,2389,A61,A72,4,A92,A101,1,A123,27,A142,A152,1,A173,1,A191,A201,1 +8334248,A14,12,A34,A42,3331,A61,A75,2,A93,A101,4,A122,42,A142,A152,1,A173,1,A191,A201,1 +7214897,A14,36,A32,A49,7409,A65,A75,3,A93,A101,2,A122,37,A143,A152,2,A173,1,A191,A201,1 +7299602,A11,12,A32,A42,652,A61,A75,4,A92,A101,4,A122,24,A143,A151,1,A173,1,A191,A201,1 +8110158,A14,36,A33,A42,7678,A63,A74,2,A92,A101,4,A123,40,A143,A152,2,A173,1,A192,A201,1 +3654815,A13,6,A34,A40,1343,A61,A75,1,A93,A101,4,A121,46,A143,A152,2,A173,2,A191,A202,1 +6572384,A11,24,A34,A49,1382,A62,A74,4,A93,A101,1,A121,26,A143,A152,2,A173,1,A192,A201,1 +4718363,A14,15,A32,A44,874,A65,A72,4,A92,A101,1,A121,24,A143,A152,1,A173,1,A191,A201,1 +4110964,A11,12,A32,A42,3590,A61,A73,2,A93,A102,2,A122,29,A143,A152,1,A172,2,A191,A201,1 +5456397,A12,11,A34,A40,1322,A64,A73,4,A92,A101,4,A123,40,A143,A152,2,A173,1,A191,A201,1 +7560842,A11,18,A31,A43,1940,A61,A72,3,A93,A102,4,A124,36,A141,A153,1,A174,1,A192,A201,1 +2867079,A14,36,A32,A43,3595,A61,A75,4,A93,A101,2,A123,28,A143,A152,1,A173,1,A191,A201,1 +9165349,A11,9,A32,A40,1422,A61,A72,3,A93,A101,2,A124,27,A143,A153,1,A174,1,A192,A201,2 +5074170,A14,30,A34,A43,6742,A65,A74,2,A93,A101,3,A122,36,A143,A152,2,A173,1,A191,A201,1 +1674264,A14,24,A32,A41,7814,A61,A74,3,A93,A101,3,A123,38,A143,A152,1,A174,1,A192,A201,1 +7549449,A14,24,A32,A41,9277,A65,A73,2,A91,A101,4,A124,48,A143,A153,1,A173,1,A192,A201,1 +9453883,A12,30,A34,A40,2181,A65,A75,4,A93,A101,4,A121,36,A143,A152,2,A173,1,A191,A201,1 +7539149,A14,18,A34,A43,1098,A61,A71,4,A92,A101,4,A123,65,A143,A152,2,A171,1,A191,A201,1 +2758767,A12,24,A32,A42,4057,A61,A74,3,A91,A101,3,A123,43,A143,A152,1,A173,1,A192,A201,2 +5771164,A11,12,A32,A46,795,A61,A72,4,A92,A101,4,A122,53,A143,A152,1,A173,1,A191,A201,2 +7250365,A12,24,A34,A49,2825,A65,A74,4,A93,A101,3,A124,34,A143,A152,2,A173,2,A192,A201,1 +6463462,A12,48,A32,A49,15672,A61,A73,2,A93,A101,2,A123,23,A143,A152,1,A173,1,A192,A201,2 +3678150,A14,36,A34,A40,6614,A61,A75,4,A93,A101,4,A123,34,A143,A152,2,A174,1,A192,A201,1 +7159187,A14,28,A31,A41,7824,A65,A72,3,A93,A103,4,A121,40,A141,A151,2,A173,2,A192,A201,1 +2996743,A11,27,A34,A49,2442,A61,A75,4,A93,A101,4,A123,43,A142,A152,4,A174,2,A192,A201,1 +3755936,A14,15,A34,A43,1829,A61,A75,4,A93,A101,4,A123,46,A143,A152,2,A173,1,A192,A201,1 +9858169,A11,12,A34,A40,2171,A61,A73,4,A93,A101,4,A122,38,A141,A152,2,A172,1,A191,A202,1 +6011271,A12,36,A34,A41,5800,A61,A73,3,A93,A101,4,A123,34,A143,A152,2,A173,1,A192,A201,1 +7464596,A12,24,A32,A42,4351,A65,A73,1,A92,A101,4,A122,48,A143,A152,1,A172,1,A192,A201,1 +9907330,A11,6,A34,A42,1872,A61,A71,4,A93,A101,4,A124,36,A143,A153,3,A174,1,A192,A201,1 +2676319,A14,18,A34,A43,1169,A65,A73,4,A93,A101,3,A122,29,A143,A152,2,A173,1,A192,A201,1 +9543202,A14,36,A33,A41,8947,A65,A74,3,A93,A101,2,A123,31,A142,A152,1,A174,2,A192,A201,1 +4682068,A11,21,A32,A43,2606,A61,A72,4,A92,A101,4,A122,28,A143,A151,1,A174,1,A192,A201,1 +5609033,A14,12,A34,A42,1592,A64,A74,3,A92,A101,2,A122,35,A143,A152,1,A173,1,A191,A202,1 +1863246,A14,15,A32,A42,2186,A65,A74,1,A92,A101,4,A121,33,A141,A151,1,A172,1,A191,A201,1 +3453367,A11,18,A32,A42,4153,A61,A73,2,A93,A102,3,A123,42,A143,A152,1,A173,1,A191,A201,2 +7549449,A14,24,A32,A41,9277,A65,A73,2,A91,A101,4,A124,48,A143,A153,1,A173,1,A192,A201,1 +6328203,A14,15,A32,A41,4657,A61,A73,3,A93,A101,2,A123,30,A143,A152,1,A173,1,A192,A201,1 +5078589,A11,16,A34,A40,2625,A61,A75,2,A93,A103,4,A122,43,A141,A151,1,A173,1,A192,A201,2 +7409626,A14,20,A34,A40,3485,A65,A72,2,A91,A101,4,A121,44,A143,A152,2,A173,1,A192,A201,1 +3861438,A14,36,A34,A41,10477,A65,A75,2,A93,A101,4,A124,42,A143,A153,2,A173,1,A191,A201,1 +6653362,A14,15,A32,A43,1386,A65,A73,4,A94,A101,2,A121,40,A143,A151,1,A173,1,A192,A201,1 +2279281,A14,24,A32,A43,1278,A61,A75,4,A93,A101,1,A121,36,A143,A152,1,A174,1,A192,A201,1 +9374296,A11,12,A32,A43,1107,A61,A73,2,A93,A101,2,A121,20,A143,A151,1,A174,2,A192,A201,1 +2769534,A11,21,A32,A40,3763,A65,A74,2,A93,A102,2,A121,24,A143,A152,1,A172,1,A191,A202,1 +2117451,A12,36,A32,A46,3711,A65,A73,2,A94,A101,2,A123,27,A143,A152,1,A173,1,A191,A201,1 +5910784,A14,15,A33,A41,3594,A61,A72,1,A92,A101,2,A122,46,A143,A152,2,A172,1,A191,A201,1 +6446222,A12,9,A32,A40,3195,A65,A73,1,A92,A101,2,A121,33,A143,A152,1,A172,1,A191,A201,1 +1562036,A14,36,A33,A43,4454,A61,A73,4,A92,A101,4,A121,34,A143,A152,2,A173,1,A191,A201,1 +4535110,A12,24,A34,A42,4736,A61,A72,2,A92,A101,4,A123,25,A141,A152,1,A172,1,A191,A201,2 +4788744,A12,30,A32,A43,2991,A65,A75,2,A92,A101,4,A123,25,A143,A152,1,A173,1,A191,A201,1 +6790659,A14,11,A32,A49,2142,A64,A75,1,A91,A101,2,A121,28,A143,A152,1,A173,1,A192,A201,1 +6687237,A11,24,A31,A49,3161,A61,A73,4,A93,A101,2,A122,31,A143,A151,1,A173,1,A192,A201,2 +3700907,A12,48,A30,A410,18424,A61,A73,1,A92,A101,2,A122,32,A141,A152,1,A174,1,A192,A202,2 +2284183,A14,10,A32,A41,2848,A62,A73,1,A93,A102,2,A121,32,A143,A152,1,A173,2,A191,A201,1 +9968219,A11,6,A32,A40,14896,A61,A75,1,A93,A101,4,A124,68,A141,A152,1,A174,1,A192,A201,2 +6987335,A11,24,A32,A42,2359,A62,A71,1,A91,A101,1,A122,33,A143,A152,1,A173,1,A191,A201,2 +8752925,A11,24,A32,A42,3345,A61,A75,4,A93,A101,2,A122,39,A143,A151,1,A174,1,A192,A201,2 +5367701,A14,18,A34,A42,1817,A61,A73,4,A92,A101,2,A124,28,A143,A152,2,A173,1,A191,A201,1 +5393100,A14,48,A33,A43,12749,A63,A74,4,A93,A101,1,A123,37,A143,A152,1,A174,1,A192,A201,1 +4459246,A11,9,A32,A43,1366,A61,A72,3,A92,A101,4,A122,22,A143,A151,1,A173,1,A191,A201,2 +1342448,A12,12,A32,A40,2002,A61,A74,3,A93,A101,4,A122,30,A143,A151,1,A173,2,A192,A201,1 +1646042,A11,24,A31,A42,6872,A61,A72,2,A91,A101,1,A122,55,A141,A152,1,A173,1,A192,A201,2 +8308521,A11,12,A31,A40,697,A61,A72,4,A93,A101,2,A123,46,A141,A152,2,A173,1,A192,A201,2 +6172317,A11,18,A34,A42,1049,A61,A72,4,A92,A101,4,A122,21,A143,A151,1,A173,1,A191,A201,1 +5333717,A11,48,A32,A41,10297,A61,A74,4,A93,A101,4,A124,39,A142,A153,3,A173,2,A192,A201,2 +3861659,A14,30,A32,A43,1867,A65,A75,4,A93,A101,4,A123,58,A143,A152,1,A173,1,A192,A201,1 +6910543,A11,12,A33,A40,1344,A61,A73,4,A93,A101,2,A121,43,A143,A152,2,A172,2,A191,A201,1 +1061602,A11,24,A32,A42,1747,A61,A72,4,A93,A102,1,A122,24,A143,A152,1,A172,1,A191,A202,1 +4970330,A12,9,A32,A43,1670,A61,A72,4,A92,A101,2,A123,22,A143,A152,1,A173,1,A192,A201,2 +9608683,A14,9,A34,A40,1224,A61,A73,3,A93,A101,1,A121,30,A143,A152,2,A173,1,A191,A201,1 +8847915,A14,12,A34,A43,522,A63,A75,4,A93,A101,4,A122,42,A143,A152,2,A173,2,A192,A201,1 +4860879,A11,12,A32,A43,1498,A61,A73,4,A92,A101,1,A123,23,A141,A152,1,A173,1,A191,A201,1 +1242372,A12,30,A33,A43,1919,A62,A72,4,A93,A101,3,A124,30,A142,A152,2,A174,1,A191,A201,2 +5171329,A13,9,A32,A43,745,A61,A73,3,A92,A101,2,A121,28,A143,A152,1,A172,1,A191,A201,2 +8086940,A12,6,A32,A43,2063,A61,A72,4,A94,A101,3,A123,30,A143,A151,1,A174,1,A192,A201,1 +5621508,A12,60,A32,A46,6288,A61,A73,4,A93,A101,4,A124,42,A143,A153,1,A173,1,A191,A201,2 +5701557,A14,24,A34,A41,6842,A65,A73,2,A93,A101,4,A122,46,A143,A152,2,A174,2,A192,A201,1 +1490791,A14,12,A32,A40,3527,A65,A72,2,A93,A101,3,A122,45,A143,A152,1,A174,2,A192,A201,1 +7984042,A14,10,A32,A40,1546,A61,A73,3,A93,A101,2,A121,31,A143,A152,1,A172,2,A191,A202,1 +4745533,A14,24,A32,A42,929,A65,A74,4,A93,A101,2,A123,31,A142,A152,1,A173,1,A192,A201,1 +1360133,A14,4,A34,A40,1455,A61,A74,2,A93,A101,1,A121,42,A143,A152,3,A172,2,A191,A201,1 +9152373,A11,15,A32,A42,1845,A61,A72,4,A92,A103,1,A122,46,A143,A151,1,A173,1,A191,A201,1 +8743438,A12,48,A30,A40,8358,A63,A72,1,A92,A101,1,A123,30,A143,A152,2,A173,1,A191,A201,1 +4419761,A11,24,A31,A42,3349,A63,A72,4,A93,A101,4,A124,30,A143,A153,1,A173,2,A192,A201,2 +5135310,A14,12,A32,A40,2859,A65,A71,4,A93,A101,4,A124,38,A143,A152,1,A174,1,A192,A201,1 +6359252,A14,18,A32,A42,1533,A61,A72,4,A94,A102,1,A122,43,A143,A152,1,A172,2,A191,A201,2 +2579500,A14,24,A32,A43,3621,A62,A75,2,A93,A101,4,A123,31,A143,A152,2,A173,1,A191,A201,2 +5856859,A12,18,A34,A49,3590,A61,A71,3,A94,A101,3,A123,40,A143,A152,3,A171,2,A192,A201,1 +4531283,A11,36,A33,A49,2145,A61,A74,2,A93,A101,1,A123,24,A143,A152,2,A173,1,A192,A201,2 +5825152,A12,24,A32,A41,4113,A63,A72,3,A92,A101,4,A123,28,A143,A151,1,A173,1,A191,A201,2 +7225727,A14,36,A32,A42,10974,A61,A71,4,A92,A101,2,A123,26,A143,A152,2,A174,1,A192,A201,2 +7714243,A11,12,A32,A40,1893,A61,A73,4,A92,A103,4,A122,29,A143,A152,1,A173,1,A192,A201,1 +4837212,A11,24,A34,A43,1231,A64,A75,4,A92,A101,4,A122,57,A143,A151,2,A174,1,A192,A201,1 +6919828,A13,30,A34,A43,3656,A65,A75,4,A93,A101,4,A122,49,A142,A152,2,A172,1,A191,A201,1 +2791847,A12,9,A34,A43,1154,A61,A75,2,A93,A101,4,A121,37,A143,A152,3,A172,1,A191,A201,1 +2849845,A11,28,A32,A40,4006,A61,A73,3,A93,A101,2,A123,45,A143,A152,1,A172,1,A191,A201,2 +3667502,A12,24,A32,A42,3069,A62,A75,4,A93,A101,4,A124,30,A143,A153,1,A173,1,A191,A201,1 +3226277,A14,6,A34,A43,1740,A61,A75,2,A94,A101,2,A121,30,A143,A151,2,A173,1,A191,A201,1 +6097436,A12,21,A33,A40,2353,A61,A73,1,A91,A101,4,A122,47,A143,A152,2,A173,1,A191,A201,1 +1545817,A14,15,A32,A40,3556,A65,A73,3,A93,A101,2,A124,29,A143,A152,1,A173,1,A191,A201,1 +7606017,A14,24,A32,A43,2397,A63,A75,3,A93,A101,2,A123,35,A141,A152,2,A173,1,A192,A201,2 +6567094,A12,6,A32,A45,454,A61,A72,3,A94,A101,1,A122,22,A143,A152,1,A172,1,A191,A201,1 +1721068,A12,30,A32,A43,1715,A65,A73,4,A92,A101,1,A123,26,A143,A152,1,A173,1,A191,A201,1 +5114820,A12,27,A34,A43,2520,A63,A73,4,A93,A101,2,A122,23,A143,A152,2,A172,1,A191,A201,2 +7437809,A14,15,A32,A43,3568,A61,A75,4,A92,A101,2,A123,54,A141,A151,1,A174,1,A192,A201,1 +9159315,A14,42,A32,A43,7166,A65,A74,2,A94,A101,4,A122,29,A143,A151,1,A173,1,A192,A201,1 +5764003,A11,11,A34,A40,3939,A61,A73,1,A93,A101,2,A121,40,A143,A152,2,A172,2,A191,A201,1 +1419235,A12,15,A32,A45,1514,A62,A73,4,A93,A103,2,A121,22,A143,A152,1,A173,1,A191,A201,1 +5842641,A14,24,A32,A40,7393,A61,A73,1,A93,A101,4,A122,43,A143,A152,1,A172,2,A191,A201,1 +1259519,A11,24,A31,A40,1193,A61,A71,1,A92,A102,4,A124,29,A143,A151,2,A171,1,A191,A201,2 +3793320,A11,60,A32,A49,7297,A61,A75,4,A93,A102,4,A124,36,A143,A151,1,A173,1,A191,A201,2 +7276981,A14,30,A34,A43,2831,A61,A73,4,A92,A101,2,A123,33,A143,A152,1,A173,1,A192,A201,1 +3160914,A13,24,A32,A43,1258,A63,A73,3,A92,A101,3,A123,57,A143,A152,1,A172,1,A191,A201,1 +4116332,A12,6,A32,A43,753,A61,A73,2,A92,A103,3,A121,64,A143,A152,1,A173,1,A191,A201,1 +2401016,A12,18,A33,A49,2427,A65,A75,4,A93,A101,2,A122,42,A143,A152,2,A173,1,A191,A201,1 +6097153,A14,24,A33,A40,2538,A61,A75,4,A93,A101,4,A123,47,A143,A152,2,A172,2,A191,A201,2 +3797948,A12,15,A31,A40,1264,A62,A73,2,A94,A101,2,A122,25,A143,A151,1,A173,1,A191,A201,2 +6006685,A12,30,A34,A42,8386,A61,A74,2,A93,A101,2,A122,49,A143,A152,1,A173,1,A191,A201,2 +6720199,A14,48,A32,A49,4844,A61,A71,3,A93,A101,2,A123,33,A141,A151,1,A174,1,A192,A201,2 +2846538,A13,21,A32,A40,2923,A62,A73,1,A92,A101,1,A123,28,A141,A152,1,A174,1,A192,A201,1 +1852011,A11,36,A32,A41,8229,A61,A73,2,A93,A101,2,A122,26,A143,A152,1,A173,2,A191,A201,2 +6947593,A14,24,A34,A42,2028,A61,A74,2,A93,A101,2,A122,30,A143,A152,2,A172,1,A191,A201,1 +5518158,A11,15,A34,A42,1433,A61,A73,4,A92,A101,3,A122,25,A143,A151,2,A173,1,A191,A201,1 +8456653,A13,42,A30,A49,6289,A61,A72,2,A91,A101,1,A122,33,A143,A152,2,A173,1,A191,A201,1 +7323240,A14,13,A32,A43,1409,A62,A71,2,A92,A101,4,A121,64,A143,A152,1,A173,1,A191,A201,1 +5716073,A11,24,A32,A41,6579,A61,A71,4,A93,A101,2,A124,29,A143,A153,1,A174,1,A192,A201,1 +4639718,A12,24,A34,A43,1743,A61,A75,4,A93,A101,2,A122,48,A143,A152,2,A172,1,A191,A201,1 +2797680,A14,12,A34,A46,3565,A65,A72,2,A93,A101,1,A122,37,A143,A152,2,A172,2,A191,A201,1 +3956429,A14,15,A31,A43,1569,A62,A75,4,A93,A101,4,A123,34,A141,A152,1,A172,2,A191,A201,1 +3257050,A11,18,A32,A43,1936,A65,A74,2,A94,A101,4,A123,23,A143,A151,2,A172,1,A191,A201,1 +3566008,A11,36,A32,A42,3959,A61,A71,4,A93,A101,3,A122,30,A143,A152,1,A174,1,A192,A201,1 +5331918,A14,12,A32,A40,2390,A65,A75,4,A93,A101,3,A123,50,A143,A152,1,A173,1,A192,A201,1 +9671059,A14,12,A32,A42,1736,A61,A74,3,A92,A101,4,A121,31,A143,A152,1,A172,1,A191,A201,1 +2180183,A11,30,A32,A41,3857,A61,A73,4,A91,A101,4,A122,40,A143,A152,1,A174,1,A192,A201,1 +3130615,A14,12,A32,A43,804,A61,A75,4,A93,A101,4,A123,38,A143,A152,1,A173,1,A191,A201,1 +6267789,A11,45,A32,A43,1845,A61,A73,4,A93,A101,4,A124,23,A143,A153,1,A173,1,A192,A201,2 +6959896,A12,45,A34,A41,4576,A62,A71,3,A93,A101,4,A123,27,A143,A152,1,A173,1,A191,A201,1 diff --git a/Module4/.ipynb_checkpoints/IntroductionToRegression-checkpoint.ipynb b/Module4/.ipynb_checkpoints/IntroductionToRegression-checkpoint.ipynb new file mode 100644 index 0000000..15f13b6 --- /dev/null +++ b/Module4/.ipynb_checkpoints/IntroductionToRegression-checkpoint.ipynb @@ -0,0 +1,563 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction to Regression\n", + "\n", + "## Introduction\n", + "\n", + "In this lab you will learn to apply linear regression modules using the Python scikit-learn package. In particularly;\n", + "\n", + "1. Understand the basics of applying regression models for prediction. \n", + "2. Evaluate the performance of regression models. \n", + "2. Apply a recipe for using scikit-learn to define, train and test machine learning models. " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## Overview of regression\n", + "\n", + "The method of regression is one of the oldest and most widely used analytics methods. The goal of regression is to produce a model that represents the ‘best fit’ to some observed data. Typically the model is a function describing some type of curve (lines, parabolas, etc.) that is determined by a set of parameters (e.g., slope and intercept). “Best fit” means that there is an optimal set of parameters according to an evaluation criteria we choose.\n", + "\n", + "A regression models attempt to predict the value of one variable, known as the **dependent variable**, **response variable** or **label**, using the values of other variables, known as **independent variables**, **explanatory variables** or **features**. Single regression has one label used to predict one feature. Multiple regression uses two or more feature variables. \n", + "\n", + "In mathematical form the goal of regression is to find a function of some features $X$ which predicts the label value $y$. This function can be written as follows:\n", + "\n", + "$$\\hat{y} = f(X)$$\n", + "\n", + "The challenge in regression is to **learn** the function $f(X)$ so that the predictions of $\\hat{y}$ are accurate. In other word, we train the model to minimize the difference between our predicted $\\hat{y}$ and the known label values $y$. In fact, the entire field of **supervised learning** has this goal \n", + "\n", + "Many machine learning models, including some of the latest deep learning methods, are a form of regression. There methods often suffer from the same problems, including overfitting and mathematically unstable fitting methods. \n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Overview of linear regression\n", + "\n", + "In this lab, you will work with linear regression models. Linear regression are a foundational form of regression. Once you understand a bit about linear regression you will know quite a lot about machine learning in general. \n", + "\n", + "The simplest case of linear regression is know as **single regression**, since there is a single feature. The function $f(X)$ is **linear in the model coefficients**. For a single vector of features $x$ the linear regression equation is written as follows:\n", + "\n", + "$$\\hat{y} = a \\cdot x + b$$\n", + "\n", + "The model coefficients are $a$, which we call the **slope** and $b$, which we call the **intercept**. Notice that this is just the equation of a straight line for one variable. \n", + "\n", + "But, what are the best values of $a$ and $b$? In linear regression, $a$ and $b$ are chosen to minimize the squared error between the predictions and the known labels. This quantity is known as the **sum squared error** or SSE. For $n$ **training cases** the SSE is computed as follows:\n", + "\n", + "$$MSE = \\sum_{i=1}^n \\big( f(x_i) - y_i \\big)^2\\\\\n", + "= \\sum_{i=1}^n \\big( \\hat{y}_i - y_i \\big)^2\\\\\n", + "= \\sum_{i=1}^n \\big( a \\cdot x_i + b - y_i \\big)^2$$\n", + "\n", + "The approach to regression that minimizes SSE is know as the **method of least squares**." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Execute a first linear regression example \n", + "\n", + "With this bit of theory in mind, you will now train and evaluate a linear regression model. In this case you will use simulated data, which means that you can compare the computed results to the known properties of the data. \n", + "\n", + "As a first step, execute the code in the cell below to load the packages you will need to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Import packages\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import sklearn.model_selection as ms\n", + "import sklearn.metrics as sklm\n", + "from sklearn import preprocessing\n", + "from sklearn import linear_model\n", + "import scipy.stats as ss\n", + "import seaborn as sns\n", + "import math\n", + "\n", + "# make plots appear inline in the notebook\n", + "%matplotlib inline " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below simulates the data and plots the result. The data has the following properties:\n", + "\n", + "- The `x` variable is uniformly distributed between 0.0 and 10.0.\n", + "- The `y` variable equals the `x` variable plus a Normally distributed random component. As a result, for the un-scalled data, the slope coefficient should be 1.0 and the intercept 0.0. \n", + "\n", + "Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "nr.seed(34567)\n", + "x = np.arange(start = 0.0, stop = 10.0, step = 0.1)\n", + "y = np.add(x, nr.normal(scale = 1.0, size = x.shape[0]))\n", + "\n", + "sns.regplot(x, y, fit_reg = False)\n", + "plt.xlabel('X')\n", + "plt.ylabel('Y')\n", + "plt.title('Data for regression')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, these data follow a straight line trend. However, there is some dispersion of these data as a result of the addition of the Normally distributed noise. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Split the dataset\n", + "\n", + "When performing any type of machine learning, good data preparation is required to ensure good model performance. Poor data preparation is often the source of poor machine learning model performance. \n", + "\n", + "The first step in preparing these data is to create **independently sampled** **training dataset** and **test data set**. In most cases, an independently sampled **evaluation dataset** will also be used. In this case, no model improvement or comparison will be performed so this additional step is unnecessary. \n", + "\n", + "If the same data are used to train and test a machine learning model, there is a high likelihood that the model will simply be learning the training data. In technical terms one can say that there is **information leakage** between the training and test processes. In this case, the model may not **generalize** well. A model that generalizes well produces consistent results when presented with new cases, never before encountered. Conversely, a model with poor generalization might give unexpected results when presented with a new case. \n", + "\n", + "The random sub-samples of the data are created using a process called **Bernoulli sampling**. Bernoulli sampling accepts a sample into the data subset with probability $p$. In this case, the probability that a given case is in the training dataset is $p$. The probability a case is in the test dataset then becomes $1-p$. \n", + "\n", + "The `train_test_split` function from the `sklearn.model_selection` module performed the required Bernoulli sampling. The `train_test_split` function samples the index for the array containing the features and label values. The code in the cell below performs this split. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(9988)\n", + "indx = range(len(x))\n", + "indx = ms.train_test_split(indx, test_size = 50)\n", + "x_train = np.ravel(x[indx[0]])\n", + "y_train = np.ravel(y[indx[0]])\n", + "x_test = np.ravel(x[indx[1]])\n", + "y_test = np.ravel(y[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Scale numeric features\n", + "\n", + "Now that the dataset is split, the numeric feature column must be re-scaled. Rescaling of numeric features is extremely important. The numeric range of a feature should not determine how much that feature determines the training of the machine learning model. \n", + "\n", + "For example, consider a data set with two features, age in years, typically measured in a few tens, and income, typically measured in tens or hundreds of thousands. There is no reason to believe that income is more important than age in some model, simply because its range of values is greater. To prevent this problem numeric features are scaled to the same range. \n", + "\n", + "There are many possible scaling methods. One simple method is known as **Min-Max** normalization. The data are scaled using the following formula to be in the range $\\{ 0,1 \\}$:\n", + "\n", + "$$x\\_scaled_i = \\frac{(x_i - Min(x))}{(Max(x) - Min(x))}$$\n", + "\n", + "where,\n", + "$x_i $ is the ith sample value,\n", + "$Min(X) $ is the minimum value of all samples,\n", + "$Max(X) $ is the maximum value of all samples.\n", + "\n", + "In general, Min-Max normalization is a good choice for cases where the value being scaled has a complex distribution. For example, a variable with a distribution with multiple modes might be a good candidate for Min-Max normalization. Notice that the presence of a few outliers can distort the result by giving unrepresentative values of $Min(X)$ or $Max(X)$.\n", + "\n", + "\n", + "For this lab you will use **Z-Score** normalization. Z-Score normalization transforms a variable so that it has zero mean and unit standard deviation (or variance). Z-Score normalization is performed using the following formula:\n", + "\n", + "$$x\\_scaled_i = \\frac{\\big(x_i - \\mu \\big)}{\\sigma}$$\n", + "\n", + "where,\n", + "$\\mu $ is the mean of the variable $X$,\n", + "$\\sigma $ is the standard deviation of the variable $X$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below uses the `StandardScaler` function from the `sklearn.preprocessing` package. This function computes the scaling coefficients for the training data. The resulting transformation is then applied to the training and test data using the `transform` method. \n", + "\n", + "Notice that the scaling transform is computed only on the training data. The scaling transform should always be computed on the training data, not the test or evaluation data. \n", + "\n", + "Generally, a numeric label does not need to be scaled. Other transformations may be required, however. \n", + "\n", + "Execute the code in the cell below that applies the Z-Score transformation to the training and test feature. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Scale the feature, being sure to use the scale of the training\n", + "## data not the test data. \n", + "scaler = preprocessing.StandardScaler().fit(x_train.reshape(-1,1))\n", + "x_train = scaler.transform(x_train.reshape(-1,1)) \n", + "y_train = scaler.transform(y_train.reshape(-1,1)) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Train the regression model\n", + "\n", + "With the data prepared, it is time to train a regression model. This is done with the `sklearn.linear_model` package. The steps for training most scikit-learn models are the same as used here:\n", + "\n", + "1. A model object is instantiated. Additional model specification can be performed at instantiation time.\n", + "2. The model is fit using a numpy array of the features and the labels. In this case, there is only one feature so the `reshape` method is used to create an array of the correct dimension.\n", + "\n", + "You can follow this link to find additional [documentation on linear regression models with scikit-learn](http://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares). \n", + "\n", + "Execute the code in the cell below to instantiate and fit the model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## define and fit the linear regression model\n", + "lin_mod = linear_model.LinearRegression()\n", + "lin_mod.fit(x_train.reshape(-1,1), y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As a first verification of this model you can print the model coefficients. These coefficients are attributes of the model object. Execute the code in the cell below and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(lin_mod.intercept_)\n", + "print(lin_mod.coef_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These coefficients are close to the values used in the simulation, $0.0$ and $1.0$. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, you will plot the predicted values computed from the training features. The `predict` method is applied to the model with the training data. A plot of the raw label values and the line of the predicted values or **scores** is then displayed. Execute the code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_regression(x, y_score, y):\n", + " ## Plot the result\n", + " sns.regplot(x, y, fit_reg=False)\n", + " plt.plot(x, y_score, c = 'red')\n", + " plt.xlabel('X')\n", + " plt.ylabel('Y')\n", + " plt.title('Fit of model to test data')\n", + "\n", + "y_score = lin_mod.predict(x_test.reshape(-1,1)) \n", + "\n", + "plot_regression(x_test, y_score, y_test)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The red line appears to be a good fit to the data. The errors between the scored values and the residuals appear to be minimal. However, an objective evaluation of model performance is require." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "## Evaluate model performance\n", + "\n", + "With the model trained, it is time to evaluate the performance. This is done using the test dataset, so that there is no information leakage from the model training. \n", + "\n", + "As a first step, a set of performance metric are computed. There are many possible metrics used for the evaluation of regression models. Generally, these metrics are functions of the **residual value**, or difference between the predicted value or score and actual label value:\n", + "\n", + "$$r_i = f(x_i) - y_i = \\hat{y}_i - y_i$$\n", + "\n", + "In this lab, you will work with some of the more common metrics:\n", + "\n", + "- **Mean squared error** or MSE, \n", + "$$MSE = \\frac{1}{N} \\sum_{i=1}^N (f(x_i) - y_i)^2$$\n", + "\n", + "The mean squared error is identical to the variance of the residuals (with a slight bias). Recall that this metric is the one linear regression minimizes. Notice that mean square error is in units of the square of the label values. \n", + "\n", + "- **Root mean squred error** or RMSE, \n", + "$$RMSE = \\sqrt{ \\frac{1}{N} \\sum_{i=1}^N (f(x_i) - y_i)^2}$$\n", + "\n", + "The root mean squared error is identical to the standard deviation of the residuals (again, with a slight bias). Root mean square error is in the same units as the label values. \n", + "\n", + "- **Mean absolute error** or MAE,\n", + "$$MAE = \\frac{1}{N} \\sum_{i=1}^N |f(x_i) - y_i|$$ \n", + "where $||$ is the absolute value operator. \n", + "\n", + "The similar in interpretation to the root mean squared error. You may find this measure more intuitive since it is simply the average of the magnitude of the residuals. \n", + "\n", + "- **Median absolute error**,\n", + "$$Median\\ Absolute\\ Error = Median \\big( \\sum_{i=1}^N |f(x_i) - y_i| \\big)$$ \n", + "\n", + "The median absolute error is a robust measure of the location parameter of the absolute residuals. If this measure is significantly different from the mean absolute error, it is likely that there are outliers in the residuals. \n", + "\n", + "- **R squared or $R^2$**, also known as the **coefficient of determination**, \n", + "$$R^2 = 1 - \\frac{SS_{res}}{SS_{tot}}$$ \n", + "where, \n", + "$SS_{res} = \\sum_{i=1}^N r_i^2$, or the sum of the squared residuals, \n", + "$SS_{res} = \\sum_{i=1}^N y_i^2$, or the sum of the squared label values. \n", + "\n", + "In other words, $R^2$ is measure of the reduction in sum of squared values between the raw label values and the residuals. If the model has not reduced the sum of squares of the labels (a useless model!), $R^2 = 0$. On the other hand, if the model fits the data perfectly so all $r_i = 0$, then $R^2 = 1$. \n", + "\n", + "- **Adjusted R squared or $R^2_{adj}$** is $R^2$ adjusted for degrees of freedom in the model,\n", + "$$R^2_{adj} = 1 - \\frac{var(r)}{var(y)} = 1 - \\frac{\\frac{SS_{res}}{(n - p -1)}}{\\frac{SS_{tot}}{(n-1)}}$$ \n", + "where, \n", + "$var(r) = $ the variance of the residuals, \n", + "$var(y) = $ the variance of the labels,\n", + "$n = $ the number of samples or cases,\n", + "$p = $ number of model parameters. \n", + "\n", + "The interpretation of $R^2_{adj}$ is the same as $R^2$. In many cases there will be little difference. However if the number of parameters is significant with respect to the number of cases, $R^2$ will give an overly optimistic measure of model performance. In general, the difference between $R^2_{adj}$ and $R^2$ becomes less significant as the number of cases $n$ grows. However, even for 'big data' models there can be a significant difference if there are a large number of model parameters. \n", + "\n", + "****\n", + "**Note:** Is it possible to get values of $R^2$ outside the range $\\{ 0,1 \\}$? Ordinarily no. But there are exceptions. \n", + "\n", + "$R^2$ can only be greater than $1$ in degenerate cases. For example, if all label values are the same. But, in this case, you do not need a model to predict the label!\n", + "\n", + "What if you find your model gives an $R^2$ less than $0$? What can this possibly mean? This invariably means that there is a bug in your code and that the residuals of your model have greater dispersion than the original labels!\n", + "****\n", + "\n", + "The code in the cell below uses functions from the `sklearn.metrics` package to compute some common metrics. There is no function for $R^2_{adj}$ in `sklearn.metrics`, but the adjustments for degrees of freedom are easily computed. \n", + "\n", + "You can follow this link to find additional [documentation on linear regression merics built into scikit-learn](http://scikit-learn.org/stable/modules/model_evaluation.html#regression-metrics). \n", + "\n", + "Execute the code in the cell below, examine the results, and answer **Question 1** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def print_metrics(y_true, y_predicted, n_parameters):\n", + " ## First compute R^2 and the adjusted R^2\n", + " r2 = sklm.r2_score(y_true, y_predicted)\n", + " r2_adj = r2 - (n_parameters - 1)/(y_true.shape[0] - n_parameters) * (1 - r2)\n", + " \n", + " ## Print the usual metrics and the R^2 values\n", + " print('Mean Square Error = ' + str(sklm.mean_squared_error(y_true, y_predicted)))\n", + " print('Root Mean Square Error = ' + str(math.sqrt(sklm.mean_squared_error(y_true, y_predicted))))\n", + " print('Mean Absolute Error = ' + str(sklm.mean_absolute_error(y_true, y_predicted)))\n", + " print('Median Absolute Error = ' + str(sklm.median_absolute_error(y_true, y_predicted)))\n", + " print('R^2 = ' + str(r2))\n", + " print('Adjusted R^2 = ' + str(r2_adj))\n", + " \n", + "print_metrics(y_test, y_score, 2) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "How can you interpret these results:\n", + "- The MSE and RMSE are as expected. The standard deviation of the simulated data is $1.0$. \n", + "- The MAE and median absolute error have small values and are close together, indicating a good model fit and few significant outliers in the residuals. \n", + "- The $R^2_{adj}$ and $R^2$ are both fairly close to one, indicating that the model is making useful predictions that are much better than the simple average of the label values." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The residuals of a linear regression model should have an approximately Normal distribution. This condition can be easily tested using graphical methods, specifically a histogram and a Quantile-Quantile Normal plot. \n", + "\n", + "****\n", + "**Note:** A common misconception is that the features or label of a linear regression model must have Normal distributions. This is not the case! Rather, the residuals (errors) of the model should be Normally distributed. \n", + "**** \n", + "\n", + "The code in the cell below plots a kernel density plot and histogram of the residuals of the regression model. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def hist_resids(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " sns.distplot(resids)\n", + " plt.title('Histogram of residuals')\n", + " plt.xlabel('Residual value')\n", + " plt.ylabel('count')\n", + " \n", + "hist_resids(y_test, y_score) " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "This histogram and the kernel density plot look approximately Normal, but with some deviations. Overall, these residuals look reasonable for a real-world model. " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "Another useful plot is the **Quantile-Quantile Normal plot**, or Q-Q Normal plot. This plot displays quantiles of a standard Normal distribution on the horizontal axis and the quantiles of the residuals on the vertical axis. If the residuals were perfectly Normally distributed, these points would fall on a straight line. In real-world problems, you should expect the straight line relationship to be approximate. \n", + "\n", + "Execute the code in the cell below and examine the resulting plot. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def resid_qq(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " ss.probplot(resids.flatten(), plot = plt)\n", + " plt.title('Residuals vs. predicted values')\n", + " plt.xlabel('Quantiles of standard Normal distribution')\n", + " plt.ylabel('Quantiles of residuals')\n", + " \n", + "resid_qq(y_test, y_score) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that these points nearly fall along the straight line. This indicates that the residuals have a distribution which is approximately Normal. \n", + "\n", + "You will now make one last diagnostic plot for this regression model, known as a **residual plot**. A plot of residuals vs. predicted values (scores) shows if there is structure in the residuals. For an ideal regression model the variance or dispersion of the residuals should not change with the values of the predicted values. It has been said that the ideal residual plot should look like a 'fuzzy caterpillar' with no change vs. the predicted value. \n", + "\n", + "Any structure in this plot with change in predicted values indicates that the model fit changes with the predicted value. For example, if the residuals increase with predicted values the model can be said to predict only the smaller label values well. The opposite situation indicates that only large label values are well predicted. Changes in the mid-range indicate that there is some nonlinear change with predicted values. In other words, in any of these cases the model is not accurately computing the predicted values. \n", + "\n", + "Execute the code in the cell below to display and examine the residual plot for the regression model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def resid_plot(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " sns.regplot(y_score, resids, fit_reg=False)\n", + " plt.title('Residuals vs. predicted values')\n", + " plt.xlabel('Predicted values')\n", + " plt.ylabel('Residual')\n", + " \n", + "resid_plot(y_test, y_score) " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ + "This residual plot looks fairly well behaved. The dispersion is reasonably constant over the range of the predicted value. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have performed a complete machine process for a linear regression model. The same steps are followed for creating and testing any machine learning model. The steps in this process include:\n", + "\n", + "1. Simulated a dataset. In a typical regression problem, detailed data exploration would be performed.\n", + "2. Prepared the data. In this case preparation included splitting the data into training and test subsets and scaling the features. \n", + "3. Constructed the regression model using training data with scikit-learn.\n", + "4. Evaluated the results of the model using the test data. In this case the residuals were found to be reasonably small and well behaved. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/Module4/ApplyingLinearRegression.ipynb b/Module4/ApplyingLinearRegression.ipynb index 55f6286..de59d52 100644 --- a/Module4/ApplyingLinearRegression.ipynb +++ b/Module4/ApplyingLinearRegression.ipynb @@ -12,7 +12,7 @@ "\n", "In this lab will learn to:\n", "\n", - "1. Use categorical data with Scikit-Learn. \n", + "1. Use categorical data with scikit-learn. \n", "2. Apply transformations to features and labels to improve model performance. \n", "3. Compare regression models to improve model performance. " ] @@ -30,7 +30,7 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -53,276 +53,24 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The code in the cell below loads the dataset and performs several data cleanup steps. Execute this code and ensure that the expected columns are present. " + "The code in the cell below loads the dataset which was prepared using steps from the Data Preparation lab.Execute this code and ensure that the expected columns are present. " ] }, { "cell_type": "code", - "execution_count": 3, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Index(['symboling', 'normalized_losses', 'make', 'fuel_type', 'aspiration',\n", - " 'num_of_doors', 'body_style', 'drive_wheels', 'engine_location',\n", - " 'wheel_base', 'length', 'width', 'height', 'curb_weight', 'engine_type',\n", - " 'num_of_cylinders', 'engine_size', 'fuel_system', 'bore', 'stroke',\n", - " 'compression_ratio', 'horsepower', 'peak_rpm', 'city_mpg',\n", - " 'highway_mpg', 'price'],\n", - " dtype='object')\n" - ] - } - ], - "source": [ - "auto_prices = pd.read_csv('Automobile price data _Raw_.csv')\n", - "\n", - "def clean_auto_data(auto_prices):\n", - " 'Function to load the auto price data set from a .csv file' \n", - " import pandas as pd\n", - " import numpy as np\n", - " \n", - " ## Remove rows with missing values, accounting for mising values coded as '?'\n", - " cols = ['price', 'bore', 'stroke', \n", - " 'horsepower', 'peak-rpm']\n", - " for column in cols:\n", - " auto_prices.loc[auto_prices[column] == '?', column] = np.nan\n", - " auto_prices.dropna(axis = 0, inplace = True)\n", - "\n", - " ## Convert some columns to numeric values\n", - " for column in cols:\n", - " auto_prices[column] = pd.to_numeric(auto_prices[column])\n", - " \n", - " ## fix column names so the '-' character becomes '_'\n", - " cols = auto_prices.columns\n", - " auto_prices.columns = [str.replace('-', '_') for str in cols]\n", - " \n", - " return auto_prices\n", - "auto_prices = clean_auto_data(auto_prices)\n", - "\n", - "print(auto_prices.columns)" - ] - }, - { - "cell_type": "markdown", + "execution_count": null, "metadata": {}, + "outputs": [], "source": [ - "As a next step, execute the code in the cell below to display and examine the first few rows of the dataset. " + "auto_prices = pd.read_csv('Auto_Data_Preped.csv')\n", + "auto_prices.columns" ] }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
symbolingnormalized_lossesmakefuel_typeaspirationnum_of_doorsbody_styledrive_wheelsengine_locationwheel_base...engine_sizefuel_systemborestrokecompression_ratiohorsepowerpeak_rpmcity_mpghighway_mpgprice
03?alfa-romerogasstdtwoconvertiblerwdfront88.6...130mpfi3.472.689.01115000212713495
13?alfa-romerogasstdtwoconvertiblerwdfront88.6...130mpfi3.472.689.01115000212716500
21?alfa-romerogasstdtwohatchbackrwdfront94.5...152mpfi2.683.479.01545000192616500
32164audigasstdfoursedanfwdfront99.8...109mpfi3.193.4010.01025500243013950
42164audigasstdfoursedan4wdfront99.4...136mpfi3.193.408.01155500182217450
\n", - "

5 rows × 26 columns

\n", - "
" - ], - "text/plain": [ - " symboling normalized_losses make fuel_type aspiration num_of_doors \\\n", - "0 3 ? alfa-romero gas std two \n", - "1 3 ? alfa-romero gas std two \n", - "2 1 ? alfa-romero gas std two \n", - "3 2 164 audi gas std four \n", - "4 2 164 audi gas std four \n", - "\n", - " body_style drive_wheels engine_location wheel_base ... engine_size \\\n", - "0 convertible rwd front 88.6 ... 130 \n", - "1 convertible rwd front 88.6 ... 130 \n", - "2 hatchback rwd front 94.5 ... 152 \n", - "3 sedan fwd front 99.8 ... 109 \n", - "4 sedan 4wd front 99.4 ... 136 \n", - "\n", - " fuel_system bore stroke compression_ratio horsepower peak_rpm city_mpg \\\n", - "0 mpfi 3.47 2.68 9.0 111 5000 21 \n", - "1 mpfi 3.47 2.68 9.0 111 5000 21 \n", - "2 mpfi 2.68 3.47 9.0 154 5000 19 \n", - "3 mpfi 3.19 3.40 10.0 102 5500 24 \n", - "4 mpfi 3.19 3.40 8.0 115 5500 18 \n", - "\n", - " highway_mpg price \n", - "0 27 13495 \n", - "1 27 16500 \n", - "2 26 16500 \n", - "3 30 13950 \n", - "4 22 17450 \n", - "\n", - "[5 rows x 26 columns]" - ] - }, - "execution_count": 4, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ "auto_prices.head()" ] @@ -334,428 +82,24 @@ "Notice that there are both numeric and categorical features. " ] }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "## Transform some variables\n", - "\n", - "Before building any machine learning model data transformations should be applied. Generally, deterimining which transformations to apply is performed by careful exploration of the data set. Failure to perform these steps generally results in machine learning models with suboptimal performance. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Agregating categorical variables\n", - "\n", - "When a dataset contains categorical variables these need to be investigated to ensure that each category has sufficient samples. It is commonly the case that some categories may have very few samples, or have so many similar categories as to be meaningless. \n", - "\n", - "As a specific case, you will examine the number of cylinders in the cars. Execute the cell below to print a frequency table for this variable and examine the result. " - ] - }, - { - "cell_type": "code", - "execution_count": 5, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
fuel_type
num_of_cylinders
eight4
five10
four155
six24
three1
twelve1
\n", - "
" - ], - "text/plain": [ - " fuel_type\n", - "num_of_cylinders \n", - "eight 4\n", - "five 10\n", - "four 155\n", - "six 24\n", - "three 1\n", - "twelve 1" - ] - }, - "execution_count": 5, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "auto_prices[['fuel_type', 'num_of_cylinders']].groupby('num_of_cylinders').count()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Notice that there is only one car with three and twelve cylinders. There are only four cars with eight cylinders, and 10 cars with five cylinders. It is likely that all of these categories will not have statistically significant difference in predicting auto price. It is clear that these categories need to be aggregated. \n", - "\n", - "The code in the cell below uses a Python dictionary to recode the number of cylinder categories into a smaller number categories. Execute this code and examine the resulting frequency table." - ] - }, - { - "cell_type": "code", - "execution_count": 6, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
fuel_type
num_of_cylinders
eight_twelve5
five_six34
three_four156
\n", - "
" - ], - "text/plain": [ - " fuel_type\n", - "num_of_cylinders \n", - "eight_twelve 5\n", - "five_six 34\n", - "three_four 156" - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "cylinder_categories = {'three':'three_four', 'four':'three_four', \n", - " 'five':'five_six', 'six':'five_six',\n", - " 'eight':'eight_twelve', 'twelve':'eight_twelve'}\n", - "auto_prices['num_of_cylinders'] = [cylinder_categories[x] for x in auto_prices['num_of_cylinders']]\n", - "auto_prices[['fuel_type', 'num_of_cylinders']].groupby('num_of_cylinders').count()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "There are now three categories. One of these categories only has five members. However, it is likely that these autos will have different pricing from others. \n", - "\n", - "Now, execute the code in the cell below and examine the frequency table for the `body_style` feature." - ] - }, - { - "cell_type": "code", - "execution_count": 7, - "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
fuel_type
body_style
convertible6
hardtop8
hatchback63
sedan94
wagon24
\n", - "
" - ], - "text/plain": [ - " fuel_type\n", - "body_style \n", - "convertible 6\n", - "hardtop 8\n", - "hatchback 63\n", - "sedan 94\n", - "wagon 24" - ] - }, - "execution_count": 7, - "metadata": {}, - "output_type": "execute_result" - } - ], - "source": [ - "auto_prices[['fuel_type', 'body_style']].groupby('body_style').count()" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Two of these categories have a limited number of members, but for now this feature will not be agregated. This can be done later to improve model performance. " - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Transforming numeric variables\n", - "\n", - "To improve performance of machine learning models transformations of the values are often applied. Typically, transformations are used to make the relationships betweeen variables more linear. In other cases, transformations are performed to make distributions closer to Normal, or at least more symmetric. These transformations can include taking logarithms, exponential transformations and power transformations. \n", - "\n", - "In this case, you will transform the label, the price of the car. Execute the code in the cell below to display and examine a histogram of the label. " - ] - }, - { - "cell_type": "code", - "execution_count": 8, - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZ4AAAEWCAYAAABWn/G6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl8XWWd+PHP92ZPs7VJ2qRJN9pCm5aWtmFXAUEoCBYd1CIoOijOiDrK6Ag6P4dx+w06yriAy0+cUQZsARcqUiq7ILQlQFu6N93TJk3S7Ptyv78/zpOSpllu29x77r35vl+v+8q5z3nOc75PueSb85znPkdUFWOMMSZSAn4HYIwxZmyxxGOMMSaiLPEYY4yJKEs8xhhjIsoSjzHGmIiyxGOMMSaiLPGYMUNEtojIpX7H4ScReb+IHBSRFhFZNArt3SQifxmN2MzYIfY9HhMPRGQf8ElVfaZf2cdd2TtOop3pwF4gSVV7RjdK/4nIbuAOVX3c71jM2GVXPMZEkIgk+hzCNGDLaDQUBX0xMcoSjxkzRGSfiFzhts8TkTIRaRKRIyLyA1ftr+5ngxuOulBEAiLyryKyX0SqReQ3IpLdr92PuX1HReT/DDjP3SLymIj8r4g0AR93535VRBpEpFJEfiIiyf3aUxH5jIjsEpFmEfmmiMx0xzSJyCP96w/o46CxikiKiLQACcBGd+Uz2PEqIp8XkT0iUisi3xORgNv3cRH5m4jcKyJ1wN2u7OV+x88TkadFpM79u361X1x3ishu9+/0iIhMcPtS3b/PUfdv8pqITDqV/8YmNljiMWPVD4EfqmoWMBN4xJW/y/3MUdUMVX0V+Lh7XQacAWQAPwEQkRLgfuAmoBDIBooGnGsZ8BiQAzwE9AJfBPKAC4HLgc8MOGYpsAS4APgX4BfuHFOA+cCNQ/Rr0FhVtVNVM1ydhao6c+h/Gt4PlAKLXex/32/f+cAeYCLw7f4HiUgm8AzwFDAZmAU863Z/HrgeuMTtqwfuc/tuwft3mwLkAv8AtA8Tn4lxlnhMPPmj+4u5QUQa8BLCULqBWSKSp6otqrp2mLo3AT9Q1T2q2gLcBSx3Q003AH9S1ZdVtQv4OjDwxumrqvpHVQ2qaruqvq6qa1W1R1X3AT/H+4Xc3z2q2qSqW4DNwF/c+RuB1cBQEwOGizVU96hqnaoeAP6L45PcYVX9sYt9YHK4FqhS1e+raoeqNqvqOrfv08DXVLVCVTuBu4EbXFzdeAlnlqr2un+fppOI18QYSzwmnlyvqjl9L068iujvVuBMYLsb2rl2mLqTgf393u8HEoFJbt/Bvh2q2gYcHXD8wf5vRORMEXlCRKrc8Nt38K5++jvSb7t9kPcZDG64WEPVP979rs3B9g00BRh0CA/v3tIf+v1RsA3vym8S8CCwBlghIodF5LsiknQS8ZoYY4nHjEmquktVb8QbMroHeExExnHi1QrAYbxfnH2mAj14yaASKO7bISJpeH+9H3e6Ae9/CmwHZruhvq8Ccuq9CTnWUE0ZcPzhfu+HmwZ7EG/Ycqh9V/f/w0BVU1X1kKp2q+q/q2oJcBHeldPHTiJeE2Ms8ZgxSURuFpF8VQ0CDa64F6gBgnj3R/r8FviiiMwQkQy8K5SVbrr1Y8B1InKRu+H/74ycRDKBJqBFROYA/zhqHRs+1lB9WUTGi8gU4J+AlSEe9wRQICJfcJMZMkXkfLfvZ8C3RWQagIjki8gyt32ZiJwtIgl4/y7deP8tTJyyxGPGqqXAFjfT64fAcndfog3vpvnf3LDQBcCv8IaD/or3HZ8O4HMA7h7M54AVeFc/zUA10DnMub8EfMTV/X+E/os9FEPGehIeB14HNgB/Bh4I5SBVbQbeA1wHVAG78CY5gPdvvAr4i4g0A2vxJioAFOAl8Ca8IbgXgf89yZhNDLEvkBozitxVRgPeMNpev+M5WSKieLGX+x2LiV92xWPMaRKR60Qk3d0j+k/gLWCfv1EZE70s8Rhz+pbh3YA/DMzGG7azoQRjhmBDbcYYYyLKrniMMcZElC3yN4i8vDydPn2632EYY0xMef3112tVNX+kepZ4BjF9+nTKysr8DsMYY2KKiOwfuZYNtRljjIkwSzzGGGMiyhKPMcaYiLLEY4wxJqIs8RhjjIkoSzzGGGMiyhKPMcaYiLLEY4wxJqIs8RhjjIkoW7lgjHl43YHTOv4j508dpUiMMWOVXfEYY4yJKEs8xhhjIsoSjzHGmIiyxGOMMSaiLPEYY4yJKEs8xhhjIsoSjzHGmIiyxGOMMSaiLPEYY4yJqLAmHhFZKiI7RKRcRO4cZH+KiKx0+9eJyPR+++5y5TtE5KqR2hSRz7oyFZG8fuUiIj9y+zaJyOLw9dgYY8xIwpZ4RCQBuA+4GigBbhSRkgHVbgXqVXUWcC9wjzu2BFgOzAOWAveLSMIIbf4NuALYP+AcVwOz3es24Kej2U9jjDEnJ5xXPOcB5aq6R1W7gBXAsgF1lgG/dtuPAZeLiLjyFaraqap7gXLX3pBtquqbqrpvkDiWAb9Rz1ogR0QKR7WnxhhjQhbOxFMEHOz3vsKVDVpHVXuARiB3mGNDafNU4kBEbhORMhEpq6mpGaFJY4wxpyqciUcGKdMQ65xs+enGgar+QlVLVbU0Pz9/hCaNMcacqnAmngpgSr/3xcDhoeqISCKQDdQNc2wobZ5KHMYYYyIknInnNWC2iMwQkWS8yQKrBtRZBdzitm8AnlNVdeXL3ay3GXgTA9aH2OZAq4CPudltFwCNqlo5Gh00xhhz8sL2IDhV7RGRzwJrgATgV6q6RUS+AZSp6irgAeBBESnHu9JZ7o7dIiKPAFuBHuB2Ve0Fb9r0wDZd+eeBfwEKgE0i8qSqfhJ4ErgGb4JCG/CJcPXZGGPMyMS7wDD9lZaWallZmd9hhIU9gdQYEy4i8rqqlo5Uz1YuMMYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkSUJR5jjDERZYnHGGNMRFniMcYYE1GWeIwxxkRUWBOPiCwVkR0iUi4idw6yP0VEVrr960Rker99d7nyHSJy1UhtisgM18Yu12ayK58qIs+LyJsisklErglnn40xxgwvbIlHRBKA+4CrgRLgRhEpGVDtVqBeVWcB9wL3uGNLgOXAPGApcL+IJIzQ5j3Avao6G6h3bQP8K/CIqi5ybd4fjv4aY4wJTTiveM4DylV1j6p2ASuAZQPqLAN+7bYfAy4XEXHlK1S1U1X3AuWuvUHbdMe827WBa/N6t61AltvOBg6Pcj/HjN6gsmL9Aa6890U+/WAZe2pa/A7JGBODEsPYdhFwsN/7CuD8oeqoao+INAK5rnztgGOL3PZgbeYCDaraM0j9u4G/iMjngHHAFYMFKyK3AbcBTJ06NaQOjiXVzR385tX91LV2UVKYxcu7anl221/59CVn8KUrz8LL/cYYM7JwXvEM9ptIQ6wzWuUANwL/o6rFwDXAgyJyQr9V9ReqWqqqpfn5+YM0N3apKn988xDtXb08cEspf/78O3jhy5dx7YJC7nt+N6s22kWkMSZ04Uw8FcCUfu+LOXGY61gdEUnEGwqrG+bYocprgRzXxsBz3Qo8AqCqrwKpQN5p9GvMefNgA/uOtrF0fgGXz52EiJCfmcL3P3QOC6fkcPeqLdQ0d/odpjEmRoQz8bwGzHazzZLxbuyvGlBnFXCL274BeE5V1ZUvd7PeZgCzgfVDtemOed61gWvzcbd9ALgcQETm4iWemlHvbZxq7+pl9eYqpoxPY8m08cftSwgI/3nDAlo7e/n645t9itAYE2vClnjc/ZbPAmuAbXgzy7aIyDdE5H2u2gNAroiUA3cAd7pjt+BdpWwFngJuV9Xeodp0bX0FuMO1levaBvhn4FMishH4LfBxl6hMCJ7dfoS2zh6WnVNEYJD7OLMnZfKF98xm9eYqntt+xIcIjTGxRux38IlKS0u1rKzM7zDC4uF1B0Ku290b5DtPbmNuYRYfKvVGOD9y/okTL3p6g1zyvReYMiGNFbddOGqxGmNii4i8rqqlI9UL56w2E+O2VTbR2RNk8dS3h9iGSlwLirNZvbmK/1yzg8k5aUO2OVjiMsaMLbZkjhnShoMNZKUmckb+uBHrlk6bQHJCgFd210YgMmNMLLPEYwbV2tnDziPNLJySM+i9nYHSkhNYPG08Gysaae7ojkCExphYZYnHDGrToUaCCudMyQn5mItm5hIMKmv31IUxMmNMrLPEYwa14UA9BVmpFGYPfb9moLyMFM6clMnr++sI2qQVY8wQLPGYE9S3dXGwvv2krnb6nDM1h6aOHvbVtoYhMmNMPLDEY05QXu0t/jmnIPOkj51bkEVyQoCNFQ2jHZYxJk5Y4jEn2F3TQmZqIvmZKSd9bHJigJLJWWw+1ERPbzAM0RljYp0lHnOcoCq7q1uYmZ9xyitOLyzOob27l13V9tgEY8yJLPGY4xxp6qC1q5dZ+Rmn3MasiRmkJyew4aANtxljTmSJxxxnd403KWDmxFNPPAkB4eyibG/lg+7e0QrNGBMnLPGY4+yubiEvI5nstKTTamdBcQ49QWWnDbcZYwawxGOO6Q0qe4+2MvM0htn6TJ2QTnpyAtsqm0YhMmNMPLHEY46pqG+jqyc4KoknISDMKchke1UTvUH7Mqkx5m2WeMwxu2taEAhpUdBQzC3MoqM7yL6j9mVSY8zbLPGYYw7UtTExK4X05NF5WsbsiZkkBsSG24wxx7HEYwBQVSrq25kyPn3U2kxODDBrYgbbKpuwBw4aY/pY4jEA1LV20dbVO6qJB7wldOrbuqlq6hjVdo0xscsSjwHgYH0bAMUTQl+NOhRzCjMRsOE2Y8wxlngMAAfr2klOCDApK3VU281MTaJ4fBrbKptHtV1jTOyyxGMA74qnaHxaSE8bPVklhVkcaminsd2eTGqMscRjgJ7eIJWNHUwZP7rDbH3mFmYBNtxmjPFY4jFUNnbQG1SKR3liQZ/8zBRyxyVb4jHGAJZ4DG9PLJgyITyJR0SYW5jFnppWmjtsuM2YsS6kxCMivxOR94qIJao4dLCujazUxNNeGHQ4cwuz6FXlxZ01YTuHMSY2hJpIfgp8BNglIv8hInPCGJOJsIr69rBd7fSZlustGvrM1iNhPY8xJvqFlHhU9RlVvQlYDOwDnhaRV0TkEyISvj+TTdh1dvdytLWLyTnhmVjQJyDCnIIsntteTbc9EtuYMS3koTMRyQU+DnwSeBP4IV4iejoskZmIOOJWFCgY5e/vDGZuYSZNHT28tq8u7OcyxkSvUO/x/B54CUgHrlPV96nqSlX9HHD6a+gb31T2JZ7s8CeeWRMzSE4M8MzW6rCfyxgTvUK94vmlqpao6v9V1UoAEUkBUNXSoQ4SkaUiskNEykXkzkH2p4jISrd/nYhM77fvLle+Q0SuGqlNEZnh2tjl2kzut+9DIrJVRLaIyMMh9nlMONLUQUpigJwwTizok5KYwMUzc3l6W5UtGmrMGBZq4vnWIGWvDneAiCQA9wFXAyXAjSJSMqDarUC9qs4C7gXucceWAMuBecBS4H4RSRihzXuAe1V1NlDv2kZEZgN3ARer6jzgCyH2eUyoauygIDsVCcOKBYO5omQSB+va2WWPxDZmzBo28YhIgYgsAdJEZJGILHavS/GG3YZzHlCuqntUtQtYASwbUGcZ8Gu3/RhwuXi/AZcBK1S1U1X3AuWuvUHbdMe827WBa/N6t/0p4D5VrQdQVRvncVSVqqaOiNzf6XPF3EkAPG2z24wZs0Z64tdVeBMKioEf9CtvBr46wrFFwMF+7yuA84eqo6o9ItII5LrytQOOLXLbg7WZCzSoas8g9c8EEJG/AQnA3ar61AixjwmN7d10dAcjcn+nz6SsVBYWZ/PMtiPcftmsiJ3XGBM9hk08qvpr4Nci8neq+ruTbHuwsZuBA/tD1RmqfLArtOHqg9fH2cCleAn0JRGZr6oNxwUichtwG8DUqVMHaS7+VEVwRlt/V8ydxA+e2Ul1cwcTMyN7bmOM/0YaarvZbU4XkTsGvkZouwKY0u99MXB4qDoikghkA3XDHDtUeS2Q49oYeK4K4HFV7XbDdjvwEtFxVPUXqlqqqqX5+fkjdC0+VDV6iWe0H4UwkitKJqEKz26zUU9jxqKRJheMcz8zgMxBXsN5DZjtZpsl400WWDWgzirgFrd9A/CcetOdVgHL3ay3GXiJYv1Qbbpjnndt4Np83G3/EbgMQETy8Ibe9owQ+5hQ1dTB+PQkUpMSInreOQWZTJ2QzlObqyJ6XmNMdBhpqO3n7ue/n2zD7p7NZ4E1ePdWfqWqW0TkG0CZqq4CHgAeFJFyvCud5e7YLSLyCLAV6AFuV9VegMHadKf8CrBCRL6F9wXXB1z5GuBKEdkK9AJfVtWjJ9ufeFTVGNmJBX1EhKvnF/Crv+2lsb07rGvEGWOiz0iTCwAQke/iTaluB54CFgJfUNX/He44VX0SeHJA2df7bXcAHxzi2G8D3w6lTVe+B2/W28ByBe5wL+P09Aapbelk3uQsX85/1fwCfv7XPTy3/QjvX1TsSwzGGH+E+j2eK1W1CbgW757JmcCXwxaVCbvq5k6CGvn7O33OKc6hICuV1W/ZcJsxY02oiadvLOQa4LeqaottxbiqCC6VM5hAQLhq3iRe3FlDW1fPyAcYY+JGqInnTyKyHSgFnhWRfKAjfGGZcKtq7CAxIOSOS/EthqXzC+nsCfLCDntGjzFjSaiPRbgTuBAoVdVuoJUTVyEwMeRIUweTslJJCERmqZzBnDt9PBPGJbPaZrcZM6aENLnAmYv3fZ7+x/xmlOMxEVLV2MHsSSPNiA+vxIQAS+cX8Ic3DtHa2cO4lJP5OBpjYlWoj0V4EPhP4B3Aue415KrUJrq1dPbQ3Nnj2/2d/q4/p4j27l5bu82YMSTUPzFLgRK1tezjQt+KBX58h2eg0mnjmZydyh83HOL6RUUjH2CMiXmhTi7YDBSEMxATOUd8ntHWXyAgvO+cIl7aVcvRlk6/wzHGRECoiScP2Coia0RkVd8rnIGZ8Klq7CAjJZGMKLmncv2iyfQGlT+/Vel3KMaYCAj1N8/d4QzCRFZVU0dUXO30mVOQxZyCTP745iE+duF0v8MxxoRZqNOpXwT2AUlu+zXgjTDGZcIkqMqRCD/8LRTLzinijQMN7K1t9TsUY0yYhTqr7VN4T/f8uSsqwlv12cSYoy1d9AQ16hLPBxYXkRAQVqw/4HcoxpgwC/Uez+3AxUATgKruAiaGKygTPn4vlTOUSVmpXDF3Io++XkFnT6/f4RhjwijUxNOpql19b9yXSG1qdQyqamwnIJCf6d9SOUO56fxp1LV2sWaLfafHmHgWauJ5UUS+CqSJyHuAR4E/hS8sEy5VjR3kZaSQlBDqf/rIecesPKZOSOehtfv9DsUYE0ah/va5E6gB3gI+jfc8nH8NV1AmfKrcGm3RKBAQlp83hXV76yivbvE7HGNMmIQ6qy2IN5ngM6p6g6r+P1vFIPY0d3RT39ZNYZTd3+nvg0umkJQg/ObVfX6HYowJk2ETj3juFpFaYDuwQ0RqROTrwx1notPOI81AdCyVM5T8zBTev6iIla8dpNZWMjAmLo10xfMFvNls56pqrqpOAM4HLhaRL4Y9OjOqtlW6xBPFVzwAn75kJl29QX718l6/QzHGhMFIiedjwI2qeuw3gKruAW52+0wM2VHVTGpSgOy0pJEr+2hmfgbXzC/kwVf309TR7Xc4xphRNlLiSVLV2oGFqlrD24/DNjFie1UTk7JSEfHv4W+h+sdLZ9Lc2cODr9oMN2PizUiJp+sU95koo6psr2yO6vs7/c0vyubSs/L55Ut7aGy3qx5j4slIiWehiDQN8moGzo5EgGZ0HGpoj5qHv4XqS1eeRUN7Nz9+dpffoRhjRtGwiUdVE1Q1a5BXpqraUFsM2VEV/TPaBppflM3yc6fwP6/ss+/1GBNHou/r6yYstrvEE61fHh3KP195FmlJCXzrz1v9DsUYM0qi40lgJuy2VTYxZUIaqUkJvsbx8LqTX336nbPzeHJzFV/9/Vt85wM2wmtMrLMrnjFie1UzZ03K8juMU3LBzFyKctL4w5uHqGxs9zscY8xpssQzBnR097K3tpW5hZl+h3JKEgMBPnzuFHqDyhdXbqA3aKs1GRPLLPGMAeXVLfQGlTkFsXnFA5CXkcJ1Cyezdk8dP3mu3O9wjDGnwRLPGNA3sWBOjF7x9Fk8NYf3Lyri3md28mjZQb/DMcacorAmHhFZKiI7RKRcRO4cZH+KiKx0+9eJyPR+++5y5TtE5KqR2hSRGa6NXa7N5AHnukFEVERKw9Pb6LW9somUxADTc8f5HcppERHu+bsFvHN2Hnf+/i2e2WoPjDMmFoUt8YhIAnAfcDVQAtwoIiUDqt0K1KvqLOBe4B53bAmwHJgHLAXuF5GEEdq8B7hXVWcD9a7tvlgygc8D68LR12i340gzZ07KJCEQ/UvljCQ5McBPb17C/MlZfObhN1j9VqXfIRljTlI4r3jOA8pVdY97bPYKYNmAOsuAX7vtx4DLxVtIbBmwQlU73QKl5a69Qdt0x7zbtYFr8/p+5/km8F2gY7Q7GQu2VTYzpyC2h9n6y0hJ5L8/cd6x5PPLl/Zgj4cyJnaEM/EUAf0H4itc2aB1VLUHaARyhzl2qPJcoMG1cdy5RGQRMEVVnxguWBG5TUTKRKSspqYm1D5GvZrmTmpbOjkrjhIPwIRxyTz8qQu4en4B3/rzNr706CZaO3tGPtAY47twJp7BxnUG/lk6VJ1RKReRAN4Q3j8PE6dXWfUXqlqqqqX5+fkjVY8ZfUvlzC2M3RltQ0lNSuAnNy7m8++exe/frODaH7/MpooGv8MyxowgnImnApjS730xcHioOiKSCGQDdcMcO1R5LZDj2uhfngnMB14QkX3ABcCqsTTBYHtVE0BcDbX1FwgId1x5Fr/91AV0dPfygftf4d6nd9LVE/Q7NGPMEMKZeF4DZrvZZsl4kwVWDaizCrjFbd8APKfeYP0qYLmb9TYDmA2sH6pNd8zzrg1cm4+raqOq5qnqdFWdDqwF3qeqZeHqdLTZXtVMfmYKuRkpfocSVheckcvqf3on1y2czA+f3cX19/2NrYeb/A7LGDOIsCUed7/ls8AaYBvwiKpuEZFviMj7XLUHgFwRKQfuAO50x24BHgG2Ak8Bt6tq71Btura+Atzh2sp1bY9526ua4vZqZ6Cc9GTu/fA5/PyjS6hu7mDZfS/z42d30d1rVz/GRBOx2UAnKi0t1bKy2L8o6ukNUvJva7jlwml87b3erPNTWaQzmnzk/Kkh1atr7eLfVm3hTxsPs2hqDvfftJjC7LTT7n+o5zdmLBKR11V1xFsZtnJBHNt3tI2unmBML5VzqiaMS+bHNy7ixzcuYmdVM9f9+GVe3X3U77CMMVjiiWvHJhbE+FI5p+O6hZN5/LMXk52WxM0PrGPDwXq/QzJmzLPEE8e2VzaTEBBm5mf4HYqvZk3M5I+3X8x50yfwaFkF6/balY8xfrIHwcWxLYcbmZWf4fvD30bT6dyjWTq/gPq2Lh7fcJigwoVn5I5iZMaYUNkVTxzbWtlEyeSxd39nKEkJAW46fxpzC7N4YuNhm25tjE8s8cSp2pZOjjR1Ms8Sz3ESAsKHS6dQPD6NlWUHOFjX5ndIxow5lnji1LZK76/5kjhcKud0JScG+OiF08lMTeLBtftp7uj2OyRjxhRLPHFqixtGsqG2wWWkJPLRC6bR2dPLo2UVBO37bMZEjCWeOLX1cBNFOWnkpCePXHmMmpSVynULJlNe08Jfd8bPiuTGRDtLPHFqa2VTXK5IPdqWTBvPguJsntl2hAN2v8eYiLDEE4fau3rZU9Niw2whEBGuP6eIzNQkfv9GBT1BW9fNmHCzxBOHtlc1EVRsRluIUpMSWHbOZKqbO3lxhw25GRNulnji0Fab0XbS5hRksaA4mxd21HCkaUw+Id2YiLHEE4e2HG4iKzWR4vFpfocSU65dMJmUpAB/3HAIW7XdmPCxxBOHth72ViwQGeyJ4GYoGSmJXDWvgP1H29hU0eh3OMbELUs8caa7N8j2qiZKCrP9DiUmLZk2nsk5qazeXGmPzzYmTCzxxJmdR5rp6A6ycIolnlMREOG6BZNp6ujhxZ3VfodjTFyyxBNn+oaIFhbn+BxJ7JqWO46Fxdm8tKuW+rYuv8MxJu5Y4okzGw82kJ2WxLTcdL9DiWlXzSsA4JmtR3yOxJj4Y4knzmysaGRBcbZNLDhNOenJXDQzjw0HGzjc0O53OMbEFUs8caS9q5edR5ptmG2UXHJmPqlJCazZUuV3KMbEFUs8cWTL4UZ6g8rCKZZ4RkNacgKXzZnIruoWdlU3+x2OMXHDEk8c2XCwAYCFxTajbbRcMGMC49OTWLO5yh6dYMwoscQTRzZVNFKYncrErFS/Q4kbiQkB3lNSwOHGDjZVNPgdjjFxwRJPHNlU0cACu9oZdQuKs5mck8pfth6ho7vX73CMiXmWeOJEQ1sX+4622f2dMAiIsHReIQ1t3Tz46n6/wzEm5lniiRMb7YujYTVrYgazJ2bwk+fLaWzr9jscY2KaJZ44UbavjoBgVzxhtHR+AU0d3dz/QrnfoRgT0yzxxIn1e+uYNzmbjJREv0OJW4XZabx/URH//co+DtmXSo05ZZZ44kBnTy8bDjZw7vQJfocS9/75yrMA+P5fdvgciTGxK6yJR0SWisgOESkXkTsH2Z8iIivd/nUiMr3fvrtc+Q4RuWqkNkVkhmtjl2sz2ZXfISJbRWSTiDwrItPC2Wc/bD7USGdPkPNmjPc7lLhXlJPGJy6azh/ePMTWw01+h2NMTApb4hGRBOA+4GqgBLhRREoGVLsVqFfVWcDjNLV1AAAUoUlEQVS9wD3u2BJgOTAPWArcLyIJI7R5D3Cvqs4G6l3bAG8Cpaq6AHgM+G44+uundXvrAOyKJ0I+c+ksslKT+I+ntvsdijExKZxXPOcB5aq6R1W7gBXAsgF1lgG/dtuPAZeLt7rlMmCFqnaq6l6g3LU3aJvumHe7NnBtXg+gqs+rapsrXwsUh6Gvvnptbx0z88eRm5HidyhjQnZ6Ep+9bBZ/3VnDy7tq/Q7HmJgTzsRTBBzs977ClQ1aR1V7gEYgd5hjhyrPBRpcG0OdC7yroNWDBSsit4lImYiU1dTUjNi5aNEbVMr213PeDLvaiaSPXjiNopw0/u/qbQSDtpSOMScjnIlnsHX5B/4fOlSd0Sp/+0QiNwOlwPcGqYuq/kJVS1W1ND8/f7AqUWlHVTPNHT02zBZhqUkJfOmqM9lyuIlVGw/7HY4xMSWciacCmNLvfTEw8P/QY3VEJBHIBuqGOXao8logx7VxwrlE5Arga8D7VLXztHoVZV7bZ/d3/LJsYRHzJmfxH6u309rZM/IBxhggvInnNWC2m22WjDdZYNWAOquAW9z2DcBzqqqufLmb9TYDmA2sH6pNd8zzrg1cm48DiMgi4Od4Sac6TH31zfq9dRRmp1I8Ps3vUMacQED4xrJ5VDV18KPndvkdjjExI2yJx91v+SywBtgGPKKqW0TkGyLyPlftASBXRMqBO4A73bFbgEeArcBTwO2q2jtUm66trwB3uLZyXdvgDa1lAI+KyAYRGZj8YlZvUHm5vJaLZ+XZE0d9smTaBD64pJgHXtpLuT2zx5iQhPVr7qr6JPDkgLKv99vuAD44xLHfBr4dSpuufA/erLeB5VecdOAxYmNFA43t3VxyZuzck4pHX7l6Dmu2VPH1x7fw0CfPtz8CjBmBrVwQw17YUUNA4J2z8/wOZUzLy0jhy0vn8Mruozz6eoXf4RgT9SzxxLAXd9ZwzpQcctKT/Q5lzLvpvKmcN2MC33xiK1WNHX6HY0xUs8QTo+pau9hU0cAlZ070OxSDN9Hgu3+3gO7eIF/7w1uoPSbbmCFZ4olRL+2qQRUuOcvu70SL6Xnj+NKVZ/Hs9mp+98Yhv8MxJmpZ4olRL+6oYcK4ZBYU2aOuo8knLp7BeTMm8PXHN7O7psXvcIyJSpZ4YlAwqPx1Vw3vnJ1HIGAzqKJJQkD40fJFpCQGuP2hN+jo7vU7JGOijiWeGPTmwXpqW7q47Cy7vxONCrJT+cGHzmF7VTPffGKr3+EYE3Us8cSgVRsOk5IY4IqSSX6HYoZw2ZyJfPqSM3ho3QEeXLvf73CMiSqWeGJMT2+QP79VyeVzJ9pjrqPcv1w1h8vnTOTuVVt4aVfsrHhuTLhZ4okxa/fUUdvSxXULJvsdihlBQkD44Y2LmD0xg8889AbbKu2JpcaAJZ6Y86eNh8lISeSyOXZ/JxZkpCTywMfPZVxyIjf/ch3l1TbTzRhLPDGks6eX1ZsrubJkEqlJCX6HY0JUlJPGw5/y1nC76Zdr2X+01e+QjPGVJZ4Y8tedtTR19HDdOTbMFmvOyM/goU+eT1dPkBt+9ipbDjf6HZIxvrHEE0N+u/4AeRnJvGOWLQoai84qyGTlpy8kMSB8+OdreaW81u+QjPGFJZ4Ysbe2lee2V3PT+dNISrD/bLHqzEmZ/P4zFzE5J5Vb/ns9v35ln63rZsYc+w0WI/7nb3tJTghw0wVT/Q7FnKbC7DQe/YeLeOfsfP5t1Ra+uHIDbV326GwzdtgXQWJAY3s3j75ewbULC5mYmep3OGPaw+sOnNbxHznf+8MhOy2JX36slPueL+cHz+xkw8EGvv+hhSyZNmE0woxqo/VvaGKXXfHEgEfLDtLW1cvfXzzD71DMKAoEhM9dPpuHP3kBPUHlhp+9yjef2EpzR7ffoRkTVnbFE+U6unv577/t47zpE5hvK1HHpQtn5vLUF97Fd57cxgMv72XVxsPcdfUcrj+nKC4Wge0NKrUtndQ0d9LU0c22yqZj97USEwKkJSUwLiWR7LQkEuKgv2Zklnii3AMv7+VQQzvfu2GB36GYUTDcMNP8ydn84yUz+dOmw9zxyEa++9QOrpg7kbmFWYh4v5CjeZippzfIvqOtbK9qZntlM9urmtl5pJnDDe30BEeeQCFATnoSk7JSmZyTxpTx6czIG0dyog3MxBtLPFGsurmD+58v5z0lk7jIplCPCVMmpPMPl8xkU0Ujz247wv+uO0BBVioXz8plQXGO3+Edp7G9mzcO1PPG/npe31/PhoMNtHV5j4EICMzIG8fZRdlcu6CQwpw0JmamkJmayN92HcXlUbp7g7R399La2UNdazdHWzupauxgR1UzCiQGhOm545g9KYPZkzKZlJniX4fNqLHEE8W+v2YnXb1BvnrNXL9DMREUEOGcKTmcXZTNxoMNvFxey+/eOMTqzVXsO9rKDUuKKel3FRQJqsrBunbK9tdRtr+esn117KpuQdVLMnMLs/jgkmIWFOdwVkEmsyZmDLm6xr7athHP19nTy4GjbeyqbmHnkWZWb65i9eYqctKT2FvbytVnF7JoSk5cDEWORWLfIThRaWmplpWV+RrDmwfq+cBPX+HWi2fwr9eWjFq7pzujyESeqrK7ppX1e4+y80gLXb1BZuSN44q5E7lszkQWTx1/UksohfIZ6Ozu5UhTBwfr29l/tJX9dW00d3hTvlMSA0ydkM603HSm5Y6jeHwaKYnhXcKpoa2LXdUtbKtsYk9NK129QQqyUlk6v4Cr5xdQOn2C3R+KAiLyuqqWjljPEs+J/E48DW1dvPdHLwPw5D+9k+y0pFFr2xJPbLvm7AKe2FTJmi1VrN1zlO5eJTkhwILibOYXZVNSmMUZ+eMoGp/GxMzUQX8Z930GunqCNHV009jeTVN7N0dbu6hq7KCqqYO61q5j9XPSk5g2wUsy03LTmZSVSiCCV1sDXbuwkOe2VfPkW5W8uLOGzp4geRkpXDVvEtecXcj5MyaQaF+y9kWoiceG2qJMMKjc8chGqps7eOwfLhrVpGNiX056MjdfMI2bL5hGc0c36/fWsW5vHWX76njETbvvExDITE0iKy2RpIQACSIEVTna0kVHTy8d3cHj2hYgNyOZydmpLJ46nsJs7yZ/tH0Gs1KTuH5REdcvKqK1s4fnd1Sz+q0qfv/GIR5ad4Dx6UlcWVLA1WcXcNHMPJucEIUs8USZ/3p2F89tr+Yby+axcEp03Uw20SUzNYnL507i8rnek2iDQWV/XRv7j7ZSUd/OkaaOY1cz3UElGFQCIlQ2dpCcGCArNZGstCSy05LITk0iKy0p5n5Jj0tJ5NoFk7l2wWTau3p5cWcNqzdX8ue3KllZdpDUpADnTMnh3OkTOHf6BBZPG28PUIwC9l8gSqgq31uzg/tf2M3fLS7moxdM8zskE2MCAWFG3jhm5I0btl6sD7eOFP/5M3JZPHU85dUt7KlpYd/RNtbtKUfpu6pLYVJWChMzU72fWalMSE8OOelG85T2WGGJJwp09wb5xp+28uDa/dx43hS+df3ZEZ2xZEy8SUoIMLcwi7mFWYA3WeJAfRv7j7ZR1djBkaYOth5uov8d7vTkBHLSk8hJS2Z8ehI56cnkpHtXhJmpSWSkJNoEhlFiicdnWw438uVHN7G1sonb3nUGd109x5KOMaMsJSmB2RMzmT0x81hZd2+Q2pZOqps6qW/roqGtm4b2LmpaOtlV3Ux37/ETrwTISE3kt+sPHLtSmuSumiZlv709Pj3ZpnmPwBKPT/bUtPDAy3tZ+dpBxo9L5hcfXcKV8wr8DstEuVgfJosmSQkBCrPTKMxOO2GfqtLW1UtDWzdNHe7V3kOz2958qImm3UePm8zRJ0GEzNREMlMTyXBXSpmpiWSkeK8PLC4iLyOF/MwUxo3R+01h7bWILAV+CCQAv1TV/xiwPwX4DbAEOAp8WFX3uX13AbcCvcDnVXXNcG2KyAxgBTABeAP4qKp2DXeOSFJVKurbeWFHNX/ZeoSXy2tJSgjw4XOn8OWrziInPTnSIRljhiAijEtJZFxKIkWcmJj69PQGae7sobm9m8YOl5j6Jai61k4OHG2lrav32LDew+vf/uMhLSmB/MwU8jKS3c+UY0mp72d+Rgp5mcmkJ8dPkgpbT0QkAbgPeA9QAbwmIqtUdWu/arcC9ao6S0SWA/cAHxaREmA5MA+YDDwjIme6Y4Zq8x7gXlVdISI/c23/dKhzhKPP9a1dVNS309DeRX1bN0caOzjU0M6e2la2Hm6ktsX7bsT03HQ+9+7ZfOzCaeRl2BIgxsSqxIQA49OTGT/CH469QaW1q4eWjh5Kp4+ntqWLmuZOals6jy2gure2lfV766hvG3x18nHJCeT1JaKMFMaPSz7uSiojNZFM9zM9OZGUxABJCQGSEoSkhADJ/d4roEFQFFUIqqJ4PzNTkkhLDu8XgsOZQs8DylV1D4CIrACWAf0TzzLgbrf9GPAT8W5wLANWqGonsFdEyl17DNamiGwD3g18xNX5tWv3p0OdQ8PwzdkVrx3knqe2H1c2LjmBabnjuPSsicyfnMW7zsznjPyM0T61MSaKJQSErNQkslKTuPSsicPW7e4NUtfqJaaalk5qj/3sOvZ+d00L9fu7aOnsOeH7WKfrW9fP5+Ywz6oNZ+IpAg72e18BnD9UHVXtEZFGINeVrx1wbJHbHqzNXKBBVXsGqT/UOY574L2I3Abc5t62iMiOkHs6gq3A6qF35w2MJY7Ec9/A+hfrfOnfTZE5zSn37aP3wEdP/bwhZaxwJp7BpnUMvMoYqs5Q5YNNtB+ufqhxoKq/AH4xSN2wEpGyUJaYiEXx3Dew/sW6eO5ftPctnF9TrgCm9HtfDBweqo6IJALZQN0wxw5VXgvkuDYGnmuocxhjjPFBOBPPa8BsEZkhIsl4kwVWDaizCrjFbd8APOfuvawClotIiputNhtYP1Sb7pjnXRu4Nh8f4RzGGGN8ELahNnc/5bPAGrypz79S1S0i8g2gTFVXAQ8AD7rJA3V4iQRX7xG82yM9wO2q2gswWJvulF8BVojIt4A3XdsMdY4oEvHhvQiK576B9S/WxXP/orpv9lgEY4wxERVbS9EaY4yJeZZ4jDHGRJQlHp+IyFIR2SEi5SJyp9/xDEdEfiUi1SKyuV/ZBBF5WkR2uZ/jXbmIyI9cvzaJyOJ+x9zi6u8SkVv6lS8RkbfcMT+SCK6SKiJTROR5EdkmIltE5J/irH+pIrJeRDa6/v27K58hIutcrCvdZB3chJ6VLtZ1IjK9X1t3ufIdInJVv3JfP8sikiAib4rIE3HYt33us7NBRMpcWex/NlXVXhF+4U2M2A2cASQDG4ESv+MaJt53AYuBzf3Kvgvc6bbvBO5x29fgfV9WgAuAda58ArDH/Rzvtse7feuBC90xq4GrI9i3QmCx284EdgIlcdQ/ATLcdhKwzsX9CLDclf8M+Ee3/RngZ257ObDSbZe4z2kKMMN9fhOi4bMM3AE8DDzh3sdT3/YBeQPKYv6zaVc8/ji2nJCqduEtbrrM55iGpKp/5cTvPi3DW5oI9/P6fuW/Uc9avO9XFQJXAU+rap2q1gNPA0vdvixVfVW9/xN+06+tsFPVSlV9w203A9vwVruIl/6pqra4t0nupXhLTD3mygf2r6/fjwGXu7+Cjy1jpap7gb5lrHz9LItIMfBe4JfuvRAnfRtGzH82LfH4Y7DlhIqGqButJqlqJXi/vIG+BaiG6ttw5RWDlEecG3pZhHdVEDf9c0NRG4BqvF86uwlxiSmg/zJWJ9PvSPkv4F+AvgXLQl4+i+jvG3h/JPxFRF4Xb1kviIPPZvyssx1bQlrGJ0ad7DJIUfFvISIZwO+AL6hq0zBD3THXP/W+A3eOiOQAfwDmDhPTaC1jFXYici1Qraqvi8ilfcXDxBMzfevnYlU9LCITgadFZPswdWPms2lXPP4IZTmhaHfEXarjfla78pNd7qjCbQ8sjxgRScJLOg+p6u9dcdz0r4+qNgAv4I3/n+wSUyfb70i4GHifiOzDGwZ7N94VUDz0DQBVPex+VuP90XAe8fDZjOSNMnsduzmYiHeDbwZv37Sc53dcI8Q8neMnF3yP429wftdtv5fjb3Cud+UTgL14NzfHu+0Jbt9rrm7fDc5rItgvwRvb/q8B5fHSv3wgx22nAS8B1wKPcvwN+M+47ds5/gb8I257HsffgN+Dd/M9Kj7LwKW8PbkgLvoGjAMy+22/AiyNh89mRD8c9jruQ3UN3gyq3cDX/I5nhFh/C1QC3Xh/Jd2KNzb+LLDL/ez7IAvew/p2A28Bpf3a+Xu8G7flwCf6lZcCm90xP8GtqBGhvr0Db3hhE7DBva6Jo/4twFtCapOL4euu/Ay8GU3l7hd1iitPde/L3f4z+rX1NdeHHfSb/RQNn2WOTzxx0TfXj43utaXv/PHw2bQlc4wxxkSU3eMxxhgTUZZ4jDHGRJQlHmOMMRFliccYY0xEWeIxxhgTUZZ4jPGBiLzQfxVkV/YFEbl/mGNahtpnTCyxxGOMP37LiY9hX+7KjYlrlniM8cdjwLUikgLHFiidDGwQkWdF5A33nJQTVkMWkUv7nj3j3v9ERD7utpeIyItuUck1fUurGBNNLPEY4wNVPYr37fmlrmg5sBJoB96vqouBy4Dvh/pwLrfm3I+BG1R1CfAr4NujHbsxp8tWpzbGP33DbY+7n3+Pt+zJd0TkXXhL/RcBk4CqENo7C5iPt4oxeOuNVY5+2MacHks8xvjnj8AP3COK01T1DTdklg8sUdVut/Jy6oDjejh+tKJvvwBbVPXC8IZtzOmxoTZjfKLek0FfwBsS65tUkI33jJluEbkMmDbIofuBEhFJEZFs4HJXvgPIF5ELwRt6E5F54eyDMafCrniM8ddvgd/z9gy3h4A/iUgZ3krZJzz4S1UPisgjeCtO78JbfRpV7RKRG4AfuYSUiPd8mi1h74UxJ8FWpzbGGBNRNtRmjDEmoizxGGOMiShLPMYYYyLKEo8xxpiIssRjjDEmoizxGGOMiShLPMYYYyLq/wOU5IIdRZjiUAAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "def hist_plot(vals, lab):\n", - " ## Distribution plot of values\n", - " sns.distplot(vals)\n", - " plt.title('Histogram of ' + lab)\n", - " plt.xlabel('Value')\n", - " plt.ylabel('Density')\n", - " \n", - "labels = np.array(auto_prices['price'])\n", - "hist_plot(labels, 'prices')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The distribution of auto price is both quite skewed to the left and multimodal. Given the skew and the fact that there are no values less than or equal to zero, a log transformation might be appropriate.\n", - "\n", - "The code in the cell below displays a histogram of the logarithm of prices. Execute this code and examine the result. " - ] - }, - { - "cell_type": "code", - "execution_count": 9, - "metadata": { - "scrolled": true - }, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl8VPW9//HXZyb7vhICISRAQHZZRFBB3HGpaKt1qW21Wttavdfa29b2+utt7ar1emurtdrW61Zc6rUWvbhVZVEEQfYtEMKSECALJAGyJ5/fHzPMHUNCBsjkzEw+z8djHpmZ8z1n3hnCfOZ8v+d8j6gqxhhjDIDL6QDGGGNChxUFY4wxPlYUjDHG+FhRMMYY42NFwRhjjI8VBWOMMT5WFEzQiMhGEZntdA4nicjVIlImIodFZFIXy1VERjiRrVOOmSJS7HQO4zwrCuakiMhOEbmw03M3i8iHRx+r6lhVXdjDdgq8H4xRQYrqtIeAO1U1SVVXOx2mO6q6RFVHOZ3DOM+KgoloIVBshgIbHc5wXCHwHpkQYkXBBI3/3oSITBORlSJSLyL7ReRhb7PF3p+13i6WGSLiEpH7RGSXiFSKyLMikuq33a94l9WIyP/r9Do/EZFXROR5EakHbva+9sciUisie0XkURGJ8dueisgdIrJNRA6JyM9EZLh3nXoRedm/faffscusIhIrIocBN7BWRLYH8H6letev8m7vPhFxeZe5ReQ/RaRaRHaIyJ3H28Pyvic/FJFNInJQRP5bROK8y2aLSLmI/EBE9gH/ffQ5v/WHiMir3iw1IvKo37Kvichm73bfFpGh3udFRP7L+z7Uicg6ERnX0+9tQosVBdNXHgEeUdUUYDjwsvf5Wd6fad4ulo+Bm72384BhQBLwKICIjAH+AHwJyAVSgcGdXmsu8AqQBvwVaAe+A2QBM4ALgDs6rTMHmAJMB74PPOl9jSHAOOCGbn6vLrOqarOqJnnbTFTV4d2/NT6/9/4+w4Bzga8At3iXfR24FDgdmAxcFcD2vgRcguf9Hgnc57dsIJCBZ0/mdv+VRMQNvAHsAgrwvL8vepddBfwI+DyQDSwBXvCuejGef8+ReN7764CaAHKaUKKqdrPbCd+AncBhoNbv1gB82KnNhd77i4GfAlmdtlMAKBDl99x7wB1+j0cBrUAU8GPgBb9lCUCL3+v8BFjcQ/a7gb/7PVbgbL/HnwI/8Hv8n8Bvu9lWt1n9tj3iOFkUGIFnj6IZGOO37BvAQu/994Fv+C27sPP71sW/zzf9Hl8GbPfen+19z+L8ls8Gyr33ZwBVXW0beBO41e+xy/vvPhQ4H9iKp7C6nP4btdvJ3WxPwZyKq1Q17eiNY799+7sVzzfILSKyQkSuOE7bQXi+pR61C09ByPEuKzu6QFUbOPbbaJn/AxEZKSJviMg+b5fSL/HsNfjb73e/sYvHSXTteFlPRBYQ08W2ju4Ffeb37nS/O/5tdnm3cVSVqjZ1s94QYJeqtnWxbCjwiLcrrhY4AAgwWFXfx7NH9xiwX0SeFJGUAHKaEGJFwfQJVd2mqjcAA4AHgFdEJBHPt93OKvB8+ByVD7Th+aDeC+QdXSAi8UBm55fr9PhxYAtQpJ7uqx/h+SDrDcfLeiKq8exhdN7WHu/9z/zeeD64e+LfJt+b9ajjTY9cBuR3M15RhmePJc3vFq+qSwFU9XeqOgUYi+dLwPcCyGlCiBUF0ydE5CYRyVbVDjxdTeDp668COvD0ox/1AvAdESkUkSQ83+xf8n5zfQX4nIic5R38/Sk9f8AnA/XAYRE5DfhWr/1ix88aMFVtxzPO8gsRSfYO3t4DPO9t8jLwryIyWETSgB8EsNlvi0ieiGTgKYQvBRjnEzxF6NcikigicSJytnfZH4EfishY8A2OX+u9f4aInCki0cARoAnPv7EJI1YUTF+ZA2z0HpHzCHC9qjZ5u39+AXzk7ZKYDjwFPIdnHGIHng+XuwBUdaP3/ot4PrgOAZV4+uO782/Ajd62fyLwD8dAdJv1JNyF58O0FPgQmOfdPnhyvwOsA1YDC/DskRzvQ3eed51S7+3ngYTwFqjP4Rnr2A2U4xk0RlX/jmdP70VvV9wGPAPgACnenAfxdFfV4DlPw4QRUbWL7Jjw5f12Xouna2iH03n6iohcCvxRVYd2s3wncJuq/rNPg5mwZ3sKJuyIyOdEJME7JvEQsB7P0TYRS0TiReQyEYkSkcHAfwB/dzqXiTxWFEw4motn0LQCKMLTFRXpu7yCZ/zkIJ7uo814Ds81pldZ95Exxhgf21MwxhjjE3YTYWVlZWlBQYHTMYwxJqx8+umn1aqa3VO7sCsKBQUFrFy50ukYxhgTVkRkV8+trPvIGGOMHysKxhhjfKwoGGOM8bGiYIwxxseKgjHGGB8rCsYYY3ysKBhjjPGxomCMMcbHioIxxhifsDuj2QTfvOW7g7btG8/MD9q2jTGnzvYUjDHG+FhRMMYY42NFwRhjjE/QioKIPCUilSKyoZvlIiK/E5ESEVknIpODlcUYY0xggrmn8DQw5zjLL8VzKcUi4Hbg8SBmMcYYE4CgFQVVXQwcOE6TucCz6rEMSBOR3GDlMcYY0zMnxxQGA2V+j8u9zx1DRG4XkZUisrKqqqpPwhljTH/kZFGQLp7Trhqq6pOqOlVVp2Zn93g1OWOMMSfJyaJQDgzxe5wHVDiUxRhjDM4WhfnAV7xHIU0H6lR1r4N5jDGm3wvaNBci8gIwG8gSkXLgP4BoAFX9I7AAuAwoARqAW4KVxRhjTGCCVhRU9YYelivw7WC9vjHGmBNnZzQbY4zxsaJgjDHGx4qCMcYYHysKxhhjfKwoGGOM8bGiYIwxxseKgjHGGB8rCsYYY3ysKBhjjPGxomCMMcbHioIxxhgfKwrGGGN8rCgYY4zxsaJgjDHGx4qCMcYYHysKxhhjfKwoGGOM8bGiYIwxxseKgjHGGB8rCsYYY3ysKBhjjPGxomCMMcbHioIxxhgfKwrGGGN8rCgYY4zxsaJgjDHGx4qCMcYYHysKxhhjfKwoGGOM8bGiYIwxxseKgjHGGJ+oYG5cROYAjwBu4M+q+utOy/OBZ4A0b5t7VXVBMDOZvtHc1s6OqiOUVh8h2u1iQHIsQzISnI5ljOlB0IqCiLiBx4CLgHJghYjMV9VNfs3uA15W1cdFZAywACgIViYTfO0dygfFlSwqrqJdlSiX0N6hKCBARV0j37lwJNnJsU5HNcZ0IZh7CtOAElUtBRCRF4G5gH9RUCDFez8VqAhiHhNktQ0tvLSyjF01DUzMS2VqQQZDMxJQoOZwCyt2HeDlFWX8Y/UeHrp2IpeOz3U6sjGmk2COKQwGyvwel3uf8/cT4CYRKcezl3BXVxsSkdtFZKWIrKyqqgpGVnOKDjW18sTiUvbWNfHFqXlcd0Y+w7OTiHK7iHa7GJgax+cmDOLde85l5MBk7pi3iueW7XI6tjGmk2AWBeniOe30+AbgaVXNAy4DnhORYzKp6pOqOlVVp2ZnZwchqjkVre0dPL9sFw0tbXz9nGGcPiS927aFWYnMu206548awP97bQOPfVDSh0mNMT0JZlEoB4b4Pc7j2O6hW4GXAVT1YyAOyApiJtPLVJXXVu+h7GAj104ZwuD0+B7XiY9x88SXp3DV6YP4zdvF/GPNnj5IaowJRDCLwgqgSEQKRSQGuB6Y36nNbuACABEZjacoWP9QGFlXXsfqslouGD2AcYNTA14vyu3iwWsmMq0wg++9so5Pdx0MYkpjTKCCVhRUtQ24E3gb2IznKKONInK/iFzpbfZd4OsishZ4AbhZVTt3MZkQ1dTazoINexmcFs95owac8PoxUS7+eNMUBqbE8Y3nVlJZ3xSElMaYExHUk9dUdYGqjlTV4ar6C+9zP1bV+d77m1T1bFWdqKqnq+o7wcxjetf7Wyo53NTGlRMH4ZKuhpB6lpEYw1++OpXDzW382yvr6Oiw7wTGOMnOaDYnZV99E0u3VzO1IP2UT0oryknm3y8bzeKtVTz78c5eyWeMOTlWFMxJeWfjPmKiXFw8ZmCvbO+m6UM5b1Q2v3xzC1v3H+qVbRpjTpwVBXPC9tU3sWXfIc4enkVibO+c/ygiPHjNRJJio/j+K+tot24kYxxhRcGcsMVbq4hxu5gxPLNXt5udHMuPrxjDmrJa/rrcTmwzxglWFMwJOXCkhbVltUwrzCAhpvdnSZl7+iBmFmXx4FvF7K1r7PXtG2OOz4qCOSGLt1XhcgnnjAjOOYYiwi+uGk9bRwf/8Y+NQXkNY0z3rCiYgDU0t7Fq10Em56eREh8dtNfJz0zgXy4o4p1N+/mguDJor2OMOZYVBROw1WW1tHUo04f17lhCV247ZxjDshP56fyNNLe1B/31jDEeVhRMQFSVFTsPMCQ9ntzUnuc3OlUxUS5+euVYdtY08KfFpUF/PWOMhxUFE5DdBxqoPNTMGQUZffaaM4uyuXTcQB79oITygw199rrG9GdWFExAPtlxgNgoFxPy0vr0de+7YgyC8PM3Nvfp6xrTX1lRMD1qbGln/Z46Jg5JIyaqb/9kBqfFc+f5I3hr4z4Wb7UJdI0JNisKpkdryz0DzH3ZdeTvtpmFFGYl8hMbdDYm6KwomB6tLa9lQHIsg1LjHHn92Cg3//G5MZRWH+EvH+5wJIMx/UXvn5JqIkptQwu7ahq4cHQOcpLTY/ubt3z3Sa87OjeF/3p3KyikJcR8ZtmNZ+afajRjDLanYHqwrrwOgIl5gV9VLViuGJ+LKizYsM/pKMZELCsK5rjWldeSlx5PZlKs01FIT4zh3FHZbNhTR0nlYafjGBORrCiYblUdaqairqnPD0M9nllF2WQkxvD62graOjqcjmNMxLGiYLq1trwWASYMdr7r6Khot4srxudSdbiZpSU1TscxJuJYUTDdWr+njoKsxKBOfncyTstN4bSBybxfXEldY6vTcYyJKFYUTJeqDjVTdaiZsYNSnI7SpSsmDKKjQ3lzw16noxgTUawomC5t2lsPwJjc0CwKGYkxzBqZzbryOkqrbNDZmN5iRcF0aVNFHYPT4o85HyCUnDsym/SEaOavraC13QadjekNVhTMMeobWyk72MiYEO06Oira7eLy8YOoPNTMM0t3Oh3HmIhgRcEcI9S7jvyNzk1mZE4Sv/3nNirrm5yOY0zYs6JgjrFpbz2ZiTEMSHb+hLWeiAhXTBhES1sHv1xg02sbc6qsKJjPqGtspbTqMGMGpfTKXEd9ISsplm+eO4zX1lTw9kabAsOYU2FFwXzG4q1VdGh4dB35u/P8IsYOSuGHr66n6lCz03GMCVtWFMxnfLClkvhoN0MyEpyOckJiolz89rrTOdzcxg9fXYeqOh3JmLAUUFEQkf8RkctFxIpIBGvvUBZurWLUwGRcYdJ15K8oJ5kfzDmNf26u5NmPdzkdx5iwFOiH/OPAjcA2Efm1iJwWxEzGIWvLazlwpIVROclORzlpt5xVwAWnDeDn/7uJVbsPOh3HmLATUFFQ1X+q6peAycBO4F0RWSoit4hItxPjiMgcESkWkRIRubebNl8UkU0islFE5p3ML2F6xwdbKnEJFOUkOR3lpLlcwsNfPJ2BqXHc8fwqag7b+IIxJyLg7iARyQRuBm4DVgOP4CkS73bT3g08BlwKjAFuEJExndoUAT8EzlbVscDdJ/4rmN7y/pZKpgxNJyEmvC/Il5oQzeNfmsKBhha+9fwqmlrtus7GBCrQMYVXgSVAAvA5Vb1SVV9S1buA7r5WTgNKVLVUVVuAF4G5ndp8HXhMVQ8CqGrlyfwS5tTtr29iY0U95502wOkovWLc4FQeunYin+w8wD0vr6GjwwaejQlEoHsKf1bVMar6K1XdCyAisQCqOrWbdQYDZX6Py73P+RsJjBSRj0RkmYjM6WpDInK7iKwUkZVVVVUBRjYn4oMtnnp8foQUBYArJw7ivstHs2D9Pu5/Y5MdkWRMAALtJ/g5sKDTcx/j6T7qTleHr3T+XxkFFAGzgTxgiYiMU9Xaz6yk+iTwJMDUqVPtf3YQLNpaRW5qHKNyklm1q7bnFULMvOW7u3w+ISaKs4dn8vTSnRTvP8Tl43NP6MiqG8/M762IxoSF4xYFERmI59t9vIhM4v8+6FPwdCUdTzkwxO9xHlDRRZtlqtoK7BCRYjxFYkVg8U1vaGvv4KOSai4dlxs2ZzGfiMvG5wLw0fYaWlo7uHry4LA85NaYvtDTnsIleAaX84CH/Z4/BPyoh3VXAEUiUgjsAa7Hc1irv9eAG4CnRSQLT3dSaUDJTa9Zt6eO+qY2Zo7McjpKUIgIl43PJTbazftbKqlrauWLU4eQFBveA+rGBMNx/1eo6jPAMyLyBVX9nxPZsKq2icidwNuAG3hKVTeKyP3ASlWd7112sYhsAtqB76mqXXi3jy3ZWo0InD08MosCeArDhaNzSI2P5vW1FTz6/jZumJbP0MxEp6MZE1J66j66SVWfBwpE5J7Oy1X14S5W81++gE5jEar6Y7/7CtzjvRmHLNlWxYTBqaQnhu4FdXrLGQUZDEqLZ97yXTy5uJSpBelcPGYgibbXYAzQc/fR0a9R4Xs2kzmu+qZWVpfVcsfs4U5H6TOD0+K56/wi3t9SydLt1azfU8f0YZlML8wkJf6z52J2N4DdG2wQ24SinrqPnvD+/GnfxDF97ePtNbR3KDOLsp2O0qfiot1cNj6XKUPTeXfTfhYVV7F4axWjcpI5LTeFUTnJxxQIY/qDgPaZReRBPIelNgJvAROBu71dSyaMLdlWRWKMm0n5aU5HcUROShw3TR/KgSMtLCutYcOeOjbvOwRAcmwUuWlxZCTGkBIXTUp8tOdnXBQp8dHERbsdTm9M7wu0I/ViVf2+iFyN5zDSa4EPACsKYW7JtmpmDM8i2t2/J8DNSIzhsvG5XDpuIPvqmyitOkJFbSN765rYfaCBptaOY9aJiXL5rlCXmxrP8AFJ5KbG2eGuJqwFWhSO7kdfBrygqgci8Xj2/mZXzRF21TRw6zmFTkcJGSJCbmo8uanxn3m+pa2D+qZW6htbvT/bqG9qpfpwM7tqGlhbXgcbITE2iklD0pg+LJOMfjBwbyJPoEXhdRHZgqf76A4RyQbsKulhbsm2aoB+N55wMmKiXGQlxZKV1PV1qw81tVJSeZjNe+tZur2aj0qqGTc4lcvG55JqYxMmjARUFFT1XhF5AKhX1XYROcKxk9uZMLNkWxV56fEUZIbXVdZCUXJcNJPy05mUn05dYyvLSmv4qKSarfsPcdGYHKYPy7RuJRMWTuTg7NF4zlfwX+fZXs5j+khrewdLS2q4YuKgiJzawkmp8dFcMnYgZxRk8I81e3hj3V521jRwzeQ8YqL699iNCX2BHn30HDAcWIPnzGPwTG5nRSFMrS2r5VBzG7OKIvcsZqdlJMZw81kFfFhSzVsb9lHb0MJN04eSEmfdSSZ0BbqnMBUYozb3cMRYvK0al8BZETy1RSgQEWYWZZOZGMtLK3fz5yU7uH3WMJt3yYSsQPdlNwADgxnE9K0l26qYOCSN1AT71toXxgxK4eazCqltaOGZpTtptqvBmRAVaFHIAjaJyNsiMv/oLZjBTPDUNbSytqzWjjrqY4VZidwwLZ+9dY08v3wXre3HnvtgjNMC3Yf9STBDmL61dHs1HYqNJzhgdG4KV0/K439WlfPQO8X88NLRTkcy5jMC2lNQ1UXATiDae38FsCqIuUwQLd5WTXJsFBOH9M+pLZw2ZWg6ZxRk8MSiUt7bvN/pOMZ8RkBFQUS+DrwCPOF9ajCeC+SYMKOqLN5axYzhmf1+agsnXTEhl9G5Kdzz8lr21DY6HccYn0A/Fb4NnA3UA6jqNiByrvDej+ysaWBPbSMzR9p4gpOi3S7+8KXJtLZ38MNX12MH9plQEWhRaFbVlqMPvCew2V9xGFqyrQqw8YRQUJiVyPcvGcXirVX8Y03ny5cb44xAi8IiEfkREC8iFwF/A14PXiwTLIu3VpOfkWCXoQwRX55RwKT8NH76+kZqDjc7HceYgIvCvUAVsB74Bp5LbN4XrFAmOFrbO/h4ezUzbS8hZLhdwgNfmMDh5jZ+8b+bnY5jTMBHH3XgGVi+Q1WvUdU/2dnN4Wf17lqOtLTb+QkhZmROMrfPGsarq/ewevdBp+OYfu64RUE8fiIi1cAWoFhEqkTkx30Tz/SmJduqcLuEGcMznY5iOrlj9giyk2P52RubbNDZOKqnPYW78Rx1dIaqZqpqBnAmcLaIfCfo6UyvWrytmtOHpNn8/iEoMTaK7108ilW7a3lj3V6n45h+rKei8BXgBlXdcfQJVS0FbvIuM2GitqGFdeW1Np4Qwr4wJY8xuSn8+s0tNNncSMYhPRWFaFWt7vykqlbxf5foNGHgw5JqVO0qa6HM7RLuu3w0e2ob+evy3U7HMf1UT0Wh5SSXmRCzZGs1yXFRTMxLdTqKOY6zRmRx1vBMHl9YQkNLm9NxTD/UU1GYKCL1XdwOAeP7IqA5darKkm1VnD08iyib2iLkfffikVQfbuGZpbucjmL6oeN+QqiqW1VTurglq6p1H4WJ7VVHqKhrYpZNbREWpgzNYPaobJ5YvJ1DTa1OxzH9jH1t7AeOTm1hg8zh47sXjaK2oZWnPtzpdBTTz1hR6AeWbKumMCuRIRkJTkcxARqfl8qFowfw30t32NiC6VNWFCJcc1s7H2+vsb2EMPSt2SOobWjlhU/KnI5i+hErChHu050HaWxt55wRVhTCzZSh6UwrzODPS0ppabNLd5q+EdSiICJzRKRYREpE5N7jtLtGRFREpgYzT3+0cGsV0W7hLCsKYemO2cPZW9fEa2v2OB3F9BNBKwoi4gYeAy4FxgA3iMiYLtolA/8CLA9Wlv5sYXElZxRkkBQb6OW4TSg5d2Q2Y3JT+OOi7bR32JxIJviCuacwDShR1VLvBXpeBOZ20e5nwINAUxCz9EsVtY1s3X+Y2aPsUNRwJSJ8a/ZwSquO8M7GfU7HMf1AMIvCYMB/hKzc+5yPiEwChqjqG8fbkIjcLiIrRWRlVVVV7yeNUIu2et6r2aPsyqnh7NJxAxmamcDji7bbDKom6IJZFKSL53x/0SLiAv4L+G5PG1LVJ1V1qqpOzc62b72BWlhcyaDUOIoGJDkdxZyCKLeLb8wazrryOj4qqXE6jolwwSwK5cAQv8d5gP+FaJOBccBCEdkJTAfm22Bz72hp6+CjkhrOHTUAka7qswknX5gymAHJsTy+qMTpKCbCBbMorACKRKRQRGKA64H5Rxeqap2qZqlqgaoWAMuAK1V1ZRAz9Ruf7jrI4eY2G0+IELFRbm49p5CPSmpYU1brdBwTwYJ2SIqqtonIncDbgBt4SlU3isj9wEpVnX/8LZiezDvO9MpvbdiLS2DPwcbjtjPOOdF/lxi3i7hoF/f9fT03njm023Y3npl/qtFMPxbU4xRVdQGwoNNzXV7KU1VnBzNLf7N1/2GGZiYSF+12OorpJbHRbs4szGTx1ipqDjeTmRTrdCQTgeyM5ghU19jKvvomRuUkOx3F9LIZwzNxuYQPS4659pUxvcKKQgTauv8QACOtKESclLhoTh+S5hszMqa3WVGIQFv3HyIlLoqcFOteiEQzR2TR1qEsK7XDU03vs6IQYdo7lJLKw4zMSbZDUSPUgJQ4ThuYzLLSGpsoz/Q6KwoRZveBBprbOqzrKMLNLMqmoaWdT3cfdDqKiTBWFCLM1v2HcAmMsLOYI1pBZgJD0uP5qKSaDpv6wvQiKwoRZuv+Q+Rn2KGokU5EmFmUzYEjLWysqHc6jokgVhQiSG1DC3vrmhg10LqO+oMxg1LITIxhybYqmyjP9BorChFkyz7Poaijc60o9AcuEc4pyqL8YCOl1UecjmMihBWFCLJ5bz2ZiTFk25mu/cbk/HSSYqNYVGxTypveYUUhQjS1tlNadYTRuSl2KGo/Eu12cc6ILEqqDlN+sMHpOCYCWFGIENsqD9OuyujcFKejmD52ZmEGcdEuFtregukFVhQixOa99STEuMnPSHA6iuljsdFuZgzLYtPeevbX21VtzamxohAB2juU4n2HGJWTjNtlXUf90VnDM4l2CwuLK52OYsKcFYUIsOvAERpb263rqB9LjI1i+rBM1pXXUVJ52Ok4JoxZUYgAGyvqiXIJRTl2FnN/NrMom2i3i9+9t83pKCaMWVEIcx2qbNxTR1FOMrFRdhZzf5bk3Vt4fV0F27zTpxtzoqwohLnyg43UN7UxbpB1HRmYWZRFQrSbR2xvwZwkKwphbuOeOtwinDbQioLxjC3cfHYB/7t+Lxv21Dkdx4QhKwphTFXZUFHH8AGJxMdY15HxuH3WcFLjo3ngrS1ORzFhyIpCGNtb18TBhlbGDUp1OooJIanx0dx1fhFLtlWzeKud0GZOjBWFMLahog6XYIeimmPcND2fIRnx/OrNLXR02AyqJnBWFMKUqrK+vI7CrEQSY6OcjmNCTGyUm+9dchqb99bzyqpyp+OYMGJFIUyt31NHzZEWJualOR3FhKgrxucyZWg6D7y5hbqGVqfjmDBhRSFMzV9TgdsljLXxBNMNl0u4f+5YDja08Jt3bNDZBMaKQhhq71BeX1fByJxkO+rIHNfYQal8ZUYBf12+m3XltU7HMWHAikIYWr6jhv31zUzMs70E07N7Lh5JVlIsP3x1Pa3tHU7HMSHOikIYen1tBYkxbjthzQQkJS6an80dy8aKen5vZzqbHlhRCDMtbR0sWL+Pi8cOJCbK/vlMYOaMy+Xzkwfz2MLtrN590Ok4JoTZp0qYeX/LfuoaW7ny9EFORzFh5idXjiUnOZbvvryWI81tTscxIcqKQph5eWU5A1PimFWU7XQUE2ZS4qJ56IsT2VlzhO+9shZVO6nNHCuoRUFE5ohIsYiUiMi9XSy/R0Q2icg6EXlPRIYGM0+421/fxMLiSj4/ebBdYc2clLOGZ3HvpaexYP0+/rBwu9NxTAgK2qmwIuIGHgMuAsqBFSIyX1U3+TVbDUxV1QYR+RbwIHBdsDKFu1dX7aFD4dqpQ5yOYsLY12cOY1NFPQ+9U0zRgCQuHjswoPXmLd8dtEw3npkBCC2cAAARZUlEQVQftG2bExPMPYVpQImqlqpqC/AiMNe/gap+oKoN3ofLgLwg5glrqsrfVpZxRkE6hVmJTscxYUxE+PUXJjAhL407561mkU2aZ/wEsygMBsr8Hpd7n+vOrcCbXS0QkdtFZKWIrKyq6p9/wKt2H6S0+ojtJZheERft5plbzmDEgCRuf3YlS0uqnY5kQkQwi0JXnd5djmyJyE3AVOA3XS1X1SdVdaqqTs3O7p8DrPOWl5EQ4+by8blORzERIi0hhudvO5OCzERufnoFr63e43QkEwKCWRTKAf+vtXlARedGInIh8O/AlaraHMQ8YevAkRZeX1fB1ZMG24yopldlJMbwwu3TmTQkjbtfWsOv3txMu0213a8FsyisAIpEpFBEYoDrgfn+DURkEvAEnoJQGcQsYe3llWW0tHXwlRkFTkcxESgj0bPHcNP0fJ5YVMrVf/jILuXZjwWtKKhqG3An8DawGXhZVTeKyP0icqW32W+AJOBvIrJGROZ3s7l+q71Dee7jXZxZmMGogclOxzERKtrt4udXjef3N0yioraJKx/9kPteW8/umoaeVzYRJah9Eaq6AFjQ6bkf+92/MJivHwk+2FLJntpGfnTZaKejmH7gcxMHMasom9+8s4WXVpQxb/luLh4zkCsm5tLU2k5ctM3KG+msgzrEPbtsFzkpsVw8NsfpKKafSE2I5udXjeeu84t4eulOXlpRxlsb9+EWITctjkGp8eSkxJISH01KXDTxMW7iot3ERbmIctskCeHOikII27KvnsVbq7jnopFE238208dyUuL4wZzT+LeLR7Fq90F+/942yg42sm5PLU07u56CO8olxHoLRFy0m9hoF3FRbpLioshOimVAcix56Ql2HZAQZkUhhD2xqJSEGDdfnm6zfxjnuF3CGQUZzBnnORxaVTnc3EZ9UxuHGltpbG2nqa2D5tZ2mlrbaWrtoKnt/+5XNzWzo/oIja3tgOdY9ZyUOEYMSGLc4FSGpMc7+NuZzqwohKjygw3MX1vBV2cUkJ4Y43QcY3xEhOS4aJLjoiEtsA90VeVISzv765vYVdPAzpojfFxaw4cl1aQlRFPX1MoNZ+Tb33oIsKIQov68ZAcC3Daz0OkoxpwyESEpNoqk7CSGZycB0NTazua99Xy6+yAPvlXMI//cxg3T8vn2eSPITo51OHH/ZUUhBB040sKLK3Yz9/TBDArwm5gx4SYu2s2k/HQm5aczZWg6f/mwlOeW7eLllWXcdk4h35o9wsYeHGCjlyHoycWlNLd18M1zhzkdxZg+MWpgMg9eM5F3vzOL808bwO/eL+Gi/1rEB1vsnNa+ZkUhxFTWN/H00h3MnTiIohw7Wc30L8Oyk3j0xsm88PXpxEW7ueXpFdzz0hrqm1qdjtZvWPdRiPn9+yW0tSt3XzjS6SgmTAXzugfB0lXmr8wYysLiKl5bs4f3tlRy7ZQ8hnnHIwJl12k4cbanEEJ21zTwwie7+eIZQyiwayaYfi7K5eLC0Tl8Y9Zwot3CXz7cwaLiSjrsMqJBZUUhhDz8bjFul/Av5xc5HcWYkDEkI4FvnzeC8XmpvL1pP39dtovGlnanY0UsKwohYsXOA7y2poJbzylkYGqc03GMCSmxUW6umzqEKybkUrz/EI8tLKGittHpWBHJikIIaGvv4Mf/2Ehuahx3nj/C6TjGhCQR4azhWdw+cxht7R38cdF2Vu066HSsiGNFIQTM+2Q3m/fWc9/lY0iIsbF/Y44nPzORO88vIj8zgVdWlfP6ugq7MFAvsqLgsMr6Jh56u5izR2Ry2fiBTscxJiwkxUZxy1mFnD08k4+31/DfH+3gSHOb07EighUFB6kq9766nua2Du6fOw6Rri5rbYzpitslXD5hENdMyWP3gQYbZ+glVhQc9NKKMt7fUsm9l57mmw/GGHNiJuenc/usYajCE4u3s6681ulIYc2KgkPKDjTwszc2MWNYJl+1ay8bc0ry0hO4Y/ZwBqXG8+KKMt7asM/OZzhJVhQc0NTazh1/XYVLhN9cOwGXy7qNjDlVyXHR3DqzkGmFGSzeVsWzH++krsGmxzhRVhT6mKryo7+vZ/2eOh6+7nTy0hOcjmRMxIhyubjq9MFcdfpgtlceYe5jH7Jt/yGnY4UVKwp97OmlO3l11R7uvrCIi8bYdZeNCYZphRncNrOQw83tXPXYR/xtZRlq3UkBsaLQh/533V5+9sYmLhydY1NZGBNkQzMTeeOucxg3OJXvvbKOO19Ybd1JAbCi0EcWFldy90urmZyfzu9uON3GEYzpAwNT45j39el8f84o3t6wjzmPLGZZaY3TsUKaFYU+sHhrFd98/lNG5iTz1C1n2FnLxvQht0u4Y/YI/udbZxEX7eaGPy3jV29upqnVJtXrihWFIPvbyjK+9vQKCrOSePZr00iJi3Y6kjH90sQhabxx1zlcN3UITywq5ZLfLubDbdVOxwo5VhSCpL1DefidYr73yjqmD8vk5W9MJzPJLkZujJMSY6P49RcmMO+2M3GJcNNflvOt5z9lV80Rp6OFDOvHCIKK2kbufmkNn+w4wLVT8vjF1eOJibL6a0yoOGtEFm/+60z+tLiUxxdt573Nldw0fSjfnD2MAcn9e+p6Kwq9qL1DeeGT3fzm7WJa2zt4+IsT+fzkPKdjGWO6EBft5q4LivjiGUP4z3eKeebjnfx1+S6+dOZQbjm7gCEZ/fMcIisKvUBV+bCkml+/uYWNFfVMH5bBL68ef8LXkzXG9L2clDgevGYid8wewaMflPDMxzt5eukOLhqTw5fOHMrZI7Jw96OjBa0onILmtnbe3bSfJxaVsn5PHbmpcfz+hklcMSHXZjw1JswUZCXy0LUT+e7FI3nu413M+2Q3b2/cT25qHFeePog5YwcyMS8t4g8nl3A7y2/q1Km6cuVKx16/ua2dT3Yc4J2N+5m/toK6xlYKMhP4xrnD+fzkwcRGufssy7zlu/vstYwJRzeemX/S6za1tvPe5kr+9mkZH26rpq1DyUmJZcawTKYVZjKtMIPh2Ylh8wVQRD5V1ak9tQvqnoKIzAEeAdzAn1X1152WxwLPAlOAGuA6Vd0ZzEwnorW9gz0HG9lRfYQ1ZbWsLqtl5c4DNLS0Exvl4pKxA7lmSl6/2700pj+Ii3Zz+YRcLp+QS11DK+9t2c97myv5sKSG19ZUAJCZGMOk/HRG5iQxPDuJ4QOSGJadGNaHngetKIiIG3gMuAgoB1aIyHxV3eTX7FbgoKqOEJHrgQeA64KRp6GljdqGVppa22lu6/D9PNzURm1jK7UNLRxsaOFgQyt7Djayq+YI5QcbafNe5k8ERuUk84XJeZx3WjYzhmURH9N3ewXGGOekJkTz+cl5fH5yHqrKzpoGPtlRw/IdB1hbVsvC4krfZwVARmIMA5Jjyfa7pSfEkBQbRXJcFEmxUSTGRhEb5SLaffQmRLtdxES5iHJJl18046LdxEUH93MnmHsK04ASVS0FEJEXgbmAf1GYC/zEe/8V4FEREQ1Cn9YzS3fxwFtbjtvG7RLS4qPJTYtj7OBUrpgwiKGZCRRkJTI6N4WkWBuCMaa/ExEKsxIpzErkujM83VOt7R3sPtDA9srDlFQdZs/BRioPNVN5qJnSqiNUHWqmpb3jlF/751eN46bpQ095O8cTzE+5wUCZ3+Ny4Mzu2qhqm4jUAZnAZ04zFJHbgdu9Dw+LSPEpZsvq/BphwnL3Lcvdt3o995d6c2Pd67P3+8sPwJdPfvWAqkkwi0JXneyd9wACaYOqPgk82RuhAERkZSADLqHGcvcty923LHdoCOZptuXAEL/HeUBFd21EJApIBQ4EMZMxxpjjCGZRWAEUiUihiMQA1wPzO7WZD3zVe/8a4P1gjCcYY4wJTNC6j7xjBHcCb+M5JPUpVd0oIvcDK1V1PvAX4DkRKcGzh3B9sPJ00mtdUX3Mcvcty923LHcICLuT14wxxgSPTd1pjDHGx4qCMcYYn4guCiLyHRHZKCIbROQFEYnrtDxWRF4SkRIRWS4iBc4k/awAct8sIlUissZ7u82prP5E5F+9mTeKyN1dLBcR+Z33/V4nIpOdyNlZALlni0id3/v9Y4dyPiUilSKywe+5DBF5V0S2eX+md7PuV71ttonIV7tqEyynmLvd733vfKBKUHWT+1rv30mHiHR7GKqIzBGRYu/f+r19k7iXqGpE3vCcGLcDiPc+fhm4uVObO4A/eu9fD7wUJrlvBh51OmunTOOADUACngMY/gkUdWpzGfAmnvNTpgPLwyT3bOCNEMg6C5gMbPB77kHgXu/9e4EHulgvAyj1/kz33k8P9dzeZYdD7P0eDYwCFgJTu1nPDWwHhgExwFpgjNN/P4HeInpPAc9/8njvORAJHHuexFzgGe/9V4ALJDSmPOwpdygaDSxT1QZVbQMWAVd3ajMXeFY9lgFpIpLb10E7CSR3SFDVxRx7Ho//3/AzwFVdrHoJ8K6qHlDVg8C7wJygBe3kFHI7qqvcqrpZVXuaUcE3xY+qtgBHp/gJCxFbFFR1D/AQsBvYC9Sp6judmn1mmg3g6DQbjgkwN8AXvF0wr4jIkC6W97UNwCwRyRSRBDx7BZ1zdTX1yeA+ytedQHIDzBCRtSLypoiM7duIx5WjqnsBvD8HdNEmFN/3QHIDxInIShFZJiIhVzi6EYrvd8Aitih4+yjnAoXAICBRRG7q3KyLVR09RjfA3K8DBao6AU93xzM4TFU345nl9l3gLTy7zG2dmoXc+x1g7lXAUFWdCPweeK1PQ566kHvfT0C+eqaQuBH4rYgMdzpQAML5/Y7cogBcCOxQ1SpVbQVeBc7q1CYUp9noMbeq1qhqs/fhn/Bcj8JxqvoXVZ2sqrPwvI/bOjUJZOqTPtdTblWtV9XD3vsLgGgRyXIgalf2H+2C8/6s7KJNKL7vgeRGVSu8P0vx9ONP6quApyAU3++ARXJR2A1MF5EE7zjBBcDmTm1CcZqNHnN36oe/svNyp4jIAO/PfODzwAudmswHvuI9Cmk6nq6xvX0c8xg95RaRgUfHmkRkGp7/NzV9nbMb/n/DXwX+0UWbt4GLRSTduyd6sfc5J/WY25s31ns/Czibz069H6oCmeIndDk90h3MG/BTYAuefuPngFjgfuBK7/I44G9ACfAJMMzpzAHm/hWwEU9XxwfAaU5n9uZaguc/7VrgAu9z3wS+6b0veC68tB1YTzdHb4Rg7jv93u9lwFkO5XwBzzhTK55vo7fiGQN7D8/ezXtAhrftVDxXOzy67te8f+clwC3hkBvPHvJ67/u+Hrg1BHJf7b3fDOwH3va2HQQs8Fv3MmCr92/9353+Gz+Rm01zYYwxxieSu4+MMcacICsKxhhjfKwoGGOM8bGiYIwxxseKgjHGGB8rCsZ0IiILReSSTs/dLSJ/OM46h4OfzJjgs6JgzLFe4NhLw17PsSfjGRNxrCgYc6xXgCv8zqYtwHNy0hoReU9EVonIehE5ZuZL77UX3vB7/KiI3Oy9P0VEFonIpyLydgjMEGvMMawoGNOJqtbgOcP96PTS1wMvAY3A1ao6GTgP+M9Ap1oXkWg8k+ldo6pTgKeAX/R2dmNOVZTTAYwJUUe7kP7h/fk1PNN0/FJEZgEdeKZDzgH2BbC9UXgu6POut4648UyhYExIsaJgTNdeAx72XjI0XlVXebuBsoEpqtoqIjvxzJ/lr43P7oEfXS7ARlWdEdzYxpwa6z4ypgvqmSp7IZ5unqMDzKlApbcgnAcM7WLVXcAY8Vz/OxXPLLcAxUC2iMwAT3dSiF2sxxjA9hSMOZ4X8FzP4uiRSH8FXheRlcAaPDPZfoaqlonIy8A6PDOArvY+3yIi1wC/8xaLKOC3eGZfNSZk2CypxhhjfKz7yBhjjI8VBWOMMT5WFIwxxvhYUTDGGONjRcEYY4yPFQVjjDE+VhSMMcb4/H+OnpzaDV0W+gAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], - "source": [ - "labels = np.log(labels)\n", - "hist_plot(labels, 'log prices')" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "The distribution of the logarithm of price is more symmetric, but still shows some multimodal tendancy and skew. None the less, this is an improvement so we will use these values as our label.\n", - "\n", - "Execute the code in the cell below add the logarithmic transformation of the label to the dataset. " - ] - }, - { - "cell_type": "code", - "execution_count": 10, - "metadata": {}, - "outputs": [], - "source": [ - "auto_prices['log_price'] = labels" - ] - }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Prepare the model matrix.\n", "\n", - "All Scikit-learn models require a numpy array of numeric only values for the features. The resulting array is often referred to as the **model matrix**. \n", + "All scikit-learn models require a numpy array of numeric only values for the features. The resulting array is often referred to as the **model matrix**. \n", "\n", - "To create a model matrix from cases with both numeric and categorical varibles requires two steps. First, the numberic features must be rescaled. Second, the categorical variables must be convereted to a set of **dummy variables** to encode the presence or not of each category. " + "To create a model matrix from cases with both numeric and categorical variables requires two steps. First, the numeric features must be rescaled. Second, the categorical variables must be converted to a set of **dummy variables** to encode the presence or not of each category. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ - "### Rescale numeric features\n", - "\n", - "Numeric features must be rescaled so they have a similar range of values. Rescaling prevents features from having an undue influence on model training simply because then have a larger range of numeric variables. \n", - "\n", - "The code in the cell below creates a numpy model matrix of the numeric features in two steps:\n", - "1. The subset of the variables containing the numeric features are converted to a numpy array. \n", - "2. The `scale` function for the `sklearn.preprocessing` package is applied to these features. \n", + "### Create dummy variables from categorical features\n", "\n", - "Execute this code to create the first few columns of the feature matrix. \n", - "\n", - "***\n", - "**Note:** You can safely ignore the warning that integer values are being coerced to floating point. This step is necessary to scale the feature. \n", - "***" - ] - }, - { - "cell_type": "code", - "execution_count": 11, - "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "C:\\Users\\StevePC2\\Anaconda3\\lib\\site-packages\\sklearn\\utils\\validation.py:475: DataConversionWarning: Data with input dtype int64 was converted to float64 by the scale function.\n", - " warnings.warn(msg, DataConversionWarning)\n" - ] - } - ], - "source": [ - "Features = np.array(auto_prices[['curb_weight', 'horsepower', 'city_mpg']])\n", - "Features = preprocessing.scale(Features)" - ] - }, - { - "cell_type": "markdown", - "metadata": {}, - "source": [ - "### Create dummy variables from categrorical features\n", - "\n", - "Now, you must create dummy variables for the categrorical features. Dummy variables encode categorical features as a set of binary variables. There is one dummy variable for each possible category. For each case all of the values in the dummy variables are set to zero, except the one corresponding to the category value, which is set to one. In this way, a categorical variable with any number of categories can be encoded as series of numeric features which Scikit-learn can operate on. This procecss is referred to as **one hot encoding** since only one dummy variable is coded as 1 (hot) per case. \n", + "Now, you must create dummy variables for the categorical features. Dummy variables encode categorical features as a set of binary variables. There is one dummy variable for each possible category. For each case all of the values in the dummy variables are set to zero, except the one corresponding to the category value, which is set to one. In this way, a categorical variable with any number of categories can be encoded as series of numeric features which scikit-learn can operate on. This process is referred to as **one hot encoding** since only one dummy variable is coded as 1 (hot) per case. \n", "\n", "The `sklearn.preprocessing` package contains functions to encode categorical features as dummy variables in two steps;\n", "1. The categories are encoded as numbers starting with 0. For example, if there are 5 categories, they are encoded as the set $\\{ 0,1,2,3,4 \\}$.\n", @@ -765,39 +109,25 @@ "\n", "1. An encoder object is created using the `LabelEncoder` method.\n", "2. The encoder is `fit` to the unique string values of the feature. \n", - "3. The `transformation` method then applies the numeric encoding to the origninal feature. \n", + "3. The `transformation` method then applies the numeric encoding to the original feature. \n", "\n", "Execute the code in the cell below and examine the result. " ] }, { "cell_type": "code", - "execution_count": 12, + "execution_count": null, "metadata": { "scrolled": false }, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "['convertible' 'hatchback' 'sedan' 'wagon' 'hardtop']\n", - "[0 0 2 3 3 3 3 4 3 3 3 3 3 3 3 3 3 2 2 3 2 2 2 2 3 3 3 4 2 2 2 2 2 2 3 4 2\n", - " 2 3 3 3 3 3 2 3 3 3 2 2 2 3 3 2 3 2 3 3 2 3 3 3 4 1 3 3 0 3 1 2 2 2 2 2 2\n", - " 2 2 2 2 3 3 3 3 3 3 3 3 4 3 2 3 4 1 2 3 3 4 3 2 2 2 3 3 4 4 3 3 4 4 3 3 3\n", - " 2 2 2 3 3 4 2 2 1 1 0 2 3 2 3 2 3 2 2 2 3 3 3 3 3 4 4 4 4 2 2 2 4 4 4 3 2\n", - " 3 2 3 2 3 3 2 3 2 1 1 2 1 2 0 3 3 2 3 2 2 2 3 4 3 3 3 3 3 3 3 0 2 3 3 4 3\n", - " 4 3 4 3 4 3 3 3 3 3]\n" - ] - } - ], + "outputs": [], "source": [ "print(auto_prices['body_style'].unique())\n", - "cat_features = auto_prices['body_style']\n", + "Features = auto_prices['body_style']\n", "enc = preprocessing.LabelEncoder()\n", - "enc.fit(cat_features)\n", - "enc_cat_features = enc.transform(cat_features)\n", - "print(enc_cat_features)" + "enc.fit(Features)\n", + "Features = enc.transform(Features)\n", + "print(Features)" ] }, { @@ -808,7 +138,7 @@ "\n", "For the next step in the process, the numerically coded categorical variable is converted to a set of dummy variables following these steps:\n", "1. A one hot encoder object is created using the `OneHotEncoder` method from the `sklearn.preprocessing` module.\n", - "2. The numericaly coded categorical feature is fit with the one hot encoder. \n", + "2. The numerically coded categorical feature is fit with the one hot encoder. \n", "3. The dummy variables are encoded using the `transform` method on the encodings.\n", "\n", "Execute the code in the cell below and examine the result. " @@ -816,34 +146,16 @@ }, { "cell_type": "code", - "execution_count": 13, - "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "array([[1., 0., 0., 0., 0.],\n", - " [1., 0., 0., 0., 0.],\n", - " [0., 0., 1., 0., 0.],\n", - " [0., 0., 0., 1., 0.],\n", - " [0., 0., 0., 1., 0.],\n", - " [0., 0., 0., 1., 0.],\n", - " [0., 0., 0., 1., 0.],\n", - " [0., 0., 0., 0., 1.],\n", - " [0., 0., 0., 1., 0.],\n", - " [0., 0., 0., 1., 0.]])" - ] - }, - "execution_count": 13, - "metadata": {}, - "output_type": "execute_result" - } - ], + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], "source": [ "ohe = preprocessing.OneHotEncoder()\n", - "encoded = ohe.fit(enc_cat_features.reshape(-1,1))\n", - "encoded_array = encoded.transform(enc_cat_features.reshape(-1,1)).toarray()\n", - "encoded_array[:10,:]" + "encoded = ohe.fit(Features.reshape(-1,1))\n", + "Features = encoded.transform(Features.reshape(-1,1)).toarray()\n", + "Features[:10,:]" ] }, { @@ -852,32 +164,18 @@ "source": [ "Notice that the `body_style` feature has been encoded as five columns. Each of these columns is a dummy variable representing one category. Each row has one and only one dummy variable with a 1, and the rest 0s. This is the one hot encoding. \n", "\n", - "Now, you need to one hot encode all five categorical variables and append them as columns to the model matrix with the scaled numeric variables. The code in the cell below executes a `for` loop that calls the `encode_string` function and uses the numpy `concatenate` function to add the dummy variables to the model matrix. The `encode_string` function uses the same process discussed above. Execute this code and verify the result." + "Now, you need to one hot encode all five categorical variables and append them as columns to the model matrix with the scaled numeric variables. The code in the cell below executes a `for` loop that calls the `encode_string` function and uses the numpy `concatenate` function to add the dummy variables to the model matrix. The `encode_string` function uses the same process discussed above. \n", + "\n", + "Execute this code, verify the result, and answer **Question 1** on the course page." ] }, { "cell_type": "code", - "execution_count": 14, - "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "(195, 28)\n", - "[[-0.02101769 0.2045987 -0.68510498 1. 0. 0.\n", - " 0. 0. 1. 0. 0. 0.\n", - " 0. 1. 0. 0. 0. 0.\n", - " 1. 0. 0. 0. 0. 1.\n", - " 0. 0. 0. 0. ]\n", - " [-0.02101769 0.2045987 -0.68510498 1. 0. 0.\n", - " 0. 0. 1. 0. 0. 0.\n", - " 0. 1. 0. 0. 0. 0.\n", - " 1. 0. 0. 0. 0. 1.\n", - " 0. 0. 0. 0. ]]\n" - ] - } - ], + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], "source": [ "def encode_string(cat_feature):\n", " ## First encode the strings to numeric categories\n", @@ -886,12 +184,11 @@ " enc_cat_feature = enc.transform(cat_feature)\n", " ## Now, apply one hot encoding\n", " ohe = preprocessing.OneHotEncoder()\n", - " encoded = ohe.fit(enc_cat_features.reshape(-1,1))\n", - " return encoded.transform(enc_cat_features.reshape(-1,1)).toarray()\n", + " encoded = ohe.fit(enc_cat_feature.reshape(-1,1))\n", + " return encoded.transform(enc_cat_feature.reshape(-1,1)).toarray()\n", " \n", "\n", - "categorical_columns = ['fuel_type', 'aspiration', 'body_style', \n", - " 'drive_wheels', 'num_of_cylinders']\n", + "categorical_columns = ['fuel_type', 'aspiration', 'drive_wheels', 'num_of_cylinders']\n", "\n", "for col in categorical_columns:\n", " temp = encode_string(auto_prices[col])\n", @@ -905,7 +202,33 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Notice that the model matrix now has 28 columns, 3 numeric features and 25 dummy variables which encode the five categorical features. " + "Notice that the model matrix now has 14 features which encode the five origianalcategorical features. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Add the numeric features\n", + "\n", + "To complete the model matrix, execute the code in the cell below to concatenate the three numeric features." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.concatenate([Features, np.array(auto_prices[['curb_weight', 'horsepower', 'city_mpg']])], axis = 1)\n", + "Features[:2,:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are now 17 features, 14 dummy variables and 3 numeric features. " ] }, { @@ -919,12 +242,13 @@ }, { "cell_type": "code", - "execution_count": 15, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ "## Randomly sample cases to create independent training and test data\n", "nr.seed(9988)\n", + "labels = np.array(auto_prices['log_price'])\n", "indx = range(Features.shape[0])\n", "indx = ms.train_test_split(indx, test_size = 40)\n", "x_train = Features[indx[0],:]\n", @@ -933,25 +257,58 @@ "y_test = np.ravel(labels[indx[1]])" ] }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Rescale numeric features\n", + "\n", + "Numeric features must be rescaled so they have a similar range of values. Rescaling prevents features from having an undue influence on model training simply because then have a larger range of numeric variables. \n", + "\n", + "The code in the cell below uses the `StandardScaler` function from the Scikit Learn preprocessing package to Zscore scale the numeric features. Notice that the scaler is fit only on the training data. The trained scaler is these applied to the test data. Test data should always be scaled using the parameters from the training data. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scaler = preprocessing.StandardScaler().fit(x_train[:,14:])\n", + "x_train[:,14:] = scaler.transform(x_train[:,14:])\n", + "x_test[:,14:] = scaler.transform(x_test[:,14:])\n", + "print(x_train.shape)\n", + "x_train[:5,:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that the numeric features have been rescaled are required. " + ] + }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Construct the linear regression model\n", "\n", - "With data prepared and split into training and test subsets, you will now compute the linear regression model. There are 28 features, so the model will require at least 28 coefficients. The equation for such a **multiple regression** problem can be writen as:\n", + "With data prepared and split into training and test subsets, you will now compute the linear regression model. With the dummy variables created there are 17 features, so the model will require 17 coefficients. There is no intercept specified since we are working with dummy variables. The equation for such a **multiple regression** problem can be written as:\n", "\n", "$$\\hat{y} = f(\\vec{x}) = \\vec{\\beta} \\cdot \\vec{x} + b\\\\ = \\beta_1 x_1 + \\beta_2 x_2 + \\cdots + \\beta_n x_n + b$$ \n", - "where; \n", + "where; \n", "$\\hat{y}$ are the predicted values or scores, \n", "$\\vec{x}$ is the vector of feature values with components $\\{ x_1, x_2, \\cdots, x_n$, \n", "$\\vec{\\beta}$ is vector of model coefficients with components $\\{ \\beta_1, \\beta_2, \\cdots, \\beta_n$, \n", "$b$ is the intercept term, if there is one.\n", "\n", - "You can think of the linear regression function $f(\\vec{x})$ as the dot product between the beta vector $\\vec{\\beta}$ and the feature vector $\\vec{x}$, plus the intercecpt term $b$.\n", + "You can think of the linear regression function $f(\\vec{x})$ as the dot product between the beta vector $\\vec{\\beta}$ and the feature vector $\\vec{x}$, plus the intercept term $b$.\n", "\n", "The code in the cell below uses the `sklearn import linear_model` to compute a least squares linear model as follows:\n", - "1. A linear regresson model object is created with the `LinearRegression` method. Notice, that in this case, no intercept will be fit. The intercept value or **bias** will be accomodated in the coefficients of the dummy variables for the categorical features. \n", + "1. A linear regression model object is created with the `LinearRegression` method. Notice, that in this case, no intercept will be fit. The intercept value or **bias** will be accommodated in the coefficients of the dummy variables for the categorical features. \n", "2. The model is fit using the `fit` method with the numpy array of features and the label. \n", "\n", "Execute this code. " @@ -959,20 +316,9 @@ }, { "cell_type": "code", - "execution_count": 16, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "LinearRegression(copy_X=True, fit_intercept=False, n_jobs=1, normalize=False)" - ] - }, - "execution_count": 16, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ "## define and fit the linear regression model\n", "lin_mod = linear_model.LinearRegression(fit_intercept = False)\n", @@ -983,29 +329,14 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The model has been fit to the traning data. Execute the code in the cell below to examine the value of the intercept term and coefficients. " + "The model has been fit to the training data. Execute the code in the cell below to examine the value of the intercept term and coefficients. " ] }, { "cell_type": "code", - "execution_count": 17, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "0.0\n", - "[ 3.04458965e-01 1.68257692e-01 -1.62967987e-03 1.92995654e+00\n", - " 1.88843159e+00 1.85339386e+00 1.88038986e+00 1.84961133e+00\n", - " 1.92995654e+00 1.88843159e+00 1.85339386e+00 1.88038986e+00\n", - " 1.84961133e+00 1.92995654e+00 1.88843159e+00 1.85339386e+00\n", - " 1.88038986e+00 1.84961133e+00 1.92995654e+00 1.88843159e+00\n", - " 1.85339386e+00 1.88038986e+00 1.84961133e+00 1.92995654e+00\n", - " 1.88843159e+00 1.85339386e+00 1.88038986e+00 1.84961133e+00]\n" - ] - } - ], + "outputs": [], "source": [ "print(lin_mod.intercept_)\n", "print(lin_mod.coef_)" @@ -1016,7 +347,7 @@ "metadata": {}, "source": [ "As expected, the intercept term is `0.0`. Roughly speaking, you can interpret the coefficients of the model as follows: \n", - "1. The price of autos increases with weight (first coefficient), horsepower (second coefficinet) and weakly decreases with fuel efficiency (third coefficient). \n", + "1. The price of autos increases with weight (first coefficient), horsepower (second coefficient) and weakly decreases with fuel efficiency (third coefficient). \n", "2. The coefficients for the dummy variables are in a similar range, indicating the bias or intercept has been incorporated in these. " ] }, @@ -1026,27 +357,14 @@ "source": [ "## Evaluate the model\n", "\n", - "You will now use the test dataset to evaluate the performance of the regression model. As a first step, execute the code in the cell below to compute and display various performance metrics and examine the results. " + "You will now use the test dataset to evaluate the performance of the regression model. As a first step, execute the code in the cell below to compute and display various performance metrics and examine the results. Then, answer **Question 2** on the course page." ] }, { "cell_type": "code", - "execution_count": 18, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stdout", - "output_type": "stream", - "text": [ - "Mean Square Error = 0.023079885624992524\n", - "Root Mean Square Error = 0.1519206556890554\n", - "Mean Absolute Error = 0.12184546775225154\n", - "Median Absolute Error = 0.10035871414425745\n", - "R^2 = 0.9200320659887041\n", - "Adjusted R^2 = 0.7401042144632884\n" - ] - } - ], + "outputs": [], "source": [ "def print_metrics(y_true, y_predicted, n_parameters):\n", " ## First compute R^2 and the adjusted R^2\n", @@ -1069,27 +387,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "At first glance, these metrics look promising. The RMSE, MAE and median absolute error are all small and in a similar range. However, notice that the $R^2$ and $R^2_{adj}$ are rather different. This model has a large number of parameters compared to the number of cases available. This result indicates that the model may be over-fit and might not generalize well. \n", + "At first glance, these metrics look promising. The RMSE, MAE and median absolute error are all small and in a similar range. However, notice that the $R^2$ and $R^2_{adj}$ are rather different. This model has a large number of parameters compared to the number of cases available. This result indicates that the model may be overfit and might not generalize well. \n", "\n", "To continue the evaluation of the model performance, execute the code in the cell below to display a histogram of the residuals. " ] }, { "cell_type": "code", - "execution_count": 19, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl4HGeZ7/3v3VJrX63dsix53x07drxk32MSiEPIvkAgkBMCwzADcw7wzsvJwDDAwDBDCFsSQvZASEhI4iTOvnuJt3jfbdmyFsuStVm7dJ8/uiw6siy1JXVXd+v+XFdd7u6qrvqVZPXdz1NVT4mqYowxxgB43A5gjDEmfFhRMMYY08OKgjHGmB5WFIwxxvSwomCMMaaHFQVjjDE9rCiYoBORLSJyvts53CQinxWRgyLSJCJzg7idc0RkRz/zHxKRfx+G7ZSIiIpI7FDXZcKLFQUzJCKyX0Qu7vXabSLy/vHnqjpDVd8eYD3R/iHzc+DrqpqiquuDtRFVfU9VpwRr/Sb6WVEwI0IYFJtiYEsgC4ZBVjOCWVEwQeffmhCRBSKyRkQaRKRKRH7hLPau82+d08WyWEQ8IvKvIlIqIodF5BERSfdb7+edeTUi8v/32s7dIvK0iDwmIg3Abc62V4hInYhUiMi9IhLntz4VkbtEZJeINIrID0VkgvOeBhF5yn/5XvvYZ1YRiReRJiAG+FhE9pzk/SoiXxORXcAu57WpIvKaiNSKyA4Ruc5v+ctFZKuT85CIfNt5/XwRKfNbbq6IrHOW+zOQ4DfvEy06vxwTncdXiMh6Z98Pisjd/fyObxORvc529onIzSdb1oQ5VbXJpkFPwH7g4l6v3Qa839cywArgVudxCrDIeVwCKBDr974vAbuB8c6yfwUedeZNB5qAs4E4fN0zHX7budt5fhW+Lz+JwDxgERDrbG8b8E2/7SnwPJAGzADagDec7acDW4EvnOTncNKsfuue2M/PUYHXgFFO1mTgIPBFJ+/pwBFghrN8BXCO8zgTON15fD5Q5jyOA0qBfwK8wDXOz+Tf+/o99c7prGuW8/ObDVQBV/X+fTlZG4ApzryC4zltirzJWgpmODznfPuuE5E64Df9LNsBTBSRbFVtUtWV/Sx7M/ALVd2rqk3Ad4EbnO6Va4AXVPV9VW0Hvo/vQ8rfClV9TlW7VbVFVdeq6kpV7VTV/cDvgfN6veenqtqgqluAzcCrzvbrgZeBkx0k7i9roH6sqrWq2gJ8Gtivqn908q4DnnH2G3w/x+kikqaqR535vS3CVwz+R1U7VPVp4KNAw6jq26q6yfn5bQSe5MSf13HdwEwRSVTVCufnZyKQFQUzHK5S1YzjE3BXP8veDkwGtovIRyLy6X6WHY3vm+5xpfi+meY58w4en6GqzUBNr/cf9H8iIpNF5EURqXS6lP4DyO71niq/xy19PE8ZRNZA+ectBhb2KrY3A/nO/M8BlwOlIvKOiCw+SaZDqupfLEv7WK5PIrJQRN4SkWoRqQfu5MSfF6p6DLjemV8hIstEZGqg2zHhxYqCCSlV3aWqNwK5wE+Bp0UkmRO/5QOU4/twPG4s0Invg7oCGHN8hogkAlm9N9fr+W+B7cAkVU0DvgfI4Pcm4KyB8s97EHjHv9iq78ylrwKo6kequhTfz/E54Kk+1lcBFIqI/z6O9Xt8DEg6/kRE8vmkJ/B1pxWpajrwO07y81LV5ap6Cb6uo+3A/QPvrglHVhRMSInILSKSo6rdQJ3zchdQja8LYrzf4k8C/yQi40QkBd83+z+raifwNPAZETnTOfj7bwz8AZ+Kr++7yfkm+9Vh27H+sw7Gi8BkEblVRLzOdIaITBOROBG5WUTSVbUD3z519bGOFfgK0zdEJFZErgYW+M3/GJghInNEJAHfcRh/qUCtqraKyALgpr6CikieiFzpFPc2fMd6+spjIoAVBRNqS4Atzhk5vwRuUNVWp/vnR8AHTnfJIuBB4FF8ZybtA1qBfwBw+qz/AfgTvm/EjcBhfB9KJ/NtfB9sjfi+yf55GPfrpFkHQ1UbgUuBG/C1QirxtazinUVuBfY73WB3Arf0sY524Gp8B5SP4uvi+avf/J3AD4DX8Z3x9H6vVdwF/EBEGvEds+mrNQK+z5FvOTlr8R136K8L0YQx+WR3ozGRyfl2Xoeva2if23mMiVTWUjARS0Q+IyJJTrfFz4FN+E5/NcYMkhUFE8mW4uuyKAcm4euKsqavMUNg3UfGGGN6WEvBGGNMj4gbeCs7O1tLSkrcjmGMMRFl7dq1R1Q1Z6DlIq4olJSUsGbNGrdjGGNMRBGRgK5mt+4jY4wxPawoGGOM6WFFwRhjTA8rCsYYY3pYUTDGGNPDioIxxpgeVhSMMcb0sKJgjDGmhxUFY4wxPSLuimYzMj2x6kBIt3fTwrEDL2RMFLKWgjHGmB5WFIwxxvSwomCMMaaHFQVjjDE9rCgYY4zpYUXBGGNMDysKxhhjelhRMMYY08OKgjHGmB5WFIwxxvSwomCMMaaHFQVjjDE9rCgYY4zpYUXBGGNMDysKxhhjelhRMMYY08OKgjHGmB5WFIwxxvQIWlEQkSIReUtEtonIFhH5xz6WOV9E6kVkgzN9P1h5jDHGDCyY92juBL6lqutEJBVYKyKvqerWXsu9p6qfDmIOY4wxAQpaS0FVK1R1nfO4EdgGFAZre8YYY4YuJMcURKQEmAus6mP2YhH5WEReFpEZJ3n/HSKyRkTWVFdXBzGpMcaMbEEvCiKSAjwDfFNVG3rNXgcUq+ppwK+A5/pah6rep6rzVXV+Tk5OcAMbY8wIFsxjCoiIF19BeFxV/9p7vn+RUNWXROQ3IpKtqkeCmctEt46ubnYfbmJLeT21x9rxiBAf62FKfhozR6eRFB/U//bGRLSg/XWIiAB/ALap6i9Oskw+UKWqKiIL8LVcaoKVyUQ3VWX9gTqWbaqgpaOLBK+HgvREulSpamxjW+Uhnv/4EPOKM/nUzAISvDFuRzYm7ATzK9NZwK3AJhHZ4Lz2PWAsgKr+DrgG+KqIdAItwA2qqkHMZKJUU1snf11XxvbKRoqzkrhgSi7jc5KJ9fh6SFWVivpW1pQeZdXeGnZUNnLV3EKm5qe5nNyY8BK0oqCq7wMywDL3AvcGK4MZGZraOrn/vb0cPdbO5bMKOHNCFh755H89EWF0RiJXZiRy+tgMnllXxqMrSvnMaaNZND7LpeTGhB+7otlEtGNtnTz4/j7qmtu57awSzp6YfUJB6G1MZhJ3nT+RKfmpPP9xOe/sOByitMaEPysKJmJ1dHXz0If7OdLUxq2LShifnRLwe70xHm5eWMzsMeks31rFe7vsVGdjIMhnHxkTTK9sruRQXQu3LCxmYm7gBeG4GI9w3fwiuruVVzZXkpMSz9QCO8ZgRjZrKZiItK2igRV7azhrQhbTRw/+g9wjwjXzihidkcif1hyksr51GFMaE3msKJiIU9/SwdNryxidnsBlM/KHvL64WA+3LComPtbDY6tKaevsGoaUxkQmKwom4ry8uYKOrm5uOGMssTHD8184PdHLDWeM5eixdpZtrBiWdRoTiawomIiy78gxNpbVc+7kHLJT44d13eOykzl3cg5rSo+yfEvlsK7bmEhhRcFEjK5u5YWPy8lI9HLupOCMgXXRtFxGpyfwnWc2cqSpLSjbMCacWVEwEeOj/bVUNrRy+awC4mKD81831uPh2vlFNLV18qNl24KyDWPCmRUFExHaO7t5Y/thxmUnM2MIZxsFIi8tgTvPm8Cz6w/x4W4bm9GMLFYUTERYvb+WY22dXDItDxngiuXh8LULJlKclcS/PrfZzkYyI4oVBRP2Wtq7eHdnNeNzkinJTg7JNhO8Mfz7VTPZe+QYv39nb0i2aUw4sKJgwt4Tqw/Q1NbJRVPzQrrdcyblcPmsfH779h67qM2MGFYUTFhr7ejid+/sYXx2MuNC1Erw950l0+jqVn62fEfIt22MG6womLD2/IZyqhvbOH9KrivbH5uVxBfPLuGZdWVsKqt3JYMxoWRFwYQtVeXBD/YxNT+VCTmhbyUc9/ULJpKVHMcPl23F7gFlop0VBRO2Vu6tZXtlI188qyQkZxydTGqCl29eMpnV+2p5e6cNsW2imxUFE7b++ME+MpO8LJ1T6HYUrp9fRNGoRP7r1R3WWjBRzYqCCUsHapp5bVsVNy0cS4I3xu04xMV6+OZFk9l8qIFXNtu4SCZ6WVEwYenxVaXEiHDrohK3o/S4am4hE3NT+K/XdtLVba0FE52sKJiw097ZzTPryrhoWi756Qlux+kR4xH++ZLJ7D7cxN82HHI7jjFBYUXBhJ03t1dxpKmdG84Y63aUEyyZkc+M0Wn8z+u7aO/sdjuOMcPOioIJO3/66CD5aQmcOzk4w2MPhccjfPvSKRyobeapNQfdjmPMsLOiYMJKeV0L7+ys5tr5Y4jxuHcaan/On5LDvOJMfvXmLlo7bLA8E12sKJiw8vTaMlThuvlFbkc5KRFfa6GqoY3HVpa6HceYYWVFwYSN7m7lqTUHOWtiFkWjktyO06/FE7I4e2I2v317D83tnW7HMWbYWFEwYWPtgaOUHW3hmnlj3I4SkH+6ZBI1x9p5dIW1Fkz0sKJgwsZz6w+R6I3h0un5bkcJyLziUZwzKZvfv7vXWgsmagStKIhIkYi8JSLbRGSLiPxjH8uIiNwjIrtFZKOInB6sPCa8tXd2s2xTBZfOyCM5PtbtOAH75sWTqT3WziPWWjBRIpgthU7gW6o6DVgEfE1Epvda5lPAJGe6A/htEPOYMPbOzmrqmju4KgzGOToV84ozOXdyDve9u5djbdZaMJEvaEVBVStUdZ3zuBHYBvT+i18KPKI+K4EMESkIViYTvp5bf4is5DjOnpTtdpRT9s2LJ1lrwUSNkBxTEJESYC6wqtesQsD/CqAyTiwciMgdIrJGRNZUV9vQxdGmobWD17dV8enZBXhjIu8w1+ljMzlvcg73vbvHWgsm4gX9L1BEUoBngG+qakPv2X285YSRxlT1PlWdr6rzc3LC7ypXMzSvbamirbObpXMjq+vI3z9dMpmjzR08vGK/21GMGZKgFgUR8eIrCI+r6l/7WKQM8L9KaQxQHsxMJvws21RBYUYic4sy3I4yaHOKMrhgiu/YQpO1FkwEC+bZRwL8Adimqr84yWLPA593zkJaBNSrakWwMpnwU9/SwXu7qrl8Vr6rd1cbDv948WTqmjt4+MP9bkcxZtCC2VI4C7gVuFBENjjT5SJyp4jc6SzzErAX2A3cD9wVxDwmDL22tYqOLuXyWZF/fsGcogwunJrL/e/tpbG1w+04xgxK0E4IV9X36fuYgf8yCnwtWBlM+HvJ6TqaE8FdR/6+efEkrrz3Ax5ZUcrXLpjodhxjTlnknephokY0dR0dN3tMBhdNzeW+d621YCKTFQXjmmjqOvL3zYsnU99ixxZMZLKiYFzzcpR1HR03a0w6F0/ztRbqm621YCKLFQXjimNtnby3+wiXzsiLmq4jf9+6dAqNbZ389p09bkcx5pREzshjJqw8serAkN6/+VA97Z3deESGvK5gGI5Mc8Zk8MB7e0lLiCUjKa7fZW9aGH73ozYjk7UUjCu2VTSQ6I2hJCvZ7ShBc/H0PBR4Y/tht6MYEzArCibkurqV7ZWNTM1PDdv7MA+HzKQ4Fo/PYl3pUaoaWt2OY0xArCiYkCutOUZLRxfTCtLcjhJ050/OIS7Ww6tbKt2OYkxArCiYkNtW0UCsR5iUl+J2lKBLio/lvMk5bKtsZP+RY27HMWZAVhRMSKkqWysamJCTQnxsjNtxQuLMCdmkJcTyypZKfBfxGxO+rCiYkKpsaOVocwfTR0DX0XFxsR4umpbHgdpmtlX0Hj3emPBiRcGE1LaKBgSYWpDqdpSQOn1sJjkp8by8uZLOrm634xhzUlYUTEhtrWigaFQSqQlet6OEVIxHuGJ2ATXH2vlg9xG34xhzUlYUTMjUNbdTXtc6Is466svkvFSmF6Tx5o7D1LfY8BcmPFlRMCGzrbIRgGkjrOvI3+WzClD1DRluTDiyomBCZlt5A9kp8eSmJrgdxTWjkuM4b3IOmw7Vs7Oq0e04xpzAioIJiZb2LvYeaWL6CG4lHHfu5BxyUuJ5bv0h2jq63I5jzCdYUTAhsbOqkW5lxB5P8OeN8XD16YXUt3Tw6tYqt+MY8wlWFExIbKtsIDkuhqJRSW5HCQvFWcksHJ/Fyr01lNbYlc4mfFhRMEHX1a3sqmpicl4qnii8d8JgXTY9j/QkL39ZW0ZTW6fbcYwBrCiYEDhY20xLRxdTrevoE+K9MVw3r4ijx9r5v3/b4nYcYwC7yY4JgR1VjXgEJuVG/wB4p6okO5nzp+TyzLoyvDHC7DHBvzWp3dDH9CegloKIvBHIa8b0ZUdlI8VZySR4R8YAeKfqwqm5FGUm8uz6QxxpanM7jhnh+i0KIpIgIqOAbBHJFJFRzlQCjA5FQBPZ6prbqWxoZWq+nYp6MjEe4YYFY4nxCI+tLKWt005TNe4ZqKXwv4C1wFTn3+PT34BfBzeaiQbbnauYp+RZUehPZlIcN5wxlurGNp5ZW2ZDbBvX9FsUVPWXqjoO+LaqjlfVcc50mqreG6KMJoLtqGwkM8lLTmq821HC3sTcFC6bkc/m8gbetPs6G5cEdKBZVX8lImcCJf7vUdVHgpTLRIGOrm72HmlifvEoxE5FDcg5k7I53NjGG9sPk5HkZV7xKLcjmREmoKIgIo8CE4ANwPEOTwWsKJiT2lvdREeXMsWOJwRMRPjs3EIaWjt4dv0hUhO8TLauNxNCgV6nMB84S1XvUtV/cKZv9PcGEXlQRA6LyOaTzD9fROpFZIMzff9Uw5vwtr2yEW+MMC472e0oESXGI9y0YCx5aQk8vqqUPdVNbkcyI0igRWEzkH+K634IWDLAMu+p6hxn+sEprt+EMVVlR1UjE3NS8MbYNZKnKsEbwxfPGkdmUhyPrNjP3iNWGExoBPrXmg1sFZHlIvL88am/N6jqu0DtkBOaiHS4sY265g6m5NtVzIOVEh/L7WePIyMpjoc/3M++IzZGkgm+QK9ovjtI218sIh8D5fjOcOrzWn8RuQO4A2DsWLsaMxLsOH4qqh1PGJLUBC9fPnscD7y3j4c/3M9tZ5ZQYt1xJogCaimo6jt9TUPc9jqgWFVPA34FPNfP9u9T1fmqOj8nJ2eImzWhsL2ygYL0BNITR9a9mIMhNcHL7eeMIy0xlodW7LdRVU1QBTrMRaOINDhTq4h0iUjDUDasqg2q2uQ8fgnwikj2UNZpwkNLexcHaputlTCM0hK8fPns8aQlxPJH60oyQRRoSyFVVdOcKQH4HDCki9dEJF+ck9dFZIGTpWYo6zThYedh3w11ptqplMMqLdHLl88ZT3qCl4c+3GdnJZmgGNRpIar6HHBhf8uIyJPACmCKiJSJyO0icqeI3Okscg2w2TmmcA9wg9q1/VFhR2UjSXExjLEb6gy7tAQvXz7n72cl7T5shcEMr0AvXrva76kH33UL/X6Aq+qNA8y/lyG2Nkz46VZlZ1Wj3VAniFITfC2GB9/fxyMr9nPLomK7wM0Mm0BbCp/xmy4DGoGlwQplIldZbTPN7V12PCHIUuJj+fLZ48hNjefRlaXsqmp0O5KJEoGOffTFYAcx0WG7c0OdyblWFIItKT6W288ezwPv7+WxVaXcftY4xmbZ6apmaAI9+2iMiDzrDFtRJSLPiMiYYIczkWdHZSNjRyWTGGc31AmFxLgYbjuzhNQELw+vKKWyodXtSCbCBdp99EfgeXw31ikEXnBeM6ZHfUsHFfV2Q51QS03w8qWzxhEbIzz84X4aWzvcjmQiWKBFIUdV/6iqnc70EGBXkZlPsKuY3TMqOY7PLy6hub2Tx1cdoLOr2+1IJkIFWhSOiMgtIhLjTLdg1xSYXnZUNpCR5CXXbqjjisKMRK6ZV8SB2mae21Bud28zgxJoUfgScB1QCVTgu8bADj6bHh1d3eyubmJqfqrdUMdFswrTuXBqLusOHGVt6VG345gIFGhR+CHwBVXNUdVcfEXi7qClMhHn+A11ptqoqK67cGou43OSeWFjOYcb7cCzOTWBFoXZqtrztUNVa4G5wYlkItH2ykbiYjx2Q50w4BHhunlFeGM8/Pmjg3Z8wZySQIuCR0Qyjz8RkVEEPuy2iXKqyvbKRibm2g11wkVaopfPnT6GivpWXt922O04JoIE+hf8X8CHIvJDEfkB8CHwn8GLZSJJZUMr9S0ddipqmJlWkMa84kze311NeV2L23FMhAh0lNRH8I2MWgVUA1er6qPBDGYix3Y7FTVsfWpmPolxsTy7/hDddjaSCUDAbX1V3aqq96rqr1R1azBDmciyvaKBMZmJpCbYDXXCTVJcLJ+eXcChuhZW7LGzyM3ArAPYDEljawdlR1us6yiMzS5MZ0peKq9tq7Krnc2ArCiYIdlZ1YSCnYoaxkSEK2YV0NnVbQedzYCsKJgh2V7ZQFpCLAXpCW5HMf3ITo1n0fgs1uyv7RmOxJi+WFEwg9bZ1c2uw01MzU+zq5gjwIVTc4n3evjRS9vcjmLCmBUFM2j7ao7R3tltxxMiRFJcLBdOyeXdndW8v+uI23FMmLKiYAZte0Uj3hhhQm6K21FMgBaNz2J0egL//fpOGzDP9MmKghkU31XMDUzIsauYI0lsjIe7LpjI2tKjvL/bWgvmRPbXbAblcGMbR5s77IK1CHTt/DG+1sJr1lowJ7KiYAZle0UDYKeiRqL42BjuumAi6w7U8Z4dWzC9WFEwg7LFuYo5PdGuYo5E180vojAjkV+9ucvtKCbMWFEwp6y8roWyoy3MKLBWQqSKi/Vw+9nj+Gj/UdYdsJvxmL+zomBO2atbKgGYMTrd5SRmKK4/o4j0RC/3vbPX7SgmjFhRMKfslS2V5KbGk233Yo5oyfGx3LJoLMu3VrLvyDG345gwYUXBnJKapjZW76u1VkKU+MKZJXg9Hh54z1oLxseKgjklr2+rolthxmg7nhANclMTuPr0Qp5eW0btsXa345gwELSiICIPishhEdl8kvkiIveIyG4R2Sgipwcrixk+r2yupGhUog2AF0W+dPY42jq7eWrNQbejmDAQzJbCQ8CSfuZ/CpjkTHcAvw1iFjMMGlo7+GB3DUtm5NsAeFFkcl4qC8eN4rGVpXR128VsI13QioKqvgvU9rPIUuAR9VkJZIhIQbDymKF7a/th2ru6WTIz3+0oZph9fnEJZUdbeGen3W9hpHPzmEIh4N9eLXNeO4GI3CEia0RkTXV1dUjCmRMt31JJTmo8c4sy3Y5ihtmlM/LITY3nkRWlbkcxLnOzKPTV/9Bn21VV71PV+ao6PycnJ8ixTF9aO7p4a3s1l83Iw+OxrqNo443xcOOCsbyzs5rSGjs9dSRzsyiUAUV+z8cA5S5lMQN4d2c1LR1dLJlhPXzR6sYFY/GI8PiqA25HMS5ysyg8D3zeOQtpEVCvqhUu5jH9eGVLJemJXhaOH+V2FBMk+ekJXDYjj6fWHKS1o8vtOMYlwTwl9UlgBTBFRMpE5HYRuVNE7nQWeQnYC+wG7gfuClYWMzQdXd28vrWKi6fl2b0Totwti4qpa+7ghY+t0T5SxQZrxap64wDzFfhasLZvhs+KPTU0tHZy2Yw8t6OYIFs8PotJuSk8urKUa+cXDfwGE3Xsa58Z0PMfl5MaH8u5k+0gf7QTEW5dXMzGsno2HKxzO45xgRUF06/Wji6Wb67kspn5JHhj3I5jQuCzcwtJjovhkRX73Y5iXGBFwfTr7R3VNLZ1cuVpo92OYkIkNcHLVXMLeXFjBXXNNh7SSGNFwfTrhY/LyU6J48wJWW5HMSF008KxtHd28+z6Q25HMSFmRcGcVFNbJ69vq+LyWQXE2llHI8qM0emcVpTBk6sP4DsnxIwU9pduTuq1rZW0dXZb19EIddOCInZWNbG21G7XOZJYUTAn9dz6cgozEjl9rI11NBJ9evZoUuJjeWK1XeE8klhRMH2qamjlvV3VfHZuoY11NEIlx8eydM5olm2soL65w+04JkSsKJg+Pbv+EN0Kn5s3xu0oxkU3LhhLW2c3z64vczuKCRErCuYEqsoza8uYV5zJuOxkt+MYF80sTGf2mHSeXH3QDjiPEFYUzAk2ltWz63AT11grweBrLeyoamTdAbvCeSSwomBO8My6MuJjPVwx24bJNnDlaaNJjovhSTvgPCJYUTCf0NrRxd82lHPZjHzSErxuxzFhIDk+lqVzC3lxYzn1LXbAOdpZUTCf8MrmSupbOrjORsg0fm5aMJbWjm7+tsGucI52VhTMJzyx6gAlWUk2rIX5hJmF6cwqTOeJVXaFc7SzomB67KxqZPX+Wm5aONauTTAnuHHBWLZXNrLehtSOalYUTI8nVh0gLsbDNfOs68ic6Mo5o0mKi+FJu4dzVLOiYABoae/imXVlfGpWPqOS49yOY8JQinOF8wsby2lotQPO0cqKggF8Q2Q3tnZy88Jit6OYMHbTgmLfAWcbUjtqWVEwqCoPfrCPqfmpnFFig9+Zk5s1Jp2ZhWk8bgeco5YVBcMHu2vYXtnI7WePQ8QOMJv+HT/gvO6ADakdjawoGB54fy/ZKfFcOcfum2AGdtWcQlITYnn4w1K3o5ggsKIwwu2qauTtHdV8YXEx8bExbscxESA5PpZr5xXx0qYKDje0uh3HDDMrCiPcgx/sIz7Ww82L7ACzCdznFxfT2a12A54oZEVhBKusb+WZtYf43LwxdhqqOSUl2cmcPyWHx1cdoL2z2+04ZhhZURjBfvfOHrpV+ep5E9yOYiLQF84sobqxjZc3V7gdxQwjKwoj1OGGVp5cfYCrTy+kaFSS23FMBDpvUg7js5N54L19dnpqFLGiMELd9+5eOruVr10w0e0oJkJ5PMKXzxnPpkP1rNpX63YcM0yCWhREZImI7BCR3SLynT7m3yYi1SKywZm+HMw8xqe6sY3HVpWydM5oirPsdptm8K4+vZCs5Djuf3ev21HMMAlaURCRGODXwKeA6cCNIjK9j0X/rKpznOmBYOUxf3fPG7vo6FK+bq0EM0QJ3hg+v7gLn8XRAAAR70lEQVSEN7YfZvfhRrfjmGEQzJbCAmC3qu5V1XbgT8DSIG7PBGBPdRNPrD7ATQvGMj4nxe04JgrcuriYBK+H+9/d53YUMwyCWRQKgYN+z8uc13r7nIhsFJGnRaTPMZtF5A4RWSMia6qrq4ORdcT4z1e2kxDr4RsXTXI7iokSo5LjuG5+EX9dX8ahuha345ghCmZR6GsQnd6nKLwAlKjqbOB14OG+VqSq96nqfFWdn5OTM8wxR46P9teyfEsVd543gZzUeLfjmChyp3Na82/e2u1yEjNUsUFcdxng/81/DFDuv4Cq1vg9vR/4aRDzRLUnBrjxSbcqv3l7N2kJsaQmeAdc3kSvYP3u547N5E+rD1KYkUhG0t8vhrxp4digbM8ERzBbCh8Bk0RknIjEATcAz/svICIFfk+vBLYFMc+ItnJvDeV1rVw+q4C4WDsT2Qy/8yf7WvHv7LQu3kgWtE8HVe0Evg4sx/dh/5SqbhGRH4jIlc5i3xCRLSLyMfAN4LZg5RnJGlo6eG1rFZNyU5hVmO52HBOlMpLimFeSyZr9R6lrbnc7jhmkYHYfoaovAS/1eu37fo+/C3w3mBkMvLipgq5u5crTRtv9EkxQnT85h3WlR3l1axXXzbd7fUci60eIcpsO1bP5UD0XTM0lK8UOLpvgykiK4+yJ2Ww4WEfZ0Wa345hBsKIQxRpaOnhu/SHGZCZy7iQ7a8uExnmTc0iOj2XZpgobEykCWVGIUqrKM+vK6Ozu5rp5RcR4rNvIhEa8N4ZLpuVRWtPMlvIGt+OYU2RFIUp9uKeGXYeb+NTMArLtmgQTYvOKM8lPS2DZpgoaWzvcjmNOgRWFKLTvyDFe3lzB9II0Fo4b5XYcMwLFeITPzi2koaWDH7+83e045hRYUYgyDS0dPLn6AKOS47hm3hg728i4pmhUEmdNzOaJVQdYsadm4DeYsGBFIYq0d3bz+KpS2ju7uXlhMQneGLcjmRHu4ml5FGcl8X+e2WjdSBHCikKU6FblqTUHKTvawrXzx5CXluB2JGOIi/Xws2tO41BdC//76Y12NlIEsKIQBVSVFz4uZ2tFA1fMLmDGaLtq2YSPBeNG8X+WTOHlzZU88J4Nrx3urChEOFXl56/uYNW+Ws6ZmM2ZE7LdjmTMCb5yzniWzMjnJ69s54PdR9yOY/phRSHC/fdrO/n1W3s4oySTy2bmux3HmD6JCD+7djYTcpL5yiNrWFt61O1I5iSsKEQoVeVny7dzz5u7uX5+EUvnFOKxM41MGEtN8PLY7QvJTY3ntj+uZvOhercjmT5YUYhAnV3dfPevm/j1W3u4ccFYfnz1LCsIJiLkpiXw+FcWkZbg5cb7V/L2jsNuRzK9WFGIMMfaOrnzsXX86aODfOPCifzHZ2fisSEsTAQpzEjkqTsXMyYziS899BH3v7vXzkoKI0EdOtsMr4O1zXzlkTXsrGrkB0tn8PnFJW5HMmZAJ7vT2/Xzi/jL2oP86KVtPLn6AEvnFA7LbWLtTm9DYy2FCPH+riMs/fUHlNe18NAXF1hBMBEvLtbDjQvGsnTOaMrrW7jnzV0s21huN+hxmbUUwlxnVze/fGMX9761mwk5Kdx36zzG56S4HcuYYeERYeG4LKYXpLF8SyUr9tawYm8NM0anM6swnUl5KcTH2pX5oWRFIYwdqGnmW3/ZwEf7j3LtvDH829IZJMXZr8xEn9QEL9fMK+LiaXl8sPsI6w7UselQPbEeYUxmImMykxidkUBmUhyZyXGkxsfauF5BYp8wYUhVeWL1AX60bBsxIvziutO4+vQxbscyJugykuK4YvZolswsoLTmGNsqGjhQ28yKvTV0df/9YLQ3RshIiiMzyUtagpe0RC/pznSwtpnCjEQ7AWOQrCiEmd2HG/nes5tZva+Wsydm85/XzGZ0RqLbsYwJqRiPMD4npaertLOrm9pj7Rxtbqe2uYOjx9qpPdZOfUsHFXWtNLV1crxkPPThfhK8HibkpDApN4VJeanML87ktKIMGyQyAFYUwkR9Swe/fXsPf3h/L0lxsfzk6llcf0aRNZGNAWJjPOSmJZB7koEeO7u7aWztpK65gwk5yew63MSuw02s3lfLcxvKAd+B7blFGSwcn8Xi8VmcUZJJbIyda9ObFQWXtXV28djKA/zqzV3Ut3Tw2bmFfO/yaWSn2N3SjAlUrMfjO96QFMcNCz55Smpdczur99Wyel8tq/bVcu+bu7jnjV1kJHm5cGoul07P59zJ2Xa8zmE/BZd0dysvbqrgZ8u3c7C2hXMmZfOdT021EU6NGWYZSXFcOiOfS2f4xgZraO3gw91HeHVrFW9sO8xf1x0iwevh3Ek5XDG7gIum5ZESP3I/Gkfunrukpb2LZ9aV8eD7+9h75BjTCtJ45EuzOHdyjtvRjBkR0hK8LJlZwJKZBXR0dfPRvlqWb6nklS2VvLq1ivhYD+dPyeGK2aO5aGouySOsQIysvXVRdWMbj67Yz6MrSzna3MGswnTuuXEun55VYGdJGOMSb4yHMydmc+bEbP7vZ2aw9sBRlm2s4KVNFSzf4isQF07N5YrZBVw4NXdEdDFF/x66qKW9ize2V/G3DeW8veMwnd3KRVPz+Mo541gwbpQdRDYmjHg8whklozijZBTf//R01pQeZdnGcl7aXMnLmytJ8Hq4aGoel88q4IKpOVFbIKJzr1zU1NbJyj01LNtUwatbKjnW3kVeWjyfX1zCzQvH2tXIxkQAj0dYMG4UC8aN4vufmcFH+2tZtrGClzdXsGxTBYneGM6ZlM05k7I5a2I247KTo+ZLnhWFIWrt6GL9gTo+3HOED/fU8PHBOjq7lbSEWD5z2miunDOaheOyiLEuImMiUoxHWDQ+i0Xjs7j7yhms3lfLsk3lvLW9mle3VgEwOj2BsyZmM78kk5mF6UzOS8Uboae7BrUoiMgS4JdADPCAqv6k1/x44BFgHlADXK+q+4OZabBaO7oor2thf80xtlU0sr2yke0VDew9coyubsUjMHtMBnecO77nP4eN2WJMdInxCIsnZLF4QhaqyoHaZt7ffYQPdh/htW1V/GVtGeC7JmJafiozC9MZn5NCSVYSxVlJjMlMCvsL6IJWFEQkBvg1cAlQBnwkIs+r6la/xW4HjqrqRBG5AfgpcH0w8jS1dVJR10J7Vzftnb6po0tp7+qivbObts5uGlo7aWjpoL6lg7rmdo42d1BR30J5XSu1xz45cuOYzESm5qexZGY+s8dksHD8KNISvMGIbowJQyJCcVYyxVnJ3LywmO5upbS2mU2H6tl8qJ6NZXU8/3E5ja2dfu+BvNQEclLjyUqJIys5nuyUODKS4kiJjyEpLpZk59/4WA+xMR68MUKsx0NsjJCVHEdWkK9hCmZLYQGwW1X3AojIn4ClgH9RWArc7Tx+GrhXRESDcMeNt3cc5utPrA9o2QSvh/REL5lJcRSkJzB7TAaFGYkUpCcwdlQSk/NTrQAYYz7B4xHGZSczLjuZK08bDfjGMas91k5pbTMHaprZX3OMg7Ut1Bxro6apnZ2VjRw51k57Z3dA2/hf543nu5+aFszdCGpRKAQO+j0vAxaebBlV7RSReiALOOK/kIjcAdzhPG0SkR1BSRxa2fTazwhn+xPeRsz+3BziIMMkoN/P934K3xv8NooDWSiYRaGvI6u9WwCBLIOq3gfcNxyhwoWIrFHV+W7nGC62P+HN9ie8hdP+BPPweBlQ5Pd8DFB+smVEJBZIB2qDmMkYY0w/glkUPgImicg4EYkDbgCe77XM88AXnMfXAG8G43iCMcaYwASt+8g5RvB1YDm+U1IfVNUtIvIDYI2qPg/8AXhURHbjayHcEKw8YSiqusOw/Ql3tj/hLWz2R+yLuTHGmOMi85I7Y4wxQWFFwRhjTA8rCiEiIqNE5DUR2eX8m9nPsmkickhE7g1lxlMRyP6IyBwRWSEiW0Rko4gE5Wr1oRCRJSKyQ0R2i8h3+pgfLyJ/duavEpGS0KcMXAD7888istX5fbwhIgGdu+6WgfbHb7lrRERFJCxO6zyZQPZHRK5zfkdbROSJUGdEVW0KwQT8J/Ad5/F3gJ/2s+wvgSeAe93OPZT9ASYDk5zHo4EKIMPt7H75YoA9wHggDvgYmN5rmbuA3zmPbwD+7HbuIe7PBUCS8/irkb4/znKpwLvASmC+27mH+PuZBKwHMp3nuaHOaS2F0FkKPOw8fhi4qq+FRGQekAe8GqJcgzXg/qjqTlXd5TwuBw4D4XSLuZ6hWFS1HTg+FIs///18GrhIwneM5AH3R1XfUtVm5+lKfNcPhatAfj8AP8T3JaU1lOEGIZD9+Qrwa1U9CqCqh0Oc0YpCCOWpagWA829u7wVExAP8F/AvIc42GAPujz8RWYDv29GeEGQLVF9DsRSebBlV7QSOD8USjgLZH3+3Ay8HNdHQDLg/IjIXKFLVF0MZbJAC+f1MBiaLyAcistIZaTqk7H4Kw0hEXgfy+5j1/wW4iruAl1T1YDh8GR2G/Tm+ngLgUeALqhrYyF+hMWxDsYSJgLOKyC3AfOC8oCYamn73x/kS9d/AbaEKNESB/H5i8XUhnY+vFfeeiMxU1bogZ/tEADNMVPXik80TkSoRKVDVCudDsq9m4WLgHBG5C0gB4kSkSVVPeoAtmIZhfxCRNGAZ8K+qujJIUQfrVIZiKYuAoVgC2R9E5GJ8hf08VW0LUbbBGGh/UoGZwNvOl6h84HkRuVJV14QsZeAC/f+2UlU7gH3O4J+T8I0QERLWfRQ6/kN6fAH4W+8FVPVmVR2rqiXAt4FH3CoIARhwf5zhTZ7Ftx9/CWG2QEXbUCwD7o/T3fJ74Eo3+qtPUb/7o6r1qpqtqiXO38xKfPsVjgUBAvv/9hy+kwEQkWx83Ul7QxnSikLo/AS4RER24bvx0E8ARGS+iDzgarLBCWR/rgPOBW4TkQ3ONMeduCdyjhEcH4plG/CUOkOxiMiVzmJ/ALKcoVj+Gd+ZVmEpwP35Gb5W6F+c30fvD6WwEeD+RIwA92c5UCMiW4G3gH9R1ZpQ5rRhLowxxvSwloIxxpgeVhSMMcb0sKJgjDGmhxUFY4wxPawoGGOM6WFFwUQ0EelyTq3cLCIviEjGINfzgIhM7+P124YyWq2INA32vcFYjzEDsaJgIl2Lqs5R1Zn4rjT+2mBWoqpfVtWtwxvNmMhjRcFEkxX4DTAmIv8iIh859w74N+e1ZBFZJiIfO62L653X3z4+Fr+IfFFEdorIO8BZfut7SESu8Xve5Pyb4tybYJ2IbBKRvkbyxO99P3WGMjn+/G4R+VYg6xGR80XkRb/n94rIbc7jeSLyjoisFZHlzvAjxpwSG/vIRAURiQEuwncFMiJyKb4xYxbgG4jseRE5F9/Q3eWqeoWzXHqv9RQA/wbMwzci6lv4xrfvTyvwWVVtcIYmWCkiz/czHMafgP8BfuM8vw5YMoj1+Of2Ar8ClqpqtVPsfgR8aaD3GuPPioKJdIkisgEoAdYCrzmvX+pMxz/QU/AVifeAn4vIT4EXVfW9XutbCLytqtUAIvJnfOPP9EeA/3CKTje+1koeUNnXwqq6XkRyRWQ0viJ1VFUPOB/sAa+nlyn4Bod7zRkcLgbfTY2MOSVWFEyka1HVOc43/hfxHVO4B98H9Y9V9fe93yC+GxldDvxYRF5V1R/0WuRk38w7cbpcxffJG+e8fjO+D/d5qtohIvuBhAFyP41vgL18fC2HQNfTk8FxfL4AW1R18QDbNaZfdkzBRAVVrQe+AXzb+ca9HPiSiKQAiEih37fzZlV9DPg5cHqvVa0CzheRLGc91/rN24+vWwl8d8zyOo/TgcPOB/kFQCD3Pf4TvlEyr8FXIAJdTykwXXz3jk7H12UGsAPIEZHFzv56RWRGADmM+QRrKZio4XTLfAzcoKqPisg0YIXTndIE3AJMBH4mIt1AB777FPuvo0JE7sZ30LoCWIevKwbgfuBvIrIaeAM45rz+OPCCiKwBNgDbA8i6RURSgUPH72AXyHqcGzA9BWwEduF0j6lqu3MQ/B6nWMTiO26xZaAsxvizUVKNMcb0sO4jY4wxPawoGGOM6WFFwRhjTA8rCsYYY3pYUTDGGNPDioIxxpgeVhSMMcb0+H8O0CdCp7ycHQAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "def hist_resids(y_test, y_score):\n", " ## first compute vector of residuals. \n", @@ -1114,20 +421,9 @@ }, { "cell_type": "code", - "execution_count": 20, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY0AAAEWCAYAAACaBstRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XmcVNWd9/HPFwEjA66gxoXGR9pMXHBrNbMkMYpGBJfkMYshExMnNkRRFOLEhIygMzxxEgOKskhWZ9KJidlEZNXJPmpsjQTBENDYykgU2g2DhkZ+zx/3FlQXVd236a6uXr7v16tfVffeU/eeKpr69Tm/e85RRGBmZpZFn0pXwMzMug8HDTMzy8xBw8zMMnPQMDOzzBw0zMwsMwcNMzPLzEHDykbSKkmnlzh2uqT1HXSdn0v6dEecq6vLf6+Sxkpa1gnXHCYpJPXt4PN22O+AdR4HDUPSM5LekPS6pD9L+rakge09b0QcExE/74AqWhERURcRZ7dWTtI0Sd/pjDpZz+egYTnnRcRA4ATgRODzFa5Pj9fRf7mbdQYHDWsmIv4MLCUJHgBI2lPSzZKelfSCpHmS9kqPDZa0UNIrkl6S9CtJfdJjz0gamT7fK23BvCxpNXBK/nXT7o/hedvflvTv6fP90mtsTF+/UNJhxeovabikX0h6VdImSd8vUW6JpAkF+1ZI+qASMyW9mJ7n95KOzfL5pe/jKklPp9f/St7n8UlJv0nP/RIwLd1/qaQn0/e2VFJV3vnOkvSHtB63A8o79klJv87bPkbS8vTf4QVJX5B0DvAF4CNpS3JFWnYfSd+QtEHS/0r6d0l7pMf2SP+9N0l6Ghjdwvu9TtIPC/bdKmlW+vxT6XvbnH4m41r57Ir+DqTbYyQ9nv6u/Y+kEXnHPpe+j82S1kg6s9R1rH0cNKyZ9Mt4FLAub/d/AEeRBJLhwKHA9emxycB6YAhwEMkXVLG5aaYCR6Y/7wcuaUO1+gDfAqqAocAbwO0lyv4bsAzYDzgMuK1Eue8CF+c2JB2dnv8+4GzgPSTveV/gI0BjG+r7AaAGOAm4ALg079hpwNPAgcB0SReSfGYfJPkMfwV8L63TYOBHwBeBwcBTwD8Uu6CkQcD9wBLgEJJ/pwciYgnw/4DvR8TAiDg+fcmdwLa03Inpe87lhS4DxqT7a4CLWniv3wPOlbR3Wo89gA+TfL4AL6bn2hv4FDBT0kktnK+o9DXfBMYBBwB3AAvSP2jeAUwATomIQSS/X8+09RqWjYOG5fxU0mbgOZL/6FMBJInkS+SaiHgpIjaTfAl9NH1dE/B2oCoimiLiV1F8QrMPA9PTczwHzMpasYhojIgfRcSW9PrTgfeWKN5E8uV/SES8GRG/LlHuJ8AJeX/VjwV+HBF/Tc8xCPhbQBHxZERsyFpf4D/S9/kscAt5wQl4PiJui4htEfEGyZfgl9JrbCP5bHP1OhdYHRE/jIim9Fx/LnHNMcCfI+Kr6fveHBEPFyso6SCSPwyujoi/RMSLwEx2/pt+GLglIp6LiJeAL5V6oxHRADwGXJjuOgPYEhEPpcfvi4inIvELkoD+7tIfXUmXAXdExMMR8VZE3An8FXgX8BawJ3C0pH4R8UxEPLUb17AMHDQs58L0r7TTSb4sB6f7hwADgEfTboFXSP6aHZIe/wpJq2RZ2v1wXYnzH0ISkHIaslZM0gBJd0hqkPQa8Etg31x3SoF/IenC+a2Su7cuLVKGNPjcx84vyo8Cdemx/yZpycwGXpA0P/eXdEaF7/OQEscgCXC35n22L6X1P5SCzywNxoWvzzmcpCWSRRXQD9iQd907SFo/FF6X1v+t8lttH2NnKwNJoyQ9lHaZvUISCAcXOUeWOk/O1Tc91+EkfxysA64m6e57UdJdkg5p4VzWDg4a1kz61+C3gZvTXZtIuoOOiYh905990qQ56V+0kyPi/wDnAZNK9CdvIPlPnjO04PgWkuCUc3De88nAO4DTImJvkq4jyOvfz6v/nyPisog4hOSv+Dn5/eQFvgdcLOnvgL2An+WdZ1ZEnAwcQ9JNdW2JcxRT+D6fz69iQdnngHF5n+2+EbFXRPwPBZ9Z2uo7nOKeI+n6K6bYNf8KDM675t4RcUx6vLV/q0J3A6enXZsfIA0akvYk6V67GTgoIvYFFlHk3y3V0u/AcyQt1fzPaUBEfA8gIr4bEf9IElyCpEvVysBBw4q5BThL0gkRsR34Gklf9IEAkg6V9P70+RglyWcBr5F0FbxV5Jw/AD6vJKl9GHBlwfHHgY+lSdhzaN79NIgkcL0iaX/SrrNiJH1IO5PkL5N8gRSrDyRfYFXAjSR9/tvTc5wi6TRJ/YC/AG+2cI5irk3f5+HARKBoMj41j+RzOSa99j6SPpQeuw84Rklyvi9wFc2/SPMtBA6WdHXazz9I0mnpsReAYUoT8mlX2zLgq5L2ltRH0pGScp/5D4CrJB0maT+gVOuR9HwbgZ+T5J3+FBFPpof6k3QbbQS2SRpFkjsppaXfga8B49N/F0n6G0mj0/f5DklnpEHqTZLflbb8e1kbOGjYLtIvgf8E/jXd9TmSLqiH0u6h+0n+8geoTrdfBx4E5pQYm3EDSTfHn0i+sP6r4PhEkpbKKyT5hZ/mHbuFpCWwCXiIpHuslFOAhyW9DiwAJkbEn0q8z78CPwZGktelQpK0/RpJ0GkgSYLfDKDkjqTFLVwf4B7gUZIvwfuAb5QqGBE/Ifmr+K70s32CJN9ARGwCPgTclNahGvhNifNsBs4i+Qz/DKwF3pcevjt9bJT0WPr8EyRf6qvT9/lDktwU6XtfCqwgyVf8uJX3C8nn1+xzTOt0FUkQepmk62pBC+co+TsQEfUkeY3b03OtAz6ZHt6T5DPalL73A0luLrAykBdhMus4kgKoTvvZzXoctzTMzCwzBw0zM8vM3VNmZpaZWxpmZpZZj5swbfDgwTFs2LBKV8PMrFt59NFHN0XEkNbK9bigMWzYMOrr6ytdDTOzbkVSplka3D1lZmaZOWiYmVlmDhpmZpaZg4aZmWXmoGFmZpk5aJiZdXN1dTBsGPTpkzzW1ZXvWj3ullszs96krg5qa2HLlmS7oSHZBhg7tuOv55aGmVk3NmXKzoCRs2VLsr8cHDTMzLqxZ59t2/72ctAwM+vGhpZYjLfU/vZy0DAz68amT4cBA5rvGzAg2V8ODhpmZt3Y2LEwfz5UVYGUPM6fX54kOFQ4aEg6R9IaSesklVy8XtJFkkJSTWfWz8ysOxg7Fp55BrZvTx7LFTCggkFD0h7AbGAUcDRwsaSji5QbRLI4/cOdW0MzMytUyZbGqcC6iHg6IrYCdwEXFCn3b8CXgTc7s3JmZrarSgaNQ4Hn8rbXp/t2kHQicHhELGzpRJJqJdVLqt+4cWPH19TMzIDKBg0V2bdjwXJJfYCZwOTWThQR8yOiJiJqhgxpdeEpMzPbTZUMGuuBw/O2DwOez9seBBwL/FzSM8C7gAVOhpuZVU4lg8YjQLWkIyT1Bz4KLMgdjIhXI2JwRAyLiGHAQ8D5EeG1XM3MKqRiQSMitgETgKXAk8APImKVpBslnV+pepmZWWkVneU2IhYBiwr2XV+i7OmdUSczMyvNI8LNzCwzBw0zM8vMQcPMzDJz0DAzs8wcNMzMLDMHDTMzy8xBw8zMMnPQMDOzzBw0zMwsMwcNM7OeIqL1Mu3koGFm1t1t2gRXXAG1tWW/lIOGmVl3tXUrzJwJw4fDHXfAXnuVvbXhoGFm1t1EwMKFcNxxMGkSnHYarFgBs2aBiq1v13EcNMzMupNVq+D974fzzku2Fy6EJUvgmGM65fIOGmZm3UEub3H88fDII3DLLfDEEzB6dNlbF/kqup6GmZm1YutWmDMHbrgBNm+G8eOT5wccUJHqOGiYmXVFEbBoUZKz+OMf4eyzYcaMTuuGKqWi3VOSzpG0RtI6SdcVOT5e0kpJj0v6taSjK1FPM7NOtWoVnHMOjBmTbHdy3qIlFQsakvYAZgOjgKOBi4sEhe9GxHERcQLwZWBGJ1fTzKzz5OctfvvbiuUtWlLJlsapwLqIeDoitgJ3ARfkF4iI1/I2/wYo/3BHM7POtnVrEiCqq5PxFuPHw7p1MHEi9OtX6do1U8mcxqHAc3nb64HTCgtJugKYBPQHzih2Ikm1QC3A0KFDO7yiZmZl0UXzFi2pZEujWFtrl5ZERMyOiCOBzwFfLHaiiJgfETURUTNkyJAOrqaZWRl04bxFSyoZNNYDh+dtHwY830L5u4ALy1ojM7Ny6wZ5i5ZUMmg8AlRLOkJSf+CjwIL8ApKq8zZHA2s7sX5mZh2nG+UtWlKxnEZEbJM0AVgK7AF8MyJWSboRqI+IBcAESSOBJuBl4JJK1dfMbLd0w7xFSyo6uC8iFgGLCvZdn/d8YqdXysyso6xalQSLZcvgqKOSvMW553aLbqhSPPeUmVlH27QJJkzotnmLlngaETOzjlI4T9S4ccnzwYMrXbMO46BhZtZePSxv0RJ3T5mZtUc3HW+xuxw0zMx2Rzcfb7G73D1lZtYWXWx9i87mloaZWRYRcN99ybrc11wDp56arMt9++27BIy6Ohg2DPr0SR7r6ipS47Jw0DAza01+3kJKgkeJvEVdHdTWQkNDEmcaGpLtnhI4HDTMzEoplrdYubLFAXpTpsCWLc33bdmS7O8JnNMwMytUmLf4zGdg2rRMeYtnn23b/u7GLQ0zs5xSeYvbbsuc6C61pE9PWerHQcPMDJrlLV7dLD554H30WbaEYaOPaVM+Yvp0GDCg+b4BA5L9PYGDhpn1bgV5i/qP38LQV1Zy54vnEqjNieyxY2H+fKiqStIeVVXJ9tix5X0bnUURPWvZ7Zqamqivr690Ncysqysx3mLYyQfQ0LBr8aoqeOaZTq9lp5H0aETUtFbOiXAz611amSeqpyey28vdU2bW4+UG2x2rVfxyQMvzRPX0RHZ7OWiYWbeVZeR1XR18/rJNXNtwBY9zPMe9+Vuu7XcL3/1C8Xmienoiu70cNMysW8o08nrrVp668hZWvFHNOO7gDsZRzVpubprIF6YWX5e7pyey26uiiXBJ5wC3kqwR/vWIuKng+CTg08A2YCNwaUQUSVHt5ES4We8wbBilE9Z/ap63WMrZTGIGq9nZDSXB9u2dV9+uLmsivGItDUl7ALOBUcDRwMWSji4o9jugJiJGAD8Evty5tTSzSmmt66lUYnpgQ/P1LT41ZCHnsKRZwADnKHZXJbunTgXWRcTTEbEVuAu4IL9ARPwsInKzuDwEHNbJdTSzCsjS9VT4pX8Am7idJG+Rv77FyJmjGTCged7COYrdV8mgcSjwXN72+nRfKf8MLC52QFKtpHpJ9Rs3buzAKppZOZVqTWSZ9C+XsO7HViZyC2tJ8hZPnTUe1q2DiROhXz/nKDpYJcdpFJsismiCRdLHgRrgvcWOR8R8YD4kOY2OqqCZlU+uNZELDrnWBGQbKzH2Y8Ehv1vE0Fsnc+S2NfzybWfz2rQZjPncrtOVjx3rINFRKhk01gOH520fBjxfWEjSSGAK8N6I+Gsn1c3Myqyl1sTQocWT3Du6pFatgkmTeN+yZXDUUTBjIe9pYbpy6ziV7J56BKiWdISk/sBHgQX5BSSdCNwBnB8RL1agjmZWJi21JkqNlbj5ut65LndXUrGgERHbgAnAUuBJ4AcRsUrSjZLOT4t9BRgI3C3pcUkLSpzOzLqoUnmLlkZeF+Yhhg/dyi8+eAsXfb4a7rgjmSdq7dodeQvrPJ6w0MzKpjBvAUmLYf785HmpYzvyD4XzRJ11FsycWXSZVWufLj9Ow8x6vpbyFq3e1bSq+XgLFi6EpUsdMCrMLQ0zK5s+fZLGQqEWR2Nv2gRTpybdUIMGJc8vvxz69y9rXXs7tzTMrOLaNGPs1q1JYru6IG9x9dUOGF2Ig4aZlU2mGWML1+U+5ZRkXe7bb4fBgzu1vtY6Bw0zKxvnLXoer9xnZmVVdDR2Yd5i5kznLboJBw0z6zxNTTB7dvN1uadNczdUN+KgYWbllxtvMXkyrFnj8RbdmHMaZtYhSq5/sWoVjBqV5C0inLfo5hw0zKxNigWHYutffP6yTaw5e0IyT9TDDycti5UrPU9UN+fuKTPLrNR05nvttXNfX5q4nDlMe2Mag5Zvhiuct+hJ3NIws8xKTQvS2AgQnMt9rOQ4buVqHuEUTsDjLXoaBw0zy6zUdObvZDWLGcV9jEEEo1nI+1nK61XOW/Q0Dhpmllnh9B/708gsruT3jOA0HuYaZnAcK1lEsi631+HueRw0zCyz3LQgfWniKm5lHcP5DHN56qzxPDB3LT+puoZt6u91uHuwFhPhkjZTfN1uARERe5elVmbWJY39WHDI44sZesukXdblfgdw0fhK19DKrcWgERGDOqsiZtbFrV6drMu9dKnX5e7F2tQ9JelASUNzP+29uKRzJK2RtE7SdUWOv0fSY5K2Sbqovdczs93Q2AhXXgkjRiTjLbwud6+WKWhIOl/SWuBPwC+AZ4DF7bmwpD2A2cAo4GjgYklHFxR7Fvgk8N32XMvMdkNTE9x6KwwfDnPnel1uA7K3NP4NeBfwx4g4AjgT+E07r30qsC4ino6IrcBdwAX5BSLimYj4PVBqjS8z62j561tcfbXXt7BmsgaNpohoBPpI6hMRPwNOaOe1DwWey9ten+5rM0m1kuol1W/cuLGd1TLrxfLXt/A8UVZE1qDxiqSBwC+BOkm3Atvaee1inaG7tWB5RMyPiJqIqBkyZEg7q2XWC23aBBPSeaJ++1vPE2UlZQ0aFwBvANcAS4CngPPaee31wOF524cBz7fznGbWFrm8RXU1zJsH48Z5XW5rUaagERF/iYi3ImJbRNwZEbPS7qr2eASolnSEpP7AR4EF7TynmWVRkLfYcNgpnHXgCvrMnc2wmsE7Zq4tOtW59WqZZrktGOTXH+gH/KU9g/siYpukCcBSYA/gmxGxStKNQH1ELJB0CvATYD/gPEk3RIQ7V83aY9WqZDGkdLzFzyffy+g5o9nyRtIN1dAAn/pU0iu1dSs79tXWJs89yrt3U0Tb0wiSLgROjYgvdHyV2qempibq6+srXQ2zrmfTpmSK8nnzYODAZI3uK65g2FH9aWjIdoqqKnjmmXJW0ipF0qMRUdNaud2aeyoifgqcsTuvNbNOlp+3mDuXP75vHCcOWkefyde0KWBA6VlurffI2j31wbzNPkANu3mnk5l1kiLrci88YwYf+bdjmy2iJCVFsyic5dZ6n6wr9+XfKbWNZET4BcWLmlkl1NUliyQ9+yycefAq7hw8iUNWLuOpvkdxNfeycs1oXn9MuyyiFLFr4OjXr3lOA5LZbT3VuWUKGhHxqXJXxMx2X24Z1rdtaWQWUxm/YR6bNwzis3vMZNa2y2mifzIpTwkRSb7i2WeT1kQuOOSCUG6fk+DW2tTot9FCN1REXNXhNTKzNpv6hSY+vWUO05jG3rzGPMYzlRtofCvbtB+lEtwOElaotUR4PfAo8DbgJGBt+nMC8FZ5q2ZmrUrHWyx8due63MezggnMppFsAcPdTtYWLQaNdCDfnUA18L6IuC0ibiOZsLC9c0+Z2W6qq4ORh6xmaZ9knqi+fYIx3Mv7Wcoqjm3xtQcckLQsJLzCnrVZ1kT4IcAg4KV0e2C6z8w62d3zGnntymks2TaX1xnI1czka30uZ1vf/pCXuC6VzL71VgcJ231Zg8ZNwO8k/Szdfi8wrSw1MrPimppgzhzOnjSNgdvz8hYMhm1JC2LgQCezrbwyjwiXdDBwWrr5cET8uWy1agePCLceJwIWL4ZJk2DNGpZxFpOYsUs3lATbvfKM7aYOGREu6W/Tx5NIuqOeS38OSfeZWTmtXg2jRiVTlEfAvfdSO7R43sID76wztNY9NQmoBb5a5FjgqUTMyqOxMZknau5cGDQoWd/i8suhf3+mv5qMycgfpOc7oKyztBg0IqI2fXxf51THrJdL8xZMmwabNyfrck+b1myZ1VxOwrkKq4RMExZK+pCkQenzL0r6saQTy1s1s15m0aLm63I//viOdbkL17aAZDDe9u3JowOGdZass9z+a0RslvSPwPuBO4F55auWWS9SmLfIrct9bJK3yE0R0tCQHM6tbeFFkawSsgaN3Ojv0cDciLiHZDEmM9tdjY1w5ZUwYgQ8+GDJdbmnTGGXSQa3bEn2m3W2rEHjfyXdAXwYWCRpzza81szy5da3GD48yV+MGwfr1pVcl7vUGhZe28IqIesX/4dJlmU9JyJeAfYHrm3vxSWdI2mNpHWSrityfE9J30+PPyxpWHuvaVYxufUt8vMWK1bA7NnNEt2F+Yv99y9+Ot9ia5WQKWhExBbgReAf013bSCYu3G2S9gBmA6OAo4GLJR1dUOyfgZcjYjgwE/iP9lzTrGIK8xYLFuzIW+QHicGD4dJLm+cvXntt1waIb7G1Ssl699RU4HPA59Nd/YDvtPPapwLrIuLpiNgK3MWuCztdQJJ0B/ghcKaU19lr1tXl5y0efpj6j8+k+s2V9LngPIYdIS6/vHmSu7Gx+VxRkPRmDRrkSQata8g699QHgBOBxwAi4vncLbjtcCjJ6PKc9eycpmSXMhGxTdKrwAHApvxCkmpJBiEy1G126wqampKBedOmwauvwvjx/PC4G7hk8uBmS63Om5dtqdWXXoJNm1ovZ1ZuWXMaWyOZpCoAJP1NB1y7WIuh8L9PljJExPyIqImImiFDhnRA1czaIZe3mDgRamp25C0+e9PgokutZuG/hayryBo0fpDePbWvpMuA+4Gvt/Pa64HD87YPA54vVUZSX2Afdk7Pbta1FJknKn+8xe7e7eT8hXUlWRPhN5PkFH4EvAO4PiJmtfPajwDVko6Q1B/4KLCgoMwC4JL0+UXAf0fWaXnNOkt+3uKhh2DGjGS8xZgxzcZblGotFGbp+vVLpjl3/sK6osxjLSJieURcGxGfBf5bUrt+jSNiGzCB5FbeJ4EfRMQqSTdKOj8t9g3gAEnrSCZP3OW2XLOKKTbeYu1auOaaouMtpk9PWg35BgxIppfKT3J/61tJ/sJThFhX1GIiXNLewBUkCekFwPJ0+1rgcaBdExlExCJgUcG+6/Oevwl8qD3XMOtwufEWkyfDmjUwciTMnEndimOZUtPyIkiXXJK81BMNWnfV4iJMku4BXgYeJFkXfD+S6UMmRsTjnVLDNvIiTFZWq1YliyEtW8ZrB1dz9Vsz+PbG0ex/gNi8ufntsqWWW3V3k3VFWRdhai1orIyI49Lne5Dc6jo0IjZ3WE07mIOGlcPd8xrZ8i9TGbt5Hls0kGV/N5VP/+4KXn2j7VOwVVUl3U5mXUnWoNHaOI2m3JOIeEvSn7pywDDrcE1N1F86h5HfmcbevMYdjOP6uJGXHhyc+XbZQp4zyrqz1oLG8ZJeS58L2CvdFhARsXdZa2dWSYsWwaRJ1BRbl7sd9/B5zIV1Z62t3LdHZ1XErMtYvTpJci9ZAkcdxXncy0JGU3ysaWmlchoec2Hdmac3N8spMd5iZdUYigWM1sZXfOtb8M1ves4o61myzj1l1nMVmSeKG27YMV359OnJpIL5U4AMGJD99lkHCetJ3NKw3q3IPFF1fz+bYTWDm63HPX/+ri2GOXO8Trf1Pm5pWO+Ul7d47eCjuHrIvXx7+Wj2P735eIvcetzz5/s2WTNwS8N6m4K8Rf3HZzL01ZV8a+MYAhVdz8LrcZvt5KBhvUNTE8yaBdXVSb9SbS2sXctFv7o60wA9j60wSzhoWI+VW0Z1tBbx1IAkb7Hh0BrOPmgFfebNYVjNYBoasp3LYyvMEg4a1iPV1cGMT69mbsMo7mM027YFF+5xL8PWLGX5hmN3rL+dZfFgj60w28lBw3qEXKuiTx844fBGtnz6Sh5+cwTv4iGuYQbHsZJ73hrD1qbmUSLC61mYtYXvnrJur64uSVFs3dLEBOYybf009uFV5jGeqdxAI4NbfH1EEhw8XblZ6xw0rNupq2u+RsXrr8N7tyxiBpP4W9awnJFcw8yd80S1wrPOmmXnoGHdSq5VkRudPaBhNXOZzCiW8EeqGcO93FdinijPBWXWfhXJaUjaX9JySWvTx/1KlFsi6RVJCzu7jtY1TZmSBIz9aWQWV/J7RvB3PMg1zOBYnuA+ds4TdcABuy6j6rmgzNqnUi2N64AHIuImSdel258rUu4rwABgXGdWzrqu5xuauJK5TCPJW9zBOKZyA5sY0qzcgAHJ8t2eC8qsY1Xq7qkLgDvT53cCFxYrFBEPAF70yRKLFvFk3+OYxUQe5WSOZwVXMIdNDNmlVeEWhFl5VCpoHBQRGwDSxwPbczJJtZLqJdVv3LixQypolZe7jfYYrebne42C0aMZMng7/3fPezmbZTsS3blWhScPNCu/sgUNSfdLeqLIzwUdfa2ImB8RNRFRM2TIkNZfYF1eXR1cd1kjkxuuZAUjOOHNB/lcvxncd9MTfPAbY6iqklsVZhVQtpxGRIwsdUzSC5LeHhEbJL0deLFc9bBuqKmJtVfOZcUbO/MW13MjjU2DqZrqloRZJVWqe2oBcEn6/BLgngrVw7qadH2LaS83z1vkBuh54kCzyqpU0LgJOEvSWuCsdBtJNZK+nisk6VfA3cCZktZLen9Famvlt3o1jEryFmzfzqVDmuctcjxxoFllVeSW24hoBM4ssr8e+HTe9rs7s15WAY2NrLl4Gkcun8vrDOS2/WYwfMoVnNm3P98vssSqB+KZVZYnLLQOlT9x4LBhyXaxY8Ormqj/xCz+WlXN8OVzuINxDGcd1798DZ++PFnfotgSq85lmFWWIqLSdehQNTU1UV9fX+lq9EqFU3xA0jqYPz95njs2ip3zRP2830gmNO06T5TngzLrXJIejYia1sp57inrMLkpPvLlL5VatWU1X82bJ+o8FrCwaee0H/mc8Dbrmtw9ZR2m1Bf96w2NfLZh5zxRk/gqx/IECzmPYgEDnPA266ocNKxNWspZFH7R96WJK5nFuj7VfIY5zKeWatYyk0k0keQtDjgg6cLK54S3WdfloGGZ5XIWDQ3sWC61tnZn4Jg+fWcAOIfF/J4RzGIibxx9Mou/tIJrB8wCz2vzAAAND0lEQVRpNrFgbvoPJ7zNug8nwi2zYcOSQFEoP2l975dXM2jqZE5/cwlP963m2au+yuk3jwFpl8WTvEKeWdeRNRHuoGGZ9emTtDAKSbB9YyPccAPMmQMDB8L118OECdC/f+dX1MzazHdPWYcbOnTXlkZfmpiy71yongavvgrjxyfBY3DL63KbWffknIY101KiOz9nAUneYqVGMO3liXDyybBiBcye7YBh1oO5pWE7FA7OyyW6Ick95PIP37p2NZM3JOMtXjuoGr52bzJnlIrfPmtmPYdbGrZDa4PzaGxk7MNXcf+LIxi1z4MwYwZ7NzwBY8Y4YJj1Em5p2A6lBuc939AEs+bCtGnOW5j1cm5p2A7FRmGfw2JW9x0BE523MDMHjV6jpQR3Tn6i+52sZhGjWMy5HDj4Lbj3Xli2DI49dtcXmlmv4e6pXqC1BHfO2LHQf3Mjr//LDfzT5jls0UAe/dgMTv7mFR5vYWaAWxrdVpaWQ06rCW6ApiaYNYsPfaGaT/1lNn3HX8beL6zl5O9c44BhZjtUJGhI2l/Scklr08f9ipQ5QdKDklZJ+r2kj1Sirl1Ra3NAFSqV4N6xf/FiGFGQt5g7F4YMKf5CM+u1KtXSuA54ICKqgQfS7UJbgE9ExDHAOcAtkvbtxDp2WZlaDnlKTTN+xsHputznngtvvQULFjhvYWYtqlTQuAC4M31+J3BhYYGI+GNErE2fPw+8CPhPXzK0HAoUjuTen0Zm972KZS+MgAeT8RY88QScd57HW5hZiyoVNA6KiA0A6eOBLRWWdCrQH3iqE+rW5ZVqOZTaP3ZsMt34kUObuIpZPNWnmvFvzabPuFpYuxaucd7CzLIp291Tku4HDi5yqEQnSsnzvB34L+CSiNheokwtUAswtBcs+TZ9evG1uFtauGjs/osZO2AS8Ac4YyTMnOluKDNrs7IFjYgYWeqYpBckvT0iNqRB4cUS5fYG7gO+GBEPtXCt+cB8SKZGb1/Nu77cbbKZ1qZYvRomT4YlS6C6Gu65x91QZrbbKtU9tQC4JH1+CXBPYQFJ/YGfAP8ZEXd3Yt26hbFjk4WPtm9PHncJGI2NcNVVyV1R+XmL8893wDCz3VapoHETcJaktcBZ6TaSaiR9PS3zYeA9wCclPZ7+nFCZ6nastoyxaLN0vAXV1cl0H7XOW5hZx/HKfZ2scHQ2JPmIDlkXe/FimDQJ/vAHGOm8hZlll3XlPo8I72RtHWORyWqPtzCzzuGg0cnaOsaiRYV5i69+1eMtzKysHDQ6WVvHWBRVmLe47LIkbzFpkvMWZlZWDhqdrHB0NrQ+xqKZ/HmiTjoJHn/c80SZWadx0OhkudHZVVVJD1JVVcYkeLG8xfLlcNxxnVJvMzPwehoVMXZsG+6UamxMlladMwcGDkzyFhMmuBvKzCrCQaOrampKup1y63LX1sKNN7obyswqykGjK8ofb3Hmmcl4C3dDmVkX4JxGV/Lkk0nOwnkLM+uiHDS6gpdeSsZbHHcc/M//eLyFmXVZ7p6qpKYmmDcPpk5N8hbjxiVJb+ctzKyLctColCVLkrzFk096nigz6zbcPdXZcnmLUaNg2zbPE2Vm3YqDRmcpzFt4XW4z64bcPVVuzluYWQ/ioFFOHm9hZj2Mu6fKweMtzKyHctDoSB5vYWY9XEWChqT9JS2XtDZ93K9ImSpJj6Zrg6+SNL4Sdc2kqQluuw2GD/f6FmbWo1WqpXEd8EBEVAMPpNuFNgB/HxEnAKcB10k6pBPrmE1ufYurrvL6FmbW41UqaFwA3Jk+vxO4sLBARGyNiL+mm3vS1brSnLcws16oUl/EB0XEBoD08cBihSQdLun3wHPAf0TE8yXK1Uqql1S/cePGslUacN7CzHq1st1yK+l+4OAih6ZkPUdEPAeMSLulfirphxHxQpFy84H5ADU1NbGbVW5Z4XgLr29hZr1Q2VoaETEyIo4t8nMP8IKktwOkjy+2cq7ngVXAu8tV37o6GDYM+vRJHuvq8g46b2FmBlSue2oBcEn6/BLgnsICkg6TtFf6fD/gH4A15ahMXV3ScGhogIjksbYW7v1yXt5i2za45x7nLcysV6tU0LgJOEvSWuCsdBtJNZK+npZ5J/CwpBXAL4CbI2JlOSozZQps2bJzez9e4ktbrmLU546D3/wGbr4ZVq2C88933sLMejVFlCcFUCk1NTVRX1/fptf06ZO0MPrSxHjmcQNT2YdX+Rq1jH/ReQsz6/kkPRoRNa2V89xTwNChoIY/sYhzeSd/4H7O5BpmsrnqOMY7XpiZ7dC1xj5UyPTp8NJeh/EUR3I+93AWy3l6wHFMn17pmpmZdS1uaQBjxwL0Y8KUhTz7LFQNTQJJst/MzHIcNFJjxzpImJm1xt1TZmaWmYOGmZll5qBhZmaZOWiYmVlmDhpmZpaZg4aZmWXmoGFmZpn1uLmnJG0EGsp4icHApjKev7vz59Myfz4t8+fTsnJ+PlUR0erEST0uaJSbpPosk3r1Vv58WubPp2X+fFrWFT4fd0+ZmVlmDhpmZpaZg0bbza90Bbo4fz4t8+fTMn8+Lav45+OchpmZZeaWhpmZZeagYWZmmTlo7AZJX5H0B0m/l/QTSftWuk5diaQPSVolabsk3z4JSDpH0hpJ6yRdV+n6dDWSvinpRUlPVLouXY2kwyX9TNKT6f+riZWsj4PG7lkOHBsRI4A/Ap+vcH26mieADwK/rHRFugJJewCzgVHA0cDFko6ubK26nG8D51S6El3UNmByRLwTeBdwRSV/fxw0dkNELIuIbenmQ8BhlaxPVxMRT0bEmkrXows5FVgXEU9HxFbgLuCCCtepS4mIXwIvVboeXVFEbIiIx9Lnm4EngUMrVR8Hjfa7FFhc6UpYl3Yo8Fze9noq+J/eui9Jw4ATgYcrVQevEV6CpPuBg4scmhIR96RlppA0Hes6s25dQZbPx3ZQkX2+193aRNJA4EfA1RHxWqXq4aBRQkSMbOm4pEuAMcCZ0QsHu7T2+Vgz64HD87YPA56vUF2sG5LUjyRg1EXEjytZF3dP7QZJ5wCfA86PiC2Vro91eY8A1ZKOkNQf+CiwoMJ1sm5CkoBvAE9GxIxK18dBY/fcDgwClkt6XNK8SleoK5H0AUnrgb8D7pO0tNJ1qqT0pokJwFKSJOYPImJVZWvVtUj6HvAg8A5J6yX9c6Xr1IX8A/BPwBnp983jks6tVGU8jYiZmWXmloaZmWXmoGFmZpk5aJiZWWYOGmZmlpmDhpmZZeagYT2GpLfS2xGfkHS3pAHtONfpkhamz89vaWZaSftKunw3rjFN0md3t44dfR6zLBw0rCd5IyJOiIhjga3A+PyDSrT5dz4iFkTETS0U2Rdoc9Aw644cNKyn+hUwXNKwdB2COcBjwOGSzpb0oKTH0hbJQNix5sUfJP2aZGp30v2flHR7+vygdA2VFenP3wM3AUemrZyvpOWulfRIuubKDXnnmpKuq3E/8I7CSkvaR9IzueAmaYCk5yT1k3RZes4Vkn5UrCUl6ee5NUwkDZb0TPp8j3QdmFydxqX73y7pl3kttHd3xIdvPZeDhvU4kvqSrF2xMt31DuA/I+JE4C/AF4GREXESUA9MkvQ24GvAecC7KT4ZI8As4BcRcTxwErAKuA54Km3lXCvpbKCaZEr0E4CTJb1H0skkU4icSBKUTik8eUS8CqwA3pvuOg9YGhFNwI8j4pT02k8CbRk1/c/AqxFxSnrdyyQdAXwsPf8JwPHA4204p/VCnrDQepK9JOW+9H5FMl/PIUBDRDyU7n8XyUJIv0mm9KE/yfQVfwv8KSLWAkj6DlBb5BpnAJ8AiIi3gFcl7VdQ5uz053fp9kCSIDII+EluvjJJpeaf+j7wEeBnJEFmTrr/WEn/TtIdNpBkWpKszgZGSLoo3d4nrdMjwDfTCfF+GhEOGtYiBw3rSd5I/2LeIQ0Mf8nfBSyPiIsLyp1Ax01XLuBLEXFHwTWuzniNBcCXJO0PnAz8d7r/28CFEbFC0ieB04u8dhs7exDeVlCnKyNil0Aj6T3AaOC/JH0lIv4zQx2tl3L3lPU2DwH/IGk47MgZHAX8AThC0pFpuYtLvP4B4DPpa/eQtDewmaQVkbMUuDQvV3KopANJlr/9gKS9JA0i6XraRUS8DvwWuBVYmLZoSK+xIW0VjC1Rv2dIAg3ARXn7lwKfSV+LpKMk/Y2kKuDFiPgaScvspBLnNQPc0rBeJiI2pn+lf0/SnunuL0bEHyXVkszKuwn4NXBskVNMBOans7C+BXwmIh6U9BtJTwCL07zGO4EH05bO68DHI+IxSd8nyRs0kHShlfJ94G6atyb+lWTFtgaSfM2gXV/GzcAPJP0TO1soAF8HhgGPKanURuDC9PzXSmpK6/mJFupk5lluzcwsO3dPmZlZZg4aZmaWmYOGmZll5qBhZmaZOWiYmVlmDhpmZpaZg4aZmWX2/wH8djF1oOO97AAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "def resid_qq(y_test, y_score):\n", " ## first compute vector of residuals. \n", @@ -1145,27 +441,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "As with the histogram, the Q-Q Normal plot indicates the residuals are close to Normally distributed, show some skew (deviation from the stright line). This is particularly for large positive residuls. \n", + "As with the histogram, the Q-Q Normal plot indicates the residuals are close to Normally distributed, show some skew (deviation from the straight line). This is particularly for large residuals. \n", "\n", - "There is one more diagnostic plot. Execute the code in the cell below to dislplay the plot of residuals vs. predicted values. " + "There is one more diagnostic plot. Execute the code in the cell below to display the plot of residuals vs. predicted values. " ] }, { "cell_type": "code", - "execution_count": 21, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY0AAAEWCAYAAACaBstRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XuYHGWZ9/HvL5PJCQIJJkEgCRCNssRVwAHxFKMcBBWCXh5AXcET6Lusurq84uqyGtcVhF3UFV8T0QVPgKBoLlGRgzEeCDIgp0gkIaCJRCZAgMAkZGZyv3/UM6TT6Zmpmenp6u75fa5rrumuqq66u6an7n4O9TyKCMzMzPIYU3QAZmbWOJw0zMwsNycNMzPLzUnDzMxyc9IwM7PcnDTMzCw3Jw0bMZJWSlrQx7oFktZX6TjLJL2vGvuqd6XvVdI7JP2iBsc8QFJIGlvl/VbtM2C146RhSHpA0hZJT0r6m6RLJO0+3P1GxLyIWFaFEK2CiPhuRBw70HaSPi3pO7WIyZqfk4b1OiEidgcOAQ4FPlFwPE2v2t/czWrBScN2EhF/A64lSx4ASBov6QJJf5H0kKSvSZqY1k2T9BNJj0l6VNKvJY1J6x6QdHR6PDGVYDZJ+iNweOlxU/XHc0ueXyLpP9LjqekYG9PrfyJpZqX4JT1X0q8kPS7pYUlX9LHdzyWdWbbsDklvUuZCSR1pP3dKekGe85fex4ckrU3HP7/kfJwm6bdp348Cn07L3yPpnvTerpW0f8n+jpG0KsXxFUAl606T9JuS5/MkXZf+Dg9J+ldJxwH/CrwtlSTvSNvuKekbkjZI+quk/5DUkta1pL/3w5LWAq/v5/2eLemqsmVfkvTl9Pjd6b1tTufkjAHOXcXPQHr+Bkm3p8/a7yS9sGTdx9P72CzpT5KO6us4NjxOGraTdDE+HlhTsvg84HlkieS5wH7AOWndx4D1wHRgb7ILVKWxaf4deE76eS1w6iDCGgP8L7A/MBvYAnylj20/C/wCmArMBP6nj+2+B5zS+0TSwWn/1wDHAvPJ3vMU4G3AI4OI941AG3AYsBB4T8m6lwBrgRnA5ySdRHbO3kR2Dn8NXJZimgb8APgUMA24D3h5pQNKmgxcD/wc2Jfs73RDRPwc+E/giojYPSJelF5yKdCdtjs0vefedqH3A29Iy9uAN/fzXi8DXidpjxRHC/BWsvML0JH2tQfwbuBCSYf1s7+K0mu+CZwBPAtYDCxNX2ieD5wJHB4Rk8k+Xw8M9hiWj5OG9fqRpM3AOrJ/9H8HkCSyi8g/R8SjEbGZ7CJ0cnpdF7APsH9EdEXEr6PygGZvBT6X9rEO+HLewCLikYj4QUR0puN/DnhVH5t3kV38942IrRHxmz62uxo4pORb/TuAH0bE02kfk4GDAEXEPRGxIW+8wHnpff4F+CIlyQl4MCL+JyK6I2IL2UXw8+kY3WTntjeu1wF/jIirIqIr7etvfRzzDcDfIuK/0vveHBE3V9pQ0t5kXww+EhFPRUQHcCE7/qZvBb4YEesi4lHg83290Yj4M3AbcFJa9BqgMyJWpPXXRMR9kfkVWUJ/Zd+nrk/vBxZHxM0R0RMRlwJPA0cCPcB44GBJrRHxQETcN4RjWA5OGtbrpPQtbQHZxXJaWj4dmATcmqoFHiP7Njs9rT+frFTyi1T9cHYf+9+XLCH1+nPewCRNkrRY0p8lPQEsB6b0VqeU+b9kVTi/V9Z76z0VtiEln2vYcaE8GfhuWncjWUnmIuAhSUt6v0nnVP4+9+1jHWQJ7ksl5/bRFP9+lJ2zlIzLX99rFllJJI/9gVZgQ8lxF5OVfig/LgP/rUpLbW9nRykDScdLWpGqzB4jS4TTKuwjT8wf64037WsW2ZeDNcBHyKr7OiRdLmnffvZlw+CkYTtJ3wYvAS5Iix4mqw6aFxFT0s+eqdGc9I32YxExBzgB+Ggf9ckbyP7Je80uW99Jlpx6Pbvk8ceA5wMviYg9yKqOoKR+vyT+v0XE+yNiX7Jv8V8trScvcxlwiqSXAhOBX5bs58sR8WJgHlk11Vl97KOS8vf5YGmIZduuA84oObdTImJiRPyOsnOWSn2zqGwdWdVfJZWO+TQwreSYe0TEvLR+oL9VuSuBBalq842kpCFpPFn12gXA3hExBfgpFf5uSX+fgXVkJdXS8zQpIi4DiIjvRcQryJJLkFWp2ghw0rBKvggcI+mQiNgOfJ2sLnoGgKT9JL02PX6DssZnAU+QVRX0VNjn94FPKGvUngn8U9n624G3p0bY49i5+mkyWeJ6TNJepKqzSiS9RTsayTeRXUAqxQPZBWx/YBFZnf/2tI/DJb1EUivwFLC1n31UclZ6n7OADwMVG+OTr5Gdl3np2HtKektadw0wT1nj/FjgQ+x8IS31E+DZkj6S6vknS3pJWvcQcIBSg3yqavsF8F+S9pA0RtJzJPWe8+8DH5I0U9JUoK/SI2l/G4FlZO1O90fEPWnVOLJqo41At6TjydpO+tLfZ+DrwAfS30WSdpP0+vQ+ny/pNSlJbSX7rAzm72WD4KRhu0gXgW8B/5YWfZysCmpFqh66nuybP8Dc9PxJ4Cbgq33cm/EZsmqO+8kuWN8uW/9hspLKY2TtCz8qWfdFspLAw8AKsuqxvhwO3CzpSWAp8OGIuL+P9/k08EPgaEqqVMgabb9OlnT+TNYIfgGAsh5JP+vn+AA/Bm4luwheA3yjrw0j4mqyb8WXp3N7N1l7AxHxMPAW4NwUw1zgt33sZzNwDNk5/BuwGnh1Wn1l+v2IpNvS43eRXdT/mN7nVWRtU6T3fi1wB1l7xQ8HeL+Qnb+dzmOK6UNkSWgTWdXV0n720ednICLaydo1vpL2tQY4La0eT3aOHk7vfQZZ5wIbAfIkTGbVIymAuame3azpuKRhZma5OWmYmVlurp4yM7PcXNIwM7Pcmm7AtGnTpsUBBxxQdBhmZg3l1ltvfTgipg+0XdMljQMOOID29vaiwzAzayiSco3S4OopMzPLzUnDzMxyc9IwM7PcnDTMzCw3Jw0zM8ut6XpPmY2EZas6WLx8Les2dTJr6iTOmD+HBQfNGPiFZk3GJQ2zASxb1cE5S1fSsXkrUya20rF5K+csXcmyVR1Fh2ZWc04aZgNYvHwtrS1i0rixSNnv1haxePnaokMzqzknDbMBrNvUycTWnWeWndjawvpNnQVFZFYcJw2zAcyaOoktXTtPBLelq4eZUyf18Qqz5uWkYTaAM+bPoasn6NzWTUT2u6snOGP+nKJDM6s5Jw2zASw4aAaLTpzHjMkTeHxLFzMmT2DRifPce8pGpUK73KbJ478EtAAXR8S5fWz3ZrJ5jg9PcwWb1dSCg2Y4SZhRYElDUgtwEXA8cDBwiqSDK2w3mWxy+ptrG6GZmZUrsnrqCGBNRKyNiG3A5cDCCtt9FvgCsLWWwZmZ2a6KTBr7AetKnq9Py54h6VBgVkT8pL8dSTpdUruk9o0bN1Y/UjMzA4pNGqqw7JkJyyWNAS4EPjbQjiJiSUS0RUTb9OkDTjxlZmZDVGTSWA/MKnk+E3iw5Plk4AXAMkkPAEcCSyW11SxCMzPbSZFJ4xZgrqQDJY0DTgaW9q6MiMcjYlpEHBARBwArgBPde8rMrDiFJY2I6AbOBK4F7gG+HxErJS2SdGJRcZmZWd8KvU8jIn4K/LRs2Tl9bLugFjGZmVnffEe4mZnl5qRhZma5OWmYmVluThpmZpabk4aZmeXmpGFmZrk5aZiZWW5OGmZmlpuThpmZ5eakYWZmuTlpmJlZbk4aZmaWm5OGmZnl5qRhZma5OWmYmVluThpmZpabk4aZmeXmpGFmZrkVmjQkHSfpT5LWSDq7wvoPSLpL0u2SfiPp4CLiNDOzTGFJQ1ILcBFwPHAwcEqFpPC9iPj7iDgE+ALw3zUO08zMShRZ0jgCWBMRayNiG3A5sLB0g4h4ouTpbkDUMD4zMysztsBj7wesK3m+HnhJ+UaS/hH4KDAOeE2lHUk6HTgdYPbs2VUP1MzMMkWWNFRh2S4liYi4KCKeA3wc+FSlHUXEkohoi4i26dOnVzlMMzPrVWTSWA/MKnk+E3iwn+0vB04a0YjMzKxfRSaNW4C5kg6UNA44GVhauoGkuSVPXw+srmF8ZmZWprA2jYjolnQmcC3QAnwzIlZKWgS0R8RS4ExJRwNdwCbg1KLiNTOzYhvCiYifAj8tW3ZOyeMP1zwoMzPrk+8INzOz3Jw0zMwsNycNMzPLzUnDzMxyc9IwM7PcCu09ZWY2mixb1cHi5WtZt6mTWVMnccb8OSw4aEbRYQ2KSxpmZjWwbFUH5yxdScfmrUyZ2ErH5q2cs3Qly1Z1FB3aoDhpmJnVwOLla2ltEZPGjUXKfre2iMXL1xYd2qA4aZiZ1cC6TZ1MbG3ZadnE1hbWb+osKKKhcdIwM6uBWVMnsaWrZ6dlW7p6mDl1UkERDY2ThplZDZwxfw5dPUHntm4ist9dPcEZ8+cUHdqguPfUKNYMPTnMGsWCg2awiKxtY/2mTmY26P+ck8Yo1duTo7VFO/XkWAQN9yE2axQLDprR8P9frp4apZqlJ4eZ1ZaTxijVLD05zKy2XD01Ss2aOomOzVuZNG7HR6ARe3JYfm7D2pXPyeC5pDFKNUtPDsunWe5Griafk6EpNGlIOk7SnyStkXR2hfUflfRHSXdKukHS/kXE2YwWHDSDRSfOY8bkCTy+pYsZkyew6MR5/pbVpNyGtSufk6EprHpKUgtwEXAMsB64RdLSiPhjyWZ/ANoiolPSB4EvAG+rfbQ7NFNxthl6clg+6zZ1MmVi607LRnsbls/J0BRZ0jgCWBMRayNiG3A5sLB0g4j4ZUT0/gVXADNrHONOXJy1RtUsdyNXk8/J0BSZNPYD1pU8X5+W9eW9wM8qrZB0uqR2Se0bN26sYog7a/Ti7LJVHZyyZAWvOO9GTlmywsluFHEb1q58ToamyKShCsui4obSO4E24PxK6yNiSUS0RUTb9OnTqxjizhq5m6pLSaOb27B25XMyNEV2uV0PzCp5PhN4sHwjSUcDnwReFRFP1yi2ihq5m2ppKQlg0rixdG7rZvHytf4nGSXchrUrn5PBK7KkcQswV9KBksYBJwNLSzeQdCiwGDgxIgr/StzIxdlGLiWZWf0oLGlERDdwJnAtcA/w/YhYKWmRpBPTZucDuwNXSrpd0tI+dlcTjVycdaOfmVWDIio2IzSstra2aG9vLzqMulM6QOHE1ha2dPXQ1RMNk/SssmbqAm7FknRrRLQNtJ3vCB8lGrmUZJW5c4MVwWNPjSJu9Gsu7txgRXBJw6xBuXODFcFJw6xBuXODFcFJw6xBNXIXcGtcThpmDcqdG6wIbgg3a2Du3GC15pKGmZnl5pKGWU6+kc7MScPqTL1emEvvqC+9kW4R1EV8ZrXi6imrG/V8h3Ojz6ViVi1OGlY36vnC7BvpzDJOGlY36vnC7BvpzDJOGlY36vnC7BvpzDJOGlY36vnC7BvpzDL99p6StJnK83YLiIjYY0SislFpwUEzWETWtrF+Uycz66j3FPhGOjMYIGlExORaBWIGvjCb1btBVU9JmiFpdu/PcA8u6ThJf5K0RtLZFdbPl3SbpG5Jbx7u8czMbHhyJQ1JJ0paDdwP/Ap4APjZcA4sqQW4CDgeOBg4RdLBZZv9BTgN+N5wjmVmZtWRt6TxWeBI4N6IOBA4CvjtMI99BLAmItZGxDbgcmBh6QYR8UBE3AlsH+axzMysCvImja6IeAQYI2lMRPwSOGSYx94PWFfyfH1aZmZmdSrv2FOPSdodWA58V1IH0D3MY6vCsko9tQbekXQ6cDrA7NnDbmoxM7M+5C1pLAS2AP8M/By4DzhhmMdeD8wqeT4TeHAoO4qIJRHRFhFt06dPH2ZYZmbWl1wljYh4quTppVU69i3AXEkHAn8FTgbeXqV9m5nZCMiVNMpu8hsHtAJPDefmvojolnQmcC3QAnwzIlZKWgS0R8RSSYcDVwNTgRMkfSYi5g31mNa/eh2W3Gw0qtf/R0UMvhlB0knAERHxr9UPaXja2tqivb296DAaTul8ERNbW9jS1UNXT3ioDLMCFPH/KOnWiGgbaLshjT0VET8CXjOU19oOy1Z1cMqSFbzivBs5ZcmKQueNqOdhyc1Gm3r+f8xbPfWmkqdjgDaG2NPJMvU2E9y6TZ1Mmdi607J6GZbcbLSp5//HvCWNE0p+XgtspuxGPBucevsmUc/DkpuNNvX8/5i399S7RzqQ0abobxLljWwvnbMXV932Vzq3de9Uh5pnWPJ6bbAza1RnzJ/DOUtXDun/caQNNDT6/9BPNVREfKjqEY0Ss6ZOomPzViaN2/EnqNU3iUpVY1fd9lfefNh+3LT20UENS15v1WxmzaCepwkYqKTR2w3p5WSDCl6Rnr8FuHWkghoNivwmUVo1BjBp3Fg6t3Vz09pHuez0I6uyr8XL19bFB9ysUdXrNAEDzadxKYCk04BXR0RXev414BcjHl0TK/KbRDWrxoquZjOz2so79tS+wGTg0fR897TMhqGobxLVrBorspqt0bjtx5pB3t5T5wJ/kHSJpEuA24D/HLGobERVcy7uep7Xu570tv10bN66U9tPkffmmA1FrqQREf8LvIRsSI+rgZf2Vl1Z41lw0AwWnTiPGZMn8PiWLmZMnjDkO02rua9mVm9drM2GaqDeUwdFxCpJh6VFvfNf7Ctp34i4bWTDs5FSzaqxem2wqydu+7FmMVCbxkfJ5qn4rwrrAg8lYpaL236sWQzUe+r09PvVtQnHrDnV881aZoORq01D0lskTU6PPyXph5IOHdnQzJqH236sWeTtcvtvEXGlpFeQjT11AfA1ssZxGyJ3wRxd3PZjzSBvl9vekbNeD/y/iPgx2WRMNkTugmlmjShv0virpMXAW4GfSho/iNdaBe6CaWaNKO+F/61k07IeFxGPAXsBZ41YVKPAuk2dTGxt2WmZu2CaWb3Le3NfJ9ABvCIt6gZWD/fgko6T9CdJaySdXWH9eElXpPU3SzpguMccjmrOtFfP4+WbmfUlb++pfwc+DnwiLWoFvjOcA0tqAS4CjicbQfcUSQeXbfZeYFNEPBe4EDhvOMccjmq3QRQ9/EY9TTVrZo0jb/XUG4ETgacAIuJBsgEMh+MIYE1ErI2IbcDl7Dob4EKgd7iSq4CjJGmYxx2SardBFNkFs9oJ0AnIbPTI2+V2W0SEpACQtFsVjr0fO4YlAVjPrl14n9kmIrolPQ48C3i4dCNJp5Pduc7s2bOrENquRmIYiKK6YFZzDgxPwmQ2uuQtaXw/9Z6aIun9wPXAxcM8dqUSQ/ksgXm2ISKWRERbRLRNnz59mGFV1kxtENVshHcvMLPRJW9D+AVk1UM/AJ4PnBMRXx7msdcDs0qezwQe7GsbSWOBPdkxp0dNFd0GUU3VTIDuBWY2uuS+1yIirouIsyLiX4AbJb1jmMe+BZgr6UBJ44CTgaVl2ywFTk2P3wzcGBF9zlk+kpppGIhqJsBmKoGZ2cAGGhp9D+AfydoWlgLXpednAbcD3x3qgVMbxZlk93+0AN+MiJWSFgHtEbEU+AbwbUlryEoYJw/1eNXQLMNAVHOqWQ/EZza6qL8v7pJ+DGwCbgKOAqaSDR/y4Yi4vSYRDlJbW1u0t7cXHUbN9I5ftbpjM9u6t9PaIp639x41HceqN4Zaz3VuZtUj6daIaBtwuwGSxl0R8ffpcQtZr6XZEbG5apFW2WhKGr09l7p6enh487Znug08a7dxjBvb0rDVZ7YrD25pIy1v0hioTaOr90FE9AD313PCGE2WrergQ5f/gb8+1slDTzwNgrFjxjAGsXlrt3swNREPbmn1ZKCk8SJJT6SfzcALex9LeqIWAdquei8indt6GDtGbA/o7gl6tgcSbOvZ7h5MTcTdmq2eDDRzX0t/660YvReR8WPH0N0TjFEQAd3bt9M6ZgzjWsa4B1MT8fziVk88vHkD6r03Ytru49lOMEYigO0B2wkmTxjrHkxNxN2arZ44aTSg3ovIHhNb2XfPiYwfO4YxgpYxYs8JYzlw2u5uBG8izXRjqTW+vGNPWR0pvTdi8oSxjG0RXT3hRNGkqnlfjdlwOWk0IF9EdjYauqM2y42l1vicNOpcXxdEX0QyHmXXrLbcplHH3D9/YO6OalZbThp1zBfEgXmUXbPacvVUjQ2m/r3e++fXQ1vCrKmT6Ni89ZkJpcDdUc1GkksaNTTY6qZ67p9fL1Vn7o5qVltOGjU02Oqmal4Qqz2Pd71UnTXTPCdmjcDVUzU02OqmanWtHYkeRvVUdeaeZGa146RRQ0Opfx/MBbGvNobSUgHApHFj6dzWzeLla4d8sXVbgtno5OqpGhrJ+vf+2hhGooeR2xLMRqdCkoakvSRdJ2l1+j21j+1+LukxST+pdYwjYSTr3/trYxiJBnW3JZiNTkVVT50N3BAR50o6Oz3/eIXtzgcmAWfUMriRNFL17/21MXx24QtGZB5vtyWYjT5FVU8tBC5Njy8FTqq0UUTcAHimwBz6K024VGBm1VJUSWPviNgAEBEbJA3r6iXpdOB0gNmzZw95P/Vws9pQlY58W6k04VKBmVXDiJU0JF0v6e4KPwurfayIWBIRbRHRNn369CHto15uVhsqlybMrBZGrKQREUf3tU7SQ5L2SaWMfYDCr8wj0S211lyaMLORVlSbxlLg1PT4VODHBcXxDA98Z2Y2sKKSxrnAMZJWA8ek50hqk3Rx70aSfg1cCRwlab2k145UQPU8zpOZWb0opCE8Ih4BjqqwvB14X8nzV9YqpoEakovWyI30ZtY8PIxIUs9TqFZ77KjyBPTSOXtx09pHnZDMbECKiKJjqKq2trZob28vOoyqOmXJil3Geerc1s2MyRO47PQjB7Wv0gQ0sbWFR556mo7N25i++zim7T7+mRKWe16ZjS6Sbo2ItoG289hTDaCajfTlw408saWbMYLNW7s9O6CZDchJowFUs5G+PAFt69nOGGW/e7nXmJn1xUmjAVRzRNnyBDSuZQzbI/vdy73GbDiqPeGX1RcnjQZQzbu9yxPQHhPHsj1g8oSxHuLchq3RR1awgbn3VIOo1t3e5b3EDnjW7pxyeNZ7qt56jVnjaYaRFax/ThqjUKUE9KGCYrHmUk/TANvIcPWUmVWNR1Zofk4aZlY1nga4+bl6yqxGRsNQMPU8soJVh5OGWQ1UeyiYeuYh+pubq6fMaqD8TnzfeW+NyknDrAY8X4s1CycNsxpwryJrFm7TMBtBvY3f9z70BE8+3cNeu7XyrN3G1918LWZ5OWk0udHQY6delTZ+77PnRB5+8mkefaqLrp5g7ozJ/ltYQ3LSaGKjqcdOPSofUmP65AnsNn7skOZBMasXhbRpSNpL0nWSVqffUytsc4ikmyStlHSnpLcVEWsjc4+dYrnx25pRUQ3hZwM3RMRc4Ib0vFwn8K6ImAccB3xR0pQaxtjwfNEqlhu/rRkVlTQWApemx5cCJ5VvEBH3RsTq9PhBoAOYXrMIm4AvWsXykBrWjIpKGntHxAaA9LvfCnZJRwDjgPv6WH+6pHZJ7Rs3bqx6sI3KF61iVXMeFLN6oYgYmR1L1wPPrrDqk8ClETGlZNtNEbFLu0Zatw+wDDg1IlYMdNy2trZob28fWtBNqLf3lMcBMrP+SLo1ItoG2m7Eek9FxNF9rZP0kKR9ImJDSgoVp/WStAdwDfCpPAnDduVxgMysmorqcrsUOBU4N/3+cfkGksYBVwPfiograxuemVljqPW9WEW1aZwLHCNpNXBMeo6kNkkXp23eCswHTpN0e/o5pJhwzczqTxFzshdS0oiIR4CjKixvB96XHn8H+E6NQzMzaxhFzMnuAQvNzBpUEfdiOWmYmTWoIu7FctIwM2tQRdyL5aRhZtagiriB1KPcmpk1sFrfi+WShpmZ5eakYWZmuTlpmJlZbk4aZmaWm5OGmZnl5qRhZma5OWmYmVluThpmZpabk4aZmeXmpGFmZrk5aZiZWW5OGmZmllshSUPSXpKuk7Q6/Z5aYZv9Jd2apnldKekDRcRqZmY7FFXSOBu4ISLmAjek5+U2AC+LiEOAlwBnS9q3hjGamVmZopLGQuDS9PhS4KTyDSJiW0Q8nZ6Ox1VpZmaFK+pCvHdEbABIvysOBi9plqQ7gXXAeRHxYB/bnS6pXVL7xo0bRyxoM7PRbsQmYZJ0PfDsCqs+mXcfEbEOeGGqlvqRpKsi4qEK2y0BlgC0tbXFEEM2M7MBjFjSiIij+1on6SFJ+0TEBkn7AB0D7OtBSSuBVwJXVTlUMzPLqajpXpcCpwLnpt8/Lt9A0kzgkYjYknpXvRz475pGadaklq3qYPHytazb1MmsqZM4Y/6cmk4Zao2rqDaNc4FjJK0GjknPkdQm6eK0zd8BN0u6A/gVcEFE3FVItGZNZNmqDs5ZupKOzVuZMrGVjs1bOWfpSpat6rfAbwYUVNKIiEeAoyosbwfelx5fB7ywxqGZNb3Fy9fS2iImjcv+/SeNG0vntm4WL1/r0oYNyN1YzUaZdZs6mdjastOyia0trN/UWVBE1kicNMxGmVlTJ7Glq2enZVu6epg5dVJBEVkjcdIwG2XOmD+Hrp6gc1s3Ednvrp7gjPlzig7NGoCThtkos+CgGSw6cR4zJk/g8S1dzJg8gUUnznN7huVSVJdbMyvQgoNmOEnYkLikYWZmuTlpmJlZbk4aZmaWm5OGmZnl5qRhZma5KaK5RhKXtBH4cxV3OQ14uIr7qzbHN3z1HqPjG556jw/qI8b9I2L6QBs1XdKoNkntEdFWdBx9cXzDV+8xOr7hqff4oDFi7OXqKTMzy81Jw8zMcnPSGNiSogMYgOMbvnqP0fENT73HB40RI+A2DTMzGwSXNMzMLDcnDTMzy23UJg1J/yxppaS7JV0maULZ+gsl3Z5+7pX0WMm6npJ1S0cwxg+n+FZK+kiF9ZL0ZUlrJN0p6bCSdadKWp1+Ti0ovnekuO6U9DtJLypZ94Cku9I5bC8ovgWSHi/5W55Tsu44SX9K5/bsguI7qyS2u9Pnbq+0bkTOn6RvSuqQdHfJsr0kXZc+S9dJmtrHayt+5iTgzpMAAAAIvklEQVS9OMW6Jn1eVev4JB0i6aZ0ru+U9LaSdZdIur/kXB9S6/jSdhWvK5IOlHRzev0VksYNNb6qiIhR9wPsB9wPTEzPvw+c1s/2/wR8s+T5kzWI8QXA3cAksiHsrwfmlm3zOuBngIAjgZvT8r2Aten31PR4agHxvaz3uMDxvfGl5w8A0wo+fwuAn1R4bQtwHzAHGAfcARxc6/jKtj8BuHGkzx8wHzgMuLtk2ReAs9Pjs4HzKryuz88c8Hvgpelz+jPg+ALie17v+QX2BTYAU9LzS4A3F3n+0rqK1xWy69PJ6fHXgA9W++8+mJ9RW9Ig+0edKGks2T/ug/1sewpwWU2i2uHvgBUR0RkR3cCvgDeWbbMQ+FZkVgBTJO0DvBa4LiIejYhNwHXAcbWOLyJ+l44PsAKYWeUYhhVfP44A1kTE2ojYBlxOdq6LjK8mn8GIWA48WrZ4IXBpenwpcFKFl1b8zKXP4x4RcVNkV71v9fH6EY0vIu6NiNXp8YNABzDg3c+1iq8vqVT2GuCqobx+JIzKpBERfwUuAP5C9o3j8Yj4RaVtJe0PHAjcWLJ4gqR2SSskjdQf8G5gvqRnSZpEVqqYVbbNfsC6kufr07K+ltc6vlLvJfuW2SuAX0i6VdLpVY5tMPG9VNIdkn4maV5aVlfnL60/DvhByeKRPn+l9o6IDQDpd6XZm/r7LK6vsLzW8T1D0hFkJcj7ShZ/LlVbXShpfEHxVbquPAt4LH2xgJE5f4MyKmfuS3WKC8mSwWPAlZLeGRHfqbD5ycBVEdFTsmx2RDwoaQ5wo6S7IuK+Cq8dsoi4R9J5ZN/YniSrIuku26xS3XD0s7zW8QEg6dVkSeMVJYtfns7hDOA6SavSt7Raxncb2Xg7T0p6HfAjYC51dv7IqqZ+GxGl32BH9PwNQWGfxcFIJZ9vA6dGxPa0+BPA38gSyRLg48CiAsLb5boCPFFhu0LvkxiVJQ3gaOD+iNgYEV3AD8nq3ys5mbJqgVS8JSLWAsuAQ0ciyIj4RkQcFhHzyYq8q8s2Wc/O305nklWz9bW81vEh6YXAxcDCiHik5LW957ADuJqsSqim8UXEExHxZHr8U6BV0jTq6Pwl/X0GR+z8lXgoXWx7L7odFbbp77M4s8LyWseHpD2Aa4BPpepcIPv2n6p4nwb+l+qfy1zx9XFdeZis2rn3C/6IfBYHY7Qmjb8AR0qalOoMjwLuKd9I0vPJGvVuKlk2tbf4mi4wLwf+OBJBpm+RSJoNvIld67SXAu9S5kiyarYNwLXAsSnWqcCxaVlN40vLfwj8Q0TcW7J8N0mTex+n+O6mynLE9+zenjypymIM8AhwCzA39VoZR3bRrnovuRx/XyTtCbwK+HHJspqcvxJLgd7eUKeWxlKi4mcufR43Szoynet39fH6EY0v/R2vJmsDvLJsXe8FXWTtBdU+l3niq3hdSe1AvwTe3N/ra6rIVvgif4DPAKvIPiDfBsaTFUlPLNnm08C5Za97GXAXWXXCXcB7RzDGX5MlpDuAo9KyDwAfSI8FXERWN3sX0Fby2vcAa9LPuwuK72JgE3B7+mlPy+ek19wBrAQ+WVB8Z6bj30HWUP+ykte+Drg3ndtC4kvPTwMuL3vdiJ0/ssS1AegiKyW8l6xe/QayktANwF5p2zbg4oE+c2m7u9O5/AppJIpaxge8M73m9pKfQ9K6G9P/z93Ad4DdC4ivz+tK+nv/Pp3XK4HxI/F5zPvjYUTMzCy30Vo9ZWZmQ+CkYWZmuTlpmJlZbk4aZmaWm5OGmZnl5qRhTaNklNC7JV2Zht8Y6r4WSPpJenyi+hnpVtIUSf9nCMf4tKR/GWqM1d6PWR5OGtZMtkTEIRHxAmAb2T0Pz0g3QQ76Mx8RSyPi3H42mQIMOmmYNSInDWtWvwaeK+kASfdI+irZWFOzJB2rbG6F21KJZHd4Zg6NVZJ+Q3aHNmn5aZK+kh7vLenqNMjhHZJeBpwLPCeVcs5P250l6ZY0CN5nSvb1SWXzdFwPPL88aEl7KpsrY0x6PknSOkmtkt6f9nmHpB9UKklJWiapLT2eJumB9LhF0vklMZ2Rlu8jaXlJCe2V1Tj51rycNKzppHF6jie7sxayi/O3IuJQ4CngU8DREXEY0A58VNkkXF8nGxzwlcCz+9j9l4FfRcSLyOZNWEk2R8J9qZRzlqRjyQY+PAI4BHixpPmSXkw2JMmhZEnp8PKdR8TjZHcFvyotOoFsOI4u4IcRcXg69j1kdxvn9V6yYWYOT8d9v6QDgben/R8CvIjsTmmzPo3KUW6taU2U1HvR+zXwDbIJd/4cOwaoOxI4GPhtGnZqHNnYYgeRDWK5GkDSd4BKQ46/hmz8JCIb+fhx7ToT27Hp5w/p+e5kSWQycHVEdKZj9DWe1RXA28jGHDoZ+Gpa/gJJ/0FWHbY7gxtP7FjghZJ6xzDaM8V0C/BNSa3AjyLCScP65aRhzWRL+sb8jJQYnipdRDZZ0Cll2x1C9YacFvD5iFhcdoyP5DzGUuDzyqZ2fTE75nK5BDgpIu6QdBrZzIPlutlRg1A6hbGAf4qIXRKNpPnA64FvSzo/Ir6VI0YbpVw9ZaPNCuDlkp4Lz7QZPI9s8MoDJT0nbXdKH6+/Afhgem2LsuG2N5OVInpdC7ynpK1kvzSi7XLgjZImplFqT6h0gMiGa/898CWy6Wh753KZDGxIpYJ39BHfA2SJBnaMjNob0wfTa5H0vDRa7v5AR0R8naxkdhhm/XBJw0aViNiYvqVfph0ztH0qIu5VNgPeNZIeBn5DNo93uQ8DSyS9F+ghm6/5Jkm/lXQ38LPUrvF3wE2ppPMk8M6IuE3SFWTtBn8mq0LryxVkI5ouKFn2b8DN6bV3sXOi6nUB8H1J/8DOs01eDBwA3JaGAN9INgz4AuAsSV0pznf1E5OZR7k1M7P8XD1lZma5OWmYmVluThpmZpabk4aZmeXmpGFmZrk5aZiZWW5OGmZmltv/B2XFmFclXBjsAAAAAElFTkSuQmCC\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "def resid_plot(y_test, y_score):\n", " ## first compute vector of residuals. \n", @@ -1175,7 +460,7 @@ " plt.title('Residuals vs. predicted values')\n", " plt.xlabel('Predicted values')\n", " plt.ylabel('Residual')\n", - " \n", + "\n", "resid_plot(y_test, y_score) " ] }, @@ -1183,27 +468,16 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This plot looks reasonable. The residual values appear to have a fairly constant dispursion as the predicted value changes. A few large residuals are noticable, particularly on the positive side.\n", + "This plot looks reasonable. The residual values appear to have a fairly constant dispersion as the predicted value changes. A few large residuals are noticeable, particularly on the positive side.\n", "\n", "But, wait! This residual plot is for the log of the auto price. What does the plot look like when transformed to real prices? Execute the code in the cell below to find out. " ] }, { "cell_type": "code", - "execution_count": 22, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZcAAAEWCAYAAACqitpwAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XuYXFWZ7/Hvr6s7STd0SJAEgQRDMMoQRyO0yhwxE28QUAE94pCZM+BlTHRk1PEywtFBJg5nQHFURsXEkRFG5SKK5lEQEI1Rhwgd5BYJJgQwTSIdJECgc+nLe/7Yq0l1U31JZ1dXV/Xv8zz11K61b+/e6dRbe62111ZEYGZmlqe6SgdgZma1x8nFzMxy5+RiZma5c3IxM7PcObmYmVnunFzMzCx3Ti5WcZLWSlowwLwFktpy2s9KSX+Xx7bGuuJjlfQ3km4ahX3OkhSS6nPebm5/AzZ6nFxs2CQ9JGmHpKcl/VHSNyXtv6/bjYi5EbEyhxCthIj4dkScMNRyks6X9K3RiMlqn5OL7a23RMT+wDzg5cC5FY6n5uV9JWA2GpxcbEQi4o/AjWRJBgBJEyVdLOkPkh6V9DVJjWneQZJ+JOkJSY9L+qWkujTvIUlvSNON6Ypom6TfAa8o3m+qdnlh0edvSvrXND017WNrWv9HkmaUil/SCyX9QtKTkh6TdPUAy/1E0tn9yu6S9DZlviCpPW3nbkkvGc75S8fxQUkb0/4/V3Q+3inp12nbjwPnp/J3S7ovHduNkl5QtL03SlqX4vgyoKJ575T0q6LPcyXdnP4dHpX0fyUtBP4v8FfpyvSutOwBkr4haYukRyT9q6RCmldI/96PSdoIvGmQ4z1H0rX9yr4k6ZI0/a50bNvTOVkyxLkr+TeQPr9Z0p3pb+1/JL20aN4n0nFsl3S/pNcPtB/bN04uNiLpS/skYENR8UXAi8gSzguBw4Dz0ryPAm3ANOBgsi+yUmMPfRo4Mr1OBM7ai7DqgP8CXgAcDuwAvjzAsp8BbgKmAjOA/xhgue8Ai3o/SDo6bf/HwAnAfLJjngL8FfCnvYj3rUALcAxwKvDuonmvAjYC04ELJJ1Gds7eRnYOfwlcmWI6CPge8CngIOAB4NWldiipGfgp8BPgULJ/p1si4ifA/wOujoj9I+JlaZXLga603MvTMfe2W70XeHMqbwHePsixXgmcLGlyiqMAvIPs/AK0p21NBt4FfEHSMYNsr6S0zmXAEuB5wDJgRfrh82LgbOAVEdFM9vf10N7uw4bHycX21g8kbQc2kX0hfBpAksi+bP4xIh6PiO1kX1ZnpPU6gUOAF0REZ0T8MkoPbPcO4IK0jU3AJcMNLCL+FBHfi4iOtP8LgL8cYPFOsiRxaETsjIhfDbDcdcC8oquEvwG+HxG70jaagaMARcR9EbFluPECF6Xj/APwRYqSGLA5Iv4jIroiYgfZl+W/pX10kZ3b3rhOBn4XEddGRGfa1h8H2OebgT9GxOfTcW+PiN+UWlDSwWQ/ID4cEc9ERDvwBfb8m74D+GJEbIqIx4F/G+hAI+Jh4A7gtFT0OqAjIlan+T+OiAci8wuyxP+agU/dgN4LLIuI30REd0RcDuwCjgO6gYnA0ZIaIuKhiHhgBPuwYXBysb11WvrVt4DsS/WgVD4NaALWpOqIJ8h+HU9L8z9HdpVzU6r2OGeA7R9Klrh6PTzcwCQ1SVom6WFJTwGrgCm91Tj9/BNZ1dFtynqrvbvEMqQk9WP2fKGeAXw7zfsZ2ZXRV4BHJS3v/WU+TP2P89AB5kGWCL9UdG4fT/EfRr9zlpJ2//V7zSS7shmOFwANwJai/S4ju5qi/34Z+t+q+Crwr9lz1YKkkyStTlV1T5AlzINKbGM4MX+0N960rZlkPyI2AB8mq2Zsl3SVpEMH2ZbtAycXG5H06/KbwMWp6DGyaqi5ETElvQ5Ijf+kX8gfjYjZwFuAjwxQ372F7Mug1+H95neQJbFezy+a/ijwYuBVETGZrMoKitofiuL/Y0S8NyIOJbsq+GpxPX4/VwKLJP0F0Aj8vGg7l0TEscBcsuqxjw+wjVL6H+fm4hD7LbsJWFJ0bqdERGNE/A/9zlm6ipxJaZvIqhxLKbXPXcBBRfucHBFz0/yh/q36+y6wIFWpvpWUXCRNJKvWuxg4OCKmANdT4t8tGexvYBPZlW/xeWqKiCsBIuI7EXE8WRIKsqpcKwMnF9sXXwTeKGleRPQAXyerK58OIOkwSSem6Tcra0QX8BRZFUV3iW1eA5yrrHF+BvAP/ebfCfx1akxeSN9qr2ayBPeEpANJVXalSDpdexr7t5F90ZSKB7IvuhcAS8naJHrSNl4h6VWSGoBngJ2DbKOUj6fjnAl8CCjZqSD5Gtl5mZv2fYCk09O8HwNzlXUyqAc+SN8v3GI/Ap4v6cOpHaJZ0qvSvEeBWUodC1IV303A5yVNllQn6UhJvef8GuCDkmZImgoMdDVK2t5WYCVZu9iDEXFfmjWBrLpqK9Al6SSytp2BDPY38HXgfenfRZL2k/SmdJwvlvS6lMx2kv2t7M2/l+0FJxcbsfRlcQXwz6noE2RVX6tTtdRPya4kAOakz08DtwJfHeDeln8hq155kOyL7b/7zf8Q2ZXPE2TtHz8omvdFsiuLx4DVZNVyA3kF8BtJTwMrgA9FxIMDHOcu4PvAGyiqyiFrfP46WXJ6mKwx/2IAZT2wbhhk/wA/BNaQfVn+GPjGQAtGxHVkv7KvSuf2XrL2ECLiMeB04MIUwxzg1wNsZzvwRrJz+EdgPfDaNPu76f1Pku5I02eSffn/Lh3ntWRtZ6RjvxG4i6w95ftDHC9k56/PeUwxfZAsWW0jqzJbMcg2BvwbiIhWsnaXL6dtbQDemWZPJDtHj6Vjn07WScLKQH5YmNnokxTAnNQOYFZzfOViZma5c3IxM7PcuVrMzMxy5ysXMzPL3bgdEO+ggw6KWbNmVToMM7OqsmbNmsciYtpQy43b5DJr1ixaW1srHYaZWVWRNKxRM1wtZmZmuXNyMTOz3Dm5mJlZ7pxczMwsd6OSXCRdpuxpffcWlR2o7Gl469P71FQuSZdI2qDsyX7HFK1zVlp+vaSzisqPlXRPWueSNDiimZlVyGhduXwTWNiv7ByyJ+DNAW5hz4iqJ5ENvDcHWAxcClkyIhvl9lXAK4FP9yaktMziovX678vMLHcr17WzaPlqjr/oZyxavpqV69orHdKYMSrJJSJWkT3cqNipZI9QJb2fVlR+RXoi3Wqyhz0dQvZI0pvTk/u2ATcDC9O8yRFxa3pI0hVF2zIzK4uV69o5b8Va2rfvZEpjA+3bd3LeirVOMEkl21wO7n0kbHrvfbrdYfR9ul1bKhusvK1E+XNIWiypVVLr1q1bczkIMxuflq3aSENBNE2oR8reGwpi2aqNlQ5tTBiLDfql2ktiBOXPLYxYHhEtEdEybdqQN5iamQ1o07YOGhv6PkG7saFA27aOCkU0tlQyuTyaqrRI773Xkm30fXTqDLLHvw5WPqNEuZlZ2cyc2sSOzr4PstzR2c2MqU0DrDG+VDK5rAB6e3ydRfZUvt7yM1OvseOAJ1O12Y3ACemxsFPJHoN6Y5q3XdJxqZfYmUXbMjMriyXzZ9PZHXTs7iIie+/sDpbMn13p0MaEURlbTNKVwALgIEltZL2+LgSukfQe4A9kj2mF7HnlJ5M9nrQDeBdARDwu6TPA7Wm5pRHR20ng/WQ90hqBG9LLzKxsFhw1naVkbS9t2zqYMbWJJfNns+Co6UOuOx6M2+e5tLS0hAeuNDPbO5LWRETLUMuNxQZ9MzOrck4uZmaWOycXMzPLnZOLmZnlzsnFzMxy5+RiZma5c3IxM7PcObmYmVnunFzMzCx3Ti5mZpY7JxczM8udk4uZmeXOycXMzHLn5GJmZrlzcjEzs9w5uZiZWe6cXMzMLHej8phjGz9Wrmtn2aqNbNrWwUw/9tVs3KrYlYukF0u6s+j1lKQPSzpf0iNF5ScXrXOupA2S7pd0YlH5wlS2QdI5lTkiW7munfNWrKV9+06mNDbQvn0n561Yy8p17ZUOzcxGWcWSS0TcHxHzImIecCzQAVyXZn+hd15EXA8g6WjgDGAusBD4qqSCpALwFeAk4GhgUVrWRtmyVRtpKIimCfVI2XtDQSxbtbHSoZnZKBsr1WKvBx6IiIclDbTMqcBVEbELeFDSBuCVad6GiNgIIOmqtOzvyhyz9bNpWwdTGhv6lDU2FGjb1lGhiMysUsZKg/4ZwJVFn8+WdLekyyRNTWWHAZuKlmlLZQOV2yibObWJHZ3dfcp2dHYzY2pThSIys0qpeHKRNAE4BfhuKroUOBKYB2wBPt+7aInVY5DyUvtaLKlVUuvWrVv3KW57riXzZ9PZHXTs7iIie+/sDpbMn13p0MxslFU8uZC1ldwREY8CRMSjEdEdET3A19lT9dUGzCxabwaweZDy54iI5RHREhEt06ZNy/kwbMFR01l6ylymN0/iyR2dTG+exNJT5rq3mNk4NBbaXBZRVCUm6ZCI2JI+vhW4N02vAL4j6d+BQ4E5wG1kVy5zJB0BPEJWxfbXoxS79bPgqOlOJjYkd1mvfRVNLpKagDcCS4qKPytpHlnV1kO98yJiraRryBrqu4APRER32s7ZwI1AAbgsItaO2kGY2V7p7bLeUFCfLutLwQmmhiiiZPNEzWtpaYnW1tZKh2E27ixavpr27TtpmrDnt23H7i6mN0/iysXHVTAyGw5JayKiZajlxkKbi5mNI5u2ddDYUOhT5i7rtcfJxcxGlbusjw9OLmY2qtxlfXxwcjGzUeUu6+PDWOiKbGbjjLus1z5fuZiZWe6cXMzMLHdOLmZmljsnFzMzy52Ti5mZ5c7JxczMcufkYmZmuXNyMTOz3Dm5mJlZ7pxczMwsd04uZmaWOycXMzPLnZOLmZnlzsnFzMxyV/HkIukhSfdIulNSayo7UNLNktan96mpXJIukbRB0t2Sjinazllp+fWSzqrU8ZiZ2RhILslrI2JeRLSkz+cAt0TEHOCW9BngJGBOei0GLoUsGQGfBl4FvBL4dG9CMjOz0TdWkkt/pwKXp+nLgdOKyq+IzGpgiqRDgBOBmyPi8YjYBtwMLBztoM3MLDMWkksAN0laI2lxKjs4IrYApPfeR9YdBmwqWrctlQ1U3oekxZJaJbVu3bo158MwM7NeY+Exx6+OiM2SpgM3S1o3yLIqURaDlPctiFgOLAdoaWl5znwzM8tHxa9cImJzem8HriNrM3k0VXeR3tvT4m3AzKLVZwCbByk3M7MKqGhykbSfpObeaeAE4F5gBdDb4+ss4IdpegVwZuo1dhzwZKo2uxE4QdLU1JB/QiozG9LKde0sWr6a4y/6GYuWr2bluvahVzKzQVW6Wuxg4DpJvbF8JyJ+Iul24BpJ7wH+AJyelr8eOBnYAHQA7wKIiMclfQa4PS23NCIeH73DsGq1cl07561YS0NBTGlsoH37Ts5bsZalwIKjpg+5vpmVpojx2fTQ0tISra2tlQ7DKmzR8tW0b99J04Q9v7M6dncxvXkSVy4+roKRmY1NktYU3TYyoIq3uZhV0qZtHTQ2FPqUNTYUaNvWUaGIzGqDk4uNazOnNrGjs7tP2Y7ObmZMbapQRGa1wcnFxrUl82fT2R107O4iInvv7A6WzJ9d6dDMqpqTi41rC46aztJT5jK9eRJP7uhkevMklp4y1435Zvuo0r3FzCpuwVHTnUzMcuYrFzMzy52Ti5mZ5c7JxczMcufkYmZmuXODvtWclevaWbZqI5u2dTBzahNL5s92g73ZKPOVi9WU3rHC2rfv7DNWmAejNBtdTi5WU5at2khDQTRNqEfK3hsKYtmqjZUOzWxccXKxmuKxwszGBre5WE2ZObXpOaMce6wwqza10G7oKxerKR4rzKpdrbQbOrlYTfFYYVbtaqXd0NViVnM8VphVs03bOpjS2NCnrBrbDX3lYmY2htTKM4YqllwkzZT0c0n3SVor6UOp/HxJj0i6M71OLlrnXEkbJN0v6cSi8oWpbIOkcypxPGZmeaiVdsNKVot1AR+NiDskNQNrJN2c5n0hIi4uXljS0cAZwFzgUOCnkl6UZn8FeCPQBtwuaUVE/G5UjsLMLEcLjprOUrK2l7ZtHcyo0t5iFUsuEbEF2JKmt0u6DzhskFVOBa6KiF3Ag5I2AK9M8zZExEYASVelZZ1czKwq1UK74aDJRdJ2IErNAiIiJucRhKRZwMuB3wCvBs6WdCbQSnZ1s40s8awuWq2NPcloU7/yV+URVyXUQv92M7NB21wiojkiJpd4NeeYWPYHvgd8OCKeAi4FjgTmkV3ZfL530VIhDlJeal+LJbVKat26des+x563Wunfbma2V9VikqYDk3o/R8Qf9mXnkhrIEsu3I+L7aZuPFs3/OvCj9LENmFm0+gxgc5oeqLyPiFgOLAdoaWkpmYAqqbh/O0DThHo6dnexbNXGqrl68ZWXmcEwe4tJOkXSeuBB4BfAQ8AN+7JjSQK+AdwXEf9eVH5I0WJvBe5N0yuAMyRNlHQEMAe4DbgdmCPpCEkTyBr9V+xLbJVS7eNi+crLzHoNtyvyZ4DjgN9HxBHA64Ff7+O+Xw38LfC6ft2OPyvpHkl3A68F/hEgItYC15A11P8E+EBEdEdEF3A2cCNwH3BNWrbqVHv/9lq5s9jM9t1wq8U6I+JPkuok1UXEzyVdtC87johfUbq95PpB1rkAuKBE+fWDrVcNVq5rZ9szu3joT8/QUFfHwZMnUl+oq6r+7bVyZ7GZ7bvhXrk8kRreVwHflvQlsvtULAe91UmdPcGMKY0gaHtiJxMKdVU1Lla1X3mZWX6Gm1xOBXaQVVH9BHgAeEu5ghpviquTJjdOYM70ZmY9r4kpTROqJrFA7dxZbGb7bljVYhHxTNHHy8sUy7hV7uqk0erBVSt3FpvZvhtWcul3M+UEoAF4Jq97XapRnl/Y5XzAVW+VW0NBfXpwLYWyJRgnEzMbVrVYv5spJwH/G/hyeUMbu/LuclvO6iT34DKzShjRqMgR8QPgdTnHUjXy/sIu5wOuqv3eGTOrTsOtFntb0cc6oIUBhlgZD8rRRlKu6iQ/U97MKmG4Vy5vKXqdCGwn60E2LlVTl1v34DKzShhub7F3lTuQarJk/mzOW7GWjt1dNDYU2NHZPWa/sN2Dy8wqYagh9/+DQaq/IuKDuUdUBartC9s9uMxstA115dKa3l8NHA1cnT6fDqwpV1DVwF/YZmYDGzS5RMTlAJLeCbw2IjrT568BN5U9unFsvA1dP96O16zWDXfgykOBZuDx9Hn/VGZlMJwbH2vpy3i0b/Q0s/IbbnK5EPitpJ+nz38JnF+WiGrESL/8V65r54NX/ZaO3d1MrK/joP0nMrmxoc9Dw2rty7gWHpJmZn0N9w79/yJ7Lv116fUXvVVm9lwjvYO/d71ndndRqIOu7mDzkzt4akdnn/toau2ue9/oaVZ7Bk0uko5K78eQVYNtSq9DU5mVMNIv/971JtUXAFFXJ+oQjz29q899NLX2ZVxN9w2Z2fAMVS32EWAx8PkS84JxPATMYEZ6B3/vetOaJ7L5iZ30EKBgV1f0uY+m1u66r6b7hsxseAa9comIxen9tSVeTiwDGOkv8d71mic1cOiUSdTXia6eoGlCoc9YY7V21305x1Yzs8pQxNBDhEk6HfhJRGyX9CngGOAzEfHbcgdYLi0tLdHa2jr0giOwcl07H7/2Lrbv7KKrp4f6ujqaJ9Xzube/bNAvzOKG+uJf8KW+aHs7DFTDTZxmVjskrYmIlqGWG25vsX+OiO9KOp5sbLGLga+RNfKPCZIWAl8CCsB/RsSF5dzfUL3BAkAgCTS8UT4XHDWdt7c9wX/+6kGe2d3NfhMK/N3xR5RMGr6J08zGsuEml946njcBl0bEDyWdX56Q9p6kAvAV4I1AG3C7pBUR8bty7G+orsDLVm3kgMYGDjmg8dl1htO1duW6dq694xGmNU/k8HTlcu0dj/DSGVP6rFdL97iYWW0a7qjIj0haBrwDuF7SxL1YdzS8EtgQERsjYjdwFWUctXmo3mAj7c01nF5meT+ozMysHIZ75fIOYCFwcUQ8IekQ4OPlC2uvHUbWRbpXGyWq7CQtJuv9xuGHHz7inQ3VG2yw3lwr17Vz4Q33sWHrM3T3BPUFceRB+3HOSX82rF5mvuHQzKrBcG+i7ADageNTURewvlxBjYBKlD2nmSMilkdES0S0TJs2bcQ7G6o32EC9uf5i9oF87Nq7WN/+NF09QQCd3cH69qf52LV30TyxfsheZrV2j4uZ1aZhJRdJnwY+AZybihqAb5UrqBFoA2YWfZ4BbC7XzobqCty/a+2EQh1NDXV8ZeUDPP7Mbrojy4aprZ+egKd3Zdsaqouxbzg0s2ow3HaTtwKnAM8ARMRmsoEsx4rbgTmSjpA0ATgDWFGunQ12X8bKde0sWr6aT/3wXgBOP3YGz+zuprMn6O7poaf/9VTqSdbdEzyzu3vI+z1q7R6XfdF7ro+/6GcsWr7a7U5mY8hw21x2R0RIynrYSvuVMaa9FhFdks4GbiTrinxZRKzNez9D9dIq1YvsKysf4MD9GjigcRIT6wt0d3YTkSUUAaSrmEKdmDG1acguxtX2oLJyqbXBO81qzXCvXK5JvcWmSHov8FPgP8sX1t6LiOsj4kURcWREXJD39ofTS6tUb6/unuDJjk4ApjVP7HPCe5NMAJ1dPaxv375Xv8CHc+/McI+t2q4Aam3wTrNaM9wG/YuBa4HvAS8GzouIS8oZ2FgznC+zUo3tE+vr2NXdA0DzpAZmHthEoVT3A2C/CYUhuxbn3RW5Wrs2u2OD2dg27HtVIuLmiPh4RHwM+JmkvyljXGPOcL7MSjW2N0+qp76u7tk2kkKdqC/U8fzJE/nzww5gvwkFJhbqqC/U8djTu4f8BZ73L/ZqvQJwxwazsW2oIfcnSzpX0pclnaDM2cBGsntfxo3hfJmVamyfUF/gAwuOpKFOrG9/mrZtO+juCSbWZ6d+d3dP1mtM2TQM/gs871/s1XoF4I4NZmPbUFcu/01WDXYP8HfATcDpwKkRUbY74Mei4XyZDdSL7KUzptDR2cOMqY3Mmb4/9XXikSd28lTqphyRtb8UJDZufZr7tjzF5id3cuxnbnpOG0jev9ir9QrAIymbjW2Djoos6Z6I+PM0XQAeAw6PiO2jFF/ZjGRU5JGORLxo+eo+d+xv39lJ27Yd1BfEwc0TeeSJnTz77yDo7oE6ZT3InrffBCbUF/p0dR7uyMnDPaY8t2dmtS2vUZE7eyciolvSg7WQWEZqpCMR9x/WpXlSA4dNCf741C56Al44bT/+sG0Hu7t66Imgvg4aCgV6eoLtO7t4/gH1zw7vkndXZHdtNrNyGCq5vEzSU2laQGP6LCAiYnJZo6sRpcYaqy/UcczhU7ly8XEAHH/Rz5jS2MD9j26nUJd1J+tth+nfBpL3cPsevt/M8jbUkygLETE5vZojor5o2ollmIbTXtPb9tHbBgNZO8yEQl1VtIGYmRUbS8Pm16yhGp9Xrmtn2zO7eOhPz9DZ3UNXdw9dPT30EDRPqncvKDOrOsMd/sX20UBVT8UN6jOmNPLo9l109QQNabDLIw7a320gZlZ1nFwqrP/zWSY3TqBjdxfTmyc92x5jZlZtXC1WYdV6E6OZ2WCcXCqsWm9iNDMbjJNLhXkYEzOrRU4uFeZhTMysFrlBfwzwTYxmVmt85WJmZrlzcjEzs9xVpFpM0ueAtwC7gQeAd0XEE5JmAfcB96dFV0fE+9I6xwLfBBqB64EPRURIOhC4GpgFPAS8IyK2jdaxWD56R5zetK2DmR4806zqVerK5WbgJRHxUuD3wLlF8x6IiHnp9b6i8kuBxcCc9FqYys8BbomIOcAt6bNVkWp91LKZDawiySUiboqIrvRxNTBjsOUlHQJMjohbI3vwyRXAaWn2qcDlafryonIb41aua2fR8tUs+dYa2p/aSXdPVNWjls1sYGOhzeXdwA1Fn4+Q9FtJv5D0mlR2GNBWtExbKgM4OCK2AKT3AetSJC2W1CqpdevWrfkdge214quVngh6Itj8xE6278weIeRRCsyqW9naXCT9FHh+iVmfjIgfpmU+CXQB307ztpA96fJPqY3lB5Lmkj0/pr+BH6E5gIhYDiyH7EmUe7u+5ad4TLUJhTq6ugMEW7fvonlSg0cpMKtyZUsuEfGGweZLOgt4M/D6VNVFROwCdqXpNZIeAF5EdqVSXHU2A9icph+VdEhEbEnVZ66orwLFT+c8aP+JbH5yBwrY1dXjUQrMakBFqsUkLQQ+AZwSER1F5dMkFdL0bLKG+42pumu7pOMkCTgT+GFabQVwVpo+q6jcxrDiMdUmNzZw6AGN1NWJQl2dRykwqwGVukP/y8BE4OYsVzzb5Xg+sFRSF9ANvC8iHk/rvJ89XZFvYE87zYXANZLeA/wBOH20DsJGbsn82Zy3Yi0du7tobChQX5CTilkNUcT4bHpoaWmJ1tbWSocxrvXe29K2rYMZ4/TeFt/fY9VG0pqIaBlqOY8tZhUz3sdUK34KafH9PUthXJ8Xqw1joSuy2bhU3GPO9/dYrXFyMasQP4XUapmTi1mF+CmkVsucXMwqxE8htVrm5GJWIX4KqdUy9xYzq6Dx3mPOapevXMzMLHdOLmZmljsnFzMzy52Ti5mZ5c7JxczMcufkYmZmuXNyMTOz3Dm5mJlZ7pxczMwsd04uZmaWOycXMzPLnZOLmZnlriLJRdL5kh6RdGd6nVw071xJGyTdL+nEovKFqWyDpHOKyo+Q9BtJ6yVdLWnCaB+PmZn1Vckrly9ExLz0uh5A0tHAGcBcYCHwVUkFSQXgK8BJwNHAorQswEVpW3OAbcB7RvtAzMysr7FWLXYqcFVE7IqIB4ENwCvTa0NEbIyI3cBVwKmSBLwOuDatfzlwWgXiNjOzIpVMLmdLulvSZZKmprLDgE1Fy7SlsoHKnwc8ERFd/cpLkrRYUquk1q1bt+Z1HGZm1k/Zkoukn0q6t8TrVOBS4EhgHrAF+HzvaiU2FSMoLykilkdES0S0TJs2ba+Ox8z+hITMAAANcklEQVTMhq9sT6KMiDcMZzlJXwd+lD62ATOLZs8ANqfpUuWPAVMk1aerl+LlzUZk5bp2lq3ayKZtHcyc2sSS+bP9tEizvVSp3mKHFH18K3Bvml4BnCFpoqQjgDnAbcDtwJzUM2wCWaP/iogI4OfA29P6ZwE/HI1jsNq0cl07561YS/v2nUxpbKB9+07OW7GWlevaKx2aWVWpVJvLZyXdI+lu4LXAPwJExFrgGuB3wE+AD0REd7oqORu4EbgPuCYtC/AJ4COSNpC1wXxjdA/FasmyVRtpKIimCfVI2XtDQSxbtbHSoZlVlbJViw0mIv52kHkXABeUKL8euL5E+Uay3mRm+2zTtg6mNDb0KWtsKNC2raNCEZlVp7HWFdmsomZObWJHZ3efsh2d3cyY2lShiMyqk5OLWZEl82fT2R107O4iInvv7A6WzJ9d6dDMqoqTi1mRBUdNZ+kpc5nePIknd3QyvXkSS0+Z695iZnupIm0uZmPZgqOmO5mY7SNfuZiZWe6cXMzMLHdOLmZmljsnFzMzy52Ti5mZ5c7JxczMcufkYmZmuXNyMTOz3Dm5mJlZ7pxczMwsd04uZmaWOycXMzPLnZOLmZnlzsnFzMxyV5HkIulqSXem10OS7kzlsyTtKJr3taJ1jpV0j6QNki6RpFR+oKSbJa1P71MrcUxmZrZHRZJLRPxVRMyLiHnA94DvF81+oHdeRLyvqPxSYDEwJ70WpvJzgFsiYg5wS/psZmYVVNFqsXT18Q7gyiGWOwSYHBG3RkQAVwCnpdmnApen6cuLys3MrEIq3ebyGuDRiFhfVHaEpN9K+oWk16Syw4C2omXaUhnAwRGxBSC9D/gIQUmLJbVKat26dWt+R2FmZn2U7THHkn4KPL/ErE9GxA/T9CL6XrVsAQ6PiD9JOhb4gaS5gEpsJ/Y2pohYDiwHaGlp2ev1zcxseMqWXCLiDYPNl1QPvA04tmidXcCuNL1G0gPAi8iuVGYUrT4D2JymH5V0SERsSdVn7fkdhZmZjUQlq8XeAKyLiGeruyRNk1RI07PJGu43puqu7ZKOS+00ZwK9Vz8rgLPS9FlF5WZmViFlu3IZhjN4bkP+fGCppC6gG3hfRDye5r0f+CbQCNyQXgAXAtdIeg/wB+D0MsdtZlZ1Vq5rZ9mqjWza1sHMqU0smT+bBUcN2ES9z5R1vhp/WlpaorW1tdJhmJmV3cp17Zy3Yi0NBdHYUGBHZzed3cHSU+budYKRtCYiWoZartK9xczMrMyWrdpIQ0E0TahHyt4bCmLZqo1l26eTi5lZjdu0rYPGhkKfssaGAm3bOsq2TycXM7MaN3NqEzs6u/uU7ejsZsbUprLt08nFzKzGLZk/m87uoGN3FxHZe2d3sGT+7LLt08nFzKzGLThqOktPmcv05kk8uaOT6c2TRtSYvzcq2RXZzMxGyYKjppc1mfTnKxczM8udk4uZmeXOycXMzHLn5GJmZrlzcjEzs9yN27HFJG0FHi4qOgh4rELhjFS1xVxt8YJjHg3VFi9UX8x5xvuCiJg21ELjNrn0J6l1OIOxjSXVFnO1xQuOeTRUW7xQfTFXIl5Xi5mZWe6cXMzMLHdOLnssr3QAI1BtMVdbvOCYR0O1xQvVF/Oox+s2FzMzy52vXMzMLHdOLmZmlruaTy6SHpJ0j6Q7JbWmsgMl3SxpfXqfmsol6RJJGyTdLemYou2clZZfL+msHOO7TFK7pHuLynKLT9Kx6fg3pHVVppjPl/RIOs93Sjq5aN65af/3SzqxqHxhKtsg6Zyi8iMk/SYdy9WSJuxjvDMl/VzSfZLWSvpQKh+z53mQmMfkeZY0SdJtku5K8f7LYPuQNDF93pDmzxrpcZQh5m9KerDoHM9L5RX/u0jbLEj6raQfpc9j8xxHRE2/gIeAg/qVfRY4J02fA1yUpk8GbgAEHAf8JpUfCGxM71PT9NSc4psPHAPcW474gNuAv0jr3ACcVKaYzwc+VmLZo4G7gInAEcADQCG9HgBmAxPSMkenda4BzkjTXwPev4/xHgIck6abgd+nuMbseR4k5jF5ntNx75+mG4DfpHNXch/A3wNfS9NnAFeP9DjKEPM3gbeXWL7ifxdpmx8BvgP8aLB/x0qf45q/chnAqcDlafpy4LSi8isisxqYIukQ4ETg5oh4PCK2ATcDC/MIJCJWAY+XI740b3JE3BrZX9UVRdvKO+aBnApcFRG7IuJBYAPwyvTaEBEbI2I3cBVwavpl9zrg2rR+8fGPNN4tEXFHmt4O3Accxhg+z4PEPJCKnud0rp5OHxvSKwbZR/G5vxZ4fYppr45jpPEOEfNAKv53IWkG8CbgP9Pnwf4dK3qOx0NyCeAmSWskLU5lB0fEFsj+EwO9T9A5DNhUtG5bKhuovFzyiu+wNN2/vFzOTtUFlylVMQ0RW6ny5wFPRERXOWJOVQMvJ/uVWhXnuV/MMEbPc6quuRNoJ/uCfWCQfTwbV5r/ZIppVP8P9o85InrP8QXpHH9B0sT+MQ8ztnL8XXwR+CegJ30e7N+xoud4PCSXV0fEMcBJwAckzR9k2VL1oTFI+Wjb2/hGM+5LgSOBecAW4POpfMzELGl/4HvAhyPiqcEW3cvYRjPmMXueI6I7IuYBM8h+Bf/ZIPuoeLzw3JglvQQ4FzgKeAVZVdcnxkLMkt4MtEfEmuLiQfZR0XhrPrlExOb03g5cR/ZH/2i6ZCW9t6fF24CZRavPADYPUl4uecXXlqb7l+cuIh5N/1F7gK+TneeRxPwYWXVDfb/yfSKpgexL+tsR8f1UPKbPc6mYx/p5TjE+Aawka5cYaB/PxpXmH0BW1VqR/4NFMS9MVZIREbuA/2Lk5zjvv4tXA6dIeoisyup1ZFcyY/Mcj7SxphpewH5Ac9H0/5C1lXyOvg25n03Tb6Jvg91tsafB7kGyxrqpafrAHOOcRd/G8dziA25Py/Y2KJ5cppgPKZr+R7I6XYC59G083EjWcFifpo9gT+Ph3LTOd+nbQPn3+xiryOq7v9ivfMye50FiHpPnGZgGTEnTjcAvgTcPtA/gA/RtbL5mpMdRhpgPKfo3+CJw4Vj5uyiKfQF7GvTH5DnO5ctxrL7Iej3clV5rgU+m8ucBtwDr03vvH4KAr5DVFd8DtBRt691kDV8bgHflGOOVZNUbnWS/HN6TZ3xAC3BvWufLpFEZyhDzf6eY7gZW0PdL8JNp//dT1FuGrPfN79O8T/b7d7stHct3gYn7GO/xZJf3dwN3ptfJY/k8DxLzmDzPwEuB36a47gXOG2wfwKT0eUOaP3ukx1GGmH+WzvG9wLfY06Os4n8XRdtdwJ7kMibPsYd/MTOz3NV8m4uZmY0+JxczM8udk4uZmeXOycXMzHLn5GJmZrlzcrFxR1J3Gu32XknfldS0D9taUDQ67SmDjSQraYqkvx/BPs6X9LGRxpj3dsyGw8nFxqMdETEvIl4C7AbeVzwzDa2+1/83ImJFRFw4yCJTyEaqNat5Ti423v0SeKGkWcqenfJV4A5gpqQTJN0q6Y50hbM/PPvMi3WSfgW8rXdDkt4p6ctp+mBJ1yl7Vshdkv4XcCFwZLpq+lxa7uOSbk+DJP5L0bY+mZ6r8VPgxf2DlnSAsmcV1aXPTZI2SWqQ9N60zbskfa/UlZmklZJa0vRBaUiR3oEcP1cU05JUfoikVUVXfK/J4+Rb7XJysXErjbd0Etnd1pB9iV8RES8HngE+BbwhsoFPW4GPSJpENqbXW4DXAM8fYPOXAL+IiJeRPftmLdkQMw+kq6aPSzoBmEM2dtU84FhJ8yUdSzZcx8vJktcr+m88Ip4kG3niL1PRW4AbI6IT+H5EvCLt+z6yERSG6z3AkxHxirTf90o6AvjrtP15wMvIRgwwG1D90IuY1ZzGNMw6ZFcu3wAOBR6O7DkdkI0HdTTw6+wRGEwAbiUbLffBiFgPIOlbwGKe63XAmZCNvAs8WTQ8fq8T0uu36fP+ZMmmGbguIjrSPlYMcBxXA38F/JwsGX01lb9E0r+SVcPtD9w42MkoEdNLJb09fT4gxXQ7cFkaTPMHEeHkYoNycrHxaEf6Bf6slECeKS4ie77Hon7LzSO/xxYI+LeIWNZvHx8e5j5WAP8m6UDgWLIxsSB7kuJpEXGXpHeSjUPVXxd7ai4m9YvpHyLiOQkpPa7iTcB/S/pcRFwxjBhtnHK1mFlpq4FXS3ohPNum8SJgHXCEpCPTcosGWP8W4P1p3YKkycB2squSXjcC7y5qyzlM0nRgFfBWSY2SmsmqvJ4jsqco3gZ8iWwQw+40qxnYkq4y/maA+B4iS0gAby8qvxF4f1oXSS+StJ+kF5A9S+TrZFd6x2A2CF+5mJUQEVvTr/4rtedJhJ+KiN8re6LpjyU9BvwKeEmJTXwIWC7pPUA32XPNb5X0a0n3Ajekdpc/A25NV05PA/8nIu6QdDVZu8bDZFV3A7mabOTbBUVl/0z21MqHydqTmp+7GhcD10j6W/Zc8UD2+NxZwB3KgtpK9tjcBcDHJXWmOM8cJCYzj4psZmb5c7WYmZnlzsnFzMxy5+RiZma5c3IxM7PcObmYmVnunFzMzCx3Ti5mZpa7/w8n6Gpj+dexmwAAAABJRU5ErkJggg==\n", - "text/plain": [ - "" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "y_score_untransform = np.exp(y_score)\n", "y_test_untransform = np.exp(y_test)\n", @@ -1214,7 +488,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Notice that the untransformed residuals show a definite trend. The dispursion of the residuals has a cone-like pattern increasing to the right. The regression model seems to do a good job of predicting the price of low cost cars, but becomes progressively worse as the price of the car increases. " + "Notice that the untransformed residuals show a definite trend. The dispersion of the residuals has a cone-like pattern increasing to the right. The regression model seems to do a good job of predicting the price of low cost cars, but becomes progressively worse as the price of the car increases. " ] }, { @@ -1228,10 +502,10 @@ "2. Aggregated categories of a categorical variable to improve the statistical representation. \n", "3. Scaled the numeric features. \n", "4. Recoded the categorical features as binary dummy variables. \n", - "5. Fit the linear regression model using Scikit-learn. \n", + "5. Fit the linear regression model using scikit-learn. \n", "6. Evaluated the performance of the model using both numeric and graphical methods. \n", "\n", - "It is clear from the outcome of the performance evaluation that this model needs to be improved. As it is, the model shows poor generalizatiion. " + "It is clear from the outcome of the performance evaluation that this model needs to be improved. As it is, the model shows poor generalization. " ] }, { diff --git a/Module4/Auto_Data_Preped.csv b/Module4/Auto_Data_Preped.csv new file mode 100644 index 0000000..06ab859 --- /dev/null +++ b/Module4/Auto_Data_Preped.csv @@ -0,0 +1,196 @@ +symboling,make,fuel_type,aspiration,num_of_doors,body_style,drive_wheels,engine_location,wheel_base,length,width,height,curb_weight,engine_type,num_of_cylinders,engine_size,fuel_system,bore,stroke,compression_ratio,horsepower,peak_rpm,city_mpg,highway_mpg,price,log_price +3,alfa-romero,gas,std,two,hardtop_convert,rwd,front,88.6,168.8,64.1,48.8,2548,dohc,three_four,130,mpfi,3.47,2.68,9.0,111,5000,21,27,13495,9.510074525452104 +3,alfa-romero,gas,std,two,hardtop_convert,rwd,front,88.6,168.8,64.1,48.8,2548,dohc,three_four,130,mpfi,3.47,2.68,9.0,111,5000,21,27,16500,9.711115659888671 +1,alfa-romero,gas,std,two,hatchback,rwd,front,94.5,171.2,65.5,52.4,2823,ohcv,five_six,152,mpfi,2.68,3.47,9.0,154,5000,19,26,16500,9.711115659888671 +2,audi,gas,std,four,sedan,fwd,front,99.8,176.6,66.2,54.3,2337,ohc,three_four,109,mpfi,3.19,3.4,10.0,102,5500,24,30,13950,9.543234787249512 +2,audi,gas,std,four,sedan,4wd,front,99.4,176.6,66.4,54.3,2824,ohc,five_six,136,mpfi,3.19,3.4,8.0,115,5500,18,22,17450,9.767094927630573 +2,audi,gas,std,two,sedan,fwd,front,99.8,177.3,66.3,53.1,2507,ohc,five_six,136,mpfi,3.19,3.4,8.5,110,5500,19,25,15250,9.632334782035558 +1,audi,gas,std,four,sedan,fwd,front,105.8,192.7,71.4,55.7,2844,ohc,five_six,136,mpfi,3.19,3.4,8.5,110,5500,19,25,17710,9.781884730776879 +1,audi,gas,std,four,wagon,fwd,front,105.8,192.7,71.4,55.7,2954,ohc,five_six,136,mpfi,3.19,3.4,8.5,110,5500,19,25,18920,9.847974842605868 +1,audi,gas,turbo,four,sedan,fwd,front,105.8,192.7,71.4,55.9,3086,ohc,five_six,131,mpfi,3.13,3.4,8.3,140,5500,17,20,23875,10.08058716534893 +2,bmw,gas,std,two,sedan,rwd,front,101.2,176.8,64.8,54.3,2395,ohc,three_four,108,mpfi,3.5,2.8,8.8,101,5800,23,29,16430,9.706864211031315 +0,bmw,gas,std,four,sedan,rwd,front,101.2,176.8,64.8,54.3,2395,ohc,three_four,108,mpfi,3.5,2.8,8.8,101,5800,23,29,16925,9.736547097780475 +0,bmw,gas,std,two,sedan,rwd,front,101.2,176.8,64.8,54.3,2710,ohc,five_six,164,mpfi,3.31,3.19,9.0,121,4250,21,28,20970,9.950848123895966 +0,bmw,gas,std,four,sedan,rwd,front,101.2,176.8,64.8,54.3,2765,ohc,five_six,164,mpfi,3.31,3.19,9.0,121,4250,21,28,21105,9.9572652582166 +1,bmw,gas,std,four,sedan,rwd,front,103.5,189.0,66.9,55.7,3055,ohc,five_six,164,mpfi,3.31,3.19,9.0,121,4250,20,25,24565,10.109077944602749 +0,bmw,gas,std,four,sedan,rwd,front,103.5,189.0,66.9,55.7,3230,ohc,five_six,209,mpfi,3.62,3.39,8.0,182,5400,16,22,30760,10.333970423619581 +0,bmw,gas,std,two,sedan,rwd,front,103.5,193.8,67.9,53.7,3380,ohc,five_six,209,mpfi,3.62,3.39,8.0,182,5400,16,22,41315,10.628980909135285 +0,bmw,gas,std,four,sedan,rwd,front,110.0,197.0,70.9,56.3,3505,ohc,five_six,209,mpfi,3.62,3.39,8.0,182,5400,15,20,36880,10.51542467767053 +2,chevrolet,gas,std,two,hatchback,fwd,front,88.4,141.1,60.3,53.2,1488,l,three_four,61,2bbl,2.91,3.03,9.5,48,5100,47,53,5151,8.546946149565585 +1,chevrolet,gas,std,two,hatchback,fwd,front,94.5,155.9,63.6,52.0,1874,ohc,three_four,90,2bbl,3.03,3.11,9.6,70,5400,38,43,6295,8.747510946478448 +0,chevrolet,gas,std,four,sedan,fwd,front,94.5,158.8,63.6,52.0,1909,ohc,three_four,90,2bbl,3.03,3.11,9.6,70,5400,38,43,6575,8.791029857045965 +1,dodge,gas,std,two,hatchback,fwd,front,93.7,157.3,63.8,50.8,1876,ohc,three_four,90,2bbl,2.97,3.23,9.41,68,5500,37,41,5572,8.625509334899697 +1,dodge,gas,std,two,hatchback,fwd,front,93.7,157.3,63.8,50.8,1876,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,31,38,6377,8.760453046315272 +1,dodge,gas,turbo,two,hatchback,fwd,front,93.7,157.3,63.8,50.8,2128,ohc,three_four,98,mpfi,3.03,3.39,7.6,102,5500,24,30,7957,8.981807323377534 +1,dodge,gas,std,four,hatchback,fwd,front,93.7,157.3,63.8,50.6,1967,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,31,38,6229,8.736971085254146 +1,dodge,gas,std,four,sedan,fwd,front,93.7,157.3,63.8,50.6,1989,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,31,38,6692,8.808668062106715 +1,dodge,gas,std,four,sedan,fwd,front,93.7,157.3,63.8,50.6,1989,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,31,38,7609,8.937087036176523 +1,dodge,gas,turbo,?,sedan,fwd,front,93.7,157.3,63.8,50.6,2191,ohc,three_four,98,mpfi,3.03,3.39,7.6,102,5500,24,30,8558,9.054621796976763 +-1,dodge,gas,std,four,wagon,fwd,front,103.3,174.6,64.6,59.8,2535,ohc,three_four,122,2bbl,3.34,3.46,8.5,88,5000,24,30,8921,9.096163326913784 +3,dodge,gas,turbo,two,hatchback,fwd,front,95.9,173.2,66.3,50.2,2811,ohc,three_four,156,mfi,3.6,3.9,7.0,145,5000,19,24,12964,9.469931564261438 +2,honda,gas,std,two,hatchback,fwd,front,86.6,144.6,63.9,50.8,1713,ohc,three_four,92,1bbl,2.91,3.41,9.6,58,4800,49,54,6479,8.776321456449958 +2,honda,gas,std,two,hatchback,fwd,front,86.6,144.6,63.9,50.8,1819,ohc,three_four,92,1bbl,2.91,3.41,9.2,76,6000,31,38,6855,8.832733591996416 +1,honda,gas,std,two,hatchback,fwd,front,93.7,150.0,64.0,52.6,1837,ohc,three_four,79,1bbl,2.91,3.07,10.1,60,5500,38,42,5399,8.593969030218288 +1,honda,gas,std,two,hatchback,fwd,front,93.7,150.0,64.0,52.6,1940,ohc,three_four,92,1bbl,2.91,3.41,9.2,76,6000,30,34,6529,8.784009071186633 +1,honda,gas,std,two,hatchback,fwd,front,93.7,150.0,64.0,52.6,1956,ohc,three_four,92,1bbl,2.91,3.41,9.2,76,6000,30,34,7129,8.871926251117628 +0,honda,gas,std,four,sedan,fwd,front,96.5,163.4,64.0,54.5,2010,ohc,three_four,92,1bbl,2.91,3.41,9.2,76,6000,30,34,7295,8.894944460956886 +0,honda,gas,std,four,wagon,fwd,front,96.5,157.1,63.9,58.3,2024,ohc,three_four,92,1bbl,2.92,3.41,9.2,76,6000,30,34,7295,8.894944460956886 +0,honda,gas,std,two,hatchback,fwd,front,96.5,167.5,65.2,53.3,2236,ohc,three_four,110,1bbl,3.15,3.58,9.0,86,5800,27,33,7895,8.973984926689743 +0,honda,gas,std,two,hatchback,fwd,front,96.5,167.5,65.2,53.3,2289,ohc,three_four,110,1bbl,3.15,3.58,9.0,86,5800,27,33,9095,9.115480090952223 +0,honda,gas,std,four,sedan,fwd,front,96.5,175.4,65.2,54.1,2304,ohc,three_four,110,1bbl,3.15,3.58,9.0,86,5800,27,33,8845,9.087607606593886 +0,honda,gas,std,four,sedan,fwd,front,96.5,175.4,62.5,54.1,2372,ohc,three_four,110,1bbl,3.15,3.58,9.0,86,5800,27,33,10295,9.23941361946189 +0,honda,gas,std,four,sedan,fwd,front,96.5,175.4,65.2,54.1,2465,ohc,three_four,110,mpfi,3.15,3.58,9.0,101,5800,24,28,12945,9.468464892185638 +1,honda,gas,std,two,sedan,fwd,front,96.5,169.1,66.0,51.0,2293,ohc,three_four,110,2bbl,3.15,3.58,9.1,100,5500,25,31,10345,9.244258590179644 +0,isuzu,gas,std,four,sedan,rwd,front,94.3,170.7,61.8,53.5,2337,ohc,three_four,111,2bbl,3.31,3.23,8.5,78,4800,24,29,6785,8.82246957226897 +2,isuzu,gas,std,two,hatchback,rwd,front,96.0,172.6,65.2,51.4,2734,ohc,three_four,119,spfi,3.43,3.23,9.2,90,5000,24,29,11048,9.310004695089129 +0,jaguar,gas,std,four,sedan,rwd,front,113.0,199.6,69.6,52.8,4066,dohc,five_six,258,mpfi,3.63,4.17,8.1,176,4750,15,19,32250,10.381273322223919 +0,jaguar,gas,std,four,sedan,rwd,front,113.0,199.6,69.6,52.8,4066,dohc,five_six,258,mpfi,3.63,4.17,8.1,176,4750,15,19,35550,10.478695435231387 +0,jaguar,gas,std,two,sedan,rwd,front,102.0,191.7,70.6,47.8,3950,ohcv,eight_twelve,326,mpfi,3.54,2.76,11.5,262,5000,13,17,36000,10.491274217438248 +1,mazda,gas,std,two,hatchback,fwd,front,93.1,159.1,64.2,54.1,1890,ohc,three_four,91,2bbl,3.03,3.15,9.0,68,5000,30,31,5195,8.555451903533328 +1,mazda,gas,std,two,hatchback,fwd,front,93.1,159.1,64.2,54.1,1900,ohc,three_four,91,2bbl,3.03,3.15,9.0,68,5000,31,38,6095,8.715224041915372 +1,mazda,gas,std,two,hatchback,fwd,front,93.1,159.1,64.2,54.1,1905,ohc,three_four,91,2bbl,3.03,3.15,9.0,68,5000,31,38,6795,8.823942326585245 +1,mazda,gas,std,four,sedan,fwd,front,93.1,166.8,64.2,54.1,1945,ohc,three_four,91,2bbl,3.03,3.15,9.0,68,5000,31,38,6695,8.809116258125274 +1,mazda,gas,std,four,sedan,fwd,front,93.1,166.8,64.2,54.1,1950,ohc,three_four,91,2bbl,3.08,3.15,9.0,68,5000,31,38,7395,8.9085593751449 +1,mazda,gas,std,two,hatchback,fwd,front,98.8,177.8,66.5,53.7,2385,ohc,three_four,122,2bbl,3.39,3.39,8.6,84,4800,26,32,8845,9.087607606593886 +0,mazda,gas,std,four,sedan,fwd,front,98.8,177.8,66.5,55.5,2410,ohc,three_four,122,2bbl,3.39,3.39,8.6,84,4800,26,32,8495,9.047233034106032 +1,mazda,gas,std,two,hatchback,fwd,front,98.8,177.8,66.5,53.7,2385,ohc,three_four,122,2bbl,3.39,3.39,8.6,84,4800,26,32,10595,9.2681374707024 +0,mazda,gas,std,four,sedan,fwd,front,98.8,177.8,66.5,55.5,2410,ohc,three_four,122,2bbl,3.39,3.39,8.6,84,4800,26,32,10245,9.234545060673 +0,mazda,diesel,std,?,sedan,fwd,front,98.8,177.8,66.5,55.5,2443,ohc,three_four,122,idi,3.39,3.39,22.7,64,4650,36,42,10795,9.286838342948908 +0,mazda,gas,std,four,hatchback,fwd,front,98.8,177.8,66.5,55.5,2425,ohc,three_four,122,2bbl,3.39,3.39,8.6,84,4800,26,32,11245,9.327678864393416 +0,mazda,gas,std,four,sedan,rwd,front,104.9,175.0,66.1,54.4,2670,ohc,three_four,140,mpfi,3.76,3.16,8.0,120,5000,19,27,18280,9.813562845008141 +0,mazda,diesel,std,four,sedan,rwd,front,104.9,175.0,66.1,54.4,2700,ohc,three_four,134,idi,3.43,3.64,22.0,72,4200,31,39,18344,9.81705782453774 +-1,mercedes-benz,diesel,turbo,four,sedan,rwd,front,110.0,190.9,70.3,56.5,3515,ohc,five_six,183,idi,3.58,3.64,21.5,123,4350,22,25,25552,10.148470870454794 +-1,mercedes-benz,diesel,turbo,four,wagon,rwd,front,110.0,190.9,70.3,58.7,3750,ohc,five_six,183,idi,3.58,3.64,21.5,123,4350,22,25,28248,10.248777937608223 +0,mercedes-benz,diesel,turbo,two,hardtop_convert,rwd,front,106.7,187.5,70.3,54.9,3495,ohc,five_six,183,idi,3.58,3.64,21.5,123,4350,22,25,28176,10.246225830735987 +-1,mercedes-benz,diesel,turbo,four,sedan,rwd,front,115.6,202.6,71.7,56.3,3770,ohc,five_six,183,idi,3.58,3.64,21.5,123,4350,22,25,31600,10.360912399575003 +-1,mercedes-benz,gas,std,four,sedan,rwd,front,115.6,202.6,71.7,56.5,3740,ohcv,eight_twelve,234,mpfi,3.46,3.1,8.3,155,4750,16,18,34184,10.439512977323862 +3,mercedes-benz,gas,std,two,hardtop_convert,rwd,front,96.6,180.3,70.5,50.8,3685,ohcv,eight_twelve,234,mpfi,3.46,3.1,8.3,155,4750,16,18,35056,10.464702061835247 +0,mercedes-benz,gas,std,four,sedan,rwd,front,120.9,208.1,71.7,56.7,3900,ohcv,eight_twelve,308,mpfi,3.8,3.35,8.0,184,4500,14,16,40960,10.62035125971339 +1,mercedes-benz,gas,std,two,hardtop_convert,rwd,front,112.0,199.2,72.0,55.4,3715,ohcv,eight_twelve,304,mpfi,3.8,3.35,8.0,184,4500,14,16,45400,10.72326738402944 +1,mercury,gas,turbo,two,hatchback,rwd,front,102.7,178.4,68.0,54.8,2910,ohc,three_four,140,mpfi,3.78,3.12,8.0,175,5000,19,24,16503,9.711297461543568 +2,mitsubishi,gas,std,two,hatchback,fwd,front,93.7,157.3,64.4,50.8,1918,ohc,three_four,92,2bbl,2.97,3.23,9.4,68,5500,37,41,5389,8.592115117933497 +2,mitsubishi,gas,std,two,hatchback,fwd,front,93.7,157.3,64.4,50.8,1944,ohc,three_four,92,2bbl,2.97,3.23,9.4,68,5500,31,38,6189,8.73052880173936 +2,mitsubishi,gas,std,two,hatchback,fwd,front,93.7,157.3,64.4,50.8,2004,ohc,three_four,92,2bbl,2.97,3.23,9.4,68,5500,31,38,6669,8.805225202632306 +1,mitsubishi,gas,turbo,two,hatchback,fwd,front,93.0,157.3,63.8,50.8,2145,ohc,three_four,98,spdi,3.03,3.39,7.6,102,5500,24,30,7689,8.94754601503218 +3,mitsubishi,gas,turbo,two,hatchback,fwd,front,96.3,173.0,65.4,49.4,2370,ohc,three_four,110,spdi,3.17,3.46,7.5,116,5500,23,30,9959,9.20623194393164 +3,mitsubishi,gas,std,two,hatchback,fwd,front,96.3,173.0,65.4,49.4,2328,ohc,three_four,122,2bbl,3.35,3.46,8.5,88,5000,25,32,8499,9.047703788498627 +3,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.9,173.2,66.3,50.2,2833,ohc,three_four,156,spdi,3.58,3.86,7.0,145,5000,19,24,12629,9.44375103564617 +3,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.9,173.2,66.3,50.2,2921,ohc,three_four,156,spdi,3.59,3.86,7.0,145,5000,19,24,14869,9.607033787697222 +3,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.9,173.2,66.3,50.2,2926,ohc,three_four,156,spdi,3.59,3.86,7.0,145,5000,19,24,14489,9.581145019820722 +1,mitsubishi,gas,std,four,sedan,fwd,front,96.3,172.4,65.4,51.6,2365,ohc,three_four,122,2bbl,3.35,3.46,8.5,88,5000,25,32,6989,8.85209276347713 +1,mitsubishi,gas,std,four,sedan,fwd,front,96.3,172.4,65.4,51.6,2405,ohc,three_four,122,2bbl,3.35,3.46,8.5,88,5000,25,32,8189,9.010547069270189 +1,mitsubishi,gas,turbo,four,sedan,fwd,front,96.3,172.4,65.4,51.6,2403,ohc,three_four,110,spdi,3.17,3.46,7.5,116,5500,23,30,9279,9.13550906135318 +-1,mitsubishi,gas,std,four,sedan,fwd,front,96.3,172.4,65.4,51.6,2403,ohc,three_four,110,spdi,3.17,3.46,7.5,116,5500,23,30,9279,9.13550906135318 +1,nissan,gas,std,two,sedan,fwd,front,94.5,165.3,63.8,54.5,1889,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,5499,8.612321536507814 +1,nissan,diesel,std,two,sedan,fwd,front,94.5,165.3,63.8,54.5,2017,ohc,three_four,103,idi,2.99,3.47,21.9,55,4800,45,50,7099,8.867709208039386 +1,nissan,gas,std,two,sedan,fwd,front,94.5,165.3,63.8,54.5,1918,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,6649,8.802221746402456 +1,nissan,gas,std,four,sedan,fwd,front,94.5,165.3,63.8,54.5,1938,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,6849,8.831857935197906 +1,nissan,gas,std,four,wagon,fwd,front,94.5,170.2,63.8,53.5,2024,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,7349,8.90231952852887 +1,nissan,gas,std,two,sedan,fwd,front,94.5,165.3,63.8,54.5,1951,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,7299,8.895492631451633 +1,nissan,gas,std,two,hatchback,fwd,front,94.5,165.6,63.8,53.3,2028,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,7799,8.961750799330497 +1,nissan,gas,std,four,sedan,fwd,front,94.5,165.3,63.8,54.5,1971,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,7499,8.92252495730139 +1,nissan,gas,std,four,wagon,fwd,front,94.5,170.2,63.8,53.5,2037,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,7999,8.987071812848821 +2,nissan,gas,std,two,hardtop_convert,fwd,front,95.1,162.4,63.8,53.3,2008,ohc,three_four,97,2bbl,3.15,3.29,9.4,69,5200,31,37,8249,9.017847259860732 +0,nissan,gas,std,four,hatchback,fwd,front,97.2,173.4,65.2,54.7,2324,ohc,three_four,120,2bbl,3.33,3.47,8.5,97,5200,27,34,8949,9.099297073182859 +0,nissan,gas,std,four,sedan,fwd,front,97.2,173.4,65.2,54.7,2302,ohc,three_four,120,2bbl,3.33,3.47,8.5,97,5200,27,34,9549,9.164191715950203 +0,nissan,gas,std,four,sedan,fwd,front,100.4,181.7,66.5,55.1,3095,ohcv,five_six,181,mpfi,3.43,3.27,9.0,152,5200,17,22,13499,9.510370887608827 +0,nissan,gas,std,four,wagon,fwd,front,100.4,184.6,66.5,56.1,3296,ohcv,five_six,181,mpfi,3.43,3.27,9.0,152,5200,17,22,14399,9.57491403870827 +0,nissan,gas,std,four,sedan,fwd,front,100.4,184.6,66.5,55.1,3060,ohcv,five_six,181,mpfi,3.43,3.27,9.0,152,5200,19,25,13499,9.510370887608827 +3,nissan,gas,std,two,hatchback,rwd,front,91.3,170.7,67.9,49.7,3071,ohcv,five_six,181,mpfi,3.43,3.27,9.0,160,5200,19,25,17199,9.752606521576492 +3,nissan,gas,turbo,two,hatchback,rwd,front,91.3,170.7,67.9,49.7,3139,ohcv,five_six,181,mpfi,3.43,3.27,7.8,200,5200,17,23,19699,9.888323152016355 +1,nissan,gas,std,two,hatchback,rwd,front,99.2,178.5,67.9,49.7,3139,ohcv,five_six,181,mpfi,3.43,3.27,9.0,160,5200,19,25,18399,9.820051594294094 +0,peugot,gas,std,four,sedan,rwd,front,107.9,186.7,68.4,56.7,3020,l,three_four,120,mpfi,3.46,3.19,8.4,97,5000,19,24,11900,9.38429367909962 +0,peugot,diesel,turbo,four,sedan,rwd,front,107.9,186.7,68.4,56.7,3197,l,three_four,152,idi,3.7,3.52,21.0,95,4150,28,33,13200,9.487972108574462 +0,peugot,gas,std,four,wagon,rwd,front,114.2,198.9,68.4,58.7,3230,l,three_four,120,mpfi,3.46,3.19,8.4,97,5000,19,24,12440,9.42867236629317 +0,peugot,diesel,turbo,four,wagon,rwd,front,114.2,198.9,68.4,58.7,3430,l,three_four,152,idi,3.7,3.52,21.0,95,4150,25,25,13860,9.536762272743895 +0,peugot,gas,std,four,sedan,rwd,front,107.9,186.7,68.4,56.7,3075,l,three_four,120,mpfi,3.46,2.19,8.4,95,5000,19,24,15580,9.65374331942474 +0,peugot,diesel,turbo,four,sedan,rwd,front,107.9,186.7,68.4,56.7,3252,l,three_four,152,idi,3.7,3.52,21.0,95,4150,28,33,16900,9.735068900911164 +0,peugot,gas,std,four,wagon,rwd,front,114.2,198.9,68.4,56.7,3285,l,three_four,120,mpfi,3.46,2.19,8.4,95,5000,19,24,16695,9.722864552377755 +0,peugot,diesel,turbo,four,wagon,rwd,front,114.2,198.9,68.4,58.7,3485,l,three_four,152,idi,3.7,3.52,21.0,95,4150,25,25,17075,9.74537068443899 +0,peugot,gas,std,four,sedan,rwd,front,107.9,186.7,68.4,56.7,3075,l,three_four,120,mpfi,3.46,3.19,8.4,97,5000,19,24,16630,9.718963572186974 +0,peugot,diesel,turbo,four,sedan,rwd,front,107.9,186.7,68.4,56.7,3252,l,three_four,152,idi,3.7,3.52,21.0,95,4150,28,33,17950,9.795345393916424 +0,peugot,gas,turbo,four,sedan,rwd,front,108.0,186.7,68.3,56.0,3130,l,three_four,134,mpfi,3.61,3.21,7.0,142,5600,18,24,18150,9.806425839692997 +1,plymouth,gas,std,two,hatchback,fwd,front,93.7,157.3,63.8,50.8,1918,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,37,41,5572,8.625509334899697 +1,plymouth,gas,turbo,two,hatchback,fwd,front,93.7,157.3,63.8,50.8,2128,ohc,three_four,98,spdi,3.03,3.39,7.6,102,5500,24,30,7957,8.981807323377534 +1,plymouth,gas,std,four,hatchback,fwd,front,93.7,157.3,63.8,50.6,1967,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,31,38,6229,8.736971085254146 +1,plymouth,gas,std,four,sedan,fwd,front,93.7,167.3,63.8,50.8,1989,ohc,three_four,90,2bbl,2.97,3.23,9.4,68,5500,31,38,6692,8.808668062106715 +1,plymouth,gas,std,four,sedan,fwd,front,93.7,167.3,63.8,50.8,2191,ohc,three_four,98,2bbl,2.97,3.23,9.4,68,5500,31,38,7609,8.937087036176523 +-1,plymouth,gas,std,four,wagon,fwd,front,103.3,174.6,64.6,59.8,2535,ohc,three_four,122,2bbl,3.35,3.46,8.5,88,5000,24,30,8921,9.096163326913784 +3,plymouth,gas,turbo,two,hatchback,rwd,front,95.9,173.2,66.3,50.2,2818,ohc,three_four,156,spdi,3.59,3.86,7.0,145,5000,19,24,12764,9.454383987398135 +3,porsche,gas,std,two,hatchback,rwd,front,94.5,168.9,68.3,50.2,2778,ohc,three_four,151,mpfi,3.94,3.11,9.5,143,5500,19,27,22018,9.999615579630348 +3,porsche,gas,std,two,hardtop_convert,rwd,rear,89.5,168.9,65.0,51.6,2756,ohcf,five_six,194,mpfi,3.74,2.9,9.5,207,5900,17,25,32528,10.389856535868129 +3,porsche,gas,std,two,hardtop_convert,rwd,rear,89.5,168.9,65.0,51.6,2756,ohcf,five_six,194,mpfi,3.74,2.9,9.5,207,5900,17,25,34028,10.434938994095775 +3,porsche,gas,std,two,hardtop_convert,rwd,rear,89.5,168.9,65.0,51.6,2800,ohcf,five_six,194,mpfi,3.74,2.9,9.5,207,5900,17,25,37028,10.519429662187102 +3,saab,gas,std,two,hatchback,fwd,front,99.1,186.6,66.5,56.1,2658,ohc,three_four,121,mpfi,3.54,3.07,9.31,110,5250,21,28,11850,9.380083146563278 +2,saab,gas,std,four,sedan,fwd,front,99.1,186.6,66.5,56.1,2695,ohc,three_four,121,mpfi,3.54,3.07,9.3,110,5250,21,28,12170,9.406729185981574 +3,saab,gas,std,two,hatchback,fwd,front,99.1,186.6,66.5,56.1,2707,ohc,three_four,121,mpfi,2.54,2.07,9.3,110,5250,21,28,15040,9.618468597503831 +2,saab,gas,std,four,sedan,fwd,front,99.1,186.6,66.5,56.1,2758,ohc,three_four,121,mpfi,3.54,3.07,9.3,110,5250,21,28,15510,9.649240256170584 +3,saab,gas,turbo,two,hatchback,fwd,front,99.1,186.6,66.5,56.1,2808,dohc,three_four,121,mpfi,3.54,3.07,9.0,160,5500,19,26,18150,9.806425839692997 +2,saab,gas,turbo,four,sedan,fwd,front,99.1,186.6,66.5,56.1,2847,dohc,three_four,121,mpfi,3.54,3.07,9.0,160,5500,19,26,18620,9.831991550831058 +2,subaru,gas,std,two,hatchback,fwd,front,93.7,156.9,63.4,53.7,2050,ohcf,three_four,97,2bbl,3.62,2.36,9.0,69,4900,31,36,5118,8.540519016719735 +2,subaru,gas,std,two,hatchback,fwd,front,93.7,157.9,63.6,53.7,2120,ohcf,three_four,108,2bbl,3.62,2.64,8.7,73,4400,26,31,7053,8.86120833720818 +2,subaru,gas,std,two,hatchback,4wd,front,93.3,157.3,63.8,55.7,2240,ohcf,three_four,108,2bbl,3.62,2.64,8.7,73,4400,26,31,7603,8.936298185228436 +0,subaru,gas,std,four,sedan,fwd,front,97.2,172.0,65.4,52.5,2145,ohcf,three_four,108,2bbl,3.62,2.64,9.5,82,4800,32,37,7126,8.871505346165781 +0,subaru,gas,std,four,sedan,fwd,front,97.2,172.0,65.4,52.5,2190,ohcf,three_four,108,2bbl,3.62,2.64,9.5,82,4400,28,33,7775,8.958668737047434 +0,subaru,gas,std,four,sedan,fwd,front,97.2,172.0,65.4,52.5,2340,ohcf,three_four,108,mpfi,3.62,2.64,9.0,94,5200,26,32,9960,9.206332350578643 +0,subaru,gas,std,four,sedan,4wd,front,97.0,172.0,65.4,54.3,2385,ohcf,three_four,108,2bbl,3.62,2.64,9.0,82,4800,24,25,9233,9.130539301772627 +0,subaru,gas,turbo,four,sedan,4wd,front,97.0,172.0,65.4,54.3,2510,ohcf,three_four,108,mpfi,3.62,2.64,7.7,111,4800,24,29,11259,9.32892308780313 +0,subaru,gas,std,four,wagon,fwd,front,97.0,173.5,65.4,53.0,2290,ohcf,three_four,108,2bbl,3.62,2.64,9.0,82,4800,28,32,7463,8.917712757131387 +0,subaru,gas,std,four,wagon,fwd,front,97.0,173.5,65.4,53.0,2455,ohcf,three_four,108,mpfi,3.62,2.64,9.0,94,5200,25,31,10198,9.2299469016151 +0,subaru,gas,std,four,wagon,4wd,front,96.9,173.6,65.4,54.9,2420,ohcf,three_four,108,2bbl,3.62,2.64,9.0,82,4800,23,29,8013,8.98882050177807 +0,subaru,gas,turbo,four,wagon,4wd,front,96.9,173.6,65.4,54.9,2650,ohcf,three_four,108,mpfi,3.62,2.64,7.7,111,4800,23,23,11694,9.366831168735615 +1,toyota,gas,std,two,hatchback,fwd,front,95.7,158.7,63.6,54.5,1985,ohc,three_four,92,2bbl,3.05,3.03,9.0,62,4800,35,39,5348,8.584477938221834 +1,toyota,gas,std,two,hatchback,fwd,front,95.7,158.7,63.6,54.5,2040,ohc,three_four,92,2bbl,3.05,3.03,9.0,62,4800,31,38,6338,8.754318540250866 +1,toyota,gas,std,four,hatchback,fwd,front,95.7,158.7,63.6,54.5,2015,ohc,three_four,92,2bbl,3.05,3.03,9.0,62,4800,31,38,6488,8.77770959579525 +0,toyota,gas,std,four,wagon,fwd,front,95.7,169.7,63.6,59.1,2280,ohc,three_four,92,2bbl,3.05,3.03,9.0,62,4800,31,37,6918,8.841881989497114 +0,toyota,gas,std,four,wagon,4wd,front,95.7,169.7,63.6,59.1,2290,ohc,three_four,92,2bbl,3.05,3.03,9.0,62,4800,27,32,7898,8.974364841846596 +0,toyota,gas,std,four,wagon,4wd,front,95.7,169.7,63.6,59.1,3110,ohc,three_four,92,2bbl,3.05,3.03,9.0,62,4800,27,32,8778,9.080003870248179 +0,toyota,gas,std,four,sedan,fwd,front,95.7,166.3,64.4,53.0,2081,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,30,37,6938,8.844768827529695 +0,toyota,gas,std,four,hatchback,fwd,front,95.7,166.3,64.4,52.8,2109,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,30,37,7198,8.881558488638976 +0,toyota,diesel,std,four,sedan,fwd,front,95.7,166.3,64.4,53.0,2275,ohc,three_four,110,idi,3.27,3.35,22.5,56,4500,34,36,7898,8.974364841846596 +0,toyota,diesel,std,four,hatchback,fwd,front,95.7,166.3,64.4,52.8,2275,ohc,three_four,110,idi,3.27,3.35,22.5,56,4500,38,47,7788,8.960339366492091 +0,toyota,gas,std,four,sedan,fwd,front,95.7,166.3,64.4,53.0,2094,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,38,47,7738,8.953898535260459 +0,toyota,gas,std,four,hatchback,fwd,front,95.7,166.3,64.4,52.8,2122,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,28,34,8358,9.03097444300786 +0,toyota,gas,std,four,sedan,fwd,front,95.7,166.3,64.4,52.8,2140,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,28,34,9258,9.133243321591216 +1,toyota,gas,std,two,sedan,rwd,front,94.5,168.7,64.0,52.6,2169,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,29,34,8058,8.994420665751292 +1,toyota,gas,std,two,hatchback,rwd,front,94.5,168.7,64.0,52.6,2204,ohc,three_four,98,2bbl,3.19,3.03,9.0,70,4800,29,34,8238,9.016512874996026 +1,toyota,gas,std,two,sedan,rwd,front,94.5,168.7,64.0,52.6,2265,dohc,three_four,98,mpfi,3.24,3.08,9.4,112,6600,26,29,9298,9.13755460225053 +1,toyota,gas,std,two,hatchback,rwd,front,94.5,168.7,64.0,52.6,2300,dohc,three_four,98,mpfi,3.24,3.08,9.4,112,6600,26,29,9538,9.16303909885817 +2,toyota,gas,std,two,hardtop_convert,rwd,front,98.4,176.2,65.6,52.0,2540,ohc,three_four,146,mpfi,3.62,3.5,9.3,116,4800,24,30,8449,9.041803370152845 +2,toyota,gas,std,two,hardtop_convert,rwd,front,98.4,176.2,65.6,52.0,2536,ohc,three_four,146,mpfi,3.62,3.5,9.3,116,4800,24,30,9639,9.173572647783969 +2,toyota,gas,std,two,hatchback,rwd,front,98.4,176.2,65.6,52.0,2551,ohc,three_four,146,mpfi,3.62,3.5,9.3,116,4800,24,30,9989,9.20923976653215 +2,toyota,gas,std,two,hardtop_convert,rwd,front,98.4,176.2,65.6,52.0,2679,ohc,three_four,146,mpfi,3.62,3.5,9.3,116,4800,24,30,11199,9.323579767582693 +2,toyota,gas,std,two,hatchback,rwd,front,98.4,176.2,65.6,52.0,2714,ohc,three_four,146,mpfi,3.62,3.5,9.3,116,4800,24,30,11549,9.354354132115088 +2,toyota,gas,std,two,hardtop_convert,rwd,front,98.4,176.2,65.6,53.0,2975,ohc,three_four,146,mpfi,3.62,3.5,9.3,116,4800,24,30,17669,9.77956697061665 +-1,toyota,gas,std,four,sedan,fwd,front,102.4,175.6,66.5,54.9,2326,ohc,three_four,122,mpfi,3.31,3.54,8.7,92,4200,29,34,8948,9.09918532261002 +-1,toyota,diesel,turbo,four,sedan,fwd,front,102.4,175.6,66.5,54.9,2480,ohc,three_four,110,idi,3.27,3.35,22.5,73,4500,30,33,10698,9.277812087091196 +-1,toyota,gas,std,four,hatchback,fwd,front,102.4,175.6,66.5,53.9,2414,ohc,three_four,122,mpfi,3.31,3.54,8.7,92,4200,27,32,9988,9.209139651399664 +-1,toyota,gas,std,four,sedan,fwd,front,102.4,175.6,66.5,54.9,2414,ohc,three_four,122,mpfi,3.31,3.54,8.7,92,4200,27,32,10898,9.296334565143043 +-1,toyota,gas,std,four,hatchback,fwd,front,102.4,175.6,66.5,53.9,2458,ohc,three_four,122,mpfi,3.31,3.54,8.7,92,4200,27,32,11248,9.327945614050446 +3,toyota,gas,std,two,hatchback,rwd,front,102.9,183.5,67.7,52.0,2976,dohc,five_six,171,mpfi,3.27,3.35,9.3,161,5200,20,24,16558,9.71462464769875 +3,toyota,gas,std,two,hatchback,rwd,front,102.9,183.5,67.7,52.0,3016,dohc,five_six,171,mpfi,3.27,3.35,9.3,161,5200,19,24,15998,9.680218993408767 +-1,toyota,gas,std,four,sedan,rwd,front,104.5,187.8,66.5,54.1,3131,dohc,five_six,171,mpfi,3.27,3.35,9.2,156,5200,20,24,15690,9.660778845727078 +-1,toyota,gas,std,four,wagon,rwd,front,104.5,187.8,66.5,54.1,3151,dohc,five_six,161,mpfi,3.27,3.35,9.2,156,5200,19,24,15750,9.66459564425378 +2,volkswagen,diesel,std,two,sedan,fwd,front,97.3,171.7,65.5,55.7,2261,ohc,three_four,97,idi,3.01,3.4,23.0,52,4800,37,46,7775,8.958668737047434 +2,volkswagen,gas,std,two,sedan,fwd,front,97.3,171.7,65.5,55.7,2209,ohc,three_four,109,mpfi,3.19,3.4,9.0,85,5250,27,34,7975,8.984066927653044 +2,volkswagen,diesel,std,four,sedan,fwd,front,97.3,171.7,65.5,55.7,2264,ohc,three_four,97,idi,3.01,3.4,23.0,52,4800,37,46,7995,8.986571625268054 +2,volkswagen,gas,std,four,sedan,fwd,front,97.3,171.7,65.5,55.7,2212,ohc,three_four,109,mpfi,3.19,3.4,9.0,85,5250,27,34,8195,9.01127949117793 +2,volkswagen,gas,std,four,sedan,fwd,front,97.3,171.7,65.5,55.7,2275,ohc,three_four,109,mpfi,3.19,3.4,9.0,85,5250,27,34,8495,9.047233034106032 +2,volkswagen,diesel,turbo,four,sedan,fwd,front,97.3,171.7,65.5,55.7,2319,ohc,three_four,97,idi,3.01,3.4,23.0,68,4500,37,42,9495,9.158520623246385 +2,volkswagen,gas,std,four,sedan,fwd,front,97.3,171.7,65.5,55.7,2300,ohc,three_four,109,mpfi,3.19,3.4,10.0,100,5500,26,32,9995,9.209840246934501 +3,volkswagen,gas,std,two,hardtop_convert,fwd,front,94.5,159.3,64.2,55.6,2254,ohc,three_four,109,mpfi,3.19,3.4,8.5,90,5500,24,29,11595,9.358329249689632 +3,volkswagen,gas,std,two,hatchback,fwd,front,94.5,165.7,64.0,51.4,2221,ohc,three_four,109,mpfi,3.19,3.4,8.5,90,5500,24,29,9980,9.20833836930551 +0,volkswagen,gas,std,four,sedan,fwd,front,100.4,180.2,66.9,55.1,2661,ohc,five_six,136,mpfi,3.19,3.4,8.5,110,5500,19,24,13295,9.49514330367712 +0,volkswagen,diesel,turbo,four,sedan,fwd,front,100.4,180.2,66.9,55.1,2579,ohc,three_four,97,idi,3.01,3.4,23.0,68,4500,33,38,13845,9.535679435605061 +0,volkswagen,gas,std,four,wagon,fwd,front,100.4,183.1,66.9,55.1,2563,ohc,three_four,109,mpfi,3.19,3.4,9.0,88,5500,25,31,12290,9.416541202560081 +-2,volvo,gas,std,four,sedan,rwd,front,104.3,188.8,67.2,56.2,2912,ohc,three_four,141,mpfi,3.78,3.15,9.5,114,5400,23,28,12940,9.468078568054892 +-1,volvo,gas,std,four,wagon,rwd,front,104.3,188.8,67.2,57.5,3034,ohc,three_four,141,mpfi,3.78,3.15,9.5,114,5400,23,28,13415,9.504128762859725 +-2,volvo,gas,std,four,sedan,rwd,front,104.3,188.8,67.2,56.2,2935,ohc,three_four,141,mpfi,3.78,3.15,9.5,114,5400,24,28,15985,9.679406061493943 +-1,volvo,gas,std,four,wagon,rwd,front,104.3,188.8,67.2,57.5,3042,ohc,three_four,141,mpfi,3.78,3.15,9.5,114,5400,24,28,16515,9.71202433782489 +-2,volvo,gas,turbo,four,sedan,rwd,front,104.3,188.8,67.2,56.2,3045,ohc,three_four,130,mpfi,3.62,3.15,7.5,162,5100,17,22,18420,9.821192309809298 +-1,volvo,gas,turbo,four,wagon,rwd,front,104.3,188.8,67.2,57.5,3157,ohc,three_four,130,mpfi,3.62,3.15,7.5,162,5100,17,22,18950,9.849559210510572 +-1,volvo,gas,std,four,sedan,rwd,front,109.1,188.8,68.9,55.5,2952,ohc,three_four,141,mpfi,3.78,3.15,9.5,114,5400,23,28,16845,9.731809155840653 +-1,volvo,gas,turbo,four,sedan,rwd,front,109.1,188.8,68.8,55.5,3049,ohc,three_four,141,mpfi,3.78,3.15,8.7,160,5300,19,25,19045,9.854559878912704 +-1,volvo,gas,std,four,sedan,rwd,front,109.1,188.8,68.9,55.5,3012,ohcv,five_six,173,mpfi,3.58,2.87,8.8,134,5500,18,23,21485,9.975110296209095 +-1,volvo,diesel,turbo,four,sedan,rwd,front,109.1,188.8,68.9,55.5,3217,ohc,five_six,145,idi,3.01,3.4,23.0,106,4800,26,27,22470,10.019936365179374 +-1,volvo,gas,turbo,four,sedan,rwd,front,109.1,188.8,68.9,55.5,3062,ohc,three_four,141,mpfi,3.78,3.15,9.5,114,5400,19,25,22625,10.026810768568128 diff --git a/Module4/Classification.ipynb b/Module4/Classification.ipynb new file mode 100644 index 0000000..6b344d9 --- /dev/null +++ b/Module4/Classification.ipynb @@ -0,0 +1,762 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Applications of Classification\n", + "\n", + "In this lab you will perform **two-class classification** using **logistic regression**. A classifier is a machine learning model that separates the **label** into categories or **classes**. In other words, classification models are **supervised** machine learning models which predict a categorical label.\n", + "\n", + "The German Credit bank customer data is used to determine if a particular person is a good or bad credit risk. Thus, credit risk of the customer is the classes you must predict. In this case, the cost to the bank of issuing a loan to a bad risk customer is five times that of denying a loan to a good customer. This fact will become important when evaluating the performance of the model. \n", + "\n", + "Logistic regression is a linear model but with a nonlinear response. The response is binary, $\\{ 0,1 \\}$, or positive and negative. The response is the prediction of the category. \n", + "\n", + "In this lab you will learn the following: \n", + "- How to prepare data for classification models using scikit-learn. \n", + "- Constructing a classification model using scikit-learn.\n", + "- Evaluating the performance of the classification model. \n", + "- Using techniques such as reweighting the labels and changing the decision threshold to change the trade-off between false positive and false negative error rates. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Basics of logistic regression\n", + "\n", + "In this section some basic properties of the logistic regression model are presented. \n", + "\n", + "First, execute the code in the cell below to load the packages required to run this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import math\n", + "from sklearn import preprocessing\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Logistic regression is widely used as a classification model. Logistic regression is linear model, with a binary response, `{False, True}` or `{0, 1}`. You can think of this response as having a Binomial distribution. For linear regression the response is just, well, linear. Logistic regression is a linear regression model with a nonlinear output. The response of the linear model is transformed or 'squashed' to values close to 0 and 1 using a **sigmoidal function**, also known as the **logistic function**. The result of this transformation is a response which is the log likelihood for each of the two classes. \n", + "\n", + "The sigmoidal or logistic function can be expressed as follows:\n", + "\n", + "$$f(x) = \\frac{1}{1 + e^{-\\kappa(x - x_0)}} \\\\\n", + "\\kappa = steepness$$\n", + "\n", + "Execute the code in the cell below to compute and plot an example of the logistic function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "xseq = np.arange(-7, 7, 0.1)\n", + "\n", + "logistic = [math.exp(v)/(1 + math.exp(v)) for v in xseq]\n", + "\n", + "plt.plot(xseq, logistic, color = 'red')\n", + "plt.plot([-7,7], [0.5,0.5], color = 'blue')\n", + "plt.plot([0,0], [0,1], color = 'blue')\n", + "plt.title('Logistic function for two-class classification')\n", + "plt.ylabel('log likelihood')\n", + "plt.xlabel('Value of output from linear regression')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's make this a bit more concrete with a simple example. Say we have a linear model:\n", + "\n", + "$$\\hat{y} = \\beta_0 + \\beta_1\\ x$$\n", + "\n", + "Now, depending on the value of $\\hat{y}$ we want to classify the output from a logistic regression model as either `0` or `1`. Substituting the linear model into the logistic function creates the following expression:\n", + "\n", + "$$F(\\hat{y}) = \\frac{1}{1 + e^{-\\kappa(\\beta_0 + \\beta_1\\ x)}} $$\n", + "\n", + "In this way we transform the continuous output of the linear model defined on $-\\infty \\le \\hat{y} \\le \\infty$ to a binary response, $0 \\le F(\\hat{y}) \\le 1$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load and prepare the data set\n", + "\n", + "As a first step, load the dataset. The code in the cell below loads the dataset and assigns human-readable names to the columns. Execute this code and examine the result. \n", + "\n", + "You should by now be very familiar with the next few sections as we have covered them in detail in previous labs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit = pd.read_csv('German_Credit_Preped.csv')\n", + "print(credit.shape)\n", + "credit.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are 22 columns, 1 customer identifier column, 20 features, plus a label column. These features represent information a bank might have on its customers. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is one other aspect of this data set which you should be aware of. The label has significant **class imbalance**. Class imbalance means that there are unequal numbers of cases for the categories of the label. \n", + "\n", + "To examine the class imbalance in these data, execute the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "credit_counts = credit[['credit_history', 'bad_credit']].groupby('bad_credit').count()\n", + "print(credit_counts)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that only 30% of the cases have bad credit. This is not surprising, since a bank would typically retain customers with good credit. However, this imbalance will bias the training of any model. \n", + "\n", + "Before proceeding, answer **Question 1** on the course page." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare data for scikit-learn model\n", + "\n", + "With the data prepared, it is time to create the numpy arrays required for the scikit-learn model. \n", + "\n", + "The code in the cell below creates a numpy array of the label values. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "labels = np.array(credit['bad_credit'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you need to create the numpy feature array or **model matrix**. As first step, the categorical variables need to be recoded as binary dummy variables. As discussed in another lesson this is a three step process:\n", + "\n", + "1. Encode the categorical string variables as integers.\n", + "2. Transform the integer coded variables to dummy variables. \n", + "3. Append each dummy coded categorical variable to the model matrix. \n", + "\n", + "Execute the code in the cell below to perform this processing and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def encode_string(cat_features):\n", + " ## First encode the strings to numeric categories\n", + " enc = preprocessing.LabelEncoder()\n", + " enc.fit(cat_features)\n", + " enc_cat_features = enc.transform(cat_features)\n", + " ## Now, apply one hot encoding\n", + " ohe = preprocessing.OneHotEncoder()\n", + " encoded = ohe.fit(enc_cat_features.reshape(-1,1))\n", + " return encoded.transform(enc_cat_features.reshape(-1,1)).toarray()\n", + "\n", + "categorical_columns = ['credit_history', 'purpose', 'gender_status', \n", + " 'time_in_residence', 'property']\n", + "\n", + "Features = encode_string(credit['checking_account_status'])\n", + "for col in categorical_columns:\n", + " temp = encode_string(credit[col])\n", + " Features = np.concatenate([Features, temp], axis = 1)\n", + "\n", + "print(Features.shape)\n", + "print(Features[:2, :]) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next the numeric features must be concatenated to the numpy array by executing the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.concatenate([Features, np.array(credit[['loan_duration_mo', 'loan_amount', \n", + " 'payment_pcnt_income', 'age_yrs']])], axis = 1)\n", + "print(Features.shape)\n", + "print(Features[:2, :]) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the dummy variables the original 6 categorical features are now 31 dummy variables. With the 4 numeric features there are a total of 35. \n", + "\n", + "Now, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You must split the cases into training and test data sets. This step is critical. If machine learning models are tested on the training data, the results will be both biased and overly optimistic.\n", + "\n", + "The code in the cell below performs the following processing:\n", + "1. An index vector is Bernoulli sampled using the `train_test_split` function from the `model_selection` package of scikit-learn. \n", + "2. The first column of the resulting index array contains the indices of the samples for the training cases. \n", + "3. The second column of the resulting index array contains the indices of the samples for the test cases. \n", + "\n", + "Execute the code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(9988)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is just one more step in preparing this data. Numeric features must be rescaled so they have a similar range of values. Rescaling prevents features from having an undue influence on model training simply because then have a larger range of numeric variables.\n", + "\n", + "The code in the cell below uses the StanardScaler function from the Scikit Learn preprocessing package to Zscore scale the numeric features. Notice that the scaler is fit only on the training data. The trained scaler is these applied to the test data. Test data should always be scaled using the parameters from the training data.\n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scaler = preprocessing.StandardScaler().fit(X_train[:,34:])\n", + "X_train[:,34:] = scaler.transform(X_train[:,34:])\n", + "X_test[:,34:] = scaler.transform(X_test[:,34:])\n", + "X_train[:2,]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The four numeric features are now scaled. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Construct the logistic regression model\n", + "\n", + "Now, it is time to compute the logistic regression model. The code in the cell below does the following:\n", + "1. Define a logistic regression model object using the `LogisticRegression` method from the scikit-learn `linear_model` package.\n", + "2. Fit the linear model using the numpy arrays of the features and the labels for the training data set.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "logistic_mod = linear_model.LogisticRegression() \n", + "logistic_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The model has been computed. Notice that the configuration of the model object has been printed. In this case, only default settings are shown, since no arguments were given to create the model object. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, print and examine the model coefficients by executing the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(logistic_mod.intercept_)\n", + "print(logistic_mod.coef_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First of all, notice that model coefficients look just as they would for an regression model. This is expected as previously explained. Additionally, nearly all the coefficients have the same magnitude indicating this model is likely to be overfit, given the number of features. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Recall that the logistic regression model outputs probabilities for each class. The class with the highest probability is taken as the score (prediction). Execute the code and the cell below to compute and display a sample of these class probabilities for the test feature set. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print(probabilities[:15,:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The first column is the probability of a score of $0$ and the second column is the probability of a score of $1$. Notice that for most, but not all cases, the probability of a score of $0$ is higher than $1$. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Score and evaluate the classification model\n", + "\n", + "Now that the class probabilities have been computed these values must be transformed into actual class scores. Recall that the log likelihoods for two-class logistic regression are computed by applying the sigmoid or logistic transformation to the output of the linear model. The simple choice is to set the threshold between the two likelihoods at $0.5$. The code in the cell below applies this initial threshold to the probability of a score of $0$ for the test data. A few examples along with the known labels are then displayed. Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "scores = score_model(probabilities, 0.5)\n", + "print(np.array(scores[:15]))\n", + "print(y_test[:15])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some of the positive ($1$) predictions agree with the test labels in the second row, but several do not." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Given the results of the test data, how can you quantify the performance of the model? In general, you must **always use multiple metrics to evaluate the performance of any machine leaning model**, including classifiers. \n", + "\n", + "For classifiers there are a number of metrics commonly used. The **confusion matrix** lays out the correctly and incorrectly classified cases in a tabular format. There are various metrics derived from the values in the confusion matrix. Some of the common cases are briefly reviewed below. \n", + "\n", + "**Confusion matrix**\n", + "\n", + "As already stated, the confusion matrix lays out correctly and incorrectly classified cases. For the binary (two-class) case the confusion matrix is organized as follows:\n", + "\n", + "| | Scored Positive | Scored Negative| \n", + "|------|:------:|:------:| \n", + "|**Actual Positive** | True Positive | False Negative |\n", + "|**Actual Negative**| False Positive | True Negative | \n", + "\n", + "Here the four elements in the matrix are defined as: \n", + "**True Positive** or **TP** are cases with positive labels which have been correctly classified as positive. \n", + "**True Negative** or **TN** are cases with negative labels which have been correctly classified as negative. \n", + "**False Positive** or **FP** are cases with negative labels which have been incorrectly classified as positive. \n", + "**False Negative** or **FN** are cases with positive labels which have been incorrectly classified as negative.\n", + "\n", + "When creating a confusion matrix it is important to understand and maintain a convention for which differentiating positive and negative label values. The usual convention is to call the $1$ case positive and the $0$ case negative. \n", + "\n", + "Notice that there is an ambiguity in which case is considered positive and which is considered negative when the confusion matrix is computed. Whenever you examine a confusion matrix it is a good idea to spend a moment and decide which case is which. This step will help you relate the results to the problem at hand. \n", + "\n", + "**Accuracy**\n", + "\n", + "Accuracy is a simple and often misused metric. In simple terms, accuracy is the fraction of cases correctly classified. For a two-class classifier accuracy is written as:\n", + "\n", + "$$Accuracy = \\frac{TP+TN}{TP+FP+TN+FN}$$\n", + "\n", + "Accuracy can be quite misleading. For example, say a classifier is used to detect fraudulent accounts and the rate of fraud is less than 1%. A naive model would be to say all accounts are not fraudulent. This model has accuracy exceeding 0.99. This sounds impressive, but is clearly useless. \n", + "\n", + "**Precision**\n", + "\n", + "Precision is the fraction of correctly classified label cases out of all cases classified with that label value. We can express precision by the following relationship:\n", + "\n", + "$$Precision = \\frac{M_{i,i}}{\\sum_j M_{i,j}}$$\n", + "\n", + "In other words, the precision statistic is the number of correctly classified cases for the label value divided by all the cases in the column. Thus, precision is sensitive to the number of cases correctly classified for a given score value. \n", + "\n", + "**Recall** \n", + "\n", + "Recall is the fraction of cases of a label value correctly classified out of all cases that actually have that label value. We can express recall by the following relationship:\n", + "\n", + "$$Recall = \\frac{M_{i,i}}{\\sum_i M_{i,j}}$$\n", + "\n", + "In other words, the recall statistic is the number of correctly classified cases for the label value divided by all the cases in the row. Thus, precision is sensitive to the number of cases correctly classified for a given true label value. \n", + "\n", + "**F1**\n", + "\n", + "The F1 statistic is weighted average of precision and recall. We can express F1 by the following relationship:\n", + "\n", + "$$F1 = 2 * \\frac{precision * recall}{precision + recall}$$\n", + "\n", + "In other words, F1 is a weighted metric for overall model performance. \n", + "\n", + "**ROC** and **AUC**\n", + "\n", + "The receiver operating characteristic or ROC is a curve that displays the relationship between the true positive rate on the vertical axis and false positive rate on the horizontal axis. The ROC curve shows the tradeoff between true positive rate and false positive rate. An example is illustrated below. \n", + "\n", + "In principle, you can pick the desired operating point for a classifier on this curve. Towards the left favors low false positive rate at the expense of true positive rate. Towards the right favors high true positive rate at the expense of higher false positive rate. \n", + "\n", + "\n", + "\n", + "\"drawing\"\n", + "\n", + "
**ROC curve with values of AUC for balanced two-class problem**
\n", + "\n", + "The AUC is the area or integral under the ROC curve. The overall performance of the classifier is measured by the area under the curve or AUC. But, how can you interpret a specific AUC value? The higher the AUC the lower the increase in false positive rate required to achieve a required true positive rate. For an ideal classifier the AUC is 1.0. A true positive rate is achieved with a 0 false positive rate. This behavior means that AUC is useful for comparing classifiers. The classifier with higher AUC is generally the better one. \n", + "\n", + "For balanced cases, random guessing gives an AUC or 0.5. A balanced case has equal numbers of positive and negative cases. So Bernoulli sampling (random guessing) with a probability $p$ for the positive case, will produce a ROC curve that runs diagonally from $0.0,0.0$ to $1.0,1.0$. The area under this triangular region is 0.5. It is often said that a classifier with an AUC of greater than 0.5 is better than random guessing. But, **for unbalanced cases this statement is not in true in general**. \n", + "\n", + "****\n", + "**Note:** The term receive operating characteristic may seem a bit odd in the machine learning context. This term arose in the early days of radar engineering as a metric to measure the tradeoff between radar signal receiver correctly detecting a target, say an aircraft, and producing a positive response from noise, such as flying birds or clouds. A radar receiver would be adjusted to the desired operating point along its ROC curve. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below implements a function that computes and displays the forementioned classifier performance metrics. The code metrics are computed using the `precision_recall_fscore_support` and `accuracy_score` functions from the `metrics` package of scikit-learn. The confusion matrix is computed using the `confusion_matrix` function from this same package. Execute this code and examine the results for the logistic regression model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def print_metrics(labels, scores):\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + "\n", + "\n", + " \n", + "print_metrics(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results:\n", + "1. The confusion matrix shows the following characteristics; a) most of the positive cases are correctly classified, 182 vs. 30, however, b) may negative cases are are scored incorrectly, with only 49 correct, vs. 39 incorrect. \n", + "2. The overall accuracy is 0.77. However as just observed this is **extremely misleading!**. In fact the negative cases are poorly classified, and it is these bad credit customers the bank cares most about. This is not an unusual case. Accuracy figures should always be regarded with healthy skepticism.\n", + "3. The class imbalance is confirmed. Of the 300 test cases 212 are positive and 88 are negative. \n", + "4. The precision, recall and F1 all show that positive cases are classified reasonably well, but the negative cases are not. As already mentioned, it is these negative cases that are of greatest importance to the bank. \n", + "\n", + "Finally, the code in the cell below computes and displays the ROC curve and AUC. The `roc_curve` and `auc` functions from the scikit-learn `metrics` package are used to compute these values. \n", + "\n", + "Execute this code, examine the result, and answer **Question 3** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show()\n", + " \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ROC curve is above the diagonal red-dotted line and the AUC is 0.77. But, given the class imbalance of two positive cases for each negative case how good is this? \n", + "\n", + "One point of comparison is a naive 'classifier' that sets all cases to positive. The code in the cell below contains such a classifier. This algorithm is not really a classifier at all. This 'classifier' is hard coded. The ROC curve and AUC are then computed and displayed. Run this code, and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "probs_positive = np.concatenate((np.ones((probabilities.shape[0], 1)), \n", + " np.zeros((probabilities.shape[0], 1))),\n", + " axis = 1)\n", + "scores_positive = score_model(probs_positive, 0.5)\n", + "print_metrics(y_test, scores_positive) \n", + "plot_auc(y_test, probs_positive) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice, the accuracy from this 'classifier' is 0.71. This reflects the class imbalance. The ROC curve is directly along the diagonal which gives an AUC of 0.5. The logistic regression classifier is definitely better than this!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compute a weighted model\n", + "\n", + "Recall that a falsely classifying a bad credit risk customer as good costs the bank five times more than classifying a good credit risk customer as bad. Given this situation, the results of the first model are not that good. There are two reasons for this:\n", + "\n", + "1. The class imbalance in the label has biased the training of the model. As you observed from the accuracy of the naive 'classifier' is not that different from the logistic regression model. \n", + "2. Nothing has been done to weight the results toward correctly classifying the bad credit risk customers at the expense of the good credit risk customers.\n", + "\n", + "One approach to these problems is to weight the classes when computing the logistic regression model. The code in the cell below adds a `class_weight` argument to the call to the `LogisticRegression` function. In this case weights are chosen as $0.45, 0.55$ but you can also give another combination. Execute this code" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "logistic_mod = linear_model.LogisticRegression(class_weight = {0:0.45, 1:0.55}) \n", + "logistic_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to compute and display the class probabilities for each case. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print(probabilities[:15,:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By eyeball, the above probabilities are not terribly different from the unweighted model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To find if there is any significant difference with the unweighted model, compute the scores and the metrics and display the metrics by executing the code in the cell below. \n", + "\n", + "Then, answer **Question 4** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "scores = score_model(probabilities, 0.5)\n", + "print_metrics(y_test, scores) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The accuracy is slightly changed with respect to the unweighted model. The change could be more had we give more weights to one of the class than what we did here. The precision, recall and F1 are slightly better for the negative cases. Reweighting the labels has moved the results in the desired direction, at least a bit.\n", + "\n", + "Notice also, the ROC curve and AUC are essentially unchanged. The trade-off between true positive and false positive is similar to the unweighted model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Find a better threshold\n", + "\n", + "There is another way to tip the model scoring toward correctly identifying the bad credit cases. The scoring threshold can be adjusted. Until now, the scores have been computed from the probabilities using a threshold of 0.5. However, there is no reason to think this is the correct choice. Recall that the score is determined by setting the threshold along the sigmoidal or logistic function. It is possible to favor either positive or negative cases by changing the threshold along this curve. \n", + "\n", + "The code in the cell below contains a function for scoring and evaluating the model for a given threshold value. The `for` loop iterates over the list of five candidate threshold values. Execute this code and examine how changing the threshold value changes the scoring for the model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def test_threshold(probs, labels, threshold):\n", + " scores = score_model(probs, threshold)\n", + " print('')\n", + " print('For threshold = ' + str(threshold))\n", + " print_metrics(labels, scores)\n", + "\n", + "thresholds = [0.45, 0.40, 0.35, 0.3, 0.25]\n", + "for t in thresholds:\n", + " test_threshold(probabilities, y_test, t)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the threshold is decreased the number of correctly classified negative cases (bad credit customers) increases at the expense of correctly classifying positive cases (good credit customers). At the same time, accuracy decreases. However, as you have observed, accuracy is not a particularly useful metric here. \n", + "\n", + "Exactly which threshold to pick is a business decision. Notice that with a threshold value of 0.25 the number of false negatives (misclassified good credit customers) is about four times that of false positives (misclassified bad credit customers). " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lesson you have done the following:\n", + "1. Prepared the credit risk data set for modeling with scikit-learn. The steps included scaling the numeric features, and dummy variable coding the categorical features. The result is a numpy array of features and a numpy array of the label values. \n", + "2. Computed a logistic regression model. \n", + "3. Evaluated the performance of the module using multiple metrics. It is clear that accuracy is not a particularly useful metric here. The naive 'classifier' produced accuracy that was only somewhat worse as a result of the class imbalance. The confusion matrix and the precision, recall and F1 statistics gave meaningful measures of model performance, especially when considered together. \n", + "4. Reweighted the labels and changed the decision threshold for the reweighted model. These steps helped overcome both the class imbalance problem and the asymmetric cost of misclassification to the bank. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3.6", + "language": "python", + "name": "python36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module4/German_Credit_Preped.csv b/Module4/German_Credit_Preped.csv new file mode 100644 index 0000000..bb1ea3c --- /dev/null +++ b/Module4/German_Credit_Preped.csv @@ -0,0 +1,1001 @@ +customer_id,checking_account_status,loan_duration_mo,credit_history,purpose,loan_amount,savings_account_balance,time_employed_yrs,payment_pcnt_income,gender_status,other_signators,time_in_residence,property,age_yrs,other_credit_outstanding,home_ownership,number_loans,job_category,dependents,telephone,foreign_worker,bad_credit +1122334,< 0 DM,6,critical account - other non-bank loans,radio/television,1169,unknown/none,>= 7 years,4,male-single,none,4,real estate,67,none,own,2,skilled,1,yes,yes,0 +6156361,0 - 200 DM,48,current loans paid,radio/television,5951,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,real estate,22,none,own,1,skilled,1,none,yes,1 +2051359,none,12,critical account - other non-bank loans,education,2096,< 100 DM,4 - 7 years,2,male-single,none,3,real estate,49,none,own,1,unskilled-resident,2,none,yes,0 +8740590,< 0 DM,42,current loans paid,furniture/equipment,7882,< 100 DM,4 - 7 years,2,male-single,guarantor,4,building society savings/life insurance,45,none,for free,1,skilled,2,none,yes,0 +3924540,< 0 DM,24,past payment delays,car (new),4870,< 100 DM,1 - 4 years,3,male-single,none,4,unknown-none,53,none,for free,2,skilled,2,none,yes,1 +3115687,none,36,current loans paid,education,9055,unknown/none,1 - 4 years,2,male-single,none,4,unknown-none,35,none,for free,1,unskilled-resident,2,yes,yes,0 +8251714,none,24,current loans paid,furniture/equipment,2835,500 - 1000 DM,>= 7 years,3,male-single,none,4,building society savings/life insurance,53,none,own,1,skilled,1,none,yes,0 +2272783,0 - 200 DM,36,current loans paid,car (used),6948,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,35,none,rent,1,highly skilled,1,yes,yes,0 +1865292,none,12,current loans paid,radio/television,3059,>= 1000 DM,4 - 7 years,2,male-divorced/separated,none,4,real estate,61,none,own,1,unskilled-resident,1,none,yes,0 +8369450,0 - 200 DM,30,critical account - other non-bank loans,car (new),5234,< 100 DM,unemployed,4,male-married/widowed,none,2,car or other,28,none,own,2,highly skilled,1,none,yes,1 +2859658,0 - 200 DM,12,current loans paid,car (new),1295,< 100 DM,< 1 year,3,female-divorced/separated/married,none,1,car or other,25,none,rent,1,skilled,1,none,yes,1 +7677202,< 0 DM,48,current loans paid,business,4308,< 100 DM,< 1 year,3,female-divorced/separated/married,none,4,building society savings/life insurance,24,none,rent,1,skilled,1,none,yes,1 +6368245,0 - 200 DM,12,current loans paid,radio/television,1567,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,1,car or other,22,none,own,1,skilled,1,yes,yes,0 +8363768,< 0 DM,24,critical account - other non-bank loans,car (new),1199,< 100 DM,>= 7 years,4,male-single,none,4,car or other,60,none,own,2,unskilled-resident,1,none,yes,1 +6846974,< 0 DM,15,current loans paid,car (new),1403,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,28,none,rent,1,skilled,1,none,yes,0 +6293067,< 0 DM,24,current loans paid,radio/television,1282,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,none,2,car or other,32,none,own,1,unskilled-resident,1,none,yes,1 +2807551,none,24,critical account - other non-bank loans,radio/television,2424,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,53,none,own,2,skilled,1,none,yes,0 +8394907,< 0 DM,30,no credit - paid,business,8072,unknown/none,< 1 year,2,male-single,none,3,car or other,25,bank,own,3,skilled,1,none,yes,0 +9516779,0 - 200 DM,24,current loans paid,car (used),12579,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,2,unknown-none,44,none,for free,1,highly skilled,1,yes,yes,1 +1750839,none,24,current loans paid,radio/television,3430,500 - 1000 DM,>= 7 years,3,male-single,none,2,car or other,31,none,own,1,skilled,2,yes,yes,0 +9007770,none,9,critical account - other non-bank loans,car (new),2134,< 100 DM,1 - 4 years,4,male-single,none,4,car or other,48,none,own,3,skilled,1,yes,yes,0 +5611578,< 0 DM,6,current loans paid,radio/television,2647,500 - 1000 DM,1 - 4 years,2,male-single,none,3,real estate,44,none,rent,1,skilled,2,none,yes,0 +3677496,< 0 DM,10,critical account - other non-bank loans,car (new),2241,< 100 DM,< 1 year,1,male-single,none,3,real estate,48,none,rent,2,unskilled-resident,2,none,no,0 +3265162,0 - 200 DM,12,critical account - other non-bank loans,car (used),1804,100 - 500 DM,< 1 year,3,male-single,none,4,building society savings/life insurance,44,none,own,1,skilled,1,none,yes,0 +7795200,none,10,critical account - other non-bank loans,furniture/equipment,2069,unknown/none,1 - 4 years,2,male-married/widowed,none,1,car or other,26,none,own,2,skilled,1,none,no,0 +4362293,< 0 DM,6,current loans paid,furniture/equipment,1374,< 100 DM,1 - 4 years,1,male-single,none,2,real estate,36,bank,own,1,unskilled-resident,1,yes,yes,0 +4506902,none,6,no credit - paid,radio/television,426,< 100 DM,>= 7 years,4,male-married/widowed,none,4,car or other,39,none,own,1,unskilled-resident,1,none,yes,0 +8881521,> 200 DM or salary assignment,12,all loans at bank paid,radio/television,409,>= 1000 DM,1 - 4 years,3,female-divorced/separated/married,none,3,real estate,42,none,rent,2,skilled,1,none,yes,0 +1398341,0 - 200 DM,7,current loans paid,radio/television,2415,< 100 DM,1 - 4 years,3,male-single,guarantor,2,real estate,34,none,own,1,skilled,1,none,yes,0 +7333893,< 0 DM,60,past payment delays,business,6836,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,63,none,own,2,skilled,1,yes,yes,1 +7890417,0 - 200 DM,18,current loans paid,business,1913,>= 1000 DM,< 1 year,3,male-married/widowed,none,3,real estate,36,bank,own,1,skilled,1,yes,yes,0 +2017286,< 0 DM,24,current loans paid,furniture/equipment,4020,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,27,stores,own,1,skilled,1,none,yes,0 +7081292,0 - 200 DM,18,current loans paid,car (new),5866,100 - 500 DM,1 - 4 years,2,male-single,none,2,car or other,30,none,own,2,skilled,1,yes,yes,0 +8807838,none,12,critical account - other non-bank loans,business,1264,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,57,none,rent,1,unskilled-resident,1,none,yes,0 +1221741,> 200 DM or salary assignment,12,current loans paid,furniture/equipment,1474,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,building society savings/life insurance,33,bank,own,1,highly skilled,1,yes,yes,0 +3977061,0 - 200 DM,45,critical account - other non-bank loans,radio/television,4746,< 100 DM,< 1 year,4,male-single,none,2,building society savings/life insurance,25,none,own,2,unskilled-resident,1,none,yes,1 +6028350,none,48,critical account - other non-bank loans,education,6110,< 100 DM,1 - 4 years,1,male-single,none,3,unknown-none,31,bank,for free,1,skilled,1,yes,yes,0 +2903516,> 200 DM or salary assignment,18,current loans paid,radio/television,2100,< 100 DM,1 - 4 years,4,male-single,co-applicant,2,real estate,37,stores,own,1,skilled,1,none,yes,1 +8335289,> 200 DM or salary assignment,10,current loans paid,domestic appliances,1225,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,37,none,own,1,skilled,1,yes,yes,0 +8596814,0 - 200 DM,9,current loans paid,radio/television,458,< 100 DM,1 - 4 years,4,male-single,none,3,real estate,24,none,own,1,skilled,1,none,yes,0 +1473485,none,30,current loans paid,radio/television,2333,500 - 1000 DM,>= 7 years,4,male-single,none,2,car or other,30,bank,own,1,highly skilled,1,none,yes,0 +5043986,0 - 200 DM,12,current loans paid,radio/television,1158,500 - 1000 DM,1 - 4 years,3,male-divorced/separated,none,1,car or other,26,none,own,1,skilled,1,yes,yes,0 +7015188,0 - 200 DM,18,past payment delays,repairs,6204,< 100 DM,1 - 4 years,2,male-single,none,4,real estate,44,none,own,1,unskilled-resident,2,yes,yes,0 +5609809,< 0 DM,30,critical account - other non-bank loans,car (used),6187,100 - 500 DM,4 - 7 years,1,male-married/widowed,none,4,car or other,24,none,rent,2,skilled,1,none,yes,0 +6574474,< 0 DM,48,critical account - other non-bank loans,car (used),6143,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,4,unknown-none,58,stores,for free,2,unskilled-resident,1,none,yes,1 +6668437,none,11,critical account - other non-bank loans,car (new),1393,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,car or other,35,none,own,2,highly skilled,1,none,yes,0 +7468489,none,36,current loans paid,radio/television,2299,500 - 1000 DM,>= 7 years,4,male-single,none,4,car or other,39,none,own,1,skilled,1,none,yes,0 +6045059,< 0 DM,6,current loans paid,car (used),1352,500 - 1000 DM,unemployed,1,female-divorced/separated/married,none,2,building society savings/life insurance,23,none,rent,1,unemployed-unskilled-non-resident,1,yes,yes,0 +3678840,none,11,critical account - other non-bank loans,car (new),7228,< 100 DM,1 - 4 years,1,male-single,none,4,building society savings/life insurance,39,none,own,2,unskilled-resident,1,none,yes,0 +9514325,none,12,current loans paid,radio/television,2073,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,co-applicant,2,real estate,28,none,own,1,skilled,1,none,yes,0 +7764002,0 - 200 DM,24,past payment delays,furniture/equipment,2333,unknown/none,< 1 year,4,male-single,none,2,building society savings/life insurance,29,bank,own,1,unskilled-resident,1,none,yes,0 +4029031,0 - 200 DM,27,past payment delays,car (used),5965,< 100 DM,>= 7 years,1,male-single,none,2,car or other,30,none,own,2,highly skilled,1,yes,yes,0 +1318782,none,12,current loans paid,radio/television,1262,< 100 DM,1 - 4 years,3,male-single,none,2,car or other,25,none,own,1,skilled,1,none,yes,0 +1442097,none,18,current loans paid,car (used),3378,unknown/none,1 - 4 years,2,male-single,none,1,building society savings/life insurance,31,none,own,1,skilled,1,yes,yes,0 +7784864,0 - 200 DM,36,past payment delays,car (new),2225,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,57,bank,for free,2,skilled,1,yes,yes,1 +4572543,none,6,all loans at bank paid,car (new),783,unknown/none,1 - 4 years,1,male-single,guarantor,2,real estate,26,stores,own,1,unskilled-resident,2,none,yes,0 +4365555,0 - 200 DM,12,current loans paid,radio/television,6468,unknown/none,unemployed,2,male-single,none,1,unknown-none,52,none,own,1,highly skilled,1,yes,yes,1 +6134562,none,36,critical account - other non-bank loans,radio/television,9566,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,car or other,31,stores,own,2,skilled,1,none,yes,0 +3409265,> 200 DM or salary assignment,18,current loans paid,car (new),1961,< 100 DM,>= 7 years,3,female-divorced/separated/married,none,2,car or other,23,none,own,1,highly skilled,1,none,yes,0 +1504574,< 0 DM,36,critical account - other non-bank loans,furniture/equipment,6229,< 100 DM,< 1 year,4,female-divorced/separated/married,co-applicant,4,unknown-none,23,none,rent,2,unskilled-resident,1,yes,yes,1 +3256655,0 - 200 DM,9,current loans paid,business,1391,< 100 DM,1 - 4 years,2,male-married/widowed,none,1,real estate,27,bank,own,1,skilled,1,yes,yes,0 +8806796,0 - 200 DM,15,critical account - other non-bank loans,radio/television,1537,unknown/none,>= 7 years,4,male-single,guarantor,4,real estate,50,none,own,2,skilled,1,yes,yes,0 +1364372,0 - 200 DM,36,no credit - paid,business,1953,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,61,none,for free,1,highly skilled,1,yes,yes,1 +8688712,0 - 200 DM,48,no credit - paid,business,14421,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,25,none,own,1,skilled,1,yes,yes,1 +3747039,none,24,current loans paid,radio/television,3181,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,26,none,own,1,skilled,1,yes,yes,0 +3453642,none,27,current loans paid,repairs,5190,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,48,none,own,4,skilled,2,yes,yes,0 +5520489,none,12,current loans paid,radio/television,2171,< 100 DM,< 1 year,2,female-divorced/separated/married,none,2,car or other,29,bank,own,1,skilled,1,none,yes,0 +1647906,0 - 200 DM,12,current loans paid,car (new),1007,>= 1000 DM,1 - 4 years,4,male-married/widowed,none,1,real estate,22,none,own,1,skilled,1,none,yes,0 +2987381,none,36,current loans paid,education,1819,< 100 DM,1 - 4 years,4,male-single,none,4,unknown-none,37,stores,for free,1,skilled,1,yes,yes,1 +3683782,none,36,current loans paid,radio/television,2394,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,25,none,own,1,skilled,1,none,yes,0 +6065675,none,36,current loans paid,car (used),8133,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,2,building society savings/life insurance,30,bank,own,1,skilled,1,none,yes,0 +1510672,none,7,critical account - other non-bank loans,radio/television,730,unknown/none,>= 7 years,4,male-single,none,2,building society savings/life insurance,46,none,rent,2,unskilled-resident,1,yes,yes,0 +6390550,< 0 DM,8,critical account - other non-bank loans,other,1164,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,51,bank,for free,2,highly skilled,2,yes,yes,0 +6771352,0 - 200 DM,42,critical account - other non-bank loans,business,5954,< 100 DM,4 - 7 years,2,female-divorced/separated/married,none,1,real estate,41,bank,own,2,unskilled-resident,1,none,yes,0 +4802045,< 0 DM,36,current loans paid,education,1977,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,40,none,own,1,highly skilled,1,yes,yes,1 +8911390,< 0 DM,12,critical account - other non-bank loans,car (used),1526,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,66,none,for free,2,highly skilled,1,none,yes,0 +1139059,< 0 DM,42,current loans paid,radio/television,3965,< 100 DM,< 1 year,4,male-single,none,3,car or other,34,none,own,1,skilled,1,none,yes,1 +7399701,0 - 200 DM,11,past payment delays,radio/television,4771,< 100 DM,4 - 7 years,2,male-single,none,4,building society savings/life insurance,51,none,own,1,skilled,1,none,yes,0 +5604915,none,54,no credit - paid,car (used),9436,unknown/none,1 - 4 years,2,male-single,none,2,building society savings/life insurance,39,none,own,1,unskilled-resident,2,none,yes,0 +7683335,0 - 200 DM,30,current loans paid,furniture/equipment,3832,< 100 DM,< 1 year,2,male-married/widowed,none,1,building society savings/life insurance,22,none,own,1,skilled,1,none,yes,0 +5503995,none,24,current loans paid,radio/television,5943,unknown/none,< 1 year,1,female-divorced/separated/married,none,1,car or other,44,none,own,2,skilled,1,yes,yes,1 +8801062,none,15,current loans paid,radio/television,1213,500 - 1000 DM,>= 7 years,4,male-single,none,3,building society savings/life insurance,47,stores,own,1,skilled,1,yes,yes,0 +1817735,none,18,current loans paid,business,1568,100 - 500 DM,1 - 4 years,3,female-divorced/separated/married,none,4,building society savings/life insurance,24,none,rent,1,unskilled-resident,1,none,yes,0 +1806409,< 0 DM,24,current loans paid,other,1755,< 100 DM,>= 7 years,4,female-divorced/separated/married,guarantor,4,real estate,58,none,own,1,unskilled-resident,1,yes,yes,0 +3028735,< 0 DM,10,current loans paid,radio/television,2315,< 100 DM,>= 7 years,3,male-single,none,4,real estate,52,none,own,1,unskilled-resident,1,none,yes,0 +3707969,none,12,critical account - other non-bank loans,business,1412,< 100 DM,1 - 4 years,4,female-divorced/separated/married,guarantor,2,real estate,29,none,own,2,highly skilled,1,yes,yes,0 +7327747,0 - 200 DM,18,critical account - other non-bank loans,furniture/equipment,1295,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,building society savings/life insurance,27,none,own,2,skilled,1,none,yes,0 +5790827,0 - 200 DM,36,current loans paid,education,12612,100 - 500 DM,1 - 4 years,1,male-single,none,4,unknown-none,47,none,for free,1,skilled,2,yes,yes,1 +1812102,< 0 DM,18,current loans paid,car (new),2249,100 - 500 DM,4 - 7 years,4,male-single,none,3,car or other,30,none,own,1,highly skilled,2,yes,yes,0 +4049219,< 0 DM,12,no credit - paid,repairs,1108,< 100 DM,4 - 7 years,4,male-single,none,3,real estate,28,none,own,2,skilled,1,none,yes,1 +6177564,none,12,critical account - other non-bank loans,radio/television,618,< 100 DM,>= 7 years,4,male-single,none,4,real estate,56,none,own,1,skilled,1,none,yes,0 +9127163,< 0 DM,12,critical account - other non-bank loans,car (used),1409,< 100 DM,>= 7 years,4,male-single,none,3,real estate,54,none,own,1,skilled,1,none,yes,0 +2628060,none,12,critical account - other non-bank loans,radio/television,797,unknown/none,>= 7 years,4,female-divorced/separated/married,none,3,building society savings/life insurance,33,bank,own,1,unskilled-resident,2,none,yes,1 +2031456,> 200 DM or salary assignment,24,critical account - other non-bank loans,furniture/equipment,3617,unknown/none,>= 7 years,4,male-single,co-applicant,4,unknown-none,20,none,rent,2,skilled,1,none,yes,0 +3725303,0 - 200 DM,12,current loans paid,car (new),1318,>= 1000 DM,>= 7 years,4,male-single,none,4,real estate,54,none,own,1,skilled,1,yes,yes,0 +8326409,0 - 200 DM,54,no credit - paid,business,15945,< 100 DM,< 1 year,3,male-single,none,4,unknown-none,58,none,rent,1,skilled,1,yes,yes,1 +1957928,none,12,critical account - other non-bank loans,education,2012,unknown/none,4 - 7 years,4,female-divorced/separated/married,none,2,car or other,61,none,own,1,skilled,1,none,yes,0 +6428170,0 - 200 DM,18,current loans paid,business,2622,100 - 500 DM,1 - 4 years,4,male-single,none,4,car or other,34,none,own,1,skilled,1,none,yes,0 +4371483,0 - 200 DM,36,critical account - other non-bank loans,radio/television,2337,< 100 DM,>= 7 years,4,male-single,none,4,real estate,36,none,own,1,skilled,1,none,yes,0 +7476603,0 - 200 DM,20,past payment delays,car (used),7057,unknown/none,4 - 7 years,3,male-single,none,4,building society savings/life insurance,36,bank,rent,2,highly skilled,2,yes,yes,0 +2083425,none,24,current loans paid,car (new),1469,100 - 500 DM,>= 7 years,4,male-married/widowed,none,4,real estate,41,none,rent,1,unskilled-resident,1,none,yes,0 +7663021,0 - 200 DM,36,current loans paid,radio/television,2323,< 100 DM,4 - 7 years,4,male-single,none,4,car or other,24,none,rent,1,skilled,1,none,yes,0 +8330126,none,6,past payment delays,radio/television,932,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,real estate,24,none,own,1,skilled,1,none,yes,0 +4762252,0 - 200 DM,9,critical account - other non-bank loans,furniture/equipment,1919,< 100 DM,4 - 7 years,4,male-single,none,3,car or other,35,none,rent,1,skilled,1,yes,yes,0 +8744199,none,12,current loans paid,car (used),2445,unknown/none,< 1 year,2,male-married/widowed,none,4,car or other,26,none,rent,1,skilled,1,yes,yes,0 +7423252,0 - 200 DM,24,critical account - other non-bank loans,other,11938,< 100 DM,1 - 4 years,2,male-single,co-applicant,3,car or other,39,none,own,2,highly skilled,2,yes,yes,1 +7539327,none,18,all loans at bank paid,car (new),6458,< 100 DM,>= 7 years,2,male-single,none,4,unknown-none,39,bank,own,2,highly skilled,2,yes,yes,1 +9510423,0 - 200 DM,12,current loans paid,car (new),6078,< 100 DM,4 - 7 years,2,male-single,none,2,car or other,32,none,own,1,skilled,1,none,yes,0 +7532368,< 0 DM,24,current loans paid,furniture/equipment,7721,unknown/none,< 1 year,1,female-divorced/separated/married,none,2,building society savings/life insurance,30,none,own,1,skilled,1,yes,no,0 +1929784,0 - 200 DM,14,current loans paid,business,1410,500 - 1000 DM,>= 7 years,1,male-married/widowed,none,2,real estate,35,none,own,1,skilled,1,yes,yes,0 +9517102,> 200 DM or salary assignment,6,critical account - other non-bank loans,car (new),1299,< 100 DM,1 - 4 years,1,male-single,none,1,real estate,74,none,own,3,unemployed-unskilled-non-resident,2,none,no,0 +2801710,0 - 200 DM,9,current loans paid,furniture/equipment,918,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,1,building society savings/life insurance,30,none,own,1,skilled,1,none,yes,1 +3066581,0 - 200 DM,6,past payment delays,business,1449,100 - 500 DM,>= 7 years,1,male-divorced/separated,none,2,car or other,31,bank,own,2,skilled,2,none,yes,0 +2924368,> 200 DM or salary assignment,15,current loans paid,education,392,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,23,none,rent,1,skilled,1,yes,yes,0 +5814710,0 - 200 DM,18,current loans paid,car (new),6260,< 100 DM,4 - 7 years,3,male-single,none,3,real estate,28,none,rent,1,unskilled-resident,1,none,yes,0 +5186756,none,36,critical account - other non-bank loans,car (new),7855,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,real estate,25,stores,own,2,skilled,1,yes,yes,1 +3473006,< 0 DM,12,current loans paid,radio/television,1680,500 - 1000 DM,>= 7 years,3,male-married/widowed,none,1,real estate,35,none,own,1,skilled,1,none,yes,0 +7491936,none,48,critical account - other non-bank loans,radio/television,3578,unknown/none,>= 7 years,4,male-single,none,1,real estate,47,none,own,1,skilled,1,yes,yes,0 +8010233,< 0 DM,42,current loans paid,radio/television,7174,unknown/none,4 - 7 years,4,female-divorced/separated/married,none,3,car or other,30,none,own,1,highly skilled,1,yes,yes,1 +4298578,< 0 DM,10,critical account - other non-bank loans,furniture/equipment,2132,unknown/none,< 1 year,2,female-divorced/separated/married,co-applicant,3,real estate,27,none,rent,2,skilled,1,none,no,0 +2515323,< 0 DM,33,critical account - other non-bank loans,furniture/equipment,4281,500 - 1000 DM,1 - 4 years,1,female-divorced/separated/married,none,4,car or other,23,none,own,2,skilled,1,none,yes,1 +2317941,0 - 200 DM,12,critical account - other non-bank loans,car (new),2366,500 - 1000 DM,4 - 7 years,3,male-divorced/separated,none,3,car or other,36,none,own,1,highly skilled,1,yes,yes,0 +3906586,< 0 DM,21,current loans paid,radio/television,1835,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,real estate,25,none,own,2,skilled,1,yes,yes,1 +4863589,none,24,critical account - other non-bank loans,car (used),3868,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,2,car or other,41,none,rent,2,highly skilled,1,yes,yes,0 +4625102,none,12,current loans paid,furniture/equipment,1768,< 100 DM,1 - 4 years,3,male-single,none,2,real estate,24,none,rent,1,unskilled-resident,1,none,yes,0 +8954004,> 200 DM or salary assignment,10,critical account - other non-bank loans,car (new),781,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,63,none,for free,2,skilled,1,yes,yes,0 +1498295,0 - 200 DM,18,current loans paid,furniture/equipment,1924,unknown/none,< 1 year,4,female-divorced/separated/married,none,3,real estate,27,none,rent,1,skilled,1,none,yes,1 +4851363,< 0 DM,12,critical account - other non-bank loans,car (new),2121,< 100 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,30,none,own,2,skilled,1,none,yes,0 +1018706,< 0 DM,12,current loans paid,radio/television,701,< 100 DM,1 - 4 years,4,male-married/widowed,none,2,real estate,40,none,own,1,unskilled-resident,1,none,yes,0 +9208119,0 - 200 DM,12,current loans paid,repairs,639,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,30,none,own,1,skilled,1,none,yes,1 +4300293,0 - 200 DM,12,critical account - other non-bank loans,car (used),1860,< 100 DM,unemployed,4,male-single,none,2,car or other,34,none,own,2,highly skilled,1,yes,yes,0 +7728014,< 0 DM,12,critical account - other non-bank loans,car (new),3499,< 100 DM,1 - 4 years,3,female-divorced/separated/married,co-applicant,2,real estate,29,none,own,2,skilled,1,none,yes,1 +7990216,0 - 200 DM,48,current loans paid,car (new),8487,unknown/none,4 - 7 years,1,female-divorced/separated/married,none,2,car or other,24,none,own,1,skilled,1,none,yes,0 +1318029,< 0 DM,36,past payment delays,education,6887,< 100 DM,1 - 4 years,4,male-single,none,3,building society savings/life insurance,29,stores,own,1,skilled,1,yes,yes,1 +9949768,none,15,current loans paid,furniture/equipment,2708,< 100 DM,< 1 year,2,male-single,none,3,building society savings/life insurance,27,bank,own,2,unskilled-resident,1,none,yes,0 +7906171,none,18,current loans paid,furniture/equipment,1984,< 100 DM,1 - 4 years,4,male-single,none,4,unknown-none,47,bank,for free,2,skilled,1,none,yes,0 +4495501,none,60,current loans paid,radio/television,10144,100 - 500 DM,4 - 7 years,2,female-divorced/separated/married,none,4,real estate,21,none,own,1,skilled,1,yes,yes,0 +4891536,none,12,critical account - other non-bank loans,radio/television,1240,unknown/none,>= 7 years,4,female-divorced/separated/married,none,2,real estate,38,none,own,2,skilled,1,yes,yes,0 +6332313,none,27,past payment delays,car (used),8613,>= 1000 DM,1 - 4 years,2,male-single,none,2,car or other,27,none,own,2,skilled,1,none,yes,0 +2114106,0 - 200 DM,12,current loans paid,radio/television,766,500 - 1000 DM,1 - 4 years,4,male-single,none,3,real estate,66,none,own,1,unskilled-resident,1,none,yes,1 +8221756,0 - 200 DM,15,critical account - other non-bank loans,radio/television,2728,unknown/none,4 - 7 years,4,male-single,guarantor,2,real estate,35,bank,own,3,skilled,1,yes,yes,0 +7308446,> 200 DM or salary assignment,12,current loans paid,radio/television,1881,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,car or other,44,none,rent,1,unskilled-resident,1,yes,yes,0 +5759358,> 200 DM or salary assignment,6,current loans paid,car (new),709,>= 1000 DM,< 1 year,2,male-married/widowed,none,2,real estate,27,none,own,1,unemployed-unskilled-non-resident,1,none,no,0 +1432196,0 - 200 DM,36,current loans paid,radio/television,4795,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,unknown-none,30,none,own,1,highly skilled,1,yes,yes,0 +2429343,< 0 DM,27,current loans paid,radio/television,3416,< 100 DM,1 - 4 years,3,male-single,none,2,car or other,27,none,own,1,highly skilled,1,none,yes,0 +1581866,< 0 DM,18,current loans paid,furniture/equipment,2462,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,22,none,own,1,skilled,1,none,yes,1 +6989622,none,21,critical account - other non-bank loans,furniture/equipment,2288,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,23,none,own,1,skilled,1,yes,yes,0 +2025676,0 - 200 DM,48,all loans at bank paid,business,3566,100 - 500 DM,4 - 7 years,4,male-single,none,2,car or other,30,none,own,1,skilled,1,none,yes,0 +3000727,< 0 DM,6,critical account - other non-bank loans,car (new),860,< 100 DM,>= 7 years,1,female-divorced/separated/married,none,4,unknown-none,39,none,own,2,skilled,1,yes,yes,0 +3953814,none,12,critical account - other non-bank loans,car (new),682,100 - 500 DM,4 - 7 years,4,female-divorced/separated/married,none,3,car or other,51,none,own,2,skilled,1,yes,yes,0 +7408825,< 0 DM,36,critical account - other non-bank loans,furniture/equipment,5371,< 100 DM,1 - 4 years,3,male-single,guarantor,2,building society savings/life insurance,28,none,own,2,skilled,1,none,yes,0 +6226321,none,18,critical account - other non-bank loans,radio/television,1582,>= 1000 DM,>= 7 years,4,male-single,none,4,car or other,46,none,own,2,skilled,1,none,yes,0 +6083792,none,6,current loans paid,radio/television,1346,100 - 500 DM,>= 7 years,2,male-single,none,4,unknown-none,42,bank,for free,1,skilled,2,yes,yes,0 +5122662,none,10,current loans paid,radio/television,1924,< 100 DM,1 - 4 years,1,male-single,none,4,building society savings/life insurance,38,none,own,1,skilled,1,yes,no,0 +9746441,> 200 DM or salary assignment,36,current loans paid,radio/television,5848,< 100 DM,1 - 4 years,4,male-single,none,1,car or other,24,none,own,1,skilled,1,none,yes,0 +7353198,0 - 200 DM,24,critical account - other non-bank loans,car (used),7758,>= 1000 DM,>= 7 years,2,female-divorced/separated/married,none,4,unknown-none,29,none,rent,1,skilled,1,none,yes,0 +7766762,0 - 200 DM,24,past payment delays,business,6967,100 - 500 DM,4 - 7 years,4,male-single,none,4,car or other,36,none,rent,1,highly skilled,1,yes,yes,0 +5175459,< 0 DM,12,current loans paid,furniture/equipment,1282,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,20,none,rent,1,skilled,1,none,yes,1 +5520884,< 0 DM,9,critical account - other non-bank loans,repairs,1288,100 - 500 DM,>= 7 years,3,male-single,guarantor,4,real estate,48,none,own,2,skilled,2,none,no,0 +1137438,< 0 DM,12,all loans at bank paid,retraining,339,< 100 DM,>= 7 years,4,male-married/widowed,none,1,car or other,45,bank,own,1,unskilled-resident,1,none,yes,0 +3892302,0 - 200 DM,24,current loans paid,car (new),3512,100 - 500 DM,4 - 7 years,2,male-single,none,3,car or other,38,bank,own,2,skilled,1,yes,yes,0 +9957275,none,6,critical account - other non-bank loans,radio/television,1898,unknown/none,1 - 4 years,1,male-single,none,2,real estate,34,none,own,2,unskilled-resident,2,none,yes,0 +4695507,none,24,critical account - other non-bank loans,radio/television,2872,100 - 500 DM,>= 7 years,3,male-single,none,4,real estate,36,none,own,1,skilled,2,yes,yes,0 +7775118,none,18,critical account - other non-bank loans,car (new),1055,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,building society savings/life insurance,30,none,own,2,skilled,1,none,yes,0 +4717846,none,15,current loans paid,domestic appliances,1262,500 - 1000 DM,4 - 7 years,4,male-single,none,3,building society savings/life insurance,36,none,own,2,skilled,1,yes,yes,0 +3539651,0 - 200 DM,10,current loans paid,car (new),7308,< 100 DM,unemployed,2,male-single,none,4,unknown-none,70,bank,for free,1,highly skilled,1,yes,yes,0 +6446799,none,36,current loans paid,car (new),909,500 - 1000 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,36,none,own,1,skilled,1,none,yes,0 +5749718,none,6,current loans paid,furniture/equipment,2978,500 - 1000 DM,1 - 4 years,1,male-single,none,2,car or other,32,none,own,1,skilled,1,yes,yes,0 +8930405,< 0 DM,18,current loans paid,furniture/equipment,1131,< 100 DM,unemployed,4,female-divorced/separated/married,none,2,car or other,33,none,own,1,skilled,1,none,yes,1 +4030908,0 - 200 DM,11,current loans paid,furniture/equipment,1577,>= 1000 DM,< 1 year,4,female-divorced/separated/married,none,1,real estate,20,none,own,1,skilled,1,none,yes,0 +1327333,none,24,current loans paid,furniture/equipment,3972,< 100 DM,4 - 7 years,2,female-divorced/separated/married,none,4,building society savings/life insurance,25,none,rent,1,skilled,1,yes,yes,0 +5329060,0 - 200 DM,24,critical account - other non-bank loans,business,1935,< 100 DM,>= 7 years,4,male-divorced/separated,none,4,real estate,31,none,own,2,skilled,1,yes,yes,1 +7922005,< 0 DM,15,no credit - paid,car (new),950,< 100 DM,>= 7 years,4,male-single,none,3,car or other,33,none,rent,2,skilled,2,none,yes,1 +6450905,none,12,current loans paid,furniture/equipment,763,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,1,real estate,26,none,own,1,skilled,1,yes,yes,0 +1909580,0 - 200 DM,24,past payment delays,furniture/equipment,2064,< 100 DM,unemployed,3,female-divorced/separated/married,none,2,building society savings/life insurance,34,none,own,1,highly skilled,1,yes,yes,1 +2714562,0 - 200 DM,8,current loans paid,radio/television,1414,< 100 DM,1 - 4 years,4,male-single,guarantor,2,real estate,33,none,own,1,skilled,1,none,no,0 +6826470,< 0 DM,21,past payment delays,education,3414,< 100 DM,< 1 year,2,male-single,none,1,building society savings/life insurance,26,none,own,2,skilled,1,none,yes,1 +4340005,none,30,all loans at bank paid,car (used),7485,unknown/none,unemployed,4,female-divorced/separated/married,none,1,real estate,53,bank,own,1,highly skilled,1,yes,yes,1 +7800786,< 0 DM,12,current loans paid,furniture/equipment,2577,< 100 DM,1 - 4 years,2,male-divorced/separated,none,1,car or other,42,none,own,1,skilled,1,none,yes,0 +9266615,< 0 DM,6,critical account - other non-bank loans,radio/television,338,500 - 1000 DM,>= 7 years,4,male-single,none,4,car or other,52,none,own,2,skilled,1,none,yes,0 +6212932,none,12,current loans paid,radio/television,1963,< 100 DM,4 - 7 years,4,male-single,none,2,car or other,31,none,rent,2,highly skilled,2,yes,yes,0 +5419741,< 0 DM,21,critical account - other non-bank loans,car (new),571,< 100 DM,>= 7 years,4,male-single,none,4,real estate,65,none,own,2,skilled,1,none,yes,0 +9177065,none,36,past payment delays,business,9572,< 100 DM,< 1 year,1,male-divorced/separated,none,1,car or other,28,none,own,2,skilled,1,none,yes,1 +9804802,0 - 200 DM,36,past payment delays,business,4455,< 100 DM,1 - 4 years,2,male-divorced/separated,none,2,real estate,30,stores,own,2,highly skilled,1,yes,yes,1 +8560790,< 0 DM,21,all loans at bank paid,car (new),1647,unknown/none,1 - 4 years,4,male-single,none,2,building society savings/life insurance,40,none,own,2,unskilled-resident,2,none,yes,1 +2450622,none,24,critical account - other non-bank loans,furniture/equipment,3777,>= 1000 DM,1 - 4 years,4,male-single,none,4,real estate,50,none,own,1,skilled,1,yes,yes,0 +6376471,0 - 200 DM,18,critical account - other non-bank loans,car (new),884,< 100 DM,>= 7 years,4,male-single,none,4,car or other,36,bank,own,1,skilled,2,yes,yes,1 +2549308,none,15,critical account - other non-bank loans,radio/television,1360,< 100 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,31,none,own,2,skilled,1,none,yes,0 +5285284,0 - 200 DM,9,all loans at bank paid,car (used),5129,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,4,unknown-none,74,bank,for free,1,highly skilled,2,yes,yes,1 +3020208,0 - 200 DM,16,critical account - other non-bank loans,car (new),1175,< 100 DM,unemployed,2,male-single,none,3,car or other,68,none,for free,3,unemployed-unskilled-non-resident,1,yes,yes,0 +4186149,< 0 DM,12,current loans paid,radio/television,674,100 - 500 DM,4 - 7 years,4,male-married/widowed,none,1,building society savings/life insurance,20,none,own,1,skilled,1,none,yes,1 +4599122,0 - 200 DM,18,no credit - paid,furniture/equipment,3244,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,4,car or other,33,bank,own,2,skilled,1,yes,yes,0 +1730668,none,24,current loans paid,business,4591,>= 1000 DM,1 - 4 years,2,male-single,none,3,building society savings/life insurance,54,none,own,3,highly skilled,1,yes,yes,1 +3082071,0 - 200 DM,48,no credit - paid,business,3844,100 - 500 DM,4 - 7 years,4,male-single,none,4,unknown-none,34,none,for free,1,unskilled-resident,2,none,yes,1 +5960666,0 - 200 DM,27,current loans paid,business,3915,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,36,none,own,1,skilled,2,yes,yes,1 +7033466,none,6,current loans paid,radio/television,2108,< 100 DM,4 - 7 years,2,male-married/widowed,none,2,real estate,29,none,rent,1,skilled,1,none,yes,0 +4027322,0 - 200 DM,45,current loans paid,radio/television,3031,100 - 500 DM,1 - 4 years,4,male-single,guarantor,4,building society savings/life insurance,21,none,rent,1,skilled,1,none,yes,1 +4067442,0 - 200 DM,9,critical account - other non-bank loans,education,1501,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,3,car or other,34,none,own,2,highly skilled,1,yes,yes,1 +2816577,none,6,critical account - other non-bank loans,radio/television,1382,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,1,car or other,28,none,own,2,skilled,1,yes,yes,0 +2964220,0 - 200 DM,12,current loans paid,furniture/equipment,951,100 - 500 DM,< 1 year,4,female-divorced/separated/married,none,4,car or other,27,bank,rent,4,skilled,1,none,yes,1 +4483104,0 - 200 DM,24,current loans paid,car (used),2760,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,36,bank,for free,1,skilled,1,yes,yes,0 +6604417,0 - 200 DM,18,past payment delays,furniture/equipment,4297,< 100 DM,>= 7 years,4,male-divorced/separated,none,3,unknown-none,40,none,own,1,highly skilled,1,yes,yes,1 +7854022,none,9,critical account - other non-bank loans,education,936,500 - 1000 DM,>= 7 years,4,male-single,none,2,car or other,52,none,own,2,skilled,1,yes,yes,0 +5232363,< 0 DM,12,current loans paid,car (new),1168,< 100 DM,1 - 4 years,4,male-married/widowed,none,3,real estate,27,none,own,1,unskilled-resident,1,none,yes,0 +7627208,none,27,past payment delays,business,5117,< 100 DM,4 - 7 years,3,male-single,none,4,car or other,26,none,own,2,skilled,1,none,yes,0 +8257351,< 0 DM,12,current loans paid,retraining,902,< 100 DM,4 - 7 years,4,male-married/widowed,none,4,building society savings/life insurance,21,none,rent,1,skilled,1,none,yes,1 +4920965,none,12,critical account - other non-bank loans,car (new),1495,< 100 DM,>= 7 years,4,male-single,none,1,real estate,38,none,own,2,unskilled-resident,2,none,yes,0 +7198693,< 0 DM,30,critical account - other non-bank loans,car (used),10623,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,38,none,for free,3,highly skilled,2,yes,yes,0 +6463715,none,12,critical account - other non-bank loans,furniture/equipment,1935,< 100 DM,>= 7 years,4,male-single,none,4,real estate,43,none,own,3,skilled,1,yes,yes,0 +3336919,0 - 200 DM,12,critical account - other non-bank loans,domestic appliances,1424,< 100 DM,4 - 7 years,4,male-single,none,3,building society savings/life insurance,26,none,own,1,skilled,1,none,yes,0 +1727035,< 0 DM,24,current loans paid,business,6568,< 100 DM,1 - 4 years,2,male-married/widowed,none,2,car or other,21,stores,own,1,unskilled-resident,1,none,yes,0 +1425487,none,12,current loans paid,car (used),1413,>= 1000 DM,4 - 7 years,3,male-single,none,2,building society savings/life insurance,55,none,own,1,skilled,1,none,no,0 +3808400,none,9,critical account - other non-bank loans,radio/television,3074,unknown/none,1 - 4 years,1,male-single,none,2,real estate,33,none,own,2,skilled,2,none,yes,0 +9925070,none,36,current loans paid,radio/television,3835,unknown/none,>= 7 years,2,female-divorced/separated/married,none,4,real estate,45,none,own,1,unskilled-resident,1,yes,yes,0 +3589132,< 0 DM,27,no credit - paid,business,5293,< 100 DM,unemployed,2,male-single,none,4,building society savings/life insurance,50,stores,own,2,skilled,1,yes,yes,1 +1102227,> 200 DM or salary assignment,30,past payment delays,business,1908,< 100 DM,>= 7 years,4,male-single,none,4,real estate,66,none,own,1,highly skilled,1,yes,yes,1 +3188138,none,36,critical account - other non-bank loans,radio/television,3342,unknown/none,>= 7 years,4,male-single,none,2,car or other,51,none,own,1,skilled,1,yes,yes,0 +2870117,0 - 200 DM,6,critical account - other non-bank loans,retraining,932,unknown/none,4 - 7 years,1,female-divorced/separated/married,none,3,building society savings/life insurance,39,none,own,2,unskilled-resident,1,none,yes,0 +3212489,< 0 DM,18,no credit - paid,business,3104,< 100 DM,4 - 7 years,3,male-single,none,1,building society savings/life insurance,31,bank,own,1,skilled,1,yes,yes,0 +8004570,> 200 DM or salary assignment,36,current loans paid,radio/television,3913,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,23,none,own,1,skilled,1,yes,yes,0 +1878782,< 0 DM,24,current loans paid,furniture/equipment,3021,< 100 DM,1 - 4 years,2,male-divorced/separated,none,2,real estate,24,none,rent,1,unskilled-resident,1,none,yes,0 +8474591,none,10,current loans paid,car (new),1364,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,64,none,own,1,skilled,1,yes,yes,0 +7696088,0 - 200 DM,12,current loans paid,radio/television,625,< 100 DM,< 1 year,4,male-married/widowed,guarantor,1,real estate,26,bank,own,1,unskilled-resident,1,none,yes,0 +6328300,< 0 DM,12,current loans paid,education,1200,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,23,bank,rent,1,skilled,1,yes,yes,0 +2321864,none,12,current loans paid,radio/television,707,< 100 DM,1 - 4 years,4,male-single,none,2,real estate,30,bank,own,2,skilled,1,none,yes,0 +4113839,none,24,past payment delays,business,2978,unknown/none,1 - 4 years,4,male-single,none,4,real estate,32,none,own,2,skilled,2,yes,yes,0 +6328203,none,15,current loans paid,car (used),4657,< 100 DM,1 - 4 years,3,male-single,none,2,car or other,30,none,own,1,skilled,1,yes,yes,0 +4449813,none,36,no credit - paid,repairs,2613,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,27,none,own,2,skilled,1,none,yes,0 +2935635,0 - 200 DM,48,current loans paid,radio/television,10961,>= 1000 DM,4 - 7 years,1,male-single,co-applicant,2,unknown-none,27,bank,own,2,skilled,1,yes,yes,1 +4544060,< 0 DM,12,current loans paid,furniture/equipment,7865,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,53,none,for free,1,highly skilled,1,yes,yes,1 +8787497,none,9,current loans paid,radio/television,1478,< 100 DM,4 - 7 years,4,male-single,none,2,car or other,22,none,own,1,skilled,1,none,yes,1 +5985557,< 0 DM,24,current loans paid,furniture/equipment,3149,< 100 DM,< 1 year,4,male-single,none,1,unknown-none,22,bank,for free,1,skilled,1,none,yes,0 +7271945,> 200 DM or salary assignment,36,current loans paid,radio/television,4210,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,26,none,own,1,skilled,1,none,yes,1 +1813988,none,9,current loans paid,car (new),2507,500 - 1000 DM,>= 7 years,2,male-single,none,4,unknown-none,51,none,for free,1,unskilled-resident,1,none,yes,0 +7860808,none,12,current loans paid,radio/television,2141,100 - 500 DM,4 - 7 years,3,male-single,none,1,unknown-none,35,none,own,1,skilled,1,none,yes,0 +2709717,0 - 200 DM,18,current loans paid,radio/television,866,< 100 DM,1 - 4 years,4,male-married/widowed,guarantor,2,real estate,25,none,own,1,unskilled-resident,1,none,yes,0 +8437780,none,4,critical account - other non-bank loans,radio/television,1544,< 100 DM,4 - 7 years,2,male-single,none,1,real estate,42,none,own,3,unskilled-resident,2,none,yes,0 +9635061,< 0 DM,24,current loans paid,radio/television,1823,< 100 DM,unemployed,4,male-single,none,2,car or other,30,stores,own,1,highly skilled,2,none,yes,1 +7532911,0 - 200 DM,6,current loans paid,car (new),14555,unknown/none,unemployed,1,male-single,none,2,building society savings/life insurance,23,none,own,1,unemployed-unskilled-non-resident,1,yes,yes,1 +7932675,0 - 200 DM,21,current loans paid,business,2767,100 - 500 DM,>= 7 years,4,male-divorced/separated,none,2,car or other,61,bank,rent,2,unskilled-resident,1,none,yes,1 +7028331,none,12,critical account - other non-bank loans,radio/television,1291,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,building society savings/life insurance,35,none,own,2,skilled,1,none,yes,0 +5361518,< 0 DM,30,current loans paid,radio/television,2522,< 100 DM,>= 7 years,1,male-single,guarantor,3,building society savings/life insurance,39,none,own,1,skilled,2,none,yes,0 +2444643,< 0 DM,24,current loans paid,car (new),915,unknown/none,>= 7 years,4,female-divorced/separated/married,none,2,car or other,29,bank,own,1,skilled,1,none,yes,1 +2566697,none,6,current loans paid,radio/television,1595,< 100 DM,4 - 7 years,3,male-single,none,2,building society savings/life insurance,51,none,own,1,skilled,2,none,yes,0 +4776602,0 - 200 DM,12,current loans paid,radio/television,1155,< 100 DM,>= 7 years,3,male-married/widowed,guarantor,3,real estate,40,bank,own,2,unskilled-resident,1,none,yes,0 +5490556,< 0 DM,48,no credit - paid,car (used),4605,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,24,none,for free,2,skilled,2,none,yes,1 +9532249,none,12,critical account - other non-bank loans,business,1185,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,real estate,27,none,own,2,skilled,1,none,yes,0 +8266385,none,12,all loans at bank paid,retraining,3447,500 - 1000 DM,1 - 4 years,4,female-divorced/separated/married,none,3,real estate,35,none,own,1,unskilled-resident,2,none,yes,0 +2445116,none,24,current loans paid,business,1258,< 100 DM,4 - 7 years,4,male-single,none,1,real estate,25,none,own,1,skilled,1,yes,yes,0 +4852106,none,12,critical account - other non-bank loans,radio/television,717,< 100 DM,>= 7 years,4,male-single,none,4,real estate,52,none,own,3,skilled,1,none,yes,0 +9087286,none,6,no credit - paid,car (new),1204,100 - 500 DM,1 - 4 years,4,male-single,none,1,unknown-none,35,bank,rent,1,skilled,1,none,no,0 +7735589,> 200 DM or salary assignment,24,current loans paid,furniture/equipment,1925,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,26,none,own,1,skilled,1,none,yes,0 +5075795,none,18,current loans paid,radio/television,433,< 100 DM,unemployed,3,female-divorced/separated/married,co-applicant,4,real estate,22,none,rent,1,skilled,1,none,yes,1 +9404354,< 0 DM,6,critical account - other non-bank loans,car (new),666,>= 1000 DM,4 - 7 years,3,female-divorced/separated/married,none,4,real estate,39,none,own,2,unskilled-resident,1,yes,yes,0 +7374672,> 200 DM or salary assignment,12,current loans paid,furniture/equipment,2251,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,2,car or other,46,none,own,1,unskilled-resident,1,none,yes,0 +5537602,0 - 200 DM,30,current loans paid,car (new),2150,< 100 DM,1 - 4 years,4,female-divorced/separated/married,guarantor,2,unknown-none,24,bank,own,1,skilled,1,none,yes,1 +2779845,none,24,past payment delays,furniture/equipment,4151,100 - 500 DM,1 - 4 years,2,male-single,none,3,building society savings/life insurance,35,none,own,2,skilled,1,none,yes,0 +9096946,0 - 200 DM,9,current loans paid,furniture/equipment,2030,unknown/none,4 - 7 years,2,male-single,none,1,car or other,24,none,own,1,skilled,1,yes,yes,0 +2649014,0 - 200 DM,60,past payment delays,radio/television,7418,unknown/none,1 - 4 years,1,male-single,none,1,real estate,27,none,own,1,unskilled-resident,1,none,yes,0 +8542371,none,24,critical account - other non-bank loans,radio/television,2684,< 100 DM,1 - 4 years,4,male-single,none,2,real estate,35,none,own,2,unskilled-resident,1,none,yes,0 +3719734,< 0 DM,12,all loans at bank paid,radio/television,2149,< 100 DM,1 - 4 years,4,male-divorced/separated,none,1,unknown-none,29,none,for free,1,skilled,1,none,yes,1 +5769297,none,15,current loans paid,car (used),3812,100 - 500 DM,< 1 year,1,female-divorced/separated/married,none,4,car or other,23,none,own,1,skilled,1,yes,yes,0 +8002907,none,11,critical account - other non-bank loans,radio/television,1154,100 - 500 DM,unemployed,4,female-divorced/separated/married,none,4,real estate,57,none,own,3,unskilled-resident,1,none,yes,0 +4171983,< 0 DM,12,current loans paid,furniture/equipment,1657,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,27,none,own,1,skilled,1,none,yes,0 +7094119,< 0 DM,24,current loans paid,radio/television,1603,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,4,car or other,55,none,own,1,skilled,1,none,yes,0 +5079269,< 0 DM,18,critical account - other non-bank loans,car (new),5302,< 100 DM,>= 7 years,2,male-single,none,4,unknown-none,36,none,for free,3,highly skilled,1,yes,yes,0 +8926902,none,12,critical account - other non-bank loans,education,2748,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,4,unknown-none,57,bank,for free,3,unskilled-resident,1,none,yes,0 +5810771,none,10,critical account - other non-bank loans,car (new),1231,< 100 DM,>= 7 years,3,male-single,none,4,real estate,32,none,own,2,unskilled-resident,2,none,no,0 +7823319,0 - 200 DM,15,current loans paid,radio/television,802,< 100 DM,>= 7 years,4,male-single,none,3,car or other,37,none,own,1,skilled,2,none,yes,1 +5303576,none,36,critical account - other non-bank loans,business,6304,unknown/none,>= 7 years,4,male-single,none,4,real estate,36,none,own,2,skilled,1,none,yes,0 +3572402,none,24,current loans paid,radio/television,1533,< 100 DM,< 1 year,4,female-divorced/separated/married,none,3,car or other,38,stores,own,1,skilled,1,yes,yes,0 +3892416,< 0 DM,14,current loans paid,car (new),8978,< 100 DM,>= 7 years,1,male-divorced/separated,none,4,building society savings/life insurance,45,none,own,1,highly skilled,1,yes,no,1 +4517508,none,24,current loans paid,radio/television,999,unknown/none,>= 7 years,4,male-single,none,2,car or other,25,none,own,2,skilled,1,none,yes,0 +8060870,none,18,current loans paid,car (new),2662,unknown/none,4 - 7 years,4,male-single,none,3,building society savings/life insurance,32,none,own,1,skilled,1,none,no,0 +2081683,none,12,critical account - other non-bank loans,furniture/equipment,1402,500 - 1000 DM,4 - 7 years,3,female-divorced/separated/married,none,4,car or other,37,none,rent,1,skilled,1,yes,yes,0 +6752061,0 - 200 DM,48,all loans at bank paid,car (new),12169,unknown/none,unemployed,4,male-single,co-applicant,4,unknown-none,36,none,for free,1,highly skilled,1,yes,yes,0 +7081559,0 - 200 DM,48,current loans paid,radio/television,3060,< 100 DM,4 - 7 years,4,male-single,none,4,real estate,28,none,own,2,skilled,1,none,yes,1 +9948056,< 0 DM,30,current loans paid,repairs,11998,< 100 DM,< 1 year,1,male-divorced/separated,none,1,unknown-none,34,none,own,1,unskilled-resident,1,yes,yes,1 +4447932,none,9,current loans paid,radio/television,2697,< 100 DM,1 - 4 years,1,male-single,none,2,real estate,32,none,own,1,skilled,2,none,yes,0 +5069388,none,18,critical account - other non-bank loans,radio/television,2404,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,car or other,26,none,own,2,skilled,1,none,yes,0 +5910940,< 0 DM,12,current loans paid,furniture/equipment,1262,unknown/none,>= 7 years,2,male-divorced/separated,none,4,building society savings/life insurance,49,none,own,1,unskilled-resident,1,yes,yes,0 +3733764,none,6,current loans paid,furniture/equipment,4611,< 100 DM,< 1 year,1,female-divorced/separated/married,none,4,building society savings/life insurance,32,none,own,1,skilled,1,none,yes,1 +5012199,none,24,current loans paid,radio/television,1901,100 - 500 DM,1 - 4 years,4,male-single,none,4,car or other,29,none,rent,1,highly skilled,1,yes,yes,0 +6009011,none,15,critical account - other non-bank loans,car (used),3368,>= 1000 DM,>= 7 years,3,male-single,none,4,unknown-none,23,none,rent,2,skilled,1,yes,yes,0 +5953648,none,12,current loans paid,furniture/equipment,1574,< 100 DM,1 - 4 years,4,male-single,none,2,real estate,50,none,own,1,skilled,1,none,yes,0 +1511501,> 200 DM or salary assignment,18,all loans at bank paid,radio/television,1445,unknown/none,4 - 7 years,4,male-single,none,4,car or other,49,bank,own,1,unskilled-resident,1,none,yes,0 +5113857,none,15,critical account - other non-bank loans,furniture/equipment,1520,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,63,none,own,1,skilled,1,none,yes,0 +3627592,0 - 200 DM,24,critical account - other non-bank loans,car (new),3878,100 - 500 DM,< 1 year,4,male-divorced/separated,none,2,car or other,37,none,own,1,skilled,1,yes,yes,0 +8803488,< 0 DM,47,current loans paid,car (new),10722,< 100 DM,< 1 year,1,female-divorced/separated/married,none,1,real estate,35,none,own,1,unskilled-resident,1,yes,yes,0 +9505849,< 0 DM,48,current loans paid,car (used),4788,< 100 DM,4 - 7 years,4,male-single,none,3,building society savings/life insurance,26,none,own,1,skilled,2,none,yes,0 +3200360,0 - 200 DM,48,past payment delays,other,7582,100 - 500 DM,unemployed,2,male-single,none,4,unknown-none,31,none,for free,1,highly skilled,1,yes,yes,0 +5155553,0 - 200 DM,12,current loans paid,radio/television,1092,< 100 DM,1 - 4 years,4,female-divorced/separated/married,guarantor,4,real estate,49,none,own,2,skilled,1,yes,yes,0 +1380636,< 0 DM,24,past payment delays,radio/television,1024,< 100 DM,< 1 year,4,male-married/widowed,none,4,real estate,48,stores,own,1,skilled,1,none,yes,1 +3468954,none,12,current loans paid,business,1076,< 100 DM,1 - 4 years,2,male-married/widowed,none,2,real estate,26,none,own,1,skilled,1,yes,no,0 +1526208,0 - 200 DM,36,current loans paid,car (used),9398,< 100 DM,< 1 year,1,male-married/widowed,none,4,car or other,28,none,rent,1,highly skilled,1,yes,yes,1 +2689247,< 0 DM,24,critical account - other non-bank loans,car (used),6419,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,4,unknown-none,44,none,for free,2,highly skilled,2,yes,yes,0 +2359358,> 200 DM or salary assignment,42,critical account - other non-bank loans,car (used),4796,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,56,none,for free,1,skilled,1,none,yes,0 +1099632,none,48,critical account - other non-bank loans,business,7629,unknown/none,>= 7 years,4,male-divorced/separated,none,2,car or other,46,bank,own,2,highly skilled,2,none,yes,0 +3753189,0 - 200 DM,48,current loans paid,furniture/equipment,9960,< 100 DM,< 1 year,1,female-divorced/separated/married,none,2,car or other,26,none,own,1,skilled,1,yes,yes,1 +9668162,none,12,current loans paid,car (used),4675,unknown/none,< 1 year,1,female-divorced/separated/married,none,4,car or other,20,none,rent,1,skilled,1,none,yes,0 +4302438,none,10,current loans paid,car (new),1287,unknown/none,>= 7 years,4,male-single,co-applicant,2,building society savings/life insurance,45,none,own,1,unskilled-resident,1,none,no,0 +1985883,none,18,current loans paid,furniture/equipment,2515,< 100 DM,1 - 4 years,3,male-single,none,4,real estate,43,none,own,1,skilled,1,yes,yes,0 +7530685,0 - 200 DM,21,critical account - other non-bank loans,furniture/equipment,2745,>= 1000 DM,4 - 7 years,3,male-single,none,2,car or other,32,none,own,2,skilled,1,yes,yes,0 +3965440,none,6,current loans paid,car (new),672,< 100 DM,unemployed,1,female-divorced/separated/married,none,4,real estate,54,none,own,1,unemployed-unskilled-non-resident,1,yes,yes,0 +9545127,0 - 200 DM,36,no credit - paid,radio/television,3804,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,1,car or other,42,none,own,1,skilled,1,yes,yes,1 +2663117,> 200 DM or salary assignment,24,critical account - other non-bank loans,car (new),1344,unknown/none,4 - 7 years,4,male-single,none,2,real estate,37,bank,own,2,unskilled-resident,2,none,yes,1 +8785566,< 0 DM,10,critical account - other non-bank loans,car (new),1038,< 100 DM,4 - 7 years,4,male-single,co-applicant,3,building society savings/life insurance,49,none,own,2,skilled,1,yes,yes,0 +8697852,none,48,critical account - other non-bank loans,car (new),10127,500 - 1000 DM,1 - 4 years,2,male-single,none,2,unknown-none,44,bank,for free,1,skilled,1,none,yes,1 +9910337,none,6,current loans paid,furniture/equipment,1543,>= 1000 DM,1 - 4 years,4,male-divorced/separated,none,2,real estate,33,none,own,1,skilled,1,none,yes,0 +7200952,none,30,current loans paid,car (used),4811,unknown/none,4 - 7 years,2,female-divorced/separated/married,none,4,building society savings/life insurance,24,stores,rent,1,unskilled-resident,1,none,yes,0 +2482842,< 0 DM,12,current loans paid,radio/television,727,100 - 500 DM,< 1 year,4,male-married/widowed,none,3,unknown-none,33,none,own,1,unskilled-resident,1,yes,yes,1 +5873455,0 - 200 DM,8,current loans paid,furniture/equipment,1237,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,4,real estate,24,none,own,1,skilled,1,none,yes,1 +9224996,0 - 200 DM,9,current loans paid,car (new),276,< 100 DM,1 - 4 years,4,male-married/widowed,none,4,real estate,22,none,rent,1,unskilled-resident,1,none,yes,0 +9804740,0 - 200 DM,48,current loans paid,other,5381,unknown/none,unemployed,3,male-single,none,4,unknown-none,40,bank,for free,1,unemployed-unskilled-non-resident,1,yes,yes,0 +5724920,none,24,current loans paid,furniture/equipment,5511,100 - 500 DM,1 - 4 years,4,male-single,none,1,car or other,25,stores,own,1,skilled,1,none,yes,0 +5401207,> 200 DM or salary assignment,24,current loans paid,furniture/equipment,3749,< 100 DM,< 1 year,2,female-divorced/separated/married,none,4,car or other,26,none,own,1,skilled,1,none,yes,0 +6883393,0 - 200 DM,12,current loans paid,car (new),685,< 100 DM,4 - 7 years,2,male-married/widowed,none,3,car or other,25,bank,own,1,unskilled-resident,1,none,yes,1 +9941505,> 200 DM or salary assignment,4,current loans paid,car (new),1494,unknown/none,< 1 year,1,male-single,none,2,real estate,29,none,own,1,unskilled-resident,2,none,no,0 +1310810,< 0 DM,36,all loans at bank paid,furniture/equipment,2746,< 100 DM,>= 7 years,4,male-single,none,4,car or other,31,bank,own,1,skilled,1,none,yes,1 +6083949,< 0 DM,12,current loans paid,furniture/equipment,708,< 100 DM,1 - 4 years,2,male-single,guarantor,3,building society savings/life insurance,38,none,own,1,unskilled-resident,2,none,yes,0 +7464596,0 - 200 DM,24,current loans paid,furniture/equipment,4351,unknown/none,1 - 4 years,1,female-divorced/separated/married,none,4,building society savings/life insurance,48,none,own,1,unskilled-resident,1,yes,yes,0 +2298828,none,12,critical account - other non-bank loans,education,701,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,32,none,own,2,skilled,1,none,yes,0 +2953405,< 0 DM,15,past payment delays,furniture/equipment,3643,< 100 DM,>= 7 years,1,female-divorced/separated/married,none,4,building society savings/life insurance,27,none,own,2,unskilled-resident,1,none,yes,0 +5680012,0 - 200 DM,30,critical account - other non-bank loans,car (new),4249,< 100 DM,unemployed,4,male-married/widowed,none,2,car or other,28,none,own,2,highly skilled,1,none,yes,1 +3243360,< 0 DM,24,current loans paid,radio/television,1938,< 100 DM,< 1 year,4,male-divorced/separated,none,3,building society savings/life insurance,32,none,own,1,skilled,1,none,yes,1 +6960053,< 0 DM,24,current loans paid,car (used),2910,< 100 DM,4 - 7 years,2,male-single,none,1,unknown-none,34,none,for free,1,highly skilled,1,yes,yes,0 +8870244,< 0 DM,18,current loans paid,furniture/equipment,2659,>= 1000 DM,1 - 4 years,4,male-single,none,2,car or other,28,none,own,1,skilled,1,none,yes,0 +9717422,none,18,critical account - other non-bank loans,car (new),1028,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,3,real estate,36,none,own,2,skilled,1,none,yes,0 +9554635,< 0 DM,8,critical account - other non-bank loans,car (new),3398,< 100 DM,4 - 7 years,1,male-single,none,4,real estate,39,none,own,2,unskilled-resident,1,none,no,0 +3536846,none,12,critical account - other non-bank loans,furniture/equipment,5801,unknown/none,>= 7 years,2,male-single,none,4,building society savings/life insurance,49,none,rent,1,skilled,1,yes,yes,0 +7951981,none,24,current loans paid,car (new),1525,>= 1000 DM,4 - 7 years,4,female-divorced/separated/married,none,3,car or other,34,none,own,1,skilled,2,yes,yes,0 +5678097,> 200 DM or salary assignment,36,current loans paid,radio/television,4473,< 100 DM,>= 7 years,4,male-single,none,2,car or other,31,none,own,1,skilled,1,none,yes,0 +1771168,0 - 200 DM,6,current loans paid,radio/television,1068,< 100 DM,>= 7 years,4,male-single,none,4,car or other,28,none,own,1,skilled,2,none,yes,0 +7161704,< 0 DM,24,critical account - other non-bank loans,car (used),6615,< 100 DM,unemployed,2,male-single,none,4,unknown-none,75,none,for free,2,highly skilled,1,yes,yes,0 +8866806,none,18,critical account - other non-bank loans,education,1864,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,none,2,real estate,30,none,own,2,skilled,1,none,yes,1 +3955252,0 - 200 DM,60,current loans paid,car (new),7408,100 - 500 DM,< 1 year,4,female-divorced/separated/married,none,2,building society savings/life insurance,24,none,own,1,highly skilled,1,none,yes,1 +2239899,none,48,critical account - other non-bank loans,car (used),11590,100 - 500 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,24,bank,rent,2,unskilled-resident,1,none,yes,1 +6209500,< 0 DM,24,no credit - paid,furniture/equipment,4110,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,23,bank,rent,2,skilled,2,none,yes,1 +7504124,< 0 DM,6,critical account - other non-bank loans,furniture/equipment,3384,< 100 DM,1 - 4 years,1,male-divorced/separated,none,4,real estate,44,none,rent,1,highly skilled,1,yes,yes,1 +1114455,0 - 200 DM,13,current loans paid,radio/television,2101,< 100 DM,< 1 year,2,female-divorced/separated/married,guarantor,4,building society savings/life insurance,23,none,own,1,unskilled-resident,1,none,yes,0 +1803583,< 0 DM,15,current loans paid,domestic appliances,1275,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,2,car or other,24,none,rent,1,skilled,1,none,yes,1 +8565471,< 0 DM,24,current loans paid,furniture/equipment,4169,< 100 DM,1 - 4 years,4,male-single,none,4,building society savings/life insurance,28,none,own,1,skilled,1,none,yes,0 +5974627,0 - 200 DM,10,current loans paid,furniture/equipment,1521,< 100 DM,1 - 4 years,4,male-divorced/separated,none,2,car or other,31,none,own,1,unskilled-resident,1,none,yes,0 +6416552,0 - 200 DM,24,critical account - other non-bank loans,education,5743,< 100 DM,< 1 year,2,female-divorced/separated/married,none,4,unknown-none,24,none,for free,2,skilled,1,yes,yes,0 +5807460,< 0 DM,21,current loans paid,furniture/equipment,3599,< 100 DM,4 - 7 years,1,female-divorced/separated/married,none,4,car or other,26,none,rent,1,unskilled-resident,1,none,yes,0 +4314639,0 - 200 DM,18,current loans paid,radio/television,3213,500 - 1000 DM,< 1 year,1,male-married/widowed,none,3,real estate,25,none,rent,1,skilled,1,none,yes,0 +3529638,0 - 200 DM,18,current loans paid,business,4439,< 100 DM,>= 7 years,1,male-single,co-applicant,1,real estate,33,bank,own,1,highly skilled,1,yes,yes,0 +1222158,> 200 DM or salary assignment,10,current loans paid,car (new),3949,< 100 DM,< 1 year,1,male-single,guarantor,1,building society savings/life insurance,37,none,own,1,unskilled-resident,2,none,yes,0 +1440915,none,15,critical account - other non-bank loans,radio/television,1459,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,car or other,43,none,own,1,unskilled-resident,1,none,yes,0 +3930437,0 - 200 DM,13,critical account - other non-bank loans,radio/television,882,< 100 DM,< 1 year,4,male-single,guarantor,4,real estate,23,none,own,2,skilled,1,none,yes,0 +6612328,0 - 200 DM,24,current loans paid,radio/television,3758,500 - 1000 DM,unemployed,1,female-divorced/separated/married,none,4,unknown-none,23,none,rent,1,unemployed-unskilled-non-resident,1,none,yes,0 +9945625,none,6,past payment delays,business,1743,100 - 500 DM,1 - 4 years,1,male-single,none,2,real estate,34,none,own,2,unskilled-resident,1,none,yes,0 +5001052,0 - 200 DM,9,critical account - other non-bank loans,education,1136,>= 1000 DM,>= 7 years,4,male-single,none,3,unknown-none,32,none,for free,2,skilled,2,none,yes,1 +4733785,none,9,current loans paid,domestic appliances,1236,< 100 DM,< 1 year,1,female-divorced/separated/married,none,4,real estate,23,none,rent,1,skilled,1,yes,yes,0 +1525448,0 - 200 DM,9,current loans paid,furniture/equipment,959,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,2,car or other,29,none,own,1,skilled,1,none,no,1 +6875226,none,18,critical account - other non-bank loans,car (used),3229,unknown/none,unemployed,2,male-single,none,4,unknown-none,38,none,own,1,highly skilled,1,yes,yes,0 +6189987,< 0 DM,12,no credit - paid,radio/television,6199,< 100 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,28,none,rent,2,skilled,1,yes,yes,1 +1690342,none,10,current loans paid,education,727,500 - 1000 DM,>= 7 years,4,male-single,none,4,unknown-none,46,none,for free,1,skilled,1,yes,yes,0 +5643344,0 - 200 DM,24,current loans paid,car (new),1246,< 100 DM,< 1 year,4,male-single,none,2,real estate,23,stores,own,1,unskilled-resident,1,none,yes,1 +9430378,none,12,critical account - other non-bank loans,radio/television,2331,unknown/none,>= 7 years,1,male-single,co-applicant,4,real estate,49,none,own,1,skilled,1,yes,yes,0 +9982432,none,36,past payment delays,radio/television,4463,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,26,none,own,2,highly skilled,1,yes,yes,1 +4344490,none,12,current loans paid,radio/television,776,< 100 DM,1 - 4 years,4,male-married/widowed,none,2,real estate,28,none,own,1,skilled,1,none,yes,0 +4256428,< 0 DM,30,current loans paid,furniture/equipment,2406,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,4,real estate,23,none,rent,1,skilled,1,none,yes,1 +9723360,0 - 200 DM,18,current loans paid,education,1239,unknown/none,1 - 4 years,4,male-single,none,4,unknown-none,61,none,for free,1,skilled,1,none,yes,0 +6762528,> 200 DM or salary assignment,12,current loans paid,radio/television,3399,unknown/none,>= 7 years,2,male-single,none,3,car or other,37,none,own,1,highly skilled,1,none,yes,0 +5280442,> 200 DM or salary assignment,12,past payment delays,car (new),2247,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,car or other,36,stores,own,2,skilled,1,yes,yes,0 +7631660,none,6,current loans paid,furniture/equipment,1766,< 100 DM,1 - 4 years,1,male-married/widowed,none,2,building society savings/life insurance,21,none,rent,1,skilled,1,none,yes,0 +9842159,< 0 DM,18,current loans paid,furniture/equipment,2473,< 100 DM,unemployed,4,male-single,none,1,car or other,25,none,own,1,unemployed-unskilled-non-resident,1,none,yes,1 +3343196,none,12,current loans paid,business,1542,< 100 DM,4 - 7 years,2,male-single,none,4,car or other,36,none,own,1,skilled,1,yes,yes,0 +1342256,none,18,critical account - other non-bank loans,car (used),3850,< 100 DM,4 - 7 years,3,male-single,none,1,car or other,27,none,own,2,skilled,1,none,yes,0 +7966598,< 0 DM,18,current loans paid,furniture/equipment,3650,< 100 DM,< 1 year,1,female-divorced/separated/married,none,4,car or other,22,none,rent,1,skilled,1,none,yes,0 +9974040,< 0 DM,36,current loans paid,furniture/equipment,3446,< 100 DM,>= 7 years,4,male-single,none,2,car or other,42,none,own,1,skilled,2,none,yes,1 +9518729,0 - 200 DM,18,current loans paid,furniture/equipment,3001,< 100 DM,4 - 7 years,2,female-divorced/separated/married,none,4,real estate,40,none,rent,1,skilled,1,none,yes,0 +6655507,none,36,current loans paid,car (new),3079,unknown/none,1 - 4 years,4,male-single,none,4,real estate,36,none,own,1,skilled,1,none,yes,0 +1296391,none,18,critical account - other non-bank loans,radio/television,6070,< 100 DM,>= 7 years,3,male-single,none,4,car or other,33,none,own,2,skilled,1,yes,yes,0 +2366810,none,10,critical account - other non-bank loans,furniture/equipment,2146,< 100 DM,< 1 year,1,female-divorced/separated/married,none,3,real estate,23,none,rent,2,skilled,1,none,yes,0 +1470952,none,60,critical account - other non-bank loans,car (new),13756,unknown/none,>= 7 years,2,male-single,none,4,unknown-none,63,bank,for free,1,highly skilled,1,yes,yes,0 +4427304,0 - 200 DM,60,all loans at bank paid,other,14782,100 - 500 DM,>= 7 years,3,female-divorced/separated/married,none,4,unknown-none,60,bank,for free,2,highly skilled,1,yes,yes,1 +3112559,< 0 DM,48,all loans at bank paid,business,7685,< 100 DM,4 - 7 years,2,female-divorced/separated/married,guarantor,4,car or other,37,none,rent,1,skilled,1,none,yes,1 +4850866,none,18,past payment delays,radio/television,2320,< 100 DM,unemployed,2,male-married/widowed,none,3,real estate,34,none,own,2,skilled,1,none,yes,0 +1019796,none,7,past payment delays,radio/television,846,unknown/none,>= 7 years,3,male-single,none,4,unknown-none,36,none,for free,1,skilled,1,none,yes,0 +1707851,0 - 200 DM,36,current loans paid,car (new),14318,< 100 DM,>= 7 years,4,male-single,none,2,unknown-none,57,none,for free,1,highly skilled,1,yes,yes,1 +4197078,none,6,critical account - other non-bank loans,car (new),362,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,52,none,own,2,unskilled-resident,1,none,yes,0 +6008260,< 0 DM,20,current loans paid,furniture/equipment,2212,unknown/none,4 - 7 years,4,male-single,none,4,car or other,39,none,own,1,skilled,1,yes,yes,0 +4238613,0 - 200 DM,18,current loans paid,car (used),12976,< 100 DM,unemployed,3,female-divorced/separated/married,none,4,unknown-none,38,none,for free,1,highly skilled,1,yes,yes,1 +8881848,none,22,current loans paid,car (new),1283,unknown/none,4 - 7 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,25,none,rent,1,skilled,1,none,yes,0 +1176152,> 200 DM or salary assignment,12,current loans paid,car (new),1330,< 100 DM,< 1 year,4,male-single,none,1,real estate,26,none,own,1,skilled,1,none,yes,0 +5328999,none,30,past payment delays,business,4272,100 - 500 DM,1 - 4 years,2,male-single,none,2,building society savings/life insurance,26,none,own,2,unskilled-resident,1,none,yes,0 +2017205,none,18,critical account - other non-bank loans,radio/television,2238,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,1,car or other,25,none,own,2,skilled,1,none,yes,0 +9326362,none,18,current loans paid,radio/television,1126,unknown/none,< 1 year,4,female-divorced/separated/married,none,2,real estate,21,none,rent,1,skilled,1,yes,yes,0 +6571397,0 - 200 DM,18,critical account - other non-bank loans,furniture/equipment,7374,< 100 DM,unemployed,4,male-single,none,4,building society savings/life insurance,40,stores,own,2,highly skilled,1,yes,yes,0 +8705378,0 - 200 DM,15,critical account - other non-bank loans,business,2326,500 - 1000 DM,1 - 4 years,2,male-single,none,4,car or other,27,bank,own,1,skilled,1,none,yes,0 +3969916,none,9,current loans paid,business,1449,< 100 DM,4 - 7 years,3,female-divorced/separated/married,none,2,car or other,27,none,own,2,skilled,1,none,yes,0 +1797600,none,18,current loans paid,car (new),1820,< 100 DM,1 - 4 years,2,male-married/widowed,none,2,building society savings/life insurance,30,none,own,1,highly skilled,1,yes,yes,0 +8978303,0 - 200 DM,12,current loans paid,furniture/equipment,983,>= 1000 DM,< 1 year,1,female-divorced/separated/married,none,4,real estate,19,none,rent,1,unskilled-resident,1,none,yes,0 +9199505,< 0 DM,36,current loans paid,car (new),3249,< 100 DM,4 - 7 years,2,male-single,none,4,unknown-none,39,bank,for free,1,highly skilled,2,yes,yes,0 +5334896,< 0 DM,6,critical account - other non-bank loans,radio/television,1957,< 100 DM,4 - 7 years,1,female-divorced/separated/married,none,4,car or other,31,none,own,1,skilled,1,none,yes,0 +5018277,none,9,critical account - other non-bank loans,furniture/equipment,2406,< 100 DM,unemployed,2,male-single,none,3,car or other,31,none,own,1,highly skilled,1,none,yes,0 +9994482,0 - 200 DM,39,past payment delays,education,11760,100 - 500 DM,4 - 7 years,2,male-single,none,3,unknown-none,32,none,rent,1,skilled,1,yes,yes,0 +2520877,< 0 DM,12,current loans paid,furniture/equipment,2578,< 100 DM,unemployed,3,female-divorced/separated/married,none,4,unknown-none,55,none,for free,1,highly skilled,1,none,yes,0 +9506380,< 0 DM,36,critical account - other non-bank loans,furniture/equipment,2348,< 100 DM,1 - 4 years,3,male-married/widowed,none,2,building society savings/life insurance,46,none,own,2,skilled,1,yes,yes,0 +7043482,0 - 200 DM,12,current loans paid,car (new),1223,< 100 DM,>= 7 years,1,male-divorced/separated,none,1,real estate,46,none,rent,2,skilled,1,none,yes,1 +1638879,none,24,critical account - other non-bank loans,radio/television,1516,>= 1000 DM,1 - 4 years,4,female-divorced/separated/married,none,1,real estate,43,none,own,2,unskilled-resident,1,none,yes,0 +5296104,none,18,current loans paid,radio/television,1473,< 100 DM,< 1 year,3,male-married/widowed,none,4,real estate,39,none,own,1,skilled,1,yes,yes,0 +7379526,0 - 200 DM,18,critical account - other non-bank loans,business,1887,unknown/none,1 - 4 years,4,male-married/widowed,none,4,real estate,28,bank,own,2,skilled,1,none,yes,0 +3276838,none,24,past payment delays,business,8648,< 100 DM,< 1 year,2,male-single,none,2,car or other,27,bank,own,2,skilled,1,yes,yes,1 +8629007,none,14,past payment delays,car (new),802,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,27,none,own,2,unskilled-resident,1,none,yes,0 +6989363,0 - 200 DM,18,past payment delays,car (new),2899,unknown/none,>= 7 years,4,male-single,none,4,car or other,43,none,own,1,skilled,2,none,yes,0 +3486169,0 - 200 DM,24,current loans paid,radio/television,2039,< 100 DM,< 1 year,1,male-married/widowed,none,1,building society savings/life insurance,22,none,own,1,skilled,1,yes,yes,1 +5914287,none,24,critical account - other non-bank loans,car (used),2197,unknown/none,4 - 7 years,4,male-single,none,4,car or other,43,none,own,2,skilled,2,yes,yes,0 +2842414,< 0 DM,15,current loans paid,radio/television,1053,< 100 DM,< 1 year,4,male-married/widowed,none,2,real estate,27,none,own,1,skilled,1,none,no,0 +6610185,none,24,current loans paid,radio/television,3235,500 - 1000 DM,>= 7 years,3,male-divorced/separated,none,2,car or other,26,none,own,1,highly skilled,1,yes,yes,0 +7974174,> 200 DM or salary assignment,12,critical account - other non-bank loans,car (new),939,500 - 1000 DM,4 - 7 years,4,male-married/widowed,none,2,real estate,28,none,own,3,skilled,1,yes,yes,1 +6076228,0 - 200 DM,24,current loans paid,radio/television,1967,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,4,car or other,20,none,own,1,skilled,1,yes,yes,0 +2636238,none,33,critical account - other non-bank loans,car (used),7253,< 100 DM,4 - 7 years,3,male-single,none,2,car or other,35,none,own,2,highly skilled,1,yes,yes,0 +3698559,none,12,critical account - other non-bank loans,business,2292,< 100 DM,unemployed,4,male-single,none,2,car or other,42,stores,own,2,highly skilled,1,yes,yes,1 +1480210,none,10,current loans paid,car (new),1597,500 - 1000 DM,1 - 4 years,3,male-single,none,2,unknown-none,40,none,rent,1,unskilled-resident,2,none,no,0 +8545813,< 0 DM,24,current loans paid,car (new),1381,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,2,building society savings/life insurance,35,none,own,1,skilled,1,none,yes,1 +8778324,none,36,critical account - other non-bank loans,car (used),5842,< 100 DM,>= 7 years,2,male-single,none,2,building society savings/life insurance,35,none,own,2,skilled,2,yes,yes,0 +4645046,< 0 DM,12,current loans paid,car (new),2579,< 100 DM,< 1 year,4,male-single,none,1,real estate,33,none,own,1,unskilled-resident,2,none,yes,1 +8672645,< 0 DM,18,past payment delays,education,8471,unknown/none,1 - 4 years,1,female-divorced/separated/married,none,2,car or other,23,none,rent,2,skilled,1,yes,yes,0 +6875256,none,21,current loans paid,car (new),2782,500 - 1000 DM,4 - 7 years,1,female-divorced/separated/married,none,2,car or other,31,bank,own,1,highly skilled,1,none,yes,0 +9774597,0 - 200 DM,18,current loans paid,car (new),1042,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,2,building society savings/life insurance,33,none,own,1,skilled,1,none,yes,1 +4501534,none,15,current loans paid,car (new),3186,>= 1000 DM,4 - 7 years,2,female-divorced/separated/married,none,3,car or other,20,none,rent,1,skilled,1,none,yes,0 +2225902,0 - 200 DM,12,current loans paid,car (used),2028,unknown/none,1 - 4 years,4,male-single,none,2,car or other,30,none,own,1,skilled,1,none,yes,0 +5678751,0 - 200 DM,12,critical account - other non-bank loans,car (new),958,< 100 DM,4 - 7 years,2,male-single,none,3,real estate,47,none,own,2,unskilled-resident,2,none,yes,0 +5549074,none,21,past payment delays,furniture/equipment,1591,100 - 500 DM,4 - 7 years,4,male-single,none,3,real estate,34,none,own,2,highly skilled,1,none,yes,0 +4247210,0 - 200 DM,12,current loans paid,furniture/equipment,2762,unknown/none,>= 7 years,1,female-divorced/separated/married,none,2,building society savings/life insurance,25,bank,own,1,skilled,1,yes,yes,1 +2450182,0 - 200 DM,18,current loans paid,car (used),2779,< 100 DM,1 - 4 years,1,male-married/widowed,none,3,car or other,21,none,rent,1,skilled,1,yes,yes,0 +9966032,none,28,critical account - other non-bank loans,radio/television,2743,< 100 DM,>= 7 years,4,male-single,none,2,car or other,29,none,own,2,skilled,1,none,yes,0 +3797662,none,18,critical account - other non-bank loans,radio/television,1149,>= 1000 DM,1 - 4 years,4,male-single,none,3,real estate,46,none,own,2,skilled,1,none,yes,0 +7318192,none,9,current loans paid,furniture/equipment,1313,< 100 DM,>= 7 years,1,male-single,none,4,car or other,20,none,own,1,skilled,1,none,yes,0 +8969170,< 0 DM,18,critical account - other non-bank loans,repairs,1190,< 100 DM,unemployed,2,female-divorced/separated/married,none,4,unknown-none,55,none,for free,3,unemployed-unskilled-non-resident,2,none,yes,1 +6376369,none,5,current loans paid,business,3448,< 100 DM,4 - 7 years,1,male-single,none,4,real estate,74,none,own,1,unskilled-resident,1,none,yes,0 +7195551,0 - 200 DM,24,current loans paid,other,11328,< 100 DM,1 - 4 years,2,male-single,co-applicant,3,car or other,29,bank,own,2,highly skilled,1,yes,yes,1 +9907330,< 0 DM,6,critical account - other non-bank loans,furniture/equipment,1872,< 100 DM,unemployed,4,male-single,none,4,unknown-none,36,none,for free,3,highly skilled,1,yes,yes,0 +3653212,none,24,critical account - other non-bank loans,repairs,2058,< 100 DM,1 - 4 years,4,male-divorced/separated,none,2,real estate,33,none,own,2,skilled,1,yes,yes,0 +1329531,< 0 DM,9,current loans paid,furniture/equipment,2136,< 100 DM,1 - 4 years,3,male-single,none,2,real estate,25,none,own,1,skilled,1,none,yes,0 +1320796,0 - 200 DM,12,current loans paid,radio/television,1484,unknown/none,1 - 4 years,2,male-married/widowed,none,1,real estate,25,none,own,1,skilled,1,yes,yes,1 +5337783,none,6,current loans paid,repairs,660,500 - 1000 DM,4 - 7 years,2,male-married/widowed,none,4,real estate,23,none,rent,1,unskilled-resident,1,none,yes,0 +8339835,none,24,critical account - other non-bank loans,car (new),1287,>= 1000 DM,>= 7 years,4,female-divorced/separated/married,none,4,real estate,37,none,own,2,skilled,1,yes,yes,0 +9464784,< 0 DM,42,critical account - other non-bank loans,repairs,3394,< 100 DM,unemployed,4,male-single,co-applicant,4,car or other,65,none,own,2,unemployed-unskilled-non-resident,1,none,yes,0 +3252011,> 200 DM or salary assignment,12,all loans at bank paid,business,609,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,real estate,26,none,own,1,unemployed-unskilled-non-resident,1,none,yes,1 +4946613,none,12,current loans paid,car (new),1884,< 100 DM,>= 7 years,4,male-single,none,4,car or other,39,none,own,1,highly skilled,1,yes,yes,0 +8418012,< 0 DM,12,current loans paid,furniture/equipment,1620,< 100 DM,1 - 4 years,2,female-divorced/separated/married,co-applicant,3,building society savings/life insurance,30,none,own,1,skilled,1,none,yes,0 +7819002,0 - 200 DM,20,past payment delays,other,2629,< 100 DM,1 - 4 years,2,male-single,none,3,car or other,29,bank,own,2,skilled,1,yes,yes,0 +8650326,none,12,current loans paid,education,719,< 100 DM,>= 7 years,4,male-single,none,4,car or other,41,bank,own,1,unskilled-resident,2,none,yes,1 +6493165,0 - 200 DM,48,critical account - other non-bank loans,furniture/equipment,5096,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,3,car or other,30,none,own,1,highly skilled,1,yes,yes,1 +8221917,none,9,critical account - other non-bank loans,education,1244,unknown/none,>= 7 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,41,none,rent,2,unskilled-resident,1,none,yes,0 +6793340,< 0 DM,36,current loans paid,car (new),1842,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,car or other,34,none,own,1,skilled,1,yes,yes,1 +5800337,0 - 200 DM,7,current loans paid,radio/television,2576,< 100 DM,1 - 4 years,2,male-single,guarantor,2,real estate,35,none,own,1,skilled,1,none,yes,0 +3033374,> 200 DM or salary assignment,12,current loans paid,furniture/equipment,1424,unknown/none,>= 7 years,3,female-divorced/separated/married,none,4,real estate,55,none,own,1,highly skilled,1,yes,yes,0 +6260138,0 - 200 DM,15,past payment delays,repairs,1512,>= 1000 DM,1 - 4 years,3,male-married/widowed,none,3,building society savings/life insurance,61,stores,own,2,skilled,1,none,yes,1 +5464982,none,36,critical account - other non-bank loans,car (used),11054,unknown/none,1 - 4 years,4,male-single,none,2,car or other,30,none,own,1,highly skilled,1,yes,yes,0 +1123635,none,6,current loans paid,radio/television,518,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,1,real estate,29,none,own,1,skilled,1,none,yes,0 +7267566,none,12,no credit - paid,furniture/equipment,2759,< 100 DM,>= 7 years,2,male-single,none,4,building society savings/life insurance,34,none,own,2,skilled,1,none,yes,0 +7836298,none,24,current loans paid,car (used),2670,< 100 DM,>= 7 years,4,male-single,none,4,car or other,35,none,own,1,highly skilled,1,yes,yes,0 +5804062,< 0 DM,24,current loans paid,car (new),4817,< 100 DM,4 - 7 years,2,male-single,co-applicant,3,building society savings/life insurance,31,none,own,1,skilled,1,yes,yes,1 +9314182,none,24,current loans paid,car (used),2679,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,unknown-none,29,none,own,1,highly skilled,1,yes,yes,0 +1557803,< 0 DM,11,critical account - other non-bank loans,car (new),3905,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,36,none,rent,2,skilled,2,none,yes,0 +3608529,< 0 DM,12,current loans paid,car (used),3386,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,35,none,for free,1,skilled,1,yes,yes,1 +8714298,< 0 DM,6,current loans paid,domestic appliances,343,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,real estate,27,none,own,1,skilled,1,none,yes,0 +2245784,none,18,current loans paid,radio/television,4594,< 100 DM,< 1 year,3,male-single,none,2,car or other,32,none,own,1,skilled,1,yes,yes,0 +6049064,< 0 DM,36,current loans paid,furniture/equipment,3620,< 100 DM,1 - 4 years,1,male-single,guarantor,2,building society savings/life insurance,37,none,own,1,skilled,2,none,yes,0 +6641808,< 0 DM,15,current loans paid,car (new),1721,< 100 DM,< 1 year,2,male-single,none,3,real estate,36,none,own,1,skilled,1,none,yes,0 +5057956,0 - 200 DM,12,current loans paid,furniture/equipment,3017,< 100 DM,< 1 year,3,female-divorced/separated/married,none,1,real estate,34,none,rent,1,highly skilled,1,none,yes,0 +4441263,0 - 200 DM,12,current loans paid,retraining,754,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,38,none,own,2,skilled,1,none,yes,0 +3291069,none,18,current loans paid,business,1950,< 100 DM,4 - 7 years,4,male-single,none,1,car or other,34,stores,own,2,skilled,1,yes,yes,0 +7518282,< 0 DM,24,current loans paid,car (used),2924,< 100 DM,1 - 4 years,3,male-single,guarantor,4,unknown-none,63,bank,own,1,skilled,2,yes,yes,0 +1425368,< 0 DM,24,past payment delays,radio/television,1659,< 100 DM,< 1 year,4,female-divorced/separated/married,none,2,car or other,29,none,rent,1,unskilled-resident,1,yes,yes,1 +7122922,none,48,past payment delays,radio/television,7238,unknown/none,>= 7 years,3,male-single,none,3,car or other,32,bank,own,2,skilled,2,none,yes,0 +6752438,none,33,past payment delays,business,2764,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,car or other,26,none,own,2,skilled,1,yes,yes,0 +6177001,none,24,past payment delays,car (used),4679,< 100 DM,4 - 7 years,3,male-single,none,3,car or other,35,none,own,2,unskilled-resident,1,yes,yes,0 +6983743,0 - 200 DM,24,current loans paid,radio/television,3092,100 - 500 DM,< 1 year,3,male-married/widowed,none,2,car or other,22,none,rent,1,skilled,1,yes,yes,1 +6691152,< 0 DM,6,current loans paid,education,448,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,23,none,own,1,skilled,1,none,yes,1 +8808032,< 0 DM,9,current loans paid,car (new),654,< 100 DM,1 - 4 years,4,male-single,none,3,car or other,28,none,own,1,unskilled-resident,1,none,yes,1 +9230494,none,6,current loans paid,retraining,1238,unknown/none,unemployed,4,male-single,none,4,building society savings/life insurance,36,none,own,1,highly skilled,2,yes,yes,0 +7173285,0 - 200 DM,18,critical account - other non-bank loans,radio/television,1245,< 100 DM,1 - 4 years,4,male-married/widowed,none,2,car or other,33,none,own,1,skilled,1,none,yes,1 +5890956,< 0 DM,18,no credit - paid,furniture/equipment,3114,< 100 DM,< 1 year,1,female-divorced/separated/married,none,4,building society savings/life insurance,26,none,rent,1,skilled,1,none,yes,1 +9796998,none,39,current loans paid,car (used),2569,500 - 1000 DM,1 - 4 years,4,male-single,none,4,car or other,24,none,own,1,skilled,1,none,yes,0 +9197850,> 200 DM or salary assignment,24,current loans paid,radio/television,5152,< 100 DM,4 - 7 years,4,male-single,none,2,car or other,25,bank,own,1,skilled,1,none,yes,0 +2693563,0 - 200 DM,12,current loans paid,business,1037,100 - 500 DM,4 - 7 years,3,male-single,none,4,real estate,39,none,own,1,unskilled-resident,1,none,yes,0 +4912289,< 0 DM,15,critical account - other non-bank loans,furniture/equipment,1478,< 100 DM,>= 7 years,4,male-single,none,4,car or other,44,none,own,2,skilled,2,yes,yes,0 +6383336,0 - 200 DM,12,critical account - other non-bank loans,radio/television,3573,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,1,real estate,23,none,own,1,unskilled-resident,1,none,yes,0 +7709637,0 - 200 DM,24,current loans paid,car (new),1201,< 100 DM,< 1 year,4,male-single,none,1,building society savings/life insurance,26,none,own,1,skilled,1,none,yes,0 +4072676,< 0 DM,30,current loans paid,furniture/equipment,3622,>= 1000 DM,>= 7 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,57,none,rent,2,skilled,1,yes,yes,0 +9216806,none,15,past payment delays,furniture/equipment,960,>= 1000 DM,4 - 7 years,3,female-divorced/separated/married,none,2,building society savings/life insurance,30,none,own,2,skilled,1,none,yes,0 +2718715,none,12,critical account - other non-bank loans,car (new),1163,500 - 1000 DM,1 - 4 years,4,male-single,none,4,real estate,44,none,own,1,skilled,1,yes,yes,0 +3865105,0 - 200 DM,6,past payment delays,car (new),1209,< 100 DM,unemployed,4,male-single,none,4,building society savings/life insurance,47,none,own,1,highly skilled,1,yes,yes,1 +9640833,none,12,current loans paid,radio/television,3077,< 100 DM,1 - 4 years,2,male-single,none,4,car or other,52,none,own,1,skilled,1,yes,yes,0 +2667404,none,24,current loans paid,car (new),3757,< 100 DM,>= 7 years,4,female-divorced/separated/married,co-applicant,4,unknown-none,62,none,for free,1,skilled,1,yes,yes,0 +4167143,none,10,current loans paid,car (new),1418,100 - 500 DM,1 - 4 years,3,male-single,none,2,real estate,35,none,rent,1,unskilled-resident,1,none,no,0 +7957595,0 - 200 DM,18,current loans paid,radio/television,1113,< 100 DM,1 - 4 years,4,female-divorced/separated/married,guarantor,4,real estate,26,none,own,1,unskilled-resident,2,none,yes,0 +5378110,none,6,current loans paid,car (new),3518,< 100 DM,1 - 4 years,2,male-single,guarantor,3,building society savings/life insurance,26,none,rent,1,skilled,1,none,yes,0 +7398663,none,12,critical account - other non-bank loans,radio/television,1934,< 100 DM,>= 7 years,2,male-single,none,2,unknown-none,26,none,own,2,skilled,1,none,yes,0 +2689827,0 - 200 DM,27,no credit - paid,business,8318,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,4,unknown-none,42,none,for free,2,highly skilled,1,yes,yes,1 +4156618,none,6,critical account - other non-bank loans,radio/television,1237,100 - 500 DM,1 - 4 years,1,female-divorced/separated/married,none,1,building society savings/life insurance,27,none,own,2,skilled,1,none,yes,0 +6319109,0 - 200 DM,6,current loans paid,radio/television,368,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,38,none,own,1,skilled,1,none,yes,0 +7120920,< 0 DM,12,critical account - other non-bank loans,car (new),2122,< 100 DM,1 - 4 years,3,male-single,none,2,real estate,39,none,rent,2,unskilled-resident,2,none,no,0 +1078929,< 0 DM,24,current loans paid,furniture/equipment,2996,unknown/none,1 - 4 years,2,male-married/widowed,none,4,car or other,20,none,own,1,skilled,1,none,yes,1 +1023769,0 - 200 DM,36,current loans paid,furniture/equipment,9034,100 - 500 DM,< 1 year,4,male-single,co-applicant,1,unknown-none,29,none,rent,1,highly skilled,1,yes,yes,1 +5288061,none,24,critical account - other non-bank loans,furniture/equipment,1585,< 100 DM,4 - 7 years,4,male-single,none,3,building society savings/life insurance,40,none,own,2,skilled,1,none,yes,0 +6898416,0 - 200 DM,18,current loans paid,radio/television,1301,< 100 DM,>= 7 years,4,male-married/widowed,guarantor,2,real estate,32,none,own,1,unskilled-resident,1,none,yes,0 +3933279,> 200 DM or salary assignment,6,critical account - other non-bank loans,car (new),1323,100 - 500 DM,>= 7 years,2,male-divorced/separated,none,4,car or other,28,none,own,2,skilled,2,yes,yes,0 +2929379,< 0 DM,24,current loans paid,car (new),3123,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,building society savings/life insurance,27,none,own,1,skilled,1,none,yes,1 +7338113,< 0 DM,36,current loans paid,car (used),5493,< 100 DM,>= 7 years,2,male-single,none,4,unknown-none,42,none,for free,1,skilled,2,none,yes,0 +1383056,> 200 DM or salary assignment,9,current loans paid,radio/television,1126,100 - 500 DM,>= 7 years,2,male-divorced/separated,none,4,real estate,49,none,own,1,skilled,1,none,yes,0 +9894792,0 - 200 DM,24,critical account - other non-bank loans,radio/television,1216,100 - 500 DM,< 1 year,4,male-single,none,4,unknown-none,38,bank,own,2,skilled,2,none,yes,1 +4821533,< 0 DM,24,current loans paid,car (new),1207,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,24,none,rent,1,skilled,1,none,yes,1 +2599724,none,10,current loans paid,car (new),1309,unknown/none,1 - 4 years,4,male-single,guarantor,4,building society savings/life insurance,27,none,own,1,unskilled-resident,1,none,yes,1 +7122065,> 200 DM or salary assignment,15,critical account - other non-bank loans,car (used),2360,500 - 1000 DM,1 - 4 years,2,male-single,none,2,car or other,36,none,own,1,skilled,1,yes,yes,0 +4771670,0 - 200 DM,15,all loans at bank paid,car (new),6850,100 - 500 DM,unemployed,1,male-single,none,2,building society savings/life insurance,34,none,own,1,highly skilled,2,yes,yes,1 +7524556,none,24,current loans paid,radio/television,1413,< 100 DM,1 - 4 years,4,male-married/widowed,none,2,building society savings/life insurance,28,none,own,1,skilled,1,none,yes,0 +8295830,none,39,current loans paid,car (used),8588,100 - 500 DM,>= 7 years,4,male-single,none,2,car or other,45,none,own,1,highly skilled,1,yes,yes,0 +5815745,< 0 DM,12,current loans paid,car (new),759,< 100 DM,4 - 7 years,4,male-single,none,2,real estate,26,none,own,1,skilled,1,none,yes,1 +2954111,none,36,current loans paid,car (used),4686,< 100 DM,1 - 4 years,2,male-single,none,2,unknown-none,32,none,for free,1,highly skilled,1,yes,yes,0 +3356855,> 200 DM or salary assignment,15,current loans paid,business,2687,< 100 DM,4 - 7 years,2,male-single,none,4,building society savings/life insurance,26,none,rent,1,skilled,1,yes,yes,0 +2676379,0 - 200 DM,12,past payment delays,radio/television,585,< 100 DM,1 - 4 years,4,male-married/widowed,co-applicant,4,real estate,20,none,rent,2,skilled,1,none,yes,0 +8727753,none,24,current loans paid,car (new),2255,unknown/none,< 1 year,4,male-single,none,1,building society savings/life insurance,54,none,own,1,skilled,1,none,yes,0 +6823953,< 0 DM,6,critical account - other non-bank loans,car (new),609,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,3,building society savings/life insurance,37,none,own,2,skilled,1,none,no,0 +7281914,< 0 DM,6,critical account - other non-bank loans,car (new),1361,< 100 DM,< 1 year,2,male-single,none,4,real estate,40,none,own,1,unskilled-resident,2,none,no,0 +7911515,none,36,critical account - other non-bank loans,furniture/equipment,7127,< 100 DM,< 1 year,2,female-divorced/separated/married,none,4,building society savings/life insurance,23,none,rent,2,skilled,1,yes,yes,1 +6306871,< 0 DM,6,current loans paid,car (new),1203,100 - 500 DM,>= 7 years,3,male-single,none,2,building society savings/life insurance,43,none,own,1,skilled,1,yes,yes,0 +6961576,none,6,critical account - other non-bank loans,radio/television,700,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,36,none,for free,2,skilled,1,none,yes,0 +3215162,none,24,critical account - other non-bank loans,repairs,5507,< 100 DM,>= 7 years,3,male-single,none,4,unknown-none,44,none,for free,2,skilled,1,none,yes,0 +3199662,< 0 DM,18,current loans paid,radio/television,3190,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,real estate,24,none,own,1,skilled,1,none,yes,1 +3873021,< 0 DM,48,no credit - paid,furniture/equipment,7119,< 100 DM,1 - 4 years,3,male-single,none,4,unknown-none,53,none,for free,2,skilled,2,none,yes,1 +9751634,none,24,current loans paid,car (used),3488,100 - 500 DM,4 - 7 years,3,female-divorced/separated/married,none,4,car or other,23,none,own,1,skilled,1,none,yes,0 +5857366,0 - 200 DM,26,current loans paid,car (used),7966,< 100 DM,< 1 year,2,male-single,none,3,car or other,30,none,own,2,skilled,1,none,yes,0 +2509169,none,15,critical account - other non-bank loans,education,1532,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,none,3,car or other,31,none,own,1,skilled,1,none,yes,0 +2621457,none,4,critical account - other non-bank loans,radio/television,1503,< 100 DM,4 - 7 years,2,male-single,none,1,real estate,42,none,own,2,unskilled-resident,2,none,yes,0 +8637579,< 0 DM,36,current loans paid,radio/television,2302,< 100 DM,1 - 4 years,4,male-divorced/separated,none,4,car or other,31,none,rent,1,skilled,1,none,yes,1 +4032468,< 0 DM,6,current loans paid,car (new),662,< 100 DM,< 1 year,3,male-single,none,4,real estate,41,none,own,1,unskilled-resident,2,yes,yes,0 +9481590,0 - 200 DM,36,current loans paid,education,2273,< 100 DM,4 - 7 years,3,male-single,none,1,car or other,32,none,own,2,skilled,2,none,yes,0 +7828502,0 - 200 DM,15,current loans paid,car (new),2631,100 - 500 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,28,none,rent,2,skilled,1,yes,yes,1 +6996047,none,12,past payment delays,car (used),1503,< 100 DM,1 - 4 years,4,male-married/widowed,none,4,real estate,41,none,rent,1,skilled,1,none,yes,0 +8785106,none,24,current loans paid,radio/television,1311,100 - 500 DM,4 - 7 years,4,male-married/widowed,none,3,building society savings/life insurance,26,none,own,1,skilled,1,yes,yes,0 +3836745,none,24,current loans paid,radio/television,3105,unknown/none,< 1 year,4,male-single,none,2,car or other,25,none,own,2,skilled,1,none,yes,0 +8659616,> 200 DM or salary assignment,21,critical account - other non-bank loans,education,2319,< 100 DM,< 1 year,2,male-divorced/separated,none,1,car or other,33,none,rent,1,skilled,1,none,yes,1 +5027921,< 0 DM,6,current loans paid,car (new),1374,unknown/none,unemployed,4,female-divorced/separated/married,none,3,building society savings/life insurance,75,none,own,1,highly skilled,1,yes,yes,0 +9292167,0 - 200 DM,18,critical account - other non-bank loans,furniture/equipment,3612,< 100 DM,>= 7 years,3,female-divorced/separated/married,none,4,building society savings/life insurance,37,none,own,1,skilled,1,yes,yes,0 +9619216,< 0 DM,48,current loans paid,car (new),7763,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,42,bank,for free,1,highly skilled,1,none,yes,1 +5342884,> 200 DM or salary assignment,18,current loans paid,furniture/equipment,3049,< 100 DM,< 1 year,1,female-divorced/separated/married,none,1,building society savings/life insurance,45,stores,own,1,unskilled-resident,1,none,yes,0 +3841990,0 - 200 DM,12,current loans paid,radio/television,1534,< 100 DM,< 1 year,1,male-married/widowed,none,1,real estate,23,none,rent,1,skilled,1,none,yes,1 +9862078,none,24,past payment delays,car (new),2032,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,60,none,for free,2,skilled,1,yes,yes,0 +5779045,< 0 DM,30,current loans paid,furniture/equipment,6350,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,31,none,own,1,skilled,1,none,yes,1 +4285035,> 200 DM or salary assignment,18,current loans paid,furniture/equipment,2864,< 100 DM,1 - 4 years,2,male-single,none,1,real estate,34,none,own,1,unskilled-resident,2,none,yes,1 +9708453,none,12,critical account - other non-bank loans,car (new),1255,< 100 DM,>= 7 years,4,male-single,none,4,real estate,61,none,own,2,unskilled-resident,1,none,yes,0 +2572735,< 0 DM,24,past payment delays,car (new),1333,< 100 DM,unemployed,4,male-single,none,2,real estate,43,none,for free,2,skilled,2,none,yes,1 +9506182,none,24,critical account - other non-bank loans,car (new),2022,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,37,none,own,1,skilled,1,yes,yes,0 +3835775,none,24,current loans paid,radio/television,1552,< 100 DM,4 - 7 years,3,male-single,none,1,car or other,32,bank,own,1,skilled,2,none,yes,0 +8995384,< 0 DM,12,all loans at bank paid,radio/television,626,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,4,real estate,24,bank,own,1,unskilled-resident,1,none,yes,1 +1831870,none,48,critical account - other non-bank loans,car (used),8858,unknown/none,4 - 7 years,2,male-single,none,1,unknown-none,35,none,for free,2,skilled,1,yes,yes,0 +6930739,none,12,critical account - other non-bank loans,repairs,996,unknown/none,4 - 7 years,4,female-divorced/separated/married,none,4,real estate,23,none,own,2,skilled,1,none,yes,0 +2082338,none,6,all loans at bank paid,radio/television,1750,500 - 1000 DM,>= 7 years,2,male-single,none,4,building society savings/life insurance,45,bank,own,1,unskilled-resident,2,none,yes,0 +4367265,< 0 DM,48,current loans paid,radio/television,6999,< 100 DM,4 - 7 years,1,male-married/widowed,guarantor,1,real estate,34,none,own,2,skilled,1,yes,yes,1 +6993548,0 - 200 DM,12,critical account - other non-bank loans,car (new),1995,100 - 500 DM,< 1 year,4,male-single,none,1,car or other,27,none,own,1,skilled,1,none,yes,0 +5774450,0 - 200 DM,9,current loans paid,education,1199,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,67,none,own,2,highly skilled,1,yes,yes,0 +1859841,0 - 200 DM,12,current loans paid,radio/television,1331,< 100 DM,< 1 year,2,male-single,none,1,car or other,22,stores,own,1,skilled,1,none,yes,1 +2420058,0 - 200 DM,18,no credit - paid,car (new),2278,100 - 500 DM,< 1 year,3,female-divorced/separated/married,none,3,car or other,28,none,own,2,skilled,1,none,yes,1 +5136657,none,21,no credit - paid,car (new),5003,unknown/none,1 - 4 years,1,female-divorced/separated/married,none,4,building society savings/life insurance,29,bank,own,2,skilled,1,yes,yes,1 +4921496,< 0 DM,24,all loans at bank paid,furniture/equipment,3552,< 100 DM,4 - 7 years,3,male-single,none,4,car or other,27,bank,own,1,skilled,1,none,yes,1 +1259242,0 - 200 DM,18,critical account - other non-bank loans,furniture/equipment,1928,< 100 DM,< 1 year,2,male-single,none,2,real estate,31,none,own,2,unskilled-resident,1,none,yes,1 +4695993,< 0 DM,24,current loans paid,car (used),2964,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,49,bank,for free,1,skilled,2,yes,yes,0 +3427239,< 0 DM,24,all loans at bank paid,radio/television,1546,< 100 DM,4 - 7 years,4,male-single,guarantor,4,car or other,24,bank,rent,1,unskilled-resident,1,none,yes,1 +4030339,> 200 DM or salary assignment,6,past payment delays,radio/television,683,< 100 DM,< 1 year,2,female-divorced/separated/married,none,1,building society savings/life insurance,29,bank,own,1,skilled,1,none,yes,0 +6457386,0 - 200 DM,36,current loans paid,car (new),12389,unknown/none,1 - 4 years,1,male-single,none,4,unknown-none,37,none,for free,1,skilled,1,yes,yes,1 +2492415,0 - 200 DM,24,past payment delays,business,4712,unknown/none,1 - 4 years,4,male-single,none,2,building society savings/life insurance,37,bank,own,2,highly skilled,1,yes,yes,0 +3819769,0 - 200 DM,24,past payment delays,radio/television,1553,100 - 500 DM,4 - 7 years,3,female-divorced/separated/married,none,2,building society savings/life insurance,23,none,rent,2,skilled,1,yes,yes,0 +3248150,< 0 DM,12,current loans paid,car (new),1372,< 100 DM,4 - 7 years,2,male-divorced/separated,none,3,car or other,36,none,own,1,skilled,1,none,yes,1 +2140412,none,24,critical account - other non-bank loans,radio/television,2578,>= 1000 DM,>= 7 years,2,male-single,none,2,car or other,34,none,own,1,skilled,1,none,yes,0 +8918532,0 - 200 DM,48,current loans paid,radio/television,3979,unknown/none,4 - 7 years,4,male-single,none,1,car or other,41,none,own,2,skilled,2,yes,yes,0 +9205934,< 0 DM,48,current loans paid,radio/television,6758,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,car or other,31,none,own,1,skilled,1,yes,yes,1 +6546357,< 0 DM,24,current loans paid,furniture/equipment,3234,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,real estate,23,none,rent,1,unskilled-resident,1,yes,yes,1 +4352053,none,30,critical account - other non-bank loans,radio/television,5954,< 100 DM,4 - 7 years,3,male-single,co-applicant,2,car or other,38,none,own,1,skilled,1,none,yes,0 +6620263,none,24,current loans paid,car (used),5433,unknown/none,unemployed,2,female-divorced/separated/married,none,4,building society savings/life insurance,26,none,rent,1,highly skilled,1,yes,yes,0 +1039684,< 0 DM,15,current loans paid,business,806,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,22,none,own,1,unskilled-resident,1,none,yes,0 +1999133,0 - 200 DM,9,current loans paid,radio/television,1082,< 100 DM,>= 7 years,4,male-single,none,4,car or other,27,none,own,2,unskilled-resident,1,none,yes,0 +6641350,none,15,critical account - other non-bank loans,furniture/equipment,2788,< 100 DM,4 - 7 years,2,female-divorced/separated/married,co-applicant,3,car or other,24,bank,own,2,skilled,1,none,yes,0 +9791462,0 - 200 DM,12,current loans paid,radio/television,2930,< 100 DM,4 - 7 years,2,female-divorced/separated/married,none,1,real estate,27,none,own,1,skilled,1,none,yes,0 +2964444,none,24,critical account - other non-bank loans,education,1927,unknown/none,1 - 4 years,3,female-divorced/separated/married,none,2,car or other,33,none,own,2,skilled,1,yes,yes,0 +4612811,0 - 200 DM,36,critical account - other non-bank loans,car (new),2820,< 100 DM,< 1 year,4,male-divorced/separated,none,4,car or other,27,none,own,2,skilled,1,none,yes,1 +8885321,none,24,current loans paid,retraining,937,< 100 DM,< 1 year,4,male-married/widowed,none,3,car or other,27,none,own,2,unskilled-resident,1,none,yes,0 +8198782,0 - 200 DM,18,critical account - other non-bank loans,car (new),1056,< 100 DM,>= 7 years,3,male-single,guarantor,3,real estate,30,bank,own,2,skilled,1,none,yes,1 +2454414,0 - 200 DM,12,critical account - other non-bank loans,car (new),3124,< 100 DM,< 1 year,1,male-single,none,3,real estate,49,bank,own,2,unskilled-resident,2,none,yes,0 +5678348,none,9,current loans paid,furniture/equipment,1388,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,real estate,26,none,rent,1,skilled,1,none,yes,0 +5182819,0 - 200 DM,36,current loans paid,repairs,2384,< 100 DM,< 1 year,4,male-single,none,1,unknown-none,33,none,rent,1,unskilled-resident,1,none,yes,1 +2727231,none,12,current loans paid,car (new),2133,unknown/none,>= 7 years,4,female-divorced/separated/married,none,4,unknown-none,52,none,for free,1,highly skilled,1,yes,yes,0 +1623348,< 0 DM,18,current loans paid,furniture/equipment,2039,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,4,real estate,20,bank,rent,1,skilled,1,none,yes,1 +2307018,< 0 DM,9,critical account - other non-bank loans,car (new),2799,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,36,none,rent,2,skilled,2,none,yes,0 +1856079,< 0 DM,12,current loans paid,furniture/equipment,1289,< 100 DM,1 - 4 years,4,male-single,guarantor,1,building society savings/life insurance,21,none,own,1,unskilled-resident,1,none,yes,0 +6274836,< 0 DM,18,current loans paid,domestic appliances,1217,< 100 DM,1 - 4 years,4,male-married/widowed,none,3,real estate,47,none,own,1,unskilled-resident,1,yes,yes,1 +6514251,< 0 DM,12,critical account - other non-bank loans,furniture/equipment,2246,< 100 DM,>= 7 years,3,male-single,none,3,building society savings/life insurance,60,none,own,2,skilled,1,none,yes,1 +4867066,< 0 DM,12,critical account - other non-bank loans,radio/television,385,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,3,real estate,58,none,own,4,unskilled-resident,1,yes,yes,0 +4883891,0 - 200 DM,24,past payment delays,car (new),1965,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,42,none,rent,2,skilled,1,yes,yes,0 +9366782,none,21,current loans paid,business,1572,>= 1000 DM,>= 7 years,4,female-divorced/separated/married,none,4,real estate,36,bank,own,1,unskilled-resident,1,none,yes,0 +8194462,0 - 200 DM,24,current loans paid,car (new),2718,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,4,building society savings/life insurance,20,none,rent,1,unskilled-resident,1,yes,yes,1 +3570555,< 0 DM,24,all loans at bank paid,other,1358,unknown/none,>= 7 years,4,male-single,none,3,car or other,40,stores,own,1,highly skilled,1,yes,yes,1 +1491305,0 - 200 DM,6,all loans at bank paid,car (new),931,100 - 500 DM,< 1 year,1,female-divorced/separated/married,none,1,building society savings/life insurance,32,stores,own,1,unskilled-resident,1,none,yes,1 +3303554,< 0 DM,24,current loans paid,car (new),1442,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,4,car or other,23,none,rent,2,skilled,1,none,yes,1 +3657362,0 - 200 DM,24,no credit - paid,business,4241,< 100 DM,1 - 4 years,1,male-single,none,4,real estate,36,none,own,3,unskilled-resident,1,yes,yes,1 +5220673,none,18,critical account - other non-bank loans,car (new),2775,< 100 DM,4 - 7 years,2,male-single,none,2,building society savings/life insurance,31,bank,own,2,skilled,1,none,yes,1 +5869534,none,24,past payment delays,business,3863,< 100 DM,1 - 4 years,1,male-single,none,2,unknown-none,32,none,for free,1,skilled,1,none,yes,0 +2213067,0 - 200 DM,7,current loans paid,radio/television,2329,< 100 DM,< 1 year,1,female-divorced/separated/married,guarantor,1,real estate,45,none,own,1,skilled,1,none,yes,0 +1024391,0 - 200 DM,24,all loans at bank paid,education,1837,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,4,unknown-none,34,bank,for free,1,unskilled-resident,1,none,yes,1 +5699583,none,36,current loans paid,furniture/equipment,3349,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,car or other,28,none,own,1,highly skilled,1,yes,yes,1 +2566539,> 200 DM or salary assignment,10,current loans paid,furniture/equipment,1275,< 100 DM,< 1 year,4,female-divorced/separated/married,none,2,building society savings/life insurance,23,none,own,1,skilled,1,none,yes,0 +6680952,< 0 DM,24,all loans at bank paid,furniture/equipment,2828,500 - 1000 DM,1 - 4 years,4,male-single,none,4,real estate,22,stores,own,1,skilled,1,yes,yes,0 +7697857,none,24,critical account - other non-bank loans,business,4526,< 100 DM,1 - 4 years,3,male-single,none,2,real estate,74,none,own,1,highly skilled,1,yes,yes,0 +1147017,0 - 200 DM,36,current loans paid,radio/television,2671,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,co-applicant,4,unknown-none,50,none,for free,1,skilled,1,none,yes,1 +1877439,none,18,current loans paid,radio/television,2051,< 100 DM,< 1 year,4,male-single,none,1,real estate,33,none,own,1,skilled,1,none,yes,0 +8278164,none,15,current loans paid,car (used),1300,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,45,bank,for free,1,skilled,2,none,yes,0 +5491895,< 0 DM,12,current loans paid,domestic appliances,741,100 - 500 DM,unemployed,4,female-divorced/separated/married,none,3,building society savings/life insurance,22,none,own,1,skilled,1,none,yes,1 +1885075,> 200 DM or salary assignment,10,current loans paid,car (new),1240,100 - 500 DM,>= 7 years,1,female-divorced/separated/married,none,4,unknown-none,48,none,for free,1,unskilled-resident,2,none,yes,1 +6750986,< 0 DM,21,current loans paid,radio/television,3357,>= 1000 DM,< 1 year,4,female-divorced/separated/married,none,2,car or other,29,bank,own,1,skilled,1,none,yes,0 +8900389,< 0 DM,24,all loans at bank paid,car (used),3632,< 100 DM,1 - 4 years,1,female-divorced/separated/married,guarantor,4,car or other,22,bank,rent,1,skilled,1,none,no,0 +3583117,none,18,past payment delays,furniture/equipment,1808,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,1,real estate,22,none,own,1,skilled,1,none,yes,1 +6196790,0 - 200 DM,48,no credit - paid,business,12204,unknown/none,1 - 4 years,2,male-single,none,2,car or other,48,bank,own,1,highly skilled,1,yes,yes,0 +9626587,0 - 200 DM,60,past payment delays,radio/television,9157,unknown/none,1 - 4 years,2,male-single,none,2,unknown-none,27,none,for free,1,highly skilled,1,none,yes,0 +1451248,< 0 DM,6,critical account - other non-bank loans,car (new),3676,< 100 DM,1 - 4 years,1,male-single,none,3,real estate,37,none,rent,3,skilled,2,none,yes,0 +2753919,0 - 200 DM,30,current loans paid,furniture/equipment,3441,100 - 500 DM,1 - 4 years,2,female-divorced/separated/married,co-applicant,4,car or other,21,none,rent,1,skilled,1,none,yes,1 +5526862,none,12,current loans paid,car (new),640,< 100 DM,1 - 4 years,4,male-divorced/separated,none,2,real estate,49,none,own,1,unskilled-resident,1,none,yes,0 +4872213,0 - 200 DM,21,critical account - other non-bank loans,business,3652,< 100 DM,4 - 7 years,2,male-single,none,3,building society savings/life insurance,27,none,own,2,skilled,1,none,yes,0 +2393150,none,18,critical account - other non-bank loans,car (new),1530,< 100 DM,1 - 4 years,3,male-single,none,2,building society savings/life insurance,32,bank,own,2,skilled,1,none,yes,1 +5713333,none,48,current loans paid,business,3914,unknown/none,1 - 4 years,4,male-divorced/separated,none,2,real estate,38,bank,own,1,skilled,1,none,yes,1 +1937588,< 0 DM,12,current loans paid,furniture/equipment,1858,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,car or other,22,none,rent,1,skilled,1,none,yes,0 +9355241,< 0 DM,18,current loans paid,radio/television,2600,< 100 DM,1 - 4 years,4,male-single,none,4,unknown-none,65,none,for free,2,skilled,1,none,yes,1 +5508301,none,15,current loans paid,radio/television,1979,unknown/none,>= 7 years,4,male-single,none,2,car or other,35,none,own,1,skilled,1,none,yes,0 +3586335,> 200 DM or salary assignment,6,current loans paid,furniture/equipment,2116,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,41,none,own,1,skilled,1,yes,yes,0 +1838843,0 - 200 DM,9,all loans at bank paid,car (new),1437,100 - 500 DM,4 - 7 years,2,male-single,none,3,unknown-none,29,none,own,1,skilled,1,none,yes,1 +4763001,none,42,critical account - other non-bank loans,furniture/equipment,4042,500 - 1000 DM,1 - 4 years,4,male-single,none,4,real estate,36,none,own,2,skilled,1,yes,yes,0 +4124475,none,9,current loans paid,education,3832,unknown/none,>= 7 years,1,male-single,none,4,real estate,64,none,own,1,unskilled-resident,1,none,yes,0 +1746129,< 0 DM,24,current loans paid,radio/television,3660,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,28,none,own,1,skilled,1,none,yes,0 +8578734,< 0 DM,18,all loans at bank paid,furniture/equipment,1553,< 100 DM,1 - 4 years,4,male-single,none,3,car or other,44,bank,own,1,skilled,1,none,yes,1 +3544577,0 - 200 DM,15,current loans paid,radio/television,1444,unknown/none,< 1 year,4,male-single,none,1,building society savings/life insurance,23,none,own,1,skilled,1,none,yes,0 +5640616,none,9,current loans paid,furniture/equipment,1980,< 100 DM,< 1 year,2,female-divorced/separated/married,co-applicant,2,car or other,19,none,rent,2,skilled,1,none,yes,1 +3695680,0 - 200 DM,24,current loans paid,car (new),1355,< 100 DM,< 1 year,3,female-divorced/separated/married,none,4,car or other,25,none,own,1,unskilled-resident,1,yes,yes,1 +1665899,none,12,current loans paid,education,1393,< 100 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,47,bank,own,3,skilled,2,yes,yes,0 +3336776,none,24,current loans paid,radio/television,1376,500 - 1000 DM,4 - 7 years,4,female-divorced/separated/married,none,1,car or other,28,none,own,1,skilled,1,none,yes,0 +3175919,none,60,past payment delays,radio/television,15653,< 100 DM,4 - 7 years,2,male-single,none,4,car or other,21,none,own,2,skilled,1,yes,yes,0 +6193893,none,12,current loans paid,radio/television,1493,< 100 DM,< 1 year,4,female-divorced/separated/married,none,3,car or other,34,none,own,1,skilled,2,none,yes,0 +9308059,< 0 DM,42,past payment delays,radio/television,4370,< 100 DM,4 - 7 years,3,male-single,none,2,building society savings/life insurance,26,bank,own,2,skilled,2,yes,yes,1 +7240888,< 0 DM,18,current loans paid,education,750,< 100 DM,unemployed,4,female-divorced/separated/married,none,1,real estate,27,none,own,1,unemployed-unskilled-non-resident,1,none,yes,1 +7390720,0 - 200 DM,15,current loans paid,repairs,1308,< 100 DM,>= 7 years,4,male-single,none,4,car or other,38,none,own,2,unskilled-resident,1,none,yes,0 +2244055,none,15,current loans paid,education,4623,100 - 500 DM,1 - 4 years,3,male-single,none,2,building society savings/life insurance,40,none,own,1,highly skilled,1,yes,yes,1 +9290774,none,24,critical account - other non-bank loans,radio/television,1851,< 100 DM,4 - 7 years,4,male-married/widowed,guarantor,2,car or other,33,none,own,2,skilled,1,yes,yes,0 +9895994,< 0 DM,18,critical account - other non-bank loans,radio/television,1880,< 100 DM,4 - 7 years,4,male-married/widowed,none,1,building society savings/life insurance,32,none,own,2,highly skilled,1,yes,yes,0 +5985690,none,36,past payment delays,business,7980,unknown/none,< 1 year,4,male-single,none,4,car or other,27,none,rent,2,skilled,1,yes,yes,1 +1445890,< 0 DM,30,no credit - paid,furniture/equipment,4583,< 100 DM,1 - 4 years,2,male-divorced/separated,guarantor,2,real estate,32,none,own,2,skilled,1,none,yes,0 +9446992,none,12,current loans paid,car (new),1386,500 - 1000 DM,1 - 4 years,2,female-divorced/separated/married,none,2,building society savings/life insurance,26,none,own,1,skilled,1,none,yes,1 +7946983,> 200 DM or salary assignment,24,current loans paid,car (new),947,< 100 DM,4 - 7 years,4,male-single,none,3,unknown-none,38,bank,for free,1,skilled,2,none,yes,1 +3808692,< 0 DM,12,current loans paid,education,684,< 100 DM,1 - 4 years,4,male-single,none,4,car or other,40,none,rent,1,unskilled-resident,2,none,yes,1 +6960835,< 0 DM,48,current loans paid,education,7476,< 100 DM,4 - 7 years,4,male-single,none,1,unknown-none,50,none,for free,1,highly skilled,1,yes,yes,0 +6472767,0 - 200 DM,12,current loans paid,furniture/equipment,1922,< 100 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,37,none,own,1,unskilled-resident,1,none,yes,1 +9116367,< 0 DM,24,current loans paid,car (new),2303,< 100 DM,>= 7 years,4,male-single,co-applicant,1,real estate,45,none,own,1,skilled,1,none,yes,1 +2210355,0 - 200 DM,36,past payment delays,car (new),8086,100 - 500 DM,>= 7 years,2,male-single,none,4,car or other,42,none,own,4,highly skilled,1,yes,yes,1 +2597995,none,24,critical account - other non-bank loans,car (used),2346,< 100 DM,4 - 7 years,4,male-single,none,3,car or other,35,none,own,2,skilled,1,yes,yes,0 +9519867,< 0 DM,14,current loans paid,car (new),3973,< 100 DM,unemployed,1,male-single,none,4,unknown-none,22,none,for free,1,skilled,1,none,yes,0 +3698602,0 - 200 DM,12,current loans paid,car (new),888,< 100 DM,>= 7 years,4,male-single,none,4,car or other,41,bank,own,1,unskilled-resident,2,none,yes,1 +7748894,none,48,current loans paid,radio/television,10222,unknown/none,4 - 7 years,4,male-single,none,3,car or other,37,stores,own,1,skilled,1,yes,yes,0 +5309238,0 - 200 DM,30,no credit - paid,business,4221,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,1,car or other,28,none,own,2,skilled,1,none,yes,0 +4970061,0 - 200 DM,18,critical account - other non-bank loans,furniture/equipment,6361,< 100 DM,>= 7 years,2,male-single,none,1,unknown-none,41,none,own,1,skilled,1,yes,yes,0 +2854826,> 200 DM or salary assignment,12,current loans paid,radio/television,1297,< 100 DM,1 - 4 years,3,male-married/widowed,none,4,real estate,23,none,rent,1,skilled,1,none,yes,0 +3463374,< 0 DM,12,current loans paid,car (new),900,unknown/none,1 - 4 years,4,male-married/widowed,none,2,car or other,23,none,own,1,skilled,1,none,yes,1 +3481971,none,21,current loans paid,furniture/equipment,2241,< 100 DM,>= 7 years,4,male-single,none,2,real estate,50,none,own,2,skilled,1,none,yes,0 +1153712,0 - 200 DM,6,past payment delays,furniture/equipment,1050,< 100 DM,unemployed,4,male-single,none,1,building society savings/life insurance,35,stores,own,2,highly skilled,1,yes,yes,0 +3077979,> 200 DM or salary assignment,6,critical account - other non-bank loans,education,1047,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,4,building society savings/life insurance,50,none,own,1,unskilled-resident,1,none,yes,0 +6586335,none,24,critical account - other non-bank loans,other,6314,< 100 DM,unemployed,4,male-single,co-applicant,2,unknown-none,27,bank,own,2,highly skilled,1,yes,yes,0 +2300255,0 - 200 DM,30,all loans at bank paid,furniture/equipment,3496,>= 1000 DM,1 - 4 years,4,male-single,none,2,car or other,34,stores,own,1,skilled,2,yes,yes,0 +6107764,none,48,all loans at bank paid,business,3609,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,1,real estate,27,stores,own,1,skilled,1,none,yes,0 +9756743,< 0 DM,12,critical account - other non-bank loans,car (new),4843,< 100 DM,>= 7 years,3,male-single,co-applicant,4,building society savings/life insurance,43,none,rent,2,skilled,1,yes,yes,1 +2412093,> 200 DM or salary assignment,30,critical account - other non-bank loans,radio/television,3017,< 100 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,47,none,own,1,skilled,1,none,yes,0 +2967207,none,24,critical account - other non-bank loans,business,4139,100 - 500 DM,1 - 4 years,3,male-single,none,3,building society savings/life insurance,27,none,own,2,unskilled-resident,1,yes,yes,0 +5294255,none,36,current loans paid,business,5742,100 - 500 DM,4 - 7 years,2,male-single,none,2,car or other,31,none,own,2,skilled,1,yes,yes,0 +3960084,none,60,current loans paid,car (new),10366,< 100 DM,>= 7 years,2,male-single,none,4,building society savings/life insurance,42,none,own,1,highly skilled,1,yes,yes,0 +5926774,none,15,current loans paid,car (used),3029,< 100 DM,4 - 7 years,2,male-single,none,2,car or other,33,none,own,1,skilled,1,none,yes,0 +5969658,none,24,current loans paid,car (new),1393,< 100 DM,1 - 4 years,2,male-single,guarantor,2,real estate,31,none,own,1,skilled,1,yes,yes,0 +4016637,none,6,critical account - other non-bank loans,car (new),2080,500 - 1000 DM,1 - 4 years,1,male-married/widowed,none,2,car or other,24,none,own,1,skilled,1,none,yes,0 +5319265,none,21,past payment delays,business,2580,500 - 1000 DM,< 1 year,4,male-single,none,2,real estate,41,bank,own,1,unskilled-resident,2,none,yes,1 +8616661,none,30,critical account - other non-bank loans,radio/television,4530,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,4,car or other,26,none,rent,1,highly skilled,1,yes,yes,0 +9306673,none,24,critical account - other non-bank loans,furniture/equipment,5150,< 100 DM,>= 7 years,4,male-single,none,4,car or other,33,none,own,1,skilled,1,yes,yes,0 +7790498,0 - 200 DM,72,current loans paid,radio/television,5595,100 - 500 DM,1 - 4 years,2,male-married/widowed,none,2,car or other,24,none,own,1,skilled,1,none,yes,1 +3314262,< 0 DM,24,current loans paid,radio/television,2384,< 100 DM,>= 7 years,4,male-single,none,4,real estate,64,bank,rent,1,unskilled-resident,1,none,yes,0 +8563405,none,18,current loans paid,radio/television,1453,< 100 DM,< 1 year,3,female-divorced/separated/married,none,1,real estate,26,none,own,1,skilled,1,none,yes,0 +8021407,none,6,current loans paid,education,1538,< 100 DM,< 1 year,1,female-divorced/separated/married,none,2,unknown-none,56,none,own,1,skilled,1,none,yes,0 +1620760,none,12,current loans paid,radio/television,2279,unknown/none,1 - 4 years,4,male-single,none,4,unknown-none,37,none,for free,1,skilled,1,yes,yes,0 +7861677,none,15,past payment delays,radio/television,1478,< 100 DM,1 - 4 years,4,male-married/widowed,none,3,real estate,33,bank,own,2,skilled,1,none,yes,0 +5532106,none,24,critical account - other non-bank loans,radio/television,5103,< 100 DM,< 1 year,3,male-married/widowed,none,3,unknown-none,47,none,for free,3,skilled,1,yes,yes,0 +6577647,0 - 200 DM,36,past payment delays,business,9857,100 - 500 DM,4 - 7 years,1,male-single,none,3,building society savings/life insurance,31,none,own,2,unskilled-resident,2,yes,yes,0 +2592971,none,60,current loans paid,car (new),6527,unknown/none,1 - 4 years,4,male-single,none,4,unknown-none,34,none,for free,1,skilled,2,yes,yes,0 +7186212,> 200 DM or salary assignment,10,critical account - other non-bank loans,radio/television,1347,unknown/none,4 - 7 years,4,male-single,none,2,building society savings/life insurance,27,none,own,2,skilled,1,yes,yes,0 +5547078,0 - 200 DM,36,past payment delays,car (new),2862,100 - 500 DM,>= 7 years,4,male-single,none,3,unknown-none,30,none,for free,1,skilled,1,none,yes,0 +8856174,none,9,current loans paid,radio/television,2753,100 - 500 DM,>= 7 years,3,male-single,co-applicant,4,car or other,35,none,own,1,skilled,1,yes,yes,0 +3625778,< 0 DM,12,current loans paid,car (new),3651,>= 1000 DM,1 - 4 years,1,male-single,none,3,building society savings/life insurance,31,none,own,1,skilled,2,none,yes,0 +6075200,< 0 DM,15,critical account - other non-bank loans,furniture/equipment,975,< 100 DM,1 - 4 years,2,male-divorced/separated,none,3,building society savings/life insurance,25,none,own,2,skilled,1,none,yes,0 +2626457,0 - 200 DM,15,current loans paid,repairs,2631,100 - 500 DM,1 - 4 years,3,female-divorced/separated/married,none,2,real estate,25,none,own,1,unskilled-resident,1,none,yes,0 +7031172,0 - 200 DM,24,current loans paid,radio/television,2896,100 - 500 DM,< 1 year,2,male-single,none,1,car or other,29,none,own,1,skilled,1,none,yes,0 +7612370,< 0 DM,6,critical account - other non-bank loans,car (new),4716,unknown/none,< 1 year,1,male-single,none,3,real estate,44,none,own,2,unskilled-resident,2,none,yes,0 +3634470,none,24,current loans paid,radio/television,2284,< 100 DM,4 - 7 years,4,male-single,none,2,car or other,28,none,own,1,skilled,1,yes,yes,0 +1891464,none,6,current loans paid,car (used),1236,500 - 1000 DM,1 - 4 years,2,male-single,none,4,building society savings/life insurance,50,none,rent,1,skilled,1,none,yes,0 +4385444,0 - 200 DM,12,current loans paid,radio/television,1103,< 100 DM,4 - 7 years,4,male-single,guarantor,3,real estate,29,none,own,2,skilled,1,none,no,0 +4245378,none,12,critical account - other non-bank loans,car (new),926,< 100 DM,unemployed,1,female-divorced/separated/married,none,2,building society savings/life insurance,38,none,own,1,unemployed-unskilled-non-resident,1,none,yes,0 +8748357,none,18,critical account - other non-bank loans,radio/television,1800,< 100 DM,1 - 4 years,4,male-single,none,2,car or other,24,none,own,2,skilled,1,none,yes,0 +6154067,> 200 DM or salary assignment,15,current loans paid,education,1905,< 100 DM,>= 7 years,4,male-single,none,4,car or other,40,none,rent,1,highly skilled,1,yes,yes,0 +8475240,none,12,current loans paid,furniture/equipment,1123,500 - 1000 DM,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,29,none,rent,1,unskilled-resident,1,none,yes,1 +8497474,< 0 DM,48,critical account - other non-bank loans,car (used),6331,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,46,none,for free,2,skilled,1,yes,yes,1 +4216405,> 200 DM or salary assignment,24,current loans paid,radio/television,1377,100 - 500 DM,>= 7 years,4,female-divorced/separated/married,none,2,unknown-none,47,none,for free,1,skilled,1,yes,yes,0 +2028520,0 - 200 DM,30,past payment delays,business,2503,100 - 500 DM,>= 7 years,4,male-single,none,2,building society savings/life insurance,41,stores,own,2,skilled,1,none,yes,0 +6062683,0 - 200 DM,27,current loans paid,business,2528,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,building society savings/life insurance,32,none,own,1,skilled,2,yes,yes,0 +3591516,none,15,current loans paid,car (new),5324,500 - 1000 DM,>= 7 years,1,female-divorced/separated/married,none,4,unknown-none,35,none,for free,1,skilled,1,none,yes,0 +8852736,0 - 200 DM,48,current loans paid,car (new),6560,100 - 500 DM,4 - 7 years,3,male-single,none,2,building society savings/life insurance,24,none,own,1,skilled,1,none,yes,1 +2188239,0 - 200 DM,12,no credit - paid,furniture/equipment,2969,< 100 DM,< 1 year,4,female-divorced/separated/married,none,3,building society savings/life insurance,25,none,rent,2,skilled,1,none,yes,1 +8356737,0 - 200 DM,9,current loans paid,radio/television,1206,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,4,real estate,25,none,own,1,skilled,1,none,yes,0 +7311456,0 - 200 DM,9,current loans paid,radio/television,2118,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,37,none,own,1,unskilled-resident,2,none,yes,0 +2717625,none,18,critical account - other non-bank loans,radio/television,629,500 - 1000 DM,>= 7 years,4,male-single,none,3,building society savings/life insurance,32,bank,own,2,highly skilled,1,yes,yes,0 +1257668,< 0 DM,6,all loans at bank paid,education,1198,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,4,unknown-none,35,none,for free,1,skilled,1,none,yes,1 +9770090,none,21,current loans paid,car (used),2476,unknown/none,>= 7 years,4,male-single,none,4,real estate,46,none,own,1,highly skilled,1,yes,yes,0 +5376189,< 0 DM,9,critical account - other non-bank loans,radio/television,1138,< 100 DM,1 - 4 years,4,male-single,none,4,real estate,25,none,own,2,unskilled-resident,1,none,yes,0 +7684771,0 - 200 DM,60,current loans paid,car (new),14027,< 100 DM,4 - 7 years,4,male-single,none,2,unknown-none,27,none,own,1,highly skilled,1,yes,yes,1 +5147905,none,30,critical account - other non-bank loans,car (used),7596,unknown/none,>= 7 years,1,male-single,none,4,car or other,63,none,own,2,skilled,1,none,yes,0 +9729748,none,30,critical account - other non-bank loans,radio/television,3077,unknown/none,>= 7 years,3,male-single,none,2,car or other,40,none,own,2,skilled,2,yes,yes,0 +3354393,none,18,current loans paid,radio/television,1505,< 100 DM,1 - 4 years,4,male-single,none,2,unknown-none,32,none,for free,1,highly skilled,1,yes,yes,0 +2738606,> 200 DM or salary assignment,24,critical account - other non-bank loans,radio/television,3148,unknown/none,1 - 4 years,3,male-single,none,2,car or other,31,none,own,2,skilled,1,yes,yes,0 +7723603,0 - 200 DM,20,no credit - paid,car (used),6148,100 - 500 DM,>= 7 years,3,male-married/widowed,none,4,car or other,31,bank,own,2,skilled,1,yes,yes,0 +4275545,> 200 DM or salary assignment,9,no credit - paid,radio/television,1337,< 100 DM,< 1 year,4,male-single,none,2,car or other,34,none,own,2,highly skilled,1,yes,yes,1 +8516067,0 - 200 DM,6,all loans at bank paid,education,433,>= 1000 DM,< 1 year,4,female-divorced/separated/married,none,2,building society savings/life insurance,24,bank,rent,1,skilled,2,none,yes,1 +8269904,< 0 DM,12,current loans paid,car (new),1228,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,real estate,24,none,own,1,unskilled-resident,1,none,yes,1 +3797846,0 - 200 DM,9,current loans paid,radio/television,790,500 - 1000 DM,1 - 4 years,4,female-divorced/separated/married,none,3,real estate,66,none,own,1,unskilled-resident,1,none,yes,0 +1759390,none,27,current loans paid,car (new),2570,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,3,real estate,21,none,rent,1,skilled,1,none,yes,1 +8406970,none,6,critical account - other non-bank loans,car (new),250,>= 1000 DM,1 - 4 years,2,female-divorced/separated/married,none,2,real estate,41,bank,own,2,unskilled-resident,1,none,yes,0 +4726527,none,15,critical account - other non-bank loans,radio/television,1316,500 - 1000 DM,1 - 4 years,2,male-married/widowed,none,2,building society savings/life insurance,47,none,own,2,unskilled-resident,1,none,yes,0 +7293796,< 0 DM,18,current loans paid,radio/television,1882,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,25,bank,rent,2,skilled,1,none,yes,1 +8019704,0 - 200 DM,48,all loans at bank paid,business,6416,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,3,unknown-none,59,none,rent,1,skilled,1,none,yes,1 +1119926,> 200 DM or salary assignment,24,critical account - other non-bank loans,business,1275,>= 1000 DM,1 - 4 years,2,male-divorced/separated,none,4,real estate,36,none,own,2,skilled,1,yes,yes,0 +8976091,0 - 200 DM,24,past payment delays,radio/television,6403,< 100 DM,< 1 year,1,male-single,none,2,car or other,33,none,own,1,skilled,1,none,yes,0 +2467002,< 0 DM,24,current loans paid,radio/television,1987,< 100 DM,1 - 4 years,2,male-single,none,4,real estate,21,none,rent,1,unskilled-resident,2,none,yes,1 +1767395,0 - 200 DM,8,current loans paid,radio/television,760,< 100 DM,4 - 7 years,4,female-divorced/separated/married,guarantor,2,real estate,44,none,own,1,unskilled-resident,1,none,yes,0 +1412724,none,24,current loans paid,car (used),2603,>= 1000 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,28,none,rent,1,skilled,1,yes,yes,0 +7497034,none,4,critical account - other non-bank loans,car (new),3380,< 100 DM,4 - 7 years,1,female-divorced/separated/married,none,1,real estate,37,none,own,1,skilled,2,none,yes,0 +1760449,0 - 200 DM,36,all loans at bank paid,domestic appliances,3990,unknown/none,< 1 year,3,female-divorced/separated/married,none,2,unknown-none,29,bank,own,1,unemployed-unskilled-non-resident,1,none,yes,0 +3573461,0 - 200 DM,24,current loans paid,car (used),11560,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,4,car or other,23,none,rent,2,highly skilled,1,none,yes,1 +5869439,< 0 DM,18,current loans paid,car (new),4380,100 - 500 DM,1 - 4 years,3,male-single,none,4,car or other,35,none,own,1,unskilled-resident,2,yes,yes,0 +5359049,none,6,critical account - other non-bank loans,car (new),6761,< 100 DM,4 - 7 years,1,male-single,none,3,unknown-none,45,none,own,2,highly skilled,2,yes,yes,0 +7610754,0 - 200 DM,30,no credit - paid,business,4280,100 - 500 DM,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,26,none,rent,2,unskilled-resident,1,none,yes,1 +7876212,< 0 DM,24,all loans at bank paid,car (new),2325,100 - 500 DM,4 - 7 years,2,male-single,none,3,car or other,32,bank,own,1,skilled,1,none,yes,0 +7058762,0 - 200 DM,10,all loans at bank paid,radio/television,1048,< 100 DM,1 - 4 years,4,male-single,none,4,real estate,23,stores,own,1,unskilled-resident,1,none,yes,0 +4452376,none,21,current loans paid,radio/television,3160,unknown/none,>= 7 years,4,male-single,none,3,building society savings/life insurance,41,none,own,1,skilled,1,yes,yes,0 +3813837,< 0 DM,24,all loans at bank paid,furniture/equipment,2483,500 - 1000 DM,1 - 4 years,4,male-single,none,4,real estate,22,stores,own,1,skilled,1,yes,yes,0 +4808834,< 0 DM,39,critical account - other non-bank loans,furniture/equipment,14179,unknown/none,4 - 7 years,4,male-single,none,4,building society savings/life insurance,30,none,own,2,highly skilled,1,yes,yes,0 +6902723,< 0 DM,13,critical account - other non-bank loans,business,1797,< 100 DM,< 1 year,3,male-single,none,1,building society savings/life insurance,28,bank,own,2,unskilled-resident,1,none,yes,0 +8866025,< 0 DM,15,current loans paid,car (new),2511,< 100 DM,unemployed,1,female-divorced/separated/married,none,4,car or other,23,none,rent,1,skilled,1,none,yes,0 +5625748,< 0 DM,12,current loans paid,car (new),1274,< 100 DM,< 1 year,3,female-divorced/separated/married,none,1,real estate,37,none,own,1,unskilled-resident,1,none,yes,1 +3897156,none,21,current loans paid,car (used),5248,unknown/none,1 - 4 years,1,male-single,none,3,car or other,26,none,own,1,skilled,1,none,yes,0 +1337550,< 0 DM,6,current loans paid,furniture/equipment,428,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,1,building society savings/life insurance,49,bank,own,1,skilled,1,yes,yes,0 +4131044,< 0 DM,18,current loans paid,car (new),976,< 100 DM,< 1 year,1,female-divorced/separated/married,none,2,car or other,23,none,own,1,unskilled-resident,1,none,yes,1 +3163950,0 - 200 DM,12,current loans paid,business,841,100 - 500 DM,4 - 7 years,2,female-divorced/separated/married,none,4,real estate,23,none,rent,1,unskilled-resident,1,none,yes,0 +1657747,none,30,critical account - other non-bank loans,radio/television,5771,< 100 DM,4 - 7 years,4,female-divorced/separated/married,none,2,car or other,25,none,own,2,skilled,1,none,yes,0 +9250778,none,12,past payment delays,repairs,1555,>= 1000 DM,>= 7 years,4,male-single,none,4,unknown-none,55,none,for free,2,skilled,2,none,yes,1 +7923489,< 0 DM,24,current loans paid,car (new),1285,unknown/none,4 - 7 years,4,female-divorced/separated/married,none,4,unknown-none,32,none,rent,1,skilled,1,none,yes,1 +1700698,> 200 DM or salary assignment,15,critical account - other non-bank loans,radio/television,1271,unknown/none,1 - 4 years,3,male-single,none,4,unknown-none,39,none,for free,2,skilled,1,yes,yes,1 +8022152,< 0 DM,12,critical account - other non-bank loans,car (new),691,< 100 DM,>= 7 years,4,male-single,none,3,building society savings/life insurance,35,none,own,2,skilled,1,none,yes,1 +1605780,none,15,critical account - other non-bank loans,car (new),5045,unknown/none,>= 7 years,1,female-divorced/separated/married,none,4,car or other,59,none,own,1,skilled,1,yes,yes,0 +8975629,< 0 DM,18,critical account - other non-bank loans,furniture/equipment,2124,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,4,real estate,24,none,rent,2,skilled,1,none,yes,1 +4062254,< 0 DM,12,current loans paid,radio/television,2214,< 100 DM,1 - 4 years,4,male-single,none,3,building society savings/life insurance,24,none,own,1,unskilled-resident,1,none,yes,0 +9324382,none,21,critical account - other non-bank loans,car (new),12680,unknown/none,>= 7 years,4,male-single,none,4,unknown-none,30,none,for free,1,highly skilled,1,yes,yes,1 +1025006,none,24,critical account - other non-bank loans,car (new),2463,100 - 500 DM,4 - 7 years,4,male-married/widowed,none,3,building society savings/life insurance,27,none,own,2,skilled,1,yes,yes,0 +1846586,< 0 DM,30,current loans paid,furniture/equipment,3108,< 100 DM,< 1 year,2,male-divorced/separated,none,4,building society savings/life insurance,31,none,own,1,unskilled-resident,1,none,yes,1 +8968659,none,10,current loans paid,car (used),2901,unknown/none,< 1 year,1,female-divorced/separated/married,none,4,real estate,31,none,rent,1,skilled,1,none,yes,0 +2233391,0 - 200 DM,12,critical account - other non-bank loans,furniture/equipment,3617,< 100 DM,>= 7 years,1,male-single,none,4,car or other,28,none,rent,3,skilled,1,yes,yes,0 +1491565,none,12,critical account - other non-bank loans,radio/television,1655,< 100 DM,>= 7 years,2,male-single,none,4,real estate,63,none,own,2,unskilled-resident,1,yes,yes,0 +4567772,< 0 DM,24,current loans paid,car (used),2812,unknown/none,>= 7 years,2,female-divorced/separated/married,none,4,real estate,26,none,rent,1,skilled,1,none,yes,0 +7000454,< 0 DM,36,critical account - other non-bank loans,education,8065,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,unknown-none,25,none,own,2,highly skilled,1,yes,yes,1 +8450534,none,21,critical account - other non-bank loans,car (used),3275,< 100 DM,>= 7 years,1,male-single,none,4,car or other,36,none,own,1,highly skilled,1,yes,yes,0 +9182898,none,24,critical account - other non-bank loans,radio/television,2223,100 - 500 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,52,bank,own,2,skilled,1,none,yes,0 +4237007,> 200 DM or salary assignment,12,critical account - other non-bank loans,car (new),1480,500 - 1000 DM,unemployed,2,male-single,none,4,unknown-none,66,bank,for free,3,unemployed-unskilled-non-resident,1,none,yes,0 +2480667,< 0 DM,24,current loans paid,car (new),1371,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,4,real estate,25,none,rent,1,skilled,1,none,yes,1 +2564769,none,36,critical account - other non-bank loans,car (new),3535,< 100 DM,4 - 7 years,4,male-single,none,4,car or other,37,none,own,2,skilled,1,yes,yes,0 +6154723,< 0 DM,18,current loans paid,radio/television,3509,< 100 DM,4 - 7 years,4,female-divorced/separated/married,guarantor,1,real estate,25,none,own,1,skilled,1,none,yes,0 +4299505,none,36,critical account - other non-bank loans,car (used),5711,>= 1000 DM,>= 7 years,4,male-single,none,2,car or other,38,none,own,2,highly skilled,1,yes,yes,0 +2287593,0 - 200 DM,18,current loans paid,repairs,3872,< 100 DM,unemployed,2,female-divorced/separated/married,none,4,car or other,67,none,own,1,skilled,1,yes,yes,0 +3094366,0 - 200 DM,39,critical account - other non-bank loans,radio/television,4933,< 100 DM,4 - 7 years,2,male-single,guarantor,2,real estate,25,none,own,2,skilled,1,none,yes,1 +7536220,none,24,critical account - other non-bank loans,car (new),1940,>= 1000 DM,>= 7 years,4,male-single,none,4,real estate,60,none,own,1,skilled,1,yes,yes,0 +9783545,0 - 200 DM,12,no credit - paid,retraining,1410,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,31,none,own,1,unskilled-resident,1,yes,yes,0 +6146218,0 - 200 DM,12,current loans paid,car (new),836,100 - 500 DM,< 1 year,4,female-divorced/separated/married,none,2,building society savings/life insurance,23,bank,own,1,unskilled-resident,1,none,yes,1 +7934311,0 - 200 DM,20,current loans paid,car (used),6468,unknown/none,unemployed,1,male-divorced/separated,none,4,real estate,60,none,own,1,highly skilled,1,yes,yes,0 +7300802,0 - 200 DM,18,current loans paid,business,1941,>= 1000 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,35,none,own,1,unskilled-resident,1,yes,yes,0 +7658137,none,22,current loans paid,radio/television,2675,500 - 1000 DM,>= 7 years,3,male-single,none,4,car or other,40,none,own,1,skilled,1,none,yes,0 +8404631,none,48,critical account - other non-bank loans,car (used),2751,unknown/none,>= 7 years,4,male-single,none,3,car or other,38,none,own,2,skilled,2,yes,yes,0 +6944710,0 - 200 DM,48,past payment delays,education,6224,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,50,none,for free,1,skilled,1,none,yes,1 +1514807,< 0 DM,40,critical account - other non-bank loans,education,5998,< 100 DM,1 - 4 years,4,male-single,none,3,unknown-none,27,bank,own,1,skilled,1,yes,yes,1 +2003384,0 - 200 DM,21,current loans paid,business,1188,< 100 DM,>= 7 years,2,female-divorced/separated/married,none,4,building society savings/life insurance,39,none,own,1,skilled,2,none,yes,1 +5659086,none,24,current loans paid,car (used),6313,unknown/none,>= 7 years,3,male-single,none,4,car or other,41,none,own,1,highly skilled,2,yes,yes,0 +4628471,none,6,critical account - other non-bank loans,furniture/equipment,1221,unknown/none,1 - 4 years,1,male-married/widowed,none,2,building society savings/life insurance,27,none,own,2,skilled,1,none,yes,0 +5403673,> 200 DM or salary assignment,24,current loans paid,furniture/equipment,2892,< 100 DM,>= 7 years,3,male-divorced/separated,none,4,unknown-none,51,none,for free,1,skilled,1,none,yes,0 +8614255,none,24,current loans paid,furniture/equipment,3062,500 - 1000 DM,>= 7 years,4,male-single,none,3,unknown-none,32,none,rent,1,skilled,1,yes,yes,0 +7923482,none,9,current loans paid,furniture/equipment,2301,100 - 500 DM,< 1 year,2,female-divorced/separated/married,none,4,building society savings/life insurance,22,none,rent,1,skilled,1,none,yes,0 +2383650,< 0 DM,18,current loans paid,car (used),7511,unknown/none,>= 7 years,1,male-single,none,4,building society savings/life insurance,51,none,for free,1,skilled,2,yes,yes,1 +6958612,none,12,critical account - other non-bank loans,furniture/equipment,1258,< 100 DM,< 1 year,2,female-divorced/separated/married,none,4,building society savings/life insurance,22,none,rent,2,unskilled-resident,1,none,yes,0 +1883328,none,24,past payment delays,car (new),717,unknown/none,>= 7 years,4,male-married/widowed,none,4,car or other,54,none,own,2,skilled,1,yes,yes,0 +8386168,0 - 200 DM,9,current loans paid,car (new),1549,unknown/none,< 1 year,4,male-single,none,2,real estate,35,none,own,1,unemployed-unskilled-non-resident,1,none,yes,0 +9660695,none,24,critical account - other non-bank loans,education,1597,< 100 DM,>= 7 years,4,male-single,none,4,unknown-none,54,none,for free,2,skilled,2,none,yes,0 +5786462,0 - 200 DM,18,critical account - other non-bank loans,radio/television,1795,< 100 DM,>= 7 years,3,female-divorced/separated/married,guarantor,4,real estate,48,bank,rent,2,unskilled-resident,1,yes,yes,0 +1678929,< 0 DM,20,critical account - other non-bank loans,furniture/equipment,4272,< 100 DM,>= 7 years,1,female-divorced/separated/married,none,4,building society savings/life insurance,24,none,own,2,skilled,1,none,yes,0 +4809900,none,12,critical account - other non-bank loans,radio/television,976,unknown/none,>= 7 years,4,male-single,none,4,car or other,35,none,own,2,skilled,1,none,yes,0 +6983851,0 - 200 DM,12,current loans paid,car (new),7472,unknown/none,unemployed,1,female-divorced/separated/married,none,2,real estate,24,none,rent,1,unemployed-unskilled-non-resident,1,none,yes,0 +9335369,< 0 DM,36,current loans paid,car (new),9271,< 100 DM,4 - 7 years,2,male-single,none,1,car or other,24,none,own,1,skilled,1,yes,yes,1 +7140312,0 - 200 DM,6,current loans paid,radio/television,590,< 100 DM,< 1 year,3,male-married/widowed,none,3,real estate,26,none,own,1,unskilled-resident,1,none,no,0 +4639968,none,12,critical account - other non-bank loans,radio/television,930,unknown/none,>= 7 years,4,male-single,none,4,real estate,65,none,own,4,skilled,1,none,yes,0 +9646710,0 - 200 DM,42,all loans at bank paid,car (used),9283,< 100 DM,unemployed,1,male-single,none,2,unknown-none,55,bank,for free,1,highly skilled,1,yes,yes,0 +1322037,0 - 200 DM,15,no credit - paid,car (new),1778,< 100 DM,< 1 year,2,female-divorced/separated/married,none,1,real estate,26,none,rent,2,unemployed-unskilled-non-resident,1,none,yes,1 +3439631,0 - 200 DM,8,current loans paid,business,907,< 100 DM,< 1 year,3,male-married/widowed,none,2,real estate,26,none,own,1,skilled,1,yes,yes,0 +7403964,0 - 200 DM,6,current loans paid,radio/television,484,< 100 DM,4 - 7 years,3,male-married/widowed,guarantor,3,real estate,28,bank,own,1,unskilled-resident,1,none,yes,0 +6675884,< 0 DM,36,critical account - other non-bank loans,car (used),9629,< 100 DM,4 - 7 years,4,male-single,none,4,car or other,24,none,own,2,skilled,1,yes,yes,1 +5903457,< 0 DM,48,current loans paid,domestic appliances,3051,< 100 DM,1 - 4 years,3,male-single,none,4,car or other,54,none,own,1,skilled,1,none,yes,1 +6892095,< 0 DM,48,current loans paid,car (new),3931,< 100 DM,4 - 7 years,4,male-single,none,4,unknown-none,46,none,for free,1,skilled,2,none,yes,1 +2535665,0 - 200 DM,36,past payment delays,car (new),7432,< 100 DM,1 - 4 years,2,female-divorced/separated/married,none,2,building society savings/life insurance,54,none,rent,1,skilled,1,none,yes,0 +2113858,none,6,current loans paid,domestic appliances,1338,500 - 1000 DM,1 - 4 years,1,male-divorced/separated,none,4,real estate,62,none,own,1,skilled,1,none,yes,0 +6940571,none,6,critical account - other non-bank loans,radio/television,1554,< 100 DM,4 - 7 years,1,female-divorced/separated/married,none,2,car or other,24,none,rent,2,skilled,1,yes,yes,0 +1818305,< 0 DM,36,current loans paid,other,15857,< 100 DM,unemployed,2,male-divorced/separated,co-applicant,3,car or other,43,none,own,1,highly skilled,1,none,yes,0 +1136050,< 0 DM,18,current loans paid,radio/television,1345,< 100 DM,1 - 4 years,4,male-married/widowed,none,3,real estate,26,bank,own,1,skilled,1,none,yes,1 +5408293,none,12,current loans paid,car (new),1101,< 100 DM,1 - 4 years,3,male-married/widowed,none,2,real estate,27,none,own,2,skilled,1,yes,yes,0 +6973245,> 200 DM or salary assignment,12,current loans paid,radio/television,3016,< 100 DM,1 - 4 years,3,male-married/widowed,none,1,car or other,24,none,own,1,skilled,1,none,yes,0 +3458174,< 0 DM,36,current loans paid,furniture/equipment,2712,< 100 DM,>= 7 years,2,male-single,none,2,building society savings/life insurance,41,bank,own,1,skilled,2,none,yes,1 +1641060,< 0 DM,8,critical account - other non-bank loans,car (new),731,< 100 DM,>= 7 years,4,male-single,none,4,real estate,47,none,own,2,unskilled-resident,1,none,yes,0 +3831409,none,18,critical account - other non-bank loans,furniture/equipment,3780,< 100 DM,< 1 year,3,male-divorced/separated,none,2,car or other,35,none,own,2,highly skilled,1,yes,yes,0 +7489128,< 0 DM,21,critical account - other non-bank loans,car (new),1602,< 100 DM,>= 7 years,4,male-married/widowed,none,3,car or other,30,none,own,2,skilled,1,yes,yes,0 +2737942,< 0 DM,18,critical account - other non-bank loans,car (new),3966,< 100 DM,>= 7 years,1,female-divorced/separated/married,none,4,real estate,33,bank,rent,3,skilled,1,yes,yes,1 +3155576,none,18,no credit - paid,business,4165,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,36,stores,own,2,skilled,2,none,yes,1 +6658732,< 0 DM,36,current loans paid,car (used),8335,unknown/none,>= 7 years,3,male-single,none,4,unknown-none,47,none,for free,1,skilled,1,none,yes,1 +6286813,0 - 200 DM,48,past payment delays,business,6681,unknown/none,1 - 4 years,4,male-single,none,4,unknown-none,38,none,for free,1,skilled,2,yes,yes,0 +7203909,none,24,past payment delays,business,2375,500 - 1000 DM,1 - 4 years,4,male-single,none,2,car or other,44,none,own,2,skilled,2,yes,yes,0 +8071605,< 0 DM,18,current loans paid,car (new),1216,< 100 DM,< 1 year,4,female-divorced/separated/married,none,3,car or other,23,none,rent,1,skilled,1,yes,yes,1 +2576628,< 0 DM,45,no credit - paid,business,11816,< 100 DM,>= 7 years,2,male-single,none,4,car or other,29,none,rent,2,skilled,1,none,yes,1 +3544441,0 - 200 DM,24,current loans paid,radio/television,5084,unknown/none,>= 7 years,2,female-divorced/separated/married,none,4,car or other,42,none,own,1,skilled,1,yes,yes,0 +6494860,> 200 DM or salary assignment,15,current loans paid,radio/television,2327,< 100 DM,< 1 year,2,female-divorced/separated/married,none,3,real estate,25,none,own,1,unskilled-resident,1,none,yes,1 +5247218,< 0 DM,12,no credit - paid,car (new),1082,< 100 DM,1 - 4 years,4,male-single,none,4,car or other,48,bank,own,2,skilled,1,none,yes,1 +7024848,none,12,current loans paid,radio/television,886,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,2,car or other,21,none,own,1,skilled,1,none,yes,0 +7182646,none,4,current loans paid,furniture/equipment,601,< 100 DM,< 1 year,1,female-divorced/separated/married,none,3,real estate,23,none,rent,1,unskilled-resident,2,none,yes,0 +6839088,< 0 DM,24,critical account - other non-bank loans,car (used),2957,< 100 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,63,none,own,2,skilled,1,yes,yes,0 +9873458,none,24,critical account - other non-bank loans,radio/television,2611,< 100 DM,>= 7 years,4,male-married/widowed,co-applicant,3,real estate,46,none,own,2,skilled,1,none,yes,0 +1896222,< 0 DM,36,current loans paid,furniture/equipment,5179,< 100 DM,4 - 7 years,4,male-single,none,2,building society savings/life insurance,29,none,own,1,skilled,1,none,yes,1 +3865701,none,21,past payment delays,car (used),2993,< 100 DM,1 - 4 years,3,male-single,none,2,real estate,28,stores,own,2,unskilled-resident,1,none,yes,0 +1157229,none,18,current loans paid,repairs,1943,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,real estate,23,none,own,1,skilled,1,none,yes,1 +3448928,none,24,all loans at bank paid,business,1559,< 100 DM,4 - 7 years,4,male-single,none,4,car or other,50,bank,own,1,skilled,1,yes,yes,0 +4318645,none,18,current loans paid,furniture/equipment,3422,< 100 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,47,bank,own,3,skilled,2,yes,yes,0 +7624820,0 - 200 DM,21,current loans paid,furniture/equipment,3976,unknown/none,4 - 7 years,2,male-single,none,3,car or other,35,none,own,1,skilled,1,yes,yes,0 +8556957,none,18,current loans paid,car (new),6761,unknown/none,1 - 4 years,2,male-single,none,4,car or other,68,none,rent,2,skilled,1,none,yes,1 +9294026,none,24,current loans paid,car (new),1249,< 100 DM,< 1 year,4,male-married/widowed,none,2,real estate,28,none,own,1,skilled,1,none,yes,0 +8814062,< 0 DM,9,current loans paid,radio/television,1364,< 100 DM,4 - 7 years,3,male-single,none,4,real estate,59,none,own,1,skilled,1,none,yes,0 +7578622,< 0 DM,12,current loans paid,radio/television,709,< 100 DM,>= 7 years,4,male-single,none,4,real estate,57,stores,own,1,unskilled-resident,1,none,yes,1 +1129973,< 0 DM,20,critical account - other non-bank loans,car (new),2235,< 100 DM,1 - 4 years,4,male-married/widowed,guarantor,2,building society savings/life insurance,33,bank,rent,2,skilled,1,none,no,1 +3785461,none,24,critical account - other non-bank loans,car (used),4042,unknown/none,4 - 7 years,3,male-single,none,4,building society savings/life insurance,43,none,own,2,skilled,1,yes,yes,0 +6563566,none,15,critical account - other non-bank loans,radio/television,1471,< 100 DM,1 - 4 years,4,male-single,none,4,unknown-none,35,none,for free,2,skilled,1,yes,yes,0 +8205907,< 0 DM,18,all loans at bank paid,car (new),1442,< 100 DM,4 - 7 years,4,male-single,none,4,unknown-none,32,none,for free,2,unskilled-resident,2,none,yes,1 +4983715,none,36,past payment delays,car (new),10875,< 100 DM,>= 7 years,2,male-single,none,2,car or other,45,none,own,2,skilled,2,yes,yes,0 +1607673,none,24,current loans paid,car (new),1474,100 - 500 DM,< 1 year,4,male-married/widowed,none,3,real estate,33,none,own,1,skilled,1,yes,yes,0 +7984360,none,10,current loans paid,retraining,894,unknown/none,4 - 7 years,4,female-divorced/separated/married,none,3,building society savings/life insurance,40,none,own,1,skilled,1,yes,yes,0 +8489771,none,15,critical account - other non-bank loans,furniture/equipment,3343,< 100 DM,1 - 4 years,4,male-single,none,2,unknown-none,28,none,for free,1,skilled,1,yes,yes,0 +1769803,< 0 DM,15,current loans paid,car (new),3959,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,building society savings/life insurance,29,none,own,1,skilled,1,yes,yes,1 +8600642,none,9,current loans paid,car (new),3577,100 - 500 DM,1 - 4 years,1,male-single,guarantor,2,real estate,26,none,rent,1,skilled,2,none,no,0 +2919320,none,24,critical account - other non-bank loans,car (used),5804,>= 1000 DM,1 - 4 years,4,male-single,none,2,real estate,27,none,own,2,skilled,1,none,yes,0 +2941810,none,18,past payment delays,business,2169,< 100 DM,1 - 4 years,4,male-married/widowed,none,2,car or other,28,none,own,1,skilled,1,yes,yes,1 +5203823,< 0 DM,24,current loans paid,radio/television,2439,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,real estate,35,none,own,1,skilled,1,yes,yes,1 +4987253,none,27,critical account - other non-bank loans,furniture/equipment,4526,>= 1000 DM,< 1 year,4,male-single,none,2,real estate,32,stores,own,2,unskilled-resident,2,yes,yes,0 +6273169,none,10,current loans paid,furniture/equipment,2210,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,25,bank,rent,1,unskilled-resident,1,none,yes,1 +6364124,none,15,current loans paid,furniture/equipment,2221,500 - 1000 DM,1 - 4 years,2,female-divorced/separated/married,none,4,car or other,20,none,rent,1,skilled,1,none,yes,0 +1218238,< 0 DM,18,current loans paid,radio/television,2389,< 100 DM,< 1 year,4,female-divorced/separated/married,none,1,car or other,27,stores,own,1,skilled,1,none,yes,0 +8334248,none,12,critical account - other non-bank loans,furniture/equipment,3331,< 100 DM,>= 7 years,2,male-single,none,4,building society savings/life insurance,42,stores,own,1,skilled,1,none,yes,0 +7214897,none,36,current loans paid,business,7409,unknown/none,>= 7 years,3,male-single,none,2,building society savings/life insurance,37,none,own,2,skilled,1,none,yes,0 +7299602,< 0 DM,12,current loans paid,furniture/equipment,652,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,24,none,rent,1,skilled,1,none,yes,0 +8110158,none,36,past payment delays,furniture/equipment,7678,500 - 1000 DM,4 - 7 years,2,female-divorced/separated/married,none,4,car or other,40,none,own,2,skilled,1,yes,yes,0 +3654815,> 200 DM or salary assignment,6,critical account - other non-bank loans,car (new),1343,< 100 DM,>= 7 years,1,male-single,none,4,real estate,46,none,own,2,skilled,2,none,no,0 +6572384,< 0 DM,24,critical account - other non-bank loans,business,1382,100 - 500 DM,4 - 7 years,4,male-single,none,1,real estate,26,none,own,2,skilled,1,yes,yes,0 +4718363,none,15,current loans paid,domestic appliances,874,unknown/none,< 1 year,4,female-divorced/separated/married,none,1,real estate,24,none,own,1,skilled,1,none,yes,0 +4110964,< 0 DM,12,current loans paid,furniture/equipment,3590,< 100 DM,1 - 4 years,2,male-single,co-applicant,2,building society savings/life insurance,29,none,own,1,unskilled-resident,2,none,yes,0 +5456397,0 - 200 DM,11,critical account - other non-bank loans,car (new),1322,>= 1000 DM,1 - 4 years,4,female-divorced/separated/married,none,4,car or other,40,none,own,2,skilled,1,none,yes,0 +7560842,< 0 DM,18,all loans at bank paid,radio/television,1940,< 100 DM,< 1 year,3,male-single,co-applicant,4,unknown-none,36,bank,for free,1,highly skilled,1,yes,yes,0 +2867079,none,36,current loans paid,radio/television,3595,< 100 DM,>= 7 years,4,male-single,none,2,car or other,28,none,own,1,skilled,1,none,yes,0 +9165349,< 0 DM,9,current loans paid,car (new),1422,< 100 DM,< 1 year,3,male-single,none,2,unknown-none,27,none,for free,1,highly skilled,1,yes,yes,1 +5074170,none,30,critical account - other non-bank loans,radio/television,6742,unknown/none,4 - 7 years,2,male-single,none,3,building society savings/life insurance,36,none,own,2,skilled,1,none,yes,0 +1674264,none,24,current loans paid,car (used),7814,< 100 DM,4 - 7 years,3,male-single,none,3,car or other,38,none,own,1,highly skilled,1,yes,yes,0 +7549449,none,24,current loans paid,car (used),9277,unknown/none,1 - 4 years,2,male-divorced/separated,none,4,unknown-none,48,none,for free,1,skilled,1,yes,yes,0 +9453883,0 - 200 DM,30,critical account - other non-bank loans,car (new),2181,unknown/none,>= 7 years,4,male-single,none,4,real estate,36,none,own,2,skilled,1,none,yes,0 +7539149,none,18,critical account - other non-bank loans,radio/television,1098,< 100 DM,unemployed,4,female-divorced/separated/married,none,4,car or other,65,none,own,2,unemployed-unskilled-non-resident,1,none,yes,0 +2758767,0 - 200 DM,24,current loans paid,furniture/equipment,4057,< 100 DM,4 - 7 years,3,male-divorced/separated,none,3,car or other,43,none,own,1,skilled,1,yes,yes,1 +5771164,< 0 DM,12,current loans paid,education,795,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,53,none,own,1,skilled,1,none,yes,1 +7250365,0 - 200 DM,24,critical account - other non-bank loans,business,2825,unknown/none,4 - 7 years,4,male-single,none,3,unknown-none,34,none,own,2,skilled,2,yes,yes,0 +6463462,0 - 200 DM,48,current loans paid,business,15672,< 100 DM,1 - 4 years,2,male-single,none,2,car or other,23,none,own,1,skilled,1,yes,yes,1 +3678150,none,36,critical account - other non-bank loans,car (new),6614,< 100 DM,>= 7 years,4,male-single,none,4,car or other,34,none,own,2,highly skilled,1,yes,yes,0 +7159187,none,28,all loans at bank paid,car (used),7824,unknown/none,< 1 year,3,male-single,guarantor,4,real estate,40,bank,rent,2,skilled,2,yes,yes,0 +2996743,< 0 DM,27,critical account - other non-bank loans,business,2442,< 100 DM,>= 7 years,4,male-single,none,4,car or other,43,stores,own,4,highly skilled,2,yes,yes,0 +3755936,none,15,critical account - other non-bank loans,radio/television,1829,< 100 DM,>= 7 years,4,male-single,none,4,car or other,46,none,own,2,skilled,1,yes,yes,0 +9858169,< 0 DM,12,critical account - other non-bank loans,car (new),2171,< 100 DM,1 - 4 years,4,male-single,none,4,building society savings/life insurance,38,bank,own,2,unskilled-resident,1,none,no,0 +6011271,0 - 200 DM,36,critical account - other non-bank loans,car (used),5800,< 100 DM,1 - 4 years,3,male-single,none,4,car or other,34,none,own,2,skilled,1,yes,yes,0 +2676319,none,18,critical account - other non-bank loans,radio/television,1169,unknown/none,1 - 4 years,4,male-single,none,3,building society savings/life insurance,29,none,own,2,skilled,1,yes,yes,0 +9543202,none,36,past payment delays,car (used),8947,unknown/none,4 - 7 years,3,male-single,none,2,car or other,31,stores,own,1,highly skilled,2,yes,yes,0 +4682068,< 0 DM,21,current loans paid,radio/television,2606,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,28,none,rent,1,highly skilled,1,yes,yes,0 +5609033,none,12,critical account - other non-bank loans,furniture/equipment,1592,>= 1000 DM,4 - 7 years,3,female-divorced/separated/married,none,2,building society savings/life insurance,35,none,own,1,skilled,1,none,no,0 +1863246,none,15,current loans paid,furniture/equipment,2186,unknown/none,4 - 7 years,1,female-divorced/separated/married,none,4,real estate,33,bank,rent,1,unskilled-resident,1,none,yes,0 +3453367,< 0 DM,18,current loans paid,furniture/equipment,4153,< 100 DM,1 - 4 years,2,male-single,co-applicant,3,car or other,42,none,own,1,skilled,1,none,yes,1 +5078589,< 0 DM,16,critical account - other non-bank loans,car (new),2625,< 100 DM,>= 7 years,2,male-single,guarantor,4,building society savings/life insurance,43,bank,rent,1,skilled,1,yes,yes,1 +7409626,none,20,critical account - other non-bank loans,car (new),3485,unknown/none,< 1 year,2,male-divorced/separated,none,4,real estate,44,none,own,2,skilled,1,yes,yes,0 +3861438,none,36,critical account - other non-bank loans,car (used),10477,unknown/none,>= 7 years,2,male-single,none,4,unknown-none,42,none,for free,2,skilled,1,none,yes,0 +6653362,none,15,current loans paid,radio/television,1386,unknown/none,1 - 4 years,4,male-married/widowed,none,2,real estate,40,none,rent,1,skilled,1,yes,yes,0 +2279281,none,24,current loans paid,radio/television,1278,< 100 DM,>= 7 years,4,male-single,none,1,real estate,36,none,own,1,highly skilled,1,yes,yes,0 +9374296,< 0 DM,12,current loans paid,radio/television,1107,< 100 DM,1 - 4 years,2,male-single,none,2,real estate,20,none,rent,1,highly skilled,2,yes,yes,0 +2769534,< 0 DM,21,current loans paid,car (new),3763,unknown/none,4 - 7 years,2,male-single,co-applicant,2,real estate,24,none,own,1,unskilled-resident,1,none,no,0 +2117451,0 - 200 DM,36,current loans paid,education,3711,unknown/none,1 - 4 years,2,male-married/widowed,none,2,car or other,27,none,own,1,skilled,1,none,yes,0 +5910784,none,15,past payment delays,car (used),3594,< 100 DM,< 1 year,1,female-divorced/separated/married,none,2,building society savings/life insurance,46,none,own,2,unskilled-resident,1,none,yes,0 +6446222,0 - 200 DM,9,current loans paid,car (new),3195,unknown/none,1 - 4 years,1,female-divorced/separated/married,none,2,real estate,33,none,own,1,unskilled-resident,1,none,yes,0 +1562036,none,36,past payment delays,radio/television,4454,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,4,real estate,34,none,own,2,skilled,1,none,yes,0 +4535110,0 - 200 DM,24,critical account - other non-bank loans,furniture/equipment,4736,< 100 DM,< 1 year,2,female-divorced/separated/married,none,4,car or other,25,bank,own,1,unskilled-resident,1,none,yes,1 +4788744,0 - 200 DM,30,current loans paid,radio/television,2991,unknown/none,>= 7 years,2,female-divorced/separated/married,none,4,car or other,25,none,own,1,skilled,1,none,yes,0 +6790659,none,11,current loans paid,business,2142,>= 1000 DM,>= 7 years,1,male-divorced/separated,none,2,real estate,28,none,own,1,skilled,1,yes,yes,0 +6687237,< 0 DM,24,all loans at bank paid,business,3161,< 100 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,31,none,rent,1,skilled,1,yes,yes,1 +3700907,0 - 200 DM,48,no credit - paid,other,18424,< 100 DM,1 - 4 years,1,female-divorced/separated/married,none,2,building society savings/life insurance,32,bank,own,1,highly skilled,1,yes,no,1 +2284183,none,10,current loans paid,car (used),2848,100 - 500 DM,1 - 4 years,1,male-single,co-applicant,2,real estate,32,none,own,1,skilled,2,none,yes,0 +9968219,< 0 DM,6,current loans paid,car (new),14896,< 100 DM,>= 7 years,1,male-single,none,4,unknown-none,68,bank,own,1,highly skilled,1,yes,yes,1 +6987335,< 0 DM,24,current loans paid,furniture/equipment,2359,100 - 500 DM,unemployed,1,male-divorced/separated,none,1,building society savings/life insurance,33,none,own,1,skilled,1,none,yes,1 +8752925,< 0 DM,24,current loans paid,furniture/equipment,3345,< 100 DM,>= 7 years,4,male-single,none,2,building society savings/life insurance,39,none,rent,1,highly skilled,1,yes,yes,1 +5367701,none,18,critical account - other non-bank loans,furniture/equipment,1817,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,unknown-none,28,none,own,2,skilled,1,none,yes,0 +5393100,none,48,past payment delays,radio/television,12749,500 - 1000 DM,4 - 7 years,4,male-single,none,1,car or other,37,none,own,1,highly skilled,1,yes,yes,0 +4459246,< 0 DM,9,current loans paid,radio/television,1366,< 100 DM,< 1 year,3,female-divorced/separated/married,none,4,building society savings/life insurance,22,none,rent,1,skilled,1,none,yes,1 +1342448,0 - 200 DM,12,current loans paid,car (new),2002,< 100 DM,4 - 7 years,3,male-single,none,4,building society savings/life insurance,30,none,rent,1,skilled,2,yes,yes,0 +1646042,< 0 DM,24,all loans at bank paid,furniture/equipment,6872,< 100 DM,< 1 year,2,male-divorced/separated,none,1,building society savings/life insurance,55,bank,own,1,skilled,1,yes,yes,1 +8308521,< 0 DM,12,all loans at bank paid,car (new),697,< 100 DM,< 1 year,4,male-single,none,2,car or other,46,bank,own,2,skilled,1,yes,yes,1 +6172317,< 0 DM,18,critical account - other non-bank loans,furniture/equipment,1049,< 100 DM,< 1 year,4,female-divorced/separated/married,none,4,building society savings/life insurance,21,none,rent,1,skilled,1,none,yes,0 +5333717,< 0 DM,48,current loans paid,car (used),10297,< 100 DM,4 - 7 years,4,male-single,none,4,unknown-none,39,stores,for free,3,skilled,2,yes,yes,1 +3861659,none,30,current loans paid,radio/television,1867,unknown/none,>= 7 years,4,male-single,none,4,car or other,58,none,own,1,skilled,1,yes,yes,0 +6910543,< 0 DM,12,past payment delays,car (new),1344,< 100 DM,1 - 4 years,4,male-single,none,2,real estate,43,none,own,2,unskilled-resident,2,none,yes,0 +1061602,< 0 DM,24,current loans paid,furniture/equipment,1747,< 100 DM,< 1 year,4,male-single,co-applicant,1,building society savings/life insurance,24,none,own,1,unskilled-resident,1,none,no,0 +4970330,0 - 200 DM,9,current loans paid,radio/television,1670,< 100 DM,< 1 year,4,female-divorced/separated/married,none,2,car or other,22,none,own,1,skilled,1,yes,yes,1 +9608683,none,9,critical account - other non-bank loans,car (new),1224,< 100 DM,1 - 4 years,3,male-single,none,1,real estate,30,none,own,2,skilled,1,none,yes,0 +8847915,none,12,critical account - other non-bank loans,radio/television,522,500 - 1000 DM,>= 7 years,4,male-single,none,4,building society savings/life insurance,42,none,own,2,skilled,2,yes,yes,0 +4860879,< 0 DM,12,current loans paid,radio/television,1498,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,1,car or other,23,bank,own,1,skilled,1,none,yes,0 +1242372,0 - 200 DM,30,past payment delays,radio/television,1919,100 - 500 DM,< 1 year,4,male-single,none,3,unknown-none,30,stores,own,2,highly skilled,1,none,yes,1 +5171329,> 200 DM or salary assignment,9,current loans paid,radio/television,745,< 100 DM,1 - 4 years,3,female-divorced/separated/married,none,2,real estate,28,none,own,1,unskilled-resident,1,none,yes,1 +8086940,0 - 200 DM,6,current loans paid,radio/television,2063,< 100 DM,< 1 year,4,male-married/widowed,none,3,car or other,30,none,rent,1,highly skilled,1,yes,yes,0 +5621508,0 - 200 DM,60,current loans paid,education,6288,< 100 DM,1 - 4 years,4,male-single,none,4,unknown-none,42,none,for free,1,skilled,1,none,yes,1 +5701557,none,24,critical account - other non-bank loans,car (used),6842,unknown/none,1 - 4 years,2,male-single,none,4,building society savings/life insurance,46,none,own,2,highly skilled,2,yes,yes,0 +1490791,none,12,current loans paid,car (new),3527,unknown/none,< 1 year,2,male-single,none,3,building society savings/life insurance,45,none,own,1,highly skilled,2,yes,yes,0 +7984042,none,10,current loans paid,car (new),1546,< 100 DM,1 - 4 years,3,male-single,none,2,real estate,31,none,own,1,unskilled-resident,2,none,no,0 +4745533,none,24,current loans paid,furniture/equipment,929,unknown/none,4 - 7 years,4,male-single,none,2,car or other,31,stores,own,1,skilled,1,yes,yes,0 +1360133,none,4,critical account - other non-bank loans,car (new),1455,< 100 DM,4 - 7 years,2,male-single,none,1,real estate,42,none,own,3,unskilled-resident,2,none,yes,0 +9152373,< 0 DM,15,current loans paid,furniture/equipment,1845,< 100 DM,< 1 year,4,female-divorced/separated/married,guarantor,1,building society savings/life insurance,46,none,rent,1,skilled,1,none,yes,0 +8743438,0 - 200 DM,48,no credit - paid,car (new),8358,500 - 1000 DM,< 1 year,1,female-divorced/separated/married,none,1,car or other,30,none,own,2,skilled,1,none,yes,0 +4419761,< 0 DM,24,all loans at bank paid,furniture/equipment,3349,500 - 1000 DM,< 1 year,4,male-single,none,4,unknown-none,30,none,for free,1,skilled,2,yes,yes,1 +5135310,none,12,current loans paid,car (new),2859,unknown/none,unemployed,4,male-single,none,4,unknown-none,38,none,own,1,highly skilled,1,yes,yes,0 +6359252,none,18,current loans paid,furniture/equipment,1533,< 100 DM,< 1 year,4,male-married/widowed,co-applicant,1,building society savings/life insurance,43,none,own,1,unskilled-resident,2,none,yes,1 +2579500,none,24,current loans paid,radio/television,3621,100 - 500 DM,>= 7 years,2,male-single,none,4,car or other,31,none,own,2,skilled,1,none,yes,1 +5856859,0 - 200 DM,18,critical account - other non-bank loans,business,3590,< 100 DM,unemployed,3,male-married/widowed,none,3,car or other,40,none,own,3,unemployed-unskilled-non-resident,2,yes,yes,0 +4531283,< 0 DM,36,past payment delays,business,2145,< 100 DM,4 - 7 years,2,male-single,none,1,car or other,24,none,own,2,skilled,1,yes,yes,1 +5825152,0 - 200 DM,24,current loans paid,car (used),4113,500 - 1000 DM,< 1 year,3,female-divorced/separated/married,none,4,car or other,28,none,rent,1,skilled,1,none,yes,1 +7225727,none,36,current loans paid,furniture/equipment,10974,< 100 DM,unemployed,4,female-divorced/separated/married,none,2,car or other,26,none,own,2,highly skilled,1,yes,yes,1 +7714243,< 0 DM,12,current loans paid,car (new),1893,< 100 DM,1 - 4 years,4,female-divorced/separated/married,guarantor,4,building society savings/life insurance,29,none,own,1,skilled,1,yes,yes,0 +4837212,< 0 DM,24,critical account - other non-bank loans,radio/television,1231,>= 1000 DM,>= 7 years,4,female-divorced/separated/married,none,4,building society savings/life insurance,57,none,rent,2,highly skilled,1,yes,yes,0 +6919828,> 200 DM or salary assignment,30,critical account - other non-bank loans,radio/television,3656,unknown/none,>= 7 years,4,male-single,none,4,building society savings/life insurance,49,stores,own,2,unskilled-resident,1,none,yes,0 +2791847,0 - 200 DM,9,critical account - other non-bank loans,radio/television,1154,< 100 DM,>= 7 years,2,male-single,none,4,real estate,37,none,own,3,unskilled-resident,1,none,yes,0 +2849845,< 0 DM,28,current loans paid,car (new),4006,< 100 DM,1 - 4 years,3,male-single,none,2,car or other,45,none,own,1,unskilled-resident,1,none,yes,1 +3667502,0 - 200 DM,24,current loans paid,furniture/equipment,3069,100 - 500 DM,>= 7 years,4,male-single,none,4,unknown-none,30,none,for free,1,skilled,1,none,yes,0 +3226277,none,6,critical account - other non-bank loans,radio/television,1740,< 100 DM,>= 7 years,2,male-married/widowed,none,2,real estate,30,none,rent,2,skilled,1,none,yes,0 +6097436,0 - 200 DM,21,past payment delays,car (new),2353,< 100 DM,1 - 4 years,1,male-divorced/separated,none,4,building society savings/life insurance,47,none,own,2,skilled,1,none,yes,0 +1545817,none,15,current loans paid,car (new),3556,unknown/none,1 - 4 years,3,male-single,none,2,unknown-none,29,none,own,1,skilled,1,none,yes,0 +7606017,none,24,current loans paid,radio/television,2397,500 - 1000 DM,>= 7 years,3,male-single,none,2,car or other,35,bank,own,2,skilled,1,yes,yes,1 +6567094,0 - 200 DM,6,current loans paid,repairs,454,< 100 DM,< 1 year,3,male-married/widowed,none,1,building society savings/life insurance,22,none,own,1,unskilled-resident,1,none,yes,0 +1721068,0 - 200 DM,30,current loans paid,radio/television,1715,unknown/none,1 - 4 years,4,female-divorced/separated/married,none,1,car or other,26,none,own,1,skilled,1,none,yes,0 +5114820,0 - 200 DM,27,critical account - other non-bank loans,radio/television,2520,500 - 1000 DM,1 - 4 years,4,male-single,none,2,building society savings/life insurance,23,none,own,2,unskilled-resident,1,none,yes,1 +7437809,none,15,current loans paid,radio/television,3568,< 100 DM,>= 7 years,4,female-divorced/separated/married,none,2,car or other,54,bank,rent,1,highly skilled,1,yes,yes,0 +9159315,none,42,current loans paid,radio/television,7166,unknown/none,4 - 7 years,2,male-married/widowed,none,4,building society savings/life insurance,29,none,rent,1,skilled,1,yes,yes,0 +5764003,< 0 DM,11,critical account - other non-bank loans,car (new),3939,< 100 DM,1 - 4 years,1,male-single,none,2,real estate,40,none,own,2,unskilled-resident,2,none,yes,0 +1419235,0 - 200 DM,15,current loans paid,repairs,1514,100 - 500 DM,1 - 4 years,4,male-single,guarantor,2,real estate,22,none,own,1,skilled,1,none,yes,0 +5842641,none,24,current loans paid,car (new),7393,< 100 DM,1 - 4 years,1,male-single,none,4,building society savings/life insurance,43,none,own,1,unskilled-resident,2,none,yes,0 +1259519,< 0 DM,24,all loans at bank paid,car (new),1193,< 100 DM,unemployed,1,female-divorced/separated/married,co-applicant,4,unknown-none,29,none,rent,2,unemployed-unskilled-non-resident,1,none,yes,1 +3793320,< 0 DM,60,current loans paid,business,7297,< 100 DM,>= 7 years,4,male-single,co-applicant,4,unknown-none,36,none,rent,1,skilled,1,none,yes,1 +7276981,none,30,critical account - other non-bank loans,radio/television,2831,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,2,car or other,33,none,own,1,skilled,1,yes,yes,0 +3160914,> 200 DM or salary assignment,24,current loans paid,radio/television,1258,500 - 1000 DM,1 - 4 years,3,female-divorced/separated/married,none,3,car or other,57,none,own,1,unskilled-resident,1,none,yes,0 +4116332,0 - 200 DM,6,current loans paid,radio/television,753,< 100 DM,1 - 4 years,2,female-divorced/separated/married,guarantor,3,real estate,64,none,own,1,skilled,1,none,yes,0 +2401016,0 - 200 DM,18,past payment delays,business,2427,unknown/none,>= 7 years,4,male-single,none,2,building society savings/life insurance,42,none,own,2,skilled,1,none,yes,0 +6097153,none,24,past payment delays,car (new),2538,< 100 DM,>= 7 years,4,male-single,none,4,car or other,47,none,own,2,unskilled-resident,2,none,yes,1 +3797948,0 - 200 DM,15,all loans at bank paid,car (new),1264,100 - 500 DM,1 - 4 years,2,male-married/widowed,none,2,building society savings/life insurance,25,none,rent,1,skilled,1,none,yes,1 +6006685,0 - 200 DM,30,critical account - other non-bank loans,furniture/equipment,8386,< 100 DM,4 - 7 years,2,male-single,none,2,building society savings/life insurance,49,none,own,1,skilled,1,none,yes,1 +6720199,none,48,current loans paid,business,4844,< 100 DM,unemployed,3,male-single,none,2,car or other,33,bank,rent,1,highly skilled,1,yes,yes,1 +2846538,> 200 DM or salary assignment,21,current loans paid,car (new),2923,100 - 500 DM,1 - 4 years,1,female-divorced/separated/married,none,1,car or other,28,bank,own,1,highly skilled,1,yes,yes,0 +1852011,< 0 DM,36,current loans paid,car (used),8229,< 100 DM,1 - 4 years,2,male-single,none,2,building society savings/life insurance,26,none,own,1,skilled,2,none,yes,1 +6947593,none,24,critical account - other non-bank loans,furniture/equipment,2028,< 100 DM,4 - 7 years,2,male-single,none,2,building society savings/life insurance,30,none,own,2,unskilled-resident,1,none,yes,0 +5518158,< 0 DM,15,critical account - other non-bank loans,furniture/equipment,1433,< 100 DM,1 - 4 years,4,female-divorced/separated/married,none,3,building society savings/life insurance,25,none,rent,2,skilled,1,none,yes,0 +8456653,> 200 DM or salary assignment,42,no credit - paid,business,6289,< 100 DM,< 1 year,2,male-divorced/separated,none,1,building society savings/life insurance,33,none,own,2,skilled,1,none,yes,0 +7323240,none,13,current loans paid,radio/television,1409,100 - 500 DM,unemployed,2,female-divorced/separated/married,none,4,real estate,64,none,own,1,skilled,1,none,yes,0 +5716073,< 0 DM,24,current loans paid,car (used),6579,< 100 DM,unemployed,4,male-single,none,2,unknown-none,29,none,for free,1,highly skilled,1,yes,yes,0 +4639718,0 - 200 DM,24,critical account - other non-bank loans,radio/television,1743,< 100 DM,>= 7 years,4,male-single,none,2,building society savings/life insurance,48,none,own,2,unskilled-resident,1,none,yes,0 +2797680,none,12,critical account - other non-bank loans,education,3565,unknown/none,< 1 year,2,male-single,none,1,building society savings/life insurance,37,none,own,2,unskilled-resident,2,none,yes,0 +3956429,none,15,all loans at bank paid,radio/television,1569,100 - 500 DM,>= 7 years,4,male-single,none,4,car or other,34,bank,own,1,unskilled-resident,2,none,yes,0 +3257050,< 0 DM,18,current loans paid,radio/television,1936,unknown/none,4 - 7 years,2,male-married/widowed,none,4,car or other,23,none,rent,2,unskilled-resident,1,none,yes,0 +3566008,< 0 DM,36,current loans paid,furniture/equipment,3959,< 100 DM,unemployed,4,male-single,none,3,building society savings/life insurance,30,none,own,1,highly skilled,1,yes,yes,0 +5331918,none,12,current loans paid,car (new),2390,unknown/none,>= 7 years,4,male-single,none,3,car or other,50,none,own,1,skilled,1,yes,yes,0 +9671059,none,12,current loans paid,furniture/equipment,1736,< 100 DM,4 - 7 years,3,female-divorced/separated/married,none,4,real estate,31,none,own,1,unskilled-resident,1,none,yes,0 +2180183,< 0 DM,30,current loans paid,car (used),3857,< 100 DM,1 - 4 years,4,male-divorced/separated,none,4,building society savings/life insurance,40,none,own,1,highly skilled,1,yes,yes,0 +3130615,none,12,current loans paid,radio/television,804,< 100 DM,>= 7 years,4,male-single,none,4,car or other,38,none,own,1,skilled,1,none,yes,0 +6267789,< 0 DM,45,current loans paid,radio/television,1845,< 100 DM,1 - 4 years,4,male-single,none,4,unknown-none,23,none,for free,1,skilled,1,yes,yes,1 +6959896,0 - 200 DM,45,critical account - other non-bank loans,car (used),4576,100 - 500 DM,unemployed,3,male-single,none,4,car or other,27,none,own,1,skilled,1,none,yes,0 diff --git a/Module4/IntroductionToRegression.ipynb b/Module4/IntroductionToRegression.ipynb index c3e4c51..92a5398 100644 --- a/Module4/IntroductionToRegression.ipynb +++ b/Module4/IntroductionToRegression.ipynb @@ -4,16 +4,15 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# Module 4 \n", "# Introduction to Regression\n", "\n", "## Introduction\n", "\n", - "In this lab you will learn to apply linear regression modules using the Python Scikit-Learn package. In particularly;\n", + "In this lab you will learn to apply linear regression modules using the Python scikit-learn package. In particularly;\n", "\n", "1. Understand the basics of applying regression models for prediction. \n", "2. Evaluate the performance of regression models. \n", - "2. Apply a recipe for using Scikit-Learn to define, train and test machine learning models. " + "2. Apply a recipe for using scikit-learn to define, train and test machine learning models. " ] }, { @@ -26,9 +25,9 @@ "\n", "The method of regression is one of the oldest and most widely used analytics methods. The goal of regression is to produce a model that represents the ‘best fit’ to some observed data. Typically the model is a function describing some type of curve (lines, parabolas, etc.) that is determined by a set of parameters (e.g., slope and intercept). “Best fit” means that there is an optimal set of parameters according to an evaluation criteria we choose.\n", "\n", - "A regression models attempt to predict the value of one variable, known as the **dependent variable**, **response variable** or **label**, using the values of other variables, known as **independent variables**, **explainatory variables** or **features**. Single regression has one label used to predict one feature. Multiple regression uses two of more feature variables. \n", + "A regression models attempt to predict the value of one variable, known as the **dependent variable**, **response variable** or **label**, using the values of other variables, known as **independent variables**, **explanatory variables** or **features**. Single regression has one label used to predict one feature. Multiple regression uses two or more feature variables. \n", "\n", - "In mathematical form the goal of regression is to find a function of some features $X$ which predicts the label value $y$. This function can be writen as follows:\n", + "In mathematical form the goal of regression is to find a function of some features $X$ which predicts the label value $y$. This function can be written as follows:\n", "\n", "$$\\hat{y} = f(X)$$\n", "\n", @@ -45,19 +44,19 @@ "\n", "In this lab, you will work with linear regression models. Linear regression are a foundational form of regression. Once you understand a bit about linear regression you will know quite a lot about machine learning in general. \n", "\n", - "The simplist case of linear regression is know as **single regression**, since there is a single feature. The function $f(X)$ is **linear in the model coefficients**. For a single vector of features $x$ the linear regression equation is written as follows:\n", + "The simplest case of linear regression is know as **single regression**, since there is a single feature. The function $f(X)$ is **linear in the model coefficients**. For a single vector of features $x$ the linear regression equation is written as follows:\n", "\n", "$$\\hat{y} = a \\cdot x + b$$\n", "\n", "The model coefficients are $a$, which we call the **slope** and $b$, which we call the **intercept**. Notice that this is just the equation of a straight line for one variable. \n", "\n", - "But, what are the best values of $a$ and $b$? In linear regression, $a$ and $b$ are chosen to minimize the squared error between the predictions and the known labels. This quantity is known as the **mean squared error** or MSE. For $n$ **training cases** the MSE is computed as follows:\n", + "But, what are the best values of $a$ and $b$? In linear regression, $a$ and $b$ are chosen to minimize the squared error between the predictions and the known labels. This quantity is known as the **sum squared error** or SSE. For $n$ **training cases** the SSE is computed as follows:\n", "\n", "$$MSE = \\sum_{i=1}^n \\big( f(x_i) - y_i \\big)^2\\\\\n", "= \\sum_{i=1}^n \\big( \\hat{y}_i - y_i \\big)^2\\\\\n", "= \\sum_{i=1}^n \\big( a \\cdot x_i + b - y_i \\big)^2$$\n", "\n", - "The approach to regression that minimizes MSE is know as the **method of least squares**." + "The approach to regression that minimizes SSE is know as the **method of least squares**." ] }, { @@ -66,7 +65,7 @@ "source": [ "### Execute a first linear regression example \n", "\n", - "With this bit of theory in mind, you will now train and evaluate a linear regresson model. In this case you will use simulated data, which means that you can compare the computed results to the known properties of the data. \n", + "With this bit of theory in mind, you will now train and evaluate a linear regression model. In this case you will use simulated data, which means that you can compare the computed results to the known properties of the data. \n", "\n", "As a first step, execute the code in the cell below to load the packages you will need to run the rest of this notebook. " ] @@ -100,7 +99,7 @@ "The code in the cell below simulates the data and plots the result. The data has the following properties:\n", "\n", "- The `x` variable is uniformly distributed between 0.0 and 10.0.\n", - "- The `y` variiable equals the `x` variable plus a Normally distributed rabdom component. As a result, for the un-scalled data, the slope coefficient should be 1.0 and the intercept 0.0. \n", + "- The `y` variable equals the `x` variable plus a Normally distributed random component. As a result, for the un-scalled data, the slope coefficient should be 1.0 and the intercept 0.0. \n", "\n", "Execute this code and examine the result. " ] @@ -109,13 +108,13 @@ "cell_type": "code", "execution_count": 2, "metadata": { - "scrolled": true + "scrolled": false }, "outputs": [ { "data": { "text/plain": [ - "" + "Text(0.5, 1.0, 'Data for regression')" ] }, "execution_count": 2, @@ -124,9 +123,9 @@ }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhsAAAGJCAYAAAAjYfFoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XmcXGWd7/FP9ZL0gh2aJhuLRJLxlyCCSBhgQFkcEUZx\nv+I2VwKOzgBzdQZnvCiKy+gdEMeLuDAuwDhzZUaHURkQrooCekVJQggyhh9JICzZu9N0SC9JV3Xd\nP86p7upOdXdVd506p6q+79erX+k6VXXqqR9Fn189z+95nlQ2m0VEREQkKg1xN0BERERqm5INERER\niZSSDREREYmUkg0RERGJlJINERERiZSSDREREYmUkg0RERGJlJINERERiZSSDREREYlUU9wNEKkX\nZnYf8Oq8Q1mgH3DgO8DX3D1T4jmPA77p7meUoX1nA/8IHAPc6+6vn+0564WZXQN80t0b426LSBIp\n2RCpnCzwMPAXQApoBA4DLgC+BJwJXFTiOf8bcFqZ2veFsF0XALvKdM568U3g7rgbIZJUSjZEKmuv\nu6+ecOwuM3PgBjP7obvfVsL5UmVsWxdwv7v/ooznrAvuvg3YFnc7RJIqpY3YRCrDzH4BZN393AL3\npYBngCfd/azwWAtwDfA24MXAfuC3wN+4+/qw6/6a8BRZ4NPu/hkz6wI+A7weWAzsA+4H/srdny7w\n2scAT4XnSIX/nuPuD5jZSuCzwClAc3ie/+nuvw+fexbwC+DPgY8BhwJvc/d7C7zOCPAp4ELgZcDn\n3f3vzOxo4DrgPKAFeBD4iLs/kvfcRcD/Bl4bHvoe0AO8x91fEj7mKeAHwAnAHwH/4u4fMLNO4O+B\nNwHzgEeAj7v7z/PO/9owZscDw8ADwEfd3cP7jyXofToDaAXWA59197vD+z9FMIzSkHfOi4CPAMvD\n/wY/BK5y9+fD+68B3gt8CPhfgAFPh+f9l4nxE6lmKhAVSQB3zwL3AqeaWe7/y38GLgY+R3CR/SuC\ni/T/Ce//FvBtguTgtPA2wI+BPwb+JnzeNcBrgK9P8vLbwufvBO4Kf3/YzM4B/l94/ouBS4GjgV+b\n2UsnnOOTwF8DlwO/nuKtXhW2/23A7WFi9CBwEnAZ8E6Cv0sPmJkBmNkcgoTmdOAvw7acCFwZti3f\n5QQJ2RuBb5vZ3PC5F4av/RbgWeCesEYFM3sJQSLwEPAG4BKCC/9d4f2p8Pc24D3huXuAH4VJCGE7\nRttiZlcD3w1j8VaCJOvtwC/CNuUsBm4kSGT+hCDp+6cC8RWpahpGEUmOHQS9B11m9jzQDlzh7reH\n9//SzOYB15vZAnffambPAeSGZsxsMfAC8GF3fzB83gNm9gfAnxV6UXcfBh4ys/3A7rxz/T3wBPD6\nMBnCzH4KbCboBXhn3mm+6u7/UcR7fMDd/3fuhpl9DugETnP358JjdwOPh69xEcG3/5cCJ+d6O8Je\noicLnP9pd/943vn/DHg5cKq7rwkP3xMW614LnAr8IUGPyv9y9x3h854D3mRm7cAhBMnHp939/4b3\nP0SQxOUnDrnXPBT4OHCTu38o7/h/EfSYrAJuCg+3Ape6+33hYzYS9G68niD2IjVByYZIcuTqL7Jh\nAvAnAGZ2BMHF9qUE37yhwEUOwN23E/Rq5IZH/oCgG/+MyZ5TiJm1ASuBT+USjfD8fWb2nwRFpPnW\nF3nqiY87l2BYY7uZ5c/kuAd4d/j7OQTDS6PDKu6+z8zuBM6ecL5HJtw+lyCJW5d3/hRwJ3BtmLz9\nhmCIao2ZfZ+g0PO+vOSk38x+D3zLzM4H/i9wt7t/ZJL3eDowB/jX/IPu/iszezps8015d/0m7/fn\nwn/bJzm3SFXSMIpIchwFDBJ00WNmrwsvcs8RdPO/h+CiCFMUhprZe8KL2pPAbQTd/gMltuXQ8DV2\nFLhvR3h/TpagJqEYEx/XRTBsM5z3c4Bgxk5HWLcyn8KzY3YWef7FBc5/bdjuxWEdy6sJLvqXEiQb\nO8zss3nn+WPgVoK6kn8BdprZv4bJykSd4b/FxA53H8r7PZfY6W+z1BR9oEUSIPzWfTbwK3fPhrUA\nPyCYKnusux8aFo7+5zTnORP4J+D7wJHuPt/dzyOoiyjF8wQX40UF7lsMdJd4vqle537gZIKelNzP\nKQTDGwcIkq2FBZ67oMjzPzHF+Z8CcPc17v52gqnIryHovfiYmb0tvH+Hu1/h7kcQ1JdcS1B38ncF\nXnMPQaIWdexEqoaSDZFk+HOCi1OuiPNkgmGPa919S97j/iT8N/f/7sRFwE4nuNB9Oq/+oJHgG3nR\n3H0AWAO8IyyQJDzXPIKhnF+Wcr4p3E9QD7HR3R/O/QDvI6hlGAkf8xIzOyGvHa0cPJQz2fmPJqhF\nyT//+cDfAmkz+5CZbTGzZndPh/UTHySI4zFmdpqZ7TCzkwHc/VF3/yTwO4IF0Cb6LUEP1LvyD5rZ\nqwhmFZUrdiJVQzUbIpXVYWanhr83AIcTXPg+APyzu/8ovO9hgkTiOjP7IkHisYqxC2xuTD83jfKd\nBMMAD4XHv2pmNxMMI1xGUCSJmbW7e3+Rbb2KoHbibjP7atiGqwjqET6T97jZrPXxDwQFoPea2fUE\nQ0jvJBjO+HD4mO8C/5Ng9sfVQB/BzJz5BMWUU7kFuAL4mZl9nmB68XkEicYN7p4xs58TTI39oZl9\nhSDufw4MAXcQzF4ZAP7ZzD5NMBTyWoIZMV+a+ILu3hsW137CzNIEvVHHEsTsMYLVYkXqino2RCrr\nJILpkL8m+Ib7HYJE4IPufnHuQe6+meCieyTwI4KCwhGCoZYs8KrwobcDqwnqCT7i7vcTTP88nWAK\n7PXAFoLpl+Q9r5Bx0zfDdSj+mGCmxm0ES5k/TTCzY8OE5xVj3PnD19hOsCbGUwS9OncQDHNc4u43\nho/JECQIa4GvEQwTPUYwzJRfo1Ho/AME7/mXBEMfPwbeDPytu18ZPuZ3BFNjX0SQ2NxOUHfxWnff\n5O77CZKL/yJY6+MegjqYD7j7PxeKg7t/miDJOyd8T58A/g14lbsPFnrOVO9DpNolalGvcP75GuBy\nd39gwn0dwO+Bj7m7vhmI1Ilw/5flE6fWmtlvgWfDWgsRSbDEDKOEicZtwHGTPOQ6guIqEakvhwDf\nN7OvAf9BsBbJRQR1LX8TZ8NEpDiJGEYxsxUE480vmeT+MxmbLy8idcTdHyLYcG4lwdDJ94GlwOsm\n9oCKSDIlpWfjLIKlmq9mwnoA4VLF3yAY//xm5ZsmInELh1CKWaFURBIoEcmGu4+uphduh5Dv48Ba\nd/9ZgftEREQk4RKRbEwmLAz7AOG0PREREak+iU42CIZPPunus1pxL5vNZlOp2SwFICIiUrdmfQFN\n1NRXADPLrSWwJfzZx9gbbSNYme8X7v76Ek6b3bt3kExmpHwNrTKNjQ10dLSiOCgOOYpFQHEYo1gE\nFIcxYSxmnWwkuWfjOWDZhGP3Eyyq891ST5bJjJBO1/eHBhSHHMVhjGIRUBzGKBYBxaF8EptshHsi\nPJl/LFz6d3e46qCIiIhUgUSsszHBVOM6yRrzERERkWklrmfD3RunuO/YSrZFREREZi+JPRsiIiJS\nQ5RsiIiISKSUbIiIiEiklGyIiIhIpJRsiIiISKSUbIiIiEiklGyIiIhIpJRsiIiISKSUbIiIiEik\nlGyIiIhIpJRsiIiISKSUbIiIiEiklGyIiIhIpJRsiIiISKSUbIiIiEiklGyIiIhIpJRsiIiISKSU\nbIiIiEiklGyIiIhIpJRsiIiISKSUbIiIiEiklGyIiIhIpJRsiIiISKSUbIiIiEiklGyIiIhIpJRs\niIiISKSUbIiIiEikmuJuQD4zmwusAS539wfCY6cBXwROAJ4Drnf3b8fXShERESlFYno2wkTjNuC4\nvGMLgR8DPwdeAXwKuNHMLoijjSIiIlK6RPRsmNkK4LsF7nozsN3dPxHe3mxm5wDvBu6uVPtERERk\n5pLSs3EWcC9wOpDKO343sKrA4+dVolEiIiIye4no2XD3m3K/m1n+8WeAZ/LuWwC8E/hkJdsnIiIi\nM5eIZKMYZtYC3A5sA75R6vMbG5PSiROP3PtXHBSHHMUioDiMUSwCisOYcsWgKpINM2sH7gCWAWe4\n+1Cp5+joaC17u6qR4hBQHMYoFgHFYYxiEVAcyifxyYaZvQi4BzgWOMfdn5zJefbuHSSTGSlr26pJ\nY2MDHR2tioPiMEqxCCgOYxSLgOIwJheL2Up0smFmKeAHwBLg1e6+cabnymRGSKfr+0MDikOO4jBG\nsQgoDmMUi4DiUD6JTjaA9wNnAxcCe8N1NwAOuHtvbK0SERGRoiUx2ciGPwBvJZgKe+eEx9wPnFvJ\nRomIiMjMJC7ZcPfGvN+1UqiIiEiV07weERERiVTiejZERKQ+DaczPLq5hx17Blh0WBsnLO2iualx\n+idK4inZEBGR2A2nM9xy9+Ns7xkYPbb2id2sumC5Eo4aoGEUERGJ3aObe8YlGgDbewZ4dHNPTC2S\nclKyISIisduxZ6Ck41JdlGyIiEjsFh3WVtJxqS5KNkREJHYnLO1icdf4xGJxV1AkKtVPBaIiIhK7\n5qZGVl2wXLNRapSSDRERSYTmpkZOtgVxN0MioGEUERERiZSSDREREYmUkg0RERGJlJINERERiZSS\nDREREYmUkg0RERGJlJINERERiZSSDREREYmUkg0RERGJlFYQFRGRujKczmhZ9ApTsiEiInVjOJ3h\nlrsfZ3vP2Nb1a5/YzaoLlivhiJCGUUREpG48urlnXKIBsL1ngEc398TUovqgZENEROrGjj0DJR2X\n8lCyISIidWPRYW0lHZfyULIhIiJ144SlXSzuGp9YLO4KikQlOioQFRGRutHc1MiqC5ZrNkqFKdkQ\nEZG60tzUyMm2IO5m1BUNo4iIiEikEtWzYWZzgTXA5e7+QHhsCfBN4HRgC/BX7v7TuNooIiLx0qJc\n1ScxyUaYaNwGHDfhrh8C64GTgbcAPzCz5e7+XIWbKCIiM1DO5ECLclWnRCQbZrYC+G6B4+cCxwKn\nufsQ8Pdm9hrgEuAzlW2liIiUqtzJwVSLcqkOI7mSUrNxFnAvwVBJKu/4qcDDYaKR86vwcSIiknDT\nrdg5nM6w1ndx14NbWOu7GE5npjyfFuWqTono2XD3m3K/m1n+XYuBbRMevhM4qgLNEhGRWZoqOZhJ\nr4cW5apOiUg2ptAG7J9wbD8wt9QTNTYmpRMnHrn3rzgoDjmKRUBxGBNFLI6YfwipJ3YXPP7Yll52\n7BkgldefvWPPAI9t6eWU5YWHRE6y+azb2M22nv6xc3W1c5LNp6mpPO3WZ2JMuWKQ9GRjCDhswrG5\nQMn9ZR0drWVpULVTHAKKwxjFIqA4jClnLM5e+WJ+9+Qentv1wuixoxa8iLNXvpgfPbCZpgIXs76B\nYTo72yc955XvXcmaDTvZunsfR84/hJUrFjKnufzFofpMlE/Sk42tHDw7ZRGwvdQT7d07SCYzUpZG\nVaPGxgY6OloVB8VhlGIRUBzGRBWL9573B6zf1MOOngEWdbVx4rIu+vcNMa+tmXSB15nX1kxvb/+4\nYwfSmfAc/SzqaufEZV2sOHoeAP37hug/6Cwzp8/EmFwsZivpycZvgI+a2Vx3zw2nnAn8stQTZTIj\npNP1/aEBxSFHcRijWAQUhzHljkUDKU5adjgsGzuWTo9w/JJOVm/YOa5mY3FXG8cv6Rz3+oVqO1Zv\n2MmqC5YDRLbmhj4T5ZP0ZON+4FngVjP7LPBG4BTg4jgbJSIis1fsPiWTzWh5+IndrJ9wX36BqRb/\nSo4kJhvZ3C/uPmJmbwK+TbCy6CbgzVrQS0SkNhSzT8lkM1rWb+ph+57C02pPWNqlxb8SJHHJhrs3\nTrj9JHBOTM0REZGYlTqtNZecaPGv5NC8HhERmZFSF+SaqROWdrG4a3zCsTgsNC1k0WFtWvwrYRLX\nsyEiIslXyT1KJqvtAA6q2VjY2Uo6M8LW3f0MDA3TOreJVN5CHlr8Kx5KNkREpGSV3qNkstqO/CTk\n8HktrNvYzT0PPUs2m2VgKM3AUJqueS2kUikWd40lKVJZSjZERKRkMx2mKPcMkfwkZK3vYmfvIACp\nVIqueS0M7k9zRFc7J9t8zUaJkZINEREp2Uz2KIl66GViopNKpWhraebI+e0qCo2ZCkRFRKRkkxVt\nTjVMMd0OsLOlTdqSSz0bIiJSsmIX5MoX9QyRE5Z2sfaJ3QetSKo6jfgp2RAREYbTGdZt6qZvYJh5\nbc0cv6Rz2qGNYhbkyhd1z8NMEiCpDCUbIiJ1LldLsWPPAE2NDaQzI6N7j5TzQl2JnodSEyCpDCUb\nIiJ1LldLkbccRSTTWNXzUL+UbIiIlKjWNviq5Gqb6nmoT0o2RERKUMmVMytFszgkapr6KiJSgqin\nb8ZhJtNYJ6rUPilSndSzISJSglrc4CtXS/HYlt6SZqPk1GJvj5SXkg0RkRLU6pBDc1MjpyxfQGdn\nO729/aTTI0U/t9L7pEj10TCKiEgJyjHkUGsK9epks1nW+m4Nqwigng0RkZJo+ubBJvbqZLNZevqG\nGNyfZnuYiGhYpb4p2RARKZGmb443cbGuwf1pAFrnjl1iNKxS35RsiIjIrEzs7dm6u59tPf2k8lcJ\nI94i2qnWRpl430k2P7Z21iolGyIiCVVNi4fl9/as9V1sf/DgxCKuItqpZssAB923bmM3V753ZcXb\nWcuUbIiIJFCc00kPpDOs891F9QIUSoCStvvqdGujTLxvW08/azbsZMXR8yrWxlqnZENEpEzK2RMR\n13TSA8MZbr5zA1u7+4Gg2POna57luCWHsbCzlXUbu9nZOzj6+EIJUNKKaGeyNsrW3fuUbJSRkg0R\nkTIod09EXIuHrdmwk209Y4lGT98Qw+kRel/YD8DAUJqueS2j9RiTJUBJKqKdydooR84/JKrm1CWt\nsyEiUgblXsY8rsXDtu7eN/r74P40w+HiXsPpkdGf3GyTnKSvnjrV2iiF7juiq52VKxZWsok1Tz0b\nIiJlUEpPRJLrHvK/0Q/nrSLa3NRQ8Dgkf/XU6YZ1Jt53ks1nTnMj/TG3u5Yo2RARKYNieyKKHW6J\nuu6hUMLT1NTAyhULuX/ts2zt7h9NMJqbGkbXzBgYSo9LPKpl9dSphnUm3tfUpE7/cquKZMPMjgK+\nDrwa6AFucPcb4m2ViMiYYnsiSin8jKruYbKE5/0XHkdncyOXvGEF63w3W7v38fstvQwdyIzWaLzs\nJYdx0h8cTnffUOyFn1I9qiLZAL4PPAW8EngZ8F0z2+LuP4q3WSIigWJ7IpKwa+xkCc/6TT2cN7+D\nOWGSc7It4IJTq2etD0muxCcbZnYocCpwqbtvBjab2T3AawAlGyKSGMX0RMS5a2xu6OS+R7YyMDRM\n69ymcat87ug5OOFJ0qwSqV7VMDA1CPQDq8ysycwMOAN4ON5miYiULq5dY3NDJ3c++DS7egfp23eA\nnr4hstns6GMWdSW70FOqV+J7Ntx9v5ldAXwF+DDQCNzi7rfG2jARkRmo9IJXud6Mtb6bzdv6aJ3b\nROvcJgaG0qPTWNtamlnc1caJy5Jf6CnVKfHJRmgFcAdwPfBy4EYz+5m73xZvs0RESlepoYn8QtC+\nffsZGEqPLsrVNa+Fwf1pFna2cdYrjuCEpV3MUS2GRCTxyYaZvQa4FDjK3fcD68LZKVcDRScbjY3V\nMGIUndz7VxwUhxzFIlDLcVi3qZsdewZIpcJ1MlIwnBlh6ECGtpYm2lubOefkozhleZD41HIsSqE4\njClXDBKfbBDMQNkYJho564CPlXKSjo7WsjaqWikOAcVhjGIRSHocDgxnWLNhJ1t37+PI+YewcsVC\n5jRP3RPRNzBMU3ixeFH7HAYPZBgeHiGTydLU2MBRC17E2StffNB5kh6LSlEcyqcako1twDIza3L3\n3Bq5KwimwhZt795BMpmR6R9YoxobG+joaFUcFIdRikWgGuJwIB1sjpbbswTg/rXPcskbVjCnqZED\n6QzrN/Wwo6efRV3tnLgsGBKZ19ZMOu89HdYxl8GhNMuOOpSVyxdw4rIu+vcNja6UWQ2xqATFYUwu\nFrNVDcnGfwLXAd8ys88By4Grwp+iZTIjpNP1/aEBxSFHcRijWASSHIdgga3xi2dv7e5nne/mhKVd\nBy3QtXrDTlZdsJzjl3SyesPO0ftSpFh65DwuPt9GC1ILveckxKKcO+jOVBLiUCsSn2y4+96wbuMG\n4CFgN/AZd/9WvC0TEamM6RYCm2pF0iRt9V6scu+gK/FLfLIB4O6PA6+Lux0iInGYaiGw6RKRalyU\nq5Ql3aU6qNRWRCThploILM4VSaOShCXdpbyqomdDRKSeTbUQWFxb0UepFhOoeqdkQ0SkCkw2HDIx\nETl8XgsAP1n9bNXUaExUiwlUvVOyISJS5XKJSK0UVlZ6SXeJnpINEZEaUUuFldVY2CqTU7IhIpFJ\nwloJ9USFlZJUSjZEJBK10qVfTVRYKUmlqa8iEompuvRryXA6w1rfxV0PbmGt72I4nZnV42Zjqimy\nInFSz4aIRKIeuvSL7b0ppZdnNkNPKqyUpFKyISKRSFKXfrEX8FIv9MUWZBb7uHIMPamwUpJIyYaI\nRGI2ayWUs7C02Av4gRlc6IvtvSn2cbU0m0Qkn5INEYnETLv0y11YWuwFfP2m0i/0xfbeFPu4ehh6\nkvqkAlERiUyuS//1py/hZFtQVLJQ7sLSonsfevqLely+Ygsyi31ckoaeRMpJPRsikijl/nZfdO9D\nV3tJz4fie2+KfZyW6ZZapWRDRBKl3N/ui72An7isi9UbdpZ8oS+2ILOYx2k2idQqJRsikijl/nZf\n7AV8TkIu9JpNIrVIyYaIJEoU3+7L2fsgIqVTsiEiiVMrF33tDSMSULIhIhKBqPeGUSIj1UTJhohI\nBEpZNTTutUhEoqZkQ0QkAsVM4Z1p0qCVRqXaaFEvEZEIFDOFd6YLmGmlUak2SjZERKYxk+3hi1k1\ndKZJg1YalWqjYRQRkSnMdKijmCm8M00atNKoVBslGyJSt4bTGdZt6qZvYJh5bc0cv6TzoARiNvUR\n003hnWnSoJVGpdoo2RCRupTrsdixZ4CmxgbSmRFWb9h5UI9FlPURs0kaamUtEqkPSjZEpC7leixS\nqbFjhXosoq6PUNIg9aAqkg0zmwN8CXgXsB+42d0/Hm+rRKSaFdtjofoIkdmrimQD+DJwNvBaoAP4\nNzPb4u7fjLVVIlK1iu2xUH2EyOwlPtkws07gEuBcd18bHrseOBVQsiEiM5LrscjvyZisx0JDHSKz\nU1SyYWbfAS539xcibk8hZwLPu/uvcgfc/boY2iEiNSTXY/HYlt4pZ6OIyOwV27PxBuAsM1vl7j+P\nskEFHAtsMbM/BT4GzAFuAT7n7tkKt0VEakhzUyOnLF9AZ2c7vb39pNMjcTdJpCYVm2ysAL4G/MTM\nvgp81N2HomvWOIcALwU+AFwMLAa+AfQTFI0WpbGxvhdLzb1/xUFxyFEsAorDGMUioDiMKVcMUtls\n8Z0DZvY2gmLNvcCfuvuasrRi6tf8KPB54Bh3fy489iHgL9x9eZGnUQ+IiIjIzKSmf8jUSioQdffb\nzewnwGeAB8zsDmBgwmMumW2jJtgODOUSjdzLAEeXcpK9ewfJZOq3i7SxsYGOjlbFQXEYVWuxOJDO\nsH5TDzt6+lnU1c6Jy7qYU0T9Ra3FYTYUi4DiMCYXi9mayWyUFqATmAu8hAnJRgR+A7SY2TJ33xQe\nOw7YUspJMpkRjceiOOQoDmNqIRaF9i8ptBroVCbGYTidqdvprrXwmSgHxaF8Sko2zOz9wHXAIPAW\nd78jklblcfcnzOwu4FYzu4ygZuOjBL0rIiKz2r+kkJluviYihRVV+WFmy8zs5wSFmXcCL6tEopHn\nPcAm4JfArcCX3f2rFXx9EUmwcu9fMlXyIiKlK7Zn41FgD3Chu98VYXsKCtf3uDj8EREZp9z7l0S5\n+ZpIPSp2Tsu/EvRmVDzREJH6NpzOsNZ3cdeDW1jruxhOZw56zAlLu1jcNT6xmM3+JVFvviZSb4rq\n2YhghomIyLSKrZ0o9/4l2nxNpLwSvzeKiNSvibUT2WyWzVv7uOXHj3OyzR+XUJRz/xJtviZSXko2\nRCSx8mskstksPX1DDKdHePyZXrbvGYh0hog2XxMpH63FKiKJlV8jMbg/zXC45kFzU/CnSzNERKqD\nkg0RSaz8ws/8RKN17linrGaIiCSfhlFEJLHyayfW+m42b+ujdW4TqdTYVg2aISKSfOrZEJFEy9VO\nrPqT5Sw9ct64REMzRESqg3o2RKQqFJohsuKYTs0YEakCSjZEpGrkzxDR/iUi1UPDKCJSlbR/iUj1\nUM+GiJRVpbZm1/4lItVDyYaIlE0lhza0f4lI9dAwioiUTSlDG8VssDaVcm++JiLRUc+GiJRNsUMb\n5egB0f4lItVDyYaIFKWYWoxihzam6gEpZT+SmexfMvF9nGTzS3q+iJROyYaITKvYnohit2aPq7iz\n0PtYt7GbK9+7MtLXFal3SjZEZFrF9kQUO7QRV3FnofexraefNRt2suLoeZG+tkg9U7IhItMqpSei\nmKGNYntAym2y97F19z4lGyIRUrIhItMqd09EXMWdk7X3yPmHRPq6IvVOyYZIlavEIloTeyKy2Swt\ncxrZ2t0P7JrRa86kuLNYk8WkUI/KEV3trFyxkP59Q5G0RUQglc1m425DJWR7e/tJp0fibkdsmpoa\n6OxsR3GorTgUKnhc3NVW1BTSUmORu4Bv7d7H77f0MnQgM7oDa7GvWQnTxaTQbJSF8ztq5jMxG7X2\n/8dMKQ5jwlikpn/kNOcpR2NEJFqTfVMv1xTSYuT3RKzb2DNuq/diX7MSvTDTxWRij0pTk9Y2FIma\nkg2RhJtq2mkcU0hn+pqVWspce6aIJI9SepGEm+qbehxTSGf6mpXapVV7pogkj5INkYSb+I08m80y\nMDTMfY9sJZ0ZYWFn67j7yzmFtND+JTPdk6RSPQ7aM0UkeTSMIpJw+d/Is9ksPX1DDIdFa/c89CwL\nO1s5/w9PHq52AAAYo0lEQVSPprtvqKx1EFMNe8xk2mqlehy0Z4pI8lRVsmFmdwE73f2SuNsiUin5\n0zUH96cZTo/Q3NRA69zgf9+dvYM0NTbw+tOXlPV1pyu0LLUAtZILeUU5rVZESlc1yYaZvRO4ALg1\n5qaIVFT+N/X7HtkKQOvcpnGzQaIofiz3sId6HETqV1UkG2bWCVwHPBR3W0TikP9N/c4Hnz7o/iiK\nH0sZ9ih2Sqt6HETqU1UkG8D1wHeAI+NuiEicKjkUUexrVWpKq4hUr8QnG2Z2LvAq4OXATTE3RyRW\nlRyKKPa1KrmwmIhUp0QnG2Y2lyDBuMzd95vZjM/V2Fjfs3xz719xqP44NDU1cOrLFpX8vAPpDOs3\n9bCjp59FXe280uYDU8eimNfa9fwgqQKLGe96fqgqVueshc9EuSgWAcVhTLlikOhkA/gUsNrdfzbb\nE3V0tE7/oDqgOATqLQ4HhjPc/L1HeG7XC8GBjd387sk9/OU7XjHrWCx78WGs29hd4HgnnZ3tszp3\nJdXbZ2IqikVAcSifpCcbFwELzSz8C8lcADN7u7t3lHKivXsHyWTqd0OdxsYGOjpaFYc6jcPqx3ex\nZXvfuGNPb9/Lmg07OX5J56xisXTxISw4tJVtPf2jx47oamfp4kPo7e2f4pnJUK+fiUIUi4DiMCYX\ni9lKerJxFtCcd/s6IAv8baknymRG6n73PlAccuotDtt27+OgDZ5TWbbu3sfSxYewznfPuAakgRTv\nO98Oqu1oIFVVMa63z8RUFIuA4lA+iU423P3Z/NthD0fW3Z+KqUkiVWnidNVgyfM0T23r47ePbWdw\nf3p03Y6ZzCTRlFYRmYqqX0RqSKG9TGD8fiG5Jc8HhtI88czzPLV9Lz19Q2TDro8oNkcTkfqW6J6N\nidx9VdxtEEmq6da7yE1jXeu7Gdyfpq2lmRcGDoTPHRk9BtqOXUTKSz0bIjViui3cc0MdR85vp62l\nmVQK5jSPDZUM541Nazt2ESmnqurZEIlKscttJ1mxe5nkJxJtLU280N8wurlbNpulZU4jW7v7gV0V\ni0MtxF9EJqdkQ+perSy3XagIdHB/mq27+1nrY4lDbhnyHXsGaEil6Dq0hZbmRpa/+FAef+Z5hg5k\nWLexm3UbuysSh1qJv4hMTsMoUvemG36oFpMVgW7r6efOB5/mlrsfZzidGa3fuPCMl/BHJxzBm848\nlisvegVHL3gR+4dHxu0mW4k41Er8RWRy6tmQulfurdTjUqgINH8r+vz9SpqbGjll+QI6O9vp7e0n\nnR6JLQ61En8RmZySDal7pWylXg7F1ifMpI4hVwS6Y88A2wtcrKe6gFc6DnG/rohUjpINqXuV3La9\n2PqE2dYxzOQCXsk4JOF1RaRylGxI3avktu3Fbsc+223bZ3IBr2QckvC6IlI5SjZEqNxy28XWJ8y2\njmGmF/C4lh3XcucitU3JhkgFTTW8kV+j0T84TDabHTczZKrnF6ILuIgkhZINGaWFlaI3cXgjt4jW\nM7te4KdrnmXoQIZUKjW6UVpby9hsEtUxiEi1UrIhgBZWqpT84Y2t3fv4/ZZehg5k+M1/7aRv3wGa\nmxromtdCKpWiraWJ447ppL21WcmfiFQ1JRsCzL4gcTr13GtS6L3nYrpuYw+pVGp0X5L8DdFSqRTt\nrc28/vQlMbZeRGT2lGwIEO3CSvXcazLVe8+PbXNTQ95ztCGaiNQWLVcuQLQLK9XzctRTvff82LbO\nbRpNOHL/qkZDRGqFejYEiHZhpWpfjno2Q0BTvffzTjl6NOapVIqueS20zGnkZUsO44jD2+tqqElE\napuSDQGiXVipmpejjnIlTy1mJSL1QsmGjIpqXYZqXo466pU8tRaGiNQDJRsSuWr+Bl/KENBkwy3V\n+t5FRMpFyYZURLm/wVdqKm2xQ0DTDbeo90JE6pmSDak6lZxKW+wQUNTrlIiIVDMlG1J1ZnphH05n\nWLepm76BYea1NXP8ks6iNibLHwY5fF4LAD9Z/ey4HpVqn3EjIhIlJRtSdWZyYc/1huzYM0BTYwPp\nzAirN+wsqjckNwwyVY9KNc+4ERGJmhb1kqozkwt7ORYWm+ocJyztYnHX+Nevlhk3IiJRU8+GVJ3J\ndk7d2t0P7BpXLJorJL3vka0MDA3T1tI87lylDHNM1aNysi3QrBMRkUko2ahz1bhB2mQ7p67b2M26\njd2jQxvA6LDHwNAwffsOMDCUZlFX++i5ShnmmK5HRbNOREQKU7JRx5K+QdrERGjFMZ1seLp3XGIE\nYzun5uQPj+TeW+vcJgaG0gxnRugfGqZlTmPJwxzVvDiZiEicqiLZMLMjgC8D5wADwPeAq9z9QKwN\nq3JJnq45MRHKZrP828830dbSNJpYrH1i96S9DROHPHJ7jwzuT7P48HbOfPniomaj5NMCXSIiM1MV\nyQZwO9ADnAF0AbcAaeCjcTaq2iV5uubERGhwf5q9/QdIpRitu9jeM8Ch7XMKPr9QEpJKpWhvbeb8\n05aw4uh5pPO2ci+WhkpEREqX+NkoZmbAHwIXu/vj7v7/gE8C7463ZdUvydM1JyY8w2FiMDwhQWiZ\n2zjpLJBCM0SO6Gpn5YqFEbRYREQmUw09GzuA8929O+9YCpgXU3tqRpJrECYmPM1NDeP+zTny8EO4\n4NRjJh3amDjscZLNZ05zI/1laGM1FteKiMQh8cmGu/cBP83dNrMUcAXws9gaVSOSXIMwMRFqndtE\nNhv8m5NLjKYa2ph4X1NTeTrzkl5cKyKSJIlPNgr4AvAKYGUpT2psTPyIUaRy739iHJqaGjj1ZYvi\naNKUmpoaeP+Fx7F+Uw87egZY1NXGcUsO5fdbnh+9feKyLuaUeGGfLA75DqQz4ev2s6irveDrrNvU\nzY49A+RNgmHHngEe29LLKcuro6ajmFjUA8VhjGIRUBzGlCsGVZVsmNm1wP8A3uHuG0p5bkdHazSN\nqjLVFofz5neMu33U4s6ynHeyOBwYznDz9x7huV0vBAc2dvO7J/fwl+94BXOaxxKOvoFhmgr8T9g3\nMExnZ/tBx5Os2j4TUVEcxigWAcWhfKom2TCzG4EPAu9x9x+W+vy9ewfJZEqffVArGhsb6OhoVRym\nicPqx3exZXvfuGNbtvdx35pnxvVYzGtrJl3g+fPamuntLUdFSPT0mQgoDmMUi4DiMCYXi9mqimTD\nzK4BPgBc5O4/mMk5MpmRGU11rDWKQ2CyOGzbvY9s9uDHb9u9j/Syw0dvH7+kk9Ubdh5UXHv8kk7S\n6ZGqKh7VZyKgOIxRLAKKQ/kkPtkwsxXA1cDngV+b2ei8RXffGVvDJLFmc6EvdjrwVMW1Kh4VERkv\n8ckG8EaC9UCuDn8gmPqaBfSXW8aZ7YW+lOnAk82CSfLKrCIicUh8suHu1wLXxt0OqQ6zvdCXYzpw\nkldmFRGJQ+KTDZFSlONCP9slyZO8MquISByUbNSB4XSGdZu66RsYZl5bc8kbkFWTUi70URVxJnll\nVhGROCjZqHG5GoYdewZoamwgnRlh9YadFSlWjGNGRrEX+gMRFnEmeWVWEZE4KNmocbkahvyVLitR\nrDjTQs3ZJijFXujXb4q2iFO7w4qIjFGyUePiKlacSaFmuaaMFnOh39FTeOEtFXGKiJSfFn6vcXEV\nK84kyZkqQSm3RV2FlxRXEaeISPkp2ahxJyztYnHX+AtoJYoVZ5LkVLIX5sRl8cRFRKQeaRilxuVq\nGB7b0lvR2SgzmZFRyV6YOSriFBGpGCUbdaC5qZFTli+gs7Od3t7+iqz1P5MZGZWeMqoiThGRylCy\nIZEp9WKuKaMiIrVJyYYkinobRERqj5INiV01bccuIiKlU7IhsdJ27CIitU9TXyVWlVxbQ0RE4qFk\nQ2Kl7dhFRGqfkg2JlbZjFxGpfUo2JFZxrXAqIiKVowJRiZXW1hARqX1KNmRWyjFtVWtriIjUNiUb\nMmOatioiIsVQzYbMmKatiohIMZRsyIxp2qqIiBRDwygyY7OZtqolykVE6oeSjTpzIJ1hne8uy0V+\nplvCq9ZDRKS+KNmoIweGM9x85wa2dvePHpvJRT6/V+LEpV2cuLSL7r6hopOXqWo9NCtFRKT2KNmo\nI2s27GRbT/+4Y6Ve5Av1SizuaispYVGth4hIfamKZMPM5gJfA94KDABfdPd/iLdV1Wfr7n0Fj5dy\nkS9Hr4SWKBcRqS/VMhvleuCVwNnAZcA1ZvbWWFtUhY6cf0jB46Vc5MvRK6ElykVE6kviezbMrA24\nFHidu68H1pvZdcAVwH/E2rgqs3LFQu5f++y4mo1SL/Ll6JXQEuUiIvUl8ckGcCJBOx/MO/Yr4GPx\nNKd6zWlu5JI3rJjVbJSZzkCZSEuUi4jUj2pINhYD3e6ezju2E2gxsy5313KVJZgzy4v8dL0SWj9D\nREQmqoZkow3YP+FY7vbcCrdFmLxXQutniIhIIdWQbAxxcFKRu110VWJjY7XUwkYj9/5nGocD6Qzr\nN/Wwo6efRV3tnLisizkTEoh1m7rZsWeAVGrs2I49Azy2pZdTlidjyGS2caglikVAcRijWAQUhzHl\nikE1JBtbgcPNrMHdR8Jji4BBd3++2JN0dLRG0rhqM5M4HBjOcPP3HuG5XS8EBzZ287sn9/CX73gF\nc5rHEo6+gWGaCnww+waG6exsn3Gbo6DPwxjFIqA4jFEsAopD+VRDsvEIMAycBvw6PPYqYHUpJ9m7\nd5BMZmT6B9aoxsYGOjpaZxSH1Y/vYsv2vnHHtmzv4741z4zrsZjX1ky6wLnntTXT29t/0PE4zCYO\ntUaxCCgOYxSLgOIwJheL2Up8suHug2b2HeAmM7sEOAq4EnhfKefJZEZIp+v7QwMzi8O23fvIZgsf\nTy87fPT28Us6Wb1h50EzVY5f0pm42OvzMEaxCCgOYxSLgOJQPolPNkJ/TbCC6M+BPuAT7v6jeJtU\nP4pdW0PrZ4iISCFVkWy4+yCwKvyRCitlbQ2tnyEiIhNVRbIh8VKPhYiIzIaSDSmKeixERGSmNIlY\nREREIqVkQ0RERCKlZENEREQipWRDREREIqVkQ0RERCKlZENEREQipWRDREREIqVkQ0RERCKlZENE\nREQipWRDREREIqVkQ0RERCKlZENEREQipY3YpKDhdEa7vIqISFko2ZCDDKcz3HL342zvGRg9tvaJ\n3ay6YLkSDhERKZmGUeQgj27uGZdoAGzvGeDRzT0xtUhERKqZkg05yI49AyUdFxERmYqSDTnIosPa\nSjouIiIyFSUbcpATlnaxuGt8YrG4KygSFRERKZUKROUgzU2NrLpguWajiIhIWSjZkIKamxo52RbE\n3QwREakBGkYRERGRSCnZEBERkUgp2RAREZFIKdkQERGRSCnZEBERkUglfjaKmc0Dvgi8gSA5ugv4\nsLv3xdowERERKUo19Gz8I/By4HzgPGAF8I1YWyQiIiJFS3SyYWZtwFuBy939EXd/BPgw8BYzmxNv\n60RERKQYiU42gBGC4ZP1ecdSQCNwSCwtEhERkZIkumbD3YeAn0w4/CHgUXffE0OTREREpESxJxtm\n1gIcOcnd2919IO+xVwBvB15X6us0Nia9EydaufevOCgOOYpFQHEYo1gEFIcx5YpB7MkGcCrwCyBb\n4L63AHcAmNllwA3Ah9z93hJfI9XR0TqrRtYKxSGgOIxRLAKKwxjFIqA4lE8qmy10jU8WM/sIcB1w\npbt/Ke72iIiISPESn2yY2fuAmwnW1rgx7vaIiIhIaRKdbJhZJ/A08O/AVRPu3u3uI5VvlYiIiJQi\n6dUv5wHtwPuAbeHP9vDfo2Jsl4iIiBQp0T0bIiIiUv2S3rMhIiIiVU7JhoiIiERKyYaIiIhESsmG\niIiIRErJhoiIiEQqCcuVR87M5gFfJNhBtgG4i2CRsL5YG1YhZjYX+BrwVmAA+KK7/0O8rao8MzsC\n+DJwDkEcvgdc5e4HYm1YjMzsLmCnu18Sd1viYGZzgC8B7wL2Aze7+8fjbVXlmdlRwNeBVwM9wA3u\nfkO8raqs8O/kGuByd38gPLYE+CZwOrAF+Ct3/2lcbayESeJwGsE19ATgOeB6d/92Keetl56NfwRe\nDpxPsHbHCuAbsbaosq4HXgmcDVwGXGNmb421RfG4HWgBzgDeCVwIfDbWFsXIzN4JXBB3O2L2ZeA1\nwGuBdwN/ZmZ/Fm+TYvF94AWCvxMfBj5nZm+Kt0mVE15gbwOOm3DXDwnWdToZ+BfgB2FiVpMKxcHM\nFgI/Bn4OvAL4FHCjmZX0t6PmezbMrI3gG/0fufsj4bEPAw+Y2Zxa/1Ybvv9Lgde5+3pgvZldB1wB\n/EesjasgMzPgD4GF7t4dHvsk8AXgo3G2LQ7h6rzXAQ/F3Za4hDG4BDjX3deGx64n2Bzym3G2rZLM\n7FCC93ypu28GNpvZPQRJ2I9ibVwFmNkK4LsFjp8LHAuc5u5DwN+b2WsIPjOfqWwrozdZHIA3E+zA\n/onw9mYzO4cgOb+72PPXQ8/GCMHwyfq8YymgETgklhZV1okESeWDecd+RfDHpZ7sAM7PJRqhFDAv\npvbE7XrgO8CGuBsSozOB5939V7kD7n6du78/xjbFYRDoB1aZWVOYmJ8BPBxvsyrmLOBegqGSVN7x\nU4GHw0Qj51fh42rRZHG4G1hV4PEl/e2s+Z6N8IPykwmHPwQ86u57YmhSpS0Gut09nXdsJ9BiZl3u\n3hNTuyoqrM8ZHWs1sxRB787PYmtUTMJvbK8iGFq8KebmxOlYYIuZ/SnwMWAOcAvwOXevm6WV3X2/\nmV0BfIVgCKURuMXdb421YRXi7qP/DwR51qjFBEMo+XZSo1tlTBYHd38GeCbvvgUEw9CfLOX8NZFs\nmFkLcOQkd29394G8x14BvB14XSXalgBtBIVv+XK351a4LUnyBYLxx5VxN6SSwjHZm4DLwotM3E2K\n0yHAS4EPABcTXFy+QfAt/0vxNSsWK4A7CHq8Xk4wJv8zd78t3mbFarK/nXX7dzO81t5OkISVVPdY\nE8kGQXfXL4BC30beQvA/EWZ2GXAD8CF3v7dyzYvVEAf/z5G7PUAdMrNrgf8BvMPd620Y4VPAanev\nux6dAtLAi4B3uftzAGZ2DPAX1FGyEdYhXAoc5e77gXVhEeTVBMWC9WoIOGzCsbnU79/NdoJr6TLg\njAnDS9OqiWTD3e9nmvoTM/sIQUHcle7+lYo0LBm2AoebWYO7j4THFgGD7v58jO2KhZndCHwQeI+7\n/zDu9sTgImChmb0Q3p4LYGZvd/eO+JoVi+3AUC7RCDlwdEzticsrgY1hopGzjmBoqZ5t5eDZKYsI\nPjd1xcxeBNxDMPR4jrs/Weo56qFAFDN7H3AtQY9G3XxjCT0CDAOn5R17FbA6nubEx8yuIegyv8jd\nvx93e2JyFkE3+Ynhzx0EMw5OjLNRMfkNQe3SsrxjxxGsp1BPtgHLzCz/y+cK4KmY2pMUvwFeGQ49\n5pwZHq8bYX3bD4AlwKvd/fGZnKcmejamEk5vuxH4J+B74ZzhnN153/ZrkrsPmtl3gJvM7BKC4qYr\ngffF27LKCqd1XQ18Hvh1/ufA3XfG1rAKc/dn82+HPRxZd6+7C4u7PxEuanZrOMS6mGAadM1Na5zG\nfxL0+n7LzD4HLAeuCn/q2f3AswSfj88CbwROIajvqSfvJ1ij6UJgb97fzgPu3lvsSeqhZ+M8oJ3g\n4rot/Nke/luTVcUF/DWwlmBRlhuBT7h7zc+fn+CNBJ/3qzn4cyD16z3AJuCXwK3Al939q7G2qMLc\nfS/BmhqLCdZd+SLwGXf/VqwNi8do3V/4RfRNBEMnawjWlXjzhGG3WpVlLBZvJZgKeydjfzu3ERSK\nFi2VzdbNDC8RERGJQT30bIiIiEiMlGyIiIhIpJRsiIiISKSUbIiIiEiklGyIiIhIpJRsiIiISKSU\nbIiIiEiklGyIiIhIpJRsiIiISKSUbIhIxZjZS8ysz8xuLXDfyWY2aGYfjKFpIhIhLVcuIhUV7sJ8\nM8Huu/8eHusAHgZWu/u74myfiJSfejZEpKLc/Z+AfyfYifiI8PAtBBs/fSC2holIZNSzISIVZ2aH\nAo8CvydIPL4C/JG7Pxxrw0QkEko2RCQWZnY28DMgA/ytu98Qb4tEJCoaRhGRuPwW2AY0Ar+IuS0i\nEiElGyISl68ATcBjwHfNbG7M7RGRiCjZEJGKM7N3AxcTFIT+d2AZcH2cbRKR6CjZEJGKMrNlwNeB\nr7v7ne7+KPAJ4HIzuyDe1olIFJRsiEjFmFkz8G/A08CVeXddDzwA3Gpm8+Nom4hER8mGiFTSF4AV\nwLvdfX/uoLtngfcBc4Bb42maiERFU19FREQkUurZEBERkUgp2RAREZFIKdkQERGRSCnZEBERkUgp\n2RAREZFIKdkQERGRSCnZEBERkUgp2RAREZFIKdkQERGRSCnZEBERkUgp2RAREZFI/X+Fh8O9SpYE\nuwAAAABJRU5ErkJggg==\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEWCAYAAABrDZDcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XuUHGd55/HvMz1XjUaWhDS2dUNWEJYvB4I9IQI7xovZxARjs+eEi3bNEgKMNxvCNcHmsoY1h2AfOGzIJgFpzXXtiLCGrAUBYhbjI8habCTDCTY22MgXybI1I1vSjDT36Wf/6OpxaTQ9XX2pqu6u3+cczmh6qrveanzqqfd53/d5zd0REZHsaku7ASIiki4FAhGRjFMgEBHJOAUCEZGMUyAQEck4BQIRkYxTIJCWZmaXmNnDZnbCzF6XdnvSYGYfMrNb026HNC7TOgKJi5k9BpwJzACzwC+ArwI73D0f4f0bgUeBDnefqbINPwB2uftnq3m/SBaoRyBxe6279wHPB24Grge+kOD5nw88UM0bzaw9wjG5aj47qc8TiUKBQBLh7sfdfRfwRuAtZnYhgJm9xsx+amYjZnbAzD4Wetvu4OexILXzMjP7DTO728yeMbMjZna7mS1f6Jxm9mtgE/Ct4P1dZrbGzHaZ2bNm9oiZvSN0/MfM7A4zu83MRoA/XOAzv2xmnzOz75jZSeDfBJ/7aTN7wswOm9nnzawn9J4PmNlTZnbIzN5uZm5mL6jm88xslZl928yOBdfwIzNrC/52vZk9aWajZvZLM7sidF23hdpztZk9EHzGPWZ2Xuhvj5nZn5nZv5rZcTP7ezPrjvx/tDQlBQJJlLv/P+Ag8DvBSyeB/wgsB14D/HEol39Z8HO5uy9193sBAz4JrAHOA9YDHytxrt8AnqDQK1nq7pPAzuD8a4A/AP6ieMMMXAPcEbTn9hKX8e+BTwB9wI+BW4AXAr8JvABYC9wIYGZXAu8DXhX87RW1fB7w/qD9qymk3T4EuJmdC7wT+K2gB/Z7wGPzT2RmLwy+g/cEn/EdCoGyM3TYG4ArgXOAF7FAQJTWokAgaTgErARw93vc/efunnf3f6Vwk1roZklw/CPu/n13n3T3YeAzix0fZmbrgUuB6919wt1/BtwKvDl02L3u/r+D9oyX+Kg73f2fg3GOSeAdwHvd/Vl3HwX+AnhTcOwbgC+5+wPuPgb81xo/bxo4G3i+u0+7+4+8MNA3C3QB55tZh7s/5u6/XuBcbwT+MfgOp4FPAz3Ay0PH/JW7H3L3Z4FvUQhI0sIUCCQNa4FnAczst83sh2Y2bGbHgf8ErCr1RjPrN7OvBSmQEeC2xY6fZw1QvLkWPR60p+hAhM8JH7MaWALsC1Itx4DvBa8Xz3mgxHur+bxPAY8Ad5nZfjO7AQoBksJT/seAoeA7WrPAudZQuGaC9+WD84e/g6dD/x4Dli7wOdJCFAgkUWb2WxRuOj8OXvo7YBew3t3PAD5PIf0DsNCUtk8Gr7/I3ZcB14aOL+cQsNLM+kKvbQCeDP0eZRpd+JgjwDhwgbsvD/53hrsXb55PAetCx6+v5fPcfdTd3+/um4DXAu8rprbc/e/c/VIKA+ROIcU036Hg7wCYmQVtenKBYyUjFAgkEWa2zMyuAr4G3ObuPw/+1EfhKX3CzF5KIV9eNAzkKQz4Ejr+BIUB5LXAn0dtg7sfAP4v8Ekz6zazFwFvo/RYQJTPzAP/A/hvZtYPYGZrzez3gkO+DrzVzM4zsyU8l+uv6vPM7Coze0FwAx+hkBKaNbNzzeyVZtYFTFAIJrMLnOLrwGvM7Aoz66Aw5jAZfC+SUQoEErdvmdkohfTDhynk9N8a+vt/Bm4KjrmRwo0KgCCn/gngn4M0yVYKOfaLgOPAPwLfrLA924CNFJ6M/wH4qLt/v4rrCrueQrpmT5Cu+j/AucE1fBf4K+CHwTH3Bu+ZrObzgM3B7yeCz/pbd7+HwvjAzRR6FE8D/RQGkk/h7r+k0Iv678Gxr6UwmD5VxXVLi9CCMpEEBVM17we6ql0kJ1Jv6hGIxMzM/p2ZdZrZCgp5+28pCEgjUSAQid91FMY7fk0hb//H6TZH5FRKDYmIZFxsPQIz+6KZDZnZ/aHXPmVmDwXL1//BSpQGEBGR5MTWIzCzyyjMbPiquxfryvwucLe7z5jZLQDufn25z1q1apVv3LgxlnaKiLSqffv2HXH31eWOK1tdsVruvtsKZYTDr90V+nUPhVovZW3cuJG9e/fWr3EiIhlgZo+XPyrdweI/Ar6b4vlFRISUAoGZfZjCZiUlV3Sa2aCZ7TWzvcPDw8k1TkQkYxIPBGb2FuAq4D/4IgMU7r7D3QfcfWD16rIpLhERqVJsYwQLCWqzXw+8IigfICIiKYtz+uhOCrVQzjWzg2b2NuCvKRQN+76Z/czMPh/X+UVEJJo4Zw1tW+DlJPeqFRGpm3seGmL77v0cODrG+hVLuO6yTVy+pT/tZtWFSkyIiJRxz0ND3LjrAYZGJ1je08HQ6AQ37nqAex4aSrtpdaFAICJSxvbd++nIGUs62zEr/OzIGdt370+7aXWhQCAiUsaBo2P0dOROea2nI8fBo60x50WBQESkjPUrljA+feqGb+PTs6xbsSSlFtWXAoGISBnXXbaJ6VlnbGoG98LP6Vnnuss2lX9zE1AgEBEp4/It/dx09QX093VzfHya/r5ubrr6gpaZNZTogjIRkWZ1+Zb+lrnxz6cegYhIxikQiIhknAKBiEjGKRCIiGScAoGISMYpEIiIZJymj4qINIA0q5uqRyAikrK0q5sqEIiIpCzt6qYKBCIiKUu7uqkCgYhIytKubqpAICKSsrSrmyoQiIikLO3qppo+KiJSJ7VMAU2zuql6BCIidZD2FNBaqEcgIplXj8Vc4SmgAEs62xmbmmH77v0Nv49BbIHAzL4IXAUMufuFwWsrgb8HNgKPAW9w96NxtUFEpJzik3xHzk55kr8p+HvUAHHg6BjLezpOea1ZNriPMzX0ZeDKea/dAPzA3TcDPwh+FxFJTanFXDd/98GKUj1pTwGtRWyBwN13A8/Oe/ka4CvBv78CvC6u84uIRFFqMdejz4xVtNo37SmgtUh6jOBMd38KwN2fMrOSiTMzGwQGATZs2JBQ80Qka9avWMLQ6MRcbh+Ye7KPsto3PL6wtDOHmXF8fJrezhyduTY+cuf9rN+dbBG5SjXsrCF33+HuA+4+sHr16rSbIyItqtST/KZVvWVTPfNnCk3nnZNTs7z+4nWMTeeZms03xQyipHsEh83s7KA3cDbQmN+KiGTG5Vv6uYnCWMHBo2OsCwaFAW7c9QBjUzP0dOQYn549LdVTaqbQrT9+lNV9XQvOICq+L41y06UkHQh2AW8Bbg5+3pnw+UVETlNqMddCASJ8XKmZQienZtmwQFrp4cMjJWcopRkM4pw+uhO4HFhlZgeBj1IIAF83s7cBTwCvj+v8IiK1rg8ot9q31PhCb2ehBzH/9alZ54wGXGsQ56yhbe5+trt3uPs6d/+Cuz/j7le4++bg5/xZRSIidZHESt9S4wtvv/ScU14fHp3g4NFxTkzO8NSxcUYnpuc+oxHWGmhlsYi0pGpX+lbSiyg1vnD5ln5etG4523fv5+HDI4xOzrKyt4OR8RmmZvMcOjbBmuXQ193REGsNFAhEpCVVs9J3sVXGiwWDhf5WfH3bjj1z6aPO3DSHjo/jOEMjE+TarCHWGjTs9FERkVpUs9I3ji0jwwvWlvV0sOaMHjpzbUzOeuLlpktRIBCRllTNSt84toycH5CW9XRw1hndvHTjSnYObk09CIACgYg0kXseGmLbjj1cesvdbNuxZ9GB32o2e4mjXlAzlJ4wd0+7DWUNDAz43r17026GiKQonL8PL/CqZ2olrnMUB6BLrUeIi5ntc/eBcsdpsFgkI+pRcz9NSdT7X2wWUK2f28jftQKBSAZUMxum0SRV77/Rb9pxUCAQyYBm3j2rqNQq3qj5+2bvEcVJg8UiGRDHbJik1TLo2sz7CSdBgUAkA5p596yiamYBFc1fHzAz6wyNTnDdbfvKzj7KAqWGRDLguss2lS2p3Ayqzd+HxxdGxgurew3IuzfleEm9qUcgkgG1PE23gnCP6MiJSdowzIyu9lxdVg83O/UIRDIi7tkwjTwYG+4RTc3mMQA3Vvd1AemNl5T6zpL+LhUIRKRmaUxPrfQm+tz6gEJa6KwzuunrLqSL0hgvKfWd/cHBY9xx35OJfpdaWSyScfV4+gxX2Cwam5qhv6+bnYNb693k01YAHzkxydGxaTpzxtSss7K3g+f1di24MjiJFcpRlPrOhkcnT9nmsvh6Nd9l1JXFGiMQybB6TatMenpqeBbQ6MQMz5ycIu+FjePz7jxzYpoTkzML5v8bZbyk1Hd2cmo28am+Sg2JZFjUhWbleg21LvaqVHgW0NzgbxtMzzqdZjgwPDpJX3fHgjfRRlg9XOk2l3GmrtQjEMmwKE/yUXoNcVbYXKjiaHgW0NRsHjNwh7bgp1nhdWjc9RJRt7lMolqpAoFIhkVZaBZls5a40i2lgtDLNq2cu1l25tqYdccdVvV2kseZzTsdbdaQJZ+LSn1n73rVCxNPXSk1JJJhURaaRS32Vs90SzEVdd8TR+dm+BSD0NjUDPfuf5abrr6A7bv3c3xsipn8cwPEbW3G0bFplnS109/X3VDTWOcrt81lUhQIRDIsStnlpPP/4Vk9eXcMTtnsvRiEwjfLcL3/c1Yt5eYGvvk3olQCgZm9F3g74MDPgbe6+0QabRHJunJPn0mVp1ioF9CZa2Nm1sGeG/xdKAg1wuBvM0t8jMDM1gLvAgbc/UIgB7wp6XaISDRJTLcMjwXk3cm7c+jYBL2dOfI47s7kzGxD5/ybWVqpoXagx8ymgSXAoZTaISIRVPPEXclCtfCAdLgXcHJqljVn9HB4dAJza/icf7NKPBC4+5Nm9mngCWAcuMvd75p/nJkNAoMAGzZsSLaRIlKTxUpOAKcFiPCA9KqlXYXqoA6TM3nac5a5InlJS7zEhJmtAL4BvBE4Bvwv4A53v63Ue1RiQqS5lCqf0NFmjE3nTyvv0NuZY2o2P3f8yPg0h0cncIeLNqxouF5AIxfYC2vkEhOvAh5192F3nwa+Cbw8hXaISExKLVR79JmxBdckuPspi6iKvYDt117MzsGtDXWTbcXdztIIBE8AW81siZkZcAXwYArtEJGYlFqoBpSsr9MI9X+iiLLArtmkMUbwEzO7A7gPmAF+CuxIuh0iEp9SU043reotWUenWaaARl1g10xSKTHh7h919y3ufqG7v9ndJ9Noh4jEo9SU0+uv3HJKCmh4dIKDR8d5eGi0afYOboX9n+fTymIRiUWpJ/ziSuaHD48wOjk7VxqiWfYObpX9n8NUdE5EEnX5ln52Dm5l85nLWLeih1VLu5sq194o+xnUk3oEIhE0y3TBZtLMufZmGc+ISoFApIw09uONW5TAFnfwS7qYnZSm1JBIGa02XTDKPPioc+UX2jQmqjg3s5HKqEcgUkaaKYyoT+XV1vWBhbenjHJMrT2lKCWwJRkKBCJlVJPCqEdaJeqNttIbcpTAFuWYqPsdL6bVcu3NSqkhkTIqTWHUqwRB1JRUpamrKPPgoxwTZb9jaQ4KBCJlVDpdsF5jClFvtJXekKMEtijHtOLCqqxSakgkgkpSGPUaU4iakqo0dRUlNx/lmFZcWJVVCgQidVavaZFRb7TV3JCjBLZyx2iwt3UoEIjUWb2elKPeaOt5Q650kFuDva0h8Y1pqqGNaaTZFG+ozfSkHJ59FA5g9SifoJXZ6Yi6MY16BCIxaMYn5ajTQSu9qbfiyuxWo1lDIgJEm31UzdTYVluZ3YrUIxBpYZU8vUcZ5K5mEVkzF5fLCvUIRFpUpU/vUdYOVLOITOsNGp8CgUiTKlfwrdKUTJSFc9Xc1FVcrvEpNSTShKIMwFaTkik3yF3tmgWtN2hsCgQiTShKrj6Oev/V3tSbcRZVligQiDShKE/7cZWA0E299WiMQKQJRcnVt+LeuhKPVHoEZrYcuBW4EHDgj9z93jTaItKMoj7t6+ldokirR/BZ4HvuvgV4MfBgSu0QaUp62pd6SrxHYGbLgMuAPwRw9ylgKul2iDQ7Pe1LvaSRGtoEDANfMrMXA/uAd7v7yfBBZjYIDAJs2LAh8UaKJElF2SRNiVcfNbMBYA9wibv/xMw+C4y4+38p9R5VH5VWVu+qnwoqUhS1+mgaYwQHgYPu/pPg9zuAi1Joh0hDqGdRtnrtlyzZknggcPengQNmdm7w0hXAL5Juh0ijqOcm8Kr0KdVIa0HZnwK3m1knsB94a0rtEIlduVRNPVcAq9KnVCOV6aPu/jN3H3D3F7n769z9aBrtEIlblFRNPYuyqdKnVEMri0ViND9VMzPrDI1OcN1t++YqhtZzTYAqfUo1VGtIJEbhVM3I+DSHjo9jQN79tIqh9ZjZo0qfUg0FApEYhfP/R05M0oaBQVeuLdLuXtXQQjOplFJDIjEKp2qmZvM4jjus7usCNJArjUGBQCRG4fx/mxltZqxZ3k1fdyFdpIFcaQQlA4GZfcfMNibXFJHWdPmWfnYObmX7tRfTv6ybXJvh7gyPTnDw6DgPD40uuNWkSFIW6xF8GbjLzD5sZh2LHCciEYR7B08fH+fo2DQrezs4a1m3VgBLqhatNWRmvcCNwJXA/wTyxb+5+2dib11AtYYkDXHW7Nm2Y89pi8jGpmbo7+tm5+DWupxDpF61hqaBk0AX0DfvfyItK+6aPfUsKyFSq5LTR83sSuAzwC7gInfXf6GSGVE2h4fqew1xbCwvUq3F1hF8GHi9uz+QVGNEkhDl5h2lZk+4fHS411BcILaYajeWV4lpiUPJ1JC7/46CgLSaqCmfKDV7aqn0WU1ZCZWYlrhoZbFkStSUT5Qn9lorfVa6Ajhq20UqpQVlkilRB2mjPLEnXelTA8wSF/UIpOHEmQdfaJD2yIlJxqZmufSWu085X7kn9mrz/OWUun4NMEtc1COQhhJ3Hnx+mebh0QmGT0zR25Wr+Hz1LB9dtNj1q8S0xEU9AknNQk++cefB55dpHpuapb+vk1VLuyOfL84ey2LXv3Nwq0pMSywUCCQVpaZejk3NcNay7lOOrXcePJzyufSWuysa8K1lymgU5QagVWJa4qBAIKmY/+Rb3LlrcibPiYkZzjqjvhU665V3j7vHonEASYPGCCQV4RkwxZ278nnHHGbyzsGj44yMT9UlD17PvHvcM3c0DiBpUCCQVISnXhZ37jIzejpzrF3eQ3vOeHpksi4DsIst/Kp0wDfuKaNxDECLlKPUkKQiPPVyajaPAbixuq+Lvu4O+rrbOT4+XZdKnJXk3YsppI/cef+CA8FxTRkN0ziAJC21HoGZ5czsp2b27bTaIOlJcueuqE/xUaau6oldWlGaPYJ3Aw8Cy1Jsg6So+ORbvAEXd+6q91N21Kf4qAPBemKXVpNKIDCzdcBrgE8A70ujDdI45s/tjzI/vpK5/FE/v9baQSLNKq0ewV8CH2CRDW7MbBAYBNiwYUNCzZK0VPKUXc1c/iifr6mbklWJjxGY2VXAkLvvW+w4d9/h7gPuPrB69eqEWifNYLFZQPc8NMS2HXu49Ja7K94QXlM3JavS6BFcAlxtZr8PdAPLzOw2d782hbZIE5qfwhkZn+bIiUn2D5/kvieOsrK3g+f1dlW86reaFJVIK0g8ELj7B4EPApjZ5cCfKQhIKQuNBYRTOMXFaABmkHfnmRPTdLXn6OvuqHjVrwaCJYu0oEwaVqnpnC/btHIuhXPkxCQAFixIy5lhBsOjhdc12CtSXqoLytz9HuCeNNuQFc24122p6Zz37n+Wm66+gO279/PYM2N05Yz+Zd0Mj04yM+tYG0zN5hkZn+bw6ATusG3HntivuRm/YxHQyuJMiLtiZlxKjQU89kzhCb84iFtME7lTSBPlwdx58lghZbR2eXfs19ys37EIKDWUCbVssp6m8Irg4ljA1GyerpwtmCbq627neb2dtJnhZrS3GetW9LCspzP2a27W71gEFAgyIYm9bqNO26xkemd4Omd4LKB/WffcjbaYJiqWfDhn1VK2X3sxZy7r5gX9S+dKVsRxzWHaT1iamVJDGRD3QqmoaZFK0yfh6ZzhsYDizb14o11ops/63ckuDtNiNGlm6hFkQNwLpaKmRapJn1y+pZ+dg1t56caVnL2855Qn/MVutEkvDtNiNGlmCgQZEHfFzKhpkVrSJ5XeaJOuEqqqpNLMlBrKiDgXSpVLixSnVQ6PTnJkdLKqbSirWfWb9OIwLUaTZpX5QKC537VbqMzz8fFpOnNtXPzxuzgxOcvK3g7OWtbFk8cmOHh0nLXLnfZcW0XpE91oReKR6UAQ19zvrASX8HUu7cxhZhwfn6a3M4dRWNQ1MZ2fK/uwZnk3a5f3cHh0gqdHJrlow4qW/W5EmkmmxwjimPsdZZerVjD/OqfzzsmpWT5+zYWs6O1iWU8HSzrbmZrNn1L2YVlPBy9YvZT+vi52Dm5VEBBpAJnuEcSxEUnUXa4aRbW9l8WuM/y9dubaTin7AJpWKdJoMt0jiLqXbSWaaWFRLb2Xxa4z/L2uWtpFHmc273S0maZVijSgTAeCOOZ+xxFc4lJLamyx6wx/r+GyD0u62jWtUqQBZTo1FMdGJFE3Sm8EUVNjC6WPFrvO+d/rOauWcrMGhUUalrl72m0oa2BgwPfu3Zt2MyIr3jhrDS5xzz7atmPPafP/x6Zm6O/rZufg1rk2FGdWhW/4N119AaDdvEQamZntc/eBsscpEDSmxW7Ai91sKwke889x5MQkR8em6etuZ3N/H9ddtontu/eXDRYi0piiBoJMjxE0smry95UO/obLIjx9fJyjY9PBwq/n6vc/PDTaNIPfIlIdBYIGVcnso2Jp5+tu28fQyASzea+4qNvmM5exbkUPq5Z2n/LeqZl80wx+i0h1MjdY3Cyrfheq33PkxCRjU7Ncesvdc20H5tI7eXcMOHRsgjXLoa+7I/LTe6mB486czc0AavTBbxGpTqZ6BM2w6rf4dP+rwyMcPDrOkRMTuDvDoxMMn5iityt3Sttv+d5DcymkzlxbsIn7c5u3R316LzUddPOZy1RVU6TFZapH0Kirfou9lF8dHpkr0Hb2GT0cOTHJsyenmZ51pmby9Pd1smpp9ylt33/kJJv7lwKFxVuHjo9jDpMz+YrWRZSbDqobv0jrylQgiKOkRK3CM3fCBdq62nOs7uumN1iEVartUHhyX9LZzrLg74dHJzA3+vu6I6e+4lhTISLNIfFAYGbrga8CZwF5YIe7fzaJczfidoLhXkqxQJtTSO2Ec/yl2n7O85YwNp2fe5Jvz1nV6Zvwk3+xl/KRO+9v6LEUEaldGmMEM8D73f08YCvwJ2Z2fhInbsTtBMOzgzpzbbiD2ekF2kq1/YZXn1f3HH4zjKWISP0k3iNw96eAp4J/j5rZg8Ba4Bdxn7sR0x/hJ/1ijp88pxVoK9f2Sq6h3MypRh1LEZF4pLqy2Mw2AruBC919ZN7fBoFBgA0bNlz8+OOPJ96+JERZ3VvPm2+UFcuX3nI3y3s6MLO597k7x8en+dH1r6xbW0QkXlFXFqc2WGxmS4FvAO+ZHwQA3H0HsAMKJSYSbl5iki7QFuVpf7GxlGZZhyEi0aUSCMysg0IQuN3dv5lGGxpJktMzo8ycKjWV9GWbVsaytaeIpCvxwWIr5Bu+ADzo7p9J+vytorjw7NJb7mbbjj2RB3Kj7JcQrkEUHoC+d/+zdd/aU0TSl0aP4BLgzcDPzexnwWsfcvfvpNCWphTO81f6ZB51v4SFeikfufP+hluHISK1S2PW0I8BK3tgjVo5l13LrJ5aZk414joMEaldS64sruWJuZZzJhV4atlZrPikX03bmmn3NRGJriWLztWyF281qlmAVW2OH6Ll+eNYFFZq7KBVeloiWdWSPYKkawpVmqqptccS5ck8rkVhKkAn0npaskcQ5Ym5nirZRAZq77FEeTKvtE0ikl0t2SNIOpdd6SBqPXos5Z7MNbArIlG1ZI8g6Vx2pcXskuixNGKBPRFpTC3ZI4Bkc9mVTslMosfSiAX2RKQxpVp0LqqBgQHfu3dv2s2oq+LUTt2kRSQuDV90LotaeZGbiDSvlhwjaETa7EVEGpUCQUKSXuQmIhKVAkFCNK9fRBqVAkFCkl7kJiISlQJBhaqtEaR5/SLSqBQIKlDLgK8KtolIo9L00QrUWshNBdtEpBGpR1ABDfiKSCtSj6AC1RRy0yIyEWl0mQgEtd6Mi+//1eERTkzOsrK3g+f1dpWtEZTGTmkiIpVq+dRQrSt6w+8/+4weVizp4NmT0zw9MlF2wFeLyESkGbR8j6DWAd7571/d101vVzv9fd3sHNy66HuT3ilNRKQaLd8jqHWAt5b3axGZiDSDVAKBmV1pZr80s0fM7IY4z1XrzbiW92sRmYg0g8QDgZnlgL8BXg2cD2wzs/PjOl+tN+PF3l9ulbEWkYlIM0hjjOClwCPuvh/AzL4GXAP8Io6TVbtTV3im0dLOHGbG8fHpufcDkWYEaRGZiDS6NALBWuBA6PeDwG/HecJKb8bzp30Wponm+fg1F859zrYde2oahBYRaRRpjBHYAq+dtl+mmQ2a2V4z2zs8PJxAs54TZdqnVhmLSKtIIxAcBNaHfl8HHJp/kLvvcPcBdx9YvXp1Yo2DaDd5zQgSkVaRRiD4F2CzmZ1jZp3Am4BdKbSjpCg3ec0IEpFWkXggcPcZ4J3APwEPAl939weSbsdiotzkNSNIRFqFuZ+Wnm84AwMDvnfv3kTPWZw1VMlMIxGRRmJm+9x9oNxxLV9iolqa9ikiWdHyJSZERGRxCgQiIhmnQCAiknEKBCIiGadAICKScQoEIiIZp0AgIpJxCgQiIhmnQCAiknFaWRwS3oxmvcpKiEhGqEcQKG5GMzQ6ccqOY/O3nxQRaTUKBIEom9GIiLQiBYKAdhwTkaxSIAhoxzERySoFgoB2HBORrFIgCGjHMRHJKk0fDdFmNCKSReoRiIhknAKBiEjGKRCIiGScAoGISMbLpB7oAAAEPElEQVQpEIiIZJy5e9ptKMvMhoHHq3z7KuBIHZvTLLJ43Vm8Zsjmdeuao3m+u68ud1BTBIJamNledx9Iux1Jy+J1Z/GaIZvXrWuuL6WGREQyToFARCTjshAIdqTdgJRk8bqzeM2QzevWNddRy48RiIjI4rLQIxARkUUoEIiIZFxLBwIzu9LMfmlmj5jZDWm3J25mtt7MfmhmD5rZA2b27rTblBQzy5nZT83s22m3JSlmttzM7jCzh4L/z1+WdpuSYGbvDf77vt/MdppZd9ptqjcz+6KZDZnZ/aHXVprZ983s4eDninqdr2UDgZnlgL8BXg2cD2wzs/PTbVXsZoD3u/t5wFbgTzJwzUXvBh5MuxEJ+yzwPXffAryYDFy/ma0F3gUMuPuFQA54U7qtisWXgSvnvXYD8AN33wz8IPi9Llo2EAAvBR5x9/3uPgV8Dbgm5TbFyt2fcvf7gn+PUrgxrE23VfEzs3XAa4Bb025LUsxsGXAZ8AUAd59y92Pptiox7UCPmbUDS4BDKben7tx9N/DsvJevAb4S/PsrwOvqdb5WDgRrgQOh3w+SgZtikZltBF4C/CTdliTiL4EPAPm0G5KgTcAw8KUgJXarmfWm3ai4ufuTwKeBJ4CngOPufle6rUrMme7+FBQe+oC67aLVyoHAFngtE3NlzWwp8A3gPe4+knZ74mRmVwFD7r4v7bYkrB24CPicu78EOEkdUwWNKsiLXwOcA6wBes3s2nRb1fxaORAcBNaHfl9HC3Yh5zOzDgpB4HZ3/2ba7UnAJcDVZvYYhfTfK83stnSblIiDwEF3L/b47qAQGFrdq4BH3X3Y3aeBbwIvT7lNSTlsZmcDBD+H6vXBrRwI/gXYbGbnmFknhQGlXSm3KVZmZhRyxg+6+2fSbk8S3P2D7r7O3TdS+P/4bndv+SdEd38aOGBm5wYvXQH8IsUmJeUJYKuZLQn+e7+CDAySB3YBbwn+/Rbgznp9cMtuXu/uM2b2TuCfKMws+KK7P5Bys+J2CfBm4Odm9rPgtQ+5+3dSbJPE50+B24MHnf3AW1NuT+zc/SdmdgdwH4VZcj+lBctNmNlO4HJglZkdBD4K3Ax83czeRiEgvr5u51OJCRGRbGvl1JCIiESgQCAiknEKBCIiGadAICKScQoEIiIZp0AgUqGgyuujZrYy+H1F8Pvz026bSDUUCEQq5O4HgM9RmNdN8HOHuz+eXqtEqqd1BCJVCEp57AO+CLwDeElQ5Vak6bTsymKROLn7tJn9OfA94HcVBKSZKTUkUr1XUyiFfGHaDRGphQKBSBXM7DeBf0thJ7j3FqtCijQjBQKRCgVVLz9HYb+HJ4BPUdgsRaQpKRCIVO4dwBPu/v3g978FtpjZK1Jsk0jVNGtIRCTj1CMQEck4BQIRkYxTIBARyTgFAhGRjFMgEBHJOAUCEZGMUyAQEcm4/w+JdR0h5ZHtkgAAAABJRU5ErkJggg==\n", "text/plain": [ - "" + "
" ] }, "metadata": {}, @@ -161,11 +160,11 @@ "\n", "The first step in preparing these data is to create **independently sampled** **training dataset** and **test data set**. In most cases, an independently sampled **evaluation dataset** will also be used. In this case, no model improvement or comparison will be performed so this additional step is unnecessary. \n", "\n", - "If the same data are used to train and test a machine learning model, there is a high likelihood that the model will simply be learning the training data. In technical terms one can say that there is **information leakage** between the training and test processes. In this case, the model may not **generalize** well. A model that generalizes well produces consistent results when presented with new cases, never before encountered. conversly, a model with poor generalization might give unexptected results when presented with a new case. \n", + "If the same data are used to train and test a machine learning model, there is a high likelihood that the model will simply be learning the training data. In technical terms one can say that there is **information leakage** between the training and test processes. In this case, the model may not **generalize** well. A model that generalizes well produces consistent results when presented with new cases, never before encountered. Conversely, a model with poor generalization might give unexpected results when presented with a new case. \n", "\n", - "The random sub-samples of the data are created using a process called **Bernoulli sampling**. Bernoulli sampling aceptes a sample into the data subset with probability $p$. In this case, the probability that a given case is in the training dataset is $p$. The probability a case is in the test dataset then becomes $1-p$. \n", + "The random sub-samples of the data are created using a process called **Bernoulli sampling**. Bernoulli sampling accepts a sample into the data subset with probability $p$. In this case, the probability that a given case is in the training dataset is $p$. The probability a case is in the test dataset then becomes $1-p$. \n", "\n", - "The `train_test_split` function from the `sklearn.model_selection` module performed the required Bernoulli sampling. The `train_test_split` function samples the index for the array contaning the features and label values. The code in the cell below performs this split. Execute this code." + "The `train_test_split` function from the `sklearn.model_selection` module performed the required Bernoulli sampling. The `train_test_split` function samples the index for the array containing the features and label values. The code in the cell below performs this split. Execute this code." ] }, { @@ -190,7 +189,7 @@ "source": [ "#### Scale numeric features\n", "\n", - "Now that the dataset is split, the numeric feature column must be re-scaled. Rescaling of numeric features is extremely important. If The numeric range of a feature should not determine how much that feature determines the traning of the machine learning model. \n", + "Now that the dataset is split, the numeric feature column must be re-scaled. Rescaling of numeric features is extremely important. The numeric range of a feature should not determine how much that feature determines the training of the machine learning model. \n", "\n", "For example, consider a data set with two features, age in years, typically measured in a few tens, and income, typically measured in tens or hundreds of thousands. There is no reason to believe that income is more important than age in some model, simply because its range of values is greater. To prevent this problem numeric features are scaled to the same range. \n", "\n", @@ -203,7 +202,7 @@ "$Min(X) $ is the minimum value of all samples,\n", "$Max(X) $ is the maximum value of all samples.\n", "\n", - "In general, Min-Max normalizaton is a good choice for cases where the value being scaled has a complex distribution. For example, a variable with a distribution with multiple modes might be a good candidate for Min-Max normalization. Notice that the presence of a few outliers can distort the result by giving unrepresentative values of $Min(X)$ or $Max(X)$.\n", + "In general, Min-Max normalization is a good choice for cases where the value being scaled has a complex distribution. For example, a variable with a distribution with multiple modes might be a good candidate for Min-Max normalization. Notice that the presence of a few outliers can distort the result by giving unrepresentative values of $Min(X)$ or $Max(X)$.\n", "\n", "\n", "For this lab you will use **Z-Score** normalization. Z-Score normalization transforms a variable so that it has zero mean and unit standard deviation (or variance). Z-Score normalization is performed using the following formula:\n", @@ -221,7 +220,7 @@ "source": [ "The code in the cell below uses the `StandardScaler` function from the `sklearn.preprocessing` package. This function computes the scaling coefficients for the training data. The resulting transformation is then applied to the training and test data using the `transform` method. \n", "\n", - "Notice that the scalling transform is computed only on the training data. The scaling transform should always be computed on the training data, not the test or evaluation data. \n", + "Notice that the scaling transform is computed only on the training data. The scaling transform should always be computed on the training data, not the test or evaluation data. \n", "\n", "Generally, a numeric label does not need to be scaled. Other transformations may be required, however. \n", "\n", @@ -247,12 +246,12 @@ "source": [ "## Train the regression model\n", "\n", - "With the data prepared, it is time to train a regression model. This is done with the `sklearn.linear_model` package. The steps for training most Scikit-Learn models are the same as used here:\n", + "With the data prepared, it is time to train a regression model. This is done with the `sklearn.linear_model` package. The steps for training most scikit-learn models are the same as used here:\n", "\n", "1. A model object is instantiated. Additional model specification can be performed at instantiation time.\n", "2. The model is fit using a numpy array of the features and the labels. In this case, there is only one feature so the `reshape` method is used to create an array of the correct dimension.\n", "\n", - "You can follow this link to find additional [documentation on linear regression models with Scikit-learn](http://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares). \n", + "You can follow this link to find additional [documentation on linear regression models with scikit-learn](http://scikit-learn.org/stable/modules/linear_model.html#ordinary-least-squares). \n", "\n", "Execute the code in the cell below to instantiate and fit the model. " ] @@ -265,7 +264,8 @@ { "data": { "text/plain": [ - "LinearRegression(copy_X=True, fit_intercept=True, n_jobs=1, normalize=False)" + "LinearRegression(copy_X=True, fit_intercept=True, n_jobs=None,\n", + " normalize=False)" ] }, "execution_count": 5, @@ -295,8 +295,8 @@ "name": "stdout", "output_type": "stream", "text": [ - "[ 0.08516893]\n", - "[[ 1.03285118]]\n" + "[0.08516893]\n", + "[[1.03285118]]\n" ] } ], @@ -309,8 +309,13 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "These coefficients are close to the values used in the simulation, $0.0$ and $1.0$. \n", - "\n", + "These coefficients are close to the values used in the simulation, $0.0$ and $1.0$. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ "Next, you will plot the predicted values computed from the training features. The `predict` method is applied to the model with the training data. A plot of the raw label values and the line of the predicted values or **scores** is then displayed. Execute the code and examine the results. " ] }, @@ -321,9 +326,9 @@ "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhsAAAGJCAYAAAAjYfFoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XmcXXV9//HXZCYJCThhGBISQFn1myANLUsBEQH5CdLi\nhloXatmsVqV1q1JcqUtbI2oBRRQFXNFWRbaCC1JEjQJhSW3DF4kEWbJMJiGRTEIyk/n9ce7NLDl3\nZu7MPfece+f1fDzyyMw59577vd/czHnP53y/39PS39+PJElSVqbk3QBJktTcDBuSJClThg1JkpQp\nw4YkScqUYUOSJGXKsCFJkjJl2JAkSZkybEiSpEwZNiRJUqba8m6A1ChCCFcDZ1XY3Q+8Nsb4gxDC\n7UB/jPHFpeedByyIMf5jDdrwKeDNwDTgbTHGb070mBMVQrgI+EiMccy/vIzlOSGEDwJbYoyfmWD7\nDgGujDEeN5HjpBx3H+BLwNtjjH+o8rn/DWwvf0bG+JxM3odUD4YNqTorgVcCLSn7Hir9/bZh2z8E\n3D7RFw4hPB94H8kJ7htAnOgxa6S/9KfWz/k4cNF4GjTMa4FjanCc4f4fcNo4nzue+0Rk9T6kzBk2\npOo8E2O8e6QHxBgfzOi19yQ5SX0nxvirjF6jGaUFwyIftyivJ9WMYUOqscEl8hDCI8BzgLNDCGcB\nB1QquYcQXgf8IzAfeBr4IXBhjPGpEMJHgY+ShI3bQwgrYowHphzjBJIqyv8jqagcA3QBHwNuBr4A\nnAKsBz4TY7xk0HPnAv9Seu6ewP8An4gx3jjoMdOBfwXeAOwG/AewJqUdx5NUJo4CtgA3Av8YY1w7\nagcmz99eeq8XhRA+GmNsLW0/ctBxpwJ3AP8UY/y/Csf5KPCR0td9wD/HGD8WQmgBLgDOA54NPApc\nFmP8/KDnHgh8DjgOmAE8AHw8xnhL6d/yqlIbHwkhfC3GeG6FNjwbuAR4MdADfDrlMbuQ/Pu+muTz\n8gzwG+B9McYHRngfnST/tn8JzCP53NwBvDvG+OhIfSzVkwNEpSqFEFqH/xn2kMEl8lcBq0lO9MeQ\nXIZJO+aHgG8DvwLOILl88BqSYDEduBJ4R+nhbysddyTfBq4nOQk9CFxBEkL+B3gZcBfw2dLJmxDC\nHOAe4IXAP5Xa8AjwwxDCGwYd91skJ+hPlNrXAbxn2Ht5EfBTkhPfa4F3AicCPyu9l7E4huQ3+a+U\nviaEcBLwS5L+PZuBoPCrEMLzKhznSuCrpeccUzoepf64CPg6cDpJaPr30jgRSmHkZmAmcCbwcqAb\nuL4UQm4q9QEk/xYfT3vxEMJM4E7g+aX2nk8y5uYFwx76jdJ7+iTwEuDdped8q7T/KxXex3+RhMP3\nlZ73UeBk4IsV+kPKhZUNqTr7A9uGbesPIVwYY1w0/MExxvtDCM8AXZUuv4QQdgc+CFwRY3znoO3/\nC/wcOCfGeEUIofzb+7IY4wOjtPOr5apFCGETyYno1zHGi0rblpIEiheQhIz3Ap3AMTHGx0vHuLX0\nm/PFwLWlMSNnAG+NMV5ZOs6PSQLMgkGv/a+lNp4+6L38GlgGnMsYToQxxrtCCACPD+q3fyMZF/OX\nMcb+0nF/Aiwn+e3+9SnHeTKE8Hjp67tLz3kuyQn/ghjjxaWH/jSE0A98IIRwOckA3EBSQfhR6Xl3\nkZzMp8cYu0MIy0vPvX+EAaLnAPsCz48xxkHHeXhQ30wFdgXOjzF+v7T5zhDCLODiEMKcGOMTKe9j\nHvBH4F0xxsWl5/289P7+tnLvSvVn2JCq8yRJZWD49fPHUx47VseQnNy+M3hjjPEXIYRHSaoCV1R5\nzMWDvl5d+vuuQcdeVzqZ717adALwq0FBo+ybwFUhhPkkVY9+kt/qy8fpDyF8D/gwQAhhBnA0sGhY\nxWcFSdh4CeP4rbtUITgSuKgcNEqvvyGEcCPVDdQszwC5aVgbbyS59HR8jPGGUrj7SgjhpcCPgFvG\nMaPohcDyctAotfnxUvgqf78N+AuAEMLewPNKf8phLbUaFGNcSVLVIISwH/Bckktwx1V6jpQXw4ZU\nna0xxvtqfMw9Sn+vStm3ioFAMFb9wMaU7ZtGacPylO3lNu3OQDuHj7sYfGmog+Ty7AUkl2OGt2uk\nNoxkd5KAV4s+6iwdK22cRz+wd+nr8riXM4A3Ab0hhOtIKjsbxvhae7Bzf0HSZ3uVvwkhnEoyPmQ+\nyb/dAwz0VcWBoSGEM0nG2ewLrAPuIxkXIhWKYUPK3zqSE8pc4HfD9s0jPQRk0Ya5KdvLJ961DJw0\n92JoJWfPQV9vJDlhfxa4NuV44z0RPlU6blob55F+Qh/tWCeRjCsZ7g8AMcZVJGMszg8hLCQZo3Ih\nyYDbvx/ja60FDkrZ3ln+ojQG5DrgB8BfxBhXlLa/DTi10oFDCC8Evgb8O3Bxqb3ltVhci0OF4gBR\nKXt9o+z/Dcnsg8EDMcszOp5DMsAwa3cALyjNnBjsr4FVMcaHgZ+RhKLXDnvMy8pfxBifBu4F5scY\n7y3/IakifIzkktBYbR903B6SsSV/VRq8CUBpXMPpjNxHw/v/56W/Zw9r414kgz47QwjHhBBWhRCO\nKL3+0hjjR0jGp+xX4bhpbgMOCCEcPqjNezJ0vYwjSC57fKocNEr+ovR3+ef08Nc7luTf458HBY1W\nktlGUqFY2ZCy9xTwZ6VZGnfFGLcM3hljXB9C+DfgwyGEXpKxAweSnJx/SzJjomwsay2MZz2Gz5IE\ni9tCCP9MMvPibJJwcE6pnctDCF8GPhlCmEZSsn8T8CfDjvUB4OYQwjdJZlO0kUzpPar0nsbqKeC4\nEMLxMcY7SaoKtwK3hBC+QHKCvpBkvEvqbJBBxyGE8HqSQbK/DSF8C7gyhHAASYiZTzITZDnJINRp\nJJcxvlHqj1Uk400OI7ncUT5uC/DqEMJ/DR6XMcg3SGbjXFea6fJHksHAg3/Ru5ckSCwKIXym9L7O\nYWAcyq5p74OBMThfCCFcRVIteTulf48Qwq4xxvFetpJqysqGVJ2xrvw4+HEXk5T/bwUOT3twjPGf\nSU4UJwE3kAy4/C7JYMXNVb5+2mMqbesvvf5qkpkpS4BLgf8kGQfw8hjj4LDzNuBTJNNwf0Cy/sQn\nBu0nxvgTkvL/vqXjfA3YCpwcY7xr0ENHey+fIBkU+l8hhH1jjD8jGUexC8klmi+RrI9xdKV1Nkq+\nD9wNXEMSeiAJUp8B3kry73IhyXThU2KM/THGZ0gqBP9LcpniVpLpr2+JMX6jdIzbgZ+QjJkoz2oZ\nojT48ySSytElJNNXf0oSKMuPWU4yk2YfkunKV5BUdU4k6aPj095HjPEOkn+HY0mmwF5MMhD3jNLj\ny8+TctfS3z+eVXOzUZqDfw/wjhjjz0vbjiH5obCQ5DrxxTHGr+bXSkmSVI3CVDZKQeNa4JBB2/Yi\nSew/A/6UZBGey0II470fgSRJqrNCjNkIISwgKWEO90pgZYzxw6Xvl5dWEXwjcEu92idJksavKJWN\nE0hGbZdHV5fdQmlw2jCz6tEoSZI0cYWobMQYd6yOWFrVsLz9D5TmvJf2zSEZSPWRerZPkiSNX1Eq\nG6Mq3RXx+yTLRX855+ZIkqQxKkRlYzQhhF1JpgMeDBw3fJ2C0fT39/e3tIxn6QFJkia9CZ9ACx82\nQgjPIpnjfiBwUozx99Ueo6WlhY0bN9PXt330Bzep1tYptLfPsB/shx3si4T9MMC+SNgPA8p9MVGF\nDhulZYmvI7mt94tijMPvGzFmfX3b6e2d3B8asB/K7IcB9kXCfhhgXyTsh9opdNgA3kyyit7LgI2l\ndTcgufPm+txaJUmSxqyIYWPHEsoky+62ADcNe8wdwIvr2ShJkjQ+hQsbMcbWQV+7UqgkSQ2uYaa+\nSpKkxmTYkCRJmTJsSJKkTBk2JElSpgwbkiQpU4YNSZKUKcOGJEnKlGFDkiRlyrAhSZIyZdiQJEmZ\nMmxIkqRMGTYkSVKmDBuSJClThg1JkpQpw4YkScqUYUOSJGXKsCFJkjJl2JAkSZkybEiSpEwZNiRJ\nUqYMG5IkKVOGDUmSlCnDhiRJypRhQ5IkZcqwIUmSMmXYkCRJmTJsSJKkTBk2JElSpgwbkiQpU4YN\nSZKUKcOGJEnKlGFDkiRlqi3vBgwWQpgO3AO8I8b489K2/YErgWOBFcC7Y4w/yauNkqR8bevtY+ny\nblat62HuHjNZeFAnU9ta826WRlCYsFEKGtcChwzb9UPgAeAI4FXAdSGE+THGx+vcRElSzrb19nH1\nLQ+ysrtnx7YlD3VxzmnzDRwFVojLKCGEBcCvgQOGbX8xcCDw1pj4N2AxcG79WylJytvS5d1DggbA\nyu4eli7vzqlFGotChA3gBOA2kkslLYO2Hw3cG2PcMmjbL0qPkyRNMqvW9VS1vaxt6f3MntPOrFe/\nnJZug0m9FeIySozxivLXIYTBu+YBTw57+Gpg3zo0S5KUsWrHX8zdY2ZV2wGe9eaz2OWG6wCYdud/\n0xaXse0FL5xYw1WVQoSNEcwEnhm27Rlgeg5tkSTV0HjGXyw8qJMlD3UNec68ziSkDDdlxSN0/vlh\nO7/uscfVoPWqRtHDxhZgj2HbpgMj18tStLYW5YpRPsrv336wH8rsi4T9MKDefXHfw2tZta6HlkEX\nz1et6+G3K9Zz1Pw5qc9pa5vCm192CA883M2q7h7mds7ksIM7mTYsnMz48AfY5QuXDtm2+f0XsuWf\nPjjqic/PxIBa9UHRw8YT7Dw7ZS6wstoDtbfPqEmDGp39kLAfBtgXCfthQL36YkPPNtpSTmYberbR\n0bHriM89ZXZ7+o4VK+CAA3be/thjzNh3X6p5Z34maqfoYePXwAUhhOkxxvLllBcCd1Z7oI0bN9PX\nt72mjWskra1TaG+fYT/YDzvYFwn7YUC9+2LWzKn0przOrJlTWb9+05BtW3v7StWMTczt3DW1mtH+\nZ4fS+uiKIdueOfNv6Lns8uSbYcesxM/EgHJfTFTRw8YdwGPANSGEjwMvB44Czq72QH192+ntndwf\nGrAfyuyHAfZFwn4YUK++OHT/Du5etnqn8ReH7t8x5PXTxnbcvWw155w2H4Bl9zzEyS8/eqfjr/v5\nb+ibvwDG+V78TNROEcNGf/mLGOP2EMIrgK+SrCz6MPBKF/SSpMY3ta2Vc06bP+pslEpra9z7UBen\nn/g89k45dtfqDQwZDKJcFS5sxBhbh33/e+CknJojScrQ1LZWjgjpg0HL0tbQmPbMZk4/8Xk7bb/h\npecy7YMf4AiDRqEULmxIkjTY8DU0Pv/+k1Mf965/uYXetmkcOcoCX6o/5/VIkgpt4UGdzOucScv2\n7alBY1vbVM5fdBu9bdOAkRf4Uj6sbEiSxqVed1+d2tbKBZ84k7ZHfr/Tvks//UMe6n/Wju8rLfCl\nfBk2JElVq+fdV2fPSV9To2vNRl4zLPAs2K/D288XkGFDklS1ke6+OtqAz7Ha7X3vZsbXvrrT9g3f\n/C5bTzkNGDrA1NvPF5dhQ5JUtfHefXWsRqpmVFKPAKTxcYCoJKlq47n76ljsctWVqUFj8xv+esSg\nAdkHII2flQ1JUtWqufvqWI2nmjFYVgFIE2fYkCRVbayrf45F2/330nHKian7xho0IJsApNowbEiS\nxmUsq3+OpmI144lumDq16vbUKgCptgwbkqS6a1nXzZ7zU24FT3XVjOFqEYBUe4YNSVJdVapmdN+/\njO1771Pn1qgeDBuSpPro72f2XrNSd02kmqHiM2xIUpXqtUx3M6lUzdjwrf9g60teWufWqN4MG5JU\nBVeprN5Ep7Sq8bmolyRVYaRVKjVU+7lvSg0aPX93vkFjkrGyIUlVcJXKsbGaocGsbEhSFVylcmTT\nr/1matDomzvPoDGJWdmQpCq4SmVlVjNUiWFDkqrgKpU7a132f+xxwjGp+wwaAsOGJFXNVSoHVKxm\nrFgFM720pIRjNiRJVWv548aRL5sYNDSIlQ1JUlUqhYx1v7yHvuc+r86tUSMwbEiSxqaBlxt31dd8\nGTYkqaCKdIKsVM3446LPseXs8+rcmuq46mv+DBuSVEBFOkFWChq33vlwEoDq2prqjbTqqwN968MB\nopJUI9t6+1gS13Dz4hUsiWvY1ts37mMVYVn03f/yJalB48m99uf8Rbdx0+JHufqWByf0PuvBVV/z\nZ2VDkmqg1pWIvE+QlaoZ5y+6bcj3jVAhcNXX/FnZkKQaqHUlIrcT5OWX07HHbqm7rrl+aer2olcI\nFh7UybzOof3mqq/1ZWVDkmqg1pWIPJZFrxQyulZvgJYW5sY1qfuLXiFw1df8GTYkqQZqXYmo5wly\nrMuNN/J9YVz1NV+GDUmqgSxOxFmeIMvTal96/MGp+9cue4T+zqFtt0Kg8TJsSFINNNKJeFtvH9+4\n/n4ueNtJqfvXr3ua/t7tqfusEGg8GiJshBD2Bb4IvAjoBi6JMV6Sb6skaahGORHvvXcHF6Rsv+fS\nb3Dk3/81rN9U9zapuTXKbJT/BP4IHA68C/hkCOEV+TZJkhrPSFNa/++AP6tzazRZFL6yEULYHTga\nOC/GuBxYHkK4FTgZuD7XxklSg6gUMm570Wu57vS/A2BuZ7FnlahxFT5sAJuBTcA5IYQLgYOA44AL\nc22VJDWIsSzQNa9zJocdXPxZJWpMhb+MEmN8Bjgf+DuS4LEM+K8Y4zV5tkuSiq79vL9JDRrb22fx\n5JPrOf3Y/TgyzOb0Y/fjnNPmM62Ag1nVHBqhsgGwALgBuBj4E+CyEMJPY4zXjvUAra2Fz1WZKr9/\n+8F+KLMvEs3aD5UW6Fq/7mkAZgBHP3/ukH3N2hfVsh8G1KoPWvr7+2tyoKyEEE4GvgPsW6pyEEL4\nAHBmjPH5YzxMsd+kJI1i67Y+7lm2mie6nmaf2btx5IK9mDY1pRJx1VVwXoVbvhf8570Kq2WiB2iE\nysbhwO/KQaPkPuAD1Rxk48bN9PWlzxufDFpbp9DePsN+sB92sC8SjdAPW3v7uOqmZTzZPTAl9Y4l\nj3Hu6QuGXPqoWM3o2gCtraNOaW2EvqgH+2FAuS8mqhHCxpPAwSGEthhjb2nbAuCRag7S17ed3gqL\n1Ewm9kPCfhhgXySK3A/3xS6eWDs0KDyxdhP3xS6OCHPoX7aMOSccnfrcrjUbk9puFe+tyH1RT/ZD\n7TRC2LgRWAR8JYTwSWA+yUwUZ6NImhRGuslbpZkmq399P1MOPDDLZmWqvJx60Vdj1dgUPmzEGDeW\nxm1cAtwFdAEfizF+Jd+WSVJ9pN3MbUpfL2e/YmHq489fdBunb9uNI7JuWEa29fZx9S0PDrnPzJKH\nujjntPkGjgZV+LABEGN8EDg173ZIUh6G3+Tt8+8/OfVx157xbn55zOnA+G9tXwRLl3cPCRoAK7t7\nWLq8uyGWg9fOGiJsSNJkNvgmb5Xu0jp4gS4Y/63ti2Cky0ZqTIYNSWoAe+/dwd4p27f+6eF8+u+/\nADW8tX3eKgWlRg5Qk51hQ5IKrtIg0K41GwE4Z9Bgyj1n7QLAj+9+rGEHVg6/bASNH6AmO8OGJBXU\n7qedzNQld6fuKwcNGLi1fbMMrBx82cjZKM3BsCFJBTRaNSNNMw2sLAcoNQcXfpekApn+nW+NK2iA\nAytVXFY2JGXGhZmqUzFkrFgFM0cfHOnAShWVYUNSJppl/EA9THnsD3QecWjqvtGqGYM5sFJFZdiQ\nlIlmGj+QpUrVjPW3/ozew4+s6lgOrFRRGTYkZcLxA6PYvp3Zc3dP3VVNNWM4B1aqiBwgKikTjh+o\nbPac9tSg0fOOd04oaEhFZWVDUiaKNH6gSANVxzvTRGpkhg1JmSjK+IGiDFStFDLAoKHmZ9iQlJki\njB8owkBVqxma7ByzIamp5TlQ9VlvOdugIWFlQ1KTy2ugqiFDGmBlQ1JTW3hQJ/M6hwaLLAeqTvvJ\nrQYNaRgrG5KaWj0HqlYKGWvjCvo79qj560mNwrAhqellPVC1pbubPRcckLrPaoZk2JCkCalUzdhw\n7ffYevIpdW6NVEyGDUkaJ8dmSGPjAFFJqtLsOe2pQWPLq//KoCGlsLIhSVWwmiFVz7AhSWPQ+bzn\nMOWpp1L3GTSkkRk2JGkUVjOkiXHMhiRVMPNTnzRoSDVgZUOSUlQKGbfe8RALnzubqXVuj9TIrGxI\n0iBt999bMWicv+g2bvrNY1x9y4Ns6+2rc8ukxmVlQ5JKKoWMD37wu2yYteeO7+t9i3qp0VnZkKSe\nHjr22C111zXXLx0SNMrqcYt6qVkYNiRNah177Aa77rrT9g1f/Tpdazbmdot6qZl4GUXSpDWWmSYL\nD+pkyUNdrOweqGRkeYt6qRk1RNgIIUwDPge8AXgGuCrG+MF8WyWpUVUKGc+cehobv/HdIdvqeYt6\nqVk1RNgALgVOBF4CtAPfDSGsiDFemWurJDWcSkFj/bqn6e3dnrov61vUS82u8GM2QggdwLnAm2OM\nS2KMtwMXA0fn2zJJjWT3k4+vGDTo769vY6RJphEqGy8Enoox/qK8Ica4KMf2SGowI43NaGubQked\n2yNNNoWvbAAHAitCCG8KISwLISwPIXwohNCSd8MkFduMyy9zuXGpABqhsrEb8DzgLcDZwDzgy8Am\nkkGjY9La2gi5Kjvl928/2A9lzd4XldbNWL96PUyduuOHX7P3QzXsi4T9MKBWfdAIYaMXeBbwhhjj\n4wAhhP2At1FF2Ghvn5FN6xqM/ZCwHwY0XV8sWwaHHJK+r7+/4iWTpuuHCbAvEvZD7TRC2FgJbCkH\njZIIPLuag2zcuJm+vvSR5pNBa+sU2ttn2A/2ww7N2BeVqhkbfnU32+cvgPWbdtrXjP0wXvZFwn4Y\nUO6LiWqEsPFrYJcQwsExxodL2w4BVlRzkL6+7RWntU0m9kPCfhjQFH2xbRuz90lfZGvH2IxR3mNT\n9EON2BcJ+6F2Cn9BKsb4EHAzcE0IYWEI4VTgAuDyfFsmqQhmz2lPDRpPX/TJCQ0C3dbbx5K4hpsX\nr2BJXONdXqUJaITKBsCZwGXAnUAPcGmM8Qv5NklS3rKaabKtt4+rb3lwyBLlSx7q4pzT5rtyqDQO\nDRE2Yox/JJmJcna+LZFUBJVCRt/e+7Du/mUTPv7S5d1DggZ4W3lpIhoibEhSWT3Wzah0+3hvKy+N\nT+HHbEgSwO4vOaFuC3R5W3mptqxsSCq8eq8C6m3lpdoybEgqrOnXfpP2d749dV+Wy417W3mptgwb\nkgqpUjXjJz96gCe2tDA3rsk0AHhbeal2DBuSCmXK44/RefjzU/d98quLWfnA2h3fOx1VagyGDUmF\nUamasf6GH/GbjoNYufjRIdudjio1BsOGpPxt387subun7iqPzVi1eEXqfqejSsXn1FdJuZo9pz01\naGw+7y1DBoE6HVVqXFY2JOWmmimtTkeVGpdhQ1LdVQoZUHlKq9NRpcZl2JBUVxNZoGv4dNTynVkN\nH1KxGTYk1cWs176CaXfcnrpvPAt0eWdWqXGMaYBoCOHrIYRnZd0YSc1p9pz21KDRtXrDuFcCHenO\nrJKKZayzUU4HfhtCeHGWjZHUXKb96JaRL5u0tIz72N6ZVWocY72MsgC4HPhxCOELwAUxxi3ZNUtS\no6sUMtb+73L6Z8+e8PGdCis1jjFVNmKMq2OMrwZeB7wGuC+EcGSmLZPUkFrWdY9YzSgHjfLgzpsX\nr2BJXMO23r6qXmfhQZ3M6xwaLJwKKxVTVQNEY4zfDyH8GPgY8PMQwg1Az7DHnFvD9klqIJVCxoav\nXcvW0/5yx/e1GNzpVFipcYxnNsouQAcwHTiAYWFDUnPa1ts34om9mimtIw3urOY+J96ZVWoMVYWN\nEMKbgUXAZuBVMcYbMmmVpEIZqRKx994dqc955i9exsZrvpW6z8Gd0uQyprARQjgY+DJwIvBN4B9i\njE9l2C5JBVKpElEpaIw2ndXBndLkMtbKxlJgHfCyGOPNGbZHUgENrzh8/v0nV3zsWNbN8D4n0uQy\n1rDxHeDdMcYNWTZGUjENrjhUChrVLM7l4E5pchlT2HCGiTS5LTyok93+7RO88OZrUvePZxVQB3dK\nk4f3RpEa3GizRGph77072Dtle9eqp2DKWBciljRZGTakBpb1zcja7ltCx6knpe4b7z1NJE0+hg2p\nAVSqXtRqvYo0ldbN6L7v/9i+z74TOrakycWwIRXcSNWLTNar6Olh9v5zU3dZzZA0Hl5slQpupOpF\nrdermD2nPTVobLz8SoOGpHGzsiEV3EjVi1OOenbN1quoZrlxSaqGYUMquJGqF7VYr6Jjj91St299\n4YvY8IObxtVmSRrMsCEV3GirbU5ovYqWltTNVjMk1VJDhY0Qws3AahcZ02SSxWqbs15xGtMW/zJ1\nn0FDUq01TNgIIbweOA24JuemSHVXy9U2HZshqd4aYjZKCKGD5Nb2d+XdFqlR7fLtb2QWNLb19rEk\nruHmxStYEtewrbdvQseT1FwapbJxMfB1YJ+8GyI1okohg61bWf/0VujdPu5jZ72KqaTGV/jKRgjh\nxcDxwMfzbovUaKY89oeKQWP9uqdh6tQJv8ZI64BIEhS8shFCmA5cAbw9xvhMCGHcx2ptLXyuylT5\n/dsPk6cfKk1p3bBkKdsPOLBmfbHmqc2pk1rWPLWFtrbi9/Nk+kyMxr5I2A8DatUHhQ4bwEXA3THG\nn070QO3tMybemiZgPySauh96eytWLH71wBM80fU0+7Rt4MgFewET74uDn7MH9/1ubcr2Djo6dp3Q\nseupqT8TVbIvEvZD7bT09/fn3YaKQgi/B/YCyheUp5f+3hJjrHAROlX/xo2b6esb/3XpRtfaOoX2\n9hnYD83dDxWrGZ//EpfPPIwnuzft2LbPnrvxnjOPYMvmrRPqi629fVx107Ihx967c1fOPX0B0xpg\nzEazfyaqYV8k7IcBpb5IX5CnCkWvbJwADP4VbRHQD7y/2gP19W2ndwKD4JqF/ZBoxn4YaabJkriG\nJxY/OmT7E2uf5p5lq1nw7FkT6osptHDWS8NO64BMoaWh+rgZPxPjZV8k7IfaKXTYiDE+Nvj7EMIf\ngf4Y4yPF5uTLAAAXGElEQVQ5NUkqnD33n0dLz6adtj/zklPZ+K3/BCrfX+WJrqdZ8OxZE25DLdcB\nkdR8Ch02JI1srOtmDL+/Sn9/Pz1benl01UbufnAqh+7f4TRVSZlpqKG2McZzXKpcgpn/+rGqFuha\neFAn8zqTwNHf30/3hi30bOnl8dVPc+MvH+HqWx50IS5JmbGyITWY8awCOvj+KktiF5uf6WXmLlN3\nTFktr4vhpRBJWWioyoY0mU391S8mtNx4eVzFPrN3HRI0yiqN65CkibKyITWAiiHj0dUwo7q1AIaP\n3xhtuyRNlJUNqcBaNjw1cjWjyqABQ8dvlM3rTKar5sUbuUnNzcqGVFCVQsa6O++iL8wf9fnbevt2\nWvtialvrjvEbv12xng0925g1M9/ZKN7ITWp+hg2paPr7mb1X+toXY70V/Ggn8KltrRw1fw4dHbuy\nfv2mXBcuGulGbg5YlZqDl1GkAulccEBq0Nj4pavGHDSgse7EWmlgqgNWpeZhZUMqiEqXTa65fmly\nGaS3b8yXFRrpBO6AVan5WdmQcrbb+96dGjQWn/pGzl90G/fELm5a/GhVC2810gm8iANWJdWWlQ0p\nR5WqGbfe+TA3DbtxWjXjGBYe1MmSh7qGXEop6gl88IJjwwezSmoOhg0pB9NuvJ5Z571pp+19z9mP\ndff8D6sWr0h93lgvgzTaCdwbuUnNzbAh1VnFdTNWb6C8rGctLoN4ApdUFI7ZkOqk9fcPj7xA16D1\nwx3HIKmZWNnQDpUWgdLEVQoZKx9+nAdWbmHV4hWpC2/57yGpGRg2BLiKY2Z6epi9/9zUXU8+uX7U\nhbe8DCKpGXgZRUD2i0BNxntfzJ7Tnho01t15F11rNjbUwluSNBFWNgRkuwjUZKyajDSldeFBnUyl\nsRbekqSJMGwIyHYRqMl074s9nzOHli1bdtr+nVe9k18c+3JY/OiOoNVIC29J0kR4GUVAtrMfJstv\n8LPntKcGjfMX3ZYEjZJy0HLGiaTJwsqGgGwXgWr23+B3e+87mfGNq3favvWEk/j2ey6B2LXTvlXr\nejgizHHGiaRJwbChHbKa/dBIS2dXa8R1M4C5cU3q/nLQcsaJpMnAsKHMNeOaEdNuvpFZ55yZum/w\nreCbOWhJ0lgZNlQXzfQbfMVqxqqnYMrQYVDNGLQkqVqGDWmMpjy6gs6jFqbuG1zNGK6ZgpYkjYdh\nQw2p3kurV6pmrF32CP2dXhKRpJEYNtRw6rpI2NatzN53z9Rdt975MAtn7c7U2r6iJDUdw4YaTr0W\nCatUzfjc2z7H8gMWDlmgyzEYklSZYUMNpx6LhFUKGucvum3I9826Eqok1ZIriKrhZLlI2B5HLUwN\nGpv+6UNcc/3S1Oc020qoklRrVjbUcIavXdHf388u01p5Yu3TO/aP57LGRBfokiSls7IxyTXird/L\na1ecfux+HP7cPdllWitbtvZx3++6uWnxo1x9y4NVvY+Z//qx1KDRe8ihOy3Q5b1MJKl6VjYmsUa+\n9fvA2hVruPd3a2lpadmxr5pxFKNVM4a/pgt0SVL1GiJshBD2Bi4FTgJ6gP8ALowxbs21YQ2u6Ld+\nH8taGuMdLNp612/oeOnJqftcoEuSaqshwgbwfaAbOA7oBK4GeoEL8mxUoyvyrd/HWnUZ12DRlhbS\n6hldT66Dtkb5LyFJjaPwYzZCCAH4c+DsGOODMcZfAh8B3phvyxpfkW/9PlLVZbBqxlG0dHfTscdu\nqa/XtWajQUOSMtIIP11XAS+NMa4dtK0FmJVTe5pGke9IOtaqy1jHUVRcbvx/fkf/XnvVptGSpFSF\nDxsxxg3AT8rfhxBagPOBn+bWqCZR5AGP1VRdRhxHsX07s+funrprpLEZY1Hv+7NIUqMqfNhI8Wng\nT4Ej825IMyjqgMdaVF0qVTO4/XbWH3YU9G4fd/saeSaPJNVbQ4WNEMKngH8A/irGuKya57a2Fn54\nSqbK779R+qGtbQpvftkhPPBwN6u6e5jbOZPDDu5k2hhP5JXGZmzc0EN7+wxaN26eUPvue3gtq9b1\nMGjGLavW9fDbFes5an7xwluaRvtMZMV+GGBfJOyHAbXqg4YJGyGEy4C3AmfGGH9Y7fPb22fUvlEN\nqNH64ZTZFaoTlbz+9fDd7+68/ROfgA9+cMcslIn2w4aebbSl/Cfc0LONjo5dJ3Tsemu0z0RW7IcB\n9kXCfqidhggbIYSPAm8BXhdjvG48x9i4cTN9feMvmze61tYptLfPaOp+qFTNWL/u6dIXm2rWD7Nm\nTqU35fmzZk5l/fpN4z5uPU2Gz8RY2A8D7IuE/TCg3BcTVfiwEUJYAHwI+BfgVyGEHVMHYoyrx3qc\nvr7t9E7gGn2zaMZ+mP6db9H+D2/bafu2Pzucp37036ljMybaD4fu38Hdy1bvNKbk0P07Gq5/m/Ez\nMR72wwD7ImE/1E7hwwbwcpL1QD5U+gPJ1Nd+wJF4k1w1y43XUpFn8khS0RQ+bMQYPwV8Ku92qFja\n7ltCx6knpe7LOmiUFXUmjyQVTeHDhjRcxWrGY10wfXqdWyNJGo3zetQwWjY8NfJlE4OGJBWSlQ01\nhEohY93iJfQd9Nw6t0aSVA3Dhoqtv5/Ze6XfBsflxiWpMRg2lJmJnswrVTM2Xn4lz7zmdRNq21aX\nG5ekujFsKBMTvXdI1lNaH3i48i3snWEiSbXlAFFlYunyyifzkbSf9zepQWPLGa+p6ZTWVd3pq3xW\nurW9JGn8rGwoE5VO2iOdzOu5QNfczvT7l1S6tb0kafysbCgTlU7aadt3+coVdV8J9LCDO5nXObQt\n1d7CXpI0NlY2lImFB3Wy5KGune4dMvxkXjFkrN7AkPu319g0lxuXpLoxbCgTo907pPW3/8MeLz4u\n9bkuNy5JzcWwocxUOplXqmas/d0f6J+1e9bNkiTVmWFD9dPTw+z956buqlc1Q5JUf4YN1UWlasb6\nn9xB72F/VufWSJLqybChzNV7pokkqVic+qrMdJxwbGrQ+OO/fnpI0NjW28eSuIabF69gSVzDtt6+\nOrZSkpQ1KxvKxFirGRNd1lySVHxWNlRTM//94tSgsfX4E1Mvm4x3WXNJUuOwsqGaGc/YjPEsay5J\naixWNjRhbXf/JjVo9Le2jjoItJplzSVJjcnKhiakYjVj1VMwZfQsO9ZlzSVJjcuwoXFp6epiz+cf\nlLqvmimtoy1rLklqfIYNVa3icuMPP0Z/+6yqj+c9SiSpuRk2NHZ9fcye15G6ywW6JEmVOEBUYzLr\nNa9IDRrr/nuxQUOSNCIrGxqVy41LkibCyoYqmvnZRalBY8O13zNoSJLGzMqGUlnNkCTVipUNDdG2\n5O7UoLHpAx+padDw5muSNHlY2Zhktvb2cV/sSl3TYo+jFtL66IqdnpN287SJrIvhzdckaXIxbEwi\nW7f1cdVNy3hi7aYd25Y81MW5R89h3qE7L9DV8/Z/YNNFnxiyrRZBYaSbr7nehiQ1H8PGJHLPstU8\n2b1pyLaDb7yWeeddutNjKy03Xoug4M3XJGlyMWxMIk90Pb3j65btfXx00VnsuW7lkMdses/76fmn\nD1U8Ri2Cgjdfk6TJpSHCRghhOnA5cAbQA3wmxvjZfFvVePaZvRsA+/3hQd73+XfstL/r0dUwY8aI\nx6hFUPDma5I0uTTKbJSLgcOBE4G3Ax8NIZyRa4sa0JEL9mLvzl0583sXD9m+7otfTQaBjhI0IAkK\n8zqHBotqg0L55munH7sfR4bZnH7sfg4OlaQmVvjKRghhJnAecGqM8QHggRDCIuB84Ae5Nq7BTJva\nyrmnL2DrdX8ONz4CwMqHH6etPX1NjTS1ukurN1+TpMmj8GEDOIyknYsHbfsF8IF8mtPYprW1MuUr\nX6ar74vQ1jauD8BIQWGi02IlSc2nEcLGPGBtjLF30LbVwC4hhM4YY3dO7WpcLS3QVvt/etfPkCSl\naYSwMRN4Zti28vfTx3qQ1tZGGZ6SjfL7z7If7nt4LavW9dDSMrBt1boefrtiPUfNL8Ylk3r0Q6Ow\nLxL2wwD7ImE/DKhVHzRC2NjCzqGi/P2Y51u2t48++HEyyLIfNvRsoy3lg7mhZxsdHbtm9rrj4edh\ngH2RsB8G2BcJ+6F2GiFsPAHsGUKYEmPcXto2F9gcY3xqrAfZuHEzfX3bR39gk2ptnUJ7+4xM+2HW\nzKn0phx71syprF+/KeUZ9VePfmgU9kXCfhhgXyTshwHlvpioRggb9wPbgGOAX5W2HQ/cXc1B+vq2\n09s7uT80kG0/HLp/B3cvW73T+hmH7t9RuL738zDAvkjYDwPsi4T9UDuFDxsxxs0hhK8DV4QQzgX2\nBd4LnJVvyzRcrabFSpKaS+HDRsl7SFYQ/RmwAfhwjPH6fJukNK6fIUkariHCRoxxM3BO6Y8kSWog\nzuuRJEmZMmxIkqRMGTYkSVKmDBuSJClThg1JkpQpw4YkScqUYUOSJGXKsCFJkjJl2JAkSZkybEiS\npEwZNiRJUqYMG5IkKVOGDUmSlCnDhiRJypRhQ5IkZcqwIUmSMmXYkCRJmWrLuwEqpm29fSxd3s2q\ndT3M3WMmCw/qZGpba97NkiQ1IMOGdrKtt4+rb3mQld09O7YteaiLc06bb+CQJFXNyyjaydLl3UOC\nBsDK7h6WLu/OqUWSpEZm2NBOVq3rqWq7JEkjMWxoJ3P3mFnVdkmSRmLY0E4WHtTJvM6hwWJeZzJI\nVJKkajlAVDuZ2tbKOafNdzaKJKkmDBtKNbWtlSPCnLybIUlqAl5GkSRJmTJsSJKkTBk2JElSpgwb\nkiQpU4YNSZKUKcOGJEnKlGFDkiRlqvDrbIQQZgGfAU4nCUc3A++KMW7ItWGSJGlMGqGy8SXgT4CX\nAqcAC4Av59oiSZI0ZoUOGyGEmcAZwDtijPfHGO8H3gW8KoQwLd/WSZKksSh02AC2k1w+eWDQthag\nFdgtlxZJkqSqFHrMRoxxC/DjYZvfCSyNMa7LoUmSJKlKuYeNEMIuwD4Vdq+MMfYMeuz5wGuAU6t9\nndbWohdxslV+//aD/VBmXyTshwH2RcJ+GFCrPsg9bABHA7cD/Sn7XgXcABBCeDtwCfDOGONtVb5G\nS3v7jAk1slnYDwn7YYB9kbAfBtgXCfuhdlr6+9PO8cUSQvhHYBHw3hjj5/JujyRJGrvCh40QwlnA\nVSRra1yWd3skSVJ1Ch02QggdwKPA94ALh+3uijFur3+rJElSNYo++uUUYFfgLODJ0p+Vpb/3zbFd\nkiRpjApd2ZAkSY2v6JUNSZLU4AwbkiQpU4YNSZKUKcOGJEnKlGFDkiRlqgjLlWcuhDAL+AzJHWSn\nADeTLBK2IdeG1UkIYTpwOXAG0AN8Jsb42XxbVX8hhL2BS4GTSPrhP4ALY4xbc21YjkIINwOrY4zn\n5t2WPIQQpgGfA94APANcFWP8YL6tqr8Qwr7AF4EXAd3AJTHGS/JtVX2Vfk7eA7wjxvjz0rb9gSuB\nY4EVwLtjjD/Jq431UKEfjiE5hy4EHgcujjF+tZrjTpbKxpeAPwFeSrJ2xwLgy7m2qL4uBg4HTgTe\nDnw0hHBGri3Kx/eBXYDjgNcDLwM+nmuLchRCeD1wWt7tyNmlwMnAS4A3An8bQvjbfJuUi/8E/kjy\nc+JdwCdDCK/It0n1UzrBXgscMmzXD0nWdToC+CZwXSmYNaW0fggh7AX8F/Az4E+Bi4DLQghV/exo\n+spGCGEmyW/0L4gx3l/a9i7g5yGEac3+W23p/Z8HnBpjfAB4IISwCDgf+EGujaujEEIA/hzYK8a4\ntrTtI8CngQvybFseSqvzLgLuyrsteSn1wbnAi2OMS0rbLia5OeSVebatnkIIu5O85/NijMuB5SGE\nW0lC2PW5Nq4OQggLgG+nbH8xcCBwTIxxC/BvIYSTST4zH6tvK7NXqR+AV5Lcgf3Dpe+XhxBOIgnn\nt4z1+JOhsrGd5PLJA4O2tQCtwG65tKi+DiMJlYsHbfsFyQ+XyWQV8NJy0ChpAWbl1J68XQx8HViW\nd0Ny9ELgqRjjL8obYoyLYoxvzrFNedgMbALOCSG0lYL5ccC9+Tarbk4AbiO5VNIyaPvRwL2loFH2\ni9LjmlGlfrgFOCfl8VX97Gz6ykbpg/LjYZvfCSyNMa7LoUn1Ng9YG2PsHbRtNbBLCKEzxtidU7vq\nqjQ+Z8e11hBCC0l156e5NSonpd/Yjie5tHhFzs3J04HAihDCm4APANOAq4FPxhgnzdLKMcZnQgjn\nA58nuYTSClwdY7wm14bVSYxxx/+BJGftMI/kEspgq2nSW2VU6ocY4x+APwzaN4fkMvRHqjl+U4SN\nEMIuwD4Vdq+MMfYMeuz5wGuAU+vRtgKYSTLwbbDy99Pr3JYi+TTJ9ccj825IPZWuyV4BvL10ksm7\nSXnaDXge8BbgbJKTy5dJfsv/XH7NysUC4AaSitefkFyT/2mM8dp8m5WrSj87J+3PzdK59vskIayq\ncY9NETZIyl23A2m/jbyK5D8RIYS3A5cA74wx3la/5uVqCzv/5yh/38MkFEL4FPAPwF/FGCfbZYSL\ngLtjjJOuopOiF3gW8IYY4+MAIYT9gLcxicJGaRzCecC+McZngPtKgyA/RDJYcLLaAuwxbNt0Ju/P\nzV1JzqUHA8cNu7w0qqYIGzHGOxhl/EkI4R9JBsS9N8b4+bo0rBieAPYMIUyJMW4vbZsLbI4xPpVj\nu3IRQrgMeCtwZozxh3m3JwevA/YKIfyx9P10gBDCa2KM7fk1KxcrgS3loFESgWfn1J68HA78rhQ0\nyu4jubQ0mT3BzrNT5pJ8biaVEMKzgFtJLj2eFGP8fbXHmAwDRAkhnAV8iqSiMWl+Yym5H9gGHDNo\n2/HA3fk0Jz8hhI+SlMxfF2P8z7zbk5MTSMrkh5X+3EAy4+CwPBuVk1+TjF06eNC2Q0jWU5hMngQO\nDiEM/uVzAfBITu0pil8Dh5cuPZa9sLR90iiNb7sO2B94UYzxwfEcpykqGyMpTW+7DPga8B+lOcNl\nXYN+229KMcbNIYSvA1eEEM4lGdz0XuCsfFtWX6VpXR8C/gX41eDPQYxxdW4Nq7MY42ODvy9VOPpj\njJPuxBJjfKi0qNk1pUus80imQTfdtMZR3EhS9f1KCOGTwHzgwtKfyewO4DGSz8fHgZcDR5GM75lM\n3kyyRtPLgI2DfnZujTGuH+tBJkNl4xRgV5KT65OlPytLfzflqOIU7wGWkCzKchnw4Rhj08+fH+bl\nJJ/3D7Hz50CT15nAw8CdwDXApTHGL+TaojqLMW4kWVNjHsm6K58BPhZj/EquDcvHjnF/pV9EX0Fy\n6eQeknUlXjnssluz6megL84gmQp7EwM/O58kGSg6Zi39/ZNmhpckScrBZKhsSJKkHBk2JElSpgwb\nkiQpU4YNSZKUKcOGJEnKlGFDkiRlyrAhSZIyZdiQJEmZMmxIkqRMGTYk1U0I4YAQwoYQwjUp+44I\nIWwOIbw1h6ZJypDLlUuqq9JdmK8iufvu90rb2oF7gbtjjG/Is32Sas/KhqS6ijF+DfgeyZ2I9y5t\nvprkxk9vya1hkjJjZUNS3YUQdgeWAv9HEjw+D7wgxnhvrg2TlAnDhqRchBBOBH4K9AHvjzFekm+L\nJGXFyyiS8vIb4EmgFbg957ZIypBhQ1JePg+0Ab8Fvh1CmJ5zeyRlxLAhqe5CCG8EziYZEPo3wMHA\nxXm2SVJ2DBuS6iqEcDDwReCLMcabYoxLgQ8D7wghnJZv6yRlwbAhqW5CCFOB7wKPAu8dtOti4OfA\nNSGE2Xm0TVJ2DBuS6unTwALgjTHGZ8obY4z9wFnANOCafJomKStOfZUkSZmysiFJkjJl2JAkSZky\nbEiSpEwZNiRJUqYMG5IkKVOGDUmSlCnDhiRJypRhQ5IkZcqwIUmSMmXYkCRJmTJsSJKkTP1/Bn2J\nWRJzuXYAAAAASUVORK5CYII=\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEWCAYAAABrDZDcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xt83HWd7/HXp2lKE1pogRZbWqhVtKDrwX1ksULpRqtnUS5VH97iqrirlnPWC+vDXWVdKdgjjwWXw0MQVtuDCCpWEVy3Kl6QNgaUsgYEtRcEWiWV0rTQ0ksSmqSf88fvlzCT+U0yk8z8fr+Z3/v5eOSRzHdu3xnKvOd7N3dHRESya1LSFRARkWQpCEREMk5BICKScQoCEZGMUxCIiGScgkBEJOMUBDIuZnaymR00s4YKPd7Lzew3ZnbAzD5eiccs47mvMLNvlnjbdjP7ULXrFDczazWzHUnXQ5KhIJBRmdkfzaw3/NAf+pnr7k+6+zR3HwxvN9EPyE8B7e4+3d2vr0zt41VOoBS5/y1m9vkK1GOBmbmZTZ7oYxV5/A+Y2X3VeGxJhoJASnFB+KE/9PNUFZ7jFGBTFR5XRMagIJBxyf3WaWZXAucAN4QthhuK3OdCM9tkZvvCFsRpYfl64HU5939ZxH3bzezzZvar8DY/MLPjzew2M9tvZr82swU5tz8rLHsu/H1WznUvNrNfhN1QdwMnjHiuxeHz7DOzR8ystYT341zgM8C7wvo9EpbPNbN1ZvasmT1uZh8ucv8VwN8Cnxp6fTn3v9PMdpvZ9txuMzM708w6w9e/y8yuDa/qCH/vCx/rtRHP1xS2QPaa2Wbgr0Zcf6mZPRG+R5vN7K1h+WnAV4DXho+9Lyw/L+za229mXWZ2xVjvmaSIu+tHP0V/gD8Cb4goXwA4MDm83A58aJTHeRlwCHgj0EjQFfQ4MKXE+7eHt38JcCywGfgD8AZgMvB14GvhbY8D9gLvC69rCy8fH15/P3AtcBSwFDgAfDO87iTgGeDNBF+U3hhenjVWPYErhh4np+wXwH8AU4EzgN3AsiL3vwX4fM7lScCDwEpgCrAQ2Ab8Tc7reF/49zRgcdR/myLPdRVwb/hezQd+D+zIuf4dwNywDu8K/9vNCa/7AHDfiMdrBf4ivP2rgF3AW5L+96uf0n7UIpBSfD/8drzPzL4/zsd4F/Ajd7/b3fuBa4Am4KzR75bna+7+hLs/B/wYeMLdf+7uA8B3gVeHtzsPeMzdv+HuA+6+FtgKXGBmJxN8+73M3Z939w7gBznP8V7gLne/y92PuPvdQCdBMJTFzOYDS4BPu3ufuz8M3EQQUKX4K4IAWuXuh919G/D/gHeH1/cDLzWzE9z9oLtvLKN67wSudPdn3b0LyBuXcffvuvtT4XvwHeAx4MxiD+bu7e7+u/D2vwXWAn9dRn0kQQoCKcVb3H1G+POWcT7GXOBPQxfc/QjQRfANvFS7cv7ujbg8Leq5Qn8Kn2susNfdD424bsgpwDtygm8fwYf5nDLqOWQu8Ky7H4ioRylOAeaOqMtngBPD6z9I0NLaGnZ/nV9m3bpG1GuYmb3fzB7Oed5XMqILbcTtX2NmG8IurOeA/zXa7SVdqjKrQDJprG1snyLoOgDAzIygS+LPVajLUwQforlOBn4C7ARmmtnROWFwMi/Uvwv4hrtH9uWPYeR78BRwnJlNzwmDkyn+mkfevwvY7u6nRt7Y/TGgzcwmAW8D7jCz4yMeJ8pOgvd/aID+5KErzOwUgpbHMuB+dx80s4cBK1JPgG8BNwBvcvc+M/siCoKaoRaBVMougj7sYm4HzjOzZWbWCHwSeB74VRXqchfwMjN7TziY/S7gdOCH7v4ngq6ez5nZFDNbAlyQc99vEnQh/Y2ZNZjZVAvm2M8r4Xl3AQvCD2bCLpdfAf8WPs6rCL7F3zbK/XPfw/8G9pvZp8PB3QYze6WZ/RWAmb3XzGaFrat94X0GCcYhjjD2f49/MbOZ4Wv7WM51RxN82O8On+fvCFoEufWcZ2ZTcsqmE7R++szsTOA9ozy3pIyCQCrlOuDt4SyUgnUA7v4oQf/7l4A9BB++F7j74UpXxN2fAc4nCJtnCAamz3f3PeFN3gO8BngWuJxgoHnovl3AcoIumN0E38r/mdL+X/lu+PsZM3so/LuNYPD2KeA/gcvDcYcoXwVOHxqL8WCNxgUEg8zbCd63mwgGywHOBTaZ2UGC9//d4VhED3Al8MvwsRZHPNfnCLqDtgM/A76R8x5sBv4vwWD0LoKW3C9z7rueoCXxtJkNvaf/AKwyswMEg9u3F3+bJG3MXQfTiIhkmVoEIiIZpyAQEck4BYGISMYpCEREMq4m1hGccMIJvmDBgqSrISJSUx588ME97j5rrNvVRBAsWLCAzs7OpKshIlJTzGzkCvtI6hoSEck4BYGISMYpCEREMk5BICKScQoCEZGMq4lZQyIitax9azerO7bRtbeH+TObuXjpQloXzU66WsPUIhARqaL2rd2sXLeJ7gN9zGhqpPtAHyvXbaJ9a3fSVRumIBARqaLVHdtobDCap0zGLPjd2GCs7tiWdNWGVS0IzOxmM+s2s9/nlB1nZneb2WPh75nVen4RkWpo39pN25qNLLl6PW1rNo75zb5rbw9NjQ15ZU2NDezY2zP6E332s/D610+0uiWpZovgFoKDM3JdCtwTHr13T3hZRKQmjKebZ/7MZnr7B/PKevsHmTezOfoOP/whmMGVV8KGDRDDmTFVCwJ37yA4ASrXcuDW8O9bgfEehC4iErvxdPNcvHQh/YNOz+EB3IPf/YPOxUtHnCS6fXsQABeEJ6cefzzs2xeUVVncYwQnuvtOgPB30WFzM1thZp1m1rl79+7YKigiUsx4unlaF81m1YWvYPb0qTzX28/s6VNZdeErXpg11NcHp58OC3OC4be/hT174Nhjox+0wlI7fdTd1wBrAFpaWnSepogkbv7MZroP9NE85YWPzqhunqjpomtXLM677rP/9Xvuu3RZ/hN8/evwvvdV/XWMFHcQ7DKzOe6+08zmAOmZPyUiMoaLly5k5bpN9BweoKmxgd7+wYJunqFxhMYGyxtHWBVev3LdJr5wy2dYvGXj8H0OH3c8U/bsjqUbKErcQbAOuAi4Kvz9XzE/v4jIuLUums0qgrGCHXt7mBexOCx3HAGgecpkeg4PsLpjG2/6+Vo6vnt93mO+/fI7aZw7l7UJhQBUMQjMbC3QCpxgZjuAywkC4HYz+yDwJPCOaj2/iEg1tC6aPeqq4K69Pcxoaswre9kzXXz5n96fV3Zd26X8/LXn4e7sGmsqaZVVLQjcva3IVcuKlIuIVF21t3vIG0dw5weXLC24zQXX3zv896hTSWOilcUikhlxbPcwNF30Bx8/pyAE2jc/zdKr1w9PJd1zsI8de3v5w679JS1OqxYFgYhkRhzbPbS+ZSkdn85fEfzAj+8Hd1pPO3F4KunT+/t49lA/M5sbmXNsU6J7ECkIRCQzxr3dQym+//1g1s+jj75Qtnw5uPOacxcPF7Uums3aFYs5dfZ05s1sYtb0qYnvQZTadQQiIpVW6jqAsgwMQGNjYfkYW0NEDSpXLJTKpBaBiGRGyds9lMqsMATcS9ofqOw9iKpIQSAimTHmdg+lMitc/LV5c1kbxFU8lCZAXUMikiljrQMY1Y03wkc/ml927rnw4x+Pqx5jLU6Li4JARGQsvb3QHNFlM8EtoicUShWkIBDJoLSfoZsqUVs/xHBGQJw0RiCSMbVwhm4qRI0DbN9edyEACgKRzKmFM3QTtXJlYQC8/e1BACxYkEiVqk1dQyIZk6b566mybx/MjDhGvQ5bACOpRSCSMWmav54aZoUhUOJ6gHqgIBDJmDTNX09c1DhAd3dmAmCIgkAkYyq2qKqWXXRRYQBcckkQALNmJVOnBGmMQCSD0jJ/PXZdXXDyyYXlKWsBxD29V0EgIlWRurUKNbIeYLQzj6v1/qlrSEQqLlVrFSLGAe7978dSGQKQzPReBYGIDGvf2k3bmo0suXr9hE7MSsVaheOPLwiAn73mzSy7ZgP/uqErtQvoqnpmQhEKAhEBKvstPokPs2GPPBIEwLPP5hVfcP29fOlv/yX1C+iSmN6rIBARoLLf4hNbq2AGZ5yRV7Tkqns4/7qOvLI0L6BLYnqvgkBEgMp+i4/9wyxqPUBvL7jX3AK6JKb3ataQiACVPcaxmnvt585Guu/SZYU3uOwyWLVq+OLFSxeyct0meg4P0NTYQG//YOoX0MU9vVdBICJA5T8wq/FhNjSO0bL9EdZ+5ROFN4iYCZSmA2DSSkEgIkBtfGCu7thGx6dfX1Detvp+1q5YXPR+mV1AVyIFgYgMS/UHphlrRxQt/2I7gzaJ51I68FsrEhksNrNPmNkmM/u9ma01s6lJ1ENEakDEQPCN7/wkF1x/L0cmNaR64LdWxB4EZnYS8HGgxd1fCTQA7467HiKScrffHrktxNKr1/O9M8/XzqkVlFTX0GSgycz6gWbgqYTqISJp4w6TIr6jhgPBq8JZQ2kdx6hFsQeBu//ZzK4BngR6gZ+5+89G3s7MVgArAE6O2i1QRGpGyRvQRW0Md+RIXnmqxzFqVBJdQzOB5cCLgbnA0Wb23pG3c/c17t7i7i2zMrg/uEi9GGvrivat3dELwu68M2gFRIWDVFQSg8VvALa7+2537we+B5yVQD1EJAajbV3xp4svofW0Ewvu075lF7ztbQnUdvwqtWFfEpIYI3gSWGxmzQRdQ8uAzgTqISIx6Nrbw4ymxryyaZOctRe/tuC2F1x/Lz2HB5jdsa2mun+SOEOgkpIYI3jAzO4AHgIGgN8Aa+Kuh4jEY+TWFT/4+DkFt7ng+nuH/07zhnDF5LZ6AJqnTKbn8ACrayTQEpk15O6XA5cn8dwiEq+hrSuiAuALH/l3fvriFnJXAdTiuoCoVk8tBZp2HxWRqmr9P5+I3BYCd8786Pvzdindc7CPHXt7+cOu/TXVz15rO5yOpCAQkeo4cCCY8fOtb+WXuw+vCcjdcvnp/X08e6ifmc2NzDm2KdnjLcuUxBkClaQgEJHKM4NjjskvywmAXK2LZrN2xWJOnT2deTObmDV9anLHW45TEmcIVJI2nRMpUcmLorIsas7/fffB2WePedda72ev5YVuahGIlKCS5/nWpZe+NDoE3EsKAaj9fvZapiAQKUElz/OtKzt3BgHwxBP55UW6gUZT6/3stUxdQyIlSEO3Req6poq1AMapFg7GqVcKApESVPI83/FI1crVqADYsgUWLZrwQ9dyP3stU9eQSAmS7rZIRddU1MZwELQCKhACkhwFgUgJkp4e2LW3h6bGhryy2LqmtmwpHgAT6AqS9FDXkEiJkuy2SKxrqsLjAJJOahGI1IDYu6aiuoGeekohUKcUBCI1ILauqagAeOlLgwCYM6eyzyWpoa4hkRpR1a6pX/4SliwpLFcLIBMUBCJZp3GAzFPXkEhWRXUDPfecQiCDFAQiWRMVAG9+cxAAI3cMlUxQEIhkxY9+VLwb6Ec/ir8+khoaIxDJgogAaN+yS9s5CKAgEKlvEQGw/Nr1HDxi9Ce1V5GkjrqGROpRxDjAt17/Hi64/l6OTG7UNtqSRy0CkXry1a/Chz5UULzkqnuY0dRIbjTU0ulfUl0KApF64A6TIhr44VTQ+Ws2JrqNtqSbuoZEap1ZYQgcOZK3HiDpbbQl3RQEIrUqaj3AjTcGATCiPOlttCXd1DUkUmuuuAI+97nC8jFWBOv0LykmkSAwsxnATcArAQf+3t3vT6IuIjVjYAAaGwvLtSWETFBSLYLrgJ+4+9vNbAqgESuR0WhjOKmi2McIzOwYYCnwVQB3P+zu++Kuh0hNiBoH+M53FAJSUUkMFi8EdgNfM7PfmNlNZnb0yBuZ2Qoz6zSzzt27d8dfS5EkvfWtxVsB73xn/PWRumYe8zcLM2sBNgJnu/sDZnYdsN/dLyt2n5aWFu/s7IytjiKJOXQIpk0rLB/H/6ftW7tZ3bGNrr09zJ/ZzMVLF2qwOGPM7EF3bxnrdkm0CHYAO9z9gfDyHcBfJlAPkXQxKwwB93GHwMp1m+g+0MeMpka6D/Sxct0m2rd2V6iyUk9iDwJ3fxroMrOXh0XLgM1x10MkNaLGATZsmNA4wOqObTQ2GM1TJmNm2ltIRpXUrKGPAbeFM4a2AX+XUD1EkvOSl8C2iA/mCnTXdu3tYUZT/lRT7S0kxSQSBO7+MDBmv5VIXdq1C170osLyCo7XzZ/ZrL2FpGTaYkIkTmaFITDOcYDRaG8hKYeCQCQOEeMAn77sGyy56h7a1mys+CCu9haScmivIZFqiloLACy9ej2NDcaMxobhGT2VPi1MewtJqdQiEKmGRx8tuiCsbfX9mtEjqaIWgUiljbEvkGb0SNqoRSBSKVHrAZ58smAgeP7MZnr7B/PKNKNHkqQgEJmoqACAIADmzy8oHjmjZ8/BPnbs7eUPu/ZXZeBYZCwKApHx+sUvigfAKNNBc2f0PL2/j2cP9TOzuZE5xzZpKwhJhIJAZDzMoLU1v6yM9QCti2azdsViTp09nXkzm5g1faoGjiUxCgKRckR1Az3zzLgXhHXt7aGpsSGvTAPHEjfNGhIpRVQX0GmnwebNwXbPd2wc13bP2gpC0kAtApHR/Od/Fh8HCENgIts9aysISQO1CCSTSjq0pYRzgnO3ewZonjKZnsMDrO7YVlKroHXRbFaFj7Njbw/zdICMJEBBIJkz9C2+scHyvsUPb/EQFQB9fXDUUQXFlVgcpq0gJGnqGpLMKXZoS+tpJxaGwPvfH7QCIkIAtDhM6oOCQDJn5Eyd8zru5J5/el3hDd3h1ltHfSz18Us9UNeQZM7wTJ3GBn5wydLCG5QxFVR9/FIPFASSOiUN5E7AxUsXBt1AI593005aT484OWwM6uOXWlc0CMzsLuAf3P2P8VVHsiTqAx8YfSB3osxoHVF0x/kf4oR/v1If5pJZo7UIbgF+Zma3Al9w9/54qiRZUGzmTnPjpAlNxyzqc5+DK64oLHfn7eN/VJG6UDQI3P12M/sRsBLoNLNvAEdyrr82hvpJnSo2/377Mz2cOnta3m0ntOXC4CBMjvhnXuEzgkVq2VhjBP3AIeAoYDo5QSAyEcXm30Mw/bIiWy6UsCBMREYfIzgXuBZYB/ylu2sXLKmYYnvsLDzhaA4dHqTn8ABNjQ309g+WPx0zKgBuuw3e854K1Fyk/oy2juBfgXe4+6UKAam0YvPvP33uouG9+p/r7Wf29KmsuvAVpY0PfOADxVsBCgGRokYbIzgnzopItow1/76sgeG+PmhqKiwfZzdQtaeviqRNYusIzKwB6AT+7O7nJ1UPSU5F5t9XeBxgzH2IROpQkltMXAJsSfD5pZZFHRDT0THhweBi+xDpxDCpZ4kEgZnNA84Dbkri+aWGXXRR8VbAORPvzdSJYZJFSXUNfRH4FMGUVJGxPfcczJhRUNy2+v6gL3/Nxor05evEMMmi2FsEZnY+0O3uD45xuxVm1mlmnbt3746pdpJKZgUh0L5lF0uvXj/uk8GK0W6ikkVJdA2dDVxoZn8Evg283sy+OfJG7r7G3VvcvWXWrFlx11HSIGoc4NFHwb1qffmti2aPf/qqSI0yT3ClpZm1Av801qyhlpYW7+zsjKdSkrx3vhO++938soUL4Yknhi8uuXo9M5oasTAo9vf2s+fg8/QNHOHMBcdpyqcIYGYPunvLWLfTNtSSHrt2wYsitoGO+LKS25e/v7efp57rBeCoBtOUT5EyJXpCmbu3aw2BAEEX0MgQcC86HTS3L3/PweeDh8CYfcxUTfkUKZOOqpRkRY0D7No15nqA3L78voEjNE4y5s6YyvSpwUZ2mvIpUjp1DUkyli+Hdevyy9ra4FvfKnqXqK0f1q5YTNuajama8qktKqTWqEUg8frzn4MWwMgQcB8zBFau2xQ5XTRNUz5Hq6dIWikIJD5mMG9eftko4wC5RpsumqYpn9qiQmqRuoak+iK2hHjDqh8y68TjuXhrd0kf2MUOshkaB0jLAfJj1VMkjdQikOpZvrwgBK556z+y7JoNTD32mLK6TebPbKa3fzCvLI1bP9RKPUVyKQik8h59NHIcoG31/fz47OXj6jZJ0zjAaGqlniK5FARSWWawaFF+WTgOMJGdPdM0DjCaWqmnSC6NEUhlRG0NffgwNL7QXz7RnT3TMg4wllqpp8iQTLYI2rd207ZmI0uuXk/bmo2a2jcRLS0FIbDpizfRtvp+llx7b977q24TkXTKXIugmkcRZmoh0X33RR4E075lV/D+DvRFvr+jnVMsIsnIXBDkzvMGaJ4ymZ7DA8Pz0ccrM2fdusOkwoZk2+r7uXjpwjHfX3WbiKRP5oKgWvO8qxUwqRK1HuAL9zD1qEZ6w+A79Hw/c45tyruN5tGLpFvmgqBaRxHW9UKiiAC45n9fzY9fcmZB8PUPOr39g6nZ90dExpa5weJqDVjW5UKideuKHhT//VNaIqeCTpk8SQPCIjUmc0FQrXnedTUj5siRIACWL88vz9kXqFjwnTp7uubRi9SYRI+qLFWtHFU5NGuopmfERLUAhoIhR+7geFNjA739g/QPuj70RVJER1UmoJIzYmKfihoVAB0dkVNEQVNBReqJgiCFYp2KevPN8MEP5hUdaprGrx96Yszn0lRQkfqgIEihWKai9vfDlCkFxedf1xF089TjGggRiaQgSKGqT0WN6AZads2GYFdQ6nQNhIgUlblZQ7Vg5IycA339PN59kO4Dz09sb6Sog+IfeYQlV90z7l1BRaT2ZSIIam2TudypqPt7D7Njby8DR5wXHXPU+M7A/bd/KwyAM84IpoK+6lX1uQZCREpW911Dad4DqNjMoNwZOQ89uZfJDcaJ06dyTNhdVHK3zaFDMG1aYfmIKcMXL13IynWb6Dk8kDcVtCbXQIhI2eq+RZDWw8SHAqr7QP4unUPf9FsXzWbtisXMmn4UL501bTgEoMRuG7PCEChyULwOUxHJtrpvEaR1D6BSZwaVvTdS1HqA7dthwYJR66OpoCLZFXuLwMzmm9kGM9tiZpvM7JJqPl9a+79LPbax5K0rPv/5whBYsSJoAYwRAiPV2piKiExMEi2CAeCT7v6QmU0HHjSzu919czWeLK3936V+0x9zBe++fTBzZuETjHPrkDSPqYhIdcQeBO6+E9gZ/n3AzLYAJwFVCYK0boVQTkAV7bYpsjPoRGTiXAURyZPoGIGZLQBeDTxQzedJY//3hAIqKgD27IHjj59wvdI6piIi1ZNYEJjZNOBO4B/dfX/E9SuAFQAnn3xyzLWLR9kB9bGPwQ035JdddhmsWlWxOlXr4B4RSa9EgsDMGglC4DZ3/17Ubdx9DbAGgm2oY6xe+uzcCXPnFpZXYQvxtI6piEj1xB4EZmbAV4Et7n5t3M9fc6owDjCatI6piEj1JNEiOBt4H/A7M3s4LPuMu9+VQF3SKyoADh6Eo4+u+lOncUxFRKon9nUE7n6fu5u7v8rdzwh/FAJDXvOawhD40peCVkAMISAi2VP3K4trxubN8IpXFJbXwFGiIlLbFAQTVJEjJaswDhD7UZciUrPqftO5ahpr47gxRZ0P8PzzFQmBCdVLRDJFQTAB497ZdM6cwgD4yleCAIg4PjK2eolIJqlraALKXoV7//1w1lmF5RUeB9DqYBEph4JgAspahRvjegCtDhaRcqhraAJK2iI6ahxgcLCqs4FK3rpaRAQFwYSMerJXVADccksQAJOq+7brxDERKYd5DcxTb2lp8c7OzqSrUZq77oLzzissr4H3WUTqi5k96O4tY91OYwSVUuybvgJARFJOQVAJUQPBR44Ml2txl4ikmcYIJiJqHGDDhqAVkBMCWtwlImmmIBiPu+8uDICpU4MAaG3NK9biLhFJO3UNlWNwECZHvGWjjANocZeIpJ1aBKUyKwwB9zEHg+fPbKa3fzCvTIu7RCRNFARjmT69sBvo8cdLng2kxV0iknYKgmLWrw8C4ODBF8o+/OEgAF7ykpIfRou7RCTtNEYw0uHDcNRRheUTWA+gox9FJM0UBLliPiheRCQN1DUE8PKXF4bA7t0KARHJhGy3CO67D845J7/shhvgIx+Z8ENrNbGI1Iq6DYJRP4j7+6NPAstpAUzkg3xoNXFjg+WtJl4FCgMRSZ267BoadVuHuXMLQ2DEeoCJbguh1cQiUkvqMgiiPojPeuzXtJ52Iuzc+cINDx6MHAeY6Ad5194emhob8sq0mlhE0qouu4Zyt3WYsf8ZvvHZt+Tf4Le/hb/4i5LuP6ScD3IdFSkitaQuWwRD2zq84vGH80Lg8n/+StACGCUEcu+fq5wPcq0mFpFakkgQmNm5ZvaomT1uZpdW+vGHPogbDh4A4MYLP8LSq9fzur9/a1n3j/ogb9/aTduajSy5ej1tazZGjhtoNbGI1JLYj6o0swbgD8AbgR3Ar4E2d99c7D7jOapyaNbPjr09zBvH9M2o+wPDs4GaGhvo7R+kf9D1IS8iqZTmoyrPBB53920AZvZtYDlQNAjGY6LbOkTdv23NxuFBZIDmKZPpOTzA6o5tCgIRqVlJdA2dBHTlXN4RluUxsxVm1mlmnbt3746tcqPRbCARqUdJBEHEhj4U9E+5+xp3b3H3llmzZsVQrbHpbAERqUdJBMEOYH7O5XnAUwnUo2yaDSQi9SiJIPg1cKqZvdjMpgDvBtYlUI+yaTaQiNSj2AeL3X3AzD4K/BRoAG52901x12O8dLaAiNSbRFYWu/tdwF1JPLeIiOSry5XFIiJSOgWBiEjGKQhERDJOQSAiknEKAhGRjFMQiIhknIJARCTjFAQiIhmnIBARyTgFgYhIxikIREQyTkEgIpJxiWw6lyZDZxN37e1h/jjONhYRqXWZbhG0b+1m5bpNdB/oY0ZTI90H+li5bhPtW7uTrpqISGwyHQSrO7YNH0ZvFvxubDBWd2xLumoiIrHJdBDoMHoRkYwHgQ6jFxHJeBDoMHoRkYwHgQ6jFxHR9FEdRi8imZfpFoGIiCgIREQyT0EgIpJxCgIRkYxTEIiIZJy5e9J1GJOZ7Qb+VMZdTgCQ/4qrAAAEJElEQVT2VKk6aZfl1w7Zfv167dlV7PWf4u6zxrpzTQRBucys091bkq5HErL82iHbr1+vPZuvHSb++tU1JCKScQoCEZGMq9cgWJN0BRKU5dcO2X79eu3ZNaHXX5djBCIiUrp6bRGIiEiJFAQiIhlXd0FgZuea2aNm9riZXZp0feJiZvPNbIOZbTGzTWZ2SdJ1ipuZNZjZb8zsh0nXJW5mNsPM7jCzreG/gdcmXae4mNknwn/zvzeztWY2Nek6VZOZ3Wxm3Wb2+5yy48zsbjN7LPw9s5zHrKsgMLMG4EbgTcDpQJuZnZ5srWIzAHzS3U8DFgMfydBrH3IJsCXpSiTkOuAn7r4I+B9k5H0ws5OAjwMt7v5KoAF4d7K1qrpbgHNHlF0K3OPupwL3hJdLVldBAJwJPO7u29z9MPBtYHnCdYqFu+9094fCvw8QfBCclGyt4mNm84DzgJuSrkvczOwYYCnwVQB3P+zu+5KtVawmA01mNhloBp5KuD5V5e4dwLMjipcDt4Z/3wq8pZzHrLcgOAnoyrm8gwx9GA4xswXAq4EHkq1JrL4IfAo4knRFErAQ2A18Lewau8nMjk66UnFw9z8D1wBPAjuB59z9Z8nWKhEnuvtOCL4UAmWdtlVvQWARZZmaH2tm04A7gX909/1J1ycOZnY+0O3uDyZdl4RMBv4S+LK7vxo4RJldA7Uq7AtfDrwYmAscbWbvTbZWtafegmAHMD/n8jzqvJmYy8waCULgNnf/XtL1idHZwIVm9keC7sDXm9k3k61SrHYAO9x9qAV4B0EwZMEbgO3uvtvd+4HvAWclXKck7DKzOQDh7+5y7lxvQfBr4FQze7GZTSEYNFqXcJ1iYWZG0Ee8xd2vTbo+cXL3f3H3ee6+gOC/+Xp3z8y3Qnd/Gugys5eHRcuAzQlWKU5PAovNrDn8f2AZGRkoH2EdcFH490XAf5Vz57o6vN7dB8zso8BPCWYP3OzumxKuVlzOBt4H/M7MHg7LPuPudyVYJ4nPx4Dbwi9A24C/S7g+sXD3B8zsDuAhgplzv6HOt5sws7VAK3CCme0ALgeuAm43sw8ShOM7ynpMbTEhIpJt9dY1JCIiZVIQiIhknIJARCTjFAQiIhmnIBARyTgFgUiZwp1et5vZceHlmeHlU5Kum8h4KAhEyuTuXcCXCeZuE/5e4+5/Sq5WIuOndQQi4xBu5/EgcDPwYeDV4Y63IjWnrlYWi8TF3fvN7J+BnwD/UyEgtUxdQyLj9yaCrY9fmXRFRCZCQSAyDmZ2BvBGgtPgPjG086NILVIQiJQp3OXyywRnPjwJ/DvB4SgiNUlBIFK+DwNPuvvd4eX/ABaZ2V8nWCeRcdOsIRGRjFOLQEQk4xQEIiIZpyAQEck4BYGISMYpCEREMk5BICKScQoCEZGM+/+ug+glx8UxkAAAAABJRU5ErkJggg==\n", "text/plain": [ - "" + "
" ] }, "metadata": {}, @@ -348,7 +353,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The red line appears to be a good fit to the data. The errors betweene the scored values and the residuals appear to be minimal. However, an objective evaluation of model performance is require." + "The red line appears to be a good fit to the data. The errors between the scored values and the residuals appear to be minimal. However, an objective evaluation of model performance is require." ] }, { @@ -361,7 +366,7 @@ "\n", "With the model trained, it is time to evaluate the performance. This is done using the test dataset, so that there is no information leakage from the model training. \n", "\n", - "As a first step, a set of performance metric are computed. There are many possible metrics used for the evaluation of regression models. Generally, these metrics are functions of the **residual value**, or differnce between the predicted value or score and actual actual label value:\n", + "As a first step, a set of performance metric are computed. There are many possible metrics used for the evaluation of regression models. Generally, these metrics are functions of the **residual value**, or difference between the predicted value or score and actual label value:\n", "\n", "$$r_i = f(x_i) - y_i = \\hat{y}_i - y_i$$\n", "\n", @@ -375,13 +380,13 @@ "- **Root mean squred error** or RMSE, \n", "$$RMSE = \\sqrt{ \\frac{1}{N} \\sum_{i=1}^N (f(x_i) - y_i)^2}$$\n", "\n", - "The root mean squared error is identical to the standard deviation of the residuals (agian, with a slight bias). Root mean square error is in the same units as the label values. \n", + "The root mean squared error is identical to the standard deviation of the residuals (again, with a slight bias). Root mean square error is in the same units as the label values. \n", "\n", "- **Mean absolute error** or MAE,\n", "$$MAE = \\frac{1}{N} \\sum_{i=1}^N |f(x_i) - y_i|$$ \n", "where $||$ is the absolute value operator. \n", "\n", - "The similar in interpretation to the root mean squared error. You may find this measure more intuative since it is simply the average of the magnitude of the residuals. \n", + "The similar in interpretation to the root mean squared error. You may find this measure more intuitive since it is simply the average of the magnitude of the residuals. \n", "\n", "- **Median absolute error**,\n", "$$Median\\ Absolute\\ Error = Median \\big( \\sum_{i=1}^N |f(x_i) - y_i| \\big)$$ \n", @@ -392,7 +397,7 @@ "$$R^2 = 1 - \\frac{SS_{res}}{SS_{tot}}$$ \n", "where, \n", "$SS_{res} = \\sum_{i=1}^N r_i^2$, or the sum of the squared residuals, \n", - "$SS_{res} = \\sum_{i=1}^N y_i^2$, or the sum of the squared label values. \n", + "$SS_{tot} = \\sum_{i=1}^N y_i^2$, or the sum of the squared label values. \n", "\n", "In other words, $R^2$ is measure of the reduction in sum of squared values between the raw label values and the residuals. If the model has not reduced the sum of squares of the labels (a useless model!), $R^2 = 0$. On the other hand, if the model fits the data perfectly so all $r_i = 0$, then $R^2 = 1$. \n", "\n", @@ -414,11 +419,11 @@ "What if you find your model gives an $R^2$ less than $0$? What can this possibly mean? This invariably means that there is a bug in your code and that the residuals of your model have greater dispersion than the original labels!\n", "****\n", "\n", - "The code in the cell below uses functions from the `sklearn.metrics` package to compute some common metrics. There is no fuction for $R^2_{adj}$ in `sklearn.metrics`, but the adjustments for degrees of freedom are easily computed. \n", + "The code in the cell below uses functions from the `sklearn.metrics` package to compute some common metrics. There is no function for $R^2_{adj}$ in `sklearn.metrics`, but the adjustments for degrees of freedom are easily computed. \n", "\n", - "You can follow this link to find additional [documentation on linear regression merics built into Scikit-learn](http://scikit-learn.org/stable/modules/model_evaluation.html#regression-metrics). \n", + "You can follow this link to find additional [documentation on linear regression merics built into scikit-learn](http://scikit-learn.org/stable/modules/model_evaluation.html#regression-metrics). \n", "\n", - "Execute the code in the cell below and examine the results. " + "Execute the code in the cell below, examine the results, and answer **Question 1** on the course page. " ] }, { @@ -430,12 +435,12 @@ "name": "stdout", "output_type": "stream", "text": [ - "Mean Square Error = 1.01784801883\n", + "Mean Square Error = 1.0178480188322825\n", "Root Mean Square Error = 1.008884541873986\n", - "Mean Absolute Error = 0.763059846639\n", - "Median Absolute Error = 0.603143592808\n", - "R^2 = 0.897901443698\n", - "Adjusted R^2 = 0.895774390442\n" + "Mean Absolute Error = 0.763059846639255\n", + "Median Absolute Error = 0.6031435928079797\n", + "R^2 = 0.8979014436983853\n", + "Adjusted R^2 = 0.7914582679796807\n" ] } ], @@ -443,7 +448,7 @@ "def print_metrics(y_true, y_predicted, n_parameters):\n", " ## First compute R^2 and the adjusted R^2\n", " r2 = sklm.r2_score(y_true, y_predicted)\n", - " r2_adj = r2 - (n_parameters - 1)/(y_true.shape[0] - n_parameters) * (1 - r2)\n", + " r2_adj = r2 - (y_true.shape[0] - 1)/(y_true.shape[0] - n_parameters - 1) * (1 - r2)\n", " \n", " ## Print the usual metrics and the R^2 values\n", " print('Mean Square Error = ' + str(sklm.mean_squared_error(y_true, y_predicted)))\n", @@ -481,14 +486,25 @@ }, { "cell_type": "code", - "execution_count": 12, + "execution_count": 9, "metadata": {}, "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "C:\\Users\\StevePC2\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n", + " return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n", + "C:\\Users\\StevePC2\\Anaconda3\\lib\\site-packages\\matplotlib\\axes\\_axes.py:6521: MatplotlibDeprecationWarning: \n", + "The 'normed' kwarg was deprecated in Matplotlib 2.1 and will be removed in 3.1. Use 'density' instead.\n", + " alternative=\"'density'\", removal=\"3.1\")\n" + ] + }, { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhoAAAGJCAYAAADMo5pWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3XmcW1d9///XlTT76tlX7/axEydOnAUHyEYgIVAopCxt\ngQKlC+VL+bbh229XSkvbHxRCSktL+fEDCmkLlBJKWMIWkpCQPY7t2Il94n3s2Tz7qhmNpPv740rj\n8XhszyLN1Ujv5+Phh607V7qfo5Glj875nHMc13URERERSYeA3wGIiIhI9lKiISIiImmjRENERETS\nRomGiIiIpI0SDREREUkbJRoiIiKSNko0REREJG2UaIiIiEjaKNEQERGRtFGiIZIixpivGGOOXeDn\nx40xX55x+9jM2/N4/DcYY7661DhXOmPMNmPMbmPMhDFmvw/X/zdjzNGLnPMeY0zcGLM6xddek3jc\n30jl44qkU8jvAESyiJv4c6Gfz/QmYHgBj3/nRR4/V3wUaAV+Gejx4fofA8ovcs7FXgsiOUOJhohP\nrLV7/Y5hhaoG9llrf+zHxa215+21EpFzKdEQ8Ykx5jjwoLX2NxO3fw34Y2AzMAr8GPi/1tpOY8xD\nwI2J82LAzdbaR4wxDcD/A7waqAH2AX9rrf3ejOuUAp/G60EpBr4PPAXcba0NJM55CDgFFAK3A49Z\na28zxqzF+wZ/C1ALDAA/Av7QWtufuO8x4N+ASuBdQAHwXeB3gQ8m/pQBDwC/ba0duMBzcsH2GGPi\neD0FTuJ5eK+19p45Hud87SkA/gb4VaAOsMDfWWu/OeO+O4BPAlfjDS8/BfyFtfapxM+/AtxorV2X\nuO0Afw78diLmnwCPzIrnrPskjq0BjgHvSbbBGHMZ8FfA9Ynn8zRwL97rYHKOdjqJ9vw60AR0AN8A\n/tJaGz3f8yyynJRoiKSYMSY4x2FnjmPTXevGmFcA9+B9yDwKtAB3AV8DbgY+APxH4j6/BxwwxtQB\nzwLjwJ8A/cB7gO8YY95prf164uG/C2wH/hQ4mXisj3Nu1/7bgX8H3gAEjDFFwMNAd+KaQ8DLgb8G\nxhKPk/RhvA/Yt+N9QH8CuApoB34LWAd8FugEfn+O54J5tmcn8K8znocL1Uqc1Z7Ese8A1wF/CRwA\n3gx8wxiTb639D2NMGV4i9UDiZ4XAR4AfGWNWW2tHOHdY5FOJNn0MeBp4W6L9M110KCWRZD0KPAG8\nG5jES5I+jPc8fnKOu/0J8H68YbVjwMvwfreTeL8nEd8p0RBJrbXA1Hl+dqEPmlfifXh/0lo7BWCM\n6QOuAbDWHjDGDAOutfaZxM8/gjeMsNNaeyrxOD8yxlTjJSlfN8a8CrgJeLO19r7E/X4E7Ae2zIph\nEnj/jOtvB04Av2GtPZE45+fGmJ2Jx5xpCHi7tTYOPGiMeQ/eN+xrrLWjicd7HfCKCzwHH75Ye6y1\nT89+Hi5gdnteA9wGvM1a+63EOT9N9Ph8whjzNeASvF6Jf7LWPpm430Hgd/B6ZUZmXsAYU4GXZNxl\nrf27GY/ZnLjWQlwG7AZ+xVo7njj2oDHmVrzne65E4wbg2Rm9Oo8aY8aBwQVeWyRtlGiIpFYH3jfo\nuXowvjfHsaSfA38HvGCM+RZwP/DTi9Qh3Ag8PuNDOek/gC8bY7bg9YZEkkkGgLXWNcb8F15R5UwH\nkh/KifP2AjcaYxxjzEZgE94H8VZgdq/N04kkI6kbGEkmGQl9wLaltMdae/AC95/trPYArwLiwP2z\nep2+B7wzEdt+vALTHxhjvok3fPUTa+2fnuca1+G9j35/1vFvssBEw1r7U7wkJWSM2QpsxEs+6oDe\n89ztIbwk6RG8nqsfWGs/t5DriqSbpreKpFbEWrvbWvvc7D9A5Hx3Snx7vh04Avwh3hh/uzHmgxe4\nVhXQNcfx5LFKvLqKvjnO6Z7j2OjsA8aYO/HqBCzwJbxkYIxzE6m5Zs+MzRn1+c2nPQsxuz3VeO95\no3i9Tsk//4WXgDRZa8fwepe+jzcEci/QY4z5V2NM3hzXWJX4e3Yi0LnAWEkkdJ/AGzLajzfUdAUQ\nZu7EFWvtJ4H/BRThDde8YIzZZ4y5aaHXF0kXJRoiGcJa+1Nr7e14H16/BDwP/KMx5qrz3KUfaJjj\neFPi7168gsiaOc6pv1g8xphfxxuy+DhQa61tsta+EXjpYvddpPm0ZykG8YY+rsKrI5n551rgcQBr\n7SFr7bvxnreX4xW6/i7woTkesxcvCZj9fFbPuu1ybi9Q6azbfwr8AV7xbKW1dq219m1cZAqvtfZf\nrbXX4D1378Erxr3XGKMea8kISjREMoAx5lPGmKcBrLUT1tr7gT/C+xBbkzgtNutuPwdeboxpnXX8\nnUCXtfZw4pyQMeYNs8558zzCegUwYK29e8YMk1K8b/zpeO+YT3uW+vilQGBWT9N2vCLckDHmV4wx\np40xddZa11r7lLX2g3hJypo5HvNxvB6Ht846/sZZt4eBGmNM/oxj13N23c4rgBestfckik5J1Hpc\nxnmeb2PMY8aYzwBYa3sTtRr/jNf7c7G1PkSWhTJekczwM+APE9Mg/wPvW+n/xRv2eDBxziCw0xhz\nM17R4N1400l/Zoz568S578ErHHwvgLX2UWPMA3g1Dn+OV9z5PrwPr4stKPU08H5jzF14dQzNwP/B\n+/Z+3imqS3A3XlJx3vYs0f14szq+a4z5G7xZJy/Dm51xv7W23xjzGN6H+n2JYYxhvKmw5cC3Zj+g\ntXYs8Vh/kyjCfBB4PV6P1Ezfxysa/ZIx5kvA5XgzRWYmj08Df2GM+WO8mSeb8Ho58oGS87Tp58CH\njTHdeElPC15R7cPJ5FDEb+rREEmti60M6s5121r7I+AdwKV4dQH/ifchd5O1NjmD4J/xagruB15r\nre3GK0bcBfwT8N94HzRvnLW2xNvxCgU/njhnAm+K6OwahrNit9Z+FW/K5lsT1/wrvOmuvwtUGWPM\nedp1oefivM9Poj0vn0d7Lvg45zvHWuvi1cF8He8D/Ed4s0nuAn4tcU4XXhHnIPBFvAThCuAOa+3M\ntTHcGY/7Cbwhj7cA9+EVld4569oP4CVpr8R7Lt+Kt67JzLUuPo73e/lQ4pwPc2bK86XGmGQPxcx2\nfQSviPi9wA8TbflhIhaRjOC4rv+r5CYW0fkccAfeHPpPW2vvPs+59+FV9bt43cou8IZEV7OIzJDY\na+M64DszF3wyxvw3sM5ae7VvwYlITsiUoZO7gB14XaRrgXuMMcettd+e49yteKvgPTjjWDq6cUWy\nQRz4Ct5QwJfwuupfi1ej8R7/whKRXOF7j4Yxphivcvs2a+2jiWN/DtxirX3VrHPz8abMbU1BYZhI\nTjDG3Ii3EuaVQB7wIl6v4TcveEcRkRTIhB6N7XhxPDHj2C+AP5vjXIP3De2CWzSLyBnW2p/j7VUi\nIrLsMqEYtBHonbUBUDdQmFh6eKateAVy/2GM6TDGPGWMee1yBSoiIiILkwmJRjHengQzJW8XzDq+\nBW8FvB/iVYbfD3wvsduiiIiIZJhMGDqZ4NyEInl7fOZBa+3HjDH/aK0dShzal1g18XfwdjC8KNd1\nXceZczVfERERubAFf4BmQqLRjrdiXmDGpkwNQHjG+gHTZiQZSQfwNnqaF8dxGB4OE4vFL37yChUM\nBigvL1I7s0SutBNyp61qZ3bJtXYuVCYkGnvwFiHaSWKvAbylec/ZAtoY829A3Fr7vhmHr8DbE2Le\nYrE40Wj2vhiS1M7skivthNxpq9qZXXKlnQvle6JhrQ0bY+4BPm+M+U3OLKH7bgBjTD0wZK2dwFvd\n8OvGmIfxkpJ34O0P8Nt+xC4iIiIXlgnFoOAt17sLbxGuzwIfsdbel/hZJ952zVhr/wf4APAXwD68\nFUJvs9a2LXvEIiIiclG+92iA16uBt1b/ORsnWWsDs25/GfjyMoUmIiIiS5ApPRoiIiKShZRoiIiI\nSNoo0RAREZG0UaIhIiIiaaNEQ0RERNJGiYaIiIikjRINERERSRslGiIiIpI2SjREREQkbZRoiIiI\nSNoo0RAREZG0UaIhIiIiaaNEQ0RERNJGiYaIiIikjRINERERSRslGiIiIpI2SjREREQkbUJ+ByAi\n6RGPx+nv7/c7DKqqqggE9J1GJFcp0RDJUv39/fzkyYOUllb4FsPo6BC37txCTU2NbzGIiL+UaIhk\nsdLSCsorq/wOQ0RymPozRUREJG2UaIiIiEjaKNEQERGRtFGiISIiImmjRENERETSRomGiIiIpI2m\nt4pISriuS3vvGKPjU8Rdl7gLgdgEsbjrd2gi4iMlGiKyZIOjkzz5QjenB8Ln/OxEz0v86qvhio01\nOI7jQ3Qi4iclGiKyaLG4y74jfew/2sfsjgvHAdeFnqFJPnvvPrasruR9r7+E6opCf4IVEV8o0RCR\nRXFdlyf2d3G0YxiAQMDh8g3VXLJ2FcGA13PxwuFOjnROMDQ+xcG2Qe76xm7+9J1XUV6S72foIrKM\nVAwqIoty8MTgdJJRt6qIN75iLZdvqCYUDOA4Do7jsLq2kD96m+H2nasB6B4Ic/c39zA+EfUzdBFZ\nRko0RGTBuvrGedaeBqCiNJ9brmo5by9FfijAW2/aOJ1stHWP8k/3Pk9kKrZs8YqIf5RoiMiCjIan\n+PmeDlwX8kIBbr6ymbzQxd9K3nLjBm7Y3gjASycH+eqPDqY7VBHJAEo0RGTeXNfl8X1dTCZ6I67f\n3jjvegvHcfiN27Zw1eZaAJ54oZs9h3vTFquIZAYlGiIybx29Y3T1jwNw2YZqWmpLF3T/QMDhva/b\nQkWpl5z8+4+t6jVEspwSDRGZl7jrssv2AFBUEGTbuqpFPU5xYR6/casBYGBkkm89fDhlMYpI5lGi\nISLzcqxjmMHRCADbN9bMqy7jfK7cXMs1W+oAeHhPBwdPDKQkRhHJPEo0ROSiYvE4ew559RTlxXls\nbK5Y8mO+4zWbKSn0lvL56o8OEo3Fl/yYIpJ5lGiIyEXZtkHGErUUV26uJRBY+lLi5SX5/OotmwBv\nfY3H9nUu+TFFJPMo0RCRC4rF4uw70g9ATUUhq+sXVgB6Iddta6CltgSA7z52nEhUa2uIZBslGiJy\nQSe6R6ans25P8cZoAcfhzdevB7zC0Ieea0/ZY4tIZlCiISIXZNsGASgrzqOppjjlj3/FphrWNZYD\n8L3HjhOe1HRXkWyiRENEzmtgZIKewQkANrdWpmWbd8dxuOMGr1djeCzC9x49mvJriIh/lGiIyHkl\nezMCAYcNKZhpcj6XrF2Faa0E4NsPH9YiXiJZRImGiMxpKhqf3p11bUMZhfnBtF3LcRzuuNHr1RgL\nT/HwbtVqiGQLJRoiMqejHcNEYy7AdG9DOm1qqWRTi9dr8pOn27SuhkiWUKIhIudwXZeXTnrDJqvK\nCqipLFyW696+cw0A/SOTPHPw9LJcU0TSS4mGiJyjb3iSgZFJwOvNSEcR6Fx2bK6lscZbV+PHT7fh\nuu6yXFdE0keJhoic40SXV5sRcBzWNpYt23UDAYdfvmEDAG3doxxMFKOKyMqlRENEzuK6Lsc7RwBo\nri0hPy99RaBzueWaVkqK8gCvV0NEVraMSDSMMQXGmC8ZYwaMMe3GmDvncZ+1xpgRY8wNyxGjSK7o\nG5qY3tdkTcPy9WYkFeaHuGVHMwDPH+mjo3ds2WMQkdTJiEQDuAvYAdwEfAD4qDHmjovc51+B1C9T\nKJLjjnd5vRmBgENrXer2NVmIV1/TSijo1YU8+NwpX2IQkdTwPdEwxhQD7wM+ZK3da629D/gk8MEL\n3OcdgD/vgCJZzHXd6USjpbaEvJA/bxGVpQVcbeoAeHx/FxMRLeAlslL5nmgA24EQ8MSMY78AXjbX\nycaYauATwO8Ay1MKL5Ijeocmplfl9GPYZKabrvSGTyYiMZ58sdvXWERk8TIh0WgEeq21M7+ydAOF\niaRitruBr1hrDyxLdCI5JFkEGgw4tNT622m4qaWC5sQW8g8/166priIrVMjvAPDqLCZnHUveLph5\n0BjzauDlwG8v5YLBYCbkV+mTbJ/amR0W285QyCEQcAgG5tfx57ouJ7oTwyZ1pSlZcjwQcAiFHELz\nHIKZ3dZbrmrhnh9Z2k6P0nZ6NK37rSwnvXazS661c6EyIdGYYFZCMeP2ePKAMaYQ+Dzwe9bayFIu\nWF5etJS7rxhqZ3ZZaDuj0XGKivIpLp7932tunb1j08MmZk3VvO93IZHJfCorS1i1qmRB90u29fXX\nb+CbDx5mIhLj0X1dXL2tackxZRK9drNLrrRzoTIh0WgHaowxAWttcnODBiBsrZ25Ws+1wDrgXmPM\nzK9oPzTGfNVa+4H5XnB4OEwsi/dRCAYDlJcXqZ1ZYrHtHBwcIxyOkF8wu8Nwbi+19XvXCzjUVRQw\nPj6/+11IOBxhcHCMUGh+E8Tmaut12xp46Ll2Ht3TzltuXE9pYo2NlUyv3eySa+1cqExINPYAU8BO\n4PHEseuBZ2ad9xSwadaxw3gzVh5YyAVjsTjRaPa+GJLUzuyy0HZGoy7xuEssPr/ahlOnRwFoqC4m\nEHDmfb8LicddolF3wb+fmW29cXsTDz3XzlQ0zs93t3PbtauXHFem0Gs3u+RKOxfK90TDWhs2xtwD\nfN4Y85tAC/Bh4N0Axph6YMhaOwEcnXlfYwxAh7W2d3mjFskuo+EpBke9EcnmmoUNc6Tb6voyNjSX\nc6R9mEef7+TWa1qXbe8VEVm6TKlcuRPYBTwIfBb4SGI9DYBO4G3nuZ/K0EVSoKPnzOqbyZkemeT6\ny73ajI7eMY52DvscjYgshO89GuD1agDvTfyZ/bPzJkPW2uXdhEEkS51KLPNdUZJPWXF+yh43Ho/T\n39837/NDIYdodJzBwTGi0TPfI9bXBskLOUxFXR54+hgVr2xZUBxVVVUEApnyvUokt2REoiEi/onF\n4nT1eYlGqnszxkaHeGRPN3V185soFgg4FBXlEw5HiM+qEWlclU9bzyS7Dg1QWxaYXqL8YkZHh7h1\n5xZqamoWHL+ILJ0SDZEc1z0QJhrzPtTTMWxSXFJOeWXVvM4NBhyKiwvIL5g8pxh167pC2npOEo25\nDEyEsmZNDZFsp75EkRzXnqjPCAUd6lZl7j6FdauKKCv2prYebh/yORoRmS8lGiI57lSPN621qaZk\n3quI+sFxHDYmejG6+8OMjC9p3T4RWSZKNERy2PBYhJHxKSDzprXOZUNz+fROiofbNftEZCVQoiGS\nw9rPmtbq7yZq81FcmEdToo7kSPsQcW20JpLxlGiI5LCOxGyTVWUFFBeujNrw5PDJ+ESUzt7xi5wt\nIn5ToiGSo+Jxl+5+74O6sTpzi0Bna6krpSDPW0JHRaEimU+JhkiO6h2amJ7WupISjWDAYX1TOQAn\nu0eYiER9jkhELkSJhkiOSi7S5Thk9LTWuWxs8YZP4i4c6xjxORoRuRAlGiI5qjMxbFJTUUReaGW9\nFawqK6C6vBDwhk9cFYWKZKyV9e4iIikRjcXpGZgAVtawyUwbW7zhk4GRSfqHJ32ORkTOR4mGSA46\nPRCenhrasEITjXWN5dMLjKkoVCRzKdEQyUFdfd6wSTDgUFtZ6HM0i5OfF2R1vbf2x7GOYWKxuM8R\nichclGiI5KBkfUbdqiKCK3j79GRRaCQap+30qM/RiMhcVu47jIgsSmQqRv/Qyq7PSGqoKqYksdDY\nEQ2fiGQkJRoiOaarf5zkHI2VWp+R5DjO9HbxHb3jjE1M+RyRiMymREMkx3Qlhk3yQgGqyldmfcZM\nG5rLp/99VButiWQcJRoiOSZZCNpQVUzAydxt4eerrDif+lVFgNbUEMlESjREcshEJMbgaASA+qoi\nn6NJnWRR6Mj4FD2DYZ+jEZGZlGiI5JDTA2d2O61fYcuOX8jq+jJCweSaGho+EckkSjREcsjpAe/b\nfl4wwKryAp+jSZ28UIA1DWUAHO8cZiqqNTVEMoUSDZEc0p1INGpXFWVFfcZMGxOzT6Ixl7ZubbQm\nkimUaIjkiKlonP5hb/2MZPFkNqlbVURZcR4ARzR8IpIxlGiI5IiewTDJCRnZVAiaNHNNja7+cUbG\nIz5HJCKgREMkZySHTYIBh+qKlb9+xlw2NJ1ZU0O9GiKZQYmGSI44nVioq6aicEXvb3IhJUV508uq\nH+0Y1poaIhkgO99tROQssXicnsT+JvVV2TOtdS7JotDR8BTd/VpTQ8RvSjREckDv0ATxuPftvi4L\nC0Fnaq0vJS/kvbUd1kZrIr5ToiGSA04nvtk7DtRWZneiEQoGWNforalxomtEa2qI+EyJhkgO6E6s\nCFpdXjj9bT+bJWefxOIu7f2TPkcjktuy/x1HJMfF4y49A159RrYPmyTVVBRSUZoPwInTEz5HI5Lb\nlGiIZLnB0UmmYt7wQa4kGo7jsCmx0drAaJROFYWK+EaJhkiWOz1jN9NcSTQA1jeVTy+z/owd8Dka\nkdylREMky/UkFuoqK86jMD/kczTLpzA/xOr6UgCeOzzAVDTmc0QiuUmJhkiW6xlM1Gdk+WyTuWxM\nDJ+MT8bY9VKPz9GI5CYlGiJZbCISYzQ8BXg7tuaaxupiigu8t7lH93b6HI1IblKiIZLF+kej0//O\n9vUz5uI4DmvqvH1dDpwY4HRimq+ILB8lGiJZrH/E683ICwWoTEz3zDWrawtJ1ITy6PPq1RBZbko0\nRLJYMtGorSzESX7a5pii/CBbWr2VQn/xfCexuFYKFVlOSjREstRUNM7gmDd0kovDJjNda6oAGBqL\n8PyRPp+jEcktSjREslR7X5jEPmo5n2hsaS2fXin0kT0dPkcjkluUaIhkqRPdXuGjA9RUFvobjM+C\nAYdXXtYIwPNH+xgY0f4nIssld1bvEckxJ7rHAKgsKyA/FPQ5Gv/E43H6+/vY1lrKD54A14WfPHmY\nW66sX9Y4qqqqCAT03U5yjxINkSzkui7HT3s9Grk+bDI2OsQje7qpq2ukpjyP3uEpHt3fQ2EotmwF\nsqOjQ9y6cws1NTXLcj2RTKJEQyQL9QxNMBr2CkHrVuX2sAlAcUk55ZVVbF0X4tG9nYxPxhmLFdJU\nU+J3aCJZT/14IlnoyKmh6X/neo/GTKvrSsnP8972Ds14jkQkfZRoiGShw+3eh2hBnkNpUZ7P0WSO\nYDDAhiZv/5OT3SOEJ6MXuYeILJUSDZEslEw0qkrzcnahrvPZ1OolGnH3zPMkIumjREMky4Qno5zq\nGQWgqky9GbNVlhZQn9hg7tDJIVzX9TkikeyWEcWgxpgC4HPAHcA48Glr7d3nOfcdwF8CrcBzwB9a\na59ZrlhFMt3RzmGSn51KNOa2eXUl3QNhRsNTdPSO0Vxb6ndIIlkrU3o07gJ2ADcBHwA+aoy5Y/ZJ\nxphXAl8E/gq4BHgC+KExpnjZIhXJcMlC0GDAobIkI75LZJzV9WUU5ntri9i2QZ+jEcluvicaiSTh\nfcCHrLV7rbX3AZ8EPjjH6Q3Ax6y1X7fWHgc+BlThJR0iwpm6g5aaIoIB1WfMJRhw2Nji1Wq094wx\nGp7yOSKR7OV7ogFsxxvCeWLGsV8AL5t9orX2W9bajwMYYwqBO4Fu4MVliFMk48VdlyMdwwCsqVdH\n34VsbqkEwEVTXUXSKRMSjUag11o7c55ZN1BojKme6w7GmFcBo8BHgD+w1o6nP0yRzNfZOzY9ZXNN\nnRajupDS4jyaa73n6PCpQeJxFYWKpEMmDOAWA7N3OEreLjjPffbh1XT8EvBVY8wxa+3T871gMJgJ\n+VX6JNundmaHhbTzaOfI9L83NJXw/JGwr8MnjuMQDDjzjiG5F4j3dzztMWxdvYr2njHCkzHae0ZZ\n21iekmvOFgg4hEIOodDZv0u9drNDrrVzoTIh0Zjg3IQieXvOngprbQ/QAzxvjLkOeD8w70SjvDw3\nVkpUO7PLfNrZlpjWWl9VzOrmSg51DFNcfL58Pf2KivIJhvIWHENhYepmy1wohk1r83nyQDej41O8\ndGqYSzbUpuy6M0Um86msLGHVqrN7mfTazS650s6FyoREox2oMcYErLXJrzANQNhae1Y5uDHmaiBm\nrd094/CLwNaFXHB4OEwslppvS5koGAxQXl6kdmaJhbTzhSN9AGxoKmdwcIxwOEJ+gX9boofDEYIh\nGB+fXwyBQIDCwjwmJqaIx1PzO71YDJtbKnjupV7ae0bpPD1MRWnqE7NwOMLg4BihkFc3o9dudsm1\ndi5UJiQae4ApYCfweOLY9cBca2O8D1gHvHbGsauAXQu5YCwWJxrN3hdDktqZXS7WzpHxCF39Xifg\n+qZyolGXeNwl5mPtget6159/DF774vF4yuK+WAwbmivYfagX14UDJwa5ZmtdSq47UzzuEo265/z+\n9NrNLrnSzoXyPdGw1oaNMfcAnzfG/CbQAnwYeDeAMaYeGLLWTgBfAJ40xvw+8EPgXcA1ib9FctqR\n9uHpf29sruDc0ieZS1FBiNX1ZZzoGuFIxxBXbq4hlOVj7SLLKVP+N92J1yvxIPBZ4COJ9TQAOoG3\nASSGTN4M/BawF69n41ZrbeeyRyySYaY3UssPTs+mkPkxrd5U18hUnBNdIxc5W0QWwvceDfB6NYD3\nJv7M/llg1u37gfuXKTSRFSOZaKxvLCcYyJTvECtDfVUR5SX5DI9FsG2DbGiu8DskkayhdyORLBCN\nxTnW6Q2dbNSH5II5jsPmxK6uvUMT9A1P+ByRSPZQoiGSBU6eHmUqUYSmb+OLs6G5YnqtjZe0/4lI\nyijREMkCyWETgA3N6Vl0KtsV5AVZ21gGwLHOYSJTMZ8jEskOSjREssCRRKLRVFNCSQoXu8o1ZvUq\nAKKxM3vGiMjSKNEQyQLJHo0NTerNWIqaikKqywsBb/jEdbX/ichSKdEQWeH6hyfoH/bWzFAh6NKZ\n1d5U16GxCN39YZ+jEVn5lGiIrHAzu/g3tijRWKq1jWXk53lvjfakikJFlkqJhsgKd/iUN2xSUhii\nvqrY52hWvlAwwIYmL2Fr6x5hfCLqc0QiK5sSDZEVbro+o7mCgOPflvDZJDl84rpw+JR6NUSWIuWJ\nhjGmIdU+gQOLAAAgAElEQVSPKSJzi0zFaOv2lszW+hmpU16ST2O11zv00skh4j5uTCey0i0q0TDG\nxIwxtXMcXwscXmpQIjI/x7tGpnclVSFoaiV7NcYno5zqGfU5GpGVa957nSR2Vn1n4qYD/I8xJjLr\ntCZgIEWxichFJNfPcBxYl1hsSlKjpbaU4oIQ45NRbNsgq+v1/IosxkJ6NL4DHAdOJG6fSvw7+ec4\n8BPgTakLT0QuJFmf0VpXSmF+RuyRmDUCgTP7n3T2jTM8Nvt7lYjMx7zfmay1/cBvAhhjAP63tVZL\n54n4xHXd6R4NDZukx8aWSvYe6cN1wbYNcs3WOr9DEllxFvUVyFr7XgBjTD2QjzeUMvPnbUsPTUQu\npGcwzPD4FKBEI12KC0Osri/jRNcIRzqGuHJzDaGgJuuJLMSiEg1jzHXAV4ENs37kAC4QXGJcInIR\nZ2+kpkQjXUxrJSe6RohMxTneOaJF0UQWaLGDup8FOoH/Awxd5FwRSYPD7d7IZUVJPjUVhT5Hk73q\nq4qoKMlnaCyCbRtUoiGyQItNNLYBV1prD6QyGBGZv5n1GY4W6kobx3HYvLqSZw6cpm94gt6hMDUV\nRX6HJbJiLHaw8SRQmspARGT+wjPWdtCwSfptaConFPSSOdumlUJFFmKxicbfAv9ojLnMGJOXyoBE\n5OKOdg6T3MFchaDpl58XZF1jOQDHO0eYjMR8jkhk5Vjs0MlfAKuBPTA93XWatVbFoCJplBw2CQUd\n1jSoc3E5mNWVHDo1RCzucqRjiEvWVvkdksiKsNhE429TGoWILEhyxsma+jLyQsrrl0NVeSG1lYX0\nDE5g2wbZumaVamNE5mGx62h8NdWBiMj8xF2XI4kZJ6rPWF5mdSU9g12MjE/R2TdOU02J3yGJZLzF\nrqPxlxf6ubX2Y4sLR0QuprN3jPBkFFB9xnJbU1/GM3k9TE7FsG2DSjRE5mGxQyfvneNx6oEp4LEl\nRSQiF3Sk48zK/+rRWF7BYICNLRW8cKyfUz2jjE9MUVyoeniRC1ns0Mm62ceMMeXAl4DHlxqUiJzf\n4VNefUZ1eSGrygp8jib3bG71Eg3X9RZNu3xDtd8hiWS0lC3an9hg7aPAh1P1mCJyrmQhqFao9EdZ\ncT6N1cUAHDo5SDw5z1hE5pTq3YEqgMoUP6aIJAyPR+jqHwdUn+GnTa3e29zYRJTO3jGfoxHJbKks\nBi0H3g48uKSIROS8Dp08s7XQJvVo+Ka1rpTC/CATkRgvnRyiuVZrmYicT6qKQQEiwM+AP1t8OCJy\nIYdOectfFxWEaNGHm2+CAYcNzTOLQqMUFy727VQku6WsGFRE0i+ZaGxsriAQ0GJRftrUMrModEhF\noSLnsegU3BjjALcBl+FNa30BeNBaq00ARNJgIhLlRJe3kdrmVg2b+K28JJ+G6mK6+sY5dHKQbeur\nCGilUJFzLLZGowr4MXAVMAQ4eDUau4wxr7HWantDkRQ70j48PcNhU4tqrjPB5pYKuvrGE0Wh4zTX\nagEvkdkWO+vkLqAYuMJau8paWwlcCRQCH09VcCJyhm0bALyN1NY1lvkcjQC01pdRmO/tNfPSSX2/\nEpnLYhONNwAfsNY+nzxgrd0L/D7w5lQEJiJnS36QrWss10ZqGcIrCvW2j08WhYrI2RabaOQBXXMc\n78IbQhGRFIrG4tMLdW1u1bBJJkkOYyWLQkXkbItNNHYBvzfH8Q8AuxcfjojM5Wj7EJGpOKD1MzJN\neUk+DVVnVgp1tVKoyFkWO+vkL4CHjDHXcWYTtVcCV+DNRBGRFHrhaB/gVV1rRdDMs6m1gq5+ryi0\nQ0WhImdZVI+GtfYJ4AbgJF5i8Vq8YtBXWGsfSl14IgLw4jEv0WipK9VuoRlodX0pBXle3UxyrRMR\n8Swq0TDG7ADuB45bay+11l4CtAP3GWMuTWWAIrnOdV1ePNYPaNgkUwUDgemi0JOnVRQqMtNiazTu\nBr7L2cuNb8BbW+MflhqUiJzR0TvG8FgEUCFoJkv+blwXjqgoVGTaYhONq4C/tdZGkgcSK4J+HHhZ\nKgITEc+BEwPT/zZKNDJWeUk+9VVFgDf7REWhIp7FJhojwPo5jjcBk4sPR0RmSyYajdXFVJQW+ByN\nXEhyquvI+BTd/WGfoxHJDIuddXIv8DljzO8BTyWOXQP8C/DtVAQmIl59xsFEorF1zSqfo5GLWV1f\nSl4owFQ0zqFTgzRUF/sdkojvFtuj8SfAYeCneHudDAEPAC8Cf5Sa0ESkvXeMkfEpALaurfI5GrmY\nUDDA+iavKPRE9yiTU9pjUmSx28SPAa8zxmzmzO6tB6y1h1IZnEius21npkpuWa36jJVgU0sFtm2Q\neNzlWMcwW9QTJTlu0dvEA1hrXwJeSlEsIjJLctiktb6MitICotG4zxHJxVSVF1JdXkDf8CSHTg1h\nlCBKjlvs0ImIpFncdbGJjdQu21DtczSyEBsTRaEDI5P0Das+XnKbEg2RDNXeM8Zo2KvPuHxjrc/R\nyEKsaywjGHAAOKyVQiXHKdEQyVAH286sn3HpevVorCT5eUHWNJQBcKxjhGhMa2pI7lpSjUaqGGMK\ngM8BdwDjwKettXef59zXA38LbASOAB+x1n5vuWIVWS7J+ozm2hIqywoYGNCy1ivJppYKjnYMMxWL\n09Gv4RPJXZnSo3EXsAO4CW+r+Y8aY+6YfZIx5nK8NTy+CGwHvgB8yxhz2fKFKpJ+cdflpUR9htbP\nWJnqVhVRXuxtgHfitBbvktzle6JhjCkG3gd8yFq711p7H/BJ4INznP5rwM+stf9irT1qrf0c8BDw\ntuWLWCT9Tp0eZSyxMZcSjZXJcRw2JpaM7xuJcnpwwueIRPzhe6KB1zMRAp6YcewXzL1nylfwFgub\nTVtaSlZ58bg3bOKA1mFYwTY0leN4NaE8bfv9DUbEJ5mQaDQCvdbamQPQ3UChMeasCjjr2Ze8ndiS\n/ha8VUlFssYLx70PpdX1ZZQV5/scjSxWUUGI1rpSAHYdGiAa0zooknsyoRi0mHM3YkvePu8OUsaY\nGrx6jUettd9dyAWDwUzIr9In2T61c2WKRGPT9RmXbahedDtDIYdAwJmeZukHx/GuP98YAoHAjL9T\n86G80BhSbXNrJW3do4xNxNh3rJ9rttRl7Wt3NrUzuyy2fZmQaExwbkKRvD0+1x2MMfV4+6y4wFsX\nesHy8qKF3mVFUjtXpj0vnWYqsQLoddubptu30HZGo+MUFeVTXOzfjq9FRfkEQ3kLjqGwMM/3GFJl\n05p8HtvXSXgyxuP7u7j1unXTP8u21+75qJ25LRMSjXagxhgTsNYmv8I0AGFr7Tkr3RhjmoEHgRhw\nk7W2b6EXHB4OE8viLsxgMEB5eZHauUI98XwHAPl5ARoqChkeDi+qnYODY4TDEfIL/JtaGQ5HCIZg\nfHx+MQQCAQoL85iYmCIeT83vdKExpMOa2gIOnhrnuYOnOXy8j7qq4qx87c6Wrf9HZ8u1di5UJiQa\ne/A2ZdsJPJ44dj3wzOwTEzNUfpQ4/2Zrbc9iLhiLxXNizwi1c2Xad8TLnbesXoUD029cC21nNOoS\nj7vE4v4tFuW63vXnH4PXvng8nrK4Fx5D6rXWFGJPjeMCP9/dzh03bQCy77V7PmpnbvN9QMlaGwbu\nAT5vjLnaGPMm4MPAZ8AbJjHGFCZO/3NgHfAeIJD4Wb0xptyH0EVSbmh0kpOnRwG4VNvCZ42SwiAb\nm72i0Eef7yTuaqVQyR2+JxoJdwK78IZEPou32ud9iZ91cmadjDuAIuApoGPGn88sa7QiaZKc1gpw\n6TolGtnkms3e77NveIIXj2mqq+SOTBg6SfZqvDfxZ/bPAjP+vXU54xJZbvsTH0CrygporC72ORpJ\npW1ryykpDDE2EeXhPR1cf9Vqv0MSWRaZ0qMhkvNc1+XFxPoZl66rwnH8m5YqqRcKBnj5tkYAdh08\nzdCo9j+R3KBEQyRDtPeMMTQWAVSfka2u3+4lGrG4y0O7TvkcjcjyUKIhkiGSwyYOcMlaLTuejVpq\nS9nQ5NWu/+SpE7gqCpUcoERDJEM8f6QXgDUNWnY8m12/vQmAk90jHGkf9jkakfRToiGSAcYmpnjp\n5BAAV2ys8TkaSadrttRRkBcE4Od72n2ORiT9lGiIZIB9R/qm11a4YpMSjWxWVBBi56X1ADz5Qjfh\nyehF7iGysinREMkAew57wyZV5QXTu31K9rrximYAJqdiPHPwtM/RiKRXRqyjIZKN4vE4/f0XX5gp\nGotP12eY5lL6+s7evicUcohGxxkcHCManX/xYH9/H66Py27L+W1oLmd1QxltXSM8sreDGxJ1GyLZ\nSImGSJr09/fzkycPUlpaccHzeoYiTEQS+yO4UR7f33nWzwMBh6KifMLhCPEFJA5dHW2UVlRTQfWC\nY5f0chyHW1+2hi/et5+jHcOc6hmlpVY9WZKdlGiIpFFpaQXllRdeE8N2el3noaDD+tY6gsGzRzSD\nAYfi4gLyCyYXtDHYyPDAxU8S39y0o4V/+94LxOIuj+7t5NdevcnvkETSQjUaIj5yXXd6E7WmmpJz\nkgzJXhWlBVy1pQ6Ax/d3MqVdPyVL6V1NxEdDoxFGw1MA6jrPQTdd4dVmjE1E2WVVFCrZSYmGiI9O\n9oxO/7u5tsTHSMQPl6yroqaiEICH93T4HI1IeijREPHRyW4v0aitLKSoQCVTuSbgONx0pTfV9aWT\ng7T3jvkckUjqKdEQ8cnIeITeoQkAVteX+RyN+OWVlzcSDHg79T68WyuFSvZRoiHikxNdI9P/Xtug\nRCNXlRfnc/V0UWgXk5GYzxGJpJYSDRGfHE8kGrWVRZQU5fkcjfgpWRQanozy9IFun6MRSS0lGiI+\nGB6L0D88CcDaRvVm5LrNrZU01XjFwA9p+ESyjBINER8c7zyzPfga1WfkPMdxpns1jneNcKxT28dL\n9lCiIeKD5LBJQ1UxxYWabSLw8m0N5Od5b8kPPadeDckeSjREltnAyCSDoxFARaByRnFhHtdd2gDA\nky92Mzwe8TkikdRQoiGyzJK9GY4Dqxu0GqiccctVLYC3o+8jWsBLsoQSDZFl5LrudH1GY3Uxhfka\nNpEzWmpL2bK6EvCKQqMx7X8iK58SDZFldHogzMi4t7fJ2oZyn6ORTPTqq1sBb4jtuZd6fI5GZOmU\naIgso0OnhgDICwZYo/oMmcMVG2uoLvf2P3lg1ymfoxFZOiUaIsskMhWbXg10XVMZeSH995NzBQLO\ndK3G4VNDZ60gK7IS6Z1OZJkc7RgmFncB2NRS6XM0ksmu3944PdX1p8+e9DkakaVRoiGyDFzXnR42\nqSovoDqxNbjIXEoK83j5tkYAnnqxm/7hCZ8jElk8JRoiy6BveJKBEW/J8Y0tFT5HIyvBbde04gCx\nuKteDVnRNLdOslY0GvX1+rHYmV04D50cBCAYcFjfqNkmcnH1VcXsMLXssj08vKeDN7x8LcWF2nxP\nVh4lGpKVRkaG+e5Pn6SkzL/eg56O49SvuYSpaJzjnV5B35qGMvLzgr7FJCvLa1+2ml22h8lIjIf3\ndPC6nWv8DklkwZRoSFZyXZeSiloqq+t8i2F4sBfwZg5MJRZe2tSqYROZvw1NFZjWSuzJQX76zEle\nc3WrZivJiqNXrEgaxeMuLx7vB6C6opC6yiKfI5KV5vadqwEYGovwxAtdPkcjsnBKNETSqL1/krEJ\nr1Zk27oqHMfxOSJZaS5bX01zbQkAP3yqjVhcy5LLyqJEQyRNXBcOdYwDUFacR2u9NlCThXMch9e9\nzKvN6O4f56kXu32OSGRhlGiIpMlAOMDwuDfz5JK1VQTUmyGLdO0ldTRUFQPw3ceOq1dDVhQlGiJp\n0jbo1VoX5gfZ0KwprbJ4wUCAN75iLeBtzPfEfvVqyMqhREMkDXqHJhgMe9NYt6xZRSio/2qyNNdu\nraexOtmrcUxbyMuKoXc/kRRzXZfdie29gwEwrdrXRJYuEHD45VeuA7xE9vH9moEiK4MSDZEU6+gd\np7PPKwLd1FRMQb4W6JLUuHpL3fQMlO89dpypqHo1JPMp0RBJobjr8lyiNyM/6LKxsdjniCSbBByH\nNyV6NfqGJ3hgl/ZAkcynREMkhY51DE9vnra2aopQUDNNJLV2bK5lY7O3wuz3HjvO0FjE54hELkyJ\nhkiKRGNxdh/ylh2vKMmnoSx2kXuILJzjOPzaqzcBMBGJ8T+PHPU5IpELU6IhkiIvHh9gPLEK6A5T\nS0CdGZIm6xrLecW2BgAe3dtBW/eIzxGJnJ8SDZEUGByd5PnDfQDUryqiJVGwJ5Iud9y4gYK8IC7w\njZ8dwnVdv0MSmZMSDZElirsuj+/rIu66BAIOOy+t154mknarygp43XXe0uQH2wZ56oAW8ZLMpERD\nZIkOHB+gd2gCgCs2VlNRWuBzRJIrbrumlZqKQgC+9tNDKgyVjKREQ2QJhsci7EkUgFZXFHLJ2iqf\nI5Jckp8X5L2v2wrAaHiK//yJ9TkikXMp0RBZpFjc5RfPdxKLuwQcePm2BgKqAJVltnXNKm7e0QzA\ns7aHZw6e9jkikbMp0RBZpGcOnJ4eMrl8Yw2ryjRkIv54600bpodQ/v3HluFxDaFI5siIRMMYU2CM\n+ZIxZsAY026MuXMe93mlMebIcsQnMtvhU0O8dHIQgKaaErat15CJ+KcwP8R7bt8CeEMoX/zei8Tj\nmoUimSHkdwAJdwE7gJuAtcA9xpjj1tpvz3WyMeYy4L+B8HIFKJLUOzTBky96Ff6lRXlcf3kjAc0y\nEZ9dsraKW3a08LPnTrH/WD//8+hRfuXGDQDE43H6+/uXPaZQyCEaHWdwcIxIxFvALhDw9/ttVVWV\n7zHkGt8TDWNMMfA+4DZr7V5grzHmk8AHgXMSDWPM7wKfAo4AFcsZq8hoeIqHd7cTj7sEAw43Xdmk\nTdMkY7z9lo2c6B7hcPsQP3jiBGsbyrjK1NHf389PnjxIaenyvmUGAg5FRfmEwxE6Tp0gEMqjrq5x\nWWOYaXR0iFt3bqGmpsa3GHKR74kGsB0vjidmHPsF8GfnOf824F1AJfDR9IYmckZ4MsoDz5ycXv3z\num0NVJUX+hyVyBmhYIAPvHkbf/2VZxgajfDFHxygobqEAqC0tILyyuUd4gsGHIqLC8gvmGRosB8n\nmL/sMYj/MqH/qBHotdZGZxzrBgqNMdWzT7bW3mGtvW/ZohMBIlMxfrbrFMPjUwDs2FzD+qZyn6MS\nOVdlaQH/602XEQw4TEZi/MM399A3POl3WJLDMqFHoxiY/b8geTstZfzBYCbkV+mTbF8utzMUChAM\nOgRTMN10Khrnoefa6U+8WV+2vortGy/e9RoMBggElh5DcjzZ+zs+7/s5jnftVDwHi7XQGBbb1lTG\nkA6BgEMo5BAKnf2aTdf/0S1rV/Hu27fw5R8coH94ki/cf5RrN5ezapmfg5m/z0z8PaRKrr3nLlQm\nJBoTnJtQJG+Pp+OC5eVF6XjYjJPL7QwEohQV5lNcvLRcdSIS5adPttE94NUdX7q+muuvbJ7XEuNF\nhfkUFS09hqTCwrwFnV9UlE8wlJey6y/GYmNYaFvTEUMqRSbzqawsYdWqs/fASef/0Te/ajOBUJAv\n3refgdEpfvHiIL/SUEdZcX7arnk+hYV5Gf17SJVcec9dqExINNqBGmNMwFqb/ArTAISttYPpuODw\ncJhYLDXfljJRMBigvLwop9s5PDxGeCJCwfjiu4zHJ6b48dMnGRz11iTY2FzO1aaGcHh+axSEJyLk\nhyPkFyyt2zoQCFBYmMfExBTx+Px/n+FwhGAIxpfwHCzVQmNYbFtTGUM6hMMRBgfHCIWKgeX7P3rD\nZQ2MjU3y9QcOMTYR494HD3HzjmZqK5fnA3Hm7zMTfw+pkmvvuQuVCYnGHmAK2Ak8njh2PfBMui4Y\ni8WJRrP3xZCUy+2MRuPEYi6xRa4lMDwW4YFnTzEa9moytqyp5JotdbguxOa5S2YsFiceX3wMZ3ht\ni8fjC3os1/WuvfTrL97CY1hcW1MbQ+rF4y7RqHvO63Q5/o++5upWhoZGuP+ZLsYmotz/RBvXXlLH\nppaKZdj878zvM5N/D6mSK++5C+X7gJK1NgzcA3zeGHO1MeZNwIeBzwAYY+qNMSrtl2XT1TfO/U+e\nmE4yLt9QzTVb6rQjq6xYN22v4+qNZYSCDnHX5ckXunl8XxcTkejF7yyyRL4nGgl3AruAB4HPAh+Z\nMbOkE3ibX4FJbjl0cpCfPnuSyFQcB7h2ax1XbKpRkiErXktNIbfvXENZsVf/cqRjmO88cowXj/f7\n2ssg2S8Thk6SvRrvTfyZ/bM5kyFr7VeBr6Y5NMkRcddl18EeDpwYACAvGOCGKxppri31OTKR1FlV\nVsDrr1vD0wdOc7RjmEg0zrMHe7Btg5jWStY3l1OYnxEfC5JF9IqSnBeJxnh0byftPWOAt6z4q3Y0\nU6lN0iQL5ecFeeXljZjWSp456G0MODI+xbO2h+de6qGlrpSmmhLqVhVRUZKv3jxZMiUaktNGx6d4\n8LlT0zNL6lYVcdOVTfpWJ1mvdlURt+9czfHOEQ6cGKB3aIK4C23do7R1jwKQnxegoiSfkqI8Sgrz\nKMgPEgo4hIIBAgFw3cQfANfFJXnbO+g43jTveCzG4Eic/LwY5eEpigtCBHxcT0OWl95NJWf1DIZ5\n6Ll2JhKbPW1oKmfntnqC2nBJcoTjOKxrKmddUzmDo5McPjVEW/fodCF0ZCpOz+AEPYMTKbriJM8e\nPYoDlBTlUVVeQFV5IdXlhdRXFRHK8gWvcpUSDclJxzqHeWxf1/RW2js213Dpuip1E0vOqiwt4Oot\ndVy9pY7xiSlOD4TpGZxgbGKK0fAUY+EoU9EYqagbdfE2KBwNT033ngQDDg3VxTTXlrCmvoyiAn08\nZQv9JiWnuK7LvqP97DnUC3hvbtdvb2R1fZnPkYlkjuLCPNY25rG28dz9fOJxl2jcWyPGcRwcBxyS\nfwOOg+P9RTDgUFCYz/BwmBMnjjHlhigqq2Y8PMXQWIS+4QkGRyaJuxCLu7T3jNHeM8azB07TWl/G\n5tYKGqqK9QVghVOiITkjFnd5cn8XRzqGASgqCHLzjhZqKrRMi8h8BQIO+YHgvM51HK+eo7AgRFG+\nQ3EwSFPD2Ul9LO7SOxjmVM8Y7T2jDI5GiLtwomuEE10jVJTms31DNWsaypRwrFBKNCQnTEZiPLy7\nfXrPklVlBbxqRzMlRanbU0NEFi4YcKivKqa+qpirTC2Do5McOjnEkY4hIlNxhkYjPLK3k8ojfVyx\nqYbWulIlHCuMEg3JeiPjEX727Jkt3ptrSrjhiibyUryDo4gsXWVpAddsrWPH5hqOdg6z70g/o+Ep\nBkcjPLy7g6aaYq7dWk95yfJvDieLo0RDstrAyCQPPHuS8KQ3s8Ss9vYs0dQ6kcwWDAbY1FLJhqYK\njrQPsfdIH+MTUTp6x/nuY8e5bH0V29ZX+7rtvMyPEg3JWgOjUZ463EZkytvk6GpTyyXrqnyOSkQW\nIhBw2NRaybqmcvYd6eOFY/3E4y57D/dx8vQo11/eREWpejcymfqOJSsdah/hcTsyvWfJy7c1KMkQ\nWcFCwQBXbq7ll16xlvpV3lbl/cOTfP/x47x0chB3nrsqy/JTj4ZknReO9/OF7x8iFoeA43DDFZq+\nKv6Kx+P09/dN3w6FHKLRcQYHx4hG0/8B2d/fh5slG6dVlhZw67WtvHh8gN0v9XizyV7opqtvnOu2\nNaj2KgMp0ZCs8sKxfv7p3ueZirkEA3Dzjmaaakr8Dkty3NjoEI/s6aauzlvqPhBwKCrKJxyOTC8a\nl05dHW2UVlRTQXXar7UcHMfh0nVVNFQX84u9nQyNRTjeNcLg6CQ372imrFhDKZlEiYZkjf3H+vin\nb+0jGouTF3K4ekOJkgzJGMUl5ZRXesN3wYBDcXEB+QWTy7JF+8jwQNqv4Yfq8kJed90antjflUg0\nIvzgiRPcsL1J//cziPqYJCvYtgE+e6+XZOTnBfid12+itlxrZIhku7xQgOu3N3KVqcXB25/lZ7tO\ncejUkN+hSYISDVnxjrQP8ZlvPc9UNE5+KMAfvGU7m5pVkyGSK5JDKbdc3UJeKIDrwhP7u9hzqFdF\nohlAiYasaMc6hvjU13czGYkRCjp88I7L2LJmld9hiYgPmmpKuP1lqykp9KoCnj/Sd9bmieIPJRqy\nYnX3j/OX/+8TjE9ECTgOv/fL29i2PjuK3URkcSrLCrh95xqqygsAONoxzM/3dBCLx32OLHcp0ZAV\naWh0kk99bTeDo5M4wG+9YStXbq71OywRyQDFhSFuu3Y1jdXFAJw8PcqDu9qJxtSz4QclGrLihCej\n/MN/7+X0oLdB2jtvM+y8pMHnqEQkk+SFArxqRzMtdaUAdPaN88TBISYiMZ8jyz1KNGRFmYrG+edv\n76OtexSAt96yiddc0+pzVCKSiYLBADdd0cTaRq84vG9kii/cf5TR8JTPkeUWJRqyYsRdly/94EUO\nnPDWBLhhexPvun2rz1GJSCYLBBxeeXkjG1sqADjVG+bvv/YcQ6OTPkeWO5RoyIrgui7feOAQTx84\nDcD2DdW89/VbcBzt3CgiFxZwHK67tJ4NDd4eKe09Y3z8P5+jb2jC58hygxINWRHuf/IED+w6BcCG\n5nLe/6ZtBAN6+YrI/DiOw7Y1Jbz6yjoATg94PRu9iVovSR+9U0vG+8Xzndz786MANFYX87/fsp2C\nvKDPUYnISuM4Drde1cBbb9oAQO/QBH//tec4PTDuc2TZTYmGZLS9h3v5yg8PArCqrIA733YFpUVa\nWlxEFu/2nWv41VdtBKBveJK//9puuvuVbKSLEg3JWEfah/jX7+wn7roUF4S4823bqa4o9DssEckC\nt167ml9/9SYABkYm+cTXnqOzb8znqLKTEg3JSB29Y3zmv/cSicbJCwX40Fsup7m21O+wRCSLvPrq\nVl/rSpIAABUOSURBVN5162YAhkYj/P3XdtPeq2Qj1ZRoSMbpHQrz6f/aw9hEFMeB97/xUja3Vvod\nlohkoZt3tPDu1xocYHgswie/9hynTo/6HVZWUaIhGWVoLMKnv7GHgRFvjvtv3Ga0tLiIpNWNVzTz\nntdtwQFGxqf45Nd309Y94ndYWUOJhmSM8Ykp7v6vPXQPeNPN3nrzBm68otnnqEQkF1x/eRPv+6Wt\nOA6Mhqf41Nd3c6JLyUYqKNGQjDA5FeMz33qek4kuy9dft4bbX7bG56hEJJe8fFsjv/2GSwg4DmMT\nUT719d0c6xz2O6wVT4mG+C4ai/Mv/7OPw6eGALj5ymbuuGG9z1GJSC7aeUkDv/vLlxJwHMYno9z1\njd3YtgG/w1rRlGiIr+Jxly9870X2H+0H4GWX1POOWzdraXER8c01W+r4vTdtIxhwCE/GuPube3n+\nSK/fYa1YSjTEN3HX5Z4fH+TZg97+JZdvqOZ9r99KQEmGiPjsKlPLh95yOfmhAFPROJ+9dx9Pvtjl\nd1grkhIN8UXcdbnnRwd5ZG8nAJtbK/nAm7YRCuolKSKZ4bL11dz59isoKggRi7v8f999kZ8+c9Lv\nsFYcvavLsovHXf7t/gPTScb6pnI+9CuXk6/9S0Qkw2xureSPf/1KyovzcIGv/+wQ3/jZIeKu63do\nK4YSDVlWsXicL99/gMf2eV2QG5sr+PDbr6C4MORzZCIic1tdX8afvesq6ld528z/5JmTfP47+5mK\nxnyObGVQoiHLJjIV41++vZ//v707D6+iuhs4/r33hiQkLEIQEozs+hPZF3EBi2jdX+sKWrWvFtu+\nWtGqbW31seqjfa22Yt3Xt1qo1qWVuvHUXWsRbVEMitpf2EFIgARIAtmX948zF6+XEBK4k7m5+X2e\nJ0/unJk785u7zW/OnDln4VKXZByU35OrZ4yha4YlGcaY5Na3VxbXf28CQw/oAcBHupnbn/pkZ+eC\nZvcs0TDtYkd1HbOfLaBguWu5PXxgL0syjDEdSvesdH5+3jgmeL0Vryoq55Y5i1ixvizgyJKbJRrG\nd6Vl1dz+5GKWef1kTDykL1dNH0NmuiUZxpiOJb1LhMvOHMlpRw0C3GBst839mDf/vSbYwJKYJRrG\nV4XrtnHLnEU7R0Q8bnw+l35nBF3S7KNnjOmYwqEQZ35rCJedMZL0LmHqGhq559kCHnnxc6pr64MO\nL+nYKaXxzTufrOfPbxTS0NhECDhr6hBOOWKgdcZljEkJhx3Sl369uvLgC0vZtLWK9z8rYuWGMi49\nfSQH9u0WdHhJw04rTcJV19bzh/lf8KfXlIbGJjLTI1xx9mhOPXKQJRnGmJQyoF93bv3B4UwZ0x+A\notJKbp2ziPkfrKahsTHY4JKE1WiYhFpVVM4jL33OJm8E1r69unLl2aPp3yc74MiMMcYfXTPSuPZ7\nExnWv5AnXy+kvqGR5/+xksWFm5l56qEc0Ml//yzRMAlR39DI3z9cw0vvr6ah0XVkc9ghfbnoJCEr\ns0vA0RljjL9CoRDHTsjnoPyePD7/S1ZsKGdVUQU3P/5vTph0IKcdNajTNoDvnHttEkrXbmXua0pR\naSUAGV0inH/8QUwZlWeXSowxnUpeTjbXXTiB1xat5W/vrfJOwtbywdJiZkwbxqRD+3W68Zws0TB7\nraSsihf+uWpnB1zgOuGaecpw+vXOCjAyY4wJTjgc4uTDBzL+4P155s1lLFlRyrbttTz68hf8/V9r\nOWPKYMYe1KfTnIhZomHarGxHLfMXrubdgvXUN7jLJNmZacyYNozJo/M6XbZujDHN6dcri59MH8On\nK0r485vL2LS1inWbtnPfvM8YlNudU44YyLiD+xAJp/Z9GZZomFZbX7KDtz5ax8KlxdTWu9bU4VCI\nKaPzOHvqELpnpQccoTHGJJ/RQ/tw6KDeLPi0iJcXrmZrRQ2riyt48IWl5PTI4Njx+UwZnZeyv6FJ\nkWiISAbwIHAWUAnMVtW7drPsOOAhYBSwFLhMVRe3V6ydTU1dAwXLSljwWRGfr9ryjXmThvfljKOH\nkGuXSYwxpkVpkTDHjDuAyaPyeG/JBl7911pKy6spLa/hL++uYN57Kxk1JIejRuYyZlgOXdJSZzTr\npEg0gDuB8cAxwCBgroisVtV5sQuJSBYwH/gTcBFwGTBfRIaoalW7RpzCqmrq+WL1FhYXlrB42WZq\nar8eoTAtEuLw4f04YdIA65DGGGPaqEtamOMm5HPMuP4ULCvhjUXrKPyqjIbGJgqWl1CwvISMLhFG\nDO7NmGE5jB7ah57ZHbumI/BEw0seLgFOVNUlwBIR+S0wC5gXt/h5QKWq/sKbvkpETgGmA3PbK+ZU\nU1ldz8qiMlauL+fLNVtZvr5s5y2qUTk9Mpgyuj/HjDugw3/ojTEmaJFwmAnSlwnSl682b+eDpcV8\n+MVGtlbUUFPXwOLCzSwu3AxAbu8sDj5wPw7K78mgvB7k9c4iHO44beECTzSAMbg4PogpWwBc38yy\nh3vzYr0PHIklGi1qampie1UdJWXVbN5WxYaSHawv2cH6zTvYuKWSpmae061rFybK/hwxIpdh+T2t\nkacxxvggf/9uTJ82jLOnDkXXbaNgWQlLlpewaZurqC/eUknxlkreW7IBgPS0MPl9u5GXk0Vub/eX\n0zOT/bpl0CMrPemSkGRINPKAElWNHYlmI5ApIjmqWhq37NK4528ERvgcY1LZtK2K0rJq6uobqatv\noK6+kdr6RurqG6murWdHdT21DU2UbqukfEcdFZW1VFTWUVPX0OJ60yIhBuZ2Z+TgHEYNyWFQbvek\n+8AaY0yqCodDDB/Yi+EDe3HeccMoKq3kP2u3omu3UbhuG2U7agGorW9k5YZyVm4o32UdoRD0yE5n\nv+wM9uuWTvfsdDLTI2Smp9E1PbLzcWZ6hEgkTCQcIhwOuf8h9zgvJ4uuGYlLD5Ih0cgCauLKotMZ\nrVw2frkWRSId91aigmUl3PVswT6vJz0tTP/9szmgTzcG9OvGsPyeDMrt0aFGVY2+j829n2lpYSIR\n9+UJSiQSpqqyfJ+TtXA4TG1NGjU19TS2YeyEqsoKIpF0tpdv3aft74u2xrC3+5rIGPwQH4Mf+9mW\n7beX2P1MjvehnLS0/qQl+Heupd+ifTEgtzsDcrtzwqQBNDU1UVJWzZriCtYUV7B2UwXFpZVs2lr1\njUvdTU1u6Pqy7bWs2bh3283KSGP2rMlkd/1mr857u3/JkGhUs2uiEJ2ubOWy8cu1JNSjR9c2LJ5c\npk3KZtqkgUGHkVSaez979cpm5sD+AUQT4zgJdvscGfD2wWKICjqGoLcP7sp3avP72NK7dzcOHtzH\n1234IRlOX9cDfUQkNpZcoEpVtzWzbG5cWS5Q5GN8xhhjjNlLyZBoFAB1wBExZUcDi5pZ9kPgqLiy\nyV65McYYY5JMqKmpufsN2peIPIRLGGYC+cAfgYtU9UUR6QeUqWq1iHQHlgFPA48ClwLnAMOsHw1j\njDEm+SRDjQbANcDHwNvAfcCvVPVFb14RMANAVSuA/wK+BXwETAJOtiTDGGOMSU5JUaNhjDHGmNSU\nLDUaxhhjjElBlmgYY4wxxjeWaBhjjDHGN5ZoGGOMMcY3lmgYY4wxxjfJ0AV5oETkAeBQVZ0WdCyJ\nJiL7Aw8Cx+O6aZ8LXK+q/g+u0I5EpCcwG3frcxiYD1ylqmWBBuYjEXkNeEpVU2LUYhHJwH1Wz8J9\nVmer6l3BRuUfb38/Ai5X1feCjifRRKQ/cC8wDfd+Pgdcp6q1gQbmAxEZCjyA6wuqFLhfVe8MNir/\niMh8YKOqzmztczp1jYaIHIXr9CtV7/F9CuiOG2RgOvBd4NpAI/LHI8Ao4CTgBGA4rkO3lCMiIRG5\nD/h20LEk2J3AeOAY4MfATSJyVqAR+cRLMp4GDg06Fh89D2TiDr7nAacBtwYakQ9EJIQ7sdkIjMUd\nT24QkfMCDcwn3n6d3NbnddpEQ0S64A5QC4OOxQ8ikg4UAz9W533gr8CUYCNLLBHJwp0FX66qBapa\nAFwFnOm9BinDO0t8C1dzEz8OUIflvYeXAFeq6hKvs77fArOCjSzxRGQ4bsiEwUHH4hcREVxniher\n6n+8354bgfODjcwX/YBPcL+zK1T1Vdx3NKV+ZwFEpBfue/nvtj63M186uQ5YguvSfGrAsSScV0X5\n39FpERkBfAd4OLCg/NGIO/AuiSkLARGgG7AliKB8Mh5Yi+t2/+OAY0mkMbjfog9iyhYA1wcTjq+m\n4g5EN9C2Uac7kmLgJFUtiSkLAT0Disc3qlqMqykGQEQm43quvjSwoPxzJ+7y+wFtfWKnTDRE5BDc\nB2EMrpo2pYnIu3zdbfuDwUaTWKpaDbweV/wT4FNVTaUkA1V9BXgFwJ00pow8oERV62PKNgKZIpKj\nqqUBxZVwqroz0U+x93Anr23UG9Fp7/LCLODNwIJqByKyGjgQ9x2dF2gwCSYix+IGOx3FXpyspmSi\nISKZ7D7rKsJdMrlRVTd35C/7nvZTVaNnTFcAvYD7gWeA09shvIRpw34iIrNwZ/wntkdsidSW/Uwx\nWUBNXFl0OqOdYzGJ9ztc+4WJQQfis7OAXNyB+G7cCU+H57Upehh3eahmb46ZKZlo4Bo/vkPzjTyv\nA8Kq+n/tG5IvWtrPM4GXAFT1MwAR+T6wSEQGqOradoty37VqP0Xkx8A9wE9U9a32Cy9hWrWfKaia\nXROK6HSqJledgojcAVwJzFDVL4OOx0+quhhARK4GnhSRn8bV0nVUNwOLVHWva6RSMtFQ1X+wm4au\nIvI2MFFEKryidCAiIuW421y/aqcw99ke9rO7iMxQ1ediir/w/vfBXevvEFrazygR+RmuodJPVfX+\ndgkswVqznylqPdBHRMIxt17nAlWqmjKNXjsb7+6o/wEuUNUXgo7HDyLSFzgyZrRxcL+z6UAPUqON\n2LlAv5hjZgaAiJyjqj1as4LO+KN2ATAC1z5jDK5KaJH3eEOAcSVaFvCMiBweUzYRqAcKgwnJHyJy\nEXAHribj90HHY9qsAKgDjogpOxr3vTQdkIjcBPwIOFdV/xJ0PD4aDMwTkbyYsonA5hRqIzYV1zYj\nesx8CXjRe9wqKVmj0RJVLYqdFpEtuDOnVQGF5AtV3SgizwP3i8gPcf1pPAbcq6rbg40ucbxbru4D\n5gDPiUi/mNmbU61zslSkqlUiMhd4WERmAvnAT4GLgo3M7A3vFt4bgNuAhbHfSVXdGFhg/liEa2T/\nuIhcg0s8fgv8OtCoEkhV18VOezUbTW05ZnbGGo3OZCbuts/XcR3ovAz8MtCIEu8EIBt3UNrg/RV5\n//MDjMtvqdbJ3DW4W3bfxiWOv4qrjk5FqfYeRn0Hd2y5gV2/kynFO5E5HdiB65PpUeDujnr51i+h\npqZU/awbY4wxJmhWo2GMMcYY31iiYYwxxhjfWKJhjDHGGN9YomGMMcYY31iiYYwxxhjfWKJhjDHG\nGN9YomGMMcYY31iiYYwxxhjfWKJhjDHGGN90urFOjElVIvIu8K244lpgI24gpJ+ranWCtvUEMFBV\nj93N/Km4Ie8HqWpCRgoWkXeAVao6MxHra2b9q4AnVPUWP9ZvTGdliYYxqaMJeBa4Egh5Zd1w48Hc\n45XNStC2rgQirYjHGNPJWaJhTGqpUtXNMdObcKOiHgacR4ISDVWtSMR6jDGpzxINYzqHGqAuOiEi\nXXBDWV8A9AQ+A25S1Te8+WHgN8B3gb7AKtyolI94879x6UREjgZ+B4wGFHgiduPNXfaILxORM3Cj\nC4/E1ZZ8Dlyvqq/vaee8eIar6hExZQO8uI9X1bdF5AfAFcBBQCOwGLhaVT9uZn0XA4+ranh3ZXt6\nDY0xjjUGNSaFiUhERE4FLgTmxsyaA3wbl0iMBZ4DXhaRk735lwNnA9NxB+b7gAdF5KhmtjEYeA03\nzPtY4BbgxjbGOR74K/AUMAI4HFcbM1dEWnNC9ARwmBdL1IXAOi/JOBO4F7gdEOBYIBN4bDfra2LX\nSz/xZXt6DY0xWI2GManmQhGZHjPdFVgN3IGroUBEhuIuo4xV1U+95e4WkbHAz4G/A0OAHcAaVS3G\nJRn/AQqb2eaPgCJglqo2AYVebcJdbYi7Abg8WmPixXkvMB/oB6xv6cmq+p7XmPMCXC0DwPm4ZACg\nBLhEVZ/2pteJyOO4BKrNRGQYu38Nr8W9hsYYLNEwJtW8iDvQhYFJuEagbwK/UdVGb5lx3v8FIhKK\neW4asNV7/ABwBvCViHwCvAE8o6olzWxzJPCJl2RELWxL0Kq6RES2iMi1wHBgGK6WAPbc6DRqDl6i\nISLjvPXM8db/TxE5RERuAA7B1dKMZu9rdaOxtfQaGmOwRMOYVFOhqqu8xytEpAiXaNTzdUPQMO4S\nwBRge9zzGwBUdbl31n4McDxwKvALEblYVf8U95wmdj1g17FnO39/vNthXwVeARYATwLZwN9asZ6o\nOcBN3mWY7wLvq+pKb/3nA3/01vs+8DAwCri/DeuP/b3c42tojHGsjYYxKUxV3wVmA5eJyAle8VLc\nra79VXVl9A+4BPg+gIhcAZyjqm+p6i9VdQzwFnBuM5spACbGtaU4LG6ZWqBHdMKrBRgaM/8a4G1V\nna6q96jqW8BAb15sjUFL+7oW13fHdGAGLrGI+gXwmKrOVNWHVHUBrtZkd2q9OLvFlB0c83iPr6Ex\nxrEaDWNS3424yyAPi8hIVf1CRF7xpmfh7u6YjjsYX+w9Z3/gVyJSCSzBXYYYC/y+mfU/hGs8+riI\n3IY7gN8Ut8wHwNUiciKwHLgad6dG1DrgdBGZDHyFa6wZ7Tgrow37Ogd32SeMa5wZu/7J3iWVMuB0\nL2ZEJF1Va+PW8yGuxuJmEbkPdxnqoujMVr6GxhisRsOYlKeqNcAPgQHA/3rFM4DncZcQPge+B8xU\n1Se9+TcDf8DdqaHecg/g7tqIX38RLjE4EHfnye+AW+MWm41rP/IcLumoAJ6OmX8j7uD+MvAJX9cM\nVLFr7UhLnsclCPNUNfaSxixcD6nvets5xdtnYta/s42Jd/npUuAs4Evc6/ezuG3t6TU0xgChpibr\nvM8YY4wx/rAaDWOMMcb4xhINY4wxxvjGEg1jjDHG+MYSDWOMMcb4xhINY4wxxvjGEg1jjDHG+MYS\nDWOMMcb4xhINY4wxxvjGEg1jjDHG+MYSDWOMMcb4xhINY4wxxvjm/wFkLXEyRlbU8AAAAABJRU5E\nrkJggg==\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3Xl8nGW9///XZ7LvTdKkSZtm6U5XStO0BVlERZBNlKUUEAS+uOByzlc9R3/nHH+IX78Hl+NxQRQE5QAiIsoqgoBQLHRLV7o3bdYmTdLsS7PO5/vHTGOIaTJtM7nvmXyej8c8mOXOfb8zpPOZ67ru+7pEVTHGGGMAPE4HMMYY4x5WFIwxxgywomCMMWaAFQVjjDEDrCgYY4wZYEXBGGPMACsKJuhEZLeIXOR0DieJyDUiUiki7SKyNIjHOV9E9o/w+qMi8n/G4Dj5IqIiEnmm+zLuYkXBnBERKRORDw957jYRWXfisaouUNW3RtlPuH/I/AD4gqomquq2YB1EVf+mqnODtX8T/qwomAnBBcUmD9gdyIYuyGomMCsKJugGtyZEpEhEikWkVURqReSH/s3e9v+32d/FskpEPCLy7yJSLiJ1IvKYiKQM2u+n/K81iMh/DDnOPSLyjIg8ISKtwG3+Y68XkWYRqRGR+0UketD+VEQ+LyIHRaRNRL4tIjP9P9MqIk8P3n7I7zhsVhGJEZF2IALYISKHTvLzKiJ3i8hB4KD/uXki8pqINIrIfhG5ftD2HxORPf6cR0Tkq/7nLxKRqkHbLRWRrf7tfgfEDnrtfS26QTlm+e9fLiLb/L97pYjcM8L/49tE5LD/OKUictPJtjUup6p2s9tp34Ay4MNDnrsNWDfcNsB64Bb//URgpf9+PqBA5KCfux0oAWb4t/0j8Lj/tflAO/ABIBpf90zvoOPc43/8cXxffuKAZcBKINJ/vL3APw06ngIvAMnAAqAbeMN//BRgD3DrSd6Hk2YdtO9ZI7yPCrwGpPmzJgCVwKf9ec8BjgEL/NvXAOf776cC5/jvXwRU+e9HA+XAPwNRwLX+9+T/DPf/aWhO/74W+d+/xUAt8PGh/7/8WVuBuf7Xsk/ktFvo3aylYMbCc/5v380i0gw8MMK2vcAsEZmsqu2qumGEbW8Cfqiqh1W1HfgGsNrfvXIt8KKqrlPVHuCb+D6kBluvqs+pqldVj6vqFlXdoKp9qloGPAhcOORnvquqraq6G9gF/MV//Bbgz8DJBolHyhqo/1TVRlU9DlwBlKnqr/15twJ/8P/e4Hsf54tIsqo2+V8faiW+YvAjVe1V1WeAzYGGUdW3VPU9//u3E/gt//h+neAFFopInKrW+N8/E4KsKJix8HFVnXTiBnx+hG3vAOYA+0Rks4hcMcK2U/F90z2hHN830yn+1ypPvKCqnUDDkJ+vHPxAROaIyEsictTfpfR/gclDfqZ20P3jwzxOPI2sgRqcNw9YMaTY3gRk+V//JPAxoFxE1orIqpNkOqKqg4tl+TDbDUtEVojImyJSLyItwGf5x/cLVe0AbvC/XiMifxKReYEex7iLFQUzrlT1oKreCGQC3wWeEZEE/vFbPkA1vg/HE3KBPnwf1DVAzokXRCQOSB96uCGPfw7sA2arajLw/wFy+r9NwFkDNThvJbB2cLFV35lLnwNQ1c2qejW+9/E54Olh9lcDTBORwb9j7qD7HUD8iQciksX7PYmvO226qqYAv+Ak75eqvqqqH8HXdbQP+OXov65xIysKZlyJyM0ikqGqXqDZ/3Q/UI+vC2LGoM1/C/yziBSISCK+b/a/U9U+4BngShE51z/4+y1G/4BPwtf33e7/Jvu5MfvFRs56Ol4C5ojILSIS5b8tF5GzRCRaRG4SkRRV7cX3O/UPs4/1+ArTl0QkUkQ+ARQNen0HsEBEzhaRWHzjMIMlAY2q2iUiRcCa4YKKyBQRucpf3LvxjfUMl8eEACsKZrxdCuz2n5HzY2C1qnb5u3++A7zj7y5ZCfwKeBzfmUmlQBfwRQB/n/UXgafwfSNuA+rwfSidzFfxfbC14fsm+7sx/L1OmvV0qGobcAmwGl8r5Ci+llWMf5NbgDJ/N9hngZuH2UcP8Al8A8pN+Lp4/jjo9QPAvcDr+M54WjdkF58H7hWRNnxjNsO1RsD3OfIVf85GfOMOI3UhGheT93c3GhOa/N/Om/F1DZU6nceYUGUtBROyRORKEYn3d1v8AHgP3+mvxpjTZEXBhLKr8XVZVAOz8XVFWdPXmDNg3UfGGGMGWEvBGGPMgJCbeGvy5Mman5/vdAxjjAkpW7ZsOaaqGaNtF3JFIT8/n+LiYqdjGGNMSBGRgK5mt+4jY4wxA6woGGOMGWBFwRhjzAArCsYYYwZYUTDGGDPAioIxxpgBVhSMMcYMsKJgjDFmgBUFY4wxA0LuimYTvp7cWBHU/a9ZkTv6RsZMcNZSMMYYM8CKgjHGmAFWFIwxxgywomCMMWaAFQVjjDED7OwjE5aaO3v4y55aalu76PcqcVERZCbF8KGzMhERp+MZ41pWFExY8aqy7uAx3thXC8DMjEQiPMLRli7ufKyYovw07vvkImZkJDqc1Bh3sqJgwspfdh/l7YPHOCs7mSsWZ5MaHw1Av1fxeOAHr+7nxl9u4PefOZfc9HiH0xrjPjamYMLG1oom3j54jBUFadyyMm+gIABEeISbVuTx27tW0t3nZc3DG6huPu5gWmPcyYqCCQsVDR08u+0IMzISuGLx1JNuNy8rmcduL6Kls5dP/3ozXb3945jSGPezomBCXr9X+cO2IyTHRrKmKJcIz8gDyYtzJvGTNUvZX9vGj984OE4pjQkNVhRMyCsub6S+rZvLF2UTHx3YMNkH52ZyQ+F0Hlx7iG0VTUFOaEzosKJgQlp3bz+v760jPz2Bs7KTT+ln/+2Ks5iSHMtXf7/DupGM8bOiYELa2wfr6eju42OLsk75+oPk2Cju++RiDtV38Oi7ZcEJaEyIsaJgQlZndx/rSo6xOCeFnNTTO730wjkZfHBuBg+8WUJLZ+8YJzQm9FhRMCGruLyJ3n7lg3Mzz2g//3LpPNq6+3hgbckYJTMmdFlRMCHJq8qG0gZmTE5gSnLsGe3rrOxkrjl7Go++U0ZNi127YCY2KwomJO2raaO5s5dVM9PHZH///JE5qMJP3rDWgpnYrCiYkLThcAMpcVHMyzq1M45OZnpaPNcV5vCHLVXUtXaNyT6NCUVWFEzIqWvtoqS+nRUFaaNeqHYq7rpgBn1eL796p2zM9mlMqLGiYELOloomPAKF+Wljut+89AQuW5TNbzaU09plZyKZicmKggkpXlV2VrUwOzOJxJixn+T3sxfMpK27jyc3Voz5vo0JBVYUTEipbOyk5XgvS6anBGX/i3JSOG9WOr9aV0pPnzcoxzDGzYJaFETkUhHZLyIlIvL1Eba7VkRURAqDmceEvh1VzURFyClPaXEq/tf5M6hr6+bPu2qCdgxj3CpoRUFEIoCfAZcB84EbRWT+MNslAV8CNgYriwkP/V7lvSOtzM1KJiYyImjHuWB2BgWTE/gfm/rCTEDBbCkUASWqelhVe4CngKuH2e7bwPcAOw/QjOjwsXY6uvtYkhOcrqMTPB7hlpV5bK1o5r2qlqAeyxi3CWZRmAZUDnpc5X9ugIgsBaar6ktBzGHCxM7KFmIiPcyZkhT0Y11bmEN8dASPrS8L+rGMcZNgFoXhTiDXgRdFPMB/A18ZdUcid4lIsYgU19fXj2FEEyr6vcqemlbmZycTFRH88yOSY6P4xDnTeH5HNU0dPUE/njFuEcx/XVXA9EGPc4DqQY+TgIXAWyJSBqwEXhhusFlVH1LVQlUtzMjICGJk41YVjZ0c7+1nXhAHmIf61Kp8evq8PF1cOfrGxoSJYBaFzcBsESkQkWhgNfDCiRdVtUVVJ6tqvqrmAxuAq1S1OIiZTIjaf7SVCBFmZyaO2zHnTElieX4qT22uRFVH/wFjwkDQioKq9gFfAF4F9gJPq+puEblXRK4K1nFNeNp7tI2CyQnERgXvrKPhrF6eS+mxDjYcbhzX4xrjlLG/JHQQVX0ZeHnIc988ybYXBTOLCV0N7d3Ut3VTdIbTWpzOVco9fV5iozzc9+e93LA8d8Rt16wY+XVjQoFd0Wxcb9/RNgDmZQX/rKOhoiM9nD09ld3VrXT29I378Y0Zb1YUjOvtO9pKRlIM6Ykxjhx/eX4qfV5lW0WzI8c3ZjxZUTCu1tXbT+mxDs5yoJVwQnZKHDmpcWwua7QBZxP2rCgYVztU345XYY6DRQFgeX4adW3dVDZ2OprDmGCzomBcraSunegID7lp8Y7mWJyTQnSkh81lTY7mMCbYrCgYVyupa6dgcgKRHmf/VGMiI1iSk8LOI8109fY7msWYYLKiYFyrqbOHho4eZo3jBWsjWZ6fRm+/sqPKBpxN+LKiYFyrpK4dwDVFYdqkOLJTYtlcaheymfBlRcG4VkldO8mxkWQmOXMq6lAiwvL8NKpbujjSdNzpOMYEhRUF40peVQ7VtzMzIxGR4SbcdcaSnElERQiby621YMKTFQXjSkdbuujs6XdN19EJcdERLJyawo7KZlvD2YQlKwrGlU6MJ8x0WVEA34Bzd5+X947Yqmwm/FhRMK50qL6dzKQYkmOjnI7yD/LS48lIjGFzmXUhmfBjRcG4Tr9XKW/sZEZGgtNRhiUiFOanUtHYSW2rLS1uwosVBeM61c3H6enzUjDZfV1HJyzNTSVChGJrLZgwY0XBuM7hYx0AFEx2Z0sBIDEmkrOmJrOtspm+fhtwNuHDioJxndJjvvGExJigrgF1xpbnp9LZ08/umlanoxgzZqwoGFfp9yplDZ2ubiWcMDMjkdT4KOtCMmHFioJxlb+PJ7i/KHhEWJaXxqH6Dhrau52OY8yYsKJgXKU0BMYTBluWl4oAW8ptSm0THqwoGFcpPdZBRmIMSS68PmE4KXFRzM1KYktFkw04m7BgRcG4hm88oSNkWgknLM9Po62rj7/uq3M6ijFnzIqCcY3a1i66+7zkh1hRmDMliaTYSH63udLpKMacMSsKxjXKGnzjCfnpzi69eaoiPMKy3FTe3F9HTYtNqW1CmxUF4xrlDZ2kxEUxKT7a6SinbFleKl6F3xdXOR3FmDNiRcG4gqpS3tBBXoi1Ek5IT4zhvFnp/G5zJV6vOh3HmNNmRcG4wpHm47R29ZGXFppFAWD18lyONB9nXckxp6MYc9qsKBhXKC7zneeflx5ag8yDXbJgCmkJ0fxmY7nTUYw5bVYUjCsUlzcSE+khKyXW6SinLSYygusLp/PanlobcDYhy4qCcYXisiZy0+LxuGg95tNx04pcFPjtxgqnoxhzWtw9DaWZEFqO97K/to2L52U6HeWMPOkvBHMyk/j1O2VMTooh0jO237vWrMgd0/0ZM5S1FIzjtlY0oQr5ITyeMNjKGWm0dfexp9qm1Dahx4qCcdyWsiYiPML01NA982iw2VOSSI2PYmOpTaltQo8VBeO4zWWNLJiaTHRkePw5ekRYUZBO6bEOW8PZhJzw+FdoQlZPn5cdVc0U5qU5HWVMnZOXSoRH2Fja4HQUY06JFQXjqN3VLXT1einMT3U6yphKjIlk0bQUtlU0093X73QcYwIW1KIgIpeKyH4RKRGRrw/z+mdF5D0R2S4i60RkfjDzGPc5cdFaYV54FQWAlQVpdPd52V7Z7HQUYwIWtKIgIhHAz4DLgPnAjcN86D+pqotU9Wzge8APg5XHuFNxeSO5afFkJofuRWsnMz0tnuyUWDYebkTV5kMyoSGYLYUioERVD6tqD/AUcPXgDVR18Dl7CYD9y5lAVJXisqaw6zo6QURYWZDO0dYuyhs6nY5jTECCWRSmAYNXHanyP/c+InK3iBzC11L40nA7EpG7RKRYRIrr6+uDEtaMv7KGTho6esJukHmwJdMnERvlYf1hG3A2oSGYRWG4+Qr+oSWgqj9T1ZnAvwL/PtyOVPUhVS1U1cKMjIwxjmmcsrnMdx7/8jBtKQBER3oozEtjd3ULrcd7nY5jzKiCWRSqgOmDHucA1SNs/xTw8SDmMS6zpayJlLgoZmYkOh0lqFYUpKEKm8rsYjbjfsEsCpuB2SJSICLRwGrghcEbiMjsQQ8vBw4GMY9xmc3ljRTmpeLxhPYkeKNJT4xhzpQkNpU20uf1Oh3HmBEFrSioah/wBeBVYC/wtKruFpF7ReQq/2ZfEJHdIrId+N/ArcHKY9ylob2bw/UdFOaH73jCYKtmptPe3ceuIzYfknG3oM6SqqovAy8Pee6bg+5/OZjHN+61pdx/fUIYjycMNiszkfSEaDYcbuDs6ZOcjmPMSdkVzcYRW8qbiI7wsGhaitNRxoVHhJUz0qlo7ORIky3AY9zLioJxxOayRhblpBAbFeF0lHGzLC+V6Ag7PdW4mxUFM+66evt570jLhOk6OiE2KoKluZPYWdVMR3ef03GMGZYVBTPudla10NuvYX3R2smsnJFOn1cpttNTjUtZUTDjrrjc94G4LAwnwRvNlORYZkxOYGNpI/1em9XFuI8VBTPuisuamJmRQFpCtNNRHLFqZjrNx3vZf9ROTzXuY0XBjCuvV9lS3sTyCXJ9wnDmZSWTEhfFuzbgbFzIioIZVyX17bQc752QXUcnRHiElQVpHK635TqN+1hRMONqk38x+6KCidtSACjMTyPSI2yw1oJxmYCKgoi8Echzxoxmc1kjmUkx5KbFOx3FUQkxkSzOmcS2ima6em25TuMeIxYFEYkVkTRgsoikikia/5YPTB2PgCZ8qCqbShspKkhDJLwnwQvEqhnp9PR72VrR5HQUYwaMNvfRZ4B/wlcAtvD3NRJa8S21aUzAqpqOU9PSNeG7jk6YlhpHblo86w81sHJGOh4rlMYFRmwpqOqPVbUA+KqqzlDVAv9tiareP04ZTZj4+6I6VhROWDkjnYaOHkrq2p2OYgwQ4CypqvpTETkXyB/8M6r6WJBymTC0qbSR5NhI5k5JcjqKayyclszL70Wy4XADc+x9MS4QUFEQkceBmcB24MSomAJWFEzANpU1sjw/LewX1TkVkR4PRQVpvLmvjsaOngl7QZ9xj0DXUygE5quqXZdvTssx/6I61xdOH33jCaYoP4239tex4XADH1uU7XQcM8EFep3CLiArmEFMeNtcauMJJ5McF8WCqSkUlzfS02fLdRpnBVoUJgN7RORVEXnhxC2YwUx42VTWSGzUxFlU51StmpFOV6+XHZXNTkcxE1yg3Uf3BDOECX+byxpZOj2V6Ei7iH44eenxZKfEsv5wA4X5qXYdh3FMoGcfrQ12EBO+2rp62VPdyhcunu10FNcSEVbNSOeP245Q1tBJweQEpyOZCSrQaS7aRKTVf+sSkX4RsXl/TUC2lDfhVd+Aqjm5xTmTiIuKsOU6jaMCbSm87wRqEfk4UBSURCbsbC5rJNIjnJM3yekorhYd6aEwP5V3So7RcryXlLgopyOZCei0OnhV9Tng4jHOYsLUptJGFkxLIT460CGsiWtFQTqq2HKdxjGBXrz2iUEPPfiuW7BrFsyounr72VHZwq3n5jkdJSSkJUQzKzOR4vImLpqbSYRd6GfGWaAthSsH3T4KtAFXByuUCR87q1ro6fdSVJDudJSQUVSQRsvxXg7UtjkdxUxAgY4pfDrYQUx42lTqGzQtnMArrZ2qeVnJJMVGsqm0kbOyk52OYyaYQM8+yhGRZ0WkTkRqReQPIpIT7HAm9G0qa2LOlERSbU6fgEV4hMK8NA7UttHU2eN0HDPBBNp99GvgBXzrKkwDXvQ/Z8xJ9fV72VreZFNbnIbl+b6WlQ04m/EWaFHIUNVfq2qf//YokBHEXCYM7DzSQnt3H6tm2njCqZoUH82cKUkUlzfR77VzOsz4CbQoHBORm0Ukwn+7GbArbMyI1h/y/YmsmmFF4XQUFaTR1tXH3hq7TtSMn0CLwu3A9cBRoAa4FrDBZzOid0qOMS8rifTEGKejhKS5WUmkxEUNrFhnzHgItCh8G7hVVTNUNRNfkbgnaKlMyOvq7ae4vInzZk12OkrI8ohQmJ/Kwbp2GjtswNmMj0CLwmJVbTrxQFUbgaXBiWTCwdbyJnr6vJxr4wlnpDAvDY/4rgo3ZjwEWhQ8IjJwormIpBH4tNtmAnr3UAMRHqGowM48OhMpcVHMzUpmS0UTfV5bgMcEX6Af7P8FvCsiz+Cb3uJ64DtBS2VC3juHjrE4J4WkWJvU7UytKEhjb00re6ptwNkEX0AtBVV9DPgkUAvUA59Q1ceDGcyErrauXnZWtXDeTBtPGAuzMhNJjY+yLiQzLgKeJVVV96jq/ar6U1XdE8jPiMilIrJfREpE5OvDvP6/RWSPiOwUkTdExGZNCwObShvp96qNJ4wRjwjL89M4fKyDw/XtTscxYS5oayOKSATwM+AyYD5wo4jMH7LZNqBQVRcDzwDfC1YeM37+dvAYsVEezrH5jsbMsrxUPAK/3VThdBQT5oK5YG4RUKKqh1W1B3iKITOrquqbqtrpf7gBsPmUwsDaA/WsmpFObFSE01HCRlJsFPOzk3lmSxVdvf1OxzFhLJhFYRpQOehxlf+5k7kD+PNwL4jIXSJSLCLF9fX1YxjRjLXyhg5Kj3Vw4RybBWWsFRWk09TZyyu7jjodxYSxYBaF4VYHGXYSF/+0GYXA94d7XVUfUtVCVS3MyLAPGzd7+4CvaF84N9PhJOFnRkYC+enxPLnRupBM8ASzKFQB0wc9zgGqh24kIh8G/g24SlW7g5jHjIO1B+rJTYsnPz3e6ShhxyPC6qJcNpU1ctAW4DFBEsyisBmYLSIFIhINrMY3/fYAEVkKPIivINQFMYsZB919/bx7qIGL5mYgYstIBsO1y3KIihCetAFnEyRBKwqq2gd8AXgV2As8raq7ReReEbnKv9n3gUTg9yKyXUReOMnuTAgoLmuis6ffxhOCaHJiDJcuzOYPNuBsgiSoU1Wo6svAy0Oe++ag+x8O5vHN+Fp7oJ7oCA8rbarsoFpTlMuLO6p5aWcN1y6zE/bM2Apm95GZYN7cV8fyglQSYmxarGBaOSONGRkJPLmx3OkoJgxZUTBjovRYBwfr2vnwWVOcjhL2RIQ1RblsrWi2BXjMmLOiYMbEa3t8585/ZL4VhfHwyXNyiI702OmpZsxZUTBj4i+7a1kwNZmcVDsVdTykJkRz+aJsntt2hM6ePqfjmDBinb8mICN9I23r6mVLeRMXz8u0b67jaM2KXJ7ddoQXd1Rzw/Jcp+OYMGEtBXPG9h9tQ4H5U5OdjjKhFOalMjsz0QqxGVNWFMwZ21PTSmp8FFnJsU5HmVBEhDUrctlR1cLOqman45gwYUXBnJHuvn5K6tqZn51sVzE74JPLcoiPjuDRd8ucjmLChBUFc0b2HW2jz6vMn5ridJQJKTk2imuX5fDSjhrq22zqMHPmrCiYM7KjspmUuCjybAI8x9x6bj49/V4bWzBjwoqCOW2d3X0cqG1jcU4KHus6cszMjEQunJPBExvL6enzOh3HhDgrCua0vVfdgldhSc4kp6NMeLedl099Wzcvv1fjdBQT4qwomNO2o7KFjKQYslPsrCOnXTg7gxmTE/i1DTibM2RFwZyW5s4eyho6WJIzyc46cgGPR7j13Hx2VDazraLJ6TgmhFlRMKdlZ1ULAEty7Kwjt/jkshySYiL59TtlTkcxIcyKgjllqsrmskby0uJJT4xxOo7xS4yJ5LrC6bz8Xg21rV1OxzEhyoqCOWWHj3XQ0NFDUUGa01HMELeem0e/Kk9ssLUWzOmxomBO2cbSRuKiIlg4zbqO3CYvPYEP+ScmtOU6zemwomBOSVtXL3uqW1iWl0pUhP35uNGd58+goaOHp4srnY5iQpD9qzanZEt5E16F5fnWdeRWKwrSWJaXyoNrD9PbbxezmVNj6ymEifGY4qDfq2wqa2RGRgIZSTbA7FYiwt0fnMntjxbz/PZqrl2W43QkE0KspWAC9t6RFpo7ezl3RrrTUcwoPjg3k3lZSfz8rRK8XnU6jgkhVhRMQLyqrD1QR2ZSDPOybTEdtxMRPv/BWRyq7+CV3UedjmNCiBUFE5D9R9uobe3mwjkZNvldiLh8UTYzMxL44WsH6LfWggmQFQUzKlXlrf11pMZHsdgmvwsZER7hK5fMpaSunee2HXE6jgkRVhTMqA7Vd1DZdJzzZ2cQ4bFWQii5dEEWC6cl89+vH7BptU1ArCiYEXlVeWVXDSlxUSzLS3U6jjlFHo/w1UvmUtV0nKc22yI8ZnRWFMyItlU0Ud3SxaULsuxitRB14ZwMigrS+PHrB2k53ut0HONy9q/cnFR3Xz9/2V3L9NQ4FttsqCFLRPjmFfNp7OzhR68fcDqOcTkrCuak1h6op627j8sXT7U1E0Lcwmkp3FiUy2Pry9l/tM3pOMbFrCiYYVU3H+ftA/WcPX0SuWnxTscxY+Brl8wlMSaSe17YjaqdomqGZ0XB/IO+fi/PbKkiITqSKxZlOx3HjJHUhGi+eskc1h9u4JktVU7HMS5lRcH8g7/uq+NoaxfXLJ1GfIxNjxVOblqRR1FBGve+uIeqpk6n4xgXsqJg3udQfTtrD9SzLC/VprMIQx6P8F/XLcGrytd+v9PmRTL/wIqCGdDY0cOTGyvISIrhcus2ClvT0+L59yvms/5wA4+sK3U6jnEZKwoG8J1++viGMgBuWZlHbFSEs4FMUK1ePp2PLpjCfa/sY93BY07HMS4S1A5jEbkU+DEQATysqvcNef0C4EfAYmC1qj4TzDxmeH39Xn67qYK61m5uOy+f9ERbKyHciQj/df3ZfPKBd7n7ya08f/d55E9OGJd1OYJtzYpcpyOEtKC1FEQkAvgZcBkwH7hRROYP2awCuA14Mlg5zMj6vcpvN1dyoLada5ZOY3ZmktORzDhJjInk4VsL8Qjc/j+bqW/rdjqScYFgdh8VASWqelhVe4CngKsHb6CqZaq6E7CZuhzQ5/Xyu+JK9ta0cuWSqRTaEpsTzvS0eB68pZCa5i5WP7Se1i6bBmOiC2ZRmAYMXjm8yv/cKRORu0SkWESK6+vrxyTcRNfd18/j68vZdaSJ3gCTAAASdUlEQVSFyxZmscpWU5uwigrS+J/bi6hp6eKXbx+mqaPH6UjGQcEsCsPNi3Ba57+p6kOqWqiqhRkZGWcYy7R39/HIulJK6tr5xNJpnD/b3tOJrqggjcfvKKKjp4+fvnmQvTWtTkcyDglmUagCpg96nANUB/F4JgDVzcf52ZslHG3p4uaVedZlZAYsy0vj7otmkZYQzeMbynlhRzWd3X1OxzLjLJhnH20GZotIAXAEWA2sCeLxzCh2VDXzx61VxEdH8pkLZzJtUpzTkYzLpCfG8JkLZvLKrqNsONzA9somLpqTSWF+KvHRdnX7RBC0/8uq2iciXwBexXdK6q9UdbeI3AsUq+oLIrIceBZIBa4UkW+p6oJgZZqovKq8tqeWtQfqyUuPZ01RLkmxUU7HMi4VFeHhyiVTWV6Qxqu7jvLK7qO8treWeVlJnJWdTF5aPGkJ0TZzbpgKaulX1ZeBl4c8981B9zfj61YyQdLV28/vNleyv7aN5flpXLkkm0iPXbNoRpeVHMut5+ZT3XycbRVNbK9sZne1b6whNsrDpLhoUuKiiIwQPCJEeASP+K6BUPWt7a34/isiREd4iI70kBQbSUpcFOmJMUxJiiHSFm9yFWsPhrGG9m4e21BOQ3s3Vy2Zyko7w8ichqmT4pg6KY7LFmVT19ZNRUMnNS3HaTneS2tXL339ild9LVKvKqq+s0zEXyAE32s9/UpPXz+9/X8/38QjkJUSy6yMJOZkJZKXlmDrgDvMikKYOlTfPnB16u3nFTAjI9HhRCbUeUTISo4lKzn2tPehqnT3eWk+3kt9WzfVzcepaOxkXUk9bx+sJyk2knNyU1men0ZaQvQYpjeBsqIQhjYcbuClndVMTozhlpV5Nm2FcQ0RITYqgqyoCLKSY1k0zbfMa1dvPwfr2tla3sTbB+p5+0A9S6ZP4qI5GWSeQREyp86KQhjxqvLSzmo2HG5kXlYS1xdOt4ntTEiIjYpg0bQUFk1LoeV4L++WHGNDaQM7KpspzE/jkvlTSLC1PcaFvcthos/r5eniKnYdaeH8WZP56MIsPHZ2iAlBKXFRXLYom/PnZLB2fx3rDzew60gLly7IojA/1c56CjIb9g8DnT1975uy4rJF2VYQTMhLjInk8sVT+eLFs8lKieXZ7Ud4bH25zc8UZFYUQlxLZy83P7zRpqwwYWtKcix3fKCAKxZnc6i+nZ+8cZADtW1Oxwpb1n0UwurauvjUI5s4XN/BjUW5LPQP2pnwFQ7rHZwOjwjnzpzMrMxEntpUyf+8W8bFZ2XywbmZ1ioeY9ZSCFHH2rtZ/dAGKho7eeS2QisIZkLITIrlsxfOZMn0Sbyxt44nN1bQ02cz748lKwohqKWzl1se2UR183Ee/XSRdRmZCSU60sN1y3K4fFE2e2taeWTdYdpt4r4xY0UhxHR09/HpRzdRUtfGg7cUUlRgs5yaiUdEOG/WZNasyKWmpYtfrD3EsXZbOW4sWFEIIV29/dz1eDHbK5v5yeqlXDjHWghmYlswNYU7z59BV28/v1h7iPKGDqcjhTwrCiGit9/LF3+7jXdKGvjetUu4bFG205GMcYXctHg+d+FM4qIieGRdKa/tqXU6UkizohACvF7la7/fwWt7avnWVQu4dplNLGvMYOmJMXz2wplkpcTy2Se28Ny2I05HCllWFFxOVfmP53fx3PZqvvbRudx6br7TkYxxpYSYSO44r4Dl+an889PbeXxDudORQpIVBRdTVe57ZR+/2VjBZy+cyecvmul0JGNcLSYqgkc/XcTFczP5j+d28cBbJU5HCjlWFFzsgbcO8eDaw9y8Mpd/vXSuzfliTABioyL4xS3LuGrJVL73yn6++8o+VHX0HzSAXdHsWo++U8r3X93PNUunce9VC60gGHMKoiI8/PcNZ5MYG8nP3zpEW1cv9161EI8t4DMqKwou9MyWKu55cQ+XzJ/C969dbH/IxpyGCI/wnY8vJCk2kgfXHqa9q4/vX7eEKFv+c0RWFFzmz+/V8C/P7OD82ZP56Zqltn6tMWdARPjGZWeRHBvF91/dT3t3P/evWWrrjIzAPnFc5K39dXzpqW0szU3lwVuWERNpf7jGjIW7PziLe69ewOt7a7n90c102LQYJ2VFwSXe2l/HXY9vYXZmEr+6bTnx0daIM2YsfWpVPj+8fgkbSxu56eGNNHf2OB3JlawouMBf99Vy12NbmJ2ZyG/uXEFKXJTTkYwJS584J4cHbjqHPdWtrH5oA3VtXU5Hch0rCg57bU8tn3l8C3OzknjyzpWkJkQ7HcmYsPbRBVn86rblVDR2cv0v1lPR0Ol0JFexouCgV3Yd5XNPbGH+1BSeuHMFKfHWQjBmPHxg9mQev2MFTZ29XPPAO2wpb3Q6kmtYUXDI89uPcPeTW1mUk8LjdxRZl5Ex42xZXirPfv5ckmIjufGXG3l+u82XBFYUxp2q8ou1h/jyU9tZlpvKY7cXkRxrBcEYJ8zISOTZz5/H2dMn8eWntvPtl/bQ1z+xV3KzojCOevu9/Mfzu7jvz/u4YnE2j91RRJIVBGMclZoQzW/uXMFt5+bzyLpSbnp4I7WtE3cA2orCOKlr7WLNLzfwxIYKPnPhDH6y2i6gMcYtoiI83HPVAv7ruiXsrGrh0h+9zesTdF0GKwrj4N2SY1zx03XsOtLKT25cyjcuO8umrjDGhT65LIcXv/gBslPiuPOxYr7+h520HO91Ota4sqIQRF29/Xz7pT2seXgjiTGRPHv3uVy1ZKrTsYwxI5iVmcizd5/LZy6cwdPFlXzkh2t5ZVfNhJlp1YpCkKw9UM9lP/4bj6wr5dZVefzpS+czLyvZ6VjGmADEREbwjcvO4rm7zyMtIZrPPrGVmx/ZyN6aVqejBZ3NpTDGSura+MGrB3hl91EKJifw+B1FnD87w+lYxpjTsDhnEi9+8QM8ubGC/379AJf/5G9cuWQqX7x4FrMyk5yOFxRWFMbIgdo2fv7WIZ7ffoTYqAi+9tG53Hl+gU1qZ0yIi4rwcOu5+Vx99lR+/tYhHltfzgs7qvno/CxuPTeflTPSwmq9EysKZ+B4Tz9v7KvliQ3lbDjcSFxUBP/rghl85oKZpNl0FcaElUnx0XzjY2dx1wUzeGRdKU9uquCV3UeZlZnINUuncdWSqUxPi3c65hmTUBs8KSws1OLiYseO39DezbqSY/x1Xx2v76mlo6efaZPiuGVVHjcUTnds7qInN1Y4clxj3GbNitxxOU5Xbz8v7qjmd5srKS5vAmB+djIXz8vkgjkZLM5JcdVp5yKyRVULR9suqC0FEbkU+DEQATysqvcNeT0GeAxYBjQAN6hqWTAzBcrrVWrbuig91kHpsQ52VrawvbKZ/bVtAKTGR3HV2VO5cvFUVsxIJ8JOMTVmQomNiuC6wulcVzidysZOXtpZw5v76vj52kPc/2YJ0REeFk5LZl52MnMyE5k9JYnZUxLJSIxxdXdT0IqCiEQAPwM+AlQBm0XkBVXdM2izO4AmVZ0lIquB7wI3BCNPfVs31c3H6e7z0t3XT3evl+4+L+3dvTR19tLU0UNjRw9NnT1UNR2nrKGDrt6/X+4+KT6Ks6dP4sol2Zw/O4OF01KsEBhjAJieFs/nLprJ5y6aSUtnL5vKGikua2RrRRMv7aimtevvi/okxUSSmRxDZlKs/78xpCZEkxgTSXx0JIkxEcRHRxIT6SEyQvCIEOnxEOERslNig94bEcyWQhFQoqqHAUTkKeBqYHBRuBq4x3//GeB+ERENQp/WH7ZWcd+f95309ehID+kJ0UyKj2bapDjOmzWZ/MkJFKQnkD85nmmT4lxd3Y0x7pASH8VH5k/hI/OnAL75zurbujlY186B2jbKjnVQ19ZNfVs32yqaqWvret8X0JF8++MLuWVlXjDjB7UoTAMqBz2uAlacbBtV7RORFiAdODZ4IxG5C7jL/7BdRPYHIe/kg0OO62KTsazBEkp5Leswbhqb3bjyvf3Ud+FT//h0oFkDqibBLArDfa0e2gIIZBtU9SHgobEIdTIiUhzIIIwbWNbgCaW8ljV4QinvWGcN5hXNVcD0QY9zgOqTbSMikUAKYKtdGGOMQ4JZFDYDs0WkQESigdXAC0O2eQG41X//WuCvwRhPMMYYE5igdR/5xwi+ALyK75TUX6nqbhG5FyhW1ReAR4DHRaQEXwthdbDyBCCo3VNjzLIGTyjltazBE0p5xzRryF28ZowxJnhsllRjjDEDrCgYY4wZYEXBT0S+LSI7RWS7iPxFRFy9Go6IfF9E9vkzPysik5zOdDIicp2I7BYRr4i48jQ/EblURPaLSImIfN3pPCMRkV+JSJ2I7HI6y2hEZLqIvCkie/1/A192OtPJiEisiGwSkR3+rN9yOtNoRCRCRLaJyEtjtU8rCn/3fVVdrKpnAy8B33Q60CheAxaq6mLgAPANh/OMZBfwCeBtp4MMZ9CULJcB84EbRWS+s6lG9ChwqdMhAtQHfEVVzwJWAne7+L3tBi5W1SXA2cClIrLS4Uyj+TKwdyx3aEXBT1UHL6mUwDAX0bmJqv5FVU9MqLIB33UgrqSqe1U1GFehj5WBKVlUtQc4MSWLK6nq24TI9TyqWqOqW/332/B9gE1zNtXw1Kfd/zDKf3Pt54CI5ACXAw+P5X6tKAwiIt8RkUp8V8q7vaUw2O3An50OEcKGm5LFlR9coUxE8oGlwEZnk5ycvztmO1AHvKaqrs0K/Aj4FyCwiZMCNKGKgoi8LiK7hrldDaCq/6aq04HfAF9wNu3oef3b/Bu+JvpvnEsaWFYXC2i6FXP6RCQR+APwT0Na5a6iqv3+LuQcoEhEFjqdaTgicgVQp6pbxnrfE2rlNVX9cICbPgn8Cfj/gxhnVKPlFZFbgSuADzl9JfgpvLduFMiULOY0iUgUvoLwG1X9o9N5AqGqzSLyFr6xGzcO6J8HXCUiHwNigWQReUJVbz7THU+olsJIRGT2oIdXASefZ9sF/AsY/Stwlap2Op0nxAUyJYs5DeKbb/4RYK+q/tDpPCMRkYwTZ/GJSBzwYVz6OaCq31DVHFXNx/f3+texKAhgRWGw+/zdHTuBS/CN6rvZ/UAS8Jr/NNpfOB3oZETkGhGpAlYBfxKRV53ONJh/wP7ElCx7gadVdbezqU5ORH4LrAfmikiViNzhdKYRnAfcAlzs/zvd7v9260bZwJv+z4DN+MYUxuxUz1Bh01wYY4wZYC0FY4wxA6woGGOMGWBFwRhjzAArCsYYYwZYUTDGGDPAioIJaSLS7z/NcZeIvHi6s8WKyMPDTdQmIreJyP1nkK999K3Gbz/GjMaKggl1x1X1bFVdiG+SuLtPZyeqeqeq7hnbaMaEHisKJpysZ9BEdiLyNRHZ7F9z4lv+5xJE5E/+OfN3icgN/uffOrHWg4h8WkQOiMhafBdfndjfoyJy7aDH7f7/JorIGyKyVUTeG22+JxH5roh8ftDje0TkK4HsR0QuGjx3vojcLyK3+e8vE5G1IrJFRF4VkexTfP+MmVhzH5nw5V8T4UP4plRARC4BZuObFluAF0TkAiADqFbVy/3bpQzZTzbwLWAZ0AK8CWwb5fBdwDWq2ioik4ENIvLCCPNRPYVvhssH/I+vxzfHzqnuZ3DuKOCnwNWqWu8vdt/BN4OuMQGzomBCXZx/quN8YAu+xYfAN1XJJfz9Az0RX5H4G/ADEfku8JKq/m3I/lYAb6lqPYCI/A6YM0oGAf6vv+h48bVWpgBHh9tYVbeJSKb4VvfLAJpUtcL/wR7wfoaYCyzEN+0JQARQE8DPGfM+VhRMqDuuqmf7v/G/hG9M4Sf4Pqj/U1UfHPoDIrIM+BjwnyLyF1W9d8gmJ/tm3oe/y9U/0Vu0//mb8H24L1PVXhEpwzdz5UieAa4FsvC1HALdz0AGvxOvC7BbVVeNclxjRmRjCiYsqGoL8CXgq/5v3K8Ct/vn8UdEpg36dt6pqk8APwDOGbKrjcBFIpLu3891g14rw9etBL6V2aL891PwzW3fKyIfBPICiPwUvtktr8VXIALdTzkwX0Ri/IXwQ/7n9wMZIrLK//tGiciCAHIY8z7WUjBhw98tswNYraqPi8hZwHp/d0o7cDMwC/i+iHiBXuBzQ/ZRIyL34Bu0rgG24uuKAfgl8LyIbALeADr8z/8GeFFEioHtBDDdsqruFpEk4IiqnujmGXU/qlopIk8DO4GD+LvHVLXHPwj+E3+xiMQ3buHa2V6NO9ksqcYYYwZY95ExxpgBVhSMMcYMsKJgjDFmgBUFY4wxA6woGGOMGWBFwRhjzAArCsYYYwb8P2wX7LMBEm7CAAAAAElFTkSuQmCC\n", "text/plain": [ - "" + "
" ] }, "metadata": {}, @@ -514,8 +530,15 @@ "collapsed": true }, "source": [ - "This histogram and the kernel density plot look approximately Normal, but with some deviations. Overall, these residuals look reasonable for a real-world model. \n", - "\n", + "This histogram and the kernel density plot look approximately Normal, but with some deviations. Overall, these residuals look reasonable for a real-world model. " + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": true + }, + "source": [ "Another useful plot is the **Quantile-Quantile Normal plot**, or Q-Q Normal plot. This plot displays quantiles of a standard Normal distribution on the horizontal axis and the quantiles of the residuals on the vertical axis. If the residuals were perfectly Normally distributed, these points would fall on a straight line. In real-world problems, you should expect the straight line relationship to be approximate. \n", "\n", "Execute the code in the cell below and examine the resulting plot. " @@ -523,14 +546,14 @@ }, { "cell_type": "code", - "execution_count": 13, + "execution_count": 10, "metadata": {}, "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhcAAAGJCAYAAAA5XRHmAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzs3Xd4VGXax/HvzKQ3KRZsK8Hy6K5l1VWCimVtCK4su7p2\nRVEBAQERsdCCqDSV3hRF17a66qrEXnktQV172ccG9qAikJCemfP+cSYhCSkzk5P++1xXLiZnTrnn\nyTDnnqf6HMdBRERExCv+1g5AREREOhYlFyIiIuIpJRciIiLiKSUXIiIi4iklFyIiIuIpJRciIiLi\nKSUXIiIi4iklFyIiIuIpJRciIiLiqbjWDkDEC8aYV4Cja212gC3A58Bca+19Hl9zCjDZWhtoYJ9j\ngJeBY621qz2+fgiYaq2d5uV5OwpjzDrgJWvtxeHfoyovY8wQYD9r7VUexLISOMZam9nUc7WF64g0\nRjUX0lE4wLtAbyAr/HMUcClQAfzTGNPP42veDvSJMDZpebXLPQu4I4rjJwLdPIylJd4HLXUdkQap\n5kI6knxr7du1tr1pjHkG+BkYDDzj1cWstT8CP3p1Pmle1tq3WjsGkc5CyYV0BiVAKdW+0RljfMAE\nYAiwO/ANsMBau7DaPr2A24AjgWTgA+AGa+3T4een4jaL+KsdMxS4MnzONcBdgK/a89scE95eo8re\nGLMHcANwPLADsBE3MRprrf2trhdpjBkNDAN6AhuAx4FrrLUFdey7a/g1X2GtXVxte3fgJ2C8tXae\nMeZEYBqwP1AOrAYmWGttXTHUE9cU3MRuNDAb2A34MBzbq+F9KpuPhgHXAV2Av1trXzTG9A2XxWG4\nf8sngaustb9Wu8aBwC24tRO/AtfXEUftMu4BzAT64f593w3HlGuMWQv8DhhsjLkQyLTWfmuM2R2Y\nBZwEJAFvhmN5v9p1uuC+b04Lb7qdRmqJjTH/Az601v6j1vb3gbXW2kHGGD8wHjgP2BMI4b4nr7fW\nvlLPebdpCqrnfdtgGYf/v9wAnAPsgptUPxg+T0VDr006JzWLSEfiM8YEqv0kGmMMsBJIA+6ptu9S\nYGp426nAQ8BcY8z1UPVhmgOkAOfi3ig2AI+Hkw6oVQVtjBkJLMH9YD4NyAWWU7OautFqa2NMMvAq\nYIDhwInAXOBsYHo9x5yNe6NcgHvjywbOB+bXtb+19gfgFeCsWk9V3tzuN8ZkAv8B3sIto4vDMeU0\nFH89dgDuDMdzOlAIPBtOCqqbjJucjQDeMMYcDbyA23fmDNwE5VjgJWNMIoAxZhfc8krHLaNJuGWx\nS33BGGNSgTeAY4CrgEFAEfCcMWZP4K/A+vBrzQJ+CidebwIHA5fjlp0fWB1+n1W+b57FTVjGAhfi\nJqe1y7m2e4H+4bgqY9wPOJCt79uZuE01S4CTgUtwm20eNsYkNXL+6mq/bxstY+Aa3MRvKu77cTFu\norNNEicCqrmQjuUY3G/X1Tm435JPr1bjsDfuB/MEa+2c8H4vGGMc4DpjzGIgAfdGmm2tfTZ83FvA\nFCCRuk0EHqjWAfAFY8x2wNAoX8c+uLUKF1hrvwlve9UYk4X7oV+Xo4Gvq9VC/J8xZgsN9xn4J7DC\nGLObtfb78LazgOettb8YY/6M++38ZmttHoAx5ntgoDEm1VpbGMVrSgYus9beHz7Py8DXuDetc6rt\nt8ha+2jlL8aYm4HPrLWnVtuWC3yGm+wswb2JB4BTrLUbw/t8jpvc1eci3JqJg621H4WPeR14D7dD\n5J3GmFLgl8qmNmPMlUBXIKuyvIwxTwP/w63dORPoj/vt/2Rr7fPhfV4C1jVSPvfiJoR/BSo7Hp+N\nW2O1Kvx7D+DaWjVNpcC/cZOQWJt9Iinjo4F3rLWVic7/GWOKgE0xXlM6OCUX0pH8F7gMtxliF+BG\nIB74h7X2i2r7/Tn87ypjTPWRHk/iJgh9rbVPGGM+Be4IdwR9Fni6vpEDxph9gR3ZeiOo9BDuN76I\nWWs/AI4xxviMMXsBewO/B/bDvYnW5WVgqDHmXeAx4Clr7QONXOoR3G+gZwK3hKv8j2LrzT4Xtznp\nHWPMw8DTwCvW2neieT1hFbjV6JWvscQY8xRwSq39Pqh8EK7B6Q3MqvV3Wod74zsR98Z3FPBmZWIR\nPv9bxphvG4jnSNzmho+qx4RbxvX5M/A+bi1G9XieYWuZ9QVKKxOL8HmLwq+19mgmqu2zLpzcnMXW\n5OIs4CFrbXl4n/MBjDHb4ya+ewN/Ce9bX8LboCjK+GVghjFmNfAEkFM9yRGpTc0i0pEUWGvfs9a+\na61dhfvB2A23BqH6N/juuAnIp7g1HZU/a3BrOiqr00/AbVI5Cfeb5XpjzIPh2ojauob//bXW9p9i\neSHhb8k/AxZYgVsrU0i1/hvVWWsfwv2mW4DbLPC2MeZrY8wZ9V3DWrsFt9nj7PCmM3Grxh8PP/8N\n7g0xF7dvytNAnjHmhhheUp61NlRr28/UrFmpHDpcqSvuZ9QEav6dyoA/ADuH9+vGtuUODZd99/D1\no9Edt4mkdizDgYxw00RXoK4+MZG8D/4JnGiM6WqM+ROwV3gbAMaYP4Vrz37GTWiGAcHw03W+LyIQ\nURlba2fhNlUlAzOAT4wxHxljjo3xutLBKbmQDsta+zPuB+Lu1Ox7sAn3RnYs8KdaP4fhVjNjrc2z\n1o601u6C284+E/g7dfd7qLy57VRre/davztQ1TZP+HFq9R2MMecAc3Crq3ew1u5irT0Nd76Ohl7v\nv6y1x4SveUY4pnvDHRfr80/g4HA/gzOBR8Lf4CvP+Y619nTcG/jxuDU41xlj/t5QLHWoXQ7gllVD\nN/h83PK6lbr/TheF9/uVbcu9vmtW2oTbD6QGY0yfcC1Ufce8ChxaRyyH496QfwW2r/73jSCWSg/h\ndtIchPu3WGutfSMcVzpucrcZd+6NdGttFm6H4cbUru1Kq/Y40jLGWrvEWnsYbvPMYNzakkeMMaoB\nl20ouZAOzVr7CO63vLPDPeLBHfEA7o373cof3BvUdKC7MSbLGJNnjDk0fJ4PrbWTgY+APeq4zhfA\nd7g39epOo2YHzvzwv7tV29aXmo4ENlprb60cGWKMScOt/q/z/2y4RuXRcCwF4dc9Hbfps96OjcBz\nuB0XRwOHUPOb8mhjzDpjTLy1tiI8ImEo7rfkbcqgEcnhkSeV507G7Z/wQn0HhGtW3gX2rfV3+hS3\nj8Ox4V1fBI4wxlTWZGCM+T3Qi/r9H9Ar3Gmy8pgk4FHcfgawtVagUmUn2y9qxXMhMCRcM/Mibpn/\ntdp543Frvxpkrd2M24F0IG6n13urPb0vboIyv9ZInf7hf+v7LM+n5nsN3PdR5TUbKuMbCJexMeZ1\nY8zc8DG/hvteLMQd1ZPR2GuTzkcZp3QGY3CTgvnGmEOstR8bY+4Dbg+PiHgH98P7RuAr3BqCBNxm\niH8aY7KBPNxmloNwhxnWZQJwnzFmOfAwcATb9rfIwf2WeLsxZjZup8LJbE06wO2YN8wYMwe3H8iu\nuCMadsLt4FeXl4Al4XM+hVvTMCX8Wj6o5xistSFjzIPAKOAHa+3Ltc45A/iPMWYh7s12GFuHKla2\n/+8JfFrXkNdqfMBKY8xE4BfckQYpuGVefZ/argNyjDH34vZFiMMti8NwEwxwR9JcjDvSYwpuP5vp\nuP1F6nMXcAXwRPiYX3HfJ/G4N01wayoODo+meAv373Ye8GL4b7MBt1/EkPCxWGtfMsY8h9tXZyfC\nw31xa0nWNxBPpX/i9oXxUy3Rw20eyweuN8YEcZsuTg9fG6BG7Vc1q4CzjDFrgC9xaxz2rLVPQ2Wc\nHd7nVWCcMWY97iib3YBxuH1w6hwaLZ2bai6kI6lziKe19nNgHm6P+uHhzYNx50UYiluzcS1wP3CS\ntdax1pbiftv8BPfm9QxuLcRl1trqH/pV17TWPoh7s8nC7bfQH7eDafVYvsAdIroH7gf/KNyRKz9W\n2+du3BvnGbiJwlTcYaNDgW6Vwx6pNqTQWrsc9ybWD/fGvxT4OPx6an8Dr+2fuJ8FNaZHD3d2/Avu\nEM/7cW96XYETq3WQHYB7szm4kWs4uGU/CXgAd9jnkdbar2vtU0O4Y+TJuDezh4G7cZsfjrfhSbHC\nN7ejcBPDu3CTgIVsm1RVL68tuDVGubjDd/+Fm9wca62t7Ag6B7cJ4BngEGvtT7gJ41rcTo5P4DYf\nXGytXVDtOoPYOvrjQdwarWWNlE+lp3ATyLestV9WK4d83PefD7f55J5wmfTF7WdTvfarejleift+\nmI1bfgW4SXCVSMoYt6PzjbjNJE+Hy+Zp3ARHZBs+x2n7M8WG24MX4VYXbwAWVhtCKCJtmIlgDRYR\n6VjafM2F2TqZ0Xrgj7jVshONMY1NSiMiIiKtoM0nF7jtzO8Bl1trv7LWPoPbaeqohg8TkTak7VeR\niohn2kWzSHXGmCNxx+YPC/eIFxERkTakXY0WMcasw52zYBXukDERERFpY9pDs0h1f8PtvX4wbg9+\nERERaWPaXbMIQHh2wHuBdBvBcr+O4zg+X6yz44qIiHRqUd9A23yziDFmR6CPtfbxaps/xZ3kKIO6\n5/GvwefzkZ9fTDBYe2kDqUsg4CcjI1llFiWVW/RUZrFRuUVPZRabynKLVptPLoBM4NHwstCVi//8\nCXcp5IhnhgsGQ1RU6A0VDZVZbFRu0VOZxUblFj2VWctoD8nF27jTM98ZXikyE5hF3YtHiYiISCtr\n8x06w4sBDcRd5+ENYDkw11q7sMEDRUREpFW0h5oLrLV5aA57ERGRdqHN11yIiIhI+6LkQkRERDyl\n5EJEREQ8peRCREREPKXkQkRERDyl5EJEREQ8peRCREREPKXkQkRERDyl5EJEREQ81S5m6BQREWkt\njgO5uQHy8nz06OGQlRXEF/Ui5J2LkgsREZF65OTEkZ2dyLp1Wyv6e/YMMWVKKQMGVLRiZG2bmkVE\nRETqkJMTx5AhSTUSC4B16/wMGZJETo6+n9dHyYWIiEgtjgPZ2YmEQnW3f4RCPqZNS8RxWjiwdkLJ\nhYiISC25uYFtaixqW7vWz5o1gRaKqH1RciEiIlJLXl5kPTYj3a+zUXIhIiJSS48ekbV3RLpfZ6Pk\nQkREpJasrCA9e4Ya3CczM0Tv3sEWiqh9UXIhIiJSi88HU6aU4vfXXTPh9ztMnlyq+S7qoeRCRESk\nDgMGVLBiRQmZmTVrMDIzQ6xYUaJ5LhqgQboiIiL1GDCggv79K8jNDbB+vTtDZ+/emqGzMUouRERE\nGuDzQZ8+6lsRDTWLiIiIiKeUXIiIiIinlFyIiIiIp9TnQkRE2g0tf94+KLkQEZF2Qcuftx9qFhER\nkTZPy5+3L0ouRESkRTgOvPlmgMcei+PNNwMRL1feFpY/96/Pw7eloPku0MEouRARkWaXkxNH796p\nDByYwtChyQwcmELv3qkR1Ti06vLnhYWkTbiS7gfsQ5f+J3h//g5KyYWIiDSrpjZptNby53Frcul2\n3BEk33WHu8Hnp1mrRzoQJRciItJsvGjSaPHlz0tKSM2eRJfTTiawbi0Apf3/wqZ/P4GGpkRGyYWI\niDQbL5o0WnL587gP3qPriUeTsmgePschlLEd+YuWk3/XvTg77NDk83cWSi5ERKTZeNGk0SLLn5eX\nkzLrJrr0+zNx9n8AlB13PBtX51J6xlmqsYiSkgsREWk2XjVpNOfy54HPPqXLKceTOmcGvmAQJyWV\ngjnz2Pzgo4R22TXm83ZmGhgsIiLNprJJo6GmkUibNDxf/jwYJHnxAlJnTsdXVgZAWZ8jKZi3mFDP\nzBhPKqDkQkREmlFlk8aQIUl1duqMtknDq+XPA19/Sfqo4cS/vQYAJzGRwuunUHzZ5eBXpX5TqQRF\nRKRZNWeTRtRCIZJWLKPrcUdWJRblBx/Cxhdfo3jYSCUWHlHNhYiINDvPmzRi8e23pF1wIfGvvgKA\nEx9P0VXXUDRqLMTpdugllaaIiLQIr5o0ouY4JNz3T7h+AvH5+QBU7PcH8hcuI3jAgS0fTyeg5EJE\nRDos//o80sZdQeJzzwDg+P0UXXElReMmQGJiK0fXcbWL5MIYswswHzgOKAIeAq611pa1amAiItJm\nJf7nEdImXIl/40Z3wz77ULBwGaV/PLR1A+sE2kVyATwCbACOBLoDdwEVwITWDEpERNoe34YNpF0z\njqTHH63aVjL0cpJunU2w1IGKhmf7lKZr88mFMcYAhwM7WWt/DW+bDMxGyYWIiFST8OzTpF85Cv8v\nPwMQ3P13FMxbjHPssSSlpEBpYStH2Dm0+eQCyAP6VSYWYT5gu1aKR0REcBcly80NkJfnjv7Iymrh\n0R/V+PI3kzrpWpIfuLdqW/H5gynMvhEnLb1d3Ow6kjZf3tbazcDzlb8bY3zASOCFVgtKRKSTy8mJ\nIzs7scbMmz17hpgypbRl560A4le/Qvroywn88D0AwZ16sOW2BZSdcHKLxiFbtfnkog6zgT8Cf4rm\noEBAE6NEqrKsVGbRUblFT2UWm9Yut1WrAgwZsu0y6uvW+RkyJImVK0s59dQWGHJaWEhy9iSS7lhe\ntan09H9QPHMOTtduNW5wrV1m7VWs5eVznMgWlWkLjDEzgbHAP6y1/4ni0PbzIkVE2jDHgb33hq++\nqn+fvfaCzz9v5oVEX38dLrxwayDbbw9LlsDppzfjRTutqP+S7Sa5MMYsAIYC51prH47ycCc/v5hg\nUD2EIxEI+MnISEZlFh2VW/RUZrFpzXJ74w0/p56a3Oh+OTnF9OnTDLGVlJA840YSF8zFF75/lfU/\nlaJb5+HsuFO9h+m9FptwuUWdXLSLZhFjzBTgMuBMa+1jsZwjGAxRoeFHUVGZxUblFj2VWWxao9x+\n+CGyavIffsDz2OI+eI/0UcOI+99nAIQytmPLTbMoPeMst5okguvpvdYy2nxyYYzZD5gI3AS8YYyp\nSk2ttetbLTARkU6oR4/Iarsj3S8i5eWkzJ1Dym2z8VW4nUXLjjmOgrmLCO26m3fXEc+0+eQCOA13\n9daJ4R9w238cINBaQYmIdFQNDTHNygrSs2eoxiiR2jIzQ/Tu7U2HzsD/PiN95FDiP3zfjS0llS1T\np1Ny4cXN3KlDmqLNJxfW2pnAzNaOQ0SkM2hsiKnPB1OmlDJkSNI2o0UA/H6HyZNLm37fDwZJXrqI\n1Bk34CstBaAs6wgK5i0mlNmriSeX5qYxOSIiAriJxZAhSdvUSlQOMc3Jcb+PDhhQwYoVJWRm1uy7\nkJkZYsWKkibPc+H/+iu6DDyFtOyJ+EpLcRIT2ZJ9E5sfy1Fi0U60+ZoLERFpfo4D2dnbzl1RKRTy\nMW1aIv37u7UXAwZU0L9/Bbm5Adavd5tPevdu4gydoRBJK1eQNm0SvqIiAMr/eDAFC5cT3Mc04cTS\n0pRciIgIubmBBvtRAKxd62fNmgBZWW5/Cp8P+vTxpm+F/4fvSR89goTVLwPgxMVRNG4CRVdcCfHx\nnlxDWo6SCxERIS8vsiqHSPeLmOOQ+K/7Sbt+Av6CfAAq9vs9BQuXUXHAQd5eS1qMkgsREWmVIaa+\n9etJHz+axGeeAsDx+ykeOYbC8ddCYqJn15GWp+RCRERafIhpwhOPkX71WPy//QZARa89KViwlIrD\nentyfmldGi0iIiJVQ0z9/rprJrwaYurb+BvpQy9iu0surEosii4dxsaXXldi0YEouRAREaD5h5gm\nPP8MXY/OIumxRwAI7v47Nj3yJIU3zoKUlCadW9oWNYuIiEiV5hhi6ivIJ3XydSTfd0/VtuJzL6Bw\n2k046RkeRC1tjZILERGpwcshpvH/9yrpoy8n8P13AAR33Iktty2g7MR+npxf2iYlFyIi4r2iIlKn\nTyHljmVVm0r+djpbbpqN0617KwYmLUHJhYiIeCru7TXu0uhffwVAqFs3CmbdRtlpg1o5MmkpSi5E\nRMQbpaWkzr6Z5IVz8YXcTqGl/fpTMGc+zo47tnJw0pKUXIiIdCINLafeFHEffUD6yKHEffYpAKH0\nDLbcOJPSM8/R0uidkJILEZFOorHl1GNSXk7K/FtJuWUmvgr3HGVHH0fBvEWEdt3Ni7ClHdI8FyIi\nnUCky6lHI/C5pcuAE0ideSO+igqclBQKZtzC5oceU2LRySm5EBHp4CJdTt2JdNmQYJDkxQvoevxR\nxL//HgDlh2fx20uvU3LxpeDXraWzU7OIiEgHF8ty6vXxr/2a9NGXk5D7BgBOQgKF10yiePhICAQ8\ni1naNyUXIiIdnCfLqTsOSXffSdrUifiKCgEoP+hgChYsJbjvfl6EKR2IkgsRkQ6uqcup+3/8gfQx\nI0h45SUAnLg4iq68mqLR4yA+3qswpQNRciEi0sHFvJy645D40AOkXT8Bf/5mACr23Y+ChcuoOPCP\nzRmytHPqdSMi0sHFspy67+efyRh8LhmjhuHP34zj81E0cgwbn1+txEIapeRCRKQTiGY59YQnH6fb\nMb1JfHoVABWZvdj05HMUTp4GiYktGre0T2oWERHpJBpbTt238TfSrh1P0qMPVx1TPOQytkzMhtTU\nVopa2iMlFyIinUh9y6knvPAsaWNHEVifB0Bw190omLeY8qOPbeEIpSNQs4iISCfmK8gn7cpRbHfO\nGVWJRfE557Px1TeVWEjMYq65MMYcAXxurf3VGHM+cCbwOjDDWhvpPG8iItJK4l9bTfroywl89y0A\nwR13Ysut8yk76ZRWjkzau5hqLowxQ4H/Aw40xhwIrAQSgLHAZM+iExER7xUVkXr91XT526lViUXJ\nX//GxtW5SizEE7E2i4wBRllrXwLOAj621p4EnA8M9ig2ERHxWNzba+h6/FGk3L4UgFDXruQvv4uC\n5StxunVv5eiko4i1WSQTeDL8+ETg6fDjz4AeTQ1KREQ8VlpK6uybSV44F1/IHY5aelI/Cm5ZgLPT\nTq0cnHQ0sSYXPwO7GGPKgYOBa8LbDwLyvAhMRES8EfjoQzJGDiXus08ACKVnsGX6DErPOpcaM2eJ\neCTW5OIB4D6gEPgOeMUYcyawAFjhUWwiItIUFRWkzL+VlDkz8FW4k2SV9T2WgnmLCO22eysHJx1Z\nrMnFtcD3QC9gkbU2aIzZEVgKZHsVnIiIxCbwuSV91FDi33sXACclhS2TplFy0SXg1ywE0rxiSi6s\ntSHcWorq2xbUs7uIiLSUUIjk5YtJvTEbX2kpAOWH9aZgwRKCvfZq5eCks4g4uTDGRDzE1Fo7LbZw\nREQkVv51a0kffTkJb74OgJOQQOE1kygePhICgVaOTjqTaGouLopwPwdQciEi0lIch6R77iJtyvX4\nigoBKD/gIAoWLiO43+9bOTjpjCJOLqy1mc0ZiIiIRM//4w+kjx1JwssvAuAEAhSNHU/R2PEQH9/K\n0Uln5enCZcaYBOAwa+3rXp5XRERqcRwS/vUAyROuwp+/GYAKsy8FC5dRcdDBrRycdHYxJRfGmEOB\n24EDqHuWTzXuiYg0E98vP8OQcaQ+9hgAjs9H8eVXUDjhekhKauXoRGKvubgNqABGhR9fCewFjMCd\nArxZGGMSgXeAEdba1c11HRGRtiph1ROkjx8NGzYAsHmHXnw0bjn7XHQ4Ph84DuTmBsjL89Gjh0NW\nVlDzZEmLizW5OAT4s7X2LWPMRcBH1tolxpjvgcuAhz2LMCycWDwAqHeSiHQ6vk0bSbt2PEmPPFS1\nbSEjmPDLTIquSaXn0hCnnlrOqlXxrFu3tUK5Z88QU6aUMmBARWuELZ1UrDOp+IGfwo+/wG0eAXgc\ndwpwTxlj9gNycdc0ERHpVBJefI6uR2dVJRbfsjsn8DyjWEgRqQCsW+dn4cKEGolF5fYhQ5LIyfG0\ni51Ig2JNLr4Ajgo//h9wWPjxdkBiU4OqwzHAi0AfQBV8ItIp+LYUkDbuCrY7+3QCee73uYfSBnMA\nH/EiJ9R1RJ3nCYV8TJuWiOM0Y7Ai1cSayi4AVhhjAP4NfGiMKQaOxK1h8JS1dmnl4/A1RUQ6tPjX\n/4/00ZcT+PYbAEI77Mjbly7kzJvOiOl8a9f6WbMmQFZW0MswReoUU82FtfYO4Bzge2vt/4DBuDUZ\n3wNDPYtORKSzKSoideIEugwaUJVYlJw2iN9Wr+GDPU5t0qnz8lTxKy0j5kY4a+1/qj2+H7jfk4ia\nSSCghXoiVVlWKrPoqNyipzKrKfD2W6SOGErgyy8ACHXtRtHsWyn/2+kEgF13bdr5d90V4uI6Z1nr\nvRabWMvL58TQCNfYOiPNubaIMSYEHBvlUFS1NIpI21VWBtnZMGMGhELutgED4PbbYeedq3ZzHNh7\nb/jqq+gvsdde8PnnaFiqxCLqd02sNRe11xmJA3YCyoE2OTtnfn4xwWCotcNoFwIBPxkZySqzKKnc\noqcyg8DHH5Ey/FLiPvkYACctnaKbZ1F2znluJrCxsMb+U6YEGDw4kVCors97h7ruA36/w+TJpWza\n1Hn7W+i9FpvKcotWrEuubzMk1BiTAawA3ojlnM0tGAxRUaE3VDRUZrFRuUWvU5ZZRQUpC24jZc4M\nfOXlAJT1PYaCuYsI7f47CDrUVenar1+IlSth2rQkvvxy6/bMzBADBpSTkxPP2rX+GtsnTy6lX78K\nKjTVRed8r7UCzwY+W2vzjTFTgOdwZ+1sLmriEJF2LfDF56SPGkr8u/8FwElOZsvkaZRcdCn4G2/j\nPvXUIOedB089VcyPP0KPHg69e7szcU6aVEZuboD16301tou0JK9nVdkO6OLxOWuw1mrdEhFpn0Ih\nkm9fQuqN2fhKSgAo/9Ph5C9Yyus/G/Ierzlld0NTeft8cMQR234L9/mgT5/O2/whbUOsC5fV1aEz\nAzgTeKlJEYmIdED+b9aRPvpyEt54DQAnIYHCCRN5pOdYpp6dss2U3Q1N5T1woKr1pW3zqkMnQBnu\nLJrXxR6OiEgH4zgk/XMlaZOvw1fkds7c1Osgnj/vDt4tO4BZlyZs0zmzcirv2p0zK6fyDgRKOb/Z\nlogUaTrPOnSKiEhN/p9+JH3sSBJeegGAkD/A/PTrGf/1RCqmxTdydP1TeU+dmsB553kcrIiHNJuI\niIjXHIcgL9jGAAAgAElEQVTEf/+LrkdnVSUWm3bZlyznTcZuzqaCxhKLhn39tZ/XXvMiUJHmEXHN\nRXjyqohGaqjTpYh0Vr5ffiH96rEk5jwBgOPz8eHxVzDwo5v4xknx7Do//gj77+/Z6UQ8FU2zyMVs\nTS72AK4BluHOa1GOuzLqCGC6lwGKiLQHjgPf3LaK/ReNJrHgFwC+ievFuRV38/oLRzVydPR22cXz\nU4p4JuLkwlq7svKxMeZVYKS19s5qu/zHGPMpMAaY41mEIiJt3PMPFZA84WoGFd5XtW0Rw5lQMYtC\n0jy/Xq9eIY46ys+mTZ6fWsQTsY4WORwYUsf2t4A/xB6OiEj78t6Mlzn61svZjR8A+I7dGMIKnuek\nJp65/qm8p04tw+dLauL5RZpPrB06vwDOqmP7ZcAnsYcjItL2OQ689WIRv/x9LCfdOrAqsVjJhRzA\nR01OLDIzQ4wcWUZmZmib7StWlHDqqZokS9q2WGsupgD/NsacCLyNm6QcAfwROMWj2ERE2pycnDie\nvnYNN+ddTC/WArCeHbmM5TzBwJjO6fM5TJhQRq9eoQin8tZAP2nbYp3n4jFjTF9gFHAybv3d+8BQ\na+0HHsYnItJmPPNYORuHTuYB5uEP929/mNMZzhI2sH1M56xcWGzAgG1XFdNU3tJexby2iLX2Ddro\nCqgiIl4L/PcdjhgxnD2xAPxGV0awiAc5i/omvKpPjx4hsrNL2XlnLSwmHVM081zcCYy21haEH9fL\nWntxkyMTEWkLyspIuXUmyXNvpVvIrUXIoT+Xcjs/Ef14UL/f4eab666pEOkooqm5yAQC1R6LiHRo\ngU8+JmPkUOI++QiAfNIZy23cycVEVltRc8RHQ00gIh1JNPNcHFfXYxGRDqeiguRF80iddRO+8nIA\nfjngaA776G6+oWdEp8jMDDFpUinduzt1dMgU6dhi7nNhjPkdsDHcTHIc8HfgdWvtA55FJyLSwgJf\nfkH6qKHE//cdAJzkZLZMysa56DJ8fdJhXf3Hqi+FiCum8UzGmEG4c11kGWP2BJ4FjgfuMMaM8DA+\nEZGWEQqRvHwxXf98ZFViUX7oYWx86TVKLhmGL+BnypRS/P66l1iq7EsxaFAFWVlKLKRzi3Ww9CTc\nKb5fBM4BvsGdmfMiYKQ3oYmItAz/N+vY7m+nkjbxGnwlJTgJCWyZmM2mVc8R3HNvHAfefDNAWRlc\nfXX9k1upL4WIK9Zmkf2AQdbakDHmJCAn/DgXImyQFBFpbY5D0r13kzr5OvyFWwAo3/9AXrv0dr5I\nOoAebzn89puP7OxE1q3b+l1sjz1CXHNN6TaTXomIK9bkYhPQxRizCegNzAxv3xPY4EVgIiLNyf/T\nj6RdOYrEF58HIOQP8HLW1Qz/YTJfjK6+bse2a3x8842fWbMSWLGihKwsTXIlUluszSI5uMut/xs3\n0XjeGHMCsBxY5VFsIiLecxwSH3mIrkdnVSUWn7EvvUNvcsIbN/HFN7UXBKu7SiIU8jFtWiJO3V0w\nRDq1WJOLUcDrwBbgNGttKXAU8CZwlUexiYh4yvfrr2QMuYCM4Zfg37yJED7mMI5DeJd3OCzq861d\n62fNmkDjO4p0MrGuLVIMjKu1baoXAYmINIeEp1aRftUV+H/9FYBv4zI5t+JuXqNvk86bl6fOFiK1\nNWWei4OA0cC+wBnAQOATa+2rHsUmItJkvs2bSLt+AkkPbZ2CZ12/S9j/mdsoJK3J5+/RQ+0iIrXF\nOs/FocAaoBdwKJAIHIzb96K/d+GJiMQu/uUX6Xp0VlViEdx5FzY9+CjPDZrvSWKRmRmid2916BSp\nLdY+FzOBOdbaY4EyAGvtpcBCYKonkYmIxGrLFtLGj6XLmYMI/PQjACVnnMXG1bmU//kET2ob/H6H\nyZNLNQRVpA6xJhd/Au6pY/si4PexhyMi0jTxuW/Q7bgjSL57BQCh7Xdg88r7KVi0HGe7LgBkZQXp\n2TPU0GlqqZmMaNIskYbF2ueiDMioY/vuQGHs4YiIxKikhNSbbyB56UJ84fGhpQNOo2D2XJztt6+x\nq88HU6aUMmRIEqFQw1UPWoBMJHqxJhf/AW40xpwZ/t0xxuwLzEPzXIhIC4t7/13SRw4l7nMLQGi7\nLmyZMYfSv51BfVnAgAEVrFhRwrRpiaxdu7USt2fPEGedVa7ZN0WaINbk4irgaeBX3KaVd3FrMj4A\nxnsTmohII8rKSLl1FinzbsEXdDtWlv35BApuW0ho510aPXzAgAr6968gNzegWgkRD8WaXISstUca\nY47HHSXiBz4GnrHWRtOQKSISk8Cnn5A+cijxH38IQCg1jcJpN1Fy3oX11lYAOA7k5gbIy3OTiays\nIH36aMSHiJdiTS7eN8b8w1r7Iu7KqCIiLSMYJHnRPFJn3oivvByAsiP7UjBvMaHf7bHN7tWTibVr\n/Tz4YHyNRch69gwxZUqpOmeKeCjW5CIVKPIyEBGRxgS++oL0kcOI/+/bADhJSRROyqZ4yFDwbzv4\nLScnbpsVTWtbt87PkCFJGv0h4qFYk4t5wKPGmEXAl0Bx9SettaubGpiISJVQiOQVy0idPhVfsftx\nU37onyhYsIzgXnvXeUhOTlxEo0Hc07uLkPXvX6H+FiIeiDW5uCn874I6nnMAreQjIp7wf/ct6aMv\nJ+E19zuLEx9P4dXXUTxiNMTFbdOHonfvILm5Aa6+OjGixKJS5SJkWkJdpOliTS4yPY1CRKQ2xyHp\n/n+SOula/FsKAKj4wwHkL1xG8A/7A3U3e8TFOVRUxFb9oEXIRLwR66qo33gdiIhIJf/6PNKuHEXi\n888C4Pj9FI2+kqJx10BCAlB/s0esiQVoETIRr8S8KqqIiOcch8T/PELahCvxb9oEQMVee1OwYCkV\nhx5WfTeys6Nr9miMFiET8Y6SCxFpE3wbNpA24UqSnnisalvRZcMpvH4qJCfX2Dc3N9DgCJBoaREy\nEW8puRCRVhf/dA7bjR6J/9dfAAj+bg8K5i2m/Mi+de7vZd+IzMwQkydrngsRL0WcXBhjvgd6W2t/\nMMZMxl1yvUXmujDGJAKLgb/hzq9xi7X21pa4tog0H9/mTTB2BGl33121rfj8iyjMno6Tll7vcU3p\nG6G1Q0SaXzQ1F92AA4AfgCnAElpuIq05wCHAsUBP4B5jzDpr7aMtdH0R8Vj8Ky+RMWYE/PgDAMEe\nO1MwdyHlfz6x0WMrl0yPpmlkhx1C3HFHCVlZSiZEmls0yUUO8JQxxgF8QJ4xps4drbWezXNhjEkB\nhgAnW2s/AD4wxswCRgJKLkTamy1bSJs2ieSVK6o2lZ55NgU3zMDp0rXRwyvntejXr5zlyxMi6tTp\n9zvMmlWqNUREWkg0ycU5QD/cGoy7gLHA5uYIqpaDcON8s9q214DrWuDaIuKhuNw3ybhiGIF1awEI\nbb89/mXLKDruZJyKutc8bGxtkNrzWtT+XX0qRFpexMmFtbYceBLAGNMTuKOF+lzsDPxqra3+ybAe\nSDLGdLfWbmiBGESkKUpKSJ0xneQlC/A5bn+J0v5/ofi2eXTZJxM2Flbt2lgyUVtFhQ+fz2Ho0HL6\n96/g8MODrFmjJdRFWlOsk2hlG2N2D3fsPAAoBz4BljfDBFspQGmtbZW/J0Z6kkDAu2FrHV1lWanM\noqNyq1vgvXdJHX4pgc8tAKGM7SiedQtlZ5xJIM5tQa0ss1WrAkyZksDatdGVoeP4eO65OG68sRyf\nz0/fvg7uSgQAHe/vofda9FRmsYm1vGJKLowxBwCrcTt0voX7v3cwMMIYc6S19pOYoqlbCdsmEZW/\nR1xzkpGR3PhOUoPKLDYqt7Dycpg+HW68EYLhvg4nn4z/jjtI3W03UqvtmpGRzGOPweDBEKq7daRR\nX3/t55NPUulb9+jVDknvteipzFpGrPNczAZeBs6x1pYAGGOSgPuAmcCp3oQHuKNTtjfG+K21lR87\nPYBia+2mSE+Sn19MMBjjp1YnEwj4ychIVplFSeW2lf/TT0i9/DLiPvwAACc1laIbbqbswovA56tq\nBqkss82bixk3LpFQqGnfKj//vIT99+/4nTb1Xoueyiw2leUWrViTi6OAPpWJBYC1tsQYk41bo+Gl\n93GbXbKAN8Lb+gJvR3OSYDBERT0dxqRuKrPYdOpyCwZJXryA1JnT8ZWVAVDW50gK5i0m1DMTgtWb\nK7Z67TWibgqpy447dq6y79TvtRipzFpGrMlFAZBQx/a6tjWJtbbYGHMPsNQYczGwGzAOuNDra4lI\n7AJff0n6yGHEv/MWAE5SEoXXTab4ssvBX3/i4Djw6qtNH72utUFE2o5Yk4sXgdnGmNOttb8BGGO2\nB2aFn/PalbgzdL6EO/x1krX28Wa4johEKxQi6a7bSbthCr4itxtU+SGHUrBgGcG993FHf7zpjv6o\nHL2xZo37+zff+PnXv+Crr5r2vURrg4i0LbEmF9fgNlF8a4z5PLxtH2ADcIwXgVVnrS0GLgr/iEgb\n4f/uW9LHjCDh/14FwImPp/Cqa3n5sHH89HE8a59sfF6KptI8FiJtT6xDUb83xvweOB/YH3fGzuXA\n/dbafA/jE5G2yHFIeuBeUideg39LAQAbdjuA+068g9se+BPrbm54Xoqm0NogIm1fzKuiWmu34K4v\nIiKdiH99HmlXXkHi888AEMTPDK5h2veTKbsr4qlnIqZkQqT90ZLrItKoylkzU558hCPuHUNiyW8A\nWPbhAu7hLXo3y3XHjy/lqqvKlEyItDOaqkxEGpSTE0e/PxVTPPAiTrjjAlLCicVcRnMw7zVbYgHQ\nt69qKUTaI9VciMg2Kmsqnn46wI/LnuUp5zJ6sB6AdezBRdzFKxzXrDFoaKlI+6WaCxGpIScnjt69\nUzl/YDmHLR3O487AqsTidi7hQD5s9sRCQ0tF2rdY1xZJAK4CHrLWfmmMuQM4C3gdd0pwrVQq0g7l\n5MQxZEgSx4Ve5FUu5nd8B8CP7MwQVvAMpzR7DBpaKtL+xdosMhN3GOozxph+uIuWTcZdU2QOmo9C\npN1xHJg1pYJ5oVGMZFHV9ns5lyuYz0a6eXKd2vNc7LknnHlmKT17ajSISEcRa3JxBnC2tfZdY8wS\n4BVr7U3GmGeBp70LT0S8VtmfovaMmd8+sIYnvr2UvfgKgF/YnmEs5VH+3qTr1R5Kevjh7vXWr/ex\n665wyinJbNpUofUeRDqQWJOL7sBn4ccnAcvCjzcAKU0NSkSaR05OHNnZiTVmzEwNFDM5OIWrmIM/\nvKjYY/yVoSzjF3aM+hqRzEvRp4/bUTMuzq9aCpEOKNbk4ivgMGPMjkAm8Gx4+1+Br70ITESiV7tW\nIivLvYlXjvxYvjyBUGjr3fwQ/ss9wQv4A58CsIntGMlC7uNc3Il3I9OjR4i//rWC/v0r1KwhIjEn\nF7OAB4AQ8JK19gNjzCRgCnCxV8GJSOTqqpXYYYcQPh/8/HPNgWFxlHM9NzKR6cThJiDPchJDWMEP\n7NbotTRrpog0JNa1Re4xxryPW2vxTHjz28DJ1trmWBVVROpQfT6K2rUSAL/8su1o89/zCfdwAYfy\nLgBbSGUct7Ccy2iotsLncxg6tFy1EyLSqKasLfKhMeZ/QKYx5ivgRWttuXehiUhD6qqpaIifIFdy\nK9OZSCJlAKymL4NZyVp6NXishoeKSDRinefCB9wMXAEk4C63fqMxphAYriRDpHlVzkdRu6aiPnvy\nJXdzIUfyBgAlJHIdNzGXMTgNzKU3fnwpffsGVVMhIlGJteZiFO48F5dD1YD4/wCLgfXA9U0PTUSq\nq2wC+eknH1OnJkaUWPgIMZwlzOJqUikC4C0O40Lu5n/s1+CxmZkhLRomIjGJdfrvocBIa+1K3E6d\nWGv/BVwCnOtNaCJSqXJK7oEDUxg2LJm8vMb/6+7OtzzLySxiJKkUUU4cE7mBI3ij0cRC02+LSFPE\nWnORCbxXx/YPgB6xhyMitUXbBAIOF3I38xjNduQD8CEHcAH38AF/3Gbv2jNmqn+FiDRVrMnFOuCw\n8L/VnYLmuRBpsliaQAB2Io/lXMZpPAlAED8zmUA2UygjsWq/6iM/qs+YqWGlIuKFWJOL2cBiY8zO\nuE0rxxtjLsPt4HmlV8GJdEbRjgKpdAYPsYThdOc3ACz7MLbrXfw3PouyavNc1FUzUTljpoiIF2Kd\n5+IuY0w8MBFIxp3++xdgorV2qYfxiXQq0TeBQDc2sIgRnMW/qra9tP8ISqZMZeXRiUAhubmqmRCR\nltOUeS6WA8uNMdsDfmvtz96FJdL5OA5kZ0feBAIwgFXczqXsTB4A38ftwcdjl3Do+KNq7KeaCRFp\nSREnF8aYoxt5ft/Kx9ba1U0JSqQzys0NRNwUkk4+tzGWIdxZte2bkwaTsng6h2ZkNFeIIiIRiabm\n4hXAofHVjBwgEGtAIp1NZefNJ5+M7L/jcbzEXVzEHnwLQFGXHpQvmk/Kif2aM0wRkYhFk1xkNlsU\nIp1UNJ03kyliBtdwBQuqtn3d+x9k3DMLp2u35gxTRCQqEScX1tpvmjMQkc4mms6bfXiDlQxmH74A\noCy9O8W33kr6wEE4zR2oiEiUoulz8TVwmLV2gzFmLdT/mWatbXgVJJFOLtLOmwmUks0UxjObgDsZ\nLqX9+lMwZz7Ojju2RKgiIlGLplnkbqA4/Hil96GIdHyV/StWr2688+YfeY97uIAD+BiAsuTtKJk1\nk9J/nI3GkopIWxZNs0h2tV9fBt6svfqpMSYJGOBRbCIdSqT9K+Io51puZhI3EI870VXZMcdRMHcR\noV13a4lQRUSaJNZ5Ll7GXUPkl1rbfw/cCzzSlKBE2ovKmoi8PB+77gr9+2+7vUcPh99+80XUv2I/\nPuVuLuQw3gGgIjGF4mnTKRk8RLUVItJuRNPnYgxwS/hXH5BnjKlr17c8iEukzaidKGRluTNc1lUT\nseeeMGBAPE88EVdje1yc02Bi4SfIWG5jOhNJohSAtxOPotfqhYQy1YVJRNqXaGouFgK/4a4lcicw\nFthc7XkH2AK85Fl0Iq2srgSiZ88Qp55azuLFCdskDF99BfPnx1N7Opjqq47W1ouvWMlg+vIaACUk\nMtE3nQMWD6NnpsaCiEj7E02fiwrgHgBjjAM8aK0tba7ARFpbfUNF163zs3BhAvXPJxf50ujDWMoc\nriKVIgDe4VCu23Ul50zfk/5a8lxE2qlYFy672xizhzEmC9jmU9Zae48XwYm0hLqaPaCxoaJN6/+w\nG9+xgiGcxPMAlBPHy0deS9m4cdx7pB+fT4mFiLRfMSUXxphLgcXUPc23Q7iGQ6Stq6/Z46yzyqNe\n8jwyDhdwD/O5gu3IB+Bj/sC1u6xkxaNGfTZFpEOIdbTIdcBS4Hprbb6H8Yi0mIaaPWbOTPD8ejuR\nxzKGMpAnAAjiZzbjyfZNZcmNjmorRKTDiDW52Bm4RYmFtFeNzZDpON5WIZzOwyxhONuzAYAv2IsL\nuZu8zCyWTC5lgPpXiEgHEmty8T7wB2Cdd6GINI+6+lREs7x5A2em7r4XW7d35TcWMYKzebDq2a8H\nDOe1ftO4do8UevcuVFOIiHQ4sSYXs4BFxphewP+AGqNGrLWrmxqYiBfq61PRr19kNQU+n1NnLYbf\n73D55WXk5MSzdu3Wc++1F/TvX86TT8ax39qnuYNL2Jk8ALZ0352K5YtJ73sMpwIQbMpLExFps2JN\nLv4d/ndeHc851N3RU6RFNdSnYtmy+IjOcc01ZTz4YM0EIjMzxORwU8akSWXk5gZYv96dofOUU5LZ\n9O0GZvx6Dclrt/ZrLj7nAkpuuAknPcObFyci0obFmlxkehpFhIwxzwL3aairNCaSPhVxcU6Dk1tl\nZoYYM6aMMWO2JhA9ejj07h2sasrw+aBPH7cGIi7Oj+/ll8i4cDCB778DILjjTmy5dT5lJ53i7QsU\nEWnDYp3n4pv6ngsvXuYpY4wPmA+cANzn9fml44mkT0VFhQ+/v+5puf1+h8mTS6uSiMoEol5FRSTf\nNBWWL62qtisZ9He23DwHp1v36F+AiEg7Fus8F92B64ED2NoE4gMScRcv6+JJdO61dsFdDC0T2OTV\neaVjy8uLrJfk0KFlPPNM/c0ekYh7ew3po4YR9/VXAIS6daNg1m2UnTYo+sBFRDqAWJtFFgPHA88D\nZwAPAPsBhwDXehNalUOAb4HTgf96fG7poHr0iGxNjlNOCTJ1av3NHg0qLSV19s0kL5yLLxRyt/3l\nL+TPmkt59x1iD15EpJ2LNbk4AbjAWptjjDkQmG2t/dAYsxx3iKpnrLWrgFUA9azCKrKNrKwgPXuG\nGmwaycwMVSUSjTZ71BL30QekjxxK3GefAhBKz6D45lmkXn4ZzqYiqAg1KX4RkfYs1uQiDfgw/Ph/\nwB/Dvy8AnormROE+GrvW8/RP1tqiGGOsIRBojqmcO6bKsmrvZTZtWhmDB9fdqdPvd8jOLiM+PsrX\nWFFB0m1zSJo9A1+F22xSfsxxFC5YjH+PPcDna/fl1pI6ynutpancoqcyi02s5RVrcvEDsAfwHfA5\ncGB4exHQLcpz9QZexh3CWtsgCM+V3EQZGclenKZTae9ldv75kJYGV18NX365dftee8GsWT4GDYqy\n7/Fnn8GFF8Lbb7u/p6TA7NnEDxtGF//W/4Dtvdxag8osNiq36KnMWkasycUjwEpjzIXAC8CDxphc\n4K/AF9GcyFr7KtDsqWR+fjHBoKqqIxEI+MnISO4QZXbssbBmDbz5pp+8PB877+yQlRXC54ONGyM8\nSTBI4tLFJE+fiq/UnS+uoncfChctJdRrT9hcDHSscmspKrPYqNyipzKLTWW5RSvW5OJ6IB7Yw1p7\nvzHmEeAhYDNux8s2JxgMUaF28Kh0pDI7/PCtryMYRfcK/7q1pF8xnITcNwBwEhIovHYyxcNGQCBQ\nZ9+KjlRuLUVlFhuVW/RUZi0j1nkuyoAx1X4fZoy5Dsi31moFJmn/HIeku+8kbepEfEWFAJQf+EcK\nFi4juO9+rRyciEjbFus8F0c38Fxzri0S2fhCkSbw//gD6WNGkPDKSwA4cXEUjR1P0ZirID6yacNF\nRDqzWJtFXmHbJSGd8E8ISGhaWHWz1vZqjvOKAOA4JD78IGnXXY0/fzMAFfvuR8GCpVQcdHArByci\n0n54tbZIHLAPcANwTZMiEmkFvp9/Jn38GBKfXgWA4/NRPGI0hVdfB0mez2gvItKhebm2yFfGmHxg\nKe604CLtQsKTj5N+9Rj8GzYAEOyZSf6CZVT0zmrlyERE2qdYay7q8yuwl8fnFGkWvo2/kXbteJIe\nfbhqW/HFl7Jl0jRITW3FyERE2jcvO3Rm4I4g+bhJEYm0gIQXniVt7CgC6/MACO66GwVzF1F+zHGt\nHJmISPvnZYdOgHXAeU2IR6RZ+QrySZ1yPcn33l21rfjs8yi84WacjO1aMTIRkY7Dqw6dAGXW2p+a\nEoxIc4p/bTXpoy8n8N23AIR22JGCW+ZT1q9/K0cmItKxxNyh0xgTALYHNltrSyqfM8b0CG8r9ihG\nkaYpKiL1pmxSli+p2lRy2iC2zLwVp3v3VgxMRKRjinpND2PMP4wxbwClwI9AoTHmfWPMxeFdFgOj\nPIxRJGZx77xF1+OPqkosQl27kr/8LgruuFuJhYhIM4mq5sIYMx8YCbwITMQdHdINOA64PbyQ2X7A\nEI/jFIlOaSmpc2aQvOA2fCF3HYHSk/qx5Zb5hHbq0crBiYh0bBEnF8aYQcBlwKnW2qdqPT3LGDMQ\neAy4yVob6XqTIp4LfPwRGSOHEvepO3AplJbOlhtnUnrWueCr3QdZRES8Fk3NxQjcxKF2YlGpD5AP\n9G1yVCKxqKggZcFtpMyZga+8HICyvsdQMHcRod1/18rBiYh0HtEkFwcBoxt4/i/AcGBhkyISiUHg\ni89JH3kZ8e+9C4CTnMyWyTdQctEl4I+6a5GIiDRBNMlFIrClgef3B/YAtGyktJxQiOTli0m9aRq+\nEnfQUvlhvSlYsIRgL00WKyLSGqL5SvclcGR9T1prHeAo4POmBiUSCf8369hu0ADSJl+Hr6QEJyGB\nLZOmsemJZ5RYiIi0omiSiweAacaYbnU9GZ7fYhpwjxeBidTLcUi6+066HdOHhDdfB6D8gIPY+Pxq\nikeNgUCglQMUEencomkWmQf8A3jfGDMbeB3YhDsU9RhgHG7txiKvgxSp5P/pR9LHjCDh5RcBCPkD\nfHHG1XS95Sp8CWqRExFpCyKuubDWluHOZ/E0MAt4G/gCWAPcgDsMtZ+1NtgMcUpn5zgkPvwgXY/O\nqkosPuH3HB7KZd9/3UTvo7qQk+P1Ir8iIhKLqD6NrbVbgKHGmPHA4bjTf/8CvG2tzW+G+ETw/fIL\n6ePHkPjUkwCE8HEL45jEDZSSBMC6dX6GDElixYoSBgyoaM1wRUQ6vVjXFskHXvA4FpFtJKx6gvTx\no/Fv2ADAurg9Oa9iJa9z1Db7hkI+pk1LpH//Cs2VJSLSijQBgLRJvk0bSR9+CdtdfF5VYrH2lMv4\nQ8UHdSYWldau9bNmjTp0ioi0JjVSS5sT/9LzpI8ZSSDvJwCCu+xKwdxFPL/xJIqeTm70+Lw8VVuI\niLQm1VxIm+HbUkDauNF0OevvVYlFyVnnsnF1LuXH/pkePZyIzhPpfiIi0jxUcyFtQvwbr5F+xXAC\n334DQGiHHSm4ZT5l/fpX7ZOVFaRnzxDr1tWfE2dmhujdWwOWRERak2oupHUVF5M66Rq6/LV/VWJR\nctogflu9pkZiAe6CplOmlOL3110z4fc7TP7/9u49zqqy3uP4ZxgY7iiK5SULTuUvb+nxkGBqmoqa\nGqSZpnYgtdQAlUBRSSHwioo3kMALpieTwMxKPGZeSlExOIrX+nmPLMagVO6XYfb543k2LIa9Z/YM\ni1l7hu/79ZoX7HX97Wevy289z7PWGr1anTlFRDKm5EIy0/aFeXQ//CA6TZ0MwJqu2/H49+7m0TN/\nRoyUsCkAABlNSURBVO122xec59hja7jzzlX06lW70fBevWp1G6qISJlQs4g0vzVr6DThGjrdfAMV\ntSFJeLzjMXxn6R1U37ET3AE9e9YyZszqgsnCscfWcMwxNcyZU8kHH1Sw4445+vRZpxoLEZEyoeRC\nmlXlq6/QbejZtH39VQDWdOjK4NU3cufKM4AN2UFDD8WqqIADDlDfChGRcqRmEWkeNTV0uul6uh91\n6IbE4sCvcFiPl7gzdybJxCIv/1CsnG7+EBFpUZRcyBZX+eYbbHtcPzpfNY6KtWvJdezI0quv45EL\nZvHM+73qnVcPxRIRaXnULCKbLZeDOXMqqa4O/R/69o39H2pr6Xj7T+h85VgqVq0CYG3v/Vk6aQrr\n/uNzVP+qtKRBD8USEWlZlFzIZpk1qy1jx7bf6NkTPXvWct1gp/+DZ1H17GwAclVVLB/5I1YOOQ8q\nQ1Khh2KJiLROSi6kyWbNasuZZ3agtjZZs5DjiPfu5KiRw6liGQBr9/oiSydNZd0ee240vx6KJSLS\nOqnPhTRJLgdjx7bfKLHYmb/zMMdwO2fRlWXUUMnyERfx0SNPbJJYgB6KJSLSWim5kCaZM6cyUeOQ\n41Tu5VX24ms8AsDr7E5f5vDEIaOhqqrocvRQLBGR1kfNItIk+U6WPVjEFM7hmzwAQC0V3MBwLuNy\nVtGR6uqVDS5LD8USEWldlFy0MkXv3EjZjjvmGMCD3MZZfIJFALxDLwZxN7M5eKPpSqGHYomItB5K\nLlqRYnduFHuMdlNVfPQh/X52Ef2Zvn7YZH7ASK5lOV3WD1NnTBGRrZP6XLQS+Ts36t55kX+M9qxZ\n6eSR7Z54jO6HHEDHmSGxeJ9dOJLfMYTJGyUW6owpIrL1UnLRChS6cyMplcdoL1tGlwuGse23T6By\n4T8AWHXSKTx161ze6nXERpOqM6aIyNat7JtFzGwbYAJwHCEZmgUMc/ePMw2sjGx850Zh+cdo9+3b\n+GaKds89Q9dzf0DlgvcAqO2xA0uvv5k1xxxHP+CIE5erM6aIiKxX9skFMBXoBRwdP08BbgNOziyi\nMlPq47Eb/RjtlSvpPO7HdLxtMhWx2mP1cQNYeu2N5Hr0WD+ZOmOKiEhSWScXZtYJOAH4srvPj8OG\nAU+ZWZW7r8k0wDKxRR6jPXcu3U77DpVvvgFA7Tbbsuya61l9wrdQtYSIiNSn3Ptc1BKaQ15KDKsA\nKiHRe3Arl3+Mdn1KvnNjzRo6XDkODjhgfWKx+vB+fPj086z+5klKLEREpEFlXXPh7quAR+sMPh94\n2d3/nUFIZSn/GO1N3/MRlHrnRuVrr9Jt6Nm0fe0VAHJdurBs3NWsOm2gkgoRESlZ5smFmXUAdiky\neqG7r0hMOxQ4ETiqOWJrSfKP0R43rj3vvruhQqpXr1pGj27gORc1NXScfAudx19Jxdq1Ydihh7Lk\npltZs/OuWzhyERFpbTJPLoA+wJNAoQ4BxwO/ATCzwcDNwPnu/nhjV1JZWe4tQJtvwIBa+vdfyXPP\ntaG6uoKddsrRt29trHQo/P3bvPUmnQefTdt5fwIg16EDq8deQYcLfkjFstW0XVd/c4tskN/GtoZt\nLS0qs6ZRuTWeyqxpmlpeFbnNevhB8zCzC4BrgRHufmMTFlH+X7K51dbCpElw8cWwMr7/o08fuPtu\nMMs2NhERKSeNbhcv++TCzAYB0wjPtpjYxMXklixZyTpdhQPQZsFf6TT0HNrNfhqAXLt2rLr4R6w6\ndxi0bUtlZRu6deuIyqxxVG6NpzJrGpVb46nMmiaWW6OTi3JoFinKzLoDE4G7gRlm9snE6EXuXvIW\nsm5dLTU1W/kGlcvR4d576HzZJbRZvgyAmj33Zsmkqazbc68wTaKMVGZNo3JrPJVZ06jcGk9l1jzK\nOrkAjgQ6A4PiH4TqmRzhwVoLMoqrxWlTvZAuw8+l/WPh5ptcZSUrzh/OiuEXQVVVxtGJiEhrUtbJ\nhbv/AvhF1nG0aLkc7X91P10uHkGbjz4CoObzu7F04hRq9uudcXAiItIalXVyIZunYvFiul40nPa/\nfRCAXEUFK88azPJRo6Fjx4yjExGR1krJRStV9b+z6DriPNosXgTAuk/3ZOktk1n75YMyjkxERFo7\n3fDbCnW65gq2GXTK+sRi5cAz+PAPzyixEBGRZqGai1aow/0zAFi3084svXEiaw/rl3FEIiKyNVFy\n0QotuW0a7eY+z6qTTyW3bfeswxERka2MkotWqGa/3roTREREMqM+FyIiIpIqJRciIiKSKiUXIiIi\nkiolFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpIqJRciIiKS\nKiUXIiIikiolFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpIq\nJRciIiKSKiUXIiIikiolFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpIqJRciIiKSKiUXIiIikiol\nFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpIqJRciIiKSKiUXIiIikiolFyIiIpKqtlkH0BAz2wGY\nDPQDVgD3AKPcvTbTwERERKSgsk8ugHuBWqAP0AP4OfARcE2WQYmIiEhhZZ1cmFkVUA382N3fAdzM\n7gcOyjYyERERKaaskwt3XwMMzH82sz2B/sCUzIISERGRerWYDp1m9gfgFeBDQh8MERERKUOZ11yY\nWQdglyKjF7r7ivj/c4HuwCRgOjCgMeuprGwxeVTm8mWlMmsclVvjqcyaRuXWeCqzpmlqeVXkcrmU\nQ2kcMzsEeBIoFMjx7v6bOtP/FzAX6OnuC5ohRBEREWmEzJOL+phZV+Br7j4jMawjsBzo7e4vZBac\niIiIFFTu9UOdgOlm1icxrDdQA7yRTUgiIiJSn7KuuQAws5lAT+D7QFfgduAhd78gy7hERESksHKv\nuQA4A3gJeBT4JfBb4OJMIxIREZGiyr7mQkRERFqWllBzISIiIi2IkgsRERFJlZILERERSZWSCxER\nEUmVkgsRERFJVebvFmkOZrYD4WVn/YAVwD3AKHevzTSwMmdm2wATgOMIiegsYJi7f5xpYC2Emf0O\nuNfd78k6lnJkZu0J++UJhP1ygrvfkG1ULUMsu3nAEHd/Kut4yp2Z7QzcAnyVsK3NAC6Jb96WAszs\ns8CtwIHAv4BJ7n59qfNvLTUX9xIewNUH+BZwCjAy04hahqnA3sDRwJHA7sBtmUbUAphZhZlNBI7I\nOpYydz2wH3AoMBgYY2YnZBpRCxATi/uAPbKOpQX5JdCBcKL8NvB14PJMIypjZlZBuJj8ANgXOAe4\n1My+XeoyWn1yYWZVQDUw2INngPuBg7KNrLyZWSfCFeUQd5/v7vOBYcDxsUylgHiF9DihtuejjMMp\nW3H7OhM4z91fcvdfA9cCQ7ONrLyZ2e7AHKBX1rG0FGZmwP7Ad939L/EcMBo4NdvIytongRcJ5823\n3f0RwnGt5PNmq28WidVeA/OfzWxPoD8wJbOgWoZawgnypcSwCqAS6AL8O4ugWoD9gAXAicD/ZRxL\nOduHcPx5LjFsNjAqm3BajEMIB/lLCdX70rBq4Gh3X5wYVgFsk1E8Zc/dqwk1/ACY2YHAVwg1GCVp\n9clFkpn9gVBA8whtvVKEu68iPHI96XzgZXdXYlGEuz8EPAQQLpikiJ2Axe5ekxj2AdDBzLZ3939l\nFFdZc/f1F0XavkoT+4j9Pv85VvkPBR7LLKgWxMzeA3YlHNceKHW+VpFcmFkHYJcioxe6ez7DPxfo\nDkwCpgMDmiG8stWIcsPMhhKuxo9qjtjKVWPKTOrVCVhdZ1j+c/tmjkW2LtcR+hH0zjqQFuIEYEdC\nbf9NhIvMBrWK5ILQUfNJoNCLUo4HfgPg7q8AmNnpwFwz+7S7L2i2KMtPSeVmZoOBm4Hz3f3x5guv\nLJVUZtKgVWyaROQ/K0GTLcLMxgPnASe5+5+zjqclcPcXAMzsh8DPzGxEnRrHglpFcuHuf6RI51Qz\n62pmJ7n7jMTg1+O/PQjt41ul+sotz8wuIHS0G+Huk5olsDJWSplJSf4O9DCzNolbwncEVrq7OsJK\n6uIdXGcDp7n7g1nHU87M7BPAAbGjdd7rQBXQjRL63G0NB8lOwHQz65MY1huoAd7IJqSWwcwGAeMJ\nNRY3Zh2PtCrzgbVA38Swg4G52YQjrZmZjQHOAk5295lZx9MC9AIeMLOdEsN6A4tK7XPXKmou6uPu\nH5jZL4FJZvZ9wvMubgducfdl2UZXvsysOzARuBuYYWafTIxepAeQyeZw95Vmdg8wxczOAD4FjAAG\nZRuZtDbx9t1LgauAZ5PHMnf/ILPAyttcwo0P08xsOCHZuBa4otQFbA01FwBnEG6pfJTwMJXfAhdn\nGlH5OxLoTDjY/yP+LYz/firDuFqSQv0yZIPhhNt1nyAkspfVqYaV+mn7Kk1/wrnuUjY9lkkB8eJx\nALAceJbw8MSbGtM0XpHLafsUERGR9GwtNRciIiLSTJRciIiISKqUXIiIiEiqlFyIiIhIqpRciIiI\nSKqUXIiIiEiqlFyIiIhIqpRciIiISKqUXIiIiEiqlFzIFmdmVWZ2oZm9YGZLzGyRmT1mZsdnFE9b\nMxuW+DzGzN6J//+MmdWa2VeyiC3G0M7MZprZCjN7v4nLONbMvpB2bHXWcUgsq0+nvNx6f4PEeq8r\nMr7WzAamGVMazOwuM3uikfO8a2aj4/8Hmdm6Rsz7ZTM7sIFp1peVmf20sfEVWF7RfUu2LkouZIsy\nsy7AM4RXHV8P7A0cCjwF3Gdmt2QQ1qnAhMTn64AvJT5n/Uz8o4FvAicAfRqYdhPxZP9b4BMpx1XI\nliqrUpY7zMz6NjxZqzEd2KnBqTaYDXy2gWl2BH4R/5/Gb9nQviVbiVb/VlTJ3ATgk8C+dV7V+5qZ\nzQMeMrPZ7j6jGWPaKKl29xXAisSgimaMpZDtgJy7P9LE+duQfYK0uUr5Dd4D7jKzfd199RaOJ3Px\nO/4z5WWmujwa3rdkK6HkQrYYM+sGDARG1kksAHD3h83scWAY4bXunwHeBQ5196fiMjYaZmZVwJWE\nK/tdgGXAY8Bgd/9XYvoTgZHAvoQ3IF7l7reb2SBgWlz2OuCr8e+77t6ryPc4HbgQ6BmXPRWY6O65\nOH5gXNdngX8BM4GL3H1NkeV9CrgGOBzoSrjCvNDdXzGzMcCYRHxj3X1cgWUUWudIYGfgHUJy8aSZ\njXX3cWb2DcKbgPcCKoHXgFHu/mhc3pPAHGCHWLZtCLUfZ7v78jjNwYQr0S8CDtxVJ6Zt4/ivEWpN\nPgR+DZzn7qvM7BDCb/WjGOs77r6/me0F3EyopflHLJuGkqMc8IO4/KsIr2svyMwOILwq+r+AtfF7\nXZDfJs3sXeB+4JjE9x9HeO30TsA3gKXAWOAV4FZgN2A+MNDd306Uz4+B3kB7wu9wpbvf28B3ycfZ\njfB22P7AmlgOyfHfBaa5e5v4+Wsxzj0I+8HDwDB3/9jMamMZ3WVmh8bY3wVGAecT3nb5n8DHhG3/\nnriatrE2cVCMYTowwt3XNLR/El7LXe++Vd+2H8fnt6nFhGNHF8Jbc7/v7tWllKOUBzWLyJa0P1BF\naBYp5nFgfzOrjJ8LnVSSw64FjicceD4X/z2ccMJKugG4HPgC8BAwOR4IpxOSmRyhSvi5etaLmZ0V\n1zmGcBC/FLgIuDqO/yLhdcSXAZ8HTgf+G7igyPK6EF5hvDNwHHAA4cruKTPblXByTsZ3fYFl7F1k\nnRcCCwjlXkFoVrnezPYjnDzvBfYknMT/CdxjZskLjGGERKw3cBrhpPrDuM5ewO8Ir0jfl3BSG10n\ntJ8C+8T5PheXNxA4KzFNJeEk3gf4XjyhPkZIRHoTEobLCpVdAW8QfvfzzezLhSYws/2BJwlJQR9C\n0tkH+J2ZJWtHhgBDCU1Sz8dh5wEvEJryHiSc+CfH4QcTEo9r4np2Bh6J8+4b/54H7jCzHUr8PjMJ\nZXAs0C/+m+zPkot/mNn2wAPAHYARyjyf/BFjqyAkEucnljGQcMI/yd2XFojhIEKC1ZeQYJwIjK8T\nQ135YfXuWyVs+3mnAN3j9zmakBReUWC9UsZUcyFbUo/470f1TLOYcBDMT1uoOjw57E/ATHfPJyx/\nM7PfE04ASRPcfRaAmf2IcPLo6+6/MLOPAdx9URxf33e4FLjc3WfGz++Z2TaEZGU04WqtFviru78P\nvG9mRwJLiizvvwnNHicmrpxPBd4Ghrj7xXXjK+A/iq3T3XNmlp/vQ3dfEa8ih7j71PwC4tXpLEKT\n1d/j4NfdPX9if9vMHgXyHQLPIiQeQ2ONzRuxb8cNibgeBf7o7q/FzwvM7Dw2/W2uS1ztnw10Ilzd\nLgP+EjsEPlDku9d1M6Gm4S4z28fdV9UZPwJ4yd3znQzdzE4h1DocRUgIAB529yfzM8Vt4kV3vzF+\nngScA9zi7k/HYTOAAXGWDsBod5+QWMZ4wgl6N6DYb5mfdjdCQnGYuz8bh50K/LXILJ8iJO5/S2wD\nXyce0939g/gdlrj7UjPbLs53q7v/pZ5Q/gEMirVufzazy4CJZnZJHF90/3T31Q3sW/Vu+4SaNQjH\ni7PdfR1hO5tOqA2TFkTJhWxJ+QPqdoQq4kLyB72PCQfoern7z83scDO7mnDQ/gLhyu2pOpP+JTHP\nkniQqyo9dDCzHoSD+NVmdmViVJu4rF6Ek9OzwLxYvf4o8Gt3f6HIYvcC3kg2E8Umgz+x6Um4mEat\n091fMrN/m9lIYHdCrcK+cXRlYtK6J52PgW0Scb+YbwqKnq0z/U+A/rEZ6fOEWpKewJ8T0+SAtxKf\n8+WxrM5yS+r3EpOp04GXCM0jw+tMshehxiU5z8vxJLg3G5KLNwssPhnn8vhvcjteSWj+wN3fiXdb\n5JOpzxFqcXJsXMbF7B2nnZeI85/F7rSIv+l9hD5LC4HfE2roftXAet5qYPy8Os15zxO29d0I28Pm\nKHXbfzsmFnkf08h9V7KnZhHZkuYCq4FD6pnmq8ArBa448zZKgM1sCqH6tR2hvf0U4L4C8xXq4NfY\njpr5/WMY4USR/9ubcLB9291Xu/sRhPbrqYST6kNmdkeRZRaLoQ2hP0CDGrvO2NfhDUKv/fmEfgGn\nFZi0vjLLsenxYn28sYlhFqEmId9WfyybJiAQTsp59S63FO7+FqEvwXlmdlCR+OuqqLOelQWmKRRH\nbaGFmdkehDI+ltAfZTyhJqLUbS6ftJVcFu7+HUJiPR7YHvgZdRKpAgp9z6S6t7pWEr5DsQ6zjblA\nLXXbT2PflYyp5kK2mFhjcBcwwszudffqeBJ6ldCp7jngSOD7cZb8FVO3xGJ2Y0Ob7XaE6vmT3P3+\n/ARmtjuhw12pSrqTIl45LgI+6+63J9Z3MqGNe6CZHQ18yd0vJ1w9X2tmowgnu+8VWOzLcb4e7r44\nLq8Doa39p6XEVcI6636/4cAT7v6txDLOjf8t9aA9H/iumbV195o4LHmL4b6E9vH93X1eXEc7whX8\n2w0s9wwz2y5xRfulAt+hXu5+s4XnptxVZ96XCf0I1jOzfQjb2Guk5xyg2t2PSqzn6zGWUsp4fpzu\nQOB/4/zbEspvE7EvybfdfTih1uWW2MTwP8ltqwn+s87ngwn9It5hQ9Nlwf0zqu932+xtX1oOJRey\npV1IuNJ/NvZRmE3onPUTwh0DT7v7NAB3X2hm7xGeX/Am4WB2ORsOWEsIVaTfMLMXCW315wL7Ee50\nKNUygNjR8fUGph0PXGFmfyMc9PchdOr7lbuvNbO1wBgzW0ro9Lc9obNasU6sPwcuIdwdM5KQUI0B\nOhM6aZaioXXmmxj2NrP5wN+AARYeqPQ+cBihQybEav0S/ITQLj7NzK4inPTGJMZXx7hONrPFhN9u\nFKFPR3IddU+00wmdMu8zswsJHfluKiGeQifsMwnJVtINwNOxj8lkQkfDiYSOqZv1wKg6FgC7xsTv\ndcIJ8+Y4rsEyjs0qM4FJsRPxB4RmnmLNAUuAIWa2Brgd6AicTGh2yCcWy4DdE/0tSrGrmU0jdCTe\nnVDLNd7d1wIN7Z/5dRbbt+rb9qcirYqaRWSLive5f5VwwhhGuHqZxIY28t3N7AEz2zHO8h1CO/98\nYAqhk1dtXFYNoff6XnE5DxP6aVwC7BGvgqDhO06eIHQMfYZQjV10Wne/gXDlP4RwsLwxxvWDOP5x\n4Iz49yohAXHCw4QKlccSQjPRh4S7JJ4inHwOdPdinffqLqPedcYagGmEOwfGEe6+eJ5QW/Qi4SR8\nOqGKvKQHHLn7QkJSsivhxHwd4cSSHD+IcBvl68AMQiJzI+FEm7fRbxO3j8MIJ5rZwN1sfHdCMZv8\nxrGT6CV1hv2JDXccvEBIZmYD/RLt+qXWktQ33S2Eh1H9D+HOlFExlvco/SFSAwnb9HTgj4Tfdl6h\nCWOnzOMJ+9aLwNNADeFOnLwJhOR7Wj3x5+oM/3VczvOE/XQSG9+pUXT/jIruWw1s+wsKfU9puSpy\nuZb+rB1pycysO6FZZKK7N9QeLCIiLYCSCxEREUmVmkVEREQkVUouREREJFVKLkRERCRVSi5EREQk\nVUouREREJFVKLkRERCRVSi5EREQkVUouREREJFVKLkRERCRVSi5EREQkVf8PxCGGb8K9YnMAAAAA\nSUVORK5CYII=\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEWCAYAAABmE+CbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XmcjXX/x/HXhxAhhXZmWoXqVrRvWu8odfNLy62iTSVJe9Ki7rTvJCntk0qr0iLZSqkQIrQabQolimzz+f3xvSbHOGfmDOfMOTPzfj4e85hzrus61/U5l3E+57ubuyMiIlIl0wGIiEh2UEIQERFACUFERCJKCCIiAighiIhIRAlBREQAJQRZT2Y2w8xaJ9jX2sx+SNF1xpjZOak4V7aLfa9m1snMRpTBNXPNzM1soxSfN2V/A1J2lBAqODObY2bLzOxPM5tnZk+YWe0NPa+7N3f3MSkIUeJw9zx3P7qk48ysj5k9UxYxScWnhFA5tHP32kALYE+gV4bjqfBS/Y1bpCwoIVQi7j4PeIeQGAAwsxpmdpeZzTWzX8xsoJnVjPY1MLM3zGyRmf1mZu+bWZVo3xwzOzJ6XDMqefxuZl8Ae8deN6qS2Cnm+RNmdnP0eLPoGvOj179hZtvFi9/MdjKzsWb2h5ktMLPnExz3tpl1L7Jtqpl1sOBeM/s1Os80M9stmfsXvY8eZvZtdP07Y+5HFzMbH537N6BPtP0sM5sZvbd3zCwn5nxHmdmsKI7+gMXs62JmH8Q8b25m70b/Dr+Y2TVmdgxwDXByVAKcGh27qZkNNrOfzexHM7vZzKpG+6pG/94LzOxb4Nhi3u/VZvZikW33m9kD0eMzo/e2JLon55Vw7+L+DUTPjzOzKdHf2odmtkfMvqui97HEzGab2RGJriMbRgmhEok+aNsAX8dsvh3YhZAkdgK2Ba6P9l0G/AA0BLYkfPjEm+vkBmDH6OffQOdShFUFeBzIARoDy4D+CY79HzAC2AzYDuiX4LhngVMLn5hZs+j8w4GjgUMI77kecDKwsBTxtgdaAXsBJwBnxezbF/gW2ALoa2b/IdyzDoR7+D4wJIqpAfAScC3QAPgGODDeBc2sDjASeBvYhvDv9J67vw3cAjzv7rXd/V/RS54EVkXH7Rm958J2mHOB46LtrYATi3mvQ4C2ZlY3iqMqcBLh/gL8Gp2rLnAmcK+Z7VXM+eKKXvMYcB5QH3gYGBZ9WWkCdAf2dvc6hL+vOaW9hiRHCaFyeNXMlgDfE/4T3wBgZkb4gLjE3X9z9yWED5hTotetBLYGctx9pbu/7/EnvzoJ6Bud43vggWQDc/eF7v6Suy+Nrt8XODTB4SsJH+zbuPvf7v5BguNeAVrEfBvvBLzs7sujc9QBdgXM3We6+8/JxgvcHr3PucB9xCQe4Cd37+fuq9x9GeED7tboGqsI97YwrrbAF+7+oruvjM41L8E1jwPmufvd0fte4u4fxzvQzLYkJP2e7v6Xu/8K3Muaf9OTgPvc/Xt3/w24NdEbdfd8YDLwn2jT4cBSd58Q7R/u7t94MJaQrA9OfOsSOhd42N0/dvfV7v4ksBzYD1gN1ACamVk1d5/j7t+sxzUkCUoIlcN/om9XrQkfhA2i7Q2BWsCkqKi+iPAttGG0/05CaWJEVCVwdYLzb0NINoXykw3MzGqZ2cNmlm9mi4FxQL3CKo4iriRUq3xioZfTWXGOIUosw1nzIXgKkBftG0UogTwI/GJmgwq/ASep6PvcJsE+CMnr/ph7+1sU/7YUuWdRoi36+kKNCCWIZOQA1YCfY677MKHUQtHrUvK/VWxp67+sKR1gZm3MbEJUjbWIkOQaxDlHMjFfVhhvdK5GhMT/NdCTUAX3q5k9Z2bbFHMu2QBKCJVI9C3uCeCuaNMCQhVNc3evF/1sGjVAE30TvczddwDaAZcmqL/9mfAfuFDjIvuXEhJPoa1iHl8GNAH2dfe6hOociKlPj4l/nruf6+7bEL59D4itly5iCHCqme0P1ARGx5znAXdvCTQnVB1dkeAc8RR9nz/Fhljk2O+B82LubT13r+nuH1LknkWltUbE9z2hOi6eeNdcDjSIuWZdd28e7S/p36qooUDrqLqxPVFCMLMahCqvu4At3b0e8CZx/t0ixf0NfE8oYcbep1ruPgTA3Z9194MIicMJ1ZySBkoIlc99wFFm1sLdC4BHCHW/WwCY2bZm9u/o8XEWGnINWEwovq+Oc84XgF4WGoi3Ay4qsn8K8N+oQfMY1q4SqkNISovMbHOi6qx4zKyjrWlw/p3w4RAvHggfTjnATYQ69oLoHHub2b5mVg34C/i7mHPEc0X0PhsBFwNxG7YjAwn3pXl07U3NrGO0bzjQ3EJD90ZAD9b+kIz1BrCVmfWM6tXrmNm+0b5fgFyLGrej6q8RwN1mVtfMqpjZjmZWeM9fAHqY2XZmthmQqNRHdL75wBhCO8937j4z2lWdUJUzH1hlZm0IbRWJFPc38AhwfvTvYma2iZkdG73PJmZ2eJSA/ib8rZTm30tKQQmhkon+gz8FXBdtuopQLTQhqrIZSfjGDrBz9PxP4CNgQIKxBzcSqh6+I3wYPV1k/8WEEsYiQn3+qzH77iN8g18ATCBUWSWyN/Cxmf0JDAMudvfvErzP5cDLwJHEVHMQGkAfISSUfEKD8l0AFnruvFXM9QFeAyYRPuCGA4MTHejurxC+zT4X3dvphPp93H0B0BG4LYphZ2B8gvMsAY4i3MN5wFfAYdHuodHvhWY2OXp8BuED+4vofb5IaAsieu/vAFMJ7QMvl/B+Idy/te5jFFMPQoL5nVCdNKyYcyT8G3D3iYR2hP7Rub4GukS7axDu0YLovW9BaKiXNDAtkCOSHDNzYOeoXlukwlEJQUREACUEERGJqMpIREQAlRBERCRSribgatCggefm5mY6DBGRcmXSpEkL3L1hSceVq4SQm5vLxIkTMx2GiEi5YmZJzR6gKiMREQGUEEREJKKEICIigBKCiIhElBBERARQQhARyWp5eZCbC1WqhN95eem7VrnqdioiUpnk5UHXrrB0aXienx+eA3TqlPrrqYQgIpKlevdekwwKLV0atqeDEoKISJaaO7d02zeUEoKISJZqnGCB00TbN5QSgohIlurbF2rVWntbrVphezooIYiIZKlOnWDQIMjJAbPwe9Cg9DQog3oZiYhktU6d0pcAilIJQUREACUEERGJKCGIiAighCAiIhElBBGR9VCWcwyVFfUyEhEppbKeY6isqIQgIlJKZTrH0OzZ0KED/PJLGk6+NiUEEam01rfap0zmGFq2DK67DvbYA0aNgs8/T+HJ41NCEJFKqbDaJz8f3NdU+ySTFNI+x9Dw4dC8Odx8M5x0UiglHHlkik6emBKCiFRKG1Ltk7Y5hubOhfbt4bjjYOONYfRoePpp2HLLDTxxcpQQRKRS2pBqn5TPMbRiBdxxBzRtCu+8A7fdBlOmQOvW63nC9aNeRiJSKTVuHKqJ4m1PRsrmGBo7Frp1gy++gBNOgPvvDxkmA1RCEJFKqaynll7HL79A586hFLB0KQwbBq++mrFkAEoIIlJJlfXU0v9YvRoeegh23RWGDAmNFjNmQLt2ab5wyVRlJCKVVllOLQ3AxIlwwQXh9+GHw4MPhsSQJVRCEBFJt0WL4MILYZ994Icf4NlnYeTIrEoGoIQgIpI+7vDMM9CkCQwcCN27w6xZcOqpoZ4qy6jKSEQkHWbODL2HxowJJYO33oK99sp0VMXKWAnBzBqZ2Wgzm2lmM8zs4kzFIiKSMn/9Bb16hSknpk4NJYOPPsr6ZACZrTJaBVzm7k2B/YALzaxZBuMRkQogo9NSv/YaNGsWBpaddlqoHjrvvBBMOZCxKiN3/xn4OXq8xMxmAtsCX2QqJhEp3zI2LfWcOdCjB7z+Ouy2G4wbBwcfnMYLpkdWpC0zywX2BD6Os6+rmU00s4nz588v69BEpBwp02mpAZYvh1tuCaWCUaPgzjth8uRymQwgCxKCmdUGXgJ6uvviovvdfZC7t3L3Vg0bNiz7AEWk3CiTaakLjRoF//pXyDZt24ZG5Msvh2rV0nCxspHRhGBm1QjJIM/dX85kLCJS/qV9WmqAefNC/dMRR8DKlfDmm/Dii9CoUQovkhmZ7GVkwGBgprvfk6k4RKTiSOv8RKtXQ//+YUzBiy/C9dfD9OnQpk0KTp4dMllCOBA4HTjczKZEP20zGI+IZLmSehClbX6iTz6BvfeGiy6CffcNieDGG6FmzQ08cXbJZC+jD4DsG6onIlkp2R5EKZ2f6Pffw5iCQYNg663hhRfgxBOzcpRxKmS8UVlEJBll2oPIHZ58MlQPPfoo9OwZGo07dqywyQCUEESknCizHkTTp8Ohh0KXLrDTTjBpEtxzD9Stm+ILZR8lBBEpF9Leg+jPP+HKK2HPPcP6BI8+Ch98ELqWVhJKCCJSLqStB5E7vPJKGFx2551hFbPZs+Hss8vNlBOpUrnerYiUW2npQfTtt3DccdChA2y2GYwfH0oGDRqkLO7yRNNfi0i5kbIeRMuXh9JA376w0UahjeCii8LjSqxyv3sRqXxGjgzrFHz1FZx0UkgG226b6aiygqqMRKRy+OknOOUUOOqo0G7wzjvw/PNKBjGUEESkYlu1Cu6/P6xf/OqrYYTx55/D0UdnOrKsoyojEam4JkyACy6AKVPgmGPCXEQ77pjpqLKWSggikjVSttrZwoVhXov994cFC+Cll8KspEoGxVIJQUSyQkpWOysoCFNOXHEFLFoU1ie44QaoXTstMVc0KiGISFbY4LmKpk2DQw6Bs86Cpk3hs89C11Ilg6QpIYhIVljvuYqWLIHLLoO99gojjB9/PKxpvPvuKY+xoisxIZjZJmZWJXq8i5kdH610JiKSMqWeq8gdhg4NvYfuvRfOOSckhC5dKvSMpOmUTAlhHLCxmW0LvAecCTyRzqBEpPIp1VxFX30Veg2ddBJsuSV8+CEMHAibb14msVZUySQEc/elQAegn7u3B5qlNywRqWySmqvo77+hT59QHTRhAjzwQFjNbL/9MhV2hZJMLyMzs/2BTsDZpXidiEipFDtX0dtvQ/fu8M03YcTxPfeEVcwkZZIpIfQEegGvuPsMM9sBGJ3esEREIj/8EFYqa9MmTD43ciQMGaJkkAYlftN397HA2Jjn3wI90hmUiAgrV0K/fmEcwapVcPPNYVxBjRqZjqzCSpgQzOx1wBPtd/fj0xKRiMj48WHKic8/h2OPDYlh++0zHVWFV1wJ4a4yi0JEBMI0E1ddBY89FvqbvvoqHH+8upGWkYQJIaoqEhFJv4ICGDwYrr4aFi8Oaxtffz1sskmmI6tUSmxDMLOdgVsJXU03Ltzu7jukMS4RqSymTAnVQxMmwKGHwoABYX1jKXPJ9DJ6HHgIWAUcBjwFPJ3OoESkEli8GHr2hJYtw9rGTz0Fo0crGWRQMgmhpru/Rxiglu/ufYDD0xuWiFRY7vDcc2HKiQcegPPPh1mz4PTT1VaQYckMMPs7msvoKzPrDvwIbJHesESkQvryS7jwwjCWoGVLeO012HvvTEclkWQHptUijD1oCZwOdE5nUCJSwSxbBtddF6ac+PTTsHLZxx8rGWSZZAamfRo9/JMwsZ2ISPLefDNMOfHdd3DaaWGNgq22ynRUEkcyvYxGE2eAmrurHUFEEps7NzQav/JKaC8YNQoOOyzTUUkxkmlDuDzm8cbA/xF6HImIrGvlyrA+wY03hgbkW24JC9hUr57pyKQEJbYhuPukmJ/x7n4psG8ZxCYi5c24cdCiBVx1Fd83PYqDNv+CKr17kbtLdfLywiF5eZCbC1WqhN+F2yXzkqkyil1xogqhYTklFYBm9hhwHPCru++WinOKSAb8+mtY2P6ppyAnhzGXDuPYge3+WSM5Px+6dg1TFD35JOtsh2KmvZYyY+4J568LB5h9R2hDMEJV0XfATe7+wQZf3OwQQmP1U8kkhFatWvnEiRM39LIikiqrV4dVbK65Bv76KySF3r3JbVaL/Px1D69aNbykqJwcmDMn7dFWWmY2yd1blXRcMr2M0jbFoLuPM7PcdJ1fRNJo0qQw5cSnn4bG4gEDQuMxoT05nnjJoLjjpWwVN/11h+Je6O4vpz6cuHF0BboCNE642raIlJlFi8KYggEDoGHD0Ahw6qlrjTJu3JhSlRD0Xzs7FNeo3C76ORsYTFhCsxPwKHBa+kML3H2Qu7dy91YNGzYsq8uKSFHu4cN/111DMujWLUw58d//rjPlRN++UKvW2i+vVSu0F8Tb3rdvmmOXpCRMCO5+prufSWg/aObu/+fu/wc0L7PoRCQ7zJoFRxwRBpY1bhwWtu/XD+rVi3t4p06haSEnJ+SKnJzwfMCA+NvVoJwdkmlUnh7b4BvNazQtVb2CojaEN9SoLJKFli4NX9/vvDOsTXDbbXDOOaHuR8qNZBuVk5nLaIyZvWNmXcysMzAcGL3BEQJmNgT4CGhiZj+Y2dmpOK+IpMDrr4epqG+5JVQLzZ4N550XNxlobEHFkEwvo+5RA/PB0aZB7v5KKi7u7qem4jwikkJz5sDFF8OwYdC8OYwdC4ccAoQP+t69Q6+gxo3X1P137aqxBRVBiVVG2URVRiJptGIF3HMP3HRT+Krfpw/PNryYa26oxty5sPnmsGRJOKxQrVpQsyYsXLju6TS2IHts8DgEM/vA3Q8ysyWsPbmdAe7udVMQp4hkg9Gj1/Qaat8e7r+fvHGN1vrmH+9Df+nSNfuL0tiC8idhQnD3g6LfdcouHBEpU/PmweWXh7qgHXaA4cOhbVsgVA0l+rBPhsYWlD8lNiqb2Y5mViN63NrMephZ/L5mIlI+rF4dFqnZdVcYOhSuu47nrp1Obre2/zQMxxtYFk/9+hpbUFEk08voJWC1me1EGKC2PfBsWqMSkfT59FPYZx+46CLeX743TVZ8ToMBN9H5/Jrk54fxZ/n5yS1vXKsW3H+/xhZUFMmsh1Dg7qvMrD1wn7v3M7PP0h2YiKTY77+HeqCBA1m66VacV/15nvm7I2AQp33APXzAx/Y7qVYN6taF335b08uo8INfCaD8S6aEsNLMTiWso/xGtK1a+kISkZRy58Pzn2JBgyasfuhhBte+mGY2i2dWnEToI1LsS9f65v/447BgARQUhB5ESgIVSzIlhDOB84G+7v6dmW0PPJPesEQkJWbM4JcTu3HArHF8yP50YwRTl7RI+uXqOlq5JLNi2hfAVcDk6Pl37n5bugMTkQ3w559w1VXQogXVv5zOOTzCQXzAVJJPBmoYrnyS6WXUDpgCvB09b2Fmw9IdmIisB/ewqH2zZnDHHXDGGexSMJvBnIOX8N+9WrXQY0gNw5VXMm0IfYB9gEUA7j6F0NNIRLLJt9/CccdBhw5hFtIPPoDBg9kkp0Hcw+vXV/uArC2ZhLDK3f8osq38zHchUtEtXw433xzmHRo3Lkw/MXkyHHggkHhtgvvvDx/8SgBSKJmEMN3M/gtUNbOdzawf8GGa4xKRZLz3HuyxR1jBrF27MPXEJZfARmv6iyRam0AJQIpKJiFcRFgUZzlhQNofQM90BiUiJfjpp7Bs5ZFHQkEBo658m9xPXqBKo23JzQ3TEsVORw0qDUjJip3t1MyqAre5+xVlF1Jimu1UKr1Vq+DBB0OJYMUK6NWLIY2v4pzuGxc771CtWioVVGYpWSDH3VcDLVMWlYisvwkTYO+9oWdPftr+QA6pP4MqN97A6ecWnwwgTFLXu3fZhCnlVzID0z6LupkOBf4q3OjuL6ctKhFZY+FC6NULHnkEtt2WcT1epM0jHVi6LIwyXr06udNoOmopSTIJYXPCTCeHx2xzQAlBJJ0KCuDJJ+HKK8M8RJddBjfcwBm712HpstKfTtNRS0mSWULzzLIIRERiTJsWWobHj4cDDoCHHgq9iVi/b/oadSzJSKaXkYiUlSVLQklgr71CF9LHHoP33/8nGUDib/pVq67pVnrBBepmKqWXTJWRiKSbO7z0EvTsCT/+COeeC7feGoYTF9G379qL2oN6EUlqJCwhmNnF0e8Dyy4ckUro66+hTRvo2BEaNoSPPgqf7nGSAWigmaRPwnEIZjbF3VuY2WR336uM44pL4xCkQvn7b7j99lASqF49TD/Rrdtao4xFUiEV4xBmmtkcoImZTYv5+dzMpqUsUpHKaMQI2H136NMnTEY3ezb06FFsMsjLW3v0cV5eWQUrlUXCvz53P9XMtgLeAY4vu5BEKrAffwxzDQ0dCrvsAiNHwhFHxD00Ly8MJps7FzbfPLQ3r1gR9uXnh3YEUFWRpE5JI5Xnufu/gJ+BOtHPT+6eXxbBiVQYq1bBvffCrrvC66/D//4XupYWkwy6duWfRe8XLlyTDApp9LGkWomVlWZ2KPAUMIewAGsjM+vs7uPSHJtIxTB+fGgbmDYN2raFfv1ghx3WOSy2RFClSnIjkDX6WFIpmdare4Cj3X02gJntAgxBcxyJFG/BgrCM5WOPQaNGoVtp+/aha1CkMAnk54fNhX08kp2OQqOPJZWSSQjVCpMBgLt/aWbV0hiTSPlWUBCSwFVXweLFYeqJ666D2rXXOqywWqhwPEExEw/HpdHHkmrJJISJZjYYeDp63gmYlL6QRMqxKVPCMOEJE+CQQ2DAgLCSWRy9e1PiLKWxqlWDunXht99CyaBvXzUoS2olM3XFBcAMoAdwMfAFcH46gxIpdxYvDqOMW7aEb74Jk9KNGZMwGUBy9f+x01FozWNJt2Qmt1tOaEe4J/3hiJQz7vDCC6Er6bx5cN55cMstsNlmJb60cePQdpCIpqOQspbRye3M7Bgzm21mX5vZ1ZmMRaTUvvwSjj4aTjkFttkGPv44zEqaRDKAUOVTq9ba2wrbmzUdhWRCxhJCtDzng0AboBlwqpk1y1Q8Iklbtiw0Eu++O3z6aehG+vHHYTWzEsSONu7dGzp3XntOoqefDoUOVQlJJpRq0hQzqwLUdvfFKbj2PsDX7v5tdO7ngBMIbRQi2enNN6F7d/juOzjtNLjzTthqq2JfkqhraX5+aGpQSUCyRYklBDN71szqmtkmhA/r2WZ2RQquvS3wfczzH6JtRa/f1cwmmtnE+fPnp+CyIuth7tww59Cxx0KNGjBqVPg6nyAZFJYEzOD009e0FRTtWqrRxpJNkqkyahaVCP4DvAk0Bk5PwbUtzrZ1emK7+yB3b+XurRo2bJiCy4qUwsqVoRTQtCm8/XaYmXTqVDjssIQviZ12AkoeX6DRxpItkhqYFg1E+w/Q391Xmlkph9DE9QPQKOb5dsBPKTivSGqMGxemnJgxA9q1gwceCF/7S1Da8QUabSzZIpkSwsOEeYw2AcaZWQ6QijaET4GdzWx7M6sOnAIMS8F5RTbMr79Cly5w6KHw55/w2mswbFhSyQBK941fo40lm5SYENz9AXff1t3bepAPJC4vJ8ndVwHdCdNrzwRecPcZG3pekfW2ejUMHAhNmsCzz0KvXqF0cHz82d9jeww1aBB+qlQJP8VR11LJVsnMdrolcAuwjbu3ibqG7g8M3tCLu/ubhHYJkcyaPDlMOfHJJ6F94MEHoWnTddYkgDB1RNH1CRYuXHOqeBPTFfYuysnRlBOSvZKpMnqC8C1+m+j5l0DPdAUkUqYWLYKLLgpjCPLzGX/BM+R+8x5VmjelQQM466y11yRYuDDx+gRFxU47ofEFUh4kkxAauPsLQAH8U9WT5OS8ItknLw9yc5zTLI/59Xel4MEBPLFJN+r9MouDB3Yif64l/aFfnIICzTsk5UsyvYz+MrP6RF1CzWw/4I+0RiWSJnl5cPc5M3ns7ws5nNF8UrA3xzCcyUui5T1S0X8uot5DUt4kU0K4lND7Z0czG09YPe2itEYlkmJ5ebBr46Xkn3YNE/7+F3vyGefzEPvzEZPTsNaTeg9JeZTMbKeTo2U0mxAGk81295Vpj0wkRfLy4JWzXuftFReRSz5P0JkruYP5bLFB541dnyC2wVlrFUh5lTAhmFmHBLt2MTPc/eU0xSSSOnPmsMW5F/PiimHMoBmHMJb3OSTpl+tDXyqT4koI7YrZ54ASgmStIU+uYG7Pe7ho0U0cgHEFd3AfPVlFyau/qouoVFYJE4K7n1mWgYhsiNjxAsfXGc1ti7txKrN4hf9wMffzPfFbeHNyoG3bMInp3Ln65i+VW3FVRqe5+zNmdmm8/e6uFdSkzMR+4DduvPaHeOEgsc1WzOMpLue0xXl8y/Ycyxu8ybFxz6fVyETWVVyV0SbR7zpx9qWwc55I8QpnDy2cMC4/PyxMVuj3has5n4H0pTc1WcZNXMet9OJvaq5zLjOVAkQSKa7K6OHo4Uh3Hx+7z8wOTGtUUunFlgiqVIk/HQRAKz5lIOfTksm8y5F0pz9f0iTusTk5YZCYiMSXzDiEfkluE0mJ2PUE3OMng3r8zoN042P2ZWt+5mSe42hGJEwGGhcgUrLi2hD2Bw4AGhZpR6gLVE13YFK5JFsiAOd0nuYuLqc+C3mAHlzPTSyh7lpHxXYXVRWRSHKKa0OoDtSOjoltR1gMnJjOoKRyKdpGkCgZNGMGA+jGoYzjI/bjaEYwlRaAEoBIKhTXhjAWGGtmT0RrIIikRUkrjNXiL67nJi7lHpZQh6s3H8SSk85m0VtVMHUVFUmZZCa3q2Fmg4Dc2OPd/fB0BSUVX2wVUeI1h50TeI0H6EFjvufrQ89ip6G3cVu0tvaDZRatSOWQTEIYCgwEHkXTXksKFK0iimd7vqUfPTiW4cystjsjeg3h6BvVuU0knZJJCKvc/aGSDxNJTnFVRNVZzhXcSW/6stHGG8HNd9G0Rw+aVit5ygkR2TDJdDt93cy6mdnWZrZ54U/aI5MKI3bt4dzc0J00nsN5j6n8i5u5jl/3aUe1r2bCZZeFFmMRSbtkSgido99XxGxzYIfUhyMVTbxRxoWTxxXaip+5m8v4L0Ngxx3hwbfJ+fe/MxOwSCWWzHoI25dFIFIxxasecg9JoYqvohsDuJlrqc4KpnXowx55V8HGG2cmWJFKLpkSAma2G9AM+Od/qrs/la6gpOKYOzf+9n18AoOrXUDzlVMYu/G/WXRzf04KDuWzAAAUSklEQVS4bKeyDU5E1lJiG4KZ3UCYqqIfcBhwB3B8muOScqpoe8HmRVqbNuM3BnIeH3IAzbeYDy+8wKFL31IyEMkCyTQqnwgcAcyL1kj4F1AjrVFJ1in6QZ+Xt+62bt3WnoMoPx8WL4bq1cEooAuPM5smnM1gZrW9FGbOhI4dQ/2RiGRcMglhmbsXAKvMrC7wK2pQrpDifegXbi/6QX/mmXDWWWtvGzhw3faClSthn5qfM6HGoTzOWeTX2IV3bplMs+F3QZ14M6uLSKYk04Yw0czqAY8Ak4A/gU/SGpWUuXi9gbp2DY/jNQyvXLnuOYqOOK7NEm7gRnr+cR8b1a8HAwbTqkuXkHFEJOuU+D/T3bu5+yJ3HwgcBXTW8prlW7ySQLwP/aVL10wvUTrO//EiM2nK5dzNC7XPgtmzQ5FCyUAkayXTqHxI4Q/QGKgXPZZyKF71T+HzeAqXrEzWTnzNm7TlRTqygAa0rvERPnAQ1K+fmjcgImmTTJVR7IC0jYF9CFVHmtyuHEpUEqhaNf6004UziRade6hatdAWvGJFeF6Dv7m22u1cVXArfxdU5xK/l2GNu3PTLRtpFlKRciKZKqN2MT9HAbsBv6Q/NFlfiRqHIXH1z+rVYVWxWIWrjHXqFBakz8kJSSAnBx5/HB57LDw+mhHM3Gh3rl3Zh2od21Pnh1nc6z35Jl/JQKRccfdS/QAGfF7a16Xip2XLli7Fe+YZ91q13EOFUPipVStsd3fPyVl7X+FPTk44JifH3WzN82L98IN7x47hBDvv7D5iRFrfm4isH2CiJ/EZa554MnoAzKwfYe4iCCWKFsAcdz8tjXkqrlatWvnEiRPL+rLlSqLJ4woXmI839XStWqEEkPS3+VWr4IEH4IYbwuNrroErr4QaGp4iko3MbJK7tyrpuKS6ncY8XgUMcffx6x2ZpFWiKqHC7YUf+oW9h0q92tj48WEE2rRp0LYt9OsHO2hYikhFkEwfwKHAZ9HPi6lIBmbW0cxmmFmBmZWYtSR5iXoExW7v1CmUFgoKwu+kksGCBXD22XDQQWHh4pdfhjfeUDIQqUASJgQzq2Zm9wHfA48DTwLfmtnV0f49N+C604EOwLgNOIfE0bdv4sbh9VJQAI8+Ck2awFNPwRVXhCkn2rfXlBMiFUxxJYS7gdpArru3dPc9gabADmb2EPDy+l7U3We6++z1fb0kFq9HUKnaB2JNmQIHHgjnngvNm8Nnn8Edd0Dt2imPW0Qyr7g2hLbAzh7T6uzui83sAmAB0CbdwQGYWVegK0Dj0oyQqsQ6dVrPBFBo8WK4/vrQPlC/Pjz5JJx+ukoEIhVccQmhwON0QXL31WY2390nFHdiMxsJbBVnV293fy3ZAN19EDAIQi+jZF8n68EdXngBLrkE5s2D888PdU2bbZbpyESkDBSXEL4wszO8yEI4ZnYaMLOkE7v7kRsanJShL7+ECy+EkSNhr73g1Vdhn30yHZWIlKHiEsKFwMtmdhZhqgoH9gZqAu3LIDYpC8uWwa23wu23h6Ur+/cPJYOqVTMdmYiUsYQJwd1/BPY1s8OB5oQRym+5+3sbelEza09Yga0hMNzMpri7VlUva2++Cd27w3ffwWmnwZ13wlbxavlEpDIocWCau48CRqXyou7+CvBKKs8ppfD993DxxfDKK7DrrjBqFBx2WKajEpEM0+T0lcnKlaEU0LQpvP023HILTJ2qZCAiQHJTV0hF8P77cMEFMGMGtGsX5iLKzc10VCKSRVRCqOh+/RW6dIFDDoE//4TXXoNhw5QMRGQdSggV1erVYdX7Jk3g2WehV69QOjj++ExHJiJZSlVGFdHkyaF66JNPQvvAgw+GdgMRkWKohFCR/PEHXHQR7L13WBThmWfgvfeUDEQkKSohVATuMGQIXHopzJ8f1iv43/+gXr1MRyYi5YgSQnk3a1ZIAKNHh5LB8OHQsmWmoxKRckhVRuXV0qVh2bM99gjTUj/0EHz0kZKBiKw3lRDKo9dfD20F+flwxhlhjYItt8x0VCJSzqmEUJ7k58MJJ4Suo5tsAmPGhLUKlAxEJAWUEMqDFSvgtttCb6GRI8PMpFOmwKGHZjoyEalAlBCy3Zgx0KJFGFh2zDG8cstMcgdcSZUa1cjNhby8TAcoIhWF2hCy1S+/wOWXh7EE228Pb7xB3qJj6do1tCdDqEHq2jU83qAlM0VEUAkh+6xeHUYWN2kCzz8P114L06fDscfSu/eaZFCosLORiMiGUgkhm3z6aZhyYtIkOOKINYkhMndu/Jcl2i4iUhoqIWSD338Pg8v23Rd+/DGMOn733bWSAUDjxvFfnmi7iEhpKCFkkjs8/XRYtezhh8PYglmz4JRTwGydw/v2hVq11t5Wq1bYLiKyoZQQMmXGDGjdGs44gwV1tufYLSdSpd/95P5r04Q9hzp1gkGDICcn5IucnPBcDcoikgpqQyhrf/0VJp67+26oU4cJZw/iyGfP5q9lITeX1HOoUyclABFJD5UQyoo7vPpqGFx2++1hyonZszll5Ln/JINC6jkkIpmghFAWvvsurGPcvj1sumlY33jwYGjYUD2HRCRrKCGkSF5eWKa4ShXWjCBevjy0+DZrBmPHhmqiyZPhoIP+eZ16DolItlBCSIG8vFDvn58faoby8+HZs99jce4eYWDZccfBzJlhAZtq1dZ6rXoOiUi2UEJIgdgRxFvxM3n8l+HLj+S3Bavhrbdg6FDYbru4r1XPIRHJFubumY4haa1atfKJEydmOox1VKkCVXwV3RjAzVxLdVZwG1dzO1ezzDfOdHgiUsmZ2SR3b1XScep2mgInbDmB6+ddwJ5M4R2Opjv9+ZqdycnJdGQiIsmr8FVGcRt7U+W33+C883j5lwPYwuZzIkM5hrf5mp3VDiAi5U6FTgjxGnu7dk1BUigogMcfD3MNDR6MXXIJHwyaycScEzEztQOISLlUodsQcnNDEigqJwfmzFnPID7/PExE98EHcMABYXH7PfZYz5OJiKRfsm0IFbqEkNJBX0uWhAVr9twzdCEdPDgMMFMyEJEKokInhJQM+nKHl14KU07cfTecdRbMnh1+V6nQt09EKpmMfKKZ2Z1mNsvMppnZK2ZWLx3X2eBBX19/DW3bwoknQoMG8OGHoXGgfv2UxyoikmmZ+or7LrCbu+8BfAn0SsdF1nvQ199/w403wm67wfjxcN99MHEi7L9/OsIUEckKGRmH4O4jYp5OAE5M17VKPV30iBFw4YWhdHDyyXDPPbDNNukKT0Qka2RDJfhZwFuJdppZVzObaGYT58+fn74ofvwRTjoJ/v3vUJwYMQKee07JQEQqjbQlBDMbaWbT4/ycEHNMb2AVkHBkgLsPcvdW7t6qYcOGqQ901Sq4996wjOXrr4fFaz7/HI46KvXXEhHJYmmrMnL3I4vbb2adgeOAIzxTgyHGjw9jCqZNgzZtoH9/2GGHjIQiIpJpmepldAxwFXC8uy8t8wAWLICzzw7rEvz2W+hWOny4koGIVGqZakPoD9QB3jWzKWY2sEyuWlAAjz4appx46im44oowyKxDh9BuICJSiWWql9FOZX7RqVPhggvgo4/g4INhwIDQrVRERIDs6GWUfjffDC1bwldfwRNPhOUslQxERNZSORLC9tvDOeeEKSc6d1b1kIhIHJVjgZxSj04TEal8KkcJQURESqSEICIigBKCiIhElBBERARQQhARkYgSgoiIAEoIIiISUUIQEREALFMzT68PM5sP5JfxZRsAC8r4mtlO92Rduifr0j1ZV6buSY67l7igTLlKCJlgZhPdvVWm48gmuifr0j1Zl+7JurL9nqjKSEREACUEERGJKCGUbFCmA8hCuifr0j1Zl+7JurL6nqgNQUREAJUQREQkooQgIiKAEkJSzOxOM5tlZtPM7BUzq5fpmDLNzDqa2QwzKzCzrO1GVxbM7Bgzm21mX5vZ1ZmOJ9PM7DEz+9XMpmc6lmxhZo3MbLSZzYz+31yc6ZjiUUJIzrvAbu6+B/Al0CvD8WSD6UAHYFymA8kkM6sKPAi0AZoBp5pZs8xGlXFPAMdkOogsswq4zN2bAvsBF2bj34kSQhLcfYS7r4qeTgC2y2Q82cDdZ7r77EzHkQX2Ab5292/dfQXwHHBChmPKKHcfB/yW6Tiyibv/7O6To8dLgJnAtpmNal1KCKV3FvBWpoOQrLEt8H3M8x/Iwv/okj3MLBfYE/g4s5Gsa6NMB5AtzGwksFWcXb3d/bXomN6Eol9eWcaWKcncE8HibFNfbonLzGoDLwE93X1xpuMpSgkh4u5HFrffzDoDxwFHeCUZvFHSPREglAgaxTzfDvgpQ7FIFjOzaoRkkOfuL2c6nnhUZZQEMzsGuAo43t2XZjoeySqfAjub2fZmVh04BRiW4Zgky5iZAYOBme5+T6bjSUQJITn9gTrAu2Y2xcwGZjqgTDOz9mb2A7A/MNzM3sl0TJkQdTboDrxDaCh8wd1nZDaqzDKzIcBHQBMz+8HMzs50TFngQOB04PDoM2SKmbXNdFBFaeoKEREBVEIQEZGIEoKIiABKCCIiElFCEBERQAlBREQiSgjliJltZ2avmdlXZvatmfU3sxppuE5rMzsg5vn5ZnZG9PgJMzsx1deMudaQaFbZS5I4tkUqu+6ZWRcz67+B5xgTb/bXaPvEmOetzGzMhlxrPWJL6v2Z2RwzaxA9/rCEY68pYf+bZlbPzHJLO/tpcX+Hkh5KCOVENLDlZeBVd98Z2BmoCdyRhsu1Bv75j+juA939qTRcZy1mthVwgLvv4e73JvGSFkDG+nKbWWlH+m9hZm3K6Fop4e4HlHBI3IRgQRV3b+vui9bz8q3JwN9hZaaEUH4cDvzt7o8DuPtq4BLgDDOrXfTbn5m9YWato8cPmdnEaB72G2OOmWNmN5rZZDP73Mx2jSbeOh+4JBo8c7CZ9TGzy4sGZGYtzWysmU0ys3fMbOtoew8z+yL6pv9cnNdtbGaPR9f8zMwOi3aNIHxoTjGzg4u8pqOZTTezqWY2LhoVfBNwcnT8yWa2j5l9GJ3zQzNrEr22i5m9bGZvR6WrO2LOe6aZfWlmYwmDhwq3tzOzj6NzjTSzLaPtfcxskJmNAJ4ys5pm9lz0Xp8nJOlE7gSuTfZ+RHEPNbPXgRHRN+axZvZCFPNtZtbJzD6JXrtjcbEnYmb1zWxEdPzDxMzPZGZ/Rr+3ju77lOjf4WAzuw2oGW3Li0oBM81sADAZaGQxpQ1gIzN7MrpXL5pZrejcsSWSVhZKU7kU83dooXQ4wdasUbJZtH2Mmd0e3ZMvi/4dSQncXT/l4AfoAdwbZ/tnhG/KXYD+MdvfAFpHjzePflcFxgB7RM/nABdFj7sBj0aP+wCXx5zrn+eEue5PBKoBHwINo+0nA49Fj38CakSP68WJ+TLg8ejxrsBcYGMgF5ie4P1/Dmwbe84477kusFH0+EjgpZjjvgU2ja6TT5h/aOvo2g2B6sD4wvMBm7Fm4OY5wN0x92ISUDN6fmnM+96DMPlhqzjxjwFaAaOAw6LHY0q4H10IcyUV/vu1BhZFcdcAfgRujPZdDNxXQuxr3a+Y2B4Aro8eH0uYnK9B9PzPmBh7x/wd1YndHz3OBQqA/WK2zQEaRPscODDa/hhr/qbmxFwv9r70IfHf4TTg0OjxTTHvfUzM+20LjMz0/93y9KPJ7coPI/4smvFm2yzqJDPrSpjMcGvCQi7Ton2Fk2xNIix4k6wmwG6E6TwgfEj8HO2bBuSZ2avAq3FeexDQD8DdZ5lZPrALUNzsj+OBJ8zshZiYi9oUeNLMdibcq2ox+95z9z8AzOwLIIfwQTXG3edH25+P4oAwSd3zUamnOvBdzLmGufuy6PEhhA9U3H2amU2jeDcTSglXxWxLdD8A3nX32LUFPnX3n6N4vyGUqiAkzMKSVnGxx3MI0b+9uw83s9/jHPMp8JiFCdpedfcpCc6V7+4TEuz73t3HR4+fIXzJuauE2NZhZpsSvhSMjTY9CQyNOST2bzq3tOevzFRlVH7MIHx7+oeZ1QW2BGYTvpnG/ntuHB2zPXA5YZbWPYDhhfsiy6Pfqynd7LcGzHD3FtHP7u5+dLTvWMIqYi2BSbZu/XcySWwt7n4+4YO0ETDFzOrHOex/wGh33w1oR/z3CWu/10Rzt/QjfJveHTivyLn+KhpeUm8CcPdR0bn2i9lc3P0oeq3Y91EQ87yANe+puNgThlbszrDozSGEUsnTlrhxt2i8xV2j8Hns324ysZZkff+mKz0lhPLjPaCWrentUxW4m/Affxmh2N3CzKqYWSPCSl4QqlH+Av6I6pKTadRcQpjMrzizgYZmtn8UTzUza25mVYBG7j4auBKoB9Qu8tpxQKfodbsAjaPzJWRmO7r7x+5+PbCAkBiKxrkp4QMLQvVIST4GWkd16NWAjgnO1bmYc8S+l90I1UYl6Uu4N/HOkdT9KEGysce7fhtCldNazCwH+NXdHyHM2rlXtGtldO+S0bjw7wU4FfggejyH8OUB4P9ijo/7dxiV9H6PaR84HRhb9DgpPSWEcsJDpWh74EQz+wpYCBS4e9/okPGEqoHPCcXwwuX6phLaGWYQ6m3HU7LXgfYWp3E3Jp4VhLaE281sKjCF0COkKvCMmX0eXfdeX7eXyQCganTM80AXd19O8e6MGk6nEz7ApgKjgWZRnCcTelzdambjoziKFVW99CHMzDmS6J5F+gBDzex9QgJK5CGgdlRVdCXwSRLXfROYH7Npfe5HcfqQXOyFbgQOMbPJwNGENoyiWhNKZp8RPrTvj7YPAqaZWTKLRs0EOkf3anPCvSu8/v1RvKtjji/u77Az4W9iGqEN7aYkri8l0Gyn5ZSF/tlDgA7uPinT8YhI+aeEICIigKqMREQkooQgIiKAEoKIiESUEEREBFBCEBGRiBKCiIgA8P9Z5I23mLI0HwAAAABJRU5ErkJggg==\n", "text/plain": [ - "" + "
" ] }, "metadata": {}, @@ -554,27 +577,27 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Notice that these points nearly fall along the straight line. This indicates that the redistuals have a distribution which is approximately Normal. \n", + "Notice that these points nearly fall along the straight line. This indicates that the residuals have a distribution which is approximately Normal. \n", "\n", "You will now make one last diagnostic plot for this regression model, known as a **residual plot**. A plot of residuals vs. predicted values (scores) shows if there is structure in the residuals. For an ideal regression model the variance or dispersion of the residuals should not change with the values of the predicted values. It has been said that the ideal residual plot should look like a 'fuzzy caterpillar' with no change vs. the predicted value. \n", "\n", "Any structure in this plot with change in predicted values indicates that the model fit changes with the predicted value. For example, if the residuals increase with predicted values the model can be said to predict only the smaller label values well. The opposite situation indicates that only large label values are well predicted. Changes in the mid-range indicate that there is some nonlinear change with predicted values. In other words, in any of these cases the model is not accurately computing the predicted values. \n", "\n", - "Execute the code in the cell below to display and examine the resdiual plot for the regression model. " + "Execute the code in the cell below to display and examine the residual plot for the regression model. " ] }, { "cell_type": "code", "execution_count": 11, "metadata": { - "scrolled": true + "scrolled": false }, "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAhsAAAGJCAYAAAAjYfFoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAAPYQAAD2EBqD+naQAAIABJREFUeJzt3XmcXXV5+PFPmCSsBiEsAXFBsE9wARGouIJaEaylat1t\nq2DdteLSnxviXhVwAxfUVhR3qGtB0KqItmJlj1Z4kCAikJBkWAIJkMxkfn98z2VuJjOTmcm9c+7y\neb9e88rMuXfOfe6TM/c857udOSMjI0iSJLXLVnUHIEmSepvFhiRJaiuLDUmS1FYWG5Ikqa0sNiRJ\nUltZbEiSpLay2JAkSW1lsSFJktrKYkOSJLXV3LoDkFohIn4OPHHM5hHgTuBq4BOZ+bUWv+a7gRMy\nc2CS5xwGnA8cnpm/aPHrbwDek5nva+V+e0VEXAf8LDOPrX6eVr4i4mXAfpn5lhbE8iXgsMzce0v3\n1QmvI02XLRvqFSPApcCjgUOrr8cDLweGgK9ExJEtfs0vAI+ZYmyafWPzfijwb9P4/eOBnVsYy2wc\nB7P1OtK02LKhXrI6My8as+3CiDgPWAG8FDivVS+WmTcBN7Vqf2qvzPxN3TFI/cpiQ/3gbuAemq74\nImIO8FbgZcD9gT8Bp2bmp5qe82Dg48DjgG2BK4D3Z+a51ePvoXSjbNX0O68E3lTt83+B04E5TY9v\n8jvV9o2a+CPigcD7gacAuwK3UgqlN2bmLeO9yYh4A/Aq4EHAIPB94G2Zecc4z71f9Z7/OTM/07R9\nIbAM+JfM/GREPBV4H/BwYD3wC+CtmZnjxTBBXO+mFHpvAE4C9gKWVLFdUD2n0d30KuAdwH2Bv8vM\nn0bEE6pcHEL5v/xP4C2ZuarpNfYHPkppvVgFvHOcOMbmeBHwEeBIyv/vpVVMv46IPwIPAF4aES8B\n9s7M6yPi/sCJwBHANsCFVSyXN73OfSnHzdHVpi+wmVbkiLgKWJKZzxuz/XLgj5n5rIjYCvgX4O+B\nfYANlGPynZn58wn2u0nX0QTH7aQ5rv5e3g+8CNiTUmR/s9rP0GTvTQK7UdRb5kTEQNPX1hERwJeA\nHYAzmp57GvCeatszgDOBT0TEO+HeD9dzgO2AF1NOHIPA96siBMY0WUfE64DPUj6ojwZ+DXyejZu1\nN9vMHRHbAhcAAbwaeCrwCeCFwAcm+J0XUk6cp1JOhO8F/gE4ZbznZ+aNwM+BF4x5qHGy+3pE7A18\nD/gNJUfHVjGdM1n8E9gV+GIVz3OANcCPqiKh2QmUYu21wK8i4onATyhjb55LKVgOB34WEVsDRMSe\nlHzdh5Kjd1FysedEwUTE9sCvgMOAtwDPAtYCP46IfYBnAjdX7/VQYFlViF0IHAi8hpK7rYBfVMdZ\n47j5EaWAeSPwEkqxOjbPY30VeHoVVyPG/YD9GT1uP0Lp2vks8DTgnyjdPGdFxDab2X+zscftZnMM\nvI1SCL6Hcjx+hlL4bFLUSeOxZUO95DDK1XezEcpV9HOaWiQeQvmgfmtmnlw97ycRMQK8IyI+A8yn\nnFjfm5k/qn7vN8C7ga0Z3/HAN5oGFP4kInYEXjnN9/EXlFaHf8zMP1XbLoiIQykngfE8Ebi2qZXi\nlxFxJ5OPOfgK8O8RsVdm3lBtewHwX5m5MiKeTLl6/1BmLgeIiBuAv42I7TNzzTTe07bAKzLz69V+\nzgeupZzEXtT0vE9n5ncaP0TEh4ArM/MZTdt+DVxJKX4+SzmpDwBHZeat1XOuphR7EzmG0nJxYGb+\ntvqd/wEuowyw/GJE3AOsbHTNRcSbgJ2AQxv5iohzgasorT/PB55OaR14Wmb+V/WcnwHXbSY/X6UU\niM8EGgOZX0hp0Tq7+nkR8PYxLVH3AP9BKUpm2k00lRw/Ebg4MxuFzy8jYi1w2wxfU33GYkO95BLg\nFZRuiz2BDwLzgOdl5h+anvfk6t+zI6J5Jsl/UgqGJ2TmDyLi98C/VQNLfwScO9HMhIhYDOzG6Imh\n4UzKFeGUZeYVwGERMSci9gUeAjwU2I9yUh3P+cArI+JS4LvADzPzG5t5qW9TrlCfD3y06iJ4PKMn\n/19Tup8ujoizgHOBn2fmxdN5P5UhSrN74z3eHRE/BI4a87wrGt9ULTyPBk4c8/90HeVE+FTKifDx\nwIWNQqPa/28i4vpJ4nkcpXvit80xUXI8kScDl1NaOZrjOY/RnD0BuKdRaFT7XVu917GzpWh6znVV\nsfMCRouNFwBnZub66jn/ABARu1AK4YcAf1M9d6ICeFLTyPH5wIcj4hfAD4BzmoseaXPsRlEvuSMz\nL8vMSzPzbMoH5c6UFobmK/yFlILk95SWkMbX/1JaQhrN739F6YI5gnLleXNEfLNqrRhrp+rfVWO2\nL5vJG6muolcACfw7pdVmDU3jP5pl5pmUK+E7KN0IF0XEtRHx3IleIzPvpHSTvLDa9HxKU/r3q8f/\nRDlB/poytuVcYHlEvH8Gb2l5Zm4Ys20FG7e8NKYqN+xE+Yx6Kxv/P60DHgbsUT1vZzbNO0ye+4XV\n60/HQkqXythYXg0sqLoydgLGG1MzlePgK8BTI2KniDgY2LfaBkBEHFy1rq2gFDivAoarh8c9LqZg\nSjnOzBMpXVvbAh8G/i8ifhsRh8/wddVnLDbUszJzBeUD8v5sPHbhNsqJ7XDg4DFfh1CapcnM5Zn5\nuszck9JP/xHg7xh/3ETjZLf7mO0Lx/w8Avf27VN9v33zEyLiRcDJlObtXTNzz8w8mrJeyGTv91uZ\neVj1ms+tYvpqNRByIl8BDqzGKTwf+HZ1hd/Y58WZ+RzKCf0plBaed0TE300WyzjG5gFKriY74a+m\n5OtjjP//dEz1vFVsmveJXrPhNso4ko1ExGOqVqqJfucC4KBxYvlLygl6FbBL8//vFGJpOJMy6PNZ\nlP+LP2bmr6q47kMp9m6nrP1xn8w8lDIAeXPGtobt0PT9VHNMZn42Mw+hdOe8lNKa8u2IsIVcm2Wx\noZ6Wmd+mXAW+sBpxD2VGBZQT+aWNL8oJ6wPAwog4NCKWR8RB1X6WZOYJwG+BB47zOn8A/kw5yTc7\nmo0HhK6u/t2radsT2NjjgFsz82ONmScRsQOlu2Dcv9mqxeU7VSx3VO/7A5Su0gkHSgI/pgyEfAPw\nKDa+kn5DRFwXEfMyc6ia8fBKylX0JjnYjG2rmS2NfW9LGd/wk4l+oWp5uRRYPOb/6feUMRKHV0/9\nKfDYiGi0dBARDwUezMR+CTy4GoTZ+J1tgO9QxinAaKtBQ2PQ7h/GxPMS4GVVy81PKTl/ZtN+51Fa\nxyaVmbdTBqT+LWUQ7VebHl5MKVhOGTMT6OnVvxN9lq9m42MNynHUeM3Jcvx+qhxHxP9ExCeq31lV\njd34FGXW0ILNvTfJilT94DhKkXBKRDwqM38XEV8DvlDNuLiY8mH+QWAppQVhPqXb4isR8V5gOaVb\n5gDKtMbxvBX4WkR8HjgLeCybjtc4h3IV+YWIOIkySPEERosQKAP9XhURJ1PGkdyPMmNid8qAwfH8\nDPhstc8fUloi3l29lysm+B0yc0NEfBN4PXBjZp4/Zp8fBr4XEZ+inHxfxejUyMb4gX2A3483xbbJ\nHOBLEXE8sJIyk2E7Ss6bnzPWO4BzIuKrlLEMcym5OIRScECZqXMsZSbJuynjdD5AGW8ykdOBfwZ+\nUP3OKspxMo9yEoXSknFgNVvjN5T/t78Hflr93wxSxlW8rPpdMvNnEfFjylif3ammF1NaUW6eJJ6G\nr1DG0mxFU+FH6U5bDbwzIoYpXR3PqV4bYKPWsSZnAy+IiP8FrqG0SOwz5jmT5fi91XMuAN4cETdT\nZvHsBbyZMoZn3KnYUjNbNtRLxp1SmplXA5+kjNh/dbX5pZR1GV5Jafl4O/B14IjMHMnMeyhXo/9H\nOZmdR2mleEVmNp8E7n3NzPwm5eRzKGXcw9MpA1abY/kDZUrqAykngtdTZsbc1PScL1NOpM+lFA7v\noUxTfSWwc2OaJU1TGDPz85ST2pGUQuA04HfV+xl7hT7WVyifBRst514NnvwbypTSr1NOgjsBT20a\ncPvXlJPPgZt5jRFK7t8FfIMyzfRxmXntmOdspBpo+TTKye0s4MuU7oqnZLVIV3WyezylUDydUhR8\nik2LrOZ83UlpUfo1ZbrwtyjFzuGZ2RhYejKly+A84FGZuYxSQP6RMmjyB5TuhmMz89Sm13kWo7NL\nvklp8frcZvLT8ENKQfmbzLymKQ+rKcffHEp3yxlVTp5AGafT3DrWnMc3UY6Hkyj5u4NSFN9rKjmm\nDJz+IKVb5dwqN+dSCh5ps+aMjHT+yrZVf/KnKc3Lg8CnmqYsSupgMYV7yEjqbR3fshGjiyvdDDyS\n0ox7fERsbpEcSZLUATq+2KD0U18GvCYzl2bmeZRBWI+f/NckdZDOb0KV1DZd0Y3SLCIeR1kb4FXV\niHtJktTBumo2SkRcR1kz4WzKFDVJktThuqEbpdmzKaPjD6TMEJAkSR2u67pRAKrVC78K3CencHvj\nkZGRkTlzZrqaryRJfW2LT6Ad340SEbsBj8nM7zdt/j1l0aUFjH8fgo3MmTOH1avvYnh47K0Z+sfA\nwFYsWLCteTAP9zIXhXkYZS4K8zCqkYst1fHFBrA38J3qNtiNmxkdTLn185RXrhse3sDQUH8fNGAe\nGszDKHNRmIdR5qIwD63TDcXGRZTlpL9Y3Qlzb+BExr8ZliRJ6jAdP0C0urnR31LuU/Er4PPAJzLz\nU5P+oiRJ6gjd0LJBZi7HNfglSepKHd+yIUmSupvFhiRJaiuLDUmS1FYWG5Ikqa0sNiRJUltZbEiS\npLay2JAkSW1lsSFJktrKYkOSJLWVxYYkSWoriw1JktRWXXFvFM2+9UPDLFk6yPJb1rJo5+3Yf5+F\nzJs7UHdYkqQuZLGhTawfGub0c69i2eDae7ddcvVKjjlqsQWHJGna7EbRJpYsHdyo0ABYNriWJUsH\na4pIktTNLDa0ieW3rJ3WdkmSJmOxoU0s2nm7aW2XJGkyFhvaxP77LGSPhRsXFnssLINEJUmaLgeI\nahPz5g5wzFGLnY0iSWoJiw2Na97cAQ6K3eoOQ5LUA+xGkSRJbWWxIUmS2spiQ5IktZXFhiRJaiuL\nDUmS1FYWG5Ikqa0sNiRJUlu5zoYktcH6oWEXxpMqFhuS1GLrh4Y5/dyrNrp78iVXr+SYoxZbcKgv\n2Y0iSS22ZOngRoUGwLLBtSxZOlhTRFK9bNmQVJte7WpYfsvaaW2Xep3FhqRa9HJXw6Kdt5vWdqnX\n2Y0iqRa93NWw/z4L2WPhxoXFHgtLy43Uj2zZkFSLXu5qmDd3gGOOWtyTXUTSTFhsSKpFr3c1zJs7\nwEGxW91hSB3BbhRJtbCrQeoftmxIqoVdDVL/sNiQVBu7GqT+YDeKJElqK4sNSZLUVhYbkiSprbpi\nzEZE7AmcAjwJWAucCbw9M9fVGpgkSdqsrig2gG8Dg8DjgIXA6cAQ8NY6g5IkSZvX8cVGRATwl8Du\nmbmq2nYCcBIWG5IkdbxuGLOxHDiyUWhU5gA71hSPJEmaho5v2cjM24H/avwcEXOA1wE/qS0oSZI0\nZR1fbIzjJOCRwMHT+aWBgW5oxGmfxvufLA/rhoa54ppBlg+uYdHC7Tlg34XM77HVHKeSh35hLgrz\nMMpcFOZhVKtyMGdkZKQlO5oNEfER4I3A8zLze9P41e55kzVZt36YU8+8nBtW3HHvtr12uw+vf94j\nmT+vtwoOSdK0zNnSHXRNy0ZEnAq8EnjxNAsNAFavvovh4Q2tD6xLDAxsxYIF206Yh4uuWsF1y27f\naNt1y27n5xdfzyGLe2c56c3loZ+Yi8I8jDIXhXkY1cjFluqKYiMi3g28Anh+Zn53JvsYHt7A0FB/\nHzQwcR5uWnkn4zVy3bTyTob23WUWIptdHg+jzEVhHkaZi8I8tE7HFxsRsR9wPPCvwK8iYvfGY5l5\nc22B9ZhFO283re2SJE1VN4x+OZoS5/HATdXXsupftcj++yxkj4UbFxZ7LCy3/JYkaUt0fMtGZn4E\n+EjdcfS6eXMHOOaoxSxZOsjyW9ayaOdSaMzrsdkokqTZ1/HFhmbPvLkDHBSzMxh0/dCwhY0k9QmL\nDc269UPDnH7uVSwbXHvvtkuuXskxRy224JCkHtQNYzbUY5YsHdyo0ABYNriWJUsHa4pIktROFhua\ndctvWTut7ZKk7maxoVnnNFtJ6i8WG5p1TrOVpP7iAFHNOqfZSlJ/sdhQLWZzmq0kqV52o0iSpLay\n2JAkSW1lsSFJktrKYkOSJLWVxYYkSWoriw1JktRWFhuSJKmtXGdDkmbZ+qFhF7VTX7HYkKRZtH5o\nmNPPvWqjOx9fcvVKjjlqsQWHepbdKJI0i5YsHdyo0ABYNriWJUsHa4pIaj+LDUmaRctvWTut7VIv\nsNiQpFm0aOftprVd6gUWG5I0i/bfZyF7LNy4sNhjYRkkKvUqB4hK0iyaN3eAY45a7GwU9RWLDUma\nZfPmDnBQ7FZ3GNKssRtFkiS1lcWGJElqK4sNSZLUVhYbkiSprSw2JElSW1lsSJKktrLYkCRJbWWx\nIUmS2spiQ5IktZXFhiRJaiuLDUmS1FYWG5Ikqa0sNiRJUltZbEiSpLay2JAkSW1lsSFJktpqbt0B\nTEdEbA1cDLw2M39RdzySJGnzuqZloyo0vgE8tO5YJEnS1HVFy0ZE7Ad8ve44JPWudUPDXJYrWX7L\nWhbtvB3777OQeXMH6g5rytYPDbNk6WDXxq/e1hXFBnAY8FPgeGBtzbFI6jHr1g/zxbOv5MZVa+7d\ndsnVKznmqMVdccJePzTM6edexbLB0Y/Hbopfva8rio3MPK3xfUTUGYqkHnTxlTdz0+CajbYtG1zL\nkqWDHBS71RTV1C1ZOrhRoQHdFb96X9eM2ZCkdrlx5Z3jbl9+S3c0pE4UZ7fEr97XFS0brTAw0N91\nVeP9mwfz0GAuioGBrbjfrjswhzkwZ2Sjx/bcdQfmzu38/Oy56w7MuXrluNunE7/HRGEeRrUqB31T\nbCxYsG3dIXQE81CYh1HmAg7ebz4X/nYZN6y4495te+12Hw4/+AHMn9f5Yx4OP/gB/PbaW1oWv8dE\nYR5aZ87IyMjmn9VBImIDcPg019kYWb36LoaHN7QrrI43MLAVCxZsi3kwDw3momjkYdUtd3JprmT5\n4FoWLdyOA/ZdyPwuGly5bmiYK64Z3KL4PSYK8zCqysWcLd1P37RsDA9vYGiovw8aMA8N5mGUuSgG\n5szhwH13gX1Ht3VTXraidfF7TBTmoXW6sUOqu5piJEnqc13XspGZ3dOuKUmSurJlQ5IkdZGua9mQ\nJtKPyzX343uW1H0sNtQT+nG55n58z5K6k90o6gmTLdfcq/rxPau7rR8a5pJcwTkXXscluYL1Q8N1\nh6RZYsuGekI/Ltfcj+9Z3cuWuP5my4Z6wqKdt5vW9l7Qj+9Z3cuWuP5msaGesP8+C9lj4cYn2T0W\nlgGTvaof37O6V6ta4uyK6U52o6gnzJs7wDFHLe6rmRmbe8/OVFEnaUVLnF0x3ctiQz1j3twBDord\n6g5jVk30nv1QVqfZf5+FXHL1yo2Oyem2xE3WFdNvf/vdxmJD6kF+KKvTtKL10UHR3ctiQ13FroGp\n8UNZnWhLWx8dFN29LDbUNewamDo/lNWLWtEVo3pYbKhr2DUwdX4oqxf140DwXmGxoa5h18DU+aGs\nXtWPA8F7gcWGuoZdA9Pjh7KkTuGiXuoaLmIlSd3Jlg11DbsGJKk7WWyoq9g1IEndx24USZLUVrZs\nqCe5+JckdQ6LDfWcyRb/mjvXxjypH3kBUq8pFxsR8cWpPjczj51ZOOpls/XHPtniX49+2KKWv56k\nzubqw/WbTsvG3m2LQj1vNv/YXfxLUjNXH67flIuNzHxSOwNRb5vNP3YX/5LUzAuQ+s14zEZEzAV2\nBxqXpXOArYFDMvNrLYhNPWQ2/9i9L4ikZl6A1G9GxUZEHAGcAew6zsN3ARYb2shs/rG7+JekZl6A\n1G+mLRv/ClwKnAKcBbwYeCDwPuCY1oSmXjLbf+wu/iWpwQuQ+s202HgYcGxmLomIy4E1mXlqRNwJ\nvAX4XssiVE/wj11SnbwAqddMi41h4Pbq+2uAhwM/BX4GfLQFcakH+ccuSf1ppisc/Q44uvr+SuDx\n1fd7bXFEkiSpp8y0ZePDwH9ExDrgG8B7I+IcYH9KC4ckSRIww5aNzPwe8JfArzPzz8CRwBDwfeCV\nrQtPkiR1uxmvs5GZlzZ9fwFwQUsikiRtEe8Dok4z03U2Jr1PivdGkaR6eB+QwoKrs8y0ZWPsfVLm\nAvsAO1LGcEiSauB9QCy4OtGMio3x7pMSEXOAU4E7tjQoSdLMeB8QC65ONNOpr5vIzBHg48DLWrVP\ntdf6oWEuyRWcc+F1XJIrWD80XHdI6mIeT53B+4BYcHWiGQ8QncC+lJuxqcPZzKhW8njqHN4HxIKr\nE7VygOgC4KmU6a/qcDYzqpU8njqHtwaw4OpErRogCrCO0o3icuVdwGZGtZLHU2fp91sDWHB1npYN\nEG2niNga+AzwbGAt8NHM/NhsxtBrbGZUK3k8qdP0e8HVaaZcbETEA6b63My8fmbhTOhk4FHA4cCD\ngDMi4rrM/E6LX6dv2MyoVvJ4kjSZ6bRsXAeMTPG5LWuriojtKDNcnpaZVwBXRMSJwOsAi40ZsplR\nreTxJGky0yk2mrtODgBOAN4P/ApYDxwCvLva1koHUOK8sGnbfwPvaPHr9B2bGdVKHk+SJjLlYqO6\n/wkAEfEJ4OWZ+d2mp1weEcuAk4DPtS5E9gBWZeZQ07abgW0iYmFmDrbwtSRJ43D5b22Jmc5GCeD/\nxtl+DTDlsR1TtB1wz5htjZ+nvKbHwEDL1i/rSo33bx7MQ4O5KMzDqIlysW5omC+fl9w0uObebZf9\nYRXHPmM/5ndhwbFuaJgrrhlk+eAaFi3cngP2XbjR+/CYGNWqHMy02FgCvCEiXletHEpEzKV0bfym\nJZGNuptNi4rGz1OeV7dgwbYtC6ibmYfCPIwyF0W/5GHd+mEuvvJmblx5J/fbdQcO3m935s/buGAY\nm4tfLbmJFbfdxdymE8+K2+5i6bI7eez+e85K3K2ybv0wXzzzcm5YUd1Z4w+r+O21t/D65z1ys3nQ\nzM202PgX4EfAkRFxGWXZ84OB7YEntyi2hhuBXSJiq8zcUG1bBNyVmbdNdSerV9/F8PCGzT+xRw0M\nbMWCBduaB/NwL3NR9FMe1g0N88Wzr9yoheKCS/58bwvFRLm45vpbGBonN9dcfyv73X/HWYm9VS66\nagXXLbt9o23XLbudn198PYcsLmOO+umY2JxGLrbUTNfZ+GVEPAx4BfBwyiyVLwGfzcxlWxzVxi6n\nDEA9lDIYFeAJwEXT2cnw8AaGhvr7oAHz0GAeRpmLoh/ycFmu5MZVazbaduOqNVyWKzca3Ds2F7vd\nd1tGxpmLuNt9t+m6nN208s5x38tNK+9kaN9dNtrWD8fEbJnxvVEy84/A21sYy0Svc1dEnAGcFhHH\nAnsBbwZe0u7XlqReMtOVXntpHRUXoKvHdBb1+hnw7My8rfp+QpnZ6q6UN1FWEP0ZcDvwrsz0HiyS\nNA0zPdH20joqvVQ4dZPptGz8CRhu+n7WZOZdwDHVlyRpBrbkRDvTdVQ6bcpsLxVO3WTOyHidV71n\n5NZb1/R139vcuVux007bYx46Jw91fwh3Ui7q1Mo81P1/OhWTxdjqY2L90DCnn3vVJsXNMUct7ri8\nNPNvY1SVizlbvJ+Z/mJEPBa4OjNXRcQ/AM8H/gf4cGM6rNQvpnuSGe9D+JKrV3b8h7Am1i3/p7O5\n0uuSpYMb5QNg2eBaliwddLXZPjOjYiMiXkkZQ/HUiFhFmYnyU+CNwHzgva0KUOp0MznJbMmHcDdc\nPfcjT6ybmumAVPWemS4Ndhzw+sz8GfAC4HeZeQTwD8BLWxSb1BUmO8lMZKYfwo3C5uwL/8TFuZKz\nL/wTp597FeuHhif9vc1ZPzTMJbmCcy68jktyxRbvrx95Yt2UMz/UMNNulL2B/6y+fypwbvX9lZQF\nt6S+MZOTzEw/hNtx9byuS5r/p6qulh9PrJty5ocaZlpsrAD2jIj1wIHA26rtBwDLWxGY1C1mcpKZ\n6YdwO66er7imd5r/6xw34Yl1U878UMNMi41vAF8D1gB/Bn4eEc8HTgX+vUWxSV1hJieZzX0IT3R1\n3o6r5+WDa8bf3oXN/3WOm/DEOr7ZHJCqzjXTYuPtwA3Ag4FPZ+ZwROwGnAa8p0WxSV1hpieZiT6E\nJ7s6b8fV86KF24+/vQub/+seN+GJVRrfTO+NsoHSitG87dQJni71vFaeZDZ3dd7qq+cD9l3IRVfe\n3BPN/46bkDrTlqyzcRTl7q+LgcdQVve8JjO/2qLYpL60uavzVl89z++h5n/HTUidaabrbDwV+C7w\nTUqhMQDMA75U3Qr+jNaFqH7T7+tI1HF13onN/zM5Dhw3IXWmmbZsvBd4W2Z+IiL+DiAz3xkRt1Na\nOyw2NCPdsgpjO3l1vmXHQScWTlK/m2mx8QjKAl5jnYUDRPtKq1shXIXRq3PwONDsGvs5dmDsWndI\nPWemxcbtwJ7A0jHbHwbcskURqWu0oxWi7tkEnaLfr849DjRbxvscu+wPq3jz3x9cY1S9Z6bLlX8N\n+ERE7A+MADtExJHAp4BvtSo4dbaZLNO9Oc4mEHgczKbmpeovumoF69b311L1432O3TS4houvvLmm\niHrTTFs2jgfuD1xe/XwZMAc4G3hnC+JSF2jH1afjFQQeB7Nl7FX9nKtX8ttrb+Hvj3gIW7HFdxXv\nChN9Xt248k72u/+OsxxN75rpOhvrgRdFxAnAIyktJL8D/gR8iHL3V/W4dlx9Ol5B4HEwW8a7qr9h\nxR1ccc0gB+67S01Rza6JPq/ut+sOsxxJb5tysRER2wAnU+7yup4y4+TtmXlN9fgRlJaNB2Cx0Rfa\ndfXZ7+MV6tYpU489DtpvwtbJwbWw7ywHU5PxPsf2XLg9B++3O2vuvLvGyHrLdFo2TgJeDnwVuAd4\nNbA6Ij6czZXEAAAV0UlEQVQEnFL9fA3w5FYHqc7k1Wfvcepxf5mwdXJh/4yNGe9z7MDYlfnzBhj/\nrkGaiekUG0cDb8jM0wAi4hzgk5SxG/9EafU4ITPvaXmU6lheffYWp5z2l/Gu6vfa7T4csG9/jY0Z\n+zk2d+5M505oItMpNnYHftz083nAg4BnA3+VmT9vXViS2mn90DCXXbOK29euZ8ft5vHwB+3EvLkD\nTjntM2Ov6vfcdQcOP/gBrLnzboaGNtQdnnrIdIqN+cCdjR+qO73eRWnt+HmrA5PUHo2ukuW3rGXu\nwFYMDW/goitv5pijFjvltA81X9XPnbuV3Qdqi1a0Ff2mBfuQNEsm6yrZf5+F7DGmv94pp5K21HSn\nvo6Ms822NqmLTNZV0o5b2EvSdIuNU6quk4atgRMj4o7mJ2XmsVscmaS22FxXiYN+JbXadIqNXwCL\nxmz7H2CX6ktSF2jMQGhu4ejVrpJOWTNE6ndTLjYy8/A2xiFpljRmIPzuuls3mY3SS1wzROocM703\niqQuNm/uAIcs3o2ddtqeW29d05PTHF0zROocrlwiqSe5ZojUOWzZkNSTZmPNEMeESFNjsSGpJ7X7\nNvWOCZGmzmJDajGvdjdVR07afaNAx4RIU2exITXZ0pOiV7ubqjMn7VwzxDEh0tRZbEiVVpwUvdrd\nVK/mxPvISFPnbBSpMtlJcaq82t1Ur+bE+8hIU2fLhjpCJ4xzaMVJ0avdTfVqTto9JkTqJRYbql2n\njHNoxUmx3TMgulEv58T7yEhTY7Gh2nVKn34rTope7W7KnEiy2FDtOqVPv1UnRa92N2VOpP5msaG2\nmeo4jE7q0/ekKEmt11XFRkT8CPhaZp5Rdyya3HTGYfRyn74kqUuKjYiYA5wC/BXwtZrD0RRMZxyG\nffqS1Ns6vtiIiD2BrwJ7A7fVHI6maLrjMOy+kKTe1Q2Lej0KuB44CFhdcyyaok4ahyFJqlfHt2xk\n5tnA2QARUXM0mirHYUiSGmovNiJiG+B+Ezy8LDNbMv9xYKAbGnHap/H+ZysPc+duxT/9zUO54ppB\nlg+uZdHC7Thg34XMr3kcxmznoZOZi8I8jDIXhXkY1aoc1F5sAI8GzgdGxnnsWcAPWvEiCxZs24rd\ndL3ZzsMRuy6Y1debKo+HUeaiMA+jzEVhHlqn9mIjMy9gFsaOrF59F8PDG9r9Mh1rYGArFizY1jyY\nh3uZi8I8jDIXhXkY1cjFlqq92Jgtw8MbGBrq74MGzEODeRhlLgrzMMpcFOahdeyQkiRJbdVtxcZ4\n4zokSVIH66pulMx8cN0xSJKk6em2lg1JktRlLDYkSVJbWWxIkqS2stiQJEltZbEhSZLaymJDkiS1\nlcWGJElqK4sNSZLUVhYbkiSprSw2JElSW1lsSJKktrLYkCRJbWWxIUmS2spiQ5IktZXFhiRJaiuL\nDUmS1FYWG5Ikqa3m1h2AJEmdav3QMEuWDrL8lrUs2nk79t9nIfPmDtQdVtex2JAkaRzrhoY5/dyr\nWDa49t5tl1y9kmOOWmzBMU12o0iSNI4rrhncqNAAWDa4liVLB2uKqHtZbEiSNI7lg2vG337L2nG3\na2IWG5IkjWPRwu3H377zdrMcSfez2JAkaRwH7LuQPRZuXFjssbAMEtX0OEBUkqRxzJ87wDFHLXY2\nSgtYbEiSNIF5cwc4KHarO4yuZzeKJElqK1s2NCUubCNJmimLDW3Wehe2kSRtAbtRtFlLlrqwjSRp\n5iw2tFkTLWDjwjaSpKmw2NBmTbSAjQvbSJKmwmJDm7X/Pi5sI6nzrR8a5pJcwTkXXscluYL1Q8N1\nh6SKA0S1WfNc2EZSh3Mge2ez2NCUuLCNpE422UB2P7vqZzeKJKnrOZC9s1lsSJK6ngPZO5vFhiSp\n6zmQvbM5ZkNSV3DJfE3GgeydzWJDUsdzpoGmwoHsnavji42I2BH4KPAMSrfPOcBxmXl7rYFJmjXO\nNJC6WzeM2fgc8AjgSOAIYD/g87VGJGlWOdNA6m4dXWxExHbAs4HXZublmXk5cBzwrIiYX290kmaL\nMw2k7tbRxQawgdJ9ckXTtjnAALBDLRFJmnXONJC6W0eP2cjMu4Efj9n8BmBJZt5SQ0iSauBMA6m7\n1V5sRMQ2wP0meHhZZq5teu7rgOcAT5uN2CR1DmcaSN2r9mIDeDRwPjAyzmPPAn4AEBGvAT4JvCEz\nfzrdFxkY6PQeo/ZqvH/zYB4azEVhHkaZi8I8jGpVDuaMjIx3ju8sEfEW4ETgzZn58RnsovPfpCRJ\nnWnOFu+g04uNiHgJ8EXK2hqnznA3I6tX38Xw8IYWRtZdBga2YsGCbTEP5qHBXBTmYZS5KMzDqCoX\nW1xsdEI3yoQiYifgVODLwJkRsXvTwyszc8pHwfDwBoaG+vugAfPQYB5GmYvCPIwyF4V5aJ1O75A6\nAtgeeAlwU/W1rPp3rxrjkiRJU9TRLRuZ+S3gW3XHIUmSZq7TWzYkSVKXs9iQJEltZbEhSZLaymJD\nkiS1lcWGJElqK4sNSZLUVhYbkiSprSw2JElSW1lsSJKktrLYkCRJbWWxIUmS2spiQ5IktZXFhiRJ\naiuLDUmS1FYWG5Ikqa0sNiRJUltZbEiSpLay2JAkSW1lsSFJktrKYkOSJLWVxYYkSWoriw1JktRW\nFhuSJKmtLDYkSVJbWWxIkqS2stiQJEltZbEhSZLaymJDkiS1lcWGJElqK4sNSZLUVhYbkiSprSw2\nJElSW1lsSJKktrLYkCRJbWWxIUmS2spiQ5IktZXFhiRJaiuLDUmS1FYWG5Ikqa0sNiRJUlvNrTuA\nzYmIXYHPAE8F1gJnAO/IzA21BiZJkqak44sN4GvABuDRwC7A14HbgA/XGZQkSZqaji42ImI+sBx4\nT2ZeC2RE/Afw+HojkyRJU9XRxUZmrgP+sfFzRDwMOBo4rbagJEnStHTNANGI+DnwW+BWyhgOSZLU\nBWpv2YiIbYD7TfDwssxcW33/emAn4FPAN4G/nc7rDAx0TV3VFo33bx7MQ4O5KMzDKHNRmIdRrcrB\nnJGRkZbsaKYi4jDgfGC8QJ6VmT8Y8/yDgIuAB2Xm9bMQoiRJ2gK1FxuTiYj7AEdl5plN27YF1gAH\nZ+altQUnSZKmpNPbiLYDvhkRj27adjAwBFxdT0iSJGk6OrplAyAizgIeBLwcuA/wBeDszHxLnXFJ\nkqSp6fSWDYBjgSuAHwPfBv4TeFutEUmSpCnr+JYNSZLU3bqhZUOSJHUxiw1JktRWFhuSJKmtLDYk\nSVJbWWxIkqS2qv3eKLMhInYEPgo8g1JgnQMcl5m31xrYLImIrSk3r3s2sBb4aGZ+rN6oZl9E7Amc\nAjyJkoczgbdXdxfuSxFxDnBzZh5bdyx1iIj5wMeBFwL3AF/MzHfWG9Xsi4i9gM8CTwQGgU9m5ifr\njWp2VZ+TFwOvzcxfVNseRFnb6THAdcAbM/O/6opxNkyQh0Mp59D9gRuAkzPz36ez335p2fgc8Ajg\nSOAIYD/g87VGNLtOBh4FHA68Bnh3RDy71ojq8W1gG+BxwAuAvwHeX2tENYqIFwBH1R1HzU4BngI8\nFXgR8PKIeHm9IdXiLOAOyufEccAHI2JaN7vsZtUJ9hvAQ8c89D3gJuAg4KvAd6vCrCeNl4eI2B34\nIfAz4JHAe4BTI2Janx0937IREdtRrugfm5mXV9uOA34REfN7/aq2ev8vA56WmVcAV0TEicDrgO/U\nGtwsiogA/hLYPTNXVdtOAE4C3lpnbHWIiJ2AE4Hf1B1LXaocHAs8OTMvqbadDDyacjXbFyLivpT3\n/LLMXAosjYjzKEXY92sNbhZExH7A18fZ/mTgwcChmXk38OGIeArlmHnf7EbZfhPlAXgm5Q7s76p+\nXhoRT6IU5+dOdf/90LKxgdJ9ckXTtjnAALBDLRHNrgMoReWFTdv+m/Lh0k+WA0c2Co3KHGDHmuKp\n28nAGcCVdQdSo8cDt2Xmfzc2ZOaJmflPNcZUh7soN7c8JiLmVoX544B+udHlYcBPKV0lc5q2Pxq4\ntCo0Gv67el4vmigP5wLHjPP8aX129nzLRnWg/HjM5jcASzLzlhpCmm17AKsyc6hp283ANhGxMDMH\na4prVlXjc+7ta42IOZTWnZ/UFlRNqiu2J1C6Fk+rOZw6PRi4LiL+AXgHMB84HfhgZvbN0sqZeU9E\nvA74FKULZQA4PTO/VGtgsyQz7/0bKHXWvfagdKE0uxnoyW6UifKQmdcD1zc9thulG/qE6ey/J4qN\niNgGuN8EDy/LzLVNz30d8BzgabMRWwfYjjLwrVnj561nOZZOchKl//HgugOZTVWf7GnAa6qTTN0h\n1WkH4C+AVwAvpZxcPk+5yv94fWHVYj/gB5QWr0dQ+uR/kpnfqDesWk302dm3n5vVufbblCJsWuMe\ne6LYoDR3nQ+MdzXyLMofERHxGuCTwBsy86ezF16t7mbTP47Gz2vpQxHxEeCfgedlZr91I7wHuCgz\n+65FZxxDlDtJvzAzbwCIiAcCr6aPio1qHMLLgL0y8x7gsmoQ5PGUwYL96m5g5zHbtqZ/Pze3p5xL\n9wUeN6Z7abN6otjIzAvYzPiTiHgLZUDcmzPzU7MSWGe4EdglIrbKzA3VtkXAXZl5W41x1SIiTgVe\nCbw4M79Xdzw1eD6we0TcUf28NUBEPCczF9QXVi2WAXc3Co1KAvevKZ66PAr4Q1VoNFxG6VrqZzey\n6eyURZTjpq9ExH2A8yhdj0/KzGunu49+GCBKRLwE+AilRaNvrlgqlwPrgUObtj0BuKiecOoTEe+m\nNJk/PzPPqjuemhxGaSY/oPr6AWXGwQF1BlWTX1PGLu3btO2hlPUU+slNwL4R0XzxuR/wx5ri6RS/\nBh5VdT02PL7a3jeq8W3fBR4EPDEzr5rJfnqiZWMy1fS2U4EvA2dWc4YbVjZd7fekzLwrIs4ATouI\nYymDm94MvKTeyGZXNa3reOBfgV81HweZeXNtgc2yzPxz889VC8dIZvbdiSUzr64WNftS1cW6B2Ua\ndM9Na9yM/6S0+v5bRHwQWAy8vfrqZxcAf6YcH+8HjgYOoYzv6Sf/RFmj6W+A1U2fnesy89ap7qQf\nWjaOALannFxvqr6WVf/25KjicbwJuISyKMupwLsys+fnz49xNOV4P55NjwP1rxcD1wC/BL4EnJKZ\nn641olmWmaspa2rsQVl35aPA+zLz32oNrB73jvurLkT/ltJ1cjFlXYlnjul261UjjObi2ZSpsGcz\n+tl5E2Wg6JTNGRnpmxlekiSpBv3QsiFJkmpksSFJktrKYkOSJLWVxYYkSWoriw1JktRWFhuSJKmt\nLDYkSVJbWWxIkqS2stiQJElt1fP3RpF6QURcBzygadMIcCfl7pzvysxftvj1DgPOBx6UmddHxPnA\nHzPz2Cn87nbASzPzM1vw+g+k3Ajs8Mz8xUz3M8n+N3p/rd6/pI3ZsiF1hxHgJMp9GhYBewKPAW4H\nzouIdtznp/leBs8C3jDF33tL9dXK128H79UgzRJbNqTusSYzVzT9fHNEvAq4kVIMnNquF87M26bx\n9FZdxMxp0X4k1cxiQ+puw9W/dwNExB+B/wCeDuwK/F1m/jIi/h/wSkqrSAInZ+bXGzuJiCdQWk72\nrx4/vflFxnajRMQhwL8ChwJrgO8Abwb+H3BC9ZxhYO+qG+YY4F+AB1G6Rz4HnJqZI9VzHwacAjya\nckfJDzNBy0NEbA8sB96SmZ9r2n4CcGxmPigi7lu9n6OA3YBbge8D/5yZd4+zz026icZ5z/sBJwNP\nBO6g3EX5zZl5c/X4vpSC7zGUgutXVYy/G+99SP3EbhSpS0XE/YBPUcZu/LDpodcCrwOOBH4dEf9K\nKTReCzwc+CTwmapVhIjYG/gRcAnwSOB9VAXDBK+7N+VEewPwl5RWlSOAT1NO8B8F/kwpbG6IiFcA\nJwLvBh4KHA+8FfhQtb8FwE8pBcHBwKuBd030+pm5BjiLcsvvZi8Cvlx9/yXgAOCZwL7AccA/Aq+Y\naL+TiYg9gV9QCrFHAX8NLAAujIhtq6d9i5KTR1HyMkwpwqS+Z8uG1D3eERH/Un0/F5gPXAk8JzNv\nbHreDzPzfLh3sOZxwAsy87zq8T9WBcP/A06jnICXAa+rWhqujogHAB+bII5XAKuAl2Xmhup1XgY8\nNjPXRsSdwHBmrqweOx54f2aeVf3+dRGxI6XgOQF4IdAYVHoncFVEHMfkJ+ovAT+LiPtn5p+rlpaH\nVNsBfgxckJn/V/18fUT8M/CISfY5mVcDf87MNzU2RMQLgJXAc4EzgAdTirbrM3Ooas1ZPMPXk3qK\nxYbUPU6jdDVAuWq+JTPvGOd5f2j6/qHANsDXI6K5W2IAmB8RW1NaOy5rdGlUfjVJHA8HLmkUGgCZ\neQFwwdgnRsQuwF7AhyLig00PbUUplvau9nd1VWg0v/6EYzYy8xfVDJ0XAR8B/h74n8z8Y/WUzwJH\nVyf8hwAPo3ThXDnJ+5rMgcDDI2JsvrcG9qu+fwel1ei1EfFz4DzgGzN8PamnWGxI3eOWzLx2Cs+7\nq+n7RlfpcyldAGOto4yNGNulun6S/U/22FiN/R5H6SoZ688zeP2GLwMvjoiTgOdRTvZExBzgHEqh\n9XXgm8ClwBemETds/Pm4FaXr6NVsWgTdBpCZn42IsyjjZZ5C6Y56V0Qc0GjlkfqVYzak3nYVMAQ8\nMDOvbXwBz6AMXhwBLgcOjojmk+shk+zz98CjqpM6ABHxrIj4Y0TMp2lgZzV7ZiWwz5jXPwRotHRc\nDvxFROw85vU3NzX1y5SC4lXADpRxHFDGnRxJ6V56R2Z+A7iWMnZjotaSdZQxGI33MwfYp+nx31Fa\nMG5oeg+3UloyHhERu0bEqcDWmXlGZr6EMmZkEXDYZt6H1PNs2ZB6WGaujojTgA9UXQC/Ap5E6Xpo\nnOw/Sxk8+sVqMOm+lMGcE/k08HrgtIj4OGW2x4nAf2XmumrMxk4R8RDKzJOPVK//Z+Bcykn4M8B3\nM3N9RHwTeCfwjWpMyk7AJ6bw3q6vuis+BHynqRtmOaVl5PkRsQrYhdLqsTul26OhufC4EHhjRDwN\nuAZ4I7Bj0+OfoYxV+VpEfKD63ZMpXUC/o7Ru/DXw4Ih4B2W2ykuBeygDb6W+ZsuG1B2mugDVeM87\njnLyfh+lVeJtwPGZ+QGAzFwGPBm4P+XEeBLw/oleoHr+EZTBj5dSuiq+TylAAL5NOeFfARyYmR8D\n3kQpaH4PfJwy/uTV1f7WVq+/DvhvSovFR6b4fk+ntGp8aUx8LwGOrl7vTMoskY9TZrs0NOfqo9V7\nOJNSeNxB03iLzLyO0kJxnyrG8yndVU/KzMHMHKZMs90A/AT4LaUr5elN40ikvjVnZMRF9CRJUvvY\nsiFJktrKYkOSJLWVxYYkSWoriw1JktRWFhuSJKmtLDYkSVJbWWxIkqS2stiQJEltZbEhSZLaymJD\nkiS1lcWGJElqq/8P3EJvL6an/wgAAAAASUVORK5CYII=\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEWCAYAAABmE+CbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzt3XuUXHWZ7vHvQ9NAIoHEk0SFBGIUiYajgEFxwBgRERUBXaJGxxsOyXiOAsow4g2dqEdd4HhDx0RFUBm8IEiWqFyEGHUIY0BQQuKIEUwU7CABIx1MJ77nj/1rUmn6Ut1dVfv2fNbq1VW7qvZ+d132u3/XrYjAzMxst7wDMDOzYnBCMDMzwAnBzMwSJwQzMwOcEMzMLHFCMDMzwAnBxkjSGkkLhnhsgaSNLdrOCkn/1Ip1FV3jvkp6naRrOrDNWZJC0u4tXm/LvgPWOU4IFSfpLklbJf1V0r2SLpK093jXGxFzI2JFC0K0QUTEJRFx3EjPk/RBSV/vRExWfU4I9fCyiNgbOBQ4DHh3zvFUXqvPuM06wQmhRiLiXuBqssQAgKQ9JZ0v6feS/iTpC5ImpMemSvqepAck3S/pJ5J2S4/dJenYdHtCKnlslnQHcETjdlOVxJMb7l8k6cPp9pS0jU3p9d+TNGOw+CU9WdKPJT0o6T5J3xzieT+U9LYBy26T9AplPimpJ63nl5IOaeb9S/txuqT1afvnNbwfb5L0s7Tu+4EPpuWnSlqb9u1qSQc2rO+FktalOC4A1PDYmyT9tOH+XEnXps/hT5LeI+l44D3Aq1MJ8Lb03H0lfVnSPZL+IOnDkrrSY13p875P0nrgpcPs7zmSLhuw7NOSPpNuvznt25b0niwe4b0b9DuQ7p8g6db0XfsvSU9veOxdaT+2SPq1pBcMtR0bHyeEGkkH2hcDdzYs/jjwFLIk8WRgf+Dc9NhZwEZgGvA4soPPYHOdfAB4Uvp7EfDGUYS1G/AV4EDgAGArcMEQz/0QcA0wBZgBfHaI5/0nsLD/jqSnpfVfBRwHzCfb58nAq4E/jyLelwPzgMOBk4BTGx57NrAemA58RNLJZO/ZK8jew58Al6aYpgLfAd4HTAV+Cxw12AYlTQKuA34I7Ef2Of0oIn4I/D/gmxGxd0Q8I73kYmB7et5haZ/722FOA05Iy+cBrxxmXy8FXiJpnxRHF/AqsvcXoCetax/gzcAnJR0+zPoGlV5zIbAY+F/AUmB5Olk5GHgbcERETCL7ft012m1Yc5wQ6uG7krYAG8h+xB8AkCSyA8Q7IuL+iNhCdoB5TXpdH/AE4MCI6IuIn8Tgk1+9CvhIWscG4DPNBhYRf46I70REb9r+R4DnDfH0PrID+34R8XBE/HSI510BHNpwNv464PKI+FtaxyRgDqCIWBsR9zQbL/DxtJ+/Bz5FQ+IB/hgRn42I7RGxlewA99G0je1k721/XC8B7oiIyyKiL63r3iG2eQJwb0R8Iu33loi4abAnSnocWdI/MyIeioge4JPs/ExfBXwqIjZExP3AR4fa0Yi4G7gFODktOgbojYhV6fGrIuK3kfkxWbJ+7tBv3ZBOA5ZGxE0RsSMiLgb+BhwJ7AD2BJ4mqTsi7oqI345hG9YEJ4R6ODmdXS0gOxBOTcunAROBm1NR/QGys9Bp6fHzyEoT16QqgXOGWP9+ZMmm393NBiZpoqSlku6W9BdgJTC5v4pjgH8lq1b5b2W9nE4d5DmkxHIVOw+CrwEuSY9dT1YC+RzwJ0nL+s+AmzRwP/cb4jHIktenG97b+1P8+zPgPUuJduDr+80kK0E040CgG7inYbtLyUotDNwuI39WjaWt17KzdICkF0talaqxHiBLclMHWUczMZ/VH29a10yyxH8ncCZZFVyPpG9I2m+Yddk4OCHUSDqLuwg4Py26j6yKZm5ETE5/+6YGaNKZ6FkRMRt4GfDOIepv7yH7Afc7YMDjvWSJp9/jG26fBRwMPDsi9iGrzoGG+vSG+O+NiNMiYj+ys+/PN9ZLD3ApsFDSc4AJwA0N6/lMRDwTmEtWdXT2EOsYzMD9/GNjiAOeuwFY3PDeTo6ICRHxXwx4z1JpbSaD20BWHTeYwbb5N2Bqwzb3iYi56fGRPquBvg0sSNWNLyclBEl7klV5nQ88LiImA99nkM8tGe47sIGshNn4Pk2MiEsBIuI/I+JossQRZNWc1gZOCPXzKeCFkg6NiL8DXySr+50OIGl/SS9Kt09Q1pAr4C9kxfcdg6zzW8C7lTUQzwDePuDxW4HXpgbN49m1SmgSWVJ6QNJjSdVZg5F0inY2OG8mOzgMFg9kB6cDgSVkdex/T+s4QtKzJXUDDwEPD7OOwZyd9nMmcAYwaMN28gWy92Vu2va+kk5Jj10FzFXW0L07cDq7HiQbfQ94vKQzU736JEnPTo/9CZil1Lidqr+uAT4haR9Ju0l6kqT+9/xbwOmSZkiaAgxV6iOtbxOwgqyd53cRsTY9tAdZVc4mYLukF5O1VQxluO/AF4F/Tp+LJD1G0kvTfh4s6ZiUgB4m+66M5vOyUXBCqJn0A/8q8P606F1k1UKrUpXNdWRn7AAHpft/BW4EPj/E2IN/I6t6+B3ZwehrAx4/g6yE8QBZff53Gx77FNkZ/H3AKrIqq6EcAdwk6a/AcuCMiPjdEPv5N+By4FgaqjnIGkC/SJZQ7iZrUD4fQFnPnR8Ms32AK4GbyQ5wVwFfHuqJEXEF2dnsN9J7eztZ/T4RcR9wCvCxFMNBwM+GWM8W4IVk7+G9wG+A56eHv53+/1nSLen2G8gO2Hek/byMrC2ItO9XA7eRtQ9cPsL+Qvb+7fI+pphOJ0swm8mqk5YPs44hvwMRsZqsHeGCtK47gTelh/cke4/uS/s+nayh3tpAvkCOWXMkBXBQqtc2qxyXEMzMDHBCMDOzxFVGZmYGuIRgZmZJqSbgmjp1asyaNSvvMMzMSuXmm2++LyKmjfS8UiWEWbNmsXr16rzDMDMrFUlNzR7gKiMzMwOcEMzMLHFCMDMzwAnBzMwSJwQzMwNK1suo1Vas62HpyvVs2NzLzCkTWTx/NgvmTB/5hWZmFVTbEsKKdT2cu3wNPVseZvKEbnq2PMy5y9ewYl1P3qGZmeWitglh6cr1dHeJiXvsjpT97+4SS1euzzs0M7Nc1DYhbNjcy4TuXa/SOKG7i42be3OKyMwsX7VNCDOnTGRr364XXtrat4MZUyYO8Qozs2qrbUJYPH82fTuC3m3bicj+9+0IFs+fnXdoZma5qG1CWDBnOktOnMv0SXvx4NY+pk/aiyUnznUvIzOrrVp3O10wZ7oTgJlZUtsSgpmZ7coJwczMACcEMzNLnBDMzAxwQjAzs6TWvYzM6s4TPFojlxDMasoTPNpATghmNeUJHm0gVxmZjVJVqlk2bO5l8oTuXZZ5gsd6cwnBbBSqVM3iCR5tICcEs1GoUjWLJ3i0gZwQzEahStfR8ASPNpDbEMxGYeaUifRseZiJe+z86ZS5msUTPFojlxDMRsHVLFZlTghmo+BqFqsyVxmZjZKrWayqXEIwMzPACcHMzBInBDMzA3JMCJJmSrpB0lpJaySdkVcsZmaWb6PyduCsiLhF0iTgZknXRsQdOcZkZlZbuSWEiLgHuCfd3iJpLbA/0LaEUJVJyczM2qEQbQiSZgGHATcN8tgiSaslrd60adOYt1GlScnMzNoh94QgaW/gO8CZEfGXgY9HxLKImBcR86ZNmzbm7VRpUjIzs3bINSFI6iZLBpdExOXt3FaVJiUzM2uHPHsZCfgysDYi/r3d2/Pc72Zmw8uzhHAU8HrgGEm3pr+XtGtjnpTMzGx4efYy+imgTm1vwZzpLCFrS9i4uZcZbexl5N5MZlZGtZrcrhOTkvX3Zuru0i69mZak7ZuZFVXuvYyqxr2ZzKysnBBazL2ZzKysnBBazL2ZzKysnBBazL2ZzKysnBBazJdYNLOyqlUvo07xJRbNrIxcQjAzM8AJwczMEicEMzMDnBDMzCxxQjAzM8C9jMxsEJ6gsZ5cQjCzXfhys/XlhGBmu/AEjfXlhGBmu/AEjfXlhGBmu/AEjfXlhGBmu/AEjfXlhGBmu/AEjfXlbqdm9iieoLGeXEIwMzPACcHMzBInBDMzA5wQzMwscUIwMzPACcHMzBInBDMzA5wQzMwscUIwMzPACcHMzBInBDMzA5wQzMwscUIwMzPACcHMzBJPf21WMivW9bB05Xo2bO5l5pSJLJ4/u9BTVZct3jrLtYQg6UJJPZJuzzMOs7JYsa6Hc5evoWfLw0ye0E3Ploc5d/kaVqzryTu0QZUt3rrLu8roIuD4nGMwK42lK9fT3SUm7rE7Uva/u0ssXbk+79AGVbZ46y7XhBARK4H784zBrEw2bO5lQnfXLssmdHexcXNvThENr2zx1l3eJYQRSVokabWk1Zs2bco7HLNczZwyka19O3ZZtrVvBzOmTMwpouGVLd66K3xCiIhlETEvIuZNmzYt73DMcrV4/mz6dgS927YTkf3v2xEsnj8779AGVbZ4667wCcHMdlowZzpLTpzL9El78eDWPqZP2oslJ84tbK+dssVbd+52alYyC+ZML9UBtWzx1lne3U4vBW4EDpa0UdJb8ozHzKzOci0hRMTCPLdvxeCBS2bF4Cqjgqv6wbJ/4FJ3l3YZuLQEKrWfZmXghFBgdThYNg5cApi4x+70btvO0pXrK7OPVkxVP9kaC/cyKrA6jPL0wCXLg6fUGJwTQoHV4WDpgUuWhzqcbI3FsFVGkrYAMdhDQETEPm2JyoDsYNmz5eFHqlOgegfLxfNnc+7yNfRu286E7i629u3wwCVruw2be5k8oXuXZaM92apildOwJYSImBQR+wzyN8nJoP3qMMpzuIFLK9b1sHDZKo7++PUsXLaq9sV5a53xlkyrWuU0qkZlSdOBvfrvR8TvWx6RPWLBnOksISvebtzcy4yKnIUMNNjApTo0qFt+xlsyrWpniKYSgqQTgU8A+wE9wIHAWmBu+0Krj+GKnnUd5VnVH5wVw3hPtlpR5VREzZYQPgQcCVwXEYdJej7gQWUt4DPhwVX1B2fFMZ6Traq27zXby6gvIv4M7CZpt4i4ATi0jXHVhns7DM69j6zIqtq+12xCeEDS3sBK4BJJnwa2ty+s+qhD19KxqOoPzqqhqrO4NltldBLwMPAO4HXAvsCSdgVVJ1Uteo5XXRrUrbyq2L7XVEKIiIca7l7cplhqyf3wh1bFH5xZkTXby6hxgNoeQDfwkMcijJ/PhM2sKJotIUxqvC/pZOBZbYmohnwmbGZFMKbZTiPiu5LOaXUwNrwqDpU3q5Oi/4abrTJ6RcPd3YB5DD7HUW21+4P2eAWzcivDb7jZbqcva/h7EbCFrOeR0Zl5TTxewazcyvAbbrYN4c3tDqTMOjHNgkfumpVbGX7DI01//VmGqRqKiNNbHlEJdeKD9ngFs3Irw294pCqj1cDNZDOcHg78Jv0dCuwY5nW10olpFjxy16zcyvAbHul6CBdHxMXAQcDzI+KzEfFZ4AV4LqNHdOKDrupQebO6KMNvWBEjdxaS9GvgORFxf7o/BVgVEQe3Ob5dzJs3L1avXt3JTTatv5eRB5eZWdFIujki5o30vGbHIXwM+IWkG9L95wEfHGNsleTBZWZWds32MvqKpB8Az06LzomIe9sXlpmZddqwbQiS5qT/h5NdLW1D+tsvLTMzs4oYqYTwTmAR2eUzBwrgmJZHZGaVVfSpG+pu2IQQEYvS/+d3Jhwzq6oyTN3QDmVKgk1NXSHpFEmT0u33Sbpc0mHtDc3MqqQMUze0WiemtWmlZnsZvT8ivi3paLK5jM4HvsDORubCKlN2tnz4O9IZZZi6odU6Ma1NKzU7uV3/MNyXAv8REVeSXSin0MqWna3z/B3pnE6M6C+asl0zvdmE8AdJS4FXAd+XtOcoXpubOhZRbXT8HemcMkzd0GplS4LNHtRfBVwNHB8RDwCPBc5uW1QtUrbsbJ3n70jnlGHqhlYrWxJsdmBar6Qe4Giyye22p/+FVobZBS1f/o50Vt1G9JftmunNXjHtA2RXSTsY+ArQDXwdOKp9oY3f4vmzOXf5Gnq3bWdCdxdb+3YUOjtb5/k7Yu1WpiTYbC+jlwOHAbcARMQf+7uhjoek44FPA13AlyLiY+NdZ6OyZWfrPH9Hysu9w1qv2YSwLSJCUgBIesx4NyypC/gc8EJgI/BzScsj4o7xrrtRmbKz5cPfkfKp6yC3dms2IXwr9TKaLOk04FTgS+Pc9rOAOyNiPYCkb5Bdp7mlCcHMqqds/fsHU8QSTrONyudLeiHwF7J2hHMj4tpxbnt/sony+m1kkIFukhaRzafEAQccMM5NmllRjeYAWfZBbkUt4TQ9liAiro2IsyPiX4DrJb1unNvWYJsZZLvLImJeRMybNm3aODdpZkU02gGCZevfP1BRx7+MNP31PpLeLekCSccp8zZgPdnYhPHYCMxsuD8D+OM412lmJTTaA2TZ+vcPVNTxLyOVEL5GVkX0K+CfgGuAU4CTIuKkcW7758BBkp4oaQ/gNcDyca7TzEpotAfIsg9yK2oJZ6Q2hNkR8b8BJH0JuA84ICK2jHfDEbE9lTauJut2emFErBnves1seEVszBzLAMGx9A4ryr4XdfzLSAmhr/9GROyQ9LtWJIOGdX4f+H6r1mfjN5YfTFF+ZDayojZmduIAWaR9L+r4F0U8qh1354PSDuCh/rvABKA33Y6I2KftETaYN29erF69upObrJXGH0zjj3K4ovhYXtP/OieRzlu4bNWjzsR7t21n+qS9uHTRkTlGtvM70a4DZJH3vd0k3RwR80Z63khXTOsa7nGrlrH07R7La1p9pubk0rwid9ds9wDBIu97URR+CmvrnLH0fBjLa1rZ5a7s1zNYsa6HhctWcfTHr2fhslVtj7uojZmdUOd9b5YTgj1iLD+Y4V4z1MGulV3uitqfuxl5JLOyd9ccjzrve7OcEOwRY/nBDPWa58x+7JAHu1aeqRW1P3cz8khmZe+uOR513vdmNTuXkdXAWHo+DPWa4doWWtmjpMzXM8irTrvOk/nVed+b4YRguxjLD2aw17zvytuHPNi1sstdUftzN6PMycyqyQmhg+rUG2akg12rztSK1p97NJ9xmZOZVdOw4xCKpszjEMbaX7+s6ra/MPZxHEVJZlZdLRmHYK07q6/C/O2jUbQz904Yy2fsOm2D4tQeOCEMo5UDqOo4KKZuB7s6fsY2fkWaUsPdTofRym6BHhRTff6MW6fTA/byVKSxNE4Iw2hlH3cPiqk+f8atUfbR56NVpLE0TgjDaOUZnwfFdEaeZ5b+jFujSGfMnVCkkqXbEIbR6m6BdatT77Qi1MX6Mx6/urXFFKn7sUsIw/AZX7nU7cyyqop0xtwJRTrOuIQwAp/xFdNg3fTqdmZZVUU6Y+6UohxnXEKw0hmq0XHSnrvX6syyqop0xlw3LiFY6Qw1ACwiHunlU5czy6oqyhlz3TghWOkMVTX04NY+PnTSIaUdHV2U0apWX04IVjrDTZxX1jPLIvSQMnMbgpVOFQeAuYeUFYETgpVOFRsdizRa1erLVUZWSmWtGhqKL5ZjReASglkBtLsarE6TxdnYuYRQcePtuVLnni+d3Pd2Xj/CDdbWLF8xrcLGe9WyOl71rF+V9n3hslWPqo7q3bad6ZP24tJFR+YYmXVKs1dMc5VRhY2350qde75Uad/dYG3NcpVRm+VZ5TLeuX3qPDdQlfbdDdbWLJcQ2ijvC32Md9bIus062ahK+17FcRvWHk4IbZR3tcN4DwR1PpBUad+rOG7D2sNVRmPUTFVQ3tUO4+250s6eL0VXtX2v2rgNaw/3MhqDZnuguHeHmRWBexm1UbNVQVWqdjCz6nNCGINmu/G57tbMyiSXNgRJpwAfBJ4KPCsi8q8HGoXRdONz3a2ZlUVeJYTbgVcAK3Pa/ri4KsjMqiiXEkJErAWQlMfmx61qPVDMzKAE3U4lLQIWARxwwAE5R7OTq4LMrGralhAkXQc8fpCH3hsRVza7nohYBiyDrNtpi8IzM7MB2pYQIuLYdq3bzMxaz91OzcwMyCkhSHq5pI3Ac4CrJF2dRxxmZrZTXr2MrgCuyGPbZmY2OFcZmZkZ4IRgZmaJE4KZmQFOCGZmljghmJkZ4IRgZmaJE4KZmQFOCGZmljghmJkZ4IRgZmaJE4KZmQFOCGZmljghmJkZUIJLaJqZ1dWKdT0sXbmeDZt7mdmBa7e7hGBmVkAr1vVw7vI19Gx5mMkTuunZ8jDnLl/DinU9bdumE4KZWQEtXbme7i4xcY/dkbL/3V1i6cr1bdumE4KZWQFt2NzLhO6uXZZN6O5i4+betm3TCcHMrIBmTpnI1r4duyzb2reDGVMmtm2bTghmZgW0eP5s+nYEvdu2E5H979sRLJ4/u23bdEIwMyugBXOms+TEuUyftBcPbu1j+qS9WHLi3Lb2MnK306TT3bvMzEayYM70jh6HXEIgn+5dZmZF44RAPt27zMyKxgmBfLp3mZkVjRMC+XTvMjMrGicE8uneZWbVtGJdDwuXreLoj1/PwmWrStUW6YRAPt27zKx6yt5Bxd1Ok0537zKz6mnsoAIwcY/d6d22naUr15fi+OISgplZi5S9g4oTgplZi5S9g4oTglmblLlx0cam7B1UnBDM2qDsjYs2NmXvoOJGZbM2KHvjoo1dmTuouIRg1gZlb1y0enJCMGuDsjcuWj3lkhAknSdpnaRfSrpC0uQ84jBrl7I3Llo95VVCuBY4JCKeDvwP8O6c4jBri7I3Llo95dKoHBHXNNxdBbwyjzjM2qnMjYtWT0VoQzgV+MFQD0paJGm1pNWbNm3qYFhmZvXSthKCpOuAxw/y0Hsj4sr0nPcC24FLhlpPRCwDlgHMmzcv2hCqmZnRxoQQEccO97ikNwInAC+ICB/ozcxylksbgqTjgXcBz4sId8w2MyuAvNoQLgAmAddKulXSF3KKw8zMEpWptkbSJuDuJp46FbivzeEUjfe5Puq4397n8TkwIqaN9KRSJYRmSVodEfPyjqOTvM/1Ucf99j53RhG6nZqZWQE4IZiZGVDdhLAs7wBy4H2ujzrut/e5AyrZhmBmZqNX1RKCmZmNkhOCmZkBFUsIko6X9GtJd0o6J+942k3STEk3SForaY2kM/KOqZMkdUn6haTv5R1LJ0iaLOmydC2RtZKek3dM7SbpHem7fbukSyXtlXdM7SDpQkk9km5vWPZYSddK+k36P6XdcVQmIUjqAj4HvBh4GrBQ0tPyjarttgNnRcRTgSOB/1uDfW50BrA27yA66NPADyNiDvAMKr7vkvYHTgfmRcQhQBfwmnyjapuLgOMHLDsH+FFEHAT8KN1vq8okBOBZwJ0RsT4itgHfAE7KOaa2ioh7IuKWdHsL2QFi/3yj6gxJM4CXAl/KO5ZOkLQPMB/4MkBEbIuIB/KNqiN2ByZI2h2YCPwx53jaIiJWAvcPWHwScHG6fTFwcrvjqFJC2B/Y0HB/IzU5OAJImgUcBtyUbyQd8yngX4G/5x1Ih8wGNgFfSdVkX5L0mLyDaqeI+ANwPvB74B7gwQEX16q6x0XEPZCd/AFtv9pSlRKCBllWiz61kvYGvgOcGRF/yTuedpN0AtATETfnHUsH7Q4cDvxHRBwGPEQHqhDylOrMTwKeCOwHPEbSP+YbVbVVKSFsBGY23J9BRYuXjSR1kyWDSyLi8rzj6ZCjgBMl3UVWNXiMpK/nG1LbbQQ2RkR/CfAysgRRZccCv4uITRHRB1wO/EPOMXXSnyQ9ASD972n3BquUEH4OHCTpiZL2IGt8Wp5zTG0lSWR1ymsj4t/zjqdTIuLdETEjImaRfc7XR0Slzxwj4l5gg6SD06IXAHfkGFIn/B44UtLE9F1/ARVvSB9gOfDGdPuNwJXt3mAuF8hph4jYLultwNVkvREujIg1OYfVbkcBrwd+JenWtOw9EfH9HGOy9nk7cEk64VkPvDnneNoqIm6SdBlwC1mPul9Q0SksJF0KLACmStoIfAD4GPAtSW8hS46ntD0OT11hZmZQrSojMzMbBycEMzMDnBDMzCxxQjAzM8AJwczMEicEKwVJOyTdmma9/LakieNY14L+GVIlnTjczLhphtH/M4ZtfFDSv4w1xlavx6wZTghWFlsj4tA06+U24J8bH1Rm1N/niFgeER8b5imTgVEnBLMyckKwMvoJ8GRJs9J1AT5PNnhppqTjJN0o6ZZUktgbHrlWxjpJPwVe0b8iSW+SdEG6/ThJV0i6Lf39A9ngoCel0sl56XlnS/q5pF9K+reGdb03XY/jOuBgBpC0r6S7+hNXGoG7QVK3pNPSOm+T9J3BSkCSVkial25PTVN39F8X4ryGmBan5U+QtLKhZPXcVrz5Vl1OCFYqaRrkFwO/SosOBr7aMOHb+4BjI+JwYDXwznRRlS8CLwOeCzx+iNV/BvhxRDyDbJ6gNWQTyP02lU7OlnQccBDZdOuHAs+UNF/SM8mm0TiMLOEcMXDlEfEgcBvwvLToZcDV/fP0RMQRadtrgbeM4m15C9lMoEek7Z4m6YnAa9P6DyW7fsKtw6zDrDpTV1jlTWiYnuMnZHM47QfcHRGr0vIjyS6O9LNs6hv2AG4E5pBNkvYbgDQR3qJBtnEM8AaAiNgBPDjIVaqOS3+/SPf3JksQk4ArIqI3bWOoebS+CbwauIEsgXw+LT9E0ofJqqj2JpuCpVnHAU+X9Mp0f98U08+BC9MEiN+NCCcEG5YTgpXF1nSm+4h00H+ocRFwbUQsHPC8Q2ndVOgCPhoRSwds48wmt7Ec+KikxwLPBK5Pyy8CTo6I2yS9iWxem4G2s7NU33gpSQFvj4hHJRFJ88kuJPQ1SedFxFebiNFqylVGViWrgKMkPRkeqaN/CrAOeKKkJ6XnLRzi9T8C3ppe26XsKmVbyM7++10NnNrQNrG/pOnASuDlkiZImkRWHfQoEfFX4L/JLof5vVQSIW3jnnQ2/7oh4ruLLIkAvLJh+dXAW9NrkfQUSY+RdCDZdSO+SFaiqvp02TZOLiFYZUTEpnR2famkPdPi90XE/0haBFwl6T7gp8AgxBEhAAAAmUlEQVQhg6ziDGBZml1yB/DWiLhR0s+UXfz8B6kd4anAjamE8lfgHyPiFknfJKunv5usWmso3wS+za6lgPeTXe3ubrL2kUmPfhnnk81++Xp2liwgu4zoLOAWZUFtIrvc4gLgbEl9Kc43DBOTmWc7NTOzjKuMzMwMcEIwM7PECcHMzAAnBDMzS5wQzMwMcEIwM7PECcHMzAD4/9aLQZgGTzOLAAAAAElFTkSuQmCC\n", "text/plain": [ - "" + "
" ] }, "metadata": {}, @@ -600,7 +623,7 @@ "collapsed": true }, "source": [ - "This residual plot looks fairly well behaved. The dispursion is reasonably constant over the range of the predicted value. " + "This residual plot looks fairly well behaved. The dispersion is reasonably constant over the range of the predicted value. " ] }, { @@ -613,16 +636,14 @@ "\n", "1. Simulated a dataset. In a typical regression problem, detailed data exploration would be performed.\n", "2. Prepared the data. In this case preparation included splitting the data into training and test subsets and scaling the features. \n", - "3. Constructed the regression model using training data with Scikit-Learn.\n", + "3. Constructed the regression model using training data with scikit-learn.\n", "4. Evaluated the results of the model using the test data. In this case the residuals were found to be reasonably small and well behaved. " ] }, { "cell_type": "code", "execution_count": null, - "metadata": { - "collapsed": true - }, + "metadata": {}, "outputs": [], "source": [] } diff --git a/Module4/img/ROC_AUC.JPG b/Module4/img/ROC_AUC.JPG new file mode 100644 index 0000000..82f47fc Binary files /dev/null and b/Module4/img/ROC_AUC.JPG differ diff --git a/Module5/.ipynb_checkpoints/Bias-Variance-Trade-Off-checkpoint.ipynb b/Module5/.ipynb_checkpoints/Bias-Variance-Trade-Off-checkpoint.ipynb new file mode 100644 index 0000000..18effdf --- /dev/null +++ b/Module5/.ipynb_checkpoints/Bias-Variance-Trade-Off-checkpoint.ipynb @@ -0,0 +1,499 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Regularization and the bias-variance trade-off\n", + "\n", + "**Overfitting** is a constant danger with machine learning models. Overfit models fit the training data well. However, an overfit model will not **generalize**. A model that generalizes is a model which exhibits good performance on data cases beyond the ones used in training. Models that generalize will be useful in production. \n", + "\n", + "As a general rule, an overfit model has learned the training data too well. The overfitting likely involved learning noise present in the training data. The noise in the data is random and uninformative. When a new data case is presented to such a model it may produce unexpected results since the random noise will be different. \n", + "\n", + "So, what is one to do to prevent overfitting of machine learning models? The most widely used set of tools for preventing overfitting are known as **regularization methods**. Regularization methods take a number of forms, but all have the same goal, to prevent overfitting of machine learning models. \n", + "\n", + "Regularization is not free however. While regularization reduces the **variance** in the model results, it introduces **bias**. Whereas, an overfit model exhibits low bias the variance is high. The high variance leads to unpredictable results when the model is exposed to new data cases. On the other hand, the stronger the regularization of a model the lower the variance, but the greater the bias. This all means that when applying regularization you will need to contend with the **bias-variance trade-off**. \n", + "\n", + "To better understand the bias variance trade-off consider the following examples of extreme model cases:\n", + "\n", + "- If the prediction for all cases is just the mean (or median), variance is minimized. The estimate for all cases is the same, so the bias of the estimates is zero. However, there is likely considerable variance in these estimates. \n", + "- On the other hand, consider what happens when the data are fit with a kNN model with k=1. The training data will fit this model perfectly, since there is one model coefficient per training data point. The variance will be low. However, the model will have considerable bias when applied to test data. \n", + "\n", + "In either case, these extreme models will not generalize well and will exhibit large errors on any independent test data. Any practical model must come to terms with the trade-off between bias and variance to make accurate predictions. \n", + "\n", + "To better understand this trade-off you can consider the example of the mean square error, which can be decomposed into its components. The mean square error can be written as:\n", + "\n", + "$$\\Delta x = E \\Big[ \\big(Y - \\hat{f}(X) \\big)^2 \\Big] = \\frac{1}{N} \\sum_{i=1}^N \\big(y_i - \\hat{f}(x_i) \\big)^2 $$\n", + "\n", + "Where,\n", + "$Y = $ the label vector. \n", + "$X = $ the feature matrix. \n", + "$\\hat{f}(x) = $ the trained model. \n", + "\n", + "Expanding the representation of the mean square error:\n", + "\n", + "$$\\Delta x = \\big( E[ \\hat{f}(X)] - \\hat{f}(X) \\big)^2 + E \\big[ ( \\hat{f}(X) - E[ \\hat{f}(X)])^2 \\big] + \\sigma^2\\\\\n", + "\\Delta x = Bias^2 + Variance + Irreducible\\ Error$$\n", + "\n", + "Study this relationship. Notice that as regularization reduces variance, bias increases. The irreducible error will remain unchanged. Regularization parameters are chosen to minimize $\\Delta x$. In many cases, this will prove challenging. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load a data set\n", + "\n", + "With the above bit of theory in mind, it is time to try an example. In this example you will compute and compare linear regression models using different levels and types of regularization. \n", + "\n", + "Execute the code in the cell below to load the packages required for the rest of this notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import scipy.stats as ss\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The data are available in a pre-processed form. The preprocessing includes the following:\n", + "1. Cleaning missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardization of numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Auto_Data_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Auto_Data_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there are 195 cases and a total of 45 features. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Split the dataset\n", + "\n", + "With the feature array loaded, you must now create randomly sampled training and test data sets. The code in the cell below uses the `train_test_split` function from the `sklearn.model_selection` module to Bernoulli sample the cases in the original dataset into the two subsets. Since this data set is small only 40 cases will be included in the test dataset. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(9988)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 40)\n", + "\n", + "x_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "x_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A first linear regression model\n", + "\n", + "To create a baseline for comparison, you will first create a model using all 45 features and no regularization. In the terminology used before this model has high variance and low bias. In otherwords, this model is overfit. \n", + "\n", + "The code in the cell below should be familiar. In summary, it performs the following processing:\n", + "1. Define and train the linear regression model using the training features and labels.\n", + "2. Score the model using the test feature set. \n", + "3. Compute and display key performance metrics for the model using the test feature set. \n", + "4. Plot a histogram of the residuals of the model using the test partition.\n", + "5. Plot a Q-Q Normal plot of the residuals of the model using the test partition.\n", + "6. Plot the residuals of the model vs. the predicted values using the test partition. \n", + "\n", + "Execute this code and examine the results for the linear regression model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "## define and fit the linear regression model\n", + "lin_mod = linear_model.LinearRegression()\n", + "lin_mod.fit(x_train, y_train)\n", + "\n", + "def print_metrics(y_true, y_predicted):\n", + " ## First compute R^2 and the adjusted R^2\n", + " r2 = sklm.r2_score(y_true, y_predicted)\n", + " \n", + " ## Print the usual metrics and the R^2 values\n", + " print('Mean Square Error = ' + str(sklm.mean_squared_error(y_true, y_predicted)))\n", + " print('Root Mean Square Error = ' + str(math.sqrt(sklm.mean_squared_error(y_true, y_predicted))))\n", + " print('Mean Absolute Error = ' + str(sklm.mean_absolute_error(y_true, y_predicted)))\n", + " print('Median Absolute Error = ' + str(sklm.median_absolute_error(y_true, y_predicted)))\n", + " print('R^2 = ' + str(r2))\n", + " \n", + "def resid_plot(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " sns.regplot(y_score, resids, fit_reg=False)\n", + " plt.title('Residuals vs. predicted values')\n", + " plt.xlabel('Predicted values')\n", + " plt.ylabel('Residual')\n", + " plt.show()\n", + "\n", + "def hist_resids(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " sns.distplot(resids)\n", + " plt.title('Histogram of residuals')\n", + " plt.xlabel('Residual value')\n", + " plt.ylabel('count')\n", + " plt.show()\n", + " \n", + "def resid_qq(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test, y_score)\n", + " ## now make the residual plots\n", + " ss.probplot(resids.flatten(), plot = plt)\n", + " plt.title('Residuals vs. predicted values')\n", + " plt.xlabel('Predicted values')\n", + " plt.ylabel('Residual')\n", + " plt.show()\n", + " \n", + "\n", + "y_score = lin_mod.predict(x_test) \n", + "print_metrics(y_test, y_score) \n", + "hist_resids(y_test, y_score) \n", + "resid_qq(y_test, y_score) \n", + "resid_plot(y_test, y_score) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Overall these results are reasonably good. The error metrics are relatively small. Further, the distribution of the residuals is a bit skewed, but otherwise well behaved. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply l2 regularization\n", + "\n", + "Now, you will apply **l2 regularization** to constrains the model parameters. Constraining the model parameters prevent overfitting of the model. This method is also known as **Ridge Regression**. \n", + "\n", + "But, how does this work? l2 regularization applies a **penalty** proportional to the **l2** or **Eucldian norm** of the model weights to the loss function. For linear regression using squared error as the metric, the total **loss function** is the sum of the squared error and the regularization term. The total loss function can then be written as follows: \n", + "\n", + "$$J(\\beta) = ||A \\beta + b||^2 + \\lambda ||\\beta||^2$$\n", + "\n", + "Where the penalty term on the model coefficients, $\\beta_i$, is written:\n", + "\n", + "$$\\lambda || \\beta||^2 = \\lambda \\big(\\beta_1^2 + \\beta_2^2 + \\ldots + \\beta_n^2 \\big)^{\\frac{1}{2}} = \\lambda \\Big( \\sum_{i=1}^n \\beta_i^2 \\Big)^{\\frac{1}{2}}$$\n", + "\n", + "We call $||\\beta||^2$ the **l2 norm** of the coefficients, since we raise the weights of each coefficient to the power of 2, sum the squares and then raise the sum to the power of $\\frac{1}{2}$. \n", + "\n", + "You can think of this penalty as constraining the 12 or Euclidean norm of the model weight vector. The value of $\\lambda$ determines how much the norm of the coefficient vector constrains the solution. You can see a geometric interpretation of the l2 penalty constraint in the figure below. \n", + "\n", + "\"Drawing\"\n", + "
**Geometric view of l2 regularization**\n", + "\n", + "Notice that for a constant value of the l2 norm, the values of the model parameters $B_1$ and $B_2$ are related. The Euclidian or l2 norm of the coefficients is shown as the dotted circle. The constant value of the l2 norm is a constant value of the penalty. Along this circle the coefficients change in relation to each other to maintain a constant l2 norm. For example, if $B_1$ is maximized then $B_2 \\sim 0$, or vice versa. It is important to note that l2 regularization is a **soft constraint**. Coefficients are driven close to, but likely not exactly to, zero. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With this bit of theory in mind, it is time to try an example of l2 regularization. The code in the cell below computes the linear regression model over a grid of 100 l2 penalty values. In a bit more detail, this code performs the following processing:\n", + "1. A grid of approximately 100 l2 penalty parameters is created.\n", + "2. A function is called which loops over the list of l2 penalty parameters. For each iteration, a penalized l2 regression model is computed along with the root mean squared error which is saved in a list. The `Ridge` function from the scikit-learn `Linear_model` package is used to define the l2 regularized linear regression model. The value of the reqularization parameter is which achieved the lowest RMSE is saved. \n", + "3. A function is called which creates two plots: One of RMSE vs regularization parameter, and, one of the values of the regression coefficients vs. the regularization parameter.\n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_regularization(l, train_RMSE, test_RMSE, coefs, min_idx, title): \n", + " plt.plot(l, test_RMSE, color = 'red', label = 'Test RMSE')\n", + " plt.plot(l, train_RMSE, label = 'Train RMSE') \n", + " plt.axvline(min_idx, color = 'black', linestyle = '--')\n", + " plt.legend()\n", + " plt.xlabel('Regularization parameter')\n", + " plt.ylabel('Root Mean Square Error')\n", + " plt.title(title)\n", + " plt.show()\n", + " \n", + " plt.plot(l, coefs)\n", + " plt.axvline(min_idx, color = 'black', linestyle = '--')\n", + " plt.title('Model coefficient values \\n vs. regularizaton parameter')\n", + " plt.xlabel('Regularization parameter')\n", + " plt.ylabel('Model coefficient value')\n", + " plt.show()\n", + "\n", + "def test_regularization_l2(x_train, y_train, x_test, y_test, l2):\n", + " train_RMSE = []\n", + " test_RMSE = []\n", + " coefs = []\n", + " for reg in l2:\n", + " lin_mod = linear_model.Ridge(alpha = reg)\n", + " lin_mod.fit(x_train, y_train)\n", + " coefs.append(lin_mod.coef_)\n", + " y_score_train = lin_mod.predict(x_train)\n", + " train_RMSE.append(sklm.mean_squared_error(y_train, y_score_train))\n", + " y_score = lin_mod.predict(x_test)\n", + " test_RMSE.append(sklm.mean_squared_error(y_test, y_score))\n", + " min_idx = np.argmin(test_RMSE)\n", + " min_l2 = l2[min_idx]\n", + " min_RMSE = test_RMSE[min_idx]\n", + " \n", + " title = 'Train and test root mean square error \\n vs. regularization parameter'\n", + " plot_regularization(l2, train_RMSE, test_RMSE, coefs, min_l2, title)\n", + " return min_l2, min_RMSE\n", + " \n", + "l2 = [x for x in range(1,101)]\n", + "out_l2 = test_regularization_l2(x_train, y_train, x_test, y_test, l2)\n", + "print(out_l2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results.\n", + "\n", + "The top plot shows the training and test RMSE vs the regularization parameter. The point with the minimum RMSE is shown with a dotted line. Notice that there is a minimum where the l2 parameter has a value of 14.0. To the left of the minimum variance dominates bias. To the right of the minimum bias dominates variance. In this case, the changes in RMSE are not dramatic until the bias grows significantly. \n", + "\n", + "The bottom plot shows the value of the 45 model coefficients vs. the regularization parameter. At the left the regularization penalty is small and the coefficient values show a wide range of values, giving a high variance model. To the right the coefficient values become more tightly clustered, giving a more constrained and higher bias model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, you will evaluate the model using the best l2 regularization parameter discovered above. The code in the cell below computes the regression model with the training data and computes and displays the results using the test data. \n", + "\n", + "Execute the code, and then answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lin_mod_l2 = linear_model.Ridge(alpha = out_l2[0])\n", + "lin_mod_l2.fit(x_train, y_train)\n", + "y_score_l2 = lin_mod_l2.predict(x_test)\n", + "\n", + "print_metrics(y_test, y_score_l2)\n", + "hist_resids(y_test, y_score_l2) \n", + "resid_qq(y_test, y_score_l2) \n", + "resid_plot(y_test, y_score_l2) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the error metrics achieved to those of the un-regularized model. The error metrics for the regularized model are somewhat better. This fact, indicates that the regularized model generalizes better than the unregularized model. Notice also that the residuals are a bit closer to Normally distributed than for the unregularized model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply l1 regularizaton\n", + "\n", + "Regularization can be performed using norms other than l2. The **l1 regularizaton** or **Lasso** method limits the sum of the absolute values of the model coefficients. The l1 norm is sometime know as the **Manhattan norm**, since distance are measured as if you were traveling on a rectangular grid of streets. This is in contrast to the l2 norm that measures distance 'as the crow flies'. \n", + "\n", + "We can compute the l1 norm of the model coefficients as follows:\n", + "\n", + "$$||\\beta||^1 = \\big( |\\beta_1| + |\\beta_2| + \\ldots + |\\beta_n| \\big) = \\Big( \\sum_{i=1}^n |\\beta_i| \\Big)^1$$\n", + "\n", + "where $|\\beta_i|$ is the absolute value of $\\beta_i$. \n", + "\n", + "Notice that to compute the l1 norm, we raise the sum of the absolute values to the first power.\n", + "\n", + "As with l2 regularization, for l1 regularization, a penalty term is multiplied by the l1 norm of the model coefficients. A penalty multiplier, $\\lambda$, determines how much the norm of the coefficient vector constrains values of the weights. The complete loss function is the sum of the squared errors plus the penalty term which becomes: \n", + "\n", + "$$J(\\beta) = ||A \\beta + b||^2 + \\lambda ||\\beta||^1$$\n", + "\n", + "You can see a geometric interpretation of the l1 norm penalty in the figure below. \n", + "\n", + "\"Drawing\"\n", + "
**Geometric view of L1 regularization**\n", + "\n", + "The l1 norm is constrained by the sum of the absolute values of the coefficients. This fact means that values of one parameter highly constrain another parameter. The dotted line in the figure above looks as though someone has pulled a rope or lasso around pegs on the axes. This behavior leads the name lasso for l1 regularization. \n", + "\n", + "Notice that in the figure above that if $B_1 = 0$ then $B_2$ has a value at the limit, or vice versa. In other words, using a l1 norm constraint forces some weight values to zero to allow other coefficients to take non-zero values. Thus, you can think of the l1 norm constraint **knocking out** some weights free the model altogether. In contrast to l2 regularization, l1 regularization does drive some coefficients to exactly zero." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below computes l1 regularized or lasso regression over a grid of regularization values. The structure of the code is nearly the same as for the l2 example. The `Lasso` function from the scikit-learn `Linear_model` package is used to define the l1 regularized linear regression model.\n", + "\n", + "Execute the code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def test_regularization_l1(x_train, y_train, x_test, y_test, l1):\n", + " train_RMSE = []\n", + " test_RMSE = []\n", + " coefs = []\n", + " for reg in l1:\n", + " lin_mod = linear_model.Lasso(alpha = reg)\n", + " lin_mod.fit(x_train, y_train)\n", + " coefs.append(lin_mod.coef_)\n", + " y_score_train = lin_mod.predict(x_train)\n", + " train_RMSE.append(sklm.mean_squared_error(y_train, y_score_train))\n", + " y_score = lin_mod.predict(x_test)\n", + " test_RMSE.append(sklm.mean_squared_error(y_test, y_score))\n", + " min_idx = np.argmin(test_RMSE)\n", + " min_l1 = l1[min_idx]\n", + " min_RMSE = test_RMSE[min_idx]\n", + " \n", + " title = 'Train and test root mean square error \\n vs. regularization parameter'\n", + " plot_regularization(l1, train_RMSE, test_RMSE, coefs, min_l1, title)\n", + " return min_l1, min_RMSE\n", + " \n", + "l1 = [x/5000 for x in range(1,101)]\n", + "out_l1 = test_regularization_l1(x_train, y_train, x_test, y_test, l1)\n", + "print(out_l1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As before, the top plot shows the training and test RMSE vs the regularization parameter. The point with the minimum RMSE is shown with a dotted line. Notice that there is a minimum where the l1 parameter has a value of 0.0044. To the left of the minimum variance dominates bias. To the right of the minimum bias dominates variance. Notice that the curve of RMSE has some kinks or sharp bends. This is in contrast to the smooth curve produced by l2 regularization. \n", + "\n", + "The bottom plot shows the value of the 45 model coefficients vs. the regularization parameter. At the left the regularization penalty is small and the coefficient values show a wide range of values, giving a high variance model. To the right the coefficient values become more tightly clustered, giving a more constrained and higher bias model. In addition, many of the parameters are being driven to zero as the regularization parameter increases. The parameters being driven to zero account for the kinks in the RMSE curve. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below computes the l1 regularized regression model with the training data and computes and displays the results using the test data. Execute the code. Execute this code, evaluate the results, and answer **Question 2** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lin_mod_l1 = linear_model.Lasso(alpha = out_l1[0])\n", + "lin_mod_l1.fit(x_train, y_train)\n", + "y_score_l1 = lin_mod_l1.predict(x_test)\n", + "\n", + "print_metrics(y_test, y_score_l1) \n", + "hist_resids(y_test, y_score_l1) \n", + "resid_qq(y_test, y_score_l1) \n", + "resid_plot(y_test, y_score_l1) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the performance metrics of the 11 regularized model to the metrics for the un-regularized model and the l2 regularized model. In this case the error metrics are in between the two previous models. The residuals are closer to the unregularized model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have explored the basics of regularization. Regularization can prevent machine learning models from being overfit. Regularization is required to help machine learning models generalize when placed in production. Selection of regularization strength involves consideration of the bias-variance trade-off. \n", + "\n", + "L2 and l1 regularization constrain model coefficients to prevent overfitting. L2 regularization constrains model coefficients using a Euclidian norm. L2 regularization can drive some coefficients toward zero, usually not to zero. On the other hand, l1 regularization can drive model coefficients to zero. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/.ipynb_checkpoints/Classificaton-checkpoint.ipynb b/Module5/.ipynb_checkpoints/Classificaton-checkpoint.ipynb new file mode 100644 index 0000000..fa8b89d --- /dev/null +++ b/Module5/.ipynb_checkpoints/Classificaton-checkpoint.ipynb @@ -0,0 +1,1566 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Applications of Classification\n", + "\n", + "In this lab you will perform **two-class classification** using **logistic regression**. A classifier is a machine learning model that seperates the **label** into categories or **classes**. In other words, classification models are **supervise** machine learning modeles which predict a categorical label\n", + "\n", + "In this case bank customer data is used to determine if a particular person is a good or bad credit risk. Thus, credit risk of the customer is the classes you must predict. In this case, the cost to the bank of issuing a loan to a bad risk customer is five times that of denying a loan to a good customer. This fact will become important when evaluating the performance of the model. \n", + "\n", + "Logistic regression is a linear model but with a nonlinear response. The response is binary, $\\{ 0,1 \\}$, or positive and negative. The response is the preediction of the category. \n", + "\n", + "In this lab you will learn the following: \n", + "- How to prepare data for classification models using Scikit-learn. \n", + "- Constructing a classification model using Scikit-learn.\n", + "- Evaluating the performance of the classification model. \n", + "- Using techniques such a reweighting the labels and changing the descion threshold to change the trade-off between false positive and false negative error rates. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Basics of logistic regression\n", + "\n", + "In this section some basic properties of the logistic regression model are presented. \n", + "\n", + "First, execute the code in the cell below to load the packages required to run this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import math\n", + "from sklearn import preprocessing\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Logistic regression is widely used as a classification model. Logistic regression is linear model, with a binary response, `{False, True}` or `{0, 1}`. You can think of this response as having a Binomial distribution. For linear regression the response is just, well, linear. Logistic regression is a linear regression model with a nonlinear output. The response of the linear model is transformed or 'squashed' to values close to 0 and 1 using a **sigmoidal function**, also known as the **logistic function**. The result of this transformation is a response which is the log likelihood for each of the two classes. \n", + "\n", + "The sigmodial or logistic function can be expressed as follows:\n", + "\n", + "$$f(x) = \\frac{1}{1 + e^{-\\kappa(x - x_0)}} \\\\\n", + "\\kappa = steepness$$\n", + "\n", + "Execute the code in the cell below to compute and plot an example of the logistic function." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5,0,'Value of output from linear regression')" + ] + }, + "execution_count": 2, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYUAAAEWCAYAAACJ0YulAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xe8FNX5x/HPQ5OIIgpYaGIBS4qKWIii2GKJNbFgNCpqlBhb7BolBn9GDComUTSoiCKKmlhQsQuWKAooIqAYxHatgA0Eaff5/XHOvSyXvXv3wh1my/f9es1rZ3bKPrs7u8+cMzPnmLsjIiIC0CjtAEREpHAoKYiISDUlBRERqaakICIi1ZQURESkmpKCiIhUU1IoYGZ2jJk9tZLrTjWzXiux3mFm9rGZzTOz7VbmtVfGqrzXPLb9ezP7Ir6n1km8xupkZp3NzM2sSSnHYWaXmNmtGdPL7Zsru4/n8bqPm9nxDb3dYmG6T6FhmNkHwMnu/kwKrz0MqHD3SxtgW+8B57j7w6scWO2v0Rl4H2jq7kuSep34Wk2B74Cd3f3NBtrmB6T0XcfX78xq+vwKKY4k9k0zuxzY3N2PbahtFjuVFKSmjYGpaQfRgDYAmrMS78kC/UYKR6ntm4XJ3TU0wAB8AOxdy7zfATOAr4BRQLuMeb8ApgPfAoOB5wlHoQAnAC/FcQMGAV/GZScDPwFOARYDi4B5wCM14wEaA5cA7wFzgYlAxxoxrhHXd+B74L34vBOOpKqWGwb8XxzvBVQA58a4PgP6ZCz7I+Ba4MMY80vxuY/idufFoUfme43r/hwYH9cbD/w8Y95Y4Argv/H9PAW0yfK5d43vpeq1nstz21fGbS/IfO9x/nCgMs6bB1wA3AGcG+e3j693WpzePH7vVte+kCX+2j6/zvE1msTl+gBvx89iJnBqxjbaAI8C38TXfBFoFOddCHwS15sO7FVIcQCXA3dR+775AXns48DfgY8JJcaJQM/4/H6E383iuP03M/aBqt9gI+DS+N6/BO4E1onzqt7/8YR9ejbwp7T/i1b5vyztAEploJakAOwZd5Zucef+J/BCnNcm7qi/ApoAZ8UdNFtS2Dfu0K0ICWIrYKM4bxjxjzpbPMD5wFvAFnHdbYDWtbyPmkmgrqSwBOgPNAUOAOYD68b5N8YfWPv4o/15/AyqfkxNMrab+V7XA74Gfhs/l6PjdOs4f2z88Xcl/DmNBQbU8n6We608t/0R8OM4v2ld3zVwIsuS8W9ibPdmzHu4rn2hltjz+vyAXwKbxe929/gddIvzrgJujt9PU6BnXG4Lwh9lu4zPabNCioOYFHLsi9XfAzn2ceBYoHX8Ps8FPgeaZ3uNjH3g5IzvbwawKbAW8AAwvMa+dQthP9wGWAhslfb/0aoMKhon7xhgqLu/7u4LgYuBHrE+9gBgqrs/4KFO9h+EHTabxcDawJaEo8633f2zPGM4GbjU3ad78Ka7z1mF91Qzrv7uvtjdRxOOuLaI1S4nAme5+yfuvtTdX46fQV1+CfzP3Ye7+xJ3vwd4BzgoY5nb3f1dd18A3Adsm2e8+Wx7mLtPjfMX57HN54Ge8T3vBvwN2CXO2z3Oh9z7wnLq8/m5+2Pu/l78bp8nlJx6xtmLgY2AjeN39KKHf7SlhD/2rc2sqbt/4O7vFWoceah1H3f3u9x9Tvw+r42vt0We2z0GuM7dZ7r7PMJ31rvGyfW/uPsCD+es3iQkh6KlpJC8doSiJwBxx5pDOOpqRzhKqprnhOqYFbj7c8ANhKO2L8xsiJm1zDOGjoSj1yTM8eVPMs4nHFG1IdTlr8zrLveZRR8SPrMqmcmz6jUbatsfUw/xT2weITH1JFSTfGpmW7B8Uqh1X4hX2syLw83U4/Mzs/3NbJyZfWVm3xAONtrE2QMJR7pPmdlMM7sovvYM4GzCkfKXZjbSzNpl2XyhxFGXWvdxMzvXzN42s29jXOtkxFWXmvvLh4QSxwYZz63svliQlBSS9ynhBBkAZtaCUJT9hFAH3yFjnmVO1+Tu/3D37QlVG10JRWYIRdhcPiYU61fGfGDNjOkN81xvNvBDLa9bV7zLfWZRJ8Jntqry2XZd8WWb/zxwONDM3T+J08cB6wKTsr125r7g7n9197Xi0Jfcnx8Z21gD+A9wDbCBu7cCRhOqUHD3ue5+rrtvSigNnWNme8V5d7v7rjEmB67O8hKFEkddsu7jZtaTcM7iSEK1ZivCeRGLi9R3X+xEqDL9YiViLApKCg2rqZk1zxiaAHcDfcxs2/jD+Svwqrt/ADwG/NTMDo3L/oFa/nTNbAcz2yleYvk94Ye6NM7+glDnWZtbgSvMrEu8ouZn9bhefxLwGzNrbGb7EY586+TulcBQ4DozaxfX7xE/g1mEk7W1xTwa6GpmvzGzJmZ2FLA14Qh8VTXEtrN93s8DpwMvxOmxwBmE8yRV31OufWE5dXx+mZoRqkNmAUvMbH/CxQsAmNmBZrZ5POD4jrDPLDWzLcxsz7i9HwgnzpfW2HbBxJGH2vbxtQl/4rOAJmbWD8gsYX8BdM5xldk9wB/NbBMzW4vwnd3rKV4KnDQlhYY1mrBTVw2Xu/uzwGWEo6jPCEczvQHcfTZwBKEOeg7hz2kC4WRVTS0JJ7S+JhRh5xCOygBuI9TJfmNmD2VZ9zpCvftThB/kbYQTY/k4i3Bk9w2hfjXb9mtzHuHk33jCFSdXE644mU+8wifGvHPmSrEu+EDCScE5hCt8Doyf1yppoG1fBVwaYz8vPvc84Q+oKim8RChhVU2Ta1+oRdbPr8b7mQucSfh+vyac6B6VsUgX4BlC9dYrwGB3H0v4Ax9AKAl8DqxPuHqnkOPIpbZ9/EngceBdwu/mB5avHrw/Ps4xs9ezbHco4YqzFwj3ZPxASPYlSzevFZB4tFIBHOPuY9KOR0TKj0oKKTOzfc2sVSw+X0Ko6xyXclgiUqaUFNLXg3DVxGxCNc2h8TJLEZHVTtVHIiJSTSUFERGplmrTuyujTZs23rlz57TDEBEpKhMnTpzt7m3rWq7okkLnzp2ZMGFC2mGIiBQVM6t5J39Wqj4SEZFqSgoiIlJNSUFERKopKYiISDUlBRERqZZYUjCzoWb2pZlNqWW+mdk/zGyGmU02s25JxSIiIvlJsqQwjNAHam32J7Sc2IXQz/BNCcYiIiJ5SOw+BXd/wbJ0M5jhEODO2NvYuNgo3EaefxeTIgXj7LPD4/XXpxuHFIglS2DRojAsXAiLF4fnli5d8bG28czndt0VWufbBcqqSfPmtfYs3655RXxuhaRgZqcQShN06tRptQQnUh+TJtW9jBQId5g7F776Cr7+Ogzffgvz5sH339c9zJ+/7M8+848/87GysmFjfvHFkBhWgzSTgmV5LmvrfO4+BBgC0L17d7XgJyLL+/57+OQT+Owz+PzzFYevvlqWBL75Jhx912WNNaBFixWH9dYL89ZYA5o1W/Ex23NNm0KTJmFo3DgMVeP5PNelS/KfYZRmUqggdLZdpQOhP1QRkeW5Q0UFvP02zJwJH3wA77+/7HHWrBXXadIENtwQNtgA2rSBTTeFddddflhvvfC4zjqw1lrL//k3KbpWgBpEmu96FHC6mY0EdgK+1fkEEWH27FAfN3UqTJkSHqdOhe++W7ZM06aw8cawySZw6KHQuTN07AgbbRSGDTcMf/aNdNV9fSWWFMzsHqAX0MbMKoA/A00B3P1mQn/GBwAzgPlAn6RiEZECtWQJvP46vPIKvPpqGGbOXDa/dWv4yU/g2GPhxz8Ow+abhz9+/eEnIsmrj46uY74Df0jq9UWkALnDW2/Bc8/Bs8/C88+Hk74A7dvDTjvBqafC9tuHZLD++mDZTj9KUsqz0kxEVp/Fi8PVMw89BA8/DB99FJ7v0gV+8xvYY49wZU379unGKYCSgogkwR1eegmGDYMHHwxX/TRvDvvuC3/+M+y9N+jy8oKkpCAiDeejj+DOO0MyeO+9cEXPYYeF4Re/CFf1SEFTUhCRVTduHFx7LTzwQLhxa889Q4ngV79SIigySgoisnIqK8N5gmuvhZdfhlat4IILwoli9aNetJQURKR+3OHxx+Hii2Hy5HCvwN//DieeGKqLpKgpKYhI/saNg4suCpeSbropjBgBRx5Ztnf/liLd/SEidfv8c+jdG3r0CE1N3HBDePzNb5QQSoy+TRGpnTvcdhucfz4sWBBOHp93nqqJSpiSgohk9+67cMopoapo991hyBDo2jXtqCRhqj4SkeW5wy23wDbbhIbphgwJzVIoIZQFlRREZJl586Bv33ACee+9w41oG22UdlSyGqmkICLBlCmwww5wzz3Qvz888YQSQhlSSUFE4N57oU8faNkSnnkmNFInZUklBZFy5h7uSO7dOzRXPWmSEkKZU1IQKVdLl8LZZ4dLTI84Ap5+OvRYJmVNSUGkHC1YAEcdBf/4R0gMI0eGpq2l7Omcgki5mTsXDjwwdHxz3XXwxz+mHZEUECUFkXIydy7sv39ow2jECDg6Z6+5UoaUFETKxfffL0sI99wTziOI1KCkIFIOFi4MvZ+98kq4/PTww9OOSAqUkoJIqVu6NLRm+vTTcPvtSgiSk64+Eill7nDmmaGbzOuvhxNOSDsiKXBKCiKl7NprYfDgcC/CWWelHY0UASUFkVL14IOhH4QjjoCrr047GikSSgoipeitt+C3v4Udd4Q77oBG+qlLfrSniJSaOXPgkENC43YPPgg/+lHaEUkR0dVHIqWksjKUED75BF54Adq1SzsiKTJKCiKlZNAgePxxuOEG2GmntKORIqTqI5FS8dprcNFF4Sa1005LOxopUkoKIqXg229Dnwjt28Ntt4FZ2hFJkUo0KZjZfmY23cxmmNlFWeZ3MrMxZvaGmU02swOSjEekJLnD734HH30U2jRad920I5IillhSMLPGwI3A/sDWwNFmtnWNxS4F7nP37YDewOCk4hEpWbfcAvffD1deCT16pB2NFLkkSwo7AjPcfaa7LwJGAofUWMaBlnF8HeDTBOMRKT0zZ4b+EPbZJ9yoJrKKkkwK7YGPM6Yr4nOZLgeONbMKYDRwRrYNmdkpZjbBzCbMmjUriVhFik9VtVGTJjB0qG5QkwaR5F6U7UyX15g+Ghjm7h2AA4DhZrZCTO4+xN27u3v3tm3bJhCqSBG69VZ47jkYOBA6dEg7GikRSSaFCqBjxnQHVqweOgm4D8DdXwGaA20SjEmkNFRUhEbu9tgjlBZEGkiSSWE80MXMNjGzZoQTyaNqLPMRsBeAmW1FSAqqHxLJxR369oUlS8JJZl1+Kg0osTua3X2JmZ0OPAk0Boa6+1Qz6w9McPdRwLnALWb2R0LV0gnuXrOKSUQy3X03PPZYuHt5s83SjkZKTKLNXLj7aMIJ5Mzn+mWMTwN2STIGkZIya1boF6FHDzgj63UZIqtElyuIFJM//SncvXzrrdC4cdrRSAlSUhApFhMnhmRw5pmwdc37QEUahpKCSDGorAzVReuvD3/+c9rRSAlT09kixWDECHjlFbj99tB5jkhCVFIQKXRz58IFF4SuNY87Lu1opMSppCBS6P7v/+Dzz+Hhh9WUhSROe5hIIZs+PdyP0KdPKCmIJExJQaSQXXghNG8OV12VdiRSJpQURArVyy+HKqMLL4QNNkg7GikTSgoihcg99Le84YZw9tlpRyNlRCeaRQrRY4/Biy/C4MHQokXa0UgZUUlBpNAsXQoXXwybbw4nn5x2NFJmVFIQKTQjRsCUKTByJDRtmnY0UmZUUhApJAsXQr9+0K0bHHFE2tFIGVJJQaSQ3HQTfPhh6DxHN6pJCrTXiRSKefPgyithr71gn33SjkbKlJKCSKG46SaYPRuuuCLtSKSMKSmIFILvv4eBA0MJoUePtKORMqakIFIIbr45dLWpvhIkZUoKImmbPx/+9rdwLmEXdVku6VJSEEnbv/4FX36pUoIUBCUFkTQtWBBKCXvsAT17ph2NSO33KZjZPwGvbb67n5lIRCLlZMiQ0IHOyJFpRyIC5C4pTAAmAs2BbsD/4rAtsDT50ERK3A8/wNVXQ69esPvuaUcjAuQoKbj7HQBmdgKwh7svjtM3A0+tluhEStntt8Nnn4W2jkQKRD7nFNoBa2dMrxWfE5GVtWRJuC9h551DSUGkQOTT9tEA4A0zGxOndwcuTywikXJw//3w/vuh/2WztKMRqVZnUnD3283scWAnwonni9z988QjEylV7jBgAGy1FRx0UNrRiCwn31ZSdwSqrpdz4JFkwhEpA088AZMnw7BhaglVCk6de6SZDQDOAqbF4UwzuyrpwERK1oAB0LEjHH102pGIrCCfw5QDgH3cfai7DwX2A36Zz8bNbD8zm25mM8zsolqWOdLMppnZVDO7O//QRYrQK6/ACy/AuedCs2ZpRyOygnyrj1oBX8XxdfJZwcwaAzcC+wAVwHgzG+Xu0zKW6QJcDOzi7l+b2fp5Ry5SjK6+GtZbT30vS8HKJylcxbKrjwzYjfBHXpcdgRnuPhPAzEYChxCqoKr8DrjR3b8GcPcv6xG7SHGZNg0efhguvxxatEg7GpGs8rn66B4zGwvsQEgKF+Z59VF74OOM6QrCFUyZugKY2X+BxsDl7v5EzQ2Z2SnAKQCdOnXK46VFCtA118Caa8Lpp6cdiUit8r30YQdCCaFnHM9Htouva7al1AToAvQCjgZuNbNWK6zkPsTdu7t797Zt2+b58iIF5PPPw53LJ54IrVunHY1IrZK8+qgC6Jgx3QH4NMsyD7v7Ynd/H5hOSBIipWXwYFi8GM46K+1IRHJK8uqj8UAXM9vEzJoBvYFRNZZ5CNgDwMzaEKqTZuYbvEhRWLAgJIWDD4bNN087GpGc8q0+yqzSyevqI3dfApwOPAm8Ddzn7lPNrL+ZHRwXexKYY2bTgDHA+e4+J8+YRIrD8OEwZw6cc07akYjUKcmrj3D30cDoGs/1yxh34Jw4iJSeykq4/nro1k2d6EhRSPLqIxF58kl4+2246y41fCdFId/qo0bAbOBroKuZ7ZZcSCIl5LrroH17OOKItCMRyUudJQUzuxo4CpgKVManHXghwbhEit/kyfDMM6GtIzVpIUUin3MKhwJbuPvCpIMRKSnXXx9uVjvllLQjEclbPtVHM4GmSQciUlKqblbr0wfWXTftaETyVmtJwcz+Sagmmg9MMrNngerSgrufmXx4IkVKN6tJkcpVfTQhPk5kxZvORKQ2CxbATTeFXtW66AZ9KS61JgV3v2N1BiJSMu66C2bP1s1qUpRyVR/d5+5HmtlbrNiQHe7+s0QjEylGlZUwaFC4WW03XbktxSdX9VFVZeiBqyMQkZJQdbPa8OG6WU2KUq7qo8/i44erLxyRInfdddCuHRx5ZNqRiKyUXNVHc1lWbVR1yONx3N29ZcKxiRSX7+fBhGfgqqt0s5oUrVwlhbVXZyAiRa+iQjerSdHL545mzGxXoIu73x77PVg7dopTNM4+GyZNSjsKKVWT3qiE7zakV7s34FfrpR2OlKhttw03yicpn57X/gxcyLLmspsBdyUZlEjRWRjv6+zQId04RFZRPiWFw4DtgNcB3P1TMyu6qqWks6uUsQUL6NXydWjdkrGv/jTtaERWST5tHy2KneE4gJm1SDYkkSJz112wZDF06Fj3siIFLp+kcJ+Z/QtoZWa/A54Bbk02LJEiUXWz2lprQau8eqoVKWj59Lx2jZntA3wHbAH0c/enE49MpBhU3ay2ZUeWXbktUrzy6WRnf3d/HHg647m+7n5zopGJFINBg8LNauu3TTsSkQaRT/XRZWa2Z9WEmV0IHJJcSCJF4q234Omn4fTTwfLt2VaksOWzJx8M/NXMeprZlcCO8TmR8jZoULhZ7dRT045EpMHkc05htpkdTDjBPBE4PF6NJFK+qnpWO/lkWE83q0npyKftI4uPzYBNgcPNTG0fSXkbPBgWLVLPalJy1PaRSH3Nnx+SwkEHQdeuaUcj0qBylRS2dPd3zKxbtvnu/npyYYkUsGHDYM4cOP/8tCMRaXC5zimcC/wOuDbLPAf2zPK8SGlbujT0mbDjjrDrrmlHI9LgclUf/S4+7rH6whEpcA8/DO+9BwMGqGc1KUm5qo9+lWtFd3+g4cMRKWDuMHAgbLIJHHZY2tGIJCJX9dFBOeY5oKQg5eXll2HcOPjnP6Fx47SjEUlEruqjPqszEJGCd8014Z6EPvppSOlK9N58M9vPzKab2QwzuyjHcoebmZtZ9yTjEVlp774bziecdhq0UOvxUroSSwpm1hi4Edgf2Bo42sy2zrLc2sCZwKtJxSKyyq67Dpo1C+0ciZSwJEsKOwIz3H2muy8CRpK9Ib0rgL8BPyQYi8jK+/JLuOMOOO442GCDtKMRSVQ+TWdnuwrpW+Atd/8yx6rtgY8zpiuAnWpsezugo7s/ambn5YjhFOAUgE6dOtUVskjDGjwYfvgBzjkn7UhEEpdPH80nAT2AMXG6FzAO6Gpm/d19eC3rZbuIu7ohPTNrBAwCTqgrAHcfAgwB6N69uxrjk9Vn/ny44YbQpMWWW6YdjUji8kkKlcBW7v4FgJltANxEOOp/AagtKVQAmZ3WdgA+zZheG/gJMNbCTUAbAqPM7GB3n1CfNyGSmDvuCE1anFdrQVakpORzTqFzVUKIvgS6uvtXwOIc640HupjZJmbWDOgNjKqa6e7funsbd+/s7p0JpQ8lBCkcmU1a9OyZdjQiq0U+JYUXzexR4P44fTjwgpm1AL6pbSV3X2JmpwNPAo2Boe4+1cz6AxPcfVRt64oUhAcegBkz4L771KSFlA2rq78cC3U7vwJ2JZwneAn4T1od7XTv3t0nTFBhQhLmDtttF04wT51a5x3MvXqFx7FjE49MZKWY2UR3r/NesHx6XnMzewlYRDhR/Jp6XpOS99hj8OabcPvtatJCykqd5xTM7EjgNUK10ZHAq2Z2eNKBiaTGHa68EjbeGI45Ju1oRFarfM4p/AnYoeqeBDNrS+iv+d9JBiaSmjFjQsN3gwdD06ZpRyOyWuVz9VGjGjepzclzPZHidOWVsNFGavhOylI+JYUnzOxJ4J44fRQwOrmQRFI0bhw891xoEbV587SjEVnt8jnRfL6Z/RrYhXD10RB3fzDxyETScOWVoXnsU09NOxKRVORTUsDd/wP8J+FYRNL15pvw6KPQvz+stVba0YikIld3nHPJaKsocxbhStWWiUUlkoa//AVatlTz2FLWcvW8tvbqDEQkVRMnwoMPwuWXw7rrph2NSGp0FZEIQL9+4VzC2WenHYlIqpQURF55BUaPhvPPh3XWSTsakVQpKYhcdhmsvz6ccUbakYikLq+rj0RK1pgx8OyzMGgQtGiRdjQiqVNJQcqXeygltG8PffumHY1IQVBJQcrXk0/Cf/8b2jjS3csigEoKUq4qK+HSS0NLqCedlHY0IgVDJQUpT3ffHe5NuPNOaNYs7WhECoZKClJ+FiyASy6B7bdXfwkiNaikIOVn0CD4+GMYPhwa6bhIJJN+EVJevvgCrroKDj0Udt897WhECo6SgpSXfv3ghx/g6qvTjkSkICkpSPmYMgVuvRVOOw26dk07GpGCpKQg5eP880PT2P36pR2JSMHSiWYpD489Bk88EbrZbN067WhECpZKClL65s+HP/wBtt5ajd6J1EElBSl9/fvDhx/C88/rRjWROqikIKVtyhS49lro0wd22y3taEQKnpKClK7KSvj970PHOX/7W9rRiBQFVR9J6Ro2DF56CYYOhTZt0o5GpCiopCClafbscAlqz55w/PFpRyNSNBJNCma2n5lNN7MZZnZRlvnnmNk0M5tsZs+a2cZJxiNl5Kyz4Lvv4Kab1L6RSD0k9msxs8bAjcD+wNbA0Wa2dY3F3gC6u/vPgH8DqviVVXfvvaFp7H794Mc/TjsakaKS5CHUjsAMd5/p7ouAkcAhmQu4+xh3nx8nxwEdEoxHysEnn4STyzvtBBdfnHY0IkUnyaTQHvg4Y7oiPlebk4DHs80ws1PMbIKZTZg1a1YDhiglxR1OPBEWLgzNYjfRdRQi9ZVkUrAsz3nWBc2OBboDA7PNd/ch7t7d3bu3bdu2AUOUkjJ4MDz1VGjKokuXtKMRKUpJHkpVAB0zpjsAn9ZcyMz2Bv4E7O7uCxOMR0rZ9OnhaqP99oO+fdOORqRoJVlSGA90MbNNzKwZ0BsYlbmAmW0H/As42N2/TDAWKWULF8Kxx8KPfhTuSbBshVQRyUdiJQV3X2JmpwNPAo2Boe4+1cz6AxPcfRShumgt4H4LP+SP3P3gpGKSEnXeeTBhAjzwAGy0UdrRiBS1RM/EuftoYHSN5/pljO+d5OtLGbj3XrjhBvjjH+Gww9KORqTo6a4eKV7TpsHJJ8POO8OAAWlHI1ISlBSkOH31FRx8MLRoAffdpyaxRRqILuSW4rNkCfTuDR99BGPGQMeOda8jInlRUpDi4g5nnglPPw233AK77JJ2RCIlRdVHUlyuuSY0cnfBBeF8gog0KCUFKR733BOSwVFHwVVXpR2NSElSUpDi8NhjcNxxoUvNYcPUHLZIQvTLksI3diwcfjhssw088gg0b552RCIlS0lBCtvYsfDLX8Kmm8ITT0DLlmlHJFLSlBSkcD33HBxwAHTuHMbVz7JI4pQUpDA99FBICJttFu5F2GCDtCMSKQtKClJ4hg6FX/8att02VB+tv37aEYmUDSUFKRyVlXDppXDSSbD33vDMM9C6ddpRiZQV3dEshWHevNCV5v33h6Rw003QtGnaUYmUHSUFSd/UqeGS0+nT4eqrQw9q6ihHJBVKCpKuYcPgtNNg7bVDe0Z77ZV2RCJlTecUJB3ffx+qi/r0gZ12gkmTlBBECoCSgqx+Tz0FP/1pKCVcdlk4oaxuNEUKgpKCrD6zZ4f2i/bdN5xEHjsW+veHxo3TjkxEIiUFSd7SpaFUsNVWoaXTSy+FN98MjduJSEFRUpDkuMPo0bDdduHcwWabweuvwxVXqFE7kQKlpCDJeOUV6NUrNGY3fz6MHAkvvxzOJYhIwVJSkIZTWQmPPgp77AE//3m47+DGG2HatNAxjvpAECl4uk9BVt38+TBiBFyqFjdsAAANMUlEQVR3HbzzDnToAAMHQt++sNZaaUcnIvWgpCArxz1UEQ0bBvfeC999B926heRwxBFqokKkSCkpSP7cYcqU0Kz18OHwv/9BixYhCfTpAz17qnkKkSKnpCC5LV4M48bBww+HZPDee+GPf7fd4JJLQptFqiISKRlKCrK8pUth8mR49tnQ29kLL4QmKZo1C81ZX3ghHHQQbLhh2pGKSAKUFMqZO3zyCYwfD6++Cq+9FsbnzQvzt9wSjj8+tEm0997qH1mkDCgplIPKSqiogBkzwnmAt95aNnz9dVimSZPQ09nxx8POO8Oee0K7dunGLSKrnZJCKVi0CD77LBz1f/ppePzgg5AE3nsPZs6EhQuXLb/22uEmsiOPDI/duoW7jnWXsUjZSzQpmNl+wN+BxsCt7j6gxvw1gDuB7YE5wFHu/kGSMRW0yspQfz93bqjC+eqrMMyZk32YPTskgVmzVtzWmmvC5puH9oYOPDCMVw0dO+oqIRHJKrGkYGaNgRuBfYAKYLyZjXL3aRmLnQR87e6bm1lv4GrgqEQCcoclS8KwdGkYqsZre6xrmUWLwhF4fYb585f96VcNVdPz5+d+D40awbrrhn6LW7eGTp1CVU+7dtC+fRiqxtdbT3/8IlJvSZYUdgRmuPtMADMbCRwCZCaFQ4DL4/i/gRvMzNzdGzyagQPDlTOrU5MmsMYayw9rrhku4Vx7bWjTJozXNmQmgNatoVUrNRUhIolKMim0Bz7OmK4AdqptGXdfYmbfAq2B2ZkLmdkpwCkAnTp1WrloevYMrXM2aRLa76/tMde8zGUaN17xD7/moH4Cysa226YdgUjDSDIpZKu7qFkCyGcZ3H0IMASge/fuK1eK6NEjDCIJuP76tCMQaRhJ1kVUAB0zpjsAn9a2jJk1AdYBvkowJhERySHJpDAe6GJmm5hZM6A3MKrGMqOA4+P44cBziZxPEBGRvCRWfRTPEZwOPEm4JHWou081s/7ABHcfBdwGDDezGYQSQu+k4hERkbolep+Cu48GRtd4rl/G+A/AEUnGICIi+dP1jSIiUk1JQUREqikpiIhINSUFERGpZsV2BaiZzQI+TDuOGtpQ4y7sAldM8SrW5BRTvMUUKxRmvBu7e9u6Fiq6pFCIzGyCu3dPO458FVO8ijU5xRRvMcUKxRdvJlUfiYhINSUFERGppqTQMIakHUA9FVO8ijU5xRRvMcUKxRdvNZ1TEBGRaiopiIhINSUFERGppqTQgMzsDDObbmZTzexvacdTFzM7z8zczNqkHUsuZjbQzN4xs8lm9qCZtUo7pprMbL/43c8ws4vSjicXM+toZmPM7O24r56Vdkx1MbPGZvaGmT2adiy5mFkrM/t33F/fNrOi69lLSaGBmNkehD6nf+buPwauSTmknMysI7AP8FHaseThaeAn7v4z4F3g4pTjWY6ZNQZuBPYHtgaONrOt040qpyXAue6+FbAz8IcCjxfgLODttIPIw9+BJ9x9S2AbiiPm5SgpNJzfAwPcfSGAu3+Zcjx1GQRcQJbuTwuNuz/l7kvi5DhCL36FZEdghrvPdPdFwEjCAUJBcvfP3P31OD6X8MfVPt2oamdmHYBfAremHUsuZtYS2I3QTwzuvsjdv0k3qvpTUmg4XYGeZvaqmT1vZjukHVBtzOxg4BN3fzPtWFbCicDjaQdRQ3vg44zpCgr4TzaTmXUGtgNeTTeSnK4nHMBUph1IHTYFZgG3x6quW82sRdpB1VeineyUGjN7Btgwy6w/ET7LdQnF8R2A+8xs07S6F60j1kuAX6zeiHLLFa+7PxyX+ROh6mPE6owtD5bluYIvgZnZWsB/gLPd/bu048nGzA4EvnT3iWbWK+146tAE6Aac4e6vmtnfgYuAy9INq36UFOrB3feubZ6Z/R54ICaB18ysktAo1qzVFV+m2mI1s58CmwBvmhmEqpjXzWxHd/98NYa4nFyfLYCZHQ8cCOxVgP14VwAdM6Y7AJ+mFEtezKwpISGMcPcH0o4nh12Ag83sAKA50NLM7nL3Y1OOK5sKoMLdq0pd/yYkhaKi6qOG8xCwJ4CZdQWaUXitJOLub7n7+u7e2d07E3bkbmkmhLqY2X7AhcDB7j4/7XiyGA90MbNNzKwZoa/xUSnHVCsLRwO3AW+7+3Vpx5OLu1/s7h3ivtobeK5AEwLxN/SxmW0Rn9oLmJZiSCtFJYWGMxQYamZTgEXA8QV4RFusbgDWAJ6OpZtx7t433ZCWcfclZnY68CTQGBjq7lNTDiuXXYDfAm+Z2aT43CWxT3VZNWcAI+LBwUygT8rx1JuauRARkWqqPhIRkWpKCiIiUk1JQUREqikpiIhINSUFERGppqRQYsxsrJntW+O5s81scB3rzUs4rraxCZA3zKznKm6rl5n9fBW3cUmOeUfEFi7HrMpr5BHDCWZ2Qxzva2bHJfl6xcDMDi70VmZLnZJC6bmHcJNPpt7x+TTtBbzj7tu5+4uruK1ewColBUJTH7U5CTjN3ffIfNLMEruvx91vdvc7k9q+BbX+3mNLr6uy/VVav4q7j3L3AQ2xLVlJ7q6hhAagNaFpjTXidGdC89gGrAU8C7wOvAUckrHevPjYC3g04/kbgBPi+PbA88BEwo1aG2V5/Y3ja0yOj52AbWMMs4BJwI9qrLMX8EaMaWhG7B8AbeJ4d2BsfD+fA5/EbfUEhgE3Ay8SmtY+MK5zAnBDxus8Gt/fAGBpXH9EjVj6AfOA6cDAuI37gUeA5+LnOBCYEuM9KuNzex64L8YwADgGeC0ut1mWz6o6PuBy4Lw4Pha4Oq77LtAzPt84vvb4+PmeGp/P+r3Gz+ptYHD8fDeu8fofxPf7EuHAYTPgifj9vghsGZfbjNA67XigP8vvK2OAu4Fp8bljY9yTgH/FmBvH76jqM/tjXPZMwh2/k4GRWT6TFfal+Pww4B/Ay4QbxA5P+3dXSkPqAWhI4EuFxzL+GC4CBsbxJkDLON4GmMGyGxhzJgWgafwRto3PH0W4c7fmaz9CuJsbQoumD8Xx6h97jeWbE1oY7Rqn7yQ00Fb1p7VcUojjlxP/QOP0sPhn1gjoQmi6o3nN1yQmhcz3W8vnNxbonhF3BbBenP41oX+HxsAGhGS3UfzcvonjaxCS1l/iOmcB12d5ncw/wOr3FF//2jh+APBMHD8FuDSOrwFMILRjlfV7JSSFSmDnWt7nB8AFGdPPAl3i+E6EJiWqPrej43hflt9Xvgc2idNbxe+/aZweDBxHOJh4OuN1WsXHT1l2ANAqy2dS2740jJCoGxH6r5iR9m+ulAZVH5WmzCqkzKojA/5qZpOBZwjNO2+Q5za3AH5CaGpiEnAp2fs16EE4cgQYDuyax3bfd/d34/QdhDbp6+s+d6909/8Rjh63XIlt1OZpd/8qju8K3OPuS939C0LpoKqZ9PEe+ipYCLwHPBWff4vwB10fVY3UTcxY9xfAcfHzf5VQKuxC7u/1Q3cfl+N17oXqFlN/Dtwft/8vQoKD8J3eH8fvrrH+a+7+fhzfi5AAxsdt7EVoTnomsKmZ/TO2Y1XVIutkQpMQxxJav60p1770UPy+p5H/Pix5UNtHpekh4Doz60aoqnk9Pn8M0BbY3t0Xm9kHhCPqTEtY/lxT1XwDprp7fbsXrKsdlWzNTmeLpWacdb2OU/t7qa/vM8ZzxbswY7wyY7qS+v/WqtZdmrGuEZplfjJzQTM7gdq/18zYs6ma3wj4xt23rWecNT+bO9x9hZ7xzGwbYF/gD8CRhCP/XxIOAA4GLjOzH9fxWpnfceZnnes7kXpSSaEEufs8QhXEUJY/wbwOoW36xbH70I2zrP4hsLWZrWFm6xCO9iDUsbet6nPWzJrW8iN+mWWllGMI9dW5vAN0NrPN4/RvCUffEKo3to/jv85YZy6wdo3tHGFmjcxsM8LR6fS4/rbx+Y6EHtKqLI7NR9fXC8BRsc/gtoQ/tddWYjsr40ng91Vxm1nX2IlLPt9rTh76U3jfzI6I27b4Rw7hfELV51/zIoZMzwKHm9n6cRvrmdnGFvoAb+Tu/yH0LdAtnvTu6O5jCB3otCKcG8lU331JGoBKCqXrHkIVROaPeATwiJlNIJwIfKfmSu7+sZndRyja/49wghJ3X2RmhwP/iMmiCaFHrJqtgZ5JaC32fMKJ5ZytRLr7D2bWh1Bt0YRwMvPmOPsvwG3x8tHMnsEeAf5tZocQWqWEkASeJ1Ql9I3b/S/wPqH6ZgrhRGyVIcBkM3vd3Y/JFWMNDxKqNd4kHLle4O6fm1lDVlfV5lZCVdLrsfnrWcCh5PG95ukY4CYzu5RwDmkk4X2eDdxlZucSzld9m21ld58W130q/ukvJpQMFhB6I6s6CL2YcE7mrrgvGTDI3b+JreBWqde+JA1DraRK0TOzYYST4/9OO5ZSZGZrAgvc3c2sN+Gkc8H2QS2rRiUFEanL9sANsXTyDeF8gJQolRRERKSaTjSLiEg1JQUREammpCAiItWUFEREpJqSgoiIVPt/CTO9yykpL1kAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "xseq = np.arange(-7, 7, 0.1)\n", + "\n", + "logistic = [math.exp(v)/(1 + math.exp(v)) for v in xseq]\n", + "\n", + "plt.plot(xseq, logistic, color = 'red')\n", + "plt.plot([-7,7], [0.5,0.5], color = 'blue')\n", + "plt.plot([0,0], [0,1], color = 'blue')\n", + "plt.title('Logistic function for two-class classification')\n", + "plt.ylabel('log likelihood')\n", + "plt.xlabel('Value of output from linear regression')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's make this a bit more concrete with a simple example. Say we have a linear model:\n", + "\n", + "$$\\hat{y} = \\beta_0 + \\beta_1\\ x$$\n", + "\n", + "Now, depending on the value of $\\hat{y}$ we want to classify the output from a logistic regression model as either `0` or `1`. We can use substitute the linear model into the logistic function as follows:\n", + "\n", + "$$F(\\hat{y}) = \\frac{1}{1 + e^{-\\kappa(\\beta_0 + \\beta_1\\ x)}} $$\n", + "\n", + "In this way we transform the continious output of the linear model defined on $-\\infty \\le \\hat{y} \\le \\infty$ to a binary response, $0 \\le F(\\hat{y}) \\le 1$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load and prepare the data set\n", + "\n", + "As a first step you must load the dataset. The code in the cell below loads the dataset and assigns human-readable names to the columns. Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(999, 21)\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
checking_account_statusloan_duration_mocredit_historypurposeloan_amountsavings_account_balancetime_employed_yrspayment_pcnt_incomegender_statusother_signators...propertyage_yrsother_credit_outstandinghome_ownershipnumber_loansjob_categorydependentstelephoneforeign_workerbad_credit
0A1248A32A435951A61A732A92A101...A12122A143A1521A1731A191A2012
1A1412A34A462096A61A742A93A101...A12149A143A1521A1722A191A2011
2A1142A32A427882A61A742A93A103...A12245A143A1531A1732A191A2011
3A1124A33A404870A61A733A93A101...A12453A143A1532A1732A191A2012
4A1436A32A469055A65A732A93A101...A12435A143A1531A1722A192A2011
\n", + "

5 rows × 21 columns

\n", + "
" + ], + "text/plain": [ + " checking_account_status loan_duration_mo credit_history purpose \\\n", + "0 A12 48 A32 A43 \n", + "1 A14 12 A34 A46 \n", + "2 A11 42 A32 A42 \n", + "3 A11 24 A33 A40 \n", + "4 A14 36 A32 A46 \n", + "\n", + " loan_amount savings_account_balance time_employed_yrs payment_pcnt_income \\\n", + "0 5951 A61 A73 2 \n", + "1 2096 A61 A74 2 \n", + "2 7882 A61 A74 2 \n", + "3 4870 A61 A73 3 \n", + "4 9055 A65 A73 2 \n", + "\n", + " gender_status other_signators ... property age_yrs \\\n", + "0 A92 A101 ... A121 22 \n", + "1 A93 A101 ... A121 49 \n", + "2 A93 A103 ... A122 45 \n", + "3 A93 A101 ... A124 53 \n", + "4 A93 A101 ... A124 35 \n", + "\n", + " other_credit_outstanding home_ownership number_loans job_category \\\n", + "0 A143 A152 1 A173 \n", + "1 A143 A152 1 A172 \n", + "2 A143 A153 1 A173 \n", + "3 A143 A153 2 A173 \n", + "4 A143 A153 1 A172 \n", + "\n", + " dependents telephone foreign_worker bad_credit \n", + "0 1 A191 A201 2 \n", + "1 2 A191 A201 1 \n", + "2 2 A191 A201 1 \n", + "3 2 A191 A201 2 \n", + "4 2 A192 A201 1 \n", + "\n", + "[5 rows x 21 columns]" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "credit = pd.read_csv('German_Credit_UCI.csv')\n", + "credit.columns = ['checking_account_status', 'loan_duration_mo', 'credit_history', \n", + " 'purpose', 'loan_amount', 'savings_account_balance', \n", + " 'time_employed_yrs', 'payment_pcnt_income','gender_status', \n", + " 'other_signators', 'time_in_residence', 'property', 'age_yrs',\n", + " 'other_credit_outstanding', 'home_ownership', 'number_loans', \n", + " 'job_category', 'dependents', 'telephone', 'foreign_worker', \n", + " 'bad_credit']\n", + "print(credit.shape)\n", + "credit.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are 20 features plus a label column. These features represent information a bank might have on its customers. However, the categorical features are coded in a way that makes them hard to understand. Further, the label is coded as $\\{ 1,2 \\}$ which is a bit ackward. \n", + "\n", + "The code in the cell below using a list of dictionaries to recode the categorical features with human-readable text. The final dictionary in the list recodes good and bad credit as a binar variable, $\\{ 0,1 \\}$. Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
checking_account_statusloan_duration_mocredit_historypurposeloan_amountsavings_account_balancetime_employed_yrspayment_pcnt_incomegender_statusother_signators...propertyage_yrsother_credit_outstandinghome_ownershipnumber_loansjob_categorydependentstelephoneforeign_workerbad_credit
00 - 200 DM48current loans paidradio/television5951< 100 DM1 - 4 years2female-divorced/separated/marriednone...real estate22noneown1skilled1noneyes1
1none12critical account - other non-bank loanseducation2096< 100 DM4 - 7 years2male-singlenone...real estate49noneown1unskilled-resident2noneyes0
2< 0 DM42current loans paidfurniture/equipment7882< 100 DM4 - 7 years2male-singleguarantor...building society savings/life insurance45nonefor free1skilled2noneyes0
3< 0 DM24past payment delayscar (new)4870< 100 DM1 - 4 years3male-singlenone...unknown-none53nonefor free2skilled2noneyes1
4none36current loans paideducation9055unknown/none1 - 4 years2male-singlenone...unknown-none35nonefor free1unskilled-resident2yesyes0
\n", + "

5 rows × 21 columns

\n", + "
" + ], + "text/plain": [ + " checking_account_status loan_duration_mo \\\n", + "0 0 - 200 DM 48 \n", + "1 none 12 \n", + "2 < 0 DM 42 \n", + "3 < 0 DM 24 \n", + "4 none 36 \n", + "\n", + " credit_history purpose loan_amount \\\n", + "0 current loans paid radio/television 5951 \n", + "1 critical account - other non-bank loans education 2096 \n", + "2 current loans paid furniture/equipment 7882 \n", + "3 past payment delays car (new) 4870 \n", + "4 current loans paid education 9055 \n", + "\n", + " savings_account_balance time_employed_yrs payment_pcnt_income \\\n", + "0 < 100 DM 1 - 4 years 2 \n", + "1 < 100 DM 4 - 7 years 2 \n", + "2 < 100 DM 4 - 7 years 2 \n", + "3 < 100 DM 1 - 4 years 3 \n", + "4 unknown/none 1 - 4 years 2 \n", + "\n", + " gender_status other_signators ... \\\n", + "0 female-divorced/separated/married none ... \n", + "1 male-single none ... \n", + "2 male-single guarantor ... \n", + "3 male-single none ... \n", + "4 male-single none ... \n", + "\n", + " property age_yrs other_credit_outstanding \\\n", + "0 real estate 22 none \n", + "1 real estate 49 none \n", + "2 building society savings/life insurance 45 none \n", + "3 unknown-none 53 none \n", + "4 unknown-none 35 none \n", + "\n", + " home_ownership number_loans job_category dependents telephone \\\n", + "0 own 1 skilled 1 none \n", + "1 own 1 unskilled-resident 2 none \n", + "2 for free 1 skilled 2 none \n", + "3 for free 2 skilled 2 none \n", + "4 for free 1 unskilled-resident 2 yes \n", + "\n", + " foreign_worker bad_credit \n", + "0 yes 1 \n", + "1 yes 0 \n", + "2 yes 0 \n", + "3 yes 1 \n", + "4 yes 0 \n", + "\n", + "[5 rows x 21 columns]" + ] + }, + "execution_count": 4, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "code_list = [['checking_account_status', \n", + " {'A11' : '< 0 DM', \n", + " 'A12' : '0 - 200 DM', \n", + " 'A13' : '> 200 DM or salary assignment', \n", + " 'A14' : 'none'}],\n", + " ['credit_history',\n", + " {'A30' : 'no credit - paid', \n", + " 'A31' : 'all loans at bank paid', \n", + " 'A32' : 'current loans paid', \n", + " 'A33' : 'past payment delays', \n", + " 'A34' : 'critical account - other non-bank loans'}],\n", + " ['purpose',\n", + " {'A40' : 'car (new)', \n", + " 'A41' : 'car (used)',\n", + " 'A42' : 'furniture/equipment',\n", + " 'A43' : 'radio/television', \n", + " 'A44' : 'domestic appliances', \n", + " 'A45' : 'repairs', \n", + " 'A46' : 'education', \n", + " 'A47' : 'vacation',\n", + " 'A48' : 'retraining',\n", + " 'A49' : 'business', \n", + " 'A410' : 'other' }],\n", + " ['savings_account_balance',\n", + " {'A61' : '< 100 DM', \n", + " 'A62' : '100 - 500 DM', \n", + " 'A63' : '500 - 1000 DM', \n", + " 'A64' : '>= 1000 DM',\n", + " 'A65' : 'unknown/none' }],\n", + " ['time_employed_yrs',\n", + " {'A71' : 'unemployed',\n", + " 'A72' : '< 1 year', \n", + " 'A73' : '1 - 4 years', \n", + " 'A74' : '4 - 7 years', \n", + " 'A75' : '>= 7 years'}],\n", + " ['gender_status',\n", + " {'A91' : 'male-divorced/separated', \n", + " 'A92' : 'female-divorced/separated/married',\n", + " 'A93' : 'male-single', \n", + " 'A94' : 'male-married/widowed', \n", + " 'A95' : 'female-single'}],\n", + " ['other_signators',\n", + " {'A101' : 'none', \n", + " 'A102' : 'co-applicant', \n", + " 'A103' : 'guarantor'}],\n", + " ['property',\n", + " {'A121' : 'real estate',\n", + " 'A122' : 'building society savings/life insurance', \n", + " 'A123' : 'car or other',\n", + " 'A124' : 'unknown-none' }],\n", + " ['other_credit_outstanding',\n", + " {'A141' : 'bank', \n", + " 'A142' : 'stores', \n", + " 'A143' : 'none'}],\n", + " ['home_ownership',\n", + " {'A151' : 'rent', \n", + " 'A152' : 'own', \n", + " 'A153' : 'for free'}],\n", + " ['job_category',\n", + " {'A171' : 'unemployed-unskilled-non-resident', \n", + " 'A172' : 'unskilled-resident', \n", + " 'A173' : 'skilled',\n", + " 'A174' : 'highly skilled'}],\n", + " ['telephone', \n", + " {'A191' : 'none', \n", + " 'A192' : 'yes'}],\n", + " ['foreign_worker',\n", + " {'A201' : 'yes', \n", + " 'A202' : 'no'}],\n", + " ['bad_credit',\n", + " {2 : 1,\n", + " 1 : 0}]]\n", + "\n", + "for col_dic in code_list:\n", + " col = col_dic[0]\n", + " dic = col_dic[1]\n", + " credit[col] = [dic[x] for x in credit[col]]\n", + " \n", + "credit.head() " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The categorical features now have meaningful coding. Additionally the label is now coded as a binar variable. \n", + "\n", + "There is one other aspect of this data set which you should be aware of. The label as significant **class imbalance**. Class imbalance means that there are unequal numbers of cases for the categories of the label. \n", + "\n", + "To examine the class imbalance in these data execute the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " credit_history\n", + "bad_credit \n", + "0 699\n", + "1 300\n" + ] + } + ], + "source": [ + "credit_counts = credit[['credit_history', 'bad_credit']].groupby('bad_credit').count()\n", + "print(credit_counts)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that only 30% of the cases have bad credit. This is not suprising, since a bank would typically retain customers with good credit. However, this imbalance will bias the traing of any model. \n", + "\n", + "There is one last step in preparing the data. As discovered in the data exploration, some of the numeric features are closer to log-Normally distributed that Normally distributed. Execute the code in the cell below applies a log transformation to these variables. " + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "credit[['log_loan_duration_mo', 'log_loan_amount', 'log_age_yrs']] = credit[['loan_duration_mo', 'loan_amount', 'age_yrs']].applymap(math.log)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare data for Scikit-Learn model\n", + "\n", + "With the data prepared, it is time to create the numpy arrays required for the Scikit-Learn model. \n", + "\n", + "The code in the cell below creates a numpy array of the label values. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "labels = np.array(credit['bad_credit'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you need to create the numpy feature array or **model matrix**. As a first step the array for the numberic features is created as follow:\n", + "1. The numpy array is created.\n", + "2. The values of the numeric features are Zscore scaled using the `scale` function from the Scikit-Learn `preprocessing` package. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(credit[['log_loan_duration_mo', 'log_loan_amount', \n", + " 'payment_pcnt_income', 'log_age_yrs']])\n", + "Features = preprocessing.scale(Features)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, the categorical variables need to be recoded as binary dummy variables. As discussed in another lesson this is a three step process:\n", + "\n", + "1. Encode the categorical string varaiables as integers.\n", + "2. Transform the integer coded variables to dummy variables. \n", + "3. Append each dummy variable coded categorical variable to the model matrix. \n", + "\n", + "Execute the code in the cell below to perform this processing and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(999, 35)\n", + "[[ 1.70862514 1.16203714 -0.86919627 -1.44689656 1. 0.\n", + " 0. 0. 0. 0. 1. 0.\n", + " 0. 0. 0. 0. 0. 0.\n", + " 0. 0. 1. 0. 0. 1.\n", + " 0. 0. 0. 0. 1. 0.\n", + " 0. 0. 0. 1. 0. ]\n", + " [-0.67649717 -0.18248204 -0.86919627 1.23153776 0. 0.\n", + " 0. 1. 0. 1. 0. 0.\n", + " 0. 0. 0. 0. 0. 1.\n", + " 0. 0. 0. 0. 0. 0.\n", + " 0. 0. 1. 0. 0. 1.\n", + " 0. 0. 0. 1. 0. ]]\n" + ] + } + ], + "source": [ + "def encode_string(cat_features):\n", + " ## First encode the strings to numeric categories\n", + " enc = preprocessing.LabelEncoder()\n", + " enc.fit(cat_features)\n", + " enc_cat_features = enc.transform(cat_features)\n", + " ## Now, apply one hot encoding\n", + " ohe = preprocessing.OneHotEncoder()\n", + " encoded = ohe.fit(enc_cat_features.reshape(-1,1))\n", + " return encoded.transform(enc_cat_features.reshape(-1,1)).toarray()\n", + "\n", + "categorical_columns = ['checking_account_status', 'credit_history', \n", + " 'purpose', 'gender_status', 'time_in_residence', \n", + " 'property']\n", + "\n", + "for col in categorical_columns:\n", + " temp = encode_string(credit[col])\n", + " Features = np.concatenate([Features, temp], axis = 1)\n", + "\n", + "print(Features.shape)\n", + "print(Features[:2, :]) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the dummy variables the original 20 features are now 35, 4 numeric and 31 dummies. \n", + "\n", + "There is just one more step before you can build a model. You must split the cases into training and test data sets. This step is critical. If machine learning models are tested on the training data, the results will be both biased and overly optimistic.\n", + "\n", + "The code in the cell below performs the following processing:\n", + "1. An index vector is Bernoulli sampled using the `train_test_split` function from the `model_selection` package of Scikit-Learn. \n", + "2. The first column of the resulting index array contains the indices of the samples for the training cases. \n", + "3. The second column of the resulting index array contains the indices of the samples for the test cases. \n", + "\n", + "Execute the code. " + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(9988)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "x_train = Features[indx[0],:]\n", + "y_train = np.ravel(labels[indx[0]])\n", + "x_test = Features[indx[1],:]\n", + "y_test = np.ravel(labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Construct the logistic regression model\n", + "\n", + "Now, it is time to compute the logistic regression model. The code in the cell below does the following:\n", + "1. Define a logistic regression model object using the `LogisticRegression` method from the Scikit-Learn `linear_model` package.\n", + "2. Fit the linear model using the numpy arrays of the features and the labels for the training data set.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "LogisticRegression(C=1.0, class_weight=None, dual=False, fit_intercept=True,\n", + " intercept_scaling=1, max_iter=100, multi_class='ovr', n_jobs=1,\n", + " penalty='l2', random_state=None, solver='liblinear', tol=0.0001,\n", + " verbose=0, warm_start=False)" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "logistic_mod = linear_model.LogisticRegression() \n", + "logistic_mod.fit(x_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The model has been computed. Notice that the configuration of the model object has been printed. In this case, only default settings are shown, since no arguments were given to create the model object. \n", + "\n", + "Now, print and examine the model coeffients by executing the code in the cell below. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "print(logistic_mod.intercept_)\n", + "print(logistic_mod.coef_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "First of all, notice that model coefficients look just as they would for an regression model. This is expected as previously explained. Additionally, nearly all the coefficients have the same magnitude indicating this model is likely to be overfit, given the number of features. \n", + "\n", + "Recall that the logistic regression model outputs probabiities for each class. The class with the highest probability is taken as the score (prediction). Execute the code and the cell below to compute and display a sample of these class probabilities for the test feature set. " + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[0.70165964 0.29834036]\n", + " [0.51877533 0.48122467]\n", + " [0.85695626 0.14304374]\n", + " [0.33488103 0.66511897]\n", + " [0.371275 0.628725 ]\n", + " [0.96379958 0.03620042]\n", + " [0.36304 0.63696 ]\n", + " [0.79883847 0.20116153]\n", + " [0.86723409 0.13276591]\n", + " [0.94423876 0.05576124]\n", + " [0.97084881 0.02915119]\n", + " [0.78799525 0.21200475]\n", + " [0.90902877 0.09097123]\n", + " [0.82560321 0.17439679]\n", + " [0.97022299 0.02977701]]\n" + ] + } + ], + "source": [ + "probabilities = logistic_mod.predict_proba(x_test)\n", + "print(probabilities[:15,:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The first column is the probability of a score of $0$ and the second column is the probability of a score of $1$. Notice that for most, but not all, cases the probability of a score of $0$ is higher than $1$. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Score and evaluate the classification model\n", + "\n", + "Now that the class probabilities have been computed these values must be transformed into actual class scores. Recall that the log likelihoods for two-class logistic regression are computed by applying the sigmoid or logistic transformation to the output of the linear model. The simple choice is to set the threshold between the two likelihoods at $0.5$. The code in the cell below applies this inital threshold to the probability of a score of $0$ for the test data. A few examples along with the known labels are then displayed. Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[0 0 0 1 1 0 1 0 0 0 0 0 0 0 0]\n", + "[1 0 0 1 1 0 0 0 1 0 0 0 0 0 0]\n" + ] + } + ], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "scores = score_model(probabilities, 0.5)\n", + "print(np.array(scores[:15]))\n", + "print(y_test[:15])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Some of the positive ($1$) predictions agree with the test labels in the second row, but several do not." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "How can you quantify the performance of the model given all the results of the test data? In general, you must **always use multiple metrics to evaluate the performance of any machine leaning model**, including classifiers. \n", + "\n", + "For classifiers there are a number of metrics commonly used. The **confusion matrix** layes out the correctly and incorrectly classified cases in a tabular format. There are various metrics derived from the values in the confusion matrix. Some of the common cases are briefly reviewed below. \n", + "\n", + "**Confusion matrix**\n", + "\n", + "As already stated, the confusion matrix lays out correctly and incorrectly classified cases. For the binary (two-class) case the confusion matrix is organized as follows:\n", + "\n", + "| | Scored Positive | Scored Negative| \n", + "|------|:------:|:------:| \n", + "|**Actual Positive** | True Positive | False Negative |\n", + "|**Acutal Negative**| False Positive | True Negative | \n", + "\n", + "Here the four elements in the matrix are defined as: \n", + "**True Positive** or **TP** are cases with positive labels which have been correctly classified as positive. \n", + "**True Negative** or **TN** are cases with negative labels which have been correctly classified as negative. \n", + "**False Positive** or **FP** are cases with negative labels which have been incorrectly classified as positive. \n", + "**False Negative** or **FN** are cases with positive labels which have been incorrectly classified as negative.\n", + "\n", + "When creating a confusion matrix it is important to understand and maintain a convention for which label values is considered positve and which value is considered negative. The usual convention is to call the $1$ case positive and the $0$ case negative. \n", + "\n", + "Notice that there is an ambiguity in which case is considered positive and which is considered negative when the confusion matrix is computed. Whenever you examine a confusion matrix it is a good idea to spend a moment and decide which case is which. This step will help you relate the results to the problem at hand. \n", + "\n", + "**Accuracy**\n", + "\n", + "Accuracy is a simple and often misused metric. In simmple terms, acuracy is the fraction of cases correctly classified. For a two-class classifier accuracy is writen as:\n", + "\n", + "$$Accuracy = \\frac{TP+TN}{TP+FP+TN+FN}$$\n", + "\n", + "Accuracy can be quite misleading. For example, say a classifer is used to detect fraudulent accounts and the rate of fraud is less than 1%. A naive model would be to say all accounts are not fraudulent. This model has accuracy exceeding 0.99. This sounds impressive, but is clearly useless. \n", + "\n", + "**Precsion**\n", + "\n", + "Precision is the fraction of cases of a label value correctly classified out of all cases of that classifiers scores as that label value. We can express precision by the following relationship:\n", + "\n", + "$$Precision = \\frac{M_{i,i}}{\\sum_j M_{i,j}}$$\n", + "\n", + "In other words, the precision statistic is the number of correctly classified cases for the lable value divided by all the cases in the column. Thus, precision is sensitive to the number of cases correctly classified for a given score value. \n", + "\n", + "**Recall** \n", + "\n", + "Recall is the fraction of cases of a label value correctly classified out of all cases that actually have that label value. We can express recall by the following relationship:\n", + "\n", + "$$Recall = \\frac{M_{i,i}}{\\sum_i M_{i,j}}$$\n", + "\n", + "In other words, the recall statistic is the number of correctly classified cases for the lable value divided by all the cases in the row. Thus, precision is sensitive to the number of cases correctly classified for a given true label value. \n", + "\n", + "**F1**\n", + "\n", + "The F1 statistic is weighted average of precision and recall. We can express F1 by the following relationship:\n", + "\n", + "$$F1 = 2 * \\frac{precision * recall}{precision + recall}$$\n", + "\n", + "In other words, F1 is a weighted metric for overall model performance. \n", + "\n", + "**ROC** and **AUC**\n", + "\n", + "The receiver operating characteristic or ROC is a curve that displays the relationship between the true postive rate on the vertical axis and false positve rate on the horizontal axis. The ROC curve shows the tradeoff between true positive rate and false positive rate. An example is illustrated below. \n", + "\n", + "In principle, you can pick the desired operating point for a classifier on this curve. Towards the left favors low false positive rate at the expense of true positive rate. Towards the right favors high true positive rate at the expense of higher false positive rate. \n", + "\n", + "\"Drawing\"\n", + "
**ROC curve with values of AUC for balanced two-class problem**
\n", + "\n", + "The AUC is the area or integral under the ROC curve. The overall performance of the classifier is measured by the area under the curve or AUC. But, how can you interpret a specific AUC value? The higher the AUC the lower the increase in false postive rate required to achieve a required true positive rate. For an ideal classifier the AUC is 1.0. A true positive rate is achieved with a 0 false positive rate. This behavior means that AUC is useful for comparing classifiers. The classifier with higher AUC is generally the better one. \n", + "\n", + "For balanced cases, random guessing gives an AUC or 0.5. A balanced case has equal numbers of positive and negative cases. So Bernoulli sampling (random guessing) with a probability $p$ for the will produce a ROC curve that runs diagonally from $0.0,0.0$ to $1.0,1.0$. The area under this trianglular region is 0.5. It is often said that a classifier with an AUC of greater than 0.5 is better than random guessing. But, **for unbalanced cases this statement is not in true in general**. \n", + "\n", + "****\n", + "**Note:** The term receive operating characteristic may seem a bit odd in the machine learning context. This term arose in the early days of radar engineering as a metric to measure the tradeoff between radar signal receiver correctly detecting a target, say an aircraft, and producing a positive response from noise, such as fying birds or clouds. A radar receiver would be adjusted to the desired operating point along its ROC curve. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below implements a function that computes and displays the forementioned classifer performance metrics. The code metrics are are computed using the `precision_recall_fscore_support` and `accuracy_score` functions from the `metrics` package of Scikit-Learn. The confusion matrix is computed using the `confusion_matrix` function from this same package. Execute this code and examine the results for the logistic regression model. " + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 177 28\n", + "True negative 55 40\n", + "\n", + "Accuracy 0.72\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.76 0.59\n", + "Recall 0.86 0.42\n", + "F1 0.81 0.49\n" + ] + } + ], + "source": [ + "def print_metrics(labels, scores):\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('True positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('True negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1])\n", + "\n", + "\n", + " \n", + "print_metrics(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results:\n", + "1. The confusion matrix shows the following characteristics; a) most of the positive cases are correctly classified, 177 vs. 28, however, b) most negative cases are are scored incorrectly, only 40 correct, vs. 55 incorrect. \n", + "2. The overall accuracy is 0.72. However as just observed this is **extremely misleading!**. In fact the negative cases are poorly classified, and it is these bad credit customers the bank cares most about. This is not an unusually case. Accuracy figures should always be regarded with healthy scepticism.\n", + "3. The class imbalance is confirmed. Of the 300 test cases 205 are positive and 95 are negative. \n", + "4. The precision, recall and F1 all show that positive cases are classified reasonably well, but the negative cases are not. As already mentioned, it is these negtive cases that are of greatest improtance to the bank. \n", + "\n", + "Finally, the code in the cell below computes and displays the ROC curve and AUC. The `roc_curve` and `auc` functions from the Scikit-Learn `metrics` package are used to commpute these values. Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XeYFFX2//H3AQUUMaxhXUEEBQMgIs4iuAqYMYKCCEZU5Gta17imXXVd/em6pnWNiDmAEUVFMRHUFQkiEhREUBgTiKCggITz++NWO80409MzTHd1+Lyepx+6q6q7Thc9ffreW3WuuTsiIiKVqRN3ACIiktuUKEREJCUlChERSUmJQkREUlKiEBGRlJQoREQkJSUKSZuZHW9mr8UdRy4xs6Vmtn0M+21mZm5m62V735lgZtPMrGsNnqfPZBYoUeQpM/vczJZFX1TfmNlDZrZRJvfp7o+7+0GZ3EcyM9vLzN4ysyVm9oOZvWhmrbK1/wriGWVm/ZOXuftG7j47Q/vb0cyeNrPvovf/kZldYGZ1M7G/mooSVot1eQ13b+3uo6rYz2+SY7Y/k8VKiSK/HeHuGwHtgN2By2KOp0Yq+lVsZp2A14AXgG2A5sBk4N1M/ILPtV/mZrYD8D4wD9jV3TcBjgFKgEa1vK/Y3nuuHXephLvrloc34HPggKTHNwIvJz2uD9wEzAW+Be4BNkha3x34EPgR+AzoFi3fBLgf+Br4ErgWqBut6we8E92/B7ipXEwvABdE97cBngUWAHOAc5O2uxp4Bngs2n//Ct7f28BdFSx/BXgkut8VKAUuB76Ljsnx6RyDpOdeAnwDPApsBrwUxbwout8k2v46YDWwHFgK3BEtd6BFdP8h4E7gZWAJ4Yt+h6R4DgJmAD8AdwGjK3rv0baPJf9/VrC+WbTvk6P39x1wRdL6DsB7wOLo//IOoF7SegfOBj4F5kTL/kNITD8CE4F9kravGx3nz6L3NhHYFhgTvdZP0XE5Ntr+cMLnazHwP6Btuc/uJcBHwApgPZI+z1HsE6I4vgVuiZbPjfa1NLp1IukzGW3TGngd+D567uVx/60Wwi32AHSr4X/c2n9YTYApwH+S1t8GDAN+R/gF+iJwfbSuQ/RldSChVdkY2Dla9zxwL9AQ2AoYB/xftO7XP0qgc/SlYtHjzYBlhARRJ/oiuRKoB2wPzAYOjra9GlgJ9Ii23aDce9uQ8KW8bwXv+xTg6+h+V2AVcAshKXSJvrB2SuMYJJ77r+i5GwCbAz2j/TcCngaeT9r3KMp9sfPbRPF9dHzXAx4HhkTrtoi++I6O1v0lOgaVJYpvgFNS/P83i/Z9XxT7boQv3V2i9XsAHaN9NQM+Bs4rF/fr0bFJJM8TomOwHnBhFEODaN3FhM/YToBF+9u8/DGIHrcH5gN7EhLMyYTPa/2kz+6HhESzQdKyxOf5PeDE6P5GQMdy73m9pH31o+wz2YiQFC8EGkSP94z7b7UQbrEHoFsN/+PCH9ZSwq87B94ENo3WGeELM/nXbCfKfjneC9xawWv+PvqySW559AVGRveT/yiN8Auvc/T4dOCt6P6ewNxyr30Z8GB0/2pgTIr31iR6TztXsK4bsDK635XwZd8waf1TwN/TOAZdgV8SX4SVxNEOWJT0eBRVJ4pBSesOBT6J7p8EvJe0zgiJtrJEsZKolVfJ+sSXZpOkZeOAPpVsfx4wtFzc+1XxGVsE7BbdnwF0r2S78onibuCf5baZAXRJ+uyeWsHnOZEoxgD/ALao5D1Xlij6ApMy+XdXrDf1D+a3Hu7+hpl1AZ4g/GpdDGxJ+FU80cwS2xrh1x2EX3LDK3i97YD1ga+TnleH8IW2Fnd3MxtC+OMcAxxH6C5JvM42ZrY46Sl1Cd1JCb95zSSLgDXAH4BPyq37A6Gb5ddt3f2npMdfEFo1VR0DgAXuvvzXlWYbArcSktFm0eJGZlbX3VeniDfZN0n3fyb8IiaK6df3HB2/0hSvs5DwXmu0PzPbkdDSKiEch/UIrbxka/0fmNmFQP8oVgc2JnymIHxmPksjHgj//yeb2Z+TltWLXrfCfZdzGnAN8ImZzQH+4e4vpbHf6sQo1aDB7ALg7qMJv2ZvihZ9R+gGau3um0a3TTwMfEP4I92hgpeaR2hRbJH0vI3dvXUlux4M9DKz7QitiGeTXmdO0mts6u6N3P3Q5LBTvJ+fCN0Px1Swujeh9ZSwmZk1THrcFPgqjWNQUQwXErpW9nT3jQndaxASTMqY0/A1oaUUXjBkryaVb84bhG6wmrqbkGRbRu/lcsreR8Kv78fM9iGMG/QGNnP3TQndk4nnVPaZqcg84Lpy//8buvvgivZdnrt/6u59CV2f/wKeif6Pqzr+1YlRqkGJonDcBhxoZu3cfQ2h7/pWM9sKwMwam9nB0bb3A6eY2f5mVidat7O7f0040+hmM9s4WrdD1GL5DXefRBj4HQSMcPdEC2Ic8KOZXWJmG5hZXTNrY2Z/rMb7uZTwq/RcM2tkZpuZ2bWE7qN/lNv2H2ZWL/qyOxx4Oo1jUJFGhOSy2Mx+B1xVbv23hPGWmngZ2NXMekRn+pwNbJ1i+6uAvczs32a2dRR/CzN7zMw2TWN/jQhjIkvNbGfgzDS2X0X4/1zPzK4ktCgSBgH/NLOWFrQ1s82jdeWPy33AGWa2Z7RtQzM7zMzSOlvLzE4wsy2j/8PEZ2p1FNsaKv8/eAnY2szOM7P60edmz3T2KakpURQId18APELon4fw63AWMNbMfiT8Qt0p2nYcYVD4VsKvxtGE7gIIfen1gOmELqBnSN0FMhg4gND1lYhlNXAEoY9/DuHX/SDCGVXpvp93gIMJg79fE7qUdgf2dvdPkzb9JorzK8Lg8RnunuiuqvQYVOI2wsDwd8BY4NVy6/9DaEEtMrPb030v0fv5jtBCupHQrdSKcGbPikq2/4yQFJsB08zsB0KLbQJhXKoqFxG6A5cQvrifrGL7EYQzymYSjvVy1u4euoUw/vMaIQHdTzhWEMacHjazxWbW290nEMas7iD838wijCWkqxvhPS8lHPM+7r7c3X8mnH32brSvjslPcvclhBM0jiB8Lj4F9q3GfqUSiTNWRPJOdCXvY+6eqgsnJ5lZHcLpuce7+8i44xFJRS0KkSwxs4PNbFMzq0/ZmMHYmMMSqVLGEoWZPWBm881saiXrzcxuN7NZUWmC9pmKRSRHdCKclfMdoXukh7svizckkaplrOvJzDoTzvN/xN3bVLD+UODPhHPN9yRcLKaBJxGRHJOxFoW7jyFcpVqZ7oQk4u4+FtjUzNI5b1xERLIozgvuGrP2WRWl0bKvy29oZgOAAQANGzbcY+edd85KgCIiVfpxBqxeBnU3qHrbOMxfAUtXMXG1f+fuW9bkJeJMFOUv/oFKLqhx94HAQICSkhKfMGFCJuMSEUnfG13DvweMijOKtSWGFMzg7rth/nzs6qu/qOnLxXnWUynhkvuEJoRz4UVEctusgSFBvNEVFn0YdzRr+/JL6N4dnogubTrzTLiq/LWj1RNnohgGnBSd/dQR+CG6MlhEJLd9/kRZgtisHTQ7Lt54ILQi7rsPWrWCN96ApUtr7aUz1vVkZoMJFTq3iIqfXUUoOIe730MoSnco4arNnwlXCouIVGzWwPAFnQsWfRgSRK50N332GZx+OowcCfvuGxLGDrVX9ipjiSIq6pVqfWLiFBGRqiV+xW/WLu5IcqcVkTBlCkycCAMHQv/+YWyiFqnMuIjEL53WQq79io/b1KnwwQdw0knQowfMng2bb17182pAJTxEJH7Jff6VybVf8XH55Re4+mpo3x6uuAKWR1OqZChJgFoUIpIr1Fqo2vvvw2mnwbRpcMIJcOut0KBBxnerRCEikg++/BL22Qd+/3t46SU47LCs7VpdTyIiuWzmzPBv48bw5JOhNZHFJAFqUYhIJqV7SmuunM2USxYvhr/+FQYNglGjoHNnOOqoWEJRi0JEMiedQWrQQHV5w4ZB69Zw//1w8cXwx+rMIlz71KIQkdTW5UI3ndJaff37hwSx667wwgtQUhJ3REoUIlKFdbnQTS2F9CQX8Sspge22g0sugXr14o0rokQhUoyq00pQqyCz5s2DM86APn3gxBPD/RyjMQqRYpTu2AGoVZApa9aEEuCtW4fB6hUr4o6oUmpRiBQrtRLi8+mnYSxizBg44IBQo6l587ijqpQShUghqqprSaejxmv6dPjoI3jgAejXr9aL+NU2JQqRQlTVALS6k7Jv8mT48EM4+eQwsdDs2bDZZnFHlRYlCpFCkmhJaAA6d6xYAddeCzfcAH/4Axx7bKjPlCdJAjSYLVJYkpOEWgzxe+892H33kCiOOw4mTcpKEb/aphaFSL5LHo9QSyJ3fPkldOkCW28Nw4fDIYfEHVGNqUUhku9ycf7mYvbxx+Hfxo3hqadCEb88ThKgFoVIYVArIn6LFsGFF8KDD4bTXvfZJ8w8VwDUohDJV7MGwhtd079wTjJn6FBo1QoeeQQuuyz2In61TS0KkXylgevccOqpoRXRrh28/HKYorTAKFGI5Kp0L5pTl1P2JRfx69gRWraEiy6C9dePN64MUdeTSK6qqh6TWhLx+OKLMDj96KPh8YABobupQJMEqEUhkj3VnddBLYbckijid+mloUVxzDFxR5Q1alGIZEt1KraCWgy5ZMaMcE3EOefAXnvB1Klw2mlxR5U1alGIZJNaCPlpxoxwPcRDD8FJJ+V8Eb/apkQhIlKRSZNCEb9TToEjjwxF/DbdNO6oYqGuJxGRZMuXw+WXh2shrr46PIaiTRKgRCEiUubdd8P1ENdfH7qYPvwwL4v41TZ1PYmIQCjit+++oUbTiBFw0EFxR5Qz1KIQyYZZA2H+6LijkIpMnx7+bdwYnn0WpkxRkihHiUIkGxLXT+h019zx/fdhGtLWrUMRP4AjjoCNNoo1rFykrieRTCk/T8RWXaDFgHhjkuDZZ+Hss2HhQrjiCujQIe6IcppaFCKZonkiclO/ftCrV+hqGj8+zD6nAeuU1KIQySRdYJcbkov47bUX7LJLmDtiPX0FpiOjR8nMugH/AeoCg9z9hnLrmwIPA5tG21zq7sMzGZNItVW3RlNColaTxGvOnFC474QT4OSTw32plox1PZlZXeBO4BCgFdDXzFqV2+xvwFPuvjvQB7grU/GI1Fh1azQlqLspXqtXw+23Q5s2MHZsWatCqi2TLYoOwCx3nw1gZkOA7sD0pG0c2Di6vwnwVQbjkWJV0xZBgqq45p+PPw5F+957L5QEv+ceaNo07qjyViYHsxsD85Iel0bLkl0NnGBmpcBw4M8VvZCZDTCzCWY2YcGCBZmIVQpZTVsECWoZ5J9Zs0Ihv0cfDbPOKUmsk0y2KCoqr1i+7dcXeMjdbzazTsCjZtbG3des9ST3gcBAgJKSErUfpWrlT01Vi6DwTZwIkyeHqUmPOCKMTWy8cdXPkyplskVRCmyb9LgJv+1aOg14CsDd3wMaAFtkMCYpFjo1tXgsWxYmE9pzT/jnP8uK+ClJ1JpMtijGAy3NrDnwJWGwuvxf61xgf+AhM9uFkCjUtyTBuowtqBVRHMaMgf794dNPw5jETTfpmogMyFiLwt1XAecAI4CPCWc3TTOza8zsyGizC4HTzWwyMBjo565TEySyLmMLakUUvi+/hP33h1Wr4I03YNCgoi4FnkkZvY4iuiZieLllVybdnw78KZMxSJ5Tq0DKmzIFdt01XFk9dGio+NqwYdxRFTSV8JDcpGqrUt5338GJJ0LbtmVF/A4/XEkiC3T9uuQmVVuVBHd4+mk45xxYtAiuuioMXEvWKFFI/CoatFa1VUk4+eRwPURJCbz5Zuh2kqxSopD4JQatk+siaTC6uCUX8evSJXQ3nXeeivjFREddcoMGrSVh9mw4/fRQxO+UU8JprxIrDWaLSG5YvRpuuy10LY0fD3X09ZQr1KIQkfhNnx5Kb7z/Phx2WCji16RJ3FFJRClb4jNrILzRdd0K9klhmDMHPvsMnngCXnxRSSLHqEUh8UkexNbAdfEZPx4+/DCMRxx2WBibaNQo7qikAmpRSLwSg9g6DbZ4/PwzXHQRdOwI119fVsRPSSJnKVGISPaMGhVOdb355tCSmDRJRfzygLqepHZVp+Kr5pQuLqWlcOCBsN128NZboUaT5AW1KKR2Vafiq8YmisPkyeHfJk3ghRfgo4+UJPKMWhSybsq3IDQPhCQsWAB/+QsMHhy6nLp0gUMPjTsqqQG1KGTdlG9BqJUg7iE5tGoFzzwD//gHdOoUd1SyDtJqUZhZPaCpu8/KcDySj9SCkGQnngiPPx4qvN5/P7RuHXdEso6qTBRmdhhwC1APaG5m7YCr3P2oTAcnMUtnYFoD0gKwZk0o4GcWxh/22APOPRfq1o07MqkF6XQ9XQPsCSwGcPcPgRaZDEpyRDoD0+pqklmzwpSkDz4YHp92Gpx/vpJEAUmn62mluy82s+Rlmte6kCVaEhqYllRWrQpF/P7+d6hfX1VeC1g6ieJjM+sN1DGz5sBfgLGZDUtipdIaUpWpU0MJ8AkToHt3uOsu2GabuKOSDEknUZwDXAmsAZ4DRgCXZTIoyQFqSUgqc+fCF1/AkCHQu3cYm5CClU6iONjdLwEuSSwws6MJSUNEisX774eL5wYMCNdDzJ4NG20Ud1SSBekMZv+tgmVX1HYgIpKjfvoJLrggXAtx442wYkVYriRRNCptUZjZwUA3oLGZ3ZK0amNCN5SIFLq33grF+2bPhjPPhBtuCAPXUlRSdT3NB6YCy4FpScuXAJdmMigRyQGlpXDwwdC8OYweDZ07xx2RxKTSROHuk4BJZva4uy/PYkwiEqdJk2D33UMRvxdfDDWaNtgg7qgkRumMUTQ2syFm9pGZzUzcMh6ZiGTXt9/CscdC+/ahBQHQrZuShKSVKB4CHgQMOAR4ChiSwZhEJJvc4bHHQhG/55+Ha6+FvfaKOyrJIekkig3dfQSAu3/m7n8DVExepFAcd1wo5LfTTmEO6yuugPXXjzsqySHpXEexwkL9js/M7AzgS2CrzIYlGZeq4J8K/RW+5CJ+Bx0UTn09+2zVZ5IKpdOiOB/YCDgX+BNwOnBqJoOSLEhV8E+lOwrbzJmhwusDD4THp5yiSq+SUpUtCnd/P7q7BDgRwMyaZDIoyRKV6Sguq1bBLbfAVVdBgwYapJa0pUwUZvZHoDHwjrt/Z2atCaU89gOULPJFRd1M6l4qLh99BKeeChMnwlFHwZ13wh/+EHdUkicq7Xoys+uBx4HjgVfN7ApgJDAZ2DE74UmtqKibSd1LxaW0FObNg6efhmefVZKQaknVougO7Obuy8zsd8BX0eMZ6b64mXUD/gPUBQa5+w0VbNMbuJowx8Vkd9e3V22aNRDmj4atuqibqdj873+hJXHGGWVF/Bo2jDsqyUOpBrOXu/syAHf/HvikmkmiLnAn4dqLVkBfM2tVbpuWhJLlf3L31sB51YxfqpLoclLroXgsXQp/+QvsvTfcfHNZET8lCamhVC2K7c0sUUrcgGZJj3H3o6t47Q7ALHefDWBmQwitlOlJ25wO3Onui6LXnF/N+CUdW3WBFgPijkKy4bXXQhnwuXPD6a7/7/+piJ+ss1SJome5x3dU87UbA/OSHpcS5t5OtiOAmb1L6J662t1fLf9CZjYAGADQtGnTaoYhUiTmzYPDDoMddoAxY0KLQqQWpCoK+OY6vnZFU16Vn2t7PaAl0JVwFtXbZtbG3ReXi2UgMBCgpKRE83WLJJs4EfbYA7bdFoYPh332Cae/itSSdC64q6lSYNukx00IA+Llt3nB3Ve6+xxgBiFxiEhVvvkGjjkGSkrKivgdeKCShNS6TCaK8UBLM2tuZvWAPsCwcts8T1Q3ysy2IHRFzc5gTCL5zx0efjgU8XvxxTAOoSJ+kkHp1HoCwMzqu/uKdLd391Vmdg4wgjD+8IC7TzOza4AJ7j4sWneQmU0HVgMXu/vC6r2FIpOqRlNFdGFd4enTB556Cv70Jxg0CHbeOe6IpMCZe+oufzPrANwPbOLuTc1sN6C/u/85GwGWV1JS4hMmTIhj17nhja7V//JvdpzOesp3yUX8Hn4YliyBs86COpnsFJBCYmYT3b2kJs9Np0VxO3A4oZsId59sZioznk3JrYhEktDFc8Xjk0+gf3/o1y/8e/LJcUckRSadnyN13P2LcstWZyIYqURyCQ6V3igeK1eG8YfddoPp02GjjeKOSIpUOi2KeVH3k0dXW/8Z0FSomaZWRHH78MNQ/vvDD6FXL/jvf2HrreOOSopUOi2KM4ELgKbAt0DHaJlkkloRxe2bb8Lt2WdDIT8lCYlROi2KVe7eJ+ORyG+pFVFc3nknFPE76yzo1g0++ww23DDuqETSalGMN7PhZnaymTXKeEQixWbJEjjnnHBF9W23lRXxU5KQHFFlonD3HYBrgT2AKWb2vJmphSFSG0aMgDZt4K67QsXXDz5QET/JOWmdhO3u/3P3c4H2wI+ECY0kE2YNLLtWQgrbvHlw+OGh5fDOO6E1oTObJAdVmSjMbCMzO97MXgTGAQsA1QvIlMQgtgawC5M7jBsX7m+7LbzyCkyapBIcktPSGcyeCrwI3Ojub2c4nuKm2egK29dfhzkihg6FUaOgSxc44IC4oxKpUjqJYnt3X5PxSESz0RUqd3joIbjgAli+HP71r1CnSSRPVJoozOxmd78QeNbMflMQKo0Z7qQmNBtd4endG555JpzVNGgQ7Lhj3BGJVEuqFsWT0b/VndlOqitxFbYqvRaO1atDAb86deCII2C//eD//k9F/CQvVfqpdfdoxI1d3P3N5BuwS3bCKxIawC4sH38cWg/33x8en3QSnHmmkoTkrXQ+uadWsOy02g6k6CWuwla3U/5auRKuvRbatYMZM2CTTeKOSKRWpBqjOJYwK11zM3suaVUjYHHFzxIpUpMmhTLgH30Exx4Lt98OW20Vd1QitSLVGMU4YCFhrus7k5YvASZlMiiRvPPtt/Ddd/D889C9e9zRiNSqShOFu88B5gBvZC8ckTwyZgxMmRKujejWDWbNgg02iDsqkVpX6RiFmY2O/l1kZt8n3RaZ2ffZC1Ekx/z4Y6jw2qVL6GJKFPFTkpAClWowOzHd6RbAlkm3xGNZV6rrlH+GD4fWreHee8MFdCriJ0Ug1emxiauxtwXquvtqoBPwf0DDLMRW+HRabH6ZNy+MP2yyCfzvf3DzzdBQfwpS+NIp4fE88Ecz2wF4BHgZeAI4PJOBFSxNcZpf3OH996Fjx1DE77XXQvmNevXijkwka9K5jmKNu68EjgZuc/c/A40zG1YB0xSn+eOrr6BHD+jUCUaPDsv23VdJQopOWlOhmtkxwIlAj2jZ+pkLqQioFZHb3MNV1RddFAaqb7pJRfykqKWTKE4FziKUGZ9tZs2BwZkNSyRGvXrBc8+Fs5oGDYIWLeKOSCRWVSYKd59qZucCLcxsZ2CWu1+X+dAKjAr/5bbkIn49esBBB8Hpp6s+kwjpzXC3DzALuB94AJhpZmqHV5fOcMpdU6eGrqVEEb8TT1SlV5Ek6XQ93Qoc6u7TAcxsF+BRoCSTgRUkjU3kll9+geuvh+uuC6e8brZZ3BGJ5KR0EkW9RJIAcPePzUynfUh+mzgxFPGbOhWOOw5uuw221HWkIhVJJ1F8YGb3EloRAMejooCS7xYuhMWL4cUX4XBdEiSSSjqJ4gzgXOCvgAFjgP9mMiiRjBg5MhTxO/fcMFj96afQoEHcUYnkvJSJwsx2BXYAhrr7jdkJSaSW/fAD/PWvMHAg7LxzGKiuX19JQiRNqarHXk4o33E88LqZVTTTnaSSKPqnwn/xefFFaNUqXA9x0UVhbEJF/ESqJVWL4nigrbv/ZGZbAsMJp8dKupJPidVpsdk3bx707BlaEc8/D3/8Y9wRieSlVIlihbv/BODuC8xMJ5XXhE6JzS53eO892GuvsiJ+e+2l+kwi6yDVl//2ZvZcdBsK7JD0+LkUz/uVmXUzsxlmNsvMLk2xXS8zczMrnGszZg2E+aPjjqK4lJbCkUeGi+cSRfy6dlWSEFlHqVoUPcs9vqM6L2xmdQlzbR8IlALjzWxY8jUZ0XaNCGdVvV+d1895iVLi6m7KvDVr4L774OKLYdUquOUW2HvvuKMSKRip5sx+cx1fuwOhLtRsADMbAnQHppfb7p/AjcBF67i/3JBc02mrLtBiQNwRFb6ePcMYxH77hYSx/fZxRyRSUDI57tAYmJf0uJRy81iY2e7Atu7+UqoXMrMBZjbBzCYsWLCg9iOtTarplB2rVoWWBIREcd998MYbShIiGZDJRGEVLPNfV4bB8VuBC6t6IXcf6O4l7l6yZT6UWUgMYKs1kRkffRQmE7rvvvD4hBOgf/9Q/VVEal3aicLMqnvyeSlhvu2EJsBXSY8bAW2AUWb2OdARGFZQA9pSu1asgKuugj32gC++UG0mkSxJp8x4BzObAnwaPd7NzNIp4TEeaGlmzaMign2AYYmV7v6Du2/h7s3cvRkwFjjS3SfU5I3EShfWZd748dC+PVxzDfTtCx9/DEcfHXdUIkUhnRbF7cDhwEIAd58M7FvVk9x9FXAOMAL4GHjK3aeZ2TVmdmTNQ85Bmgc78xYtgqVLYfhweOQR2HzzuCMSKRrpFAWs4+5f2Nr9v6vTeXF3H064ojt52ZWVbNs1ndfMWbqwrva99VYo4veXv4QifjNnqvyGSAzSaVHMM7MOgJtZXTM7D5iZ4bikmC1eHKYh3X9/uPfeMDYBShIiMUknUZwJXAA0Bb4lDDqfmcmgpIi98EIo4vfAA6Hiq4r4icSuyq4nd59PGIgWyay5c+GYY2CXXWDYMCjRCXAiuaDKRGFm95F0/UOCu+siAVl37vDOO7DPPtC0abhormNH1WcSySHpdD29AbwZ3d4FtgJWZDIoKRJz58Jhh0HnzmVF/Dp3VpIQyTHpdD09mfzYzB4FXs9YRFL41qyBe+6BSy4JLYrbb1cRP5Ecls7pseU1B7ar7UCkiBx9dBi0PvDAMD1ps2ZxRyQiKaQzRrGIsjGKOsD3QKVzSxSV5Eqxm7WLO5rctmonq2lrAAAU6klEQVQV1KkTbsceC927Q79+qs8kkgdSJgoLV9ntBnwZLVrj7r8Z2C5aqhSbnsmT4dRTw7URZ5wRSnCISN5IOZgdJYWh7r46uilJlKdKsZVbvhz+9rdwmmtpKWy9ddwRiUgNpHPW0zgza5/xSKSwjBsHu+8O110Hxx8fivj16BF3VCJSA5V2PZnZelFhv72B083sM+AnwjwT7u5KHlK5H3+EZcvg1Vfh4IPjjkZE1kGqMYpxQHtAPwMlPa+9BtOmwfnnwwEHwIwZKr8hUgBSJQoDcPfPshSL5KtFi+CCC+Chh6B1azjrrJAglCRECkKqRLGlmV1Q2Up3vyUD8Ui+ee45OPtsWLAALrsMrrxSCUKkwKRKFHWBjah47muRUIKjTx9o0yZMKLT77nFHJCIZkCpRfO3u12QtklyUuKCuMsV4oZ07jBkDXbqEIn5vvQV77gnrrx93ZCKSIalOj1VLInmK04oU24V2X3wBhxwCXbuWFfHbe28lCZECl6pFsX/Wosg15UtzFPsUp2vWwF13waVR5Zb//jeUBReRolBponD377MZSE5RaY619egBL74Yroe4917YTjUhRYpJTarHFrZZA2H+aNiqS3G3JFauhLp1QxG/vn2hVy848UQV8RMpQumU8CguicHrYm5JfPABdOgQ5oyAkChOOklJQqRIKVFUZKsuxVnkb9mycC1Ehw7wzTew7bZxRyQiOUBdTxKMHQsnnwwzZ4aS4DfdBJttFndUIpIDlCgk+OmnMC7x+uuhTpOISESJopi9+moo4nfhhbD//vDJJ1CvXtxRiUiO0RhFMVq4MHQzHXIIPPww/PJLWK4kISIVUKJImDUQ3uia+krsfOcOzzwDrVrBE0+E2efGj1eCEJGU1PWUUAwX2c2dC8cdB23bhrkjdtst7ohEJA8oURR6uQ53GDkS9tsvXFE9alQ4/XU9/deLSHrU9VTILYk5c+Cgg8JAdaKI3157KUmISLXoGwMKryWxejXccQdcfnkow3H33SriJyI1VlyJoqL5JQpxTonu3eHll+HQQ0MZDl1hLSLroLi6niqaX6JQupxWrgzlwCEU73vsMXjpJSUJEVlnGW1RmFk34D+EaVUHufsN5dZfAPQHVgELgFPd/YuMBFPIVWEnTIDTToMBA8L81cceG3dEIlJAMtaiMLO6wJ3AIUAroK+ZtSq32SSgxN3bAs8AN2YqnoKsCrtsGVxySZiKdMECzRMhIhmRya6nDsAsd5/t7r8AQ4DuyRu4+0h3/zl6OBZoksF4Cqsq7HvvhesgbrwxFPGbPh0OPzzuqESkAGWy66kxMC/pcSmwZ4rtTwNeqWiFmQ0ABgA0bdq0tuLLb8uWhTGJN94Ip7+KiGRIJhNFRbPceIUbmp0AlABdKlrv7gOBgQAlJSUVvkZRGD48FPG7+OJwAd3HH8P668cdlYgUuEx2PZUCyafcNAG+Kr+RmR0AXAEc6e4rajWCRP2mfK/h9N13cMIJcNhh8PjjZUX8lCREJAsymSjGAy3NrLmZ1QP6AMOSNzCz3YF7CUlifq1HkHw6bD6eBusOQ4bALrvAU0/BVVfBuHEq4iciWZWxrid3X2Vm5wAjCKfHPuDu08zsGmCCuw8D/g1sBDxtYT7mue5+5DrvvFDqN82dG8qB77Yb3H8/7Lpr3BGJSBHK6HUU7j4cGF5u2ZVJ9zMzlVo+129yhzffDLPMbbddqNH0xz+GUhwiIjEo3CuzEy2JfDod9rPPwhlMBx5YVsSvY0clCRGJVeEminyyejXcckvoWpo4Ee69V0X8RCRnFFdRwFx1xBHwyivhgrm774Ymmb3uUESkOpQo4vLLL2FeiDp1oF+/UMivTx+wii4/ERGJj7qe4jBuHOyxB9x1V3jcuzf07askISI5qXASRT5cXPfzz3DhhdCpEyxaBDvsEHdEIiJVKpxEkesX173zThisvuUWOP30UIrjkEPijkpEpEqFNUaRyxfXrVwZTnMdORK6do07GhGRtBVWosg1L74YCvf99a+w776hFPh6OuQikl8Kp+splyxYAMcdB0ceCYMHlxXxU5IQkTykRFGb3OGJJ0IRv2eegWuugfffVxE/Eclr+olbm+bOhVNOgd13D0X8WreOOyIRkXWWn4kiUR02WaIIYLatWQOvvw4HHxyK+L39drhGQvWZRKRA5GfXU/KpsAlxnBL76adhprlu3WDMmLCsQwclCREpKPnZooB4T4VdtQpuvRWuvBLq1w/dTCriJyIFKn8TRZwOPxxGjIDu3UMZjm22iTsikZy0cuVKSktLWb58edyhFI0GDRrQpEkT1q/FqZKVKNK1YkWYo7pOHejfH049FY45RvWZRFIoLS2lUaNGNGvWDNPfSsa5OwsXLqS0tJTmzZvX2uvm5xhFto0dC+3bw513hse9eoVCfvrgi6S0fPlyNt98cyWJLDEzNt9881pvweVfovhxRvaK/v30E5x/Puy1FyxZAi1bZme/IgVESSK7MnG886/rafUy2GzvzJ/h9PbbcPLJMGcOnHUWXH89bLxxZvcpIpKD8q9FUXeD7MyFvWpVGJMYPTp0OSlJiOStoUOHYmZ88sknvy4bNWoUhx9++Frb9evXj2eeeQYIA/GXXnopLVu2pE2bNnTo0IFXXnllnWO5/vrradGiBTvttBMjRoyocJt99tmHdu3a0a5dO7bZZht69OixVtzt2rWjdevWdOnSZZ3jSUf+tSgy6fnnQxG/yy4LRfymTVN9JpECMHjwYPbee2+GDBnC1VdfndZz/v73v/P1118zdepU6tevz7fffsvo0aPXKY7p06czZMgQpk2bxldffcUBBxzAzJkzqVvu2qu333771/s9e/ake/fuACxevJizzjqLV199laZNmzJ//vx1iidd+hYE+PZb+POf4emnw6D1hReG+kxKEiK1Z+J5tT++uFk72OO2lJssXbqUd999l5EjR3LkkUemlSh+/vln7rvvPubMmUP9+vUB+P3vf0/v3r3XKdwXXniBPn36UL9+fZo3b06LFi0YN24cnTp1qnD7JUuW8NZbb/Hggw8C8MQTT3D00UfTtGlTALbaaqt1iidd+df1VJvc4dFHoVUreOEFuO66cIaTiviJFIznn3+ebt26seOOO/K73/2ODz74oMrnzJo1i6ZNm7JxGl3O559//q/dRMm3G2644Tfbfvnll2y77ba/Pm7SpAlffvllpa89dOhQ9t9//1/jmDlzJosWLaJr167ssccePPLII1XGVxuK+yfz3LnhmoiSknB19c47xx2RSOGq4pd/pgwePJjzzjsPgD59+jB48GDat29f6dlB1T1r6NZbb017W3ev1v4GDx5M//79f328atUqJk6cyJtvvsmyZcvo1KkTHTt2ZMcdd6xWzNVVfIlizZpwVfUhh4Qifu++G6q9qj6TSMFZuHAhb731FlOnTsXMWL16NWbGjTfeyOabb86iRYvW2v77779niy22oEWLFsydO5clS5bQqFGjlPs4//zzGTly5G+W9+nTh0svvXStZU2aNGHevHm/Pi4tLWWbSio7LFy4kHHjxjF06NC1nr/FFlvQsGFDGjZsSOfOnZk8eXLGEwXunle3PVpu5DU2Y4b7Pvu4g/uoUTV/HRFJy/Tp02Pd/z333OMDBgxYa1nnzp19zJgxvnz5cm/WrNmvMX7++efetGlTX7x4sbu7X3zxxd6vXz9fsWKFu7t/9dVX/uijj65TPFOnTvW2bdv68uXLffbs2d68eXNftWpVhdvefffdftJJJ621bPr06b7ffvv5ypUr/aeffvLWrVv7lClTfvPcio47MMFr+L1bHGMUq1bBv/4FbdvClCnw4IPQuXPcUYlIhg0ePJijjjpqrWU9e/bkiSeeoH79+jz22GOccsoptGvXjl69ejFo0CA22WQTAK699lq23HJLWrVqRZs2bejRowdbbrnlOsXTunVrevfuTatWrejWrRt33nnnr2c8HXrooXz11Ve/bjtkyBD69u271vN32WUXunXrRtu2benQoQP9+/enTZs26xRTOswr6DPLZSU7NvIJM5dU70kHHwyvvQZHHx2uidh668wEJyJr+fjjj9lll13iDqPoVHTczWyiu5fU5PUKd4xi+fJwwVzdujBgQLj17Bl3VCIieacwu57efRfatSsr4tezp5KEiEgNFVaiWLoUzj03TCK0fDmoySsSu3zr3s53mTjehZMoRo+GNm3gjjvgnHNg6lQ48MC4oxIpag0aNGDhwoVKFlni0XwUDRo0qNXXLawxig03DFVf//SnuCMREcJ5/6WlpSxYsCDuUIpGYoa72pTfZz099xx88glcfnl4vHq1LpwTEanAupz1lNGuJzPrZmYzzGyWmV1awfr6ZvZktP59M2uW1gt/802YZa5nTxg6FH75JSxXkhARqXUZSxRmVhe4EzgEaAX0NbNW5TY7DVjk7i2AW4F/VfnCSy0MUr/0UphM6H//UxE/EZEMymSLogMwy91nu/svwBCge7ltugMPR/efAfa3qipyfbM0DFpPngyXXhqulRARkYzJ5GB2Y2Be0uNSYM/KtnH3VWb2A7A58F3yRmY2AEhMabfC3nlnqiq9ArAF5Y5VEdOxKKNjUUbHosxONX1iJhNFRS2D8iPn6WyDuw8EBgKY2YSaDsgUGh2LMjoWZXQsyuhYlDGzCTV9bia7nkqBbZMeNwG+qmwbM1sP2AT4PoMxiYhINWUyUYwHWppZczOrB/QBhpXbZhhwcnS/F/CW59v5uiIiBS5jXU/RmMM5wAigLvCAu08zs2sIddGHAfcDj5rZLEJLok8aLz0wUzHnIR2LMjoWZXQsyuhYlKnxsci7C+5ERCS7CqfWk4iIZIQShYiIpJSziSJj5T/yUBrH4gIzm25mH5nZm2a2XRxxZkNVxyJpu15m5mZWsKdGpnMszKx39NmYZmZPZDvGbEnjb6SpmY00s0nR38mhccSZaWb2gJnNN7Oplaw3M7s9Ok4fmVn7tF64ppNtZ/JGGPz+DNgeqAdMBlqV2+Ys4J7ofh/gybjjjvFY7AtsGN0/s5iPRbRdI2AMMBYoiTvuGD8XLYFJwGbR463ijjvGYzEQODO63wr4PO64M3QsOgPtgamVrD8UeIVwDVtH4P10XjdXWxSZKf+Rn6o8Fu4+0t1/jh6OJVyzUojS+VwA/BO4EViezeCyLJ1jcTpwp7svAnD3+VmOMVvSORYObBzd34TfXtNVENx9DKmvResOPOLBWGBTM/tDVa+bq4miovIfjSvbxt1XAYnyH4UmnWOR7DTCL4ZCVOWxMLPdgW3d/aVsBhaDdD4XOwI7mtm7ZjbWzLplLbrsSudYXA2cYGalwHDgz9kJLedU9/sEyN2Ji2qt/EcBSPt9mtkJQAnQJaMRxSflsTCzOoQqxP2yFVCM0vlcrEfofupKaGW+bWZt3H1xhmPLtnSORV/gIXe/2cw6Ea7fauPuazIfXk6p0fdmrrYoVP6jTDrHAjM7ALgCONLdV2Qptmyr6lg0AtoAo8zsc0If7LACHdBO92/kBXdf6e5zgBmExFFo0jkWpwFPAbj7e0ADQsHAYpPW90l5uZooVP6jTJXHIupuuZeQJAq1HxqqOBbu/oO7b+Huzdy9GWG85kh3r3ExtByWzt/I84QTHTCzLQhdUbOzGmV2pHMs5gL7A5jZLoREUYzzsw4DTorOfuoI/ODuX1f1pJzsevLMlf/IO2kei38DGwFPR+P5c939yNiCzpA0j0VRSPNYjAAOMrPpwGrgYndfGF/UmZHmsbgQuM/Mzid0tfQrxB+WZjaY0NW4RTQecxWwPoC730MYnzkUmAX8DJyS1usW4LESEZFalKtdTyIikiOUKEREJCUlChERSUmJQkREUlKiEBGRlJQoJOeY2Woz+zDp1izFts0qq5RZzX2OiqqPTo5KXuxUg9c4w8xOiu73M7NtktYNMrNWtRzneDNrl8ZzzjOzDdd131K8lCgkFy1z93ZJt8+ztN/j3X03QrHJf1f3ye5+j7s/Ej3sB2yTtK6/u0+vlSjL4ryL9OI8D1CikBpTopC8ELUc3jazD6LbXhVs09rMxkWtkI/MrGW0/ISk5feaWd0qdjcGaBE9d/9oDoMpUa3/+tHyG6xsDpCbomVXm9lFZtaLUHPr8WifG0QtgRIzO9PMbkyKuZ+Z/beGcb5HUkE3M7vbzCZYmHviH9GycwkJa6SZjYyWHWRm70XH8Wkz26iK/UiRU6KQXLRBUrfT0GjZfOBAd28PHAvcXsHzzgD+4+7tCF/UpVG5hmOBP0XLVwPHV7H/I4ApZtYAeAg41t13JVQyONPMfgccBbR297bAtclPdvdngAmEX/7t3H1Z0upngKOTHh8LPFnDOLsRynQkXOHuJUBboIuZtXX32wm1fPZ1932jUh5/Aw6IjuUE4IIq9iNFLidLeEjRWxZ9WSZbH7gj6pNfTahbVN57wBVm1gR4zt0/NbP9gT2A8VF5kw0ISacij5vZMuBzQhnqnYA57j4zWv8wcDZwB2Gui0Fm9jKQdklzd19gZrOjOjufRvt4N3rd6sTZkFCuInmGst5mNoDwd/0HwgQ9H5V7bsdo+bvRfuoRjptIpZQoJF+cD3wL7EZoCf9mUiJ3f8LM3gcOA0aYWX9CWeWH3f2yNPZxfHIBQTOrcH6TqLZQB0KRuT7AOcB+1XgvTwK9gU+Aoe7uFr61046TMIvbDcCdwNFm1hy4CPijuy8ys4cIhe/KM+B1d+9bjXilyKnrSfLFJsDX0fwBJxJ+Ta/FzLYHZkfdLcMIXTBvAr3MbKtom99Z+nOKfwI0M7MW0eMTgdFRn/4m7j6cMFBc0ZlHSwhlzyvyHNCDMEfCk9GyasXp7isJXUgdo26rjYGfgB/M7PfAIZXEMhb4U+I9mdmGZlZR60zkV0oUki/uAk42s7GEbqefKtjmWGCqmX0I7EyY8nE64Qv1NTP7CHid0C1TJXdfTqiu+bSZTQHWAPcQvnRfil5vNKG1U95DwD2Jwexyr7sImA5s5+7jomXVjjMa+7gZuMjdJxPmx54GPEDozkoYCLxiZiPdfQHhjKzB0X7GEo6VSKVUPVZERFJSi0JERFJSohARkZSUKEREJCUlChERSUmJQkREUlKiEBGRlJQoREQkpf8P6mATofqZfPMAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show()\n", + " \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The ROC curve is above the diagonal red-dotted line and the AUC is 0.76. But, given the class imbalance of two positive cases for each negative case how good is this? \n", + "\n", + "One point of comparison is a naive 'classifier' that sets all cases to positive. The code in the cell below contains such a classifier. This is not really a classifier at all, just coded that sets the output to a fixed probability of 1.0 for positive cases and 0.0 for negative cases. The ROC curve and AUC are then computed and displayed. Run this code, and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 205 0\n", + "True negative 95 0\n", + "\n", + "Accuracy 0.68\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.68 0.00\n", + "Recall 1.00 0.00\n", + "F1 0.81 0.00\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "C:\\Users\\StevePC2\\Anaconda3\\lib\\site-packages\\sklearn\\metrics\\classification.py:1135: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples.\n", + " 'precision', 'predicted', average, warn_for)\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xd8FNX6x/HPAwooIBawICAgSBEQMSJYsSOiKCKiqISqXgvYfvZruXptKFe9KiIq2LBdCyKIDcSCAhaqIE0lgtJ7TfL8/pgJrDFlE7KZTfJ9v177Ymfm7Myzh80+e86ZOWPujoiISG7KRR2AiIgkNyUKERHJkxKFiIjkSYlCRETypEQhIiJ5UqIQEZE8KVFI3Mysu5l9FHUcycTM1ptZ/QiOW9fM3Mx2Ke5jJ4KZzTSzdoV4nT6TxUCJooQys1/MbFP4RfWHmQ0zsyqJPKa7v+LupyXyGLHM7Ggz+8zM1pnZGjN738yaFtfxc4hnvJn1iV3n7lXcfUGCjneImb1pZsvD9z/NzK4zs/KJOF5hhQmrwc7sw90Pdffx+Rznb8mxuD+TZZUSRcl2lrtXAVoChwO3RBxPoeT0q9jM2gIfAe8BNYF6wFTgq0T8gk+2X+ZmdjDwLbAIaO7u1YDzgRSgahEfK7L3nmz1Lrlwdz1K4AP4BTglZvkh4IOY5YrAQOA34E9gMLBbzPZOwI/AWmA+0D5cXw14DlgC/A7cC5QPt6UCX4bPBwMDs8X0HnBd+Lwm8D9gGbAQuCam3F3AW8DL4fH75PD+vgCeymH9GODF8Hk7IA24FVge1kn3eOog5rU3AX8ALwF7AaPCmFeFz2uF5e8DMoDNwHrgv+F6BxqEz4cBTwIfAOsIvugPjonnNGAOsAZ4Cvg8p/celn059v8zh+11w2P3CN/fcuC2mO2tgYnA6vD/8r9AhZjtDlwJzAUWhuseI0hMa4HvgONiypcP63l++N6+A2oDE8J9bQjr5YKwfEeCz9dq4GugRbbP7k3ANGALsAsxn+cw9ilhHH8Cj4brfwuPtT58tCXmMxmWORT4GFgZvvbWqP9WS8Mj8gD0KOR/3F//sGoB04HHYrb/BxgJ7E3wC/R94P5wW+vwy+pUglblgUDjcNu7wDNAZWBfYBJwWbht+x8lcHz4pWLh8l7AJoIEUS78IvknUAGoDywATg/L3gVsA84Jy+6W7b3tTvClfGIO77snsCR83g5IBx4lSAonhF9YjeKog6zXPhi+djdgH+C88PhVgTeBd2OOPZ5sX+z8PVGsDOt3F+AV4LVwW/Xwi69zuK1/WAe5JYo/gJ55/P/XDY/9bBj7YQRfuk3C7UcAbcJj1QV+AgZki/vjsG6ykufFYR3sAlwfxlAp3HYjwWesEWDh8fbJXgfhcitgKXAUQYLpQfB5rRjz2f2RINHsFrMu6/M8EbgkfF4FaJPtPe8Sc6xUdnwmqxIkxeuBSuHyUVH/rZaGR+QB6FHI/7jgD2s9wa87Bz4F9gy3GcEXZuyv2bbs+OX4DDAoh33uF37ZxLY8LgTGhc9j/yiN4Bfe8eFyX+Cz8PlRwG/Z9n0L8EL4/C5gQh7vrVb4nhrnsK09sC183o7gy75yzPY3gDviqIN2wNasL8Jc4mgJrIpZHk/+iWJozLYOwOzw+aXAxJhtRpBoc0sU2whbeblsz/rSrBWzbhLQLZfyA4B3ssV9Uj6fsVXAYeHzOUCnXMplTxRPA//KVmYOcELMZ7dXDp/nrEQxAbgbqJ7Le84tUVwI/JDIv7uy+lD/YMl2jrt/YmYnAK8S/GpdDdQg+FX8nZlllTWCX3cQ/JIbncP+DgJ2BZbEvK4cwRfaX7i7m9lrBH+cE4CLCLpLsvZT08xWx7ykPEF3Upa/7TPGKiATOACYnW3bAQTdLNvLuvuGmOVfCVo1+dUBwDJ337x9o9nuwCCCZLRXuLqqmZV394w84o31R8zzjQS/iAlj2v6ew/pLy2M/Kwjea6GOZ2aHELS0UgjqYReCVl6sv/wfmNn1QJ8wVgf2IPhMQfCZmR9HPBD8//cws6tj1lUI95vjsbPpDdwDzDazhcDd7j4qjuMWJEYpAA1mlwLu/jnBr9mB4arlBN1Ah7r7nuGjmgcD3xD8kR6cw64WEbQoqse8bg93PzSXQ48AupjZQQStiP/F7GdhzD72dPeq7t4hNuw83s8Ggu6H83PY3JWg9ZRlLzOrHLNcB1gcRx3kFMP1BF0rR7n7HgTdaxAkmDxjjsMSgpZSsMMge9XKvTifEHSDFdbTBEm2YfhebmXH+8iy/f2Y2XEE4wZdgb3cfU+C7sms1+T2mcnJIuC+bP//u7v7iJyOnZ27z3X3Cwm6Ph8E3gr/j/Or/4LEKAWgRFF6/Ac41cxaunsmQd/1IDPbF8DMDjSz08OyzwE9zexkMysXbmvs7ksIzjR6xMz2CLcdHLZY/sbdfyAY+B0KjHX3rBbEJGCtmd1kZruZWXkza2ZmRxbg/dxM8Kv0GjOramZ7mdm9BN1Hd2cre7eZVQi/7DoCb8ZRBzmpSpBcVpvZ3sCd2bb/STDeUhgfAM3N7JzwTJ8rgf3zKH8ncLSZPWxm+4fxNzCzl81szziOV5VgTGS9mTUGroijfDrB/+cuZvZPghZFlqHAv8ysoQVamNk+4bbs9fIscLmZHRWWrWxmZ5pZXGdrmdnFZlYj/D/M+kxlhLFlkvv/wShgfzMbYGYVw8/NUfEcU/KmRFFKuPsy4EWC/nkIfh3OA74xs7UEv1AbhWUnEQwKDyL41fg5QXcBBH3pFYBZBF1Ab5F3F8gI4BSCrq+sWDKAswj6+BcS/LofSnBGVbzv50vgdILB3yUEXUqHA8e6+9yYon+EcS4mGDy+3N2zuqtyrYNc/IdgYHg58A3wYbbtjxG0oFaZ2ePxvpfw/SwnaCE9RNCt1JTgzJ4tuZSfT5AU6wIzzWwNQYttCsG4VH5uIOgOXEfwxf16PuXHEpxR9jNBXW/mr91DjxKM/3xEkICeI6grCMachpvZajPr6u5TCMas/kvwfzOPYCwhXu0J3vN6gjrv5u6b3X0jwdlnX4XHahP7IndfR3CCxlkEn4u5wIkFOK7kIuuMFZESJ7yS92V3z6sLJymZWTmC03O7u/u4qOMRyYtaFCLFxMxON7M9zawiO8YMvok4LJF8JSxRmNnzZrbUzGbkst3M7HEzmxdOTdAqUbGIJIm2BGflLCfoHjnH3TdFG5JI/hLW9WRmxxOc5/+iuzfLYXsH4GqCc82PIrhYTANPIiJJJmEtCnefQHCVam46ESQRd/dvgD3NLJ7zxkVEpBhFecHdgfz1rIq0cN2S7AXNrB/QD6By5cpHNG7cuFgCFBEp0Twd5s+G9Vv4LoPl7l6jMLuJMlFkv/gHcrmgxt2HAEMAUlJSfMqUKYmMS0SkZMvMhF9fg+/7w+htULEd9vT4Xwu7uyjPekojuOQ+Sy2Cc+FFRKSw5k2G42rCwO5QpT48MhWe2rkzsKNMFCOBS8Ozn9oAa8Irg0VEpKAyM+C+S6BFa5jyJ+zbBU79Gvb827lEBZawriczG0EwQ2f1cPKzOwkmnMPdBxNMSteB4KrNjQRXCouISEFN/RRSu8CPq6HlnjDsf3DYSUW2+4QlinBSr7y2Z904RURECiMzHeY8Bi/dCnO2wr2XwC3DoFzRdhZpmnERkZLo67fh7Rug1UI462zofx/U3vluppwoUYiIlCSb1sF1HeHZCbBnOej7IhxyMVhOJ5IWDc31JCJSUox9DhrXgMET4JR68OMsaHRJQpMEKFGIiCS/9A0wph+c2Qc2pMPwO+HDBVArr1nzi466nkREktlXw2HFPbB+Afz7dOg9FPYp3pn1lShERJLR0l+h3+kwcg7cVwt6jYf9crzZZMKp60lEJNk8fxs0qR8kiUtbw1VTI0sSoBaFiEjy2LwUurSBDxZC3Urw+jNwyqVRR6UWhYhI5DIzYcFLMKoJ7PcbXHkS/LQ8KZIEqEUhIhKtOd9Cj7Pg8GVwTht49Dmo1jTqqP5CLQoRkShkpMPdF8LhbeDHZVDzAjj1y6RLEqAWhYhI8fvhI+jRFaavgVZ7w/B3oNnxUUeVK7UoRESKS2Y6zHoIXu0IC9bCAz1h8rKkThKgFoWISPH44g149//giF/hrHPh2vugZpOoo4qLEoWISCJtXAsDzoTnv4S9ysFlr0DDCxM+P1NRUteTiEiijB4STOL37JdwWn2YOhsOuahEJQlQohARKXrb1sOYPnD2ZbApA166B0bPh5oNo46sUNT1JCJSlCY8B6vuhQ2/wAMdoNezsHfNqKPaKUoUIiJF4c+F0Pt0+GAuPFAben4B+x4bdVRFQl1PIiI769mboHEDGDMXerWFq6aVmiQBalGIiBTepj+gS1sY/QvU3w3efhZO7B51VEVOLQoRkYLKzIT5w+GDplAzDfqfBrOWl8okAWpRiIgUzE9fQY9OcMQKOPdoGPgcVGscdVQJpRaFiEg8MtLhn12h1bEwbQXUughO/aLUJwlQi0JEJH/fjYHUbjBjLaTsA8PfhaalZ7A6P2pRiIjkJnMbzLwfRpwNC9fBQ33g26VlKkmAWhQiIjkbPwLeuxmO/A06dYHr74UDGkUdVSSUKEREYm1YDdd0gGETYe9y8I8R0LBb1FFFSl1PIiJZRj0FjfaD5yfCGQ1hxrwynyRAiUJEBLatgw9S4ZwrYVsmvPpvGPUz7Fcv6siSgrqeRKRsGz8EVt8HGxfBQx2h57Ow1/5RR5VUlChEpGxaMh/6nB5M/31/Hej9JdQ4OuqokpK6nkSk7Bl8AzQ9BMbOhz7HwNVTlSTyoBaFiJQdm5ZA5zbw4W/QYDd47wU4/oKoo0p6alGISOmXmQnznodRTaHWYrj2DJi5UkkiTglNFGbW3szmmNk8M7s5h+11zGycmf1gZtPMrEMi4xGRMmjGBDiyBgzsDXs2h0dmwqOjoUKlqCMrMRKWKMysPPAkcAbQFLjQzJpmK3Y78Ia7Hw50A55KVDwiUsZs2wq3nQcpJ8CslVDvEjhlPOxxSNSRlTiJHKNoDcxz9wUAZvYa0AmYFVPGgT3C59WAxQmMR0TKikmjIPVC+Gk9tK4Bw9+Dxm2jjqrESmTX04HAopjltHBdrLuAi80sDRgNXJ3Tjsysn5lNMbMpy5YtS0SsIlIaZG6DGffBm+fCog3w6OUw8Q8liZ2UyERhOazzbMsXAsPcvRbQAXjJzP4Wk7sPcfcUd0+pUaNGAkIVkRLvs5fhuvow7Xbo1Bnmz4Vrn4ZyOmdnZyWy6ykNqB2zXIu/dy31BtoDuPtEM6sEVAeWJjAuESlN1q0MJvF78VuoXh6ufB0ado06qlIlkal2MtDQzOqZWQWCweqR2cr8BpwMYGZNgEqA+pZEJD7vPQGN94dh38KZh8D0+UoSCZCwROHu6cBVwFjgJ4Kzm2aa2T1mdnZY7Hqgr5lNBUYAqe6evXtKROSvtq2FDy6F866BDIc3HoKRc2Dfg6KOrFSykva9nJKS4lOmTIk6DBGJyidPwfoHYGMaLD0beg6BavtGHVXSM7Pv3D2lMK/VFB4iUjKkzYE+Z8DYhfDAQdD7a6jeJuqoygSdDiAiyS0zE568Fg5tAp8shMuPh6unKUkUI7UoRCR5bVwM5x4FH6VBw93hhRfhmPOijqrMUYtCRJJPZibMfRY+aAp1/4AbOsKMFUoSEVGLQkSSy/Tx0KMzHLkKzj8BBg6Fqg2ijqpMU4tCRJLDtq1wyzlw5IkwexUcnAonf6YkkQTUohCR6H0zEnpeBLM3QJt94cVR0PDIqKOSkFoUIhKdjK0w/R54uzMs3giPXQVfLVGSSDJqUYhIND4eDiNvhbaLodNFcOO9UKNe1FFJDpQoRKR4rV0OV7aHV76DfctD/7eggc5mSmbqehKR4vP2f6DRAfDyd3BOE5i+QEmiBFCiEJHE27oG3u8OXa8Nlt96BN6eBTXqRBuXxEVdTyKSWB8/AesfhM1LYFBn6PEM7FE96qikAJQoRCQxFv0Evc6AT36F++tBn4lQvXXUUUkhqOtJRIpWZiY8fhU0OxTG/Qr/OBH6T1OSKMHialGEd6ir4+7zEhyPiJRkG9OgUxv45HdoXBmefwXadoo6KtlJ+bYozOxMYDrwcbjc0szeSXRgIlKCZKTDz4NhVFOovxRu6gTTVipJlBLxtCjuAY4CxgG4+49mpslXRCTw46eQ2gVar4YLToJHnoUq9aOOSopQPGMU29x9dbZ1Jev+qSJS9LZuhhs7QutTYO5qaNQbTvpESaIUiqdF8ZOZdQXKmVk9oD/wTWLDEpGk9vXbkHoJzN0Ix+wPw0fBwUdEHZUkSDwtiquAI4BM4G1gM0GyEJGyJmMLTLsT3j0f/twET/SHCb8rSZRy8bQoTnf3m4CbslaYWWeCpCEiZcXY52DUHdB2CZxzMdx0L+xzUNRRSTGIp0Vxew7rbivqQEQkSa1ZCt2PgDP6wFtLoc27cPRLShJlSK4tCjM7HWgPHGhmj8Zs2oOgG0pESru3BsJVt8Cf6XDeofDMh7BPraijkmKWV9fTUmAGwZjEzJj164CbExmUiERs62oY+w/oNgL22xXefRw6XR11VBKRXBOFu/8A/GBmr7j75mKMSUSi9OEg2PQwbP4T/tMlmMSv6t5RRyURimcw+0Azuw9oClTKWunuhyQsKhEpfr/MgN4d4LNF8EB96PMt7JMSdVSSBOIZzB4GvAAYcAbwBvBaAmMSkeKUmQmDroDmLWDCIrj6FBgwXUlCtosnUezu7mMB3H2+u98OnJjYsESkWGz4DU6pBdcNhtqV4av34fGPoeLuUUcmSSSerqctZmbAfDO7HPgd2DexYYlIQmWkw/wh8ONN0GgrtO0Md42AXStEHZkkoXgSxbVAFeAa4D6gGtArkUGJSAJ9PxZSu0KbtdDtFHh4CFSpF3VUksTyTRTu/m34dB1wCYCZ6URqkZJm62a4pTM8MQYqGPTsBycOBrOoI5Mkl2eiMLMjgQOBL919uZkdSjCVx0mAkoVISfHlm9AzFeZthOMOgGGjoX7LqKOSEiLXwWwzux94BegOfGhmtxHck2IqoFNjRUqCjM0w9XYY2Q2WbYanrofPf1eSkALJq0XRCTjM3TeZ2d7A4nB5Trw7N7P2wGNAeWCouz+QQ5muwF0E97iY6u4XFSB+EcnNB8/A6DvhmD/h3B5w879g79pRRyUlUF6JYrO7bwJw95VmNruASaI88CRwKpAGTDazke4+K6ZMQ+AW4Bh3X2VmOptKZGet+gOuaA9vTIX9d4EbRkK9s6KOSkqwvBJFfTPLmkrcgLoxy7h753z23RqY5+4LAMzsNYJWyqyYMn2BJ919VbjPpQWMX0Rijbgf+t8ByzPg/Obw9Iewd82oo5ISLq9EcV625f8WcN8HAotiltMI7r0d6xAAM/uKoHvqLnf/MPuOzKwf0A+gTp06BQxDpAzYugrGXAaXvAn7V4D3Hoez/hF1VFJK5DUp4Kc7ue+czrnLfq/tXYCGQDuCs6i+MLNm2e/R7e5DgCEAKSkpul+3SKwPHoYtj8KWZfDfbnDJ01B5z6ijklIknik8CisNiB05q0UwIJ69zHvuvs3dFwJzCBKHiORn4TQ4sRZ0/D+YVwVOnwyXj1CSkCKXyEQxGWhoZvXMrALQDRiZrcy7hPNGmVl1gq6oBQmMSaTky8yEh/tC85bw1e/Q/zToPxX2PjzqyKSUimcKDwDMrKK7b4m3vLunm9lVwFiC8Yfn3X2mmd0DTHH3keG208xsFpAB3OjuKwr2FkTKkA2/Qsc2MP4POLQqvPAaHNkh6qiklDP3vLv8zaw18BxQzd3rmNlhQB93j+R2VykpKT5lypQoDi0SnYx0mPs0TLsFPt8Ge58dTOJXPu7felLGmdl37l6ouePj+ZQ9DnQk6CbC3aeamaYZFykuk0dDz27Qdh1cdDo89AxUPijqqKQMiWeMopy7/5ptXUYighGRGFs2BuMPR58Jv6yHZldAuzFKElLs4mlRLAq7nzy82vpq4OfEhiVSxn3+GvTqBQs2QbsD4fkxUK951FFJGRVPi+IK4DqgDvAn0CZcJyJFLWMz/HgLjOkOK7fAMzfCuDQlCYlUPC2KdHfvlvBIRMq695+C0XfDcUvhnJ7BJH57Hhh1VCJxJYrJZjYHeB14293XJTgmkbJl5WK4vD28OR1q7gI3jYK6Z0Ydlch2+XY9ufvBwL3AEcB0M3vXzNTCECkKr94HjevAW9PhwpYwY5GShCSduK7Mdvev3f0aoBWwluCGRiJSWFtWwLtd4NLboWJ5eP8ZePUH2Gv/qCMT+Zt8u57MrArB9ODdgCbAe8DRCY5LpHTKzIRRD0L6f2DLSnjyIrj4KahcLerIRHIVzxjFDOB94CF3/yLB8YiUXgt+hNQO8MUSePAQ6PsR7HVY1FGJ5CueRFHf3TMTHolIaZWZCQ/3gX8Ng60O150BA96GCpWijkwkLrkmCjN7xN2vB/5nZn+bECqOO9yJyPqFcGYbmLAUmu0Bw9+AVqdHHZVIgeTVong9/Legd7YTkW1bYe6TMP12aJYJJ3WD21/SJH5SIuV1h7tJ4dMm7v6XZBFOH76zd8ATKZ0mvQ+pF8HR66H7GeEkfrXzf51Ikorn9NheOazrXdSBiJR4WzbC1afAMWfDog3Q8ipo94GShJR4eY1RXEBwSmw9M3s7ZlNVYHXOrxIpo8a/Ar36wMLNcFJteGEM1Dk06qhEikReHaaTgBUE97p+Mmb9OuCHRAYlUmKkb4Lpd8GHD8Mag6G3QO9/Rx2VSJHKa4xiIbAQ+KT4whEpQd59HD68F45fBuf2gVvugWoHRB2VSJHLq+vpc3c/wcxWAbGnxxrg7r53wqMTSUYr0uCy0+F/s6DmrnDLaDjojKijEkmYvAazs253Wh2oEfPIWhYpe168GxrVhbdnQfcjYFaakoSUerkmipirsWsD5d09A2gLXAZULobYRJLH5uXwzrnQ6y6ovAuMGQovT4Fq+0YdmUjCxXN67LsEt0E9GHiRYGLAVxMalUiyyMyEd+6FD5rC5lEw+FKYvQxO1xniUnbEc5loprtvM7POwH/c/XEz01lPUvrN/w56dISv/oAHG0G/T2FP3ZJUyp54WhTpZnY+cAkwKly3a+JCEolYZib8+1JokQKT/oAbO8KAH5UkpMyKp0XRC/gHwTTjC8ysHjAisWGJRGTdfOjQFr5cBi2qwfD/QcuTo45KJFLx3Ap1BnANMMXMGgOL3P2+hEcmUpy2bYWZA2F0c2i5Du7pDt8vV5IQIb473B0HvAT8TnANxf5mdom7f5Xo4ESKxdfvQO9L4JgNcHFHePBp2L1W1FGJJI14up4GAR3cfRaAmTUhSBwpiQxMJOE2rYfrz4Ih46GyQav+cMIgMIs6MpGkEk+iqJCVJADc/Sczq5DAmEQS79Ph0Psy+HULnHoQPDcGajeJOiqRpBTPWU/fm9kzZnZs+HgaTQooJVX6Rvj+BvikJ6xPhxfugI9+UZIQyUM8LYrLCQaz/49gjGIC8EQigxJJiP89Ch/+G05cAedeBrfcDXvsF3VUIkkvz0RhZs2Bg4F33P2h4glJpIgt+w36nQ7vzoZau8IdY6HOaVFHJVJi5Nr1ZGa3Ekzf0R342MxyutOdSHJ74Q5oXA/emw2XpMDMxUoSIgWUV4uiO9DC3TeYWQ1gNPB88YQlspM2L4PRfaDvSKhdCV4bDKf2iDoqkRIpr0Sxxd03ALj7MjOLZ+BbJFqZmfDOPcB/YdtaGJIKFz4Bu1WJOjKREiuvRFE/5l7ZBhwce+9sd++c387NrD3wGFAeGOruD+RSrgvwJnCku0+JN3iRv/h5EqSeBROXwkNNoO/nsKfuWy2ys/JKFOdlW/5vQXZsZuUJ7rV9KpAGTDazkbHXZITlqhKcVfVtQfYvsl1GOvy7B9z/KmQAN3WCAW/ArrrcR6Qo5HXP7E93ct+tgXnuvgDAzF4DOgGzspX7F/AQcMNOHk/KorVzg0n8vloBh+8Fw9+G5u2ijkqkVEnkuMOBwKKY5bRw3XZmdjhQ291HkQcz62dmU8xsyrJly4o+Uil5tm6GmQ/BmBZw+Ea471KYslxJQiQB4rngrrBymjDHt28MBscHAan57cjdhwBDAFJSUjyf4lLaffkW9OoBx22EHp3gwadg95pRRyVSasXdojCzigXcdxrB/baz1AIWxyxXBZoB483sF6ANMNLMNNmg5GzjWrj8eGh3Pvy5CVpfB8e9oyQhkmD5Jgoza21m04G54fJhZhbPFB6TgYZmVi+cRLAbMDJro7uvcffq7l7X3esC3wBn66wnydHHz0OTfeGZL+DUevDTHLjsEc30KlIM4mlRPA50BFYAuPtU4MT8XuTu6cBVwFjgJ+ANd59pZveY2dmFD1nKlPQN8N218Glv2JQOL94NYxZAzYZRRyZSZsQzRlHO3X+1v/5yy4hn5+4+muCK7th1/8ylbLt49illyFsD4cP74aSVcN4/4Pa7oEqNqKMSKXPiSRSLzKw14OG1EVcDPyc2LCnTlv4KfU+DkT9D7Qpw58dQ+5SooxIps+LperoCuA6oA/xJMOh8RSKDkjLsuVugcX14/2dIPQpmLlGSEIlYvi0Kd19KMBAtkjib/oTRveGyD6BOJXhzCJx8SdRRiQhxJAoze5aY6x+yuHu/hEQkZUtmJrz5Tyj/NKSvh6G9odtjUKly1JGJSCieMYpPYp5XAs7lr1dcixTO7InBBXOTlsHDTaHvW1BNtyQVSTbxdD29HrtsZi8BHycsIin9MtLhnovhodeDtupt50H/VzWJn0iSKswUHvWAg4o6ECkj1v4M7Y+GiSvgiL1h+Htw6LFRRyUieYhnjGIVO8YoygErgZsTGZSUQls3w8+DYMbdcGR5OKcX3PAslNP9sESSXZ6JwoKr7A4Dfg9XZbq7JuWTgvnidejVE47bBKnnwgNPwm4HRB2ViMQpz59zYVJ4x90zwoeShMRvw2roewyc2A2WbYG2/wfHv60kIVLCxNPun2RmrRIeiZQuY4dC4/1g6Ndw+sHw08/Q98GooxKRQsg1UZhZVrfUsQTJYo6ZfW9mP5jZ98UTnpQ429ZDNKdSAAAUbElEQVTDlGvg876wJQNeuRc+mAcHHBx1ZCJSSHmNUUwCWgHnFFMsUtKNuB8+eRhOXg2dr4Jb74Qq1aOOSkR2Ul6JwgDcfX4xxSIl1R8LoM/pQcuhTgW46xOofVLUUYlIEckrUdQws+ty2+jujyYgHilpnrkRbn4U1mZCr7bw+GiovGfUUYlIEcorUZQHqpDzva+lrNv0B3zQC64cA3V3g3eeg3YXRh2ViCRAXoliibvfU2yRSMmQmQmv3wq7DoH0jfDCZdD1Uai4e9SRiUiC5DtGIbLdrC8h9RyYvAIGNoe+b8IejaKOSkQSLK/rKE4utigkuWWkwx3nQ6vjYNqK4PmA75UkRMqIXFsU7r6yOAORJLVmdjCJ3zer4Mh9gkn8mhwTdVQiUow0I5vkbMtGmH4vjDkMjtoKD/eFb5YqSYiUQYWZZlxKu3GvQJ++cPwm6H0+3P8E7LZf1FGJSETUopAd1q0MroU45WJYuQWOuwWOfUNJQqSMU4tCAqOfgX5Xw+/boGNDGDoW9qsXdVQikgTUoijrtq2DyVfBl5dDusPrD8L7PytJiMh2ShRl2Sv/gn61Ye5TcF5/mP8ndP2/qKMSkSSjrqeyaPFc6N0ePlwAdSvCvePhwOOjjkpEkpRaFGVJZiY8fT00bQQfL4C+x8LMpUoSIpIntSjKik1L4P1UuPojqL87vP8CHNc16qhEpARQoijtMjPh1Zug4rPgW2D4P+D8R6BCpagjE5ESQomiNJvxOfToDN+vhIEtwkn8Dok6KhEpYTRGURpt2wq3ngsp7eCnlXD3hTDgOyUJESkUtShKmzWz4LRjYNJqOKoGDH8fGh0VdVQiUoKpRVFabN4AU++GMYfDMenw6BXw9R9KEiKy0xKaKMysvZnNMbN5ZnZzDtuvM7NZZjbNzD41s4MSGU+p9cmL0LQ6DLoLaneGf8+Ha5+CcvodICI7L2HfJGZWHngSOANoClxoZk2zFfsBSHH3FsBbwEOJiqdUWrcSUo+C03rA6q3Q7nY4ZgRU2jfqyESkFEnkT87WwDx3X+DuW4HXgE6xBdx9nLtvDBe/AWolMJ7S5YOnoPH+MHwSnNUIZi+A1H9FHZWIlEKJTBQHAotiltPCdbnpDYzJaYOZ9TOzKWY2ZdmyZUUYYgm0bS1MugK+vhIyHd54GN6bDfuq105EEiORicJyWOc5FjS7GEgBHs5pu7sPcfcUd0+pUaNGEYZYwrx4F/SpA/OHwHnXwYJlcP4NUUclIqVcIk+PTQNqxyzXAhZnL2RmpwC3ASe4+5YExlNypc0JJvH76BeoVwnu/xxqHht1VCJSRiSyRTEZaGhm9cysAtANGBlbwMwOB54Bznb3pQmMpWTKzIQnroFDm8Cnv8DlJ8DMZUoSIlKsEtaicPd0M7sKGAuUB55395lmdg8wxd1HEnQ1VQHeNDOA39z97ETFVKJs/D2YxO+6T6B+ZXh+OBxzXtRRiUgZZO45DhskrZSUFJ8yZUrUYSROZia8dD3s/jxkbgPvA+cNhF0rRB2ZiJRgZvadu6cU5rWawiOZTP0MUs+DH1fDwJbQ702o2iDqqESkjNOlu8lg21a4qRO0PhnmrIZ/dYcBk5UkRCQpqEURtdUz4LRjYfIaaLsvDB8FDY+MOioRke3UoojKpvUw9U74sBUclwmPXQVfLlGSEJGkoxZFFD56AfpdAe22QN+L4JzHoFL1qKMSEcmRWhTFae1yuDgF2veCddvg5DvhmFeUJEQkqalFUVxGPg6XXQ9/pMO5TeCZD6FGnaijEhHJl1oUibZ1DXzbD77tD+UM/jcI3p6lJCEiJYYSRSK9cAf0qQ0LnoMuN8L85dB5QNRRiYgUiLqeEmHRT9CrPXzyG9SvBA9+CQe0jToqEZFCUYuiKGVmwmNXwqGHwrjf4MqTYMYyJQkRKdHUoigqGxbB+z3ghnHQoDI8/wq07ZT/60REkpwSxc7KSIfh10GVYWAZ8MoAOPdBTeInIqWGEsXO+OFjSD0fpq2BR1tB3zehSv2ooxIRKVIaoyiMrZvhhjPhqNNg3hq4PxX6T1aSEJFSSS2Kglo1DU49Dr5bC8fuD8M+gINbRR2VSFLatm0baWlpbN68OepQyoxKlSpRq1Ytdt111yLbpxJFvDauhZ8fhp8egBN3h14D4PJHoJwaZSK5SUtLo2rVqtStW5fwLpaSQO7OihUrSEtLo169ekW2XyWKeIx5Fi67Ck7cCpddAp0HQcV9oo5KJOlt3rxZSaIYmRn77LMPy5YtK9L96udwXtYshYsOhzP7waYMOO0eOPpFJQmRAlCSKF6JqG+1KHLz7iC4/P/gz3To0gwGj4F9akUdlYhIsVOLIrutq+Gb3jD5OtjF4N3H4c3pShIiJdg777yDmTF79uzt68aPH0/Hjh3/Ui41NZW33noLCAbib775Zho2bEizZs1o3bo1Y8aM2elY7r//fho0aECjRo0YO3ZsjmVSU1OpV68eLVu2pGXLlvz4449AMAZxzTXX0KBBA1q0aMH333+/0/HEQy2KWENvgfFPwpkb4fyb4Y5boVLVqKMSkZ00YsQIjj32WF577TXuuuuuuF5zxx13sGTJEmbMmEHFihX5888/+fzzz3cqjlmzZvHaa68xc+ZMFi9ezCmnnMLPP/9M+fLl/1b24YcfpkuXLn9ZN2bMGObOncvcuXP59ttvueKKK/j22293KqZ4KFEA/DIDep0B49Lg4N1g4Jewf5uooxIpXb4bAKt+LNp97tUSjvhPnkXWr1/PV199xbhx4zj77LPjShQbN27k2WefZeHChVSsWBGA/fbbj65du+5UuO+99x7dunWjYsWK1KtXjwYNGjBp0iTato1vPrj33nuPSy+9FDOjTZs2rF69miVLlnDAAQfsVFz5KdtdT5mZ8Mhl0LwFfJEG15wKM5crSYiUIu+++y7t27fnkEMOYe+9946ru2bevHnUqVOHPfbYI9+y11577fYuotjHAw888Leyv//+O7Vr196+XKtWLX7//fcc93vbbbfRokULrr32WrZs2VLg1xelstui2PAbjLwUbv4cDqkKw0bAkWdGHZVI6ZXPL/9EGTFiBAMGBPeB6datGyNGjKBVq1a5nh1U0LOGBg0aFHdZd4/rePfffz/7778/W7dupV+/fjz44IP885//jPv1Ra3sJYqMdHj+Gqj2EpR3GHEDdLpPk/iJlEIrVqzgs88+Y8aMGZgZGRkZmBkPPfQQ++yzD6tWrfpL+ZUrV1K9enUaNGjAb7/9xrp166haNe9xymuvvZZx48b9bX23bt24+eab/7KuVq1aLFq0aPtyWloaNWvW/Ntrs7qSKlasSM+ePRk4cGCBXl/k3L1EPY444ggvtO/GuDfbwx3cHz3Cfd3Cwu9LRPI1a9asSI8/ePBg79ev31/WHX/88T5hwgTfvHmz161bd3uMv/zyi9epU8dXr17t7u433nijp6am+pYtW9zdffHixf7SSy/tVDwzZszwFi1a+ObNm33BggVer149T09P/1u5xYsXu7t7Zmam9+/f32+66SZ3dx81apS3b9/eMzMzfeLEiX7kkUfmeJyc6h2Y4oX83i0bYxRbN8N1Z0CbM2DhOniwN/SfBFXqRh2ZiCTQiBEjOPfcc/+y7rzzzuPVV1+lYsWKvPzyy/Ts2ZOWLVvSpUsXhg4dSrVq1QC49957qVGjBk2bNqVZs2acc8451KhRY6fiOfTQQ+natStNmzalffv2PPnkk9vPeOrQoQOLFy8GoHv37jRv3pzmzZuzfPlybr/99u1l6tevT4MGDejbty9PPfXUTsUTL/Mc+rySWUpKik+ZMiX+F6z6EU45Hr5fB8fXhGFjoF6LxAUoItv99NNPNGnSJOowypyc6t3MvnP3lMLsr/S2KDashu9vgQ9T4KRy8PQN8PnvShIiIgVUOgezRz0Nlw+Ak7bCFT2g86NQce+ooxIRKZFKV4ti1R/Q7TA4+x+wNQM6/BvaDlOSEIlQSeveLukSUd+lJ1G8PRAa1YI3pkHXFjBnEXS7JeqoRMq0SpUqsWLFCiWLYuLh/SgqVapUpPst+V1PW1bCD9fDD8OgUgUY+QR0vCLqqESE4Lz/tLS0Ir8/guQu6w53RalkJ4pnboQJg6HjJuhyazCJX4XKUUclIqFdd921SO+0JtFIaNeTmbU3szlmNs/Mbs5he0Uzez3c/q2Z1Y1rxwunQrsD4fKBMDkTTpoIh92nJCEikgAJSxRmVh54EjgDaApcaGZNsxXrDaxy9wbAIODBfHf8x6/Q/HD4ejEMOB2mL4P9jizi6EVEJEsiWxStgXnuvsDdtwKvAZ2ylekEDA+fvwWcbPnNcLV4OdSrChNHw6APoeLuRR23iIjESOQYxYHAopjlNOCo3Mq4e7qZrQH2AZbHFjKzfkC/cHGLzVg7g5QOCQm6hKlOtroqw1QXO6gudlBd7NCosC9MZKLIqWWQ/Ry5eMrg7kOAIQBmNqWwl6GXNqqLHVQXO6gudlBd7GBmBZj76K8S2fWUBtSOWa4FLM6tjJntAlQDViYwJhERKaBEJorJQEMzq2dmFYBuwMhsZUYCPcLnXYDPXFfmiIgklYR1PYVjDlcBY4HywPPuPtPM7iGYF30k8BzwkpnNI2hJdItj10MSFXMJpLrYQXWxg+piB9XFDoWuixI3zbiIiBSv0jPXk4iIJIQShYiI5ClpE0XCpv8ogeKoi+vMbJaZTTOzT83soCjiLA751UVMuS5m5mZWak+NjKcuzKxr+NmYaWavFneMxSWOv5E6ZjbOzH4I/05K5YVYZva8mS01sxm5bDczezysp2lm1iquHRf2ZtuJfBAMfs8H6gMVgKlA02xl/gEMDp93A16POu4I6+JEYPfw+RVluS7CclWBCcA3QErUcUf4uWgI/ADsFS7vG3XcEdbFEOCK8HlT4Jeo405QXRwPtAJm5LK9AzCG4Bq2NsC38ew3WVsUiZn+o2TKty7cfZy7bwwXvyG4ZqU0iudzAfAv4CFgc3EGV8ziqYu+wJPuvgrA3ZcWc4zFJZ66cGCP8Hk1/n5NV6ng7hPI+1q0TsCLHvgG2NPMDshvv8maKHKa/uPA3Mq4ezqQNf1HaRNPXcTqTfCLoTTKty7M7HCgtruPKs7AIhDP5+IQ4BAz+8rMvjGz9sUWXfGKpy7uAi42szRgNHB18YSWdAr6fQIk7/0oimz6j1Ig7vdpZhcDKcAJCY0oOnnWhZmVI5iFOLW4AopQPJ+LXQi6n9oRtDK/MLNm7r46wbEVt3jq4kJgmLs/YmZtCa7faubumYkPL6kU6nszWVsUmv5jh3jqAjM7BbgNONvdtxRTbMUtv7qoCjQDxpvZLwR9sCNL6YB2vH8j77n7NndfCMwhSBylTTx10Rt4A8DdJwKVCCYMLGvi+j7JLlkThab/2CHfugi7W54hSBKltR8a8qkLd1/j7tXdva671yUYrznb3Qs9GVoSi+dv5F2CEx0ws+oEXVELijXK4hFPXfwGnAxgZk0IEkVZvD/rSODS8OynNsAad1+S34uSsuvJEzf9R4kTZ108DFQB3gzH839z97MjCzpB4qyLMiHOuhgLnGZms4AM4EZ3XxFd1IkRZ11cDzxrZtcSdLWklsYflmY2gqCrsXo4HnMnsCuAuw8mGJ/pAMwDNgI949pvKawrEREpQsna9SQiIklCiUJERPKkRCEiInlSohARkTwpUYiISJ6UKCTpmFmGmf0Y86ibR9m6uc2UWcBjjg9nH50aTnnRqBD7uNzMLg2fp5pZzZhtQ82saRHHOdnMWsbxmgFmtvvOHlvKLiUKSUab3L1lzOOXYjpud3c/jGCyyYcL+mJ3H+zuL4aLqUDNmG193H1WkUS5I86niC/OAYAShRSaEoWUCGHL4Qsz+z58HJ1DmUPNbFLYCplmZg3D9RfHrH/GzMrnc7gJQIPwtSeH9zCYHs71XzFc/4DtuAfIwHDdXWZ2g5l1IZhz65XwmLuFLYEUM7vCzB6KiTnVzJ4oZJwTiZnQzcyeNrMpFtx74u5w3TUECWucmY0L151mZhPDenzTzKrkcxwp45QoJBntFtPt9E64bilwqru3Ai4AHs/hdZcDj7l7S4Iv6rRwuoYLgGPC9RlA93yOfxYw3cwqAcOAC9y9OcFMBleY2d7AucCh7t4CuDf2xe7+FjCF4Jd/S3ffFLP5LaBzzPIFwOuFjLM9wTQdWW5z9xSgBXCCmbVw98cJ5vI50d1PDKfyuB04JazLKcB1+RxHyriknMJDyrxN4ZdlrF2B/4Z98hkE8xZlNxG4zcxqAW+7+1wzOxk4ApgcTm+yG0HSyckrZrYJ+IVgGupGwEJ3/zncPhy4Evgvwb0uhprZB0DcU5q7+zIzWxDOszM3PMZX4X4LEmdlgukqYu9Q1tXM+hH8XR9AcIOeadle2yZc/1V4nAoE9SaSKyUKKSmuBf4EDiNoCf/tpkTu/qqZfQucCYw1sz4E0yoPd/db4jhG99gJBM0sx/ubhHMLtSaYZK4bcBVwUgHey+tAV2A28I67uwXf2nHHSXAXtweAJ4HOZlYPuAE40t1XmdkwgonvsjPgY3e/sADxShmnricpKaoBS8L7B1xC8Gv6L8ysPrAg7G4ZSdAF8ynQxcz2DcvsbfHfU3w2UNfMGoTLlwCfh3361dx9NMFAcU5nHq0jmPY8J28D5xDcI+H1cF2B4nT3bQRdSG3Cbqs9gA3AGjPbDzgjl1i+AY7Jek9mtruZ5dQ6E9lOiUJKiqeAHmb2DUG304YcylwAzDCzH4HGBLd8nEXwhfqRmU0DPibolsmXu28mmF3zTTObDmQCgwm+dEeF+/ucoLWT3TBgcNZgdrb9rgJmAQe5+6RwXYHjDMc+HgFucPepBPfHngk8T9CdlWUIMMbMxrn7MoIzskaEx/mGoK5EcqXZY0VEJE9qUYiISJ6UKEREJE9KFCIikiclChERyZMShYiI5EmJQkRE8qREISIiefp/ytPlHcqphJkAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "probs_positive = np.concatenate((np.ones((probabilities.shape[0], 1)), \n", + " np.zeros((probabilities.shape[0], 1))),\n", + " axis = 1)\n", + "scores_positive = score_model(probs_positive, 0.5)\n", + "print_metrics(y_test, scores_positive) \n", + "plot_auc(y_test, probs_positive) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the accuracy from this 'classifier' is still 0.68. This reflects the class imbalance. The ROC curve is directly along the diagonal giving an AUC of 0.5. The logistic regression classifier is definitely better than this!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compute a weighted model\n", + "\n", + "Recall that a falsely classifing a bad credit risk customer as good costs the bank five times more than classifing a good credit risk customer as bad. Given this situation the results of the first model are not that good. There are two reasons for this:\n", + "\n", + "1. The class imbalance in the label has biased the training of the model. As you observed from the accuracy of the naive 'classifier' is not that different from the logistic regression model. \n", + "2. Nothing has been done to weight the results toward correctly classifing the bad credit risk customers at the expense of the good credit risk customers.\n", + "\n", + "One approach to these problems is to weight the classes when computing the logistic regression model. The code in the cell below adds a `class_weitght` argument to the call to the `LogisticRegression` function. In this case weights are chosen as $0.1, 0.9$ since there is an approximately two to one class imbalance and the asymmetry in the costs of errors to the bank. Execute this code" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "LogisticRegression(C=1.0, class_weight={0: 0.9}, dual=False,\n", + " fit_intercept=True, intercept_scaling=1, max_iter=100,\n", + " multi_class='ovr', n_jobs=1, penalty='l2', random_state=None,\n", + " solver='liblinear', tol=0.0001, verbose=0, warm_start=False)" + ] + }, + "execution_count": 17, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "logistic_mod = linear_model.LogisticRegression(class_weight = {0:0.1, 0:0.9}) \n", + "logistic_mod.fit(x_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to compute and display the class probabilities for each case. " + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[0.67991032 0.32008968]\n", + " [0.49165574 0.50834426]\n", + " [0.84404653 0.15595347]\n", + " [0.31600472 0.68399528]\n", + " [0.34917118 0.65082882]\n", + " [0.96015149 0.03984851]\n", + " [0.34313761 0.65686239]\n", + " [0.77929504 0.22070496]\n", + " [0.85492161 0.14507839]\n", + " [0.93792393 0.06207607]\n", + " [0.96763624 0.03236376]\n", + " [0.77111192 0.22888808]\n", + " [0.89779186 0.10220814]\n", + " [0.80622461 0.19377539]\n", + " [0.96692379 0.03307621]]\n" + ] + } + ], + "source": [ + "probabilities = logistic_mod.predict_proba(x_test)\n", + "print(probabilities[:15,:])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "By eyeball the above probabilities are not terribly different from the unweighted model. \n", + "\n", + "To find if there is any significant difference with the unweighted model compute the scores and .the metrics and display the metrics by executing the code in thee cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 174 31\n", + "True negative 50 45\n", + "\n", + "Accuracy 0.73\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.78 0.59\n", + "Recall 0.85 0.47\n", + "F1 0.81 0.53\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XeYFFX2//H3AQUUMazKuooICipBRB0RXAXMGGEFEUygIl/zGlddd9V19afrrnGNiDmAESOKiaCugCCigIIICgMqiKCggITz++PWMM0409MTuqvD5/U8/dBdXV11uujp0/feqnPN3REREalInbgDEBGR7KZEISIiSSlRiIhIUkoUIiKSlBKFiIgkpUQhIiJJKVFIyszsRDN7I+44somZLTOzHWPYbzMzczPbINP7Tgczm2pmXavxOn0mM0CJIkeZ2Vdmtjz6ovrWzB42s03SuU93f8LdD03nPhKZ2b5m9o6ZLTWzH83sZTNrnan9lxPPKDMbkLjM3Tdx91lp2t/OZvaMmX0fvf9PzOwiM6ubjv1VV5SwWtRkG+7ext1HVbKf3yTHTH8mC5USRW472t03AdoDewBXxBxPtZT3q9jMOgFvAC8C2wLNgcnA++n4BZ9tv8zNbCdgHDAX2M3dNwOOA4qARrW8r9jee7Ydd6mAu+uWgzfgK+DghMc3Aa8mPK4P/AeYA3wH3AtslPB8d+Bj4CfgS6BbtHwz4AHgG2AecB1QN3quP/BedP9e4D9lYnoRuCi6vy3wHLAQmA2cn7DeNcCzwOPR/geU8/7eBe4uZ/lrwKPR/a5AMfBX4PvomJyYyjFIeO1lwLfAY8AWwCtRzIuj+02i9a8H1gArgGXAndFyB1pE9x8G7gJeBZYSvuh3SojnUGA68CNwNzC6vPcerft44v9nOc83i/bdL3p/3wNXJjzfAfgAWBL9X94J1Et43oFzgC+A2dGy2wmJ6SdgIrB/wvp1o+P8ZfTeJgLbA2Oibf0cHZfjo/WPIny+lgD/A9qV+exeBnwCrAQ2IOHzHMU+IYrjO+CWaPmcaF/LolsnEj6T0TptgDeBH6LX/jXuv9V8uMUegG7V/I9b/w+rCfApcHvC87cBLwG/I/wCfRm4IXquQ/RldQihVbkdsGv03AvAfUBDoDEwHvi/6Ll1f5RA5+hLxaLHWwDLCQmiTvRFchVQD9gRmAUcFq17DbAK6BGtu1GZ97Yx4Uv5gHLe96nAN9H9rsBq4BZCUugSfWHtksIxKHntv6LXbgRsCfSM9t8IeAZ4IWHfoyjzxc5vE8UP0fHdAHgCGBo9t1X0xXds9Nyfo2NQUaL4Fjg1yf9/s2jf90ex70740m0VPb8X0DHaVzPgM+CCMnG/GR2bkuR5UnQMNgAujmJoED13KeEztgtg0f62LHsMosd7AguAfQgJph/h81o/4bP7MSHRbJSwrOTz/AFwcnR/E6Bjmfe8QcK++lP6mWxESIoXAw2ix/vE/beaD7fYA9Ctmv9x4Q9rGeHXnQNvA5tHzxnhCzPx12wnSn853gfcWs42fx992SS2PPoCI6P7iX+URviF1zl6fAbwTnR/H2BOmW1fATwU3b8GGJPkvTWJ3tOu5TzXDVgV3e9K+LJvmPD808DfUzgGXYFfS74IK4ijPbA44fEoKk8UgxOeOwL4PLp/CvBBwnNGSLQVJYpVRK28Cp4v+dJskrBsPNCngvUvAIaVifvASj5ji4Hdo/vTge4VrFc2UdwD/LPMOtOBLgmf3dPK+TyXJIoxwD+ArSp4zxUlir7ApHT+3RXqTf2Dua2Hu79lZl2AJwm/WpcAWxN+FU80s5J1jfDrDsIvueHlbG8HYEPgm4TX1SF8oa3H3d3MhhL+OMcAJxC6S0q2s62ZLUl4SV1Cd1KJ32wzwWJgLfAH4PMyz/2B0M2ybl13/znh8deEVk1lxwBgobuvWPek2cbArYRktEW0uJGZ1XX3NUniTfRtwv1fCL+IiWJa956j41ecZDuLCO+1Wvszs50JLa0iwnHYgNDKS7Te/4GZXQwMiGJ1YFPCZwrCZ+bLFOKB8P/fz8zOS1hWL9puufsu43TgWuBzM5sN/MPdX0lhv1WJUapAg9l5wN1HE37N/ida9D2hG6iNu28e3TbzMPAN4Y90p3I2NZfQotgq4XWbunubCnY9BOhlZjsQWhHPJWxndsI2Nnf3Ru5+RGLYSd7Pz4Tuh+PKebo3ofVUYgsza5jwuCkwP4VjUF4MFxO6VvZx900J3WsQEkzSmFPwDaGlFDYYsleTilfnLUI3WHXdQ0iyLaP38ldK30eJde/HzPYnjBv0BrZw980J3ZMlr6noM1OeucD1Zf7/N3b3IeXtuyx3/8Ld+xK6Pv8FPBv9H1d2/KsSo1SBEkX+uA04xMzau/taQt/1rWbWGMDMtjOzw6J1HwBONbODzKxO9Nyu7v4N4Uyjm81s0+i5naIWy2+4+yTCwO9gYIS7l7QgxgM/mdllZraRmdU1s7ZmtncV3s/lhF+l55tZIzPbwsyuI3Qf/aPMuv8ws3rRl91RwDMpHIPyNCIklyVm9jvg6jLPf0cYb6mOV4HdzKxHdKbPOcA2Sda/GtjXzP5tZttE8bcws8fNbPMU9teIMCayzMx2Bc5KYf3VhP/PDczsKkKLosRg4J9m1tKCdma2ZfRc2eNyP3Cmme0TrdvQzI40s5TO1jKzk8xs6+j/sOQztSaKbS0V/x+8AmxjZheYWf3oc7NPKvuU5JQo8oS7LwQeJfTPQ/h1OBMYa2Y/EX6h7hKtO54wKHwr4VfjaEJ3AYS+9HrANEIX0LMk7wIZAhxM6PoqiWUNcDShj3824df9YMIZVam+n/eAwwiDv98QupT2APZz9y8SVv02inM+YfD4THcv6a6q8BhU4DbCwPD3wFjg9TLP305oQS02sztSfS/R+/me0EK6idCt1JpwZs/KCtb/kpAUmwFTzexHQottAmFcqjKXELoDlxK+uJ+qZP0RhDPKZhCO9QrW7x66hTD+8wYhAT1AOFYQxpweMbMlZtbb3ScQxqzuJPzfzCSMJaSqG+E9LyMc8z7uvsLdfyGcffZ+tK+OiS9y96WEEzSOJnwuvgAOqMJ+pQIlZ6yI5JzoSt7H3T1ZF05WMrM6hNNzT3T3kXHHI5KMWhQiGWJmh5nZ5mZWn9Ixg7ExhyVSqbQlCjN70MwWmNmUCp43M7vDzGZGpQn2TFcsIlmiE+GsnO8J3SM93H15vCGJVC5tXU9m1plwnv+j7t62nOePAM4jnGu+D+FiMQ08iYhkmbS1KNx9DOEq1Yp0JyQRd/exwOZmlsp54yIikkFxXnC3HeufVVEcLfum7IpmNhAYCNCwYcO9dt1114wEKCJSqZ+mw5rlUHejyteNw4KVsGw1E9f49+6+dXU2EWeiKHvxD1RwQY27DwIGARQVFfmECRPSGZeISOre6hr+PXhUnFGsr2RIwQzuuQcWLMCuuebr6m4uzkRRTLjkvkQTwrnwIiLZaeYg+OrJ9Zct/hi2aB9PPOWZNw/OOguOPx5OPDHcB7jmmmpvMs7TY18CTonOfuoI/BhdGSwikp2+ejIkhkRbtIdmJ8QTTyJ3uP9+aN0a3noLli2rtU2nrUVhZkMIFTq3ioqfXU0oOIe730soSncE4arNXwhXCouIlK+8X/OZVtJ6yKZuJoAvv4QzzoCRI+GAA0LC2Kn2yl6lLVFERb2SPV8ycYqISOVKfs3H2c2TLa2Hsj79FCZOhEGDYMCAMDZRi1RmXETil0prIVt/zcdlyhT46CM45RTo0QNmzYItt6z8ddWgEh4iEr/y+v7LytZf85n2669hYHrPPeHKK2FFNKVKmpIEqEUhItlCrYXKjRsHp58OU6fCSSfBrbdCgwZp360ShYhILpg3D/bfH37/e3jlFTjyyIztWl1PIiLZbMaM8O9228FTT4XWRAaTBKhFISLplOoprXGfzZSNliyBv/wFBg+GUaOgc2f4059iCUUtChFJn1QGqUED1WW99BK0aQMPPACXXgp7V2UW4dqnFoWIJFeTC910SmvVDRgQEsRuu8GLL0JRUdwRKVGISCVqcqGbWgqpSSziV1QEO+wAl10G9erFG1dEiUKkEFWllaBWQXrNnQtnngl9+sDJJ4f7WUZjFCKFKNWxA1CrIF3Wrg0lwNu0CYPVK1fGHVGF1KIQKVRqJcTniy/CWMSYMXDwwaFGU/PmcUdVISUKkXyVrHtJp6PGa9o0+OQTePBB6N+/1ov41TYlCpF8lWwQWt1JmTd5Mnz8MfTrB927hyJ+W2wRd1QpUaIQyUczB8GC0dC4i7qX4rZyJVx3Hdx4I/zhD2HmuQYNciZJgAazRfJTSZeTWg3x+uAD2GOPkChOOAEmTcpIEb/aphaFSK6qbAyicRdoMTCzMUmpefOgSxfYZhsYPhwOPzzuiKpNLQqRXJXsFFeNQcTns8/Cv9ttB08/HYr45XCSALUoRHKbTnHNHosXw8UXw0MPhdNe998/zDyXB5QoRERqatgwOPtsWLgQrrgi9iJ+tU2JQkSkJk47LbQi2reHV18NU5TmGSUKkWxSnRpMknmJRfw6doSWLeGSS2DDDeONK000mC2STVSDKft9/XUYnH7ssfB44MDQ3ZSnSQLUohDJnFRaC6rUmr1KivhdfnloURx3XNwRZYxaFCKZkkprQa2E7DR9ergm4txzYd99YcoUOP30uKPKGLUoRDJJrYXcNH16uB7i4YfhlFOyvohfbVOiEBEpz6RJoYjfqafCMceEIn6bbx53VLFQ15OISKIVK+Cvfw3XQlxzTXgMBZskQIlCJDNKqrlKdnv//XA9xA03hC6mjz/OySJ+tU1dTyKZoGqu2W/ePDjggFCjacQIOPTQuCPKGkoUIulUckqsqrlmr2nToHXrkCCeey4ki002iTuqrKKuJ5F0SpxlTq2J7PLDD2Ea0jZtQhE/gKOPVpIoh1oUIrUt8cI6XUCXnZ57Ds45BxYtgiuvhA4d4o4oq6lFIVLbEi+sU0si+/TvD716ha6mDz8Ms89pwDoptShE0kGtiOySWMRv332hVaswd8QG+gpMRVqPkpl1A24H6gKD3f3GMs83BR4BNo/Wudzdh6czJpEqqUo11xKq6ppdZs8OhftOOgn69Qv3pUrS1vVkZnWBu4DDgdZAXzNrXWa1vwFPu/seQB/g7nTFI1ItVanmWkLdTdlhzRq44w5o2xbGji1tVUiVpbNF0QGY6e6zAMxsKNAdmJawjgObRvc3A+anMR4pRNVpESTSYHRu+uyzULTvgw9CSfB774WmTeOOKmelczB7O2BuwuPiaFmia4CTzKwYGA6cV96GzGygmU0wswkLFy5MR6ySr6rTIkik1kFumjkzFPJ77LEw65ySRI2ks0VRXnnFsm2/vsDD7n6zmXUCHjOztu6+dr0XuQ8CBgEUFRWp/SiVS7zQTS2CwjBxIkyeHKYmPfroMDax6aaVv04qlc4WRTGwfcLjJvy2a+l04GkAd/8AaABslcaYpFDoQrfCsXx5mExon33gn/8sLeKnJFFr0tmi+BBoaWbNgXmEweqyf7FzgIOAh82sFSFRqG9JNLYgqRkzBgYMgC++CGMS//mProlIg7S1KNx9NXAuMAL4jHB201Qzu9bMjolWuxg4w8wmA0OA/u46NUHQ2IJUbt48OOggWL0a3noLBg8u6FLg6ZTW6yiiayKGl1l2VcL9acAf0xmD5DC1CKQ8n34Ku+0WrqweNiwU8WvYMO6o8ppKeIhIbvj+ezj5ZGjXrrSI31FHKUlkgK5fF5Hs5g7PPAPnnguLF8PVV4eBa8kYJQrJDmUHr1UGQ0r06xeuhygqgrffDt1OklFKFJIdEk9nBQ1GF7rEIn5duoTupgsuUBG/mOioS/bQ4LUAzJoFZ5wRividemo47VVipcFsEckOa9bAbbeFrqUPP4Q6+nrKFmpRiEj8pk0LpTfGjYMjjwxF/Jo0iTsqiShlS/xmDoIFo+OOQuI0ezZ8+SU8+SS8/LKSRJZRi0LiV3K2kwavC8uHH8LHH4fxiCOPDGMTjRrFHZWUQy0Kic/MQfBW13C2U+Mu0EIzjxWEX36BSy6Bjh3hhhtKi/gpSWQtJQqJjyq8Fp5Ro8KprjffHFoSkyapiF8OUNeT1I6azC2tU2ILQ3ExHHII7LADvPNOqNEkOUEtCqkdmltaKjJ5cvi3SRN48UX45BMliRyjFoXUjGaSk4osXAh//jMMGRK6nLp0gSOOiDsqqQYlCqkZjTNIWe4wdCicfz78+CP84x/QqVPcUUkNpJQozKwe0NTdZ6Y5HslFaklIopNPhieeCBVeH3gA2rSJOyKpoUoThZkdCdwC1AOam1l74Gp3/1O6g5MsUNkgtaq8CsDataGAn1kYf9hrr9CiqFs37sikFqQymH0tsA+wBMDdPwZapDMoySKVDVKry0lmzgxTkj70UHh8+ulw4YVKEnkkla6nVe6+xMwSl2le63yW2IrQILVUZPXqUMTv73+H+vVV5TWPpdKi+MzMegN1zKy5md0GjE1zXBKnxFaEWgxSnilTwgD1pZfCYYeFon4nnRR3VJImqbQozgWuAtYCzwMjgCvSGZRkAbUiJJk5c+Drr8PZTb17h7EJyVupJIrD3P0y4LKSBWZ2LCFpiEihGDcuXDw3cGC4HmLWLNhkk7ijkgxIpevpb+Usu7K2AxGRLPXzz3DRRaGr6aabYOXKsFxJomBU2KIws8OAbsB2ZnZLwlObErqhRCTfvfNOKN43axacdRbceGMYuJaCkqzraQEwBVgBTE1YvhS4PJ1BiUgWKC4OA9XNm8Po0dC5c9wRSUwqTBTuPgmYZGZPuPuKDMYkInGaNAn22CMU8Xv55VCjaaON4o5KYpTKGMV2ZjbUzD4xsxklt7RHJiKZ9d13cPzxsOeeoQUB0K2bkoSklCgeBh4CDDgceBoYmsaYJC6JM85J4XCHxx+H1q3hhRfguutg333jjkqySCqJYmN3HwHg7l+6+98AFZPPR6oEW5hOOCEU8ttllzCH9ZVXwoYbxh2VZJFUrqNYaaF+x5dmdiYwD2ic3rAkrSoq9KdyHYUjsYjfoYeGU1/POUf1maRcqbQoLgQ2Ac4H/gicAZyWzqAkzSoq9KeWRGGYMSNUeH3wwfD41FNV6VWSqrRF4e7jortLgZMBzKxJOoOSDFDLofCsXg233AJXXw0NGmiQWlKWtEVhZnubWQ8z2yp63MbMHkVFAXOTBqsL1yefQMeOcNllcPjhoYjfCWo9SmoqTBRmdgPwBHAi8LqZXQmMBCYDO2cmPKlVGqwuXMXFMHcuPPMMPPcc/OEPcUckOSRZ11N3YHd3X25mvwPmR4+np7pxM+sG3A7UBQa7+43lrNMbuIYwx8Vkd9c3WG0rGbzWYHVh+d//QkvizDNLi/g1bBh3VJKDknU9rXD35QDu/gPweRWTRF3gLsK1F62BvmbWusw6LQkly//o7m2AC6oYv6RCLYnCsmwZ/PnPsN9+cPPNpUX8lCSkmpK1KHY0s5JS4gY0S3iMux9bybY7ADPdfRaAmQ0ltFKmJaxzBnCXuy+OtrmgivFLZWYOggWjoXEXtSQKwRtvhDLgc+aE013/3/9TET+psWSJomeZx3dWcdvbAXMTHhcT5t5OtDOAmb1P6J66xt1fL7shMxsIDARo2rRpFcMocCXXS6glkf/mzoUjj4SddoIxY0KLQqQWJCsK+HYNt13elFdl59reAGgJdAWaAO+aWVt3X1ImlkHAIICioiLN111VjbtAi4FxRyHpMnEi7LUXbL89DB8O++8fTn8VqSWpXHBXXcXA9gmPmxAGxMuu86K7r3L32cB0QuIQkcp8+y0cdxwUFZUW8TvkECUJqXXpTBQfAi3NrLmZ1QP6AC+VWecForpR0bUaOwOz0hiTSO5zh0ceCUX8Xn45jEOoiJ+kUSq1ngAws/ruvjLV9d19tZmdC4wgjD886O5TzexaYIK7vxQ9d6iZTQPWAJe6+6KqvYUCU1GdpoqUnO0k+aNPH3j6afjjH2HwYNh117gjkjxn7sm7/M2sA/AAsJm7NzWz3YEB7n5eJgIsq6ioyCdMmBDHrrNDyZXVVfnyb3aCxihyXWIRv0cegaVL4eyzoU46OwUkn5jZRHcvqs5rU2lR3AEcRegmwt0nm5nKjGdSYitCF80Vns8/hwEDoH//8G+/fnFHJAUmlZ8jddz96zLL1qQjGKlAYrVXXTRXOFatCuMPu+8eajNtskncEUmBSqVFMTfqfvLoauvzAE2Fmm5qRRS2jz8O5b8//hh69YL//he22SbuqKRApdKiOAu4CGgKfAd0jJZJOqkVUdi+/TbcnnsuFPJTkpAYpdKiWO3ufdIeifyWWhGF5b33QhG/s8+Gbt3gyy9h443jjkokpRbFh2Y23Mz6mVmjtEdU6DRnROFZuhTOPTdcUX3bbaVF/JQkJEtUmijcfSfgOmAv4FMze8HM1MJIF1V6LSwjRkDbtnD33aHi60cfqYifZJ2UTsJ29/+5+/nAnsBPhAmNpLaVVHot6XLStQ/5be5cOOqo0HJ4773QmtCZTZKFKk0UZraJmZ1oZi8D44GFgOoFpIMqveY/dxg/Ptzffnt47TWYNEklOCSrpdKimEI40+kmd2/h7he7+7g0x1W4VOk1f33zDfTsCfvsU1rE7+CDVcRPsl4qZz3t6O5r0x6JSL5yh4cfhosughUr4F//CnWaRHJEhYnCzG5294uB58zsNwWhUpjhTkQAeveGZ58NZzUNHgw77xx3RCJVkqxF8VT0b1VntpOqKrkKW5Ve88eaNaGAX506cPTRcOCB8H//pyJ+kpMq/NS6ezTiRit3fzvxBrTKTHgFQqfE5pfPPguthwceCI9POQXOOktJQnJWKp/c08pZdnptB1LwdEps7lu1Cq67Dtq3h+nTYbPN4o5IpFYkG6M4njArXXMzez7hqUbAkvJfJVKgJk0KZcA/+QSOPx7uuAMaN447KpFakWyMYjywiDDX9V0Jy5cCk9IZlEjO+e47+P57eOEF6N497mhEalWFicLdZwOzgbcyF45IDhkzBj79FM45JxTxmzkTNtoo7qhEal2FYxRmNjr6d7GZ/ZBwW2xmP2QuRJEs89NPocJrly6hi6mkiJ+ShOSpZIPZJdOdbgVsnXAreSw1pUqxuWf4cGjTBu67L1xApyJ+UgCSnR5bcjX29kBdd18DdAL+D2iYgdjyn06LzS1z54bxh802g//9D26+GRrqT0HyXyolPF4A9jaznYBHgVeBJ4Gj0hlY3tIUp7nFHcaNg44dQxG/N94I5Tfq1Ys7MpGMSeU6irXuvgo4FrjN3c8DtktvWHlMU5zmjvnzoUcP6NSptIjfAQcoSUjBSWkqVDM7DjgZ6BEt2zB9IRUAtSKym3u4qvqSS8JA9X/+oyJ+UtBSSRSnAWcTyozPMrPmwJD0hiUSo1694Pnnw1lNgwdDixZxRyQSq0oThbtPMbPzgRZmtisw092vT39oeUaF/7JbYhG/Hj3g0EPhjDNUn0mE1Ga42x+YCTwAPAjMMDO1w6tKZzhlrylTQtdSSRG/k09WpVeRBKl0Pd0KHOHu0wDMrBXwGFCUzsDyksYmssuvv8INN8D114dTXrfYIu6IRLJSKomiXkmSAHD3z8xMp31Ibps4MRTxmzIFTjgBbrsNttZ1pCLlSSVRfGRm9xFaEQAnoqKAkusWLYIlS+Dll+EoXRIkkkwqieJM4HzgL4ABY4D/pjMokbQYOTIU8Tv//DBY/cUX0KBB3FGJZL2kicLMdgN2Aoa5+02ZCUmklv34I/zlLzBoEOy6axiorl9fSUIkRcmqx/6VUL7jROBNMytvpjupSEnBv5KbCv/F4+WXoXXrcD3EJZeEsQkV8ROpkmQtihOBdu7+s5ltDQwnnB4rqSh7zYROi828uXOhZ8/QinjhBdh777gjEslJyRLFSnf/GcDdF5qZTiqvKp0Om3nu8MEHsO++pUX89t1X9ZlEaiDZl/+OZvZ8dBsG7JTw+Pkkr1vHzLqZ2XQzm2lmlydZr5eZuZnp2gypvuJiOOaYcPFcSRG/rl2VJERqKFmLomeZx3dWZcNmVpcw1/YhQDHwoZm9lHhNRrReI8JZVeOqsn2Rddauhfvvh0svhdWr4ZZbYL/94o5KJG8kmzP77RpuuwOhLtQsADMbCnQHppVZ75/ATcAlNdyfFKqePcMYxIEHhoSx445xRySSV9I57rAdMDfhcTFl5rEwsz2A7d39lWQbMrOBZjbBzCYsXLiw9iOV3LN6dWhJQEgU998Pb72lJCGSBulMFFbOMl/3ZBgcvxW4uLINufsgdy9y96Kts73MgubBTr9PPgmTCd1/f3h80kkwYECo/ioitS7lRGFmVT35vJgw33aJJsD8hMeNgLbAKDP7CugIvJTzA9qqEps+K1fC1VfDXnvB11+rNpNIhlRawsPMOhBKjG8GNDWz3YEB0ZSoyXwItIwmOpoH9AHWfXO6+4/AVgn7GQVc4u4Tqvomso5Oi619H34YivhNmxbKgN96K2y5ZdxRiRSEVFoUdwBHAYsA3H0ycEBlL3L31cC5wAjgM+Bpd59qZtea2THVD1kK0uLFsGwZDB8Ojz6qJCGSQakUBazj7l/b+v2/a1LZuLsPJ1zRnbjsqgrW7ZrKNqWAvPNOKOL35z+HIn4zZqj8hkgMUmlRzI26n9zM6prZBcCMNMeVWxLrOmkQu+aWLAnTkB50ENx3XxibACUJkZikkijOAi4CmgLfEQadz0pnUDmnZAAbNIhdUy++GIr4PfhgqPiqIn4isau068ndFxAGoiUZDWDX3Jw5cNxx0KoVvPQSFOX2CXAi+SKVs57uJ+H6hxLuPjAtEUlhcYf33oP994emTcNFcx07qj6TSBZJpevpLeDt6PY+0BhYmc6gcoYurquZOXPgyCOhc+fSIn6dOytJiGSZVLqenkp8bGaPAW+mLaJcoovrqmftWrj3XrjsstCiuOMOFfETyWKpnB5bVnNgh9oOJGdpbKLqjj02DFofckiYnrRZs7gjEpEkUhmjWEzpGEUd4AegwrklRMq1ejXUqRNuxx8P3buHK61Vn0kk6yVNFBaustudUILeVFT2AAAU4UlEQVQDYK27/2ZgWySpyZPhtNPCtRFnngl9+8YdkYhUQdLB7CgpDHP3NdFNSUJSt2IF/O1v4TTX4mLYZpu4IxKRakjlrKfxZrZn2iOR/DJ+POyxB1x/PZx4Inz2GfToEXdUIlINFXY9mdkGUWG//YAzzOxL4GfCPBPu7koeUrGffoLly+H11+Gww+KORkRqINkYxXhgT0A/AyU1b7wBU6fChRfCwQfD9OkqvyGSB5IlCgNw9y8zFIvkqsWL4aKL4OGHoU0bOPvskCCUJETyQrJEsbWZXVTRk+5+SxrikVzz/PNwzjmwcCFccQVcdZUShEieSZYo6gKbUP7c1yKhBEefPtC2bZhQaI894o5IRNIgWaL4xt2vzVgk2WjmoFCmoyIl5TsKiTuMGQNduoQifu+8A/vsAxtuGHdkIpImyU6PVUsicZ6J8hRajaevv4bDD4euXUuL+O23n5KESJ5L1qI4KGNRZJuSlkRJi6HQazmtXQt33w2XR5Vb/vvfUBZcRApChYnC3X/IZCBZRVVh19ejB7z8crge4r77YAfVhBQpJNWpHlsYCr0lsWoV1K0bivj17Qu9esHJJ6uIn0gBSqWEhxSajz6CDh3CnBEQEsUppyhJiBQoJQoptXx5uBaiQwf49lvYfvu4IxKRLKBEUdbMQbBgdNxRZN7YsdC+Pdx4I/TrB9OmwdFHxx2ViGQBjVGUVXLdRKENYv/8cxiXePPNUKdJRCSiRFGexl2gxcC4o0i/118PRfwuvhgOOgg+/xzq1Ys7KhHJMup6KkSLFoXupcMPh0cegV9/DcuVJESkHGpRlC3Tkc9lOdzhuedCEb8ffgizz/3tb0oQIpKUEkXixXWQ3xfZzZkDJ5wA7dqFuSN23z3uiEQkByhRQH5fXOcOI0fCgQeGK6pHjQqnv26g/3oRSY3GKPLZ7Nlw6KFhoLqkiN+++ypJiEiVKFHkozVr4PbbwzwR48bBPfeoiJ+IVFth/bQsb36JfBy87t4dXn0VjjgilOHQFdYiUgOF1aIob36JfBm8XrUqlAOHULzv8cfhlVeUJESkxtLaojCzbsDthGlVB7v7jWWevwgYAKwGFgKnufvXaQmmpDRH4y75N3A9YQKcfjoMHBhOfT3++LgjEpE8krYWhZnVBe4CDgdaA33NrHWZ1SYBRe7eDngWuCld8eRlaY7ly+Gyy8JUpAsXap4IEUmLdHY9dQBmuvssd/8VGAp0T1zB3Ue6+y/Rw7FAkzTGk1+lOT74IFwHcdNNcNppoYjfUUfFHZWI5KF0dj1tB8xNeFwM7JNk/dOB18p7wswGAgMBmjZtWlvx5bbly8OYxFtvhdNfRUTSJJ2JorxZbrzcFc1OAoqALuU97+6DgEEARUVF5W6jIAwfHor4XXppuIDus89gww3jjkpE8lw6u56KgcRTbpoA88uuZGYHA1cCx7j7ylqNYOYgeKtruJU92ymXfP89nHQSHHkkPPFEaRE/JQkRyYB0JooPgZZm1tzM6gF9gJcSVzCzPYD7CEliQa1HkHg6bC6eBusOQ4dCq1bw9NNw9dUwfryK+IlIRqWt68ndV5vZucAIwumxD7r7VDO7Fpjg7i8B/wY2AZ6xMB/zHHc/psY7L7mwruRiulw9HXbOnFAOfPfd4YEHYLfd4o5IRApQWq+jcPfhwPAyy65KuJ+eqdQSk0QutiLefjvMMrfDDqFG0957Q926cUcmIgUqf6/MLmlJ5NLpsF9+Gc5gOuSQ0iJ+HTsqSYhIrPI3UeSSNWvglltC19LEiXDffSriJyJZo7CKAmaro4+G114LF8zdcw80Se91hyIiVaFEEZdffw3zQtSpA/37h0J+ffqAlXf5iYhIfNT1FIfx42GvveDuu8Pj3r2hb18lCRHJSvmTKHLh4rpffoGLL4ZOnWDxYthpp7gjEhGpVP4kimy/uO6998Jg9S23wBlnhFIchx8ed1QiIpXKrzGKbL64btWqcJrryJHQtWvc0YiIpCy/EkW2efnlULjvL3+BAw4IpcA30CEXkdySP11P2WThQjjhBDjmGBgypLSIn5KEiOQgJYra5A5PPhmK+D37LFx7LYwbpyJ+IpLT9BO3Ns2ZA6eeCnvsEYr4tWkTd0QiIjWWm4mipDpsopIigJm2di28+SYcdlgo4vfuu+EaCdVnEpE8kZtdT4mnwpaI45TYL74IM8116wZjxoRlHTooSYhIXsnNFgXEeyrs6tVw661w1VVQv37oZlIRPxHJU7mbKOJ01FEwYgR07x7KcGy7bdwRiWSlVatWUVxczIoVK+IOpWA0aNCAJk2asGEtTpWsRJGqlSvDHNV16sCAAXDaaXDccarPJJJEcXExjRo1olmzZpj+VtLO3Vm0aBHFxcU0b9681rabm2MUmTZ2LOy5J9x1V3jcq1co5KcPvkhSK1asYMstt1SSyBAzY8stt6z1FlzuJYqfpmeu6N/PP8OFF8K++8LSpdCyZWb2K5JHlCQyKx3HO/e6ntYshy32S/8ZTu++C/36wezZcPbZcMMNsOmm6d2niEgWyr0WRd2NMjMX9urVYUxi9OjQ5aQkIZKzhg0bhpnx+eefr1s2atQojjrqqPXW69+/P88++ywQBuIvv/xyWrZsSdu2benQoQOvvfZajWO54YYbaNGiBbvssgsjRowod53999+f9u3b0759e7bddlt69OixXtzt27enTZs2dOnSpcbxpCL3WhTp9MILoYjfFVeEIn5Tp6o+k0geGDJkCPvttx9Dhw7lmmuuSek1f//73/nmm2+YMmUK9evX57vvvmP06NE1imPatGkMHTqUqVOnMn/+fA4++GBmzJhB3TLXXr377rvr7vfs2ZPu3bsDsGTJEs4++2xef/11mjZtyoIFC2oUT6r0LQjw3Xdw3nnwzDNh0Prii0N9JiUJkdoz8YLaH1/coj3sdVvSVZYtW8b777/PyJEjOeaYY1JKFL/88gv3338/s2fPpn79+gD8/ve/p3fv3jUK98UXX6RPnz7Ur1+f5s2b06JFC8aPH0+nTp3KXX/p0qW88847PPTQQwA8+eSTHHvssTRt2hSAxo0b1yieVOVe11NtcofHHoPWreHFF+H668MZTiriJ5I3XnjhBbp168bOO+/M7373Oz766KNKXzNz5kyaNm3Kpil0OV944YXruokSbzfeeONv1p03bx7bb7/9usdNmjRh3rx5FW572LBhHHTQQevimDFjBosXL6Zr167stddePProo5XGVxsK+yfznDnhmoiionB19a67xh2RSP6q5Jd/ugwZMoQLLrgAgD59+jBkyBD23HPPCs8OqupZQ7feemvK67p7lfY3ZMgQBgwYsO7x6tWrmThxIm+//TbLly+nU6dOdOzYkZ133rlKMVdV4SWKtWvDVdWHHx6K+L3/fqj2qvpMInln0aJFvPPOO0yZMgUzY82aNZgZN910E1tuuSWLFy9eb/0ffviBrbbaihYtWjBnzhyWLl1Ko0aNku7jwgsvZOTIkb9Z3qdPHy6//PL1ljVp0oS5c+eue1xcXMy2FVR2WLRoEePHj2fYsGHrvX6rrbaiYcOGNGzYkM6dOzN58uS0JwrcPadue7XcxKtt+nT3/fd3B/dRo6q/HRFJybRp02Ld/7333usDBw5cb1nnzp19zJgxvmLFCm/WrNm6GL/66itv2rSpL1myxN3dL730Uu/fv7+vXLnS3d3nz5/vjz32WI3imTJlirdr185XrFjhs2bN8ubNm/vq1avLXfeee+7xU045Zb1l06ZN8wMPPNBXrVrlP//8s7dp08Y//fTT37y2vOMOTPBqfu8WxhjF6tXwr39Bu3bw6afw0EPQuXPcUYlImg0ZMoQ//elP6y3r2bMnTz75JPXr1+fxxx/n1FNPpX379vTq1YvBgwez2WabAXDdddex9dZb07p1a9q2bUuPHj3YeuutaxRPmzZt6N27N61bt6Zbt27cdddd6854OuKII5g/f/66dYcOHUrfvn3Xe32rVq3o1q0b7dq1o0OHDgwYMIC2bdvWKKZUmJfTZ5bNinZu5BNmLK3aiw47DN54A449NlwTsc026QlORNbz2Wef0apVq7jDKDjlHXczm+juRdXZXv6OUaxYES6Yq1sXBg4Mt549445KRCTn5GfX0/vvQ/v2pUX8evZUkhARqab8ShTLlsH554dJhFasADV5RWKXa93buS4dxzt/EsXo0dC2Ldx5J5x7LkyZAoccEndUIgWtQYMGLFq0SMkiQzyaj6JBgwa1ut38GqPYeONQ9fWPf4w7EhEhnPdfXFzMwoUL4w6lYJTMcFebcvusp+efh88/h7/+NTxes0YXzomIlKMmZz2ltevJzLqZ2XQzm2lml5fzfH0zeyp6fpyZNUtpw99+G2aZ69kThg2DX38Ny5UkRERqXdoShZnVBe4CDgdaA33NrHWZ1U4HFrt7C+BW4F+VbniZhUHqV14Jkwn9738q4icikkbpbFF0AGa6+yx3/xUYCnQvs0534JHo/rPAQVZZRa5vl4VB68mT4fLLw7USIiKSNukczN4OmJvwuBjYp6J13H21mf0IbAl8n7iSmQ0ESqa0W2nvvTdFlV4B2Ioyx6qA6ViU0rEopWNRapfqvjCdiaK8lkHZkfNU1sHdBwGDAMxsQnUHZPKNjkUpHYtSOhaldCxKmdmE6r42nV1PxcD2CY+bAPMrWsfMNgA2A35IY0wiIlJF6UwUHwItzay5mdUD+gAvlVnnJaBfdL8X8I7n2vm6IiJ5Lm1dT9GYw7nACKAu8KC7TzWzawl10V8CHgAeM7OZhJZEnxQ2PShdMecgHYtSOhaldCxK6ViUqvaxyLkL7kREJLPyp9aTiIikhRKFiIgklbWJIm3lP3JQCsfiIjObZmafmNnbZrZDHHFmQmXHImG9XmbmZpa3p0amcizMrHf02ZhqZk9mOsZMSeFvpKmZjTSzSdHfyRFxxJluZvagmS0wsykVPG9mdkd0nD4xsz1T2nB1J9tO540w+P0lsCNQD5gMtC6zztnAvdH9PsBTcccd47E4ANg4un9WIR+LaL1GwBhgLFAUd9wxfi5aApOALaLHjeOOO8ZjMQg4K7rfGvgq7rjTdCw6A3sCUyp4/gjgNcI1bB2BcalsN1tbFOkp/5GbKj0W7j7S3X+JHo4lXLOSj1L5XAD8E7gJWJHJ4DIslWNxBnCXuy8GcPcFGY4xU1I5Fg5sGt3fjN9e05UX3H0Mya9F6w486sFYYHMz+0Nl283WRFFe+Y/tKlrH3VcDJeU/8k0qxyLR6YRfDPmo0mNhZnsA27v7K5kMLAapfC52BnY2s/fNbKyZdctYdJmVyrG4BjjJzIqB4cB5mQkt61T1+wTI3omLaq38Rx5I+X2a2UlAEdAlrRHFJ+mxMLM6hCrE/TMVUIxS+VxsQOh+6kpoZb5rZm3dfUmaY8u0VI5FX+Bhd7/ZzDoRrt9q6+5r0x9eVqnW92a2tihU/qNUKscCMzsYuBI4xt1XZii2TKvsWDQC2gKjzOwrQh/sS3k6oJ3q38iL7r7K3WcD0wmJI9+kcixOB54GcPcPgAaEgoGFJqXvk7KyNVGo/EepSo9F1N1yHyFJ5Gs/NFRyLNz9R3ffyt2buXszwnjNMe5e7WJoWSyVv5EXCCc6YGZbEbqiZmU0ysxI5VjMAQ4CMLNWhERRiPOzvgScEp391BH40d2/qexFWdn15Okr/5FzUjwW/wY2AZ6JxvPnuPsxsQWdJikei4KQ4rEYARxqZtOANcCl7r4ovqjTI8VjcTFwv5ldSOhq6Z+PPyzNbAihq3GraDzmamBDAHe/lzA+cwQwE/gFODWl7ebhsRIRkVqUrV1PIiKSJZQoREQkKSUKERFJSolCRESSUqIQEZGklCgk65jZGjP7OOHWLMm6zSqqlFnFfY6Kqo9Ojkpe7FKNbZxpZqdE9/ub2bYJzw02s9a1HOeHZtY+hddcYGYb13TfUriUKCQbLXf39gm3rzK03xPdfXdCscl/V/XF7n6vuz8aPewPbJvw3AB3n1YrUZbGeTepxXkBoEQh1aZEITkhajm8a2YfRbd9y1mnjZmNj1ohn5hZy2j5SQnL7zOzupXsbgzQInrtQdEcBp9Gtf7rR8tvtNI5QP4TLbvGzC4xs16EmltPRPvcKGoJFJnZWWZ2U0LM/c3sv9WM8wMSCrqZ2T1mNsHC3BP/iJadT0hYI81sZLTsUDP7IDqOz5jZJpXsRwqcEoVko40Sup2GRcsWAIe4+57A8cAd5bzuTOB2d29P+KIujso1HA/8MVq+Bjixkv0fDXxqZg2Ah4Hj3X03QiWDs8zsd8CfgDbu3g64LvHF7v4sMIHwy7+9uy9PePpZ4NiEx8cDT1Uzzm6EMh0lrnT3IqAd0MXM2rn7HYRaPge4+wFRKY+/AQdHx3ICcFEl+5ECl5UlPKTgLY++LBNtCNwZ9cmvIdQtKusD4EozawI87+5fmNlBwF7Ah1F5k40ISac8T5jZcuArQhnqXYDZ7j4jev4R4BzgTsJcF4PN7FUg5ZLm7r7QzGZFdXa+iPbxfrTdqsTZkFCuInGGst5mNpDwd/0HwgQ9n5R5bcdo+fvRfuoRjptIhZQoJFdcCHwH7E5oCf9mUiJ3f9LMxgFHAiPMbAChrPIj7n5FCvs4MbGAoJmVO79JVFuoA6HIXB/gXODAKryXp4DewOfAMHd3C9/aKcdJmMXtRuAu4Fgzaw5cAuzt7ovN7GFC4buyDHjT3ftWIV4pcOp6klyxGfBNNH/AyYRf0+sxsx2BWVF3y0uELpi3gV5m1jha53eW+pzinwPNzKxF9PhkYHTUp7+Zuw8nDBSXd+bRUkLZ8/I8D/QgzJHwVLSsSnG6+ypCF1LHqNtqU+Bn4Ecz+z1weAWxjAX+WPKezGxjMyuvdSayjhKF5Iq7gX5mNpbQ7fRzOescD0wxs4+BXQlTPk4jfKG+YWafAG8SumUq5e4rCNU1nzGzT4G1wL2EL91Xou2NJrR2ynoYuLdkMLvMdhcD04Ad3H18tKzKcUZjHzcDl7j7ZML82FOBBwndWSUGAa+Z2Uh3X0g4I2tItJ+xhGMlUiFVjxURkaTUohARkaSUKEREJCklChERSUqJQkREklKiEBGRpJQoREQkKSUKERFJ6v8DUPcPW57/QYoAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "scores = score_model(probabilities, 0.5)\n", + "print_metrics(y_test, scores) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The accuracy is essentially unchanged with respect to the unweighted model. The precision, recall and F1 are all slightly worse for the positive cases, but better for the negative cases. Reweighting the labels has moved the results in the desired direction, at least a bit.\n", + "\n", + "Notice also, the ROC curve and AUC are essentially unchanged. The trade-off between true positive and false positive is simiar to the unweighted model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Find a better threshold\n", + "\n", + "There is another way to tip the model scoring toward correctly identifing the bad credit cases. The scoring threshold can be adjusted. Untill now the scores have been computed from the probabilities using a threshold of 0.5. However, there is no reason to think this is the correct choice. Recall that the score is determined by setting the threshold along the sigmoidal or logistic function. It is possible to favor either positive or negative cases by reducing changing the threshold along this curve. \n", + "\n", + "The code in the cell below contains a function for scoring and evaluating the model for a given threshold value. The `for` loop iterates over the list of five candidate threshold values. Execute this code and examine how changing the threshold value changes the scoring for the model. " + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\n", + "For threshold = 0.45\n", + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 166 39\n", + "True negative 41 54\n", + "\n", + "Accuracy 0.73\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.80 0.58\n", + "Recall 0.81 0.57\n", + "F1 0.81 0.57\n", + "\n", + "For threshold = 0.4\n", + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 158 47\n", + "True negative 40 55\n", + "\n", + "Accuracy 0.71\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.80 0.54\n", + "Recall 0.77 0.58\n", + "F1 0.78 0.56\n", + "\n", + "For threshold = 0.35\n", + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 150 55\n", + "True negative 31 64\n", + "\n", + "Accuracy 0.71\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.83 0.54\n", + "Recall 0.73 0.67\n", + "F1 0.78 0.60\n", + "\n", + "For threshold = 0.3\n", + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 134 71\n", + "True negative 27 68\n", + "\n", + "Accuracy 0.67\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.83 0.49\n", + "Recall 0.65 0.72\n", + "F1 0.73 0.58\n", + "\n", + "For threshold = 0.25\n", + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 127 78\n", + "True negative 21 74\n", + "\n", + "Accuracy 0.67\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.86 0.49\n", + "Recall 0.62 0.78\n", + "F1 0.72 0.60\n" + ] + } + ], + "source": [ + "def test_threshold(probs, labels, threshold):\n", + " scores = score_model(probs, threshold)\n", + " print('')\n", + " print('For threshold = ' + str(threshold))\n", + " print_metrics(labels, scores)\n", + "\n", + "thresholds = [0.45, 0.40, 0.35, 0.3, 0.25]\n", + "for t in thresholds:\n", + " test_threshold(probabilities, y_test, t)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As the threshold is decressed the number of correctly classified negative cases (bad credit customers) increases at the expense of correctly classifing positve cases (good credit customers). At the same time, accuracy decreses. However, as you have observed, accuracy is not a particularly useful metric here. \n", + "\n", + "Exactly which threshold to pick is a business decision. Notice that with a threshold value of 0.25 the number of false negatives (misclassified good credit customers) is about four times that of false positives (misclassified bad credit customers). " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lesson you have done the following:\n", + "1. Prepared the credit risk data set for modeling with Scikit-Learn. These steps included scaling the numeric features and dummy variable coding the categorical features. The result is a numpy array of features and a numpy array of the label values. \n", + "2. Computed a logistic regression model. \n", + "3. Evaluated the performance of the module using mutliple metrics. It is clear that accuracy is not a particularly useful metric here. The naive 'classifier' produced accuracy that was only somewhat worse as a result of the class imbalance. The confusion matrix and the precision, recall and F1 statistics gave meanful mesures of model performance, especially when considered together. \n", + "4. Reweighted the labels and changed the decision threshold for the reweighted model. These steps helped overcome both the class imbalance problem and the asymmetric cost of misclassification to the bank. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/.ipynb_checkpoints/CrossValidation-checkpoint.ipynb b/Module5/.ipynb_checkpoints/CrossValidation-checkpoint.ipynb new file mode 100644 index 0000000..5abc9c6 --- /dev/null +++ b/Module5/.ipynb_checkpoints/CrossValidation-checkpoint.ipynb @@ -0,0 +1,523 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction to Cross Validation and Model Selection\n", + "\n", + "In a previous lab you created a model with l2 or ridge regularization and l1 or lasso regularization. In both cases, an apparently optimum value of the regularization parameter was found. This process is an example of **model selection**. The goal of model selection is to find the best performing model for the problem at hand. Model selection is a very general term and can apply to at least the following common cases:\n", + "- Selection of optimal model **hyperparameters**. Hyperparameters are parameters which determine the characteristics of a model. Hyperparameters are distinct from the model parameters. For example, for the case of l2 regularized regression, the degree of regularization is determined by a hyperparameter, which is distinct from the regression coefficients or parameters. \n", + "- **Feature selection** is the process of determining which features should be used in a model. \n", + "- Comparing different model types is an obvious case of model selection. \n", + "\n", + "If you are thinking that the model selection process is closely related to model training, you are correct. Model selection is a component of model training. However, one must be careful, as applying a poor model selection method can lead to an over-fit model!\n", + "\n", + "## Overview of k-fold cross validation\n", + "\n", + "The questions remain, how good are the hyperparameter estimates previously obtained for the l2 and l1 regularization parameters and are there better ways to estimate these parameters? The answer to both questions is to use **resampling methods**. Resampling methods repeat a calculation multiple times using randomly selected subsets of the complete dataset. In fact, resampling methods are generally the best approach to model selection problems. \n", + "\n", + "\n", + "**K-folod Cross validation** is a widely used resampling method. In cross validation a dataset is divided into **k folds**. Each fold contains $\\frac{1}{k}$ cases and is created by **Bernoulli random sampling** of the full data set. A computation is performed on $k-1$ folds of the full dataset. The $k^{th}$ fold is **held back** and is used for testing the result. The computation is performed $k$ times and model parameters are averaged (mean taken) over the results of the $k$ folds. For each iteration, $k-1$ folds are used for training and the $k^{th}$ fold is used for testing. \n", + "\n", + "4-fold cross validation is illustrated in the figure below. To ensure the data are randomly sampled the data is randomly shuffled at the start of the procedure. The random samples can then be efficiently sub-sampled as shown in the figure. The model is trained and tested four times. For each iteration the data is trained with three folds of the data and tested with the fold shown in the dark shading. \n", + "\n", + "\"Drawing\"\n", + "
**Resampling scheme for 4-fold cross validation**
\n", + "\n", + "## Introduction to nested cross validation\n", + "\n", + "Unfortunately, simple cross validation alone does not provide an unbiased approach to model selection. The problem with evaluating model performance with simple cross validation uses the same data samples as the model selection process. This situation will lead to model over fitting wherein the model selection is learned based on the evaluation data. The result is usually unrealistically optimistic model performance estimates.\n", + "\n", + "To obtain unbiased estimates of expected model performance while performing model selection, it is necessary to use **nested cross validation**. As the name implies, nested cross validation is performed though a pair of nested CV loops. The outer loop uses a set of folds to perform model evaluation. The inner loop performs model selection using another randomly sampled set of folds not used for evaluation by the outer loop. This algorithm allows model selection and evaluation to proceed with randomly sampled subsets of the full data set, thereby avoiding model selection bias. \n", + "\n", + "## Cross validation and computational efficiency\n", + "\n", + "As you may have surmised, cross validation can be computationally intensive. Processing each fold of a cross validation requires fitting and evaluating the model. It is desirable to compute a reasonable number of folds. Since the results are averaged over the folds, a small number of folds can lead to significant variability in the final result. However, with large data sets or complex models, the number of folds must be limited in order to complete the cross validation process in a reasonable amount of time. It is, therefore, necessary to trade off accuracy of the cross validation result with the practical consideration of the required computational resources. \n", + "\n", + "As mentioned earlier, other resampling methods exist. For example, leave-one-out resampling has the same number of folds as data cases. Such methods provide optimal unbiased estimates of model performance. Unfortunately, as you might think, such methods are computationally intensive and are only suitable for small datasets. In practice k-fold cross validation is a reasonable way to explore bias-variance trade-off with reasonable computational resources. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Load Features and Labels\n", + "\n", + "With the above theory in mind, you will now try an example. \n", + "\n", + "As a first step, execute the code in the cell below to load the packages required for this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "from sklearn import cross_validation\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, load the preprocessed files containing the features and the labels. The preprocessing includes the following:\n", + "1. Cleaning missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardization of numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Construct the logistic regression model\n", + "\n", + "To create a baseline for comparison you will now create a logistic regression model without cross validation. This model uses a fixed set of hyperparameters. You will compare the performance of this model with the cross validation results computed subsequently. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, execute the code in the cell below to create training and testing splits of the dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, it is time to compute the logistic regression model. The code in the cell below does the following:\n", + "1. Define a logistic regression model object using the `LogisticRegression` method from the scikit-learn `linear_model` package.\n", + "2. Fit the linear model using the numpy arrays of the features and the labels for the training data set.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "logistic_mod = linear_model.LogisticRegression(C = 1.0, class_weight = {0:0.45, 1:0.55}) \n", + "logistic_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below computes and displays metrics and the ROC curve for the model using the test data subset. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + "\n", + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show() \n", + "\n", + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results look promising, with most of the metrics having reasonable values. The question is now, how will these performance estimates hold up to cross validation?" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Cross validate model\n", + "\n", + "To compute a better estimate of model performance, you can perform simple cross validation. The code in the cell performs the following processing:\n", + "1. Create a list of the metrics to be computed for each fold. \n", + "2. Defines a logistic regression model object.\n", + "3. A 10 fold cross validation is performed using the `cross_validate` function from the scikit-learn `model_selection` package.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Labels = Labels.reshape(Labels.shape[0],)\n", + "scoring = ['precision_macro', 'recall_macro', 'roc_auc']\n", + "logistic_mod = linear_model.LogisticRegression(C = 1.0, class_weight = {0:0.45, 1:0.55}) \n", + "scores = ms.cross_validate(logistic_mod, Features, Labels, scoring=scoring,\n", + " cv=10, return_train_score=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below displays the performance metrics along with the mean and standard deviation, computed for each fold to the cross validation. The 'macro' versions of precision and recall are used. These macro versions average over the positive and negative cases. \n", + "\n", + "Execute this code, examine the result, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def print_format(f,x,y,z):\n", + " print('Fold %2d %4.3f %4.3f %4.3f' % (f, x, y, z))\n", + "\n", + "def print_cv(scores):\n", + " fold = [x + 1 for x in range(len(scores['test_precision_macro']))]\n", + " print(' Precision Recall AUC')\n", + " [print_format(f,x,y,z) for f,x,y,z in zip(fold, scores['test_precision_macro'], \n", + " scores['test_recall_macro'],\n", + " scores['test_roc_auc'])]\n", + " print('-' * 40)\n", + " print('Mean %4.3f %4.3f %4.3f' % \n", + " (np.mean(scores['test_precision_macro']), np.mean(scores['test_recall_macro']), np.mean(scores['test_roc_auc']))) \n", + " print('Std %4.3f %4.3f %4.3f' % \n", + " (np.std(scores['test_precision_macro']), np.std(scores['test_recall_macro']), np.std(scores['test_roc_auc'])))\n", + "\n", + "print_cv(scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there is considerable variability in each of the performance metrics from fold to fold. Even so, the standard deviations are at least an order of magnitude than the means. It is clear that **any one fold does not provide a representative value of the performance metrics**. The later is a key point as to why cross validation is important when evaluating a machine learning model. \n", + "\n", + "Compare the performance metric values to the values obtained for the baseline model you created above. In general the metrics obtained by cross validation are lower. However, the metrics obtained for the baseline model are within 1 standard deviation of the average metrics from cross validation. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Optimize hyperparameters with nested cross validation\n", + "\n", + "Given the variability observed in cross validation, it should be clear that performing model selection from a single training and evaluation is an uncertain proposition at best. Fortunately, the nested cross validation approach provides a better way to perform model selection. However, there is no guarantee that a model selection process will, in fact, improve a model. In some cases, it may prove to be that model selection has minimal impact. \n", + "\n", + "To start the nested cross validation process it is necessary to define the randomly sampled folds for the inner and outer loops. The code in the cell below uses the `KFolds` function from the scikit-learn `model_selection` package to define fold selection objects. Notice that the `shuffle = True` argument is used in both cases. This argument specifies that a random shuffle is preformed before folds are created, ensuring that the sampling of the folds for the inside and outside loops are independent. Notice that by creating these independent fold objects there is no need to actually create nested loops for this process. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An important decision in model selection searches is the choice of performance metric used to find the best model. For classification problems scikit-learn uses accuracy as the default metric. However, as you have seen previously, accuracy is not necessarily the best metric, particularly when there is a class imbalance as is the case here. There are a number of alternatives which one could choose for such a situation. In this case AUC will be used. \n", + "\n", + "The code below uses the `inside` k-fold object to execute the inside loop of the nested cross validation. Specifically, the steps are:\n", + "1. Define a dictionary with the grid of parameter values to search over. In this case there is only one parameter, `C`, with a list of values to try. In a more general case, the dictionary can contain values from multiple parameters, creating a multi-dimensional grid that the cross validation process will iterate over. In this case there are 5 hyperparameter values in the grid and 10-fold cross validation is being used. Thus, the model will be trained and evaluated 50 times. \n", + "2. The logistic regression model object is defined. \n", + "3. The cross validation search over the parameter grid is performed using the `GridSearch` function from the scikit-learn `model_selection` package. Notice that the cross validation folds are computed using the `inside` k-fold object.\n", + "\n", + "\n", + "****\n", + "**Note:** Somewhat confusingly, the scikit-learn `LogisticRegression` function uses a regularization parameter `C` which is the inverse of the usual l2 regularization parameter $\\lambda$. Thus, the smaller the parameter the stronger the regulation \n", + "****\n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "nr.seed(3456)\n", + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"C\": [0.1, 1, 10, 100, 1000]}\n", + "## Define the logistic regression model\n", + "logistic_mod = linear_model.LogisticRegression(class_weight = {0:0.45, 0:0.55}) \n", + "\n", + "## Perform the grid search over the parameters\n", + "clf = ms.GridSearchCV(estimator = logistic_mod, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The cross validated grid search object, `clf`, has been created. \n", + "\n", + "The code in the cell below fits the cross validated model using the `fit`method. The AUC for each hyperparameter and fold is displayed as an array. Finally, the hyperparameter for the model with the best average AUC is displayed. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Fit thhe cross validated grid search over the data \n", + "clf.fit(Features, Labels)\n", + "keys = list(clf.cv_results_.keys())\n", + "for key in keys[6:16]:\n", + " print(clf.cv_results_[key])\n", + "## And print the best parameter value\n", + "clf.best_estimator_.C" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The array of AUC metrics has dimensions 10 folds X hyperparameter values. As you might expect by now, there is considerable variation in the AUC from fold to fold for each hyperparamter value, or column. \n", + "\n", + "Evidently, the optimal hyperparameter value is 0.1. \n", + "\n", + "To help understand this behavior a bit more, the code in the cell below does the following:\n", + "1. Compute and display the mean and standard deviation of the AUC for each hyperparameter value.\n", + "2. Plot the AUC values for each fold vs. the hyperparameter values. The mean AUC for each hyperparameter value is shown with a red +. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_cv(clf, params_grid, param = 'C'):\n", + " params = [x for x in params_grid[param]]\n", + " \n", + " keys = list(clf.cv_results_.keys()) \n", + " grid = np.array([clf.cv_results_[key] for key in keys[6:16]])\n", + " means = np.mean(grid, axis = 0)\n", + " stds = np.std(grid, axis = 0)\n", + " print('Performance metrics by parameter')\n", + " print('Parameter Mean performance STD performance')\n", + " for x,y,z in zip(params, means, stds):\n", + " print('%8.2f %6.5f %6.5f' % (x,y,z))\n", + " \n", + " params = [math.log10(x) for x in params]\n", + " \n", + " plt.scatter(params * grid.shape[0], grid.flatten())\n", + " p = plt.scatter(params, means, color = 'red', marker = '+', s = 300)\n", + " plt.plot(params, np.transpose(grid))\n", + " plt.title('Performance metric vs. log parameter value\\n from cross validation')\n", + " plt.xlabel('Log hyperparameter value')\n", + " plt.ylabel('Performance metric')\n", + " \n", + "plot_cv(clf, param_grid) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are a number of points to notice here:\n", + "1. The mean AUC for each value of the hyperparameter are all within 1 standard deviation of each other. This result indicates that model performance is not sensitive to the choice of hyperparamter. \n", + "2. Graphically you can see that there is a noticeable variation in the AUC from metric to metric, regardless of hyperparameter. Keep in mind that **this variation is simply a result of random sampling of the data!**\n", + "\n", + "Finally, it is time to try execute the outer loop of the nested cross validation to evaluate the performance of the 'best' model selected by the inner loop. In this case, 'best' is quite approximate, since as already noted, the differences in performance between the models is not significant. \n", + "\n", + "The code in the cell below executes the outer loop of the nested cross validation using the `cross_val_scores` function from the scikit-learn `model_selection` package. The folds are determined by the `outside` k-fold object. The mean and standard deviation of the AUC is printed along with the value estimated for each fold. Execute this code and examine the result. \n", + "\n", + "Then, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, there is considerable variation in AUC across the folds. The mean AUC is a bit lower than estimated for the inner loop of the nested cross validation and the baseline model. However, all of these values are within 1 standard deviation of each other, and thus these differences cannot be considered significant. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will build and test a model using the estimated optimal hyperparameters. Then, answer **Question 3** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "logistic_mod = linear_model.LogisticRegression(C = 0.1, class_weight = {0:0.45, 1:0.55}) \n", + "logistic_mod.fit(X_train, y_train)\n", + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have performed by simple cross validation and nested cross validation. Key points and observations are:\n", + "1. Model selection should be done using a resampling procedure such as nested cross validation. The nested sampling structure is required to prevent bias in model selection wherein the model selected learns the best hyperparameters for the samples used, rather than a model that generalizes well. \n", + "2. There is significant variation in model performance from fold to fold in cross validation. This variation arises from the sampling of the data alone and is not a property of any particular model.\n", + "3. Given the expected sampling variation in cross validation, there is generally considerable uncertainty as to which model is best when performing model selection. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/.ipynb_checkpoints/DimensionalityReduction-checkpoint.ipynb b/Module5/.ipynb_checkpoints/DimensionalityReduction-checkpoint.ipynb new file mode 100644 index 0000000..2b761ef --- /dev/null +++ b/Module5/.ipynb_checkpoints/DimensionalityReduction-checkpoint.ipynb @@ -0,0 +1,857 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Dimensionality reduction with principle components\n", + "\n", + "**Principle component analysis**, or **PCA**, is an alternative to regularization and stright-forward feature elimination. PCA is particularly useful for problems with very large numbers of features compared to the number of training cases. For example, when faced with a problem with many thousands of features and perhaps a few thousand cases, PCA can be a good choice to **reduce the dimensionality** of the feature space. \n", + "\n", + "PCA is one of a family of transformation methods that reduce dimensionality. PCA is the focus here, since it is the most widely used of these methods. \n", + "\n", + "The basic idea of PCA is rather simple: Find a linear transformation of the feature space which **projects the majority of the variance** onto a few orthogonal dimensions in the transformed space. The PCA transformation maps the data values to a new coordinate system defined by the principle components. Assuming the highest variance directions, or **components**, are the most informative, low variance components can be eliminated from the space with little loss of information. \n", + "\n", + "The projection along which the greatest variance occurs is called the **first principle componenent**. The next projection, orthogonal to the first, with the greatest variance is call the **second principle component**. Subsequent components are all mutually orthogonal with decrerasing variance along the projected direction. \n", + "\n", + "Widely used PCA algorithms compute the components sequentially, starting with the first principle component. This means that it is compuationally efficient to compute the first several components from a very large number of features. Thus, PCA can make problems with very large numbers of features compuationally tractable. \n", + "\n", + "****\n", + "**Note:** It may help your understanding to realize that principle components are a scaled version of the **eigenvectors** of the feature matrix. The scale for each dimensions is given by the **eigenvalues**. The eigenvalues are the fraction of the variance explained by the components. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A simple example\n", + "\n", + "To cement the concepts of PCA you will now work through a simple example. This example is restricted to 2-d data so that the resuts are easy to visualize. \n", + "\n", + "As a first step, execute the code in cell below to load the packages required for the rest of this notebook." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "import sklearn.decomposition as skde\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below simulates data from a bivariate Normal distribution. The distribution is deliberately centered on $\\{ 0,0 \\}$ and with unit variance on each dimension. There is considereable correlation between the two dimensions leading to a covariance matrix:\n", + "\n", + "$$cov(X) = \\begin{bmatrix}\n", + " 1.0 & 0.6 \\\\\n", + " 0.6 & 1.0\n", + " \\end{bmatrix}$$\n", + "\n", + "Given the covariance matrix 100 draws from this distribution are computed using the `multivariate_normal` function from the Numpy `random` package. Execute this code:" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(100, 2)" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "nr.seed(124)\n", + "cov = np.array([[1.0, 0.6], [0.6, 1.0]])\n", + "mean = np.array([0.0, 0.0])\n", + "\n", + "sample = nr.multivariate_normal(mean, cov, 100)\n", + "sample.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for this data, execute the code in the cell below to display a plot and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5,1,'Sample data')" + ] + }, + "execution_count": 21, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEWCAYAAABmE+CbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xu0ZGV55/Hvrw9HaARsDK2GA9hIGFQQ7OF4hUQgKnhBW8DbrJnRZGYxmRmWwWBHCDMBTRjJdDR4yUxkBqMRVFQuoogNBIRIRD1tN3easCAKDaOtoblIC3155o/ah66ursuuqn2v32etszhn165d7z592M/ez/u+z6uIwMzMbEHZDTAzs2pwQDAzM8ABwczMEg4IZmYGOCCYmVnCAcHMzAAHBLORSTpb0oUZHevzkv48i2OZjcoBwWpH0pGS/lHSo5L+RdJNkl5RdruKIum7kv5j2e2w5tmp7AaYDUPSHsC3gP8MfBV4FvDbwFNltsusCfyEYHXzrwAi4ssRsSUiNkbE1RFxK4CkAyRdJ+mXkn4h6SJJi+bfLOmfJS2XdKukX0m6QNLzJV0l6XFJ10raM9l3iaSQdLKkhyQ9LOm0Xg2T9OrkyWWDpFskHdVn36WSfpx85sXALm2v7SnpW5LWS3ok+X6f5LVzaAXAz0h6QtJnku2flPSApMckrZL02+P8km0yOSBY3dwDbJH0BUlvmr94txHwMWBv4CXAvsDZHfucCLyBVnA5HrgK+BNgL1r/T3ygY/+jgQOBNwKnS3p9Z6MkzQBXAn8OPBf4EHCJpMVd9n0WcDnwxWTfryVtmrcA+FvghcB+wEbgMwARcSbwD8ApEbFbRJySvOdHwMuT430J+JqkXTAbggOC1UpEPAYcCQTwf4D1kq6Q9Pzk9Xsj4pqIeCoi1gOfAF7XcZhPR8TPImIdrYvrDyJidUQ8BVwGLO3Y/yMR8auIuI3Whfq9XZr2b4FvR8S3I2JrRFwDzAFv7rLvq4Fp4LyI2BQRX6d1QZ8/x19GxCUR8WREPA6c0+UcOn8vFybv2xwRHwd2Bg7q9x6zTg4IVjsRcVdEvD8i9gEOofU0cB6ApOdJ+oqkdZIeAy6kdeff7mdt32/s8vNuHfs/0Pb9T5LP6/RC4J1JumiDpA20Atdvdtl3b2BdbF9Z8ifz30jaVdJnJf0kOYcbgUWSproca/49p0m6K+lo3wA8hx3P26wvBwSrtYi4G/g8rcAArXRRAIdGxB607tw15sfs2/b9fsBDXfZ5APhiRCxq+3p2RJzbZd+HgRlJ7e3ar+3702jd3b8qOYffSbbP779dieKkv+DDwLuAPSNiEfAo45+3TRgHBKsVSS9O7obnO1n3pZXCuTnZZXfgCWBDktdfnsHH/vfkrv1g4PeAi7vscyFwvKRjJU1J2kXSUfPt7PB9YDPwAUk7SToBeGXb67vTelLZIOm5wFkd7/8Z8KKO/TcD64GdJP0psMcI52kTzgHB6uZx4FXADyT9ilYguJ3WXTXAR4B/TesO+Urg0gw+8wbgXuDvgb+MiKs7d4iIB4C30+qcXk/riWE5Xf4fi4ingROA9wOPAO/uaOd5wELgF7TO7zsdh/gkcFIyAulTwEpaHeP30Eo9/Zrt01xmqcgL5Jh1J2kJcD8wHRGby22NWf78hGBmZoADgpmZJZwyMjMzwE8IZmaWqFVxu7322iuWLFlSdjPMzGpl1apVv4iIHcqodKpVQFiyZAlzc3NlN8PMrFYk/WTwXk4ZmZlZwgHBzMwABwQzM0s4IJiZGeCAYGZmCQcEMzMDajbs1Mya4/LV61ixci0PbdjI3osWsvzYg1i2dKbsZk00BwQzK9zlq9dxxqW3sXHTFgDWbdjIGZfeBuCgUCKnjMyscCtWrn0mGMzbuGkLK1auLalFBg4IZlaChzZsHGq7FcMBwcwKt/eihUNtt2I4IJhZ4ZYfexALp6e227Zweorlxx5UUosM3KlsZiWY7zj2KKNqcUAws1IsWzrjAFAxDghmBcpz7H2TxvU36VzqxAHBrCB5jr1v0rj+Jp1L3bhT2awgeY69b9K4/iadS904IJgVJM+x900a19+kc6kbBwSzguQ59r5J4/qbdC5144BgVpA8x943aVx/k86lbtypbFaQPMfeN2lcf5POpW4UEWW3IbXZ2dmYm5sruxlmZrUiaVVEzA7azykjMzMDSgwIkvaVdL2kuyTdIekPy2qLmZmV24ewGTgtIn4saXdglaRrIuLOEttkZjaxSgsIEfEw8HDy/eOS7gJmAAcEs4pwCYnJUolRRpKWAEuBH3R57WTgZID99tuv0HaZTTKXkJg8pXcqS9oNuAQ4NSIe63w9Is6PiNmImF28eHHxDTSbUC4hMXlKDQiSpmkFg4si4tIy22Jm23MJiclT5igjARcAd0XEJ8pqh5l15xISk6fMJ4QjgH8HHCNpTfL15hLbY2ZtXEJi8pQ5yuh7gMr6fDPrzyUkylf0KK9KjDIys2ryMpflKWOUV+mjjMzMbEdljPJyQDAzq6AyRnk5IJiZVVAZo7wcEMzMKqiMUV7uVDYrWJPrAzX53IpWxigvBwSzAjW5PlCTz60sRY/ycsrIrEB1qg90+ep1HHHudex/+pUcce51XL56Xd/963Ru1p2fEMxy1p5G6bVgbdXqA41yt+/aR/XnJwSzHM1fWNf1CQZQbH2gNHf+o9ztu/ZR/TkgmOWo24W1U5H1gToD1Pydf2dQGOVu37WP6s8BwSxH/S6gAmYWLeRjJ7yssI7DtHf+o9ztL1s6w8dOeBkzixaWcm42PvchmOVo70ULWdclKMwsWshNpx9TeHvS3vkvP/ag7foQIN3dvmsf1ZufEMxyVLU0Sto7f9/tTyY/IZjlqGolpI9+8WIuvPmnXbd38t3+5HFAMMtZlS6s19+9fqjtNlmcMjKbIJ4rYP04IJhNEM8VsH4cEKwRhi2zMKmq1slt1eI+BKs9F1VLr2qd3FYtDghWe/0mW/lCt6MqdXJbtThlZLXnjlKzbPgJwWqv12zgqnWUNnHxmCae0yRzQLDaG7XMQpGa0M/RefE/+sWLuWTVulqfk23PKSOrvTqUWaj74jHdqqRedPNPa31OtiM/IVgjVL2jNM9+jiLSNt0CWl0W+7H0/IRgVoC8JoSlXd9gXMNc5KvWd2PpOSCYFSCvCWFFpaJ6XeTV8XPV+m5sOA4IZgXIq5+jqCG33QIawMLpBey563Rl+25sOKX2IUj6HPBW4OcRcUiZbTHLWx79HEUNuZ1v99lX3MGGjZue2f7kpq0E4q/e/XIHggYo+wnh88BxJbfBrLZGSUWNWvdp2dIZnr3zjveQGzdt4bSv3lJK/SjXsMpWqU8IEXGjpCVltsGszoatTTTufIheqagtEYXPQWjC3I6qqfywU0knAycD7LfffiW3xqx6hklFjVv3qVeKatjjZME1rLJXdspooIg4PyJmI2J28eIdl/kza5ph0yDD7D9uJ3SvzuVhj5MF17DKXuUDgtkkGXZeQbf9T714DUs/enXX94w7H2J+tNSUOgecDnecLHixn+w5IJhVyLDzCrrtD/DIk5u6BpIs5kMsWzrDx991WOkL7Xixn+yVGhAkfRn4PnCQpAcl/Ycy22NWtmHTIP3SI90CSVbzIapQP6oKbWiaskcZvbfMzzermmHnFfTr5IXuASOr+RBVqB9VhTY0iVNGZhUybBpkUCev8+k2jMoPOzVrus5qpScePsP1d69PNa+g1wxicD7dhqeIXkVsq2d2djbm5ubKboZZZjonV0HrQj5KLtyrl1kvklZFxOyg/fyEYFaiLCdXjZpPdyCxeQ4IZiUqe3KVyz9YO3cqm5Wo7MlVdV/a07LlgGBWorInV5X9hGLV4pSRFc45622GrVbay6i/0yzWU/C/Z3P0DAiS9gDOAPYBroqIL7W99r8i4r8U0D5rGOesdzTu5KpxfqfLjz2o6yintE8o/vdsln4po7+ltWTqJcB7JF0iaefktVfn3jJrJOesszfO73Tc8g/+92yWfimjAyLixOT7yyWdCVwn6W0FtMsaKquctdMU2/QqXdGvpEW7cZ5Q3AfRLP0Cws6SFkTEVoCIOEfSg8CNwG6FtM4aJ6ucdd3TFFkGtCmJLV0mmPYqUZ1l+4pa09mK0S9l9E3gmPYNEfEF4DTg6TwbZc2Vxaiauqcphl3zYJBuwaDf9izbV/YoKctWz4AQEX8cEdd22f6diDgw32ZZU2VRsrjuaYqsA9pMj7vxXtsHGaZ9LkHdLB52aoUbd1RNFdIU46R8sg5o444UGrd9LkHdHJ6YZrVTdppi3JTPoNnJw66pnPVdetmzp608DghWO2WnKcZN+fRaw+DJpzfz3y6/baRgs2zpDMuPPYi9Fy3koQ0bWbFy7ch9EmUHXCtPqpSRpNcCS9r3j4i/y6lNZgOVmaYYN+XTaw2DR57cxEU3/5TOruA01U+zHHmV1expq5+BAUHSF4EDgDXA/G1RAA4INpEW7TrNI09u2mH7MCmVZUtnWLFy7Q6L2vQaFzQo2IxbRrtbn8hNpx8z8H3WLGmeEGaBl0adVtIxy8nlq9fxxK8377B9ekpDp1SG6UQeFGzGeWppwrwOy0aaPoTbgRfk3RCzOlixci2btu54b/TsZ+009MVz0a7TqfZLk78fpyO47vM6LDtpAsJewJ2SVkq6Yv4r74aZVVGvO+5HN+6YQhqk1zP3rtMLhu4wH6cjuO7zOiw7aVJGZ+fdCLOq6sytZ9F/MK9XENm4aSt3nn7MM5/9wYvXsGLl2r4du+N0BFdhXodVw8CAEBE3SHo+8Ipk0w8j4uf5NsusfN1y69MLxPSU2LRl2+39qEMy+12IR8nrjzryKuuJbeNw0cJyDUwZSXoX8EPgncC7gB9IOinvhpmVrVtufdPW4NnP2imTORD90jxF5vXLntcxL+saTza8NCmjM4FXzD8VSFoMXAt8Pc+GmZWtX3/BmrPeOPbx+6V5PnjxmqHalEVbyr4TH3forI0vTUBY0JEi+iWe4WwToIjceq8L8STm9d25Xb40F/bvJCOM3i/p/cCVwLfzbZZV1bB1duqszBIOk1g+wjWUyjcwIETEcuB84FDgMOD8iPhw3g2z6pm0HG+ZufX2z4bWYjfz6ZOm/r4nMQhWjcqcgCzpOOCTwBTwfyPi3H77z87OxtzcXCFtsx0dce51XdMYM4sWTlSZgyJHwnSONoLWRbKpaw54lFE+JK2KiNlB+/XsQ5D0vYg4UtLjbF9iRUBExB5jNnAK+GvgDcCDwI8kXRERd45zXMuPc7zFl3mYtI7WKnRuT7J+K6Ydmfx394jYo+1r93GDQeKVwL0RcV9EPA18BXh7Bse1nDjHW3yZBwdhK1KaeQgHSNo5+f4oSR+QtCiDz54BHmj7+cFkW+fnnyxpTtLc+vXrM/hYG9Uk53jnO9O7pcyg9aSQR2e7g7AVKc0oo0uALZJ+C7gA2B/4UgafrS7bdujQiIjzI2I2ImYXL16cwcfaqNJ0sjZxFFJ7Z3o/7Z3ty792SybnPslB2IqXZh7C1ojYLOkdwHkR8WlJqzP47AeBfdt+3gd4KIPjWo765XibWka5W5pokE1bg7OvuGPs8/ZiNVakNAFhk6T3Au8Djk+2pavb29+PgAMl7Q+sA94D/JsMjmslaWoHaL98/UyPCWTADovfjModrVaUNCmj3wNeA5wTEfcnF/ALx/3giNgMnAKsBO4CvhoRd4x7XCtPUztAe+XrJ224rTVfmmqndwIfaPv5fqDvfIG0IuLbeNZzpQ0zLjxNuYU6jjM/+sWLd1jruD2Pv2ePkth7plwAx6wq0owyOkLSNZLukXSfpPsl3VdE46xcw85MHtQBWseZzpevXsclq9btMBHnxMO3pXHOOv5gpqe2HyMxPSXOOv7g4ho6giYOALDxpEkZXQB8AjiS1poIs2xbG8EabNgx94NGIdVxqcZubQ7g+ru3DYFetnSGFScdtt15rzjpsEo/+dQxOFv+0nQqPxoRV+XeEqucUfoE+nWA1rGPIW2b69bx2ys4ZzEyyuorTUC4XtIK4FLgqfmNEfHj3FpllZB1Cea6lHRu7+dYILGlS72vqrV5WL0C3YaNm7h89ToHhQmVJmX0Klppov8BfDz5+ss8G2XVkPWkqDpMsupMpXQLBlVr8yj6BbQqp/AsX2lGGR1dREOserKeFJX3JKv2O/vnLJxGgg1Pbhrqc3pNQptKnhTay1C3n1PdLD/2IE4teFU2q76B5a8lPZ/W08HeEfEmSS8FXhMRFxTRwHYuf229dCsT3S5tyej9T79yx/opbcdoUhnqpR+9uutwWc+vaJ605a/TpIw+T2vy2N7Jz/cAp47eNKuiug9BHFReIu1opl6plPkng1GOWVVnHX9w5VN4Vqw0AWGviPgqsBWemWE8XGEXq7QmDEFMk+ZIs0+vfo5ufQlpj1lVZa4IZ9WUZpTRryT9BkklUkmvBh7NtVVWqDxrEKWdmTzuDOZeI5g69xmkVz/HipVrux5/0a7THHHudbWaed2ubsNlLV9pAsIfAVcAB0i6CVgMnJRrq6xQec0PSFv9NIsqqcuPPahvH8L0AvHk05vZ//QrB164e10kO48/PSWe+PXmZ/LwTanuapNrYMoomW/wOuC1wH8CDo6IW/NumBUnr0VY0s5MzmIGc2f6Y9HCafbcdfqZ7xE88uSmkVNiy5bOcOLhM0ypVaJiSmKnBWLT1u1TSfOTu8zqaOATQrL28ZuBJcn+b5RERHwi57ZZQbrdXWfRuZj2ySOrJ5Red/ZHnHvdDqWoh02Jzdc0mu9L2BLBxk3d+xU8ucvqKk2n8jeB9wO/Aeze9mUNkVfnYtonj7yXicwi4Ay7SE6dRx/Z5ErTh7BPRByae0usVHl0LqZ98sjrCWVeFiUzhn1aqfPoI5tcaQLCVZLeGBFX594aa5S0M5Pb91u3YePA2cDDjkgaNuDMH3++LfMzlLsNPV0g2Nolc1T3Wkc2mdIEhJuByyQtADbRKgcfEbFHri2zRkj75DG/z6DRRqOMSBqmZEbn8dv7DDotnJ7ixMNnuGTVusyfbuq4kJDVX5qA8HFaS2jeFoPqXJiNIc18iFHnTKQNTIP6CqYktkZsd5GefeFzM714ZzEM12wUaQLCPwG31zkY+G6rHtJ0/ua9psKg42yN4P5z37Ldtiz6XwaV3M5qoqBZP2kCwsPAdyVdxfbrIdRi2KnvtkY3TiAd5b3PWTi9w/BQgF2mtw2Gy3tNhUEznvPoG+iVpurkjmrLW5php/cDfw88ixoOO63jso1VME59o1HfK3XfvnHT1mfeO+yaCsMW7et2/DSf00uaz087pNUd1Za3NOshfKSIhuSljss2VsE49Y1GfW+3Usztx2xPzYzSQTxsB3T7KKOZEVKNaT8/zd+iq5BaEXoGBEnnRcSpkr4JO5aIj4i35dqyjNRl2caq6ZU2GVRADkYLwpevXtcavpbimON0ELc/HfYKKlnNyUgbGHv9jXbrwDbLU78nhC8m/631cpl5T3pqql7j7qd65XXajBKEV6xc2zMYDHpvL70C0Pydet79SmkDY6+/UZeitqL17EOIiFXJf28A7gTujIgb5r+KauC4XPN9NL06NnttbzfK2sn9nh5GDeBlL3aTtiSH/0atKvqljAScBZxCazLaAkmbgU9HxEcLal8mXPN9eDM97vJnxlhToN+/Qb+0yagXx1533r06cLPuVxrm6dR/o1YF/VJGpwJHAK+IiPsBJL0I+N+SPhgRf1VEA60c46bahr3A5ZE2GXaxm1H7lXoNsR0lMJqVSb3mm0laDbwhIn7RsX0xcHVELC2gfduZnZ2Nubm5oj+2sQbNFSh6Ql9Rn9c5+gdGDz5ZHsssL5JWRcTswP36BITbI+KQYV/LkwNCdib9QpZV8Dni3Ot6ptZuOv2YLJpqNra0AaFfyujpEV8bSNI7gbOBlwCvjAhf5QuW5zrKdZBVzt7zXKxJ+gWEwyQ91mW7gF3G/NzbgROAz455HBuRL2TZ8DwXa5J+w06nImKPLl+7R8T0OB8aEXdFhGtHlCjvVcomxShDbM2qKk0to1JJOlnSnKS59evXl92cxvCFLBueQ2BN0rNTeewDS9cCL+jy0pkR8Y1kn+8CH0rbh+BO5Wy5LLjZZMiiU3ksEfH6vI5t2fBkqMnmGwLrlFtAMCuTL3b9eZ0Q66aUPgRJ75D0IK2lOa+UtLKMdlgzjbOWw6TwOiHWTSlPCBFxGXBZGZ9tzTfuHItJeLrwsGPrpvKjjMyGNc7FblKeLjzs2LpxQLDKGnb5y3njXOwmJZXiYcfWjQOCVdI4d+rjXOwmJZXi+RPWjUcZWSWN0w8wTtnpSSpF4WHH1skBwSpp3Dv1US92XnLVJplTRlZJZXV6OpVik8xPCBOkTsMpy7xTdyrFJpUDwoRIMzO1SgHDy0+aFc8BYUIM6qStYikD36mbFcsBYUIM6qTNegW1Kj1tjKoJ52A2DAeECTFoOGWW4++r+LQxrCacg9mwPMpoQgyarJXlqJ4mzPZtwjmYDctPCBNiUCftoFE9w6RPsp7tW0bqZlJmLJu1c0CYIP06afsFjGHTJ1nO9i0rdTNJM5bN5jllZM9YtnSGm04/hvvPfQs3nX7MdkFimPRJloXTykrduPibTSI/IdhAw6ZPspxDUFbqxvMgbBI5INhAo6RPsppDUGbqxvMgbNI4ZWQDlZk+cerGrDh+QihYHSc7jZs+GeecJzl1U8e/Fas3RUTZbUhtdnY25ubmym7GyDpHzEDrbrfJ1TQn8Zyz4N+bZUnSqoiYHbSfU0YFmsTJTpN4zlnw783K4IBQoEmc7DSJ55wF/96sDA4IBSpr0ZcyTeI5Z8G/NyuDA0KBJnHEzCSecxb8e7MyeJRRgSZxxMwknnMW/HuzMniUkZlZw3mUkZmZDcUBwczMAAcEMzNLlBIQJK2QdLekWyVdJmlRGe0wM7NtynpCuAY4JCIOBe4BziipHWZmliglIETE1RGxOfnxZmCfMtphZmbbVKEP4feBq3q9KOlkSXOS5tavX19gs8zMJktuE9MkXQu8oMtLZ0bEN5J9zgQ2Axf1Ok5EnA+cD615CDk01czMyDEgRMTr+70u6X3AW4HfjTrNjqs519g3s15KKV0h6Tjgw8DrIuLJMtowiTpr7K/bsJEzLr0NwEHBzErrQ/gMsDtwjaQ1kv6mpHZMFNfYN7N+SnlCiIjfKuNzJ51r7JtZP652OkH2XrSQdV0u/nWpse/+D7N8VWHYqRWkzjX25/s/1m3YSLCt/+Py1evKbppZYzggTJBlS2f42AkvY2bRQgTMLFpYm0Xb3f9hlj+njCbMsqUztQgAndz/YZY/PyFYLXiNYbP8OSBYLdS5/8OsLpwyslrwGsNm+XNAsNqoa/+HWV04ZWRmZoADgpmZJRwQzMwMcEAwM7OEA4KZmQEOCGZmlnBAMDMzwAHBzMwSDghmZgY4IJiZWcKlK6xwXvnMrJocEKxQ8yufzS92M7/yGeCgYFYyp4ysUF75zKy6HBCsUF75zKy6HBCsUF75zKy6HBCsUF75zKy63KlshfLKZ2bV5YBghfPKZ2bV5JSRmZkBDghmZpZwQDAzM6CkgCDpzyTdKmmNpKsl7V1GO8zMbJuynhBWRMShEfFy4FvAn5bUDjMzS5QSECLisbYfnw1EGe0wM7NtSht2Kukc4N8DjwJH99nvZODk5MenJN1eQPPKshfwi7IbkTOfY/01/fygeef4wjQ7KSKfm3NJ1wIv6PLSmRHxjbb9zgB2iYizUhxzLiJmM2xmpTT9/MDn2ARNPz+YjHPsJrcnhIh4fcpdvwRcCQwMCGZmlp+yRhkd2Pbj24C7y2iHmZltU1YfwrmSDgK2Aj8B/iDl+87Pr0mV0PTzA59jEzT9/GAyznEHufUhmJlZvXimspmZAQ4IZmaWqFVAmISSF5JWSLo7Oc/LJC0qu01Zk/ROSXdI2iqpMUP7JB0naa2keyWdXnZ7sibpc5J+3uS5QJL2lXS9pLuSv9E/LLtNRapVQGAySl5cAxwSEYcC9wBnlNyePNwOnADcWHZDsiJpCvhr4E3AS4H3Snppua3K3OeB48puRM42A6dFxEuAVwP/tYH/jj3VKiBMQsmLiLg6IjYnP94M7FNme/IQEXdFxNqy25GxVwL3RsR9EfE08BXg7SW3KVMRcSPwL2W3I08R8XBE/Dj5/nHgLmBiVnOq3YppaUteNMTvAxeX3QhLZQZ4oO3nB4FXldQWy4CkJcBS4AfltqQ4lQsIg0peRMSZwJlJyYtTqOEM5zRlPSSdSevx9aIi25aVtKVLGkRdtjXuCXZSSNoNuAQ4tSMz0WiVCwiTUPJi0DlKeh/wVuB3o6YTRYb4d2yKB4F9237eB3iopLbYGCRN0woGF0XEpWW3p0i16kOYhJIXko4DPgy8LSKeLLs9ltqPgAMl7S/pWcB7gCtKbpMNSZKAC4C7IuITZbenaLWaqSzpEmC7khcRsa7cVmVL0r3AzsAvk003R0Ta0h61IOkdwKeBxcAGYE1EHFtuq8Yn6c3AecAU8LmIOKfkJmVK0peBo2iVhv4ZcFZEXFBqozIm6UjgH4DbaF1nAP4kIr5dXquKU6uAYGZm+alVysjMzPLjgGBmZoADgpmZJRwQzMwMcEAwM7OEA4I1iqQtSTXcOyTdIumPJC1IXpuV9KmS2vWPGR2nkZVirRo87NQaRdITEbFb8v3zaM1ovykiajejvRtJL6E1Pv6zwIciYq7kJlmD+AnBGisifg6cDJyilqMkfQtA0tmSvpCsq/HPkk6Q9D8l3SbpO0n5AiQdLukGSaskrZT0m8n270r6C0k/lHSPpN9Oth+cbFuTrGlxYLL9ieS/Sta8uD35rHcn249Kjvn1ZD2Mi5JZs53n1MRKsVYRDgjWaBFxH62/8+d1efkA4C20ylRfCFwfES8DNgJvSYLCp4GTIuJw4HNA++zjnSLilcCpbKup9QfAJ5M1O2Zp1ThqdwLwcuAw4PXAivkgQ6uy5qm01lN4EXDEqOdtNorKFbczy0G3SqQAV0XEJkm30So38Z1k+23AElplUg4Brklu1qeAh9veP18J+JZAAAABLklEQVT4bFWyP8D3aVXj3Qe4NCL+qeMzjwS+HBFbgJ9JugF4BfAY8MOIeBBA0prkmN8b9mTNRuUnBGs0SS8CtgA/7/LyUwARsRXY1FZZdiutmyUBd0TEy5Ovl0XEGzvfnxx/p+RYX6JVeHEjsFLSMZ1N6tPcp9q+f+aYZkVxQLDGkrQY+BvgMyOWEV8LLJb0muR405IOHvCZLwLui4hP0ap2emjHLjcC75Y0lbTvd4AfjtA2s8w5IFjTLJwfdgpcC1wNfGSUAyVLYZ4E/IWkW4A1wGsHvO3dwO1JyufFwN91vH4ZcCtwC3Ad8McR8f/StknSOyQ9CLwGuFLSyrTvNRvEw07NzAzwE4KZmSUcEMzMDHBAMDOzhAOCmZkBDghmZpZwQDAzM8ABwczMEv8fhUj4EgYfpDoAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.scatter(sample[:,0], sample[:,1])\n", + "plt.xlabel('Dimension 1')\n", + "plt.ylabel('Dimension 2')\n", + "plt.title('Sample data')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that the data have a roughtly eliptical pattern. The correlation between the two dimensions is also visible. \n", + "\n", + "With the simulated data set created, it is time to compute the PCA model. The code in the cell below does the following:\n", + "1. Define a PCA model object using the `PCA` function from the Scikit Learn `decomposition` package.\n", + "2. Fit the PCA model to the sample data.\n", + "3. Display the ratio of the **variance explained** by each of the components, where, for a matrix X, this ratio is given by:\n", + "\n", + "$$VE(X) = \\frac{Var_{X-component}(X)}{Var_{X-total}(X)}$$\n", + "\n", + "Notice that by construction:\n", + "\n", + "$$VE(X) = \\sum_{i=1}^N VE_i(X) = 1.0$$\n", + "\n", + "In other words, the sum of the variance explained for each component must add to the total variance or 1.0 for standardized data. \n", + "\n", + "Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[0.84530942 0.15469058]\n" + ] + } + ], + "source": [ + "pca_model = skde.PCA()\n", + "pca_fit = pca_model.fit(sample)\n", + "print(pca_fit.explained_variance_ratio_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the explained variance of the first component is many times larger than for the second component. This is exactly the desired result indicating the first principle component explains the majority of the variance of the sample data. \n", + "\n", + "The code in the cell below computes and prints the scaled components. Mathematically, the scaled components are the eigenvectors scalled by the eigenvalues. Execute this code: " + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[ 0.51185182 0.67272262]\n", + " [-0.12310741 0.09366825]]\n" + ] + } + ], + "source": [ + "comps = pca_fit.components_\n", + "for i in range(2):\n", + " comps[:,i] = comps[:,i] * pca_fit.explained_variance_ratio_\n", + "print(comps)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the two vectors have their origins at $\\[ 0,0 \\}$, and are quite different magnitude, and are pointing in different directions. \n", + "\n", + "To better understand how the projections of the componets relate to the data, execute the code to plot the data along with the principle components. Execute this code: " + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5,1,'Sample data')" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYQAAAEWCAYAAABmE+CbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XucXWV97/HPN8MAiYKDEgUGMEhpUBDMYVQwtAJaQRGIgCLnoGJ7ymm9RpEa5FXBttTUVESxPZUWixWwWAPhJgYwIIpymRjkFkBeIJKBI8EarhFy+Z0/9hqys2df1r6uvdb+vl+veWVm77XXetYkWb+1fs/z/B5FBGZmZtOyboCZmfUHBwQzMwMcEMzMLOGAYGZmgAOCmZklHBDMzAxwQDBrmaQzJF3QoX2dL+nvOrEvs1Y5IFjuSDpQ0k8lPSnpvyXdJOmNWberVyTdIOl/Z90OK54tsm6AWTMkbQtcCfwl8F1gS+CPgOezbJdZEfgJwfLmDwEi4jsRsSEi1kbENRFxB4Ck3SUtk/RbSU9IulDSyOSHJf1K0imS7pD0rKTzJL1K0tWSnpZ0naTtkm1nSQpJJ0l6VNJjkk6u1TBJ+ydPLmsk/ULSQXW2nSPp58kxLwa2LntvO0lXSlot6XfJ9zsn751JKQB+XdIzkr6evP5VSY9IekrSckl/1M4v2QaTA4Llzf3ABknfkvTOyYt3GQFfBHYCXgvsApxRsc0xwJ9QCi5HAFcDnwO2p/R/4hMV2x8M7AG8A1gg6e2VjZI0ClwF/B3wcuAzwGJJM6tsuyWwBPh2su1/JW2aNA34d+DVwK7AWuDrABFxGvBj4GMR8dKI+FjymduANyT7uwj4L0lbY9YEBwTLlYh4CjgQCOBfgdWSLpf0quT9ByLi2oh4PiJWA2cBb63YzTkR8ZuImKB0cb0lIlZExPPApcCciu2/EBHPRsSdlC7Ux1dp2gnA9yPi+xGxMSKuBcaBd1XZdn9gGDg7ItZFxPcoXdAnz/G3EbE4Ip6LiKeBM6ucQ+Xv5YLkc+sj4svAVsDsep8xq+SAYLkTESsj4sSI2BnYm9LTwNkAkl4p6T8lTUh6CriA0p1/ud+Ufb+2ys8vrdj+kbLvH06OV+nVwHuTdNEaSWsoBa4dq2y7EzARm1eWfHjyG0kzJH1D0sPJOdwIjEgaqrKvyc+cLGll0tG+BngZU8/brC4HBMu1iLgXOJ9SYIBSuiiAfSJiW0p37mrzMLuUfb8r8GiVbR4Bvh0RI2VfL4mIhVW2fQwYlVTerl3Lvj+Z0t39m5Nz+OPk9cntNytRnPQXfBZ4H7BdRIwAT9L+eduAcUCwXJG0Z3I3PNnJugulFM7NySbbAM8Aa5K8/ikdOOxfJ3ftewEfBi6uss0FwBGSDpU0JGlrSQdNtrPCz4D1wCckbSHpaOBNZe9vQ+lJZY2klwOnV3z+N8BrKrZfD6wGtpD0eWDbFs7TBpwDguXN08CbgVskPUspENxF6a4a4AvA/6B0h3wVcEkHjvkj4AHgh8A/RsQ1lRtExCPAUZQ6p1dTemI4hSr/xyLiBeBo4ETgd8BxFe08G5gOPEHp/H5QsYuvAscmI5C+Biyl1DF+P6XU0+/ZPM1lloq8QI5ZdZJmAQ8BwxGxPtvWmHWfnxDMzAxwQDAzs4RTRmZmBvgJwczMErkqbrf99tvHrFmzsm6GmVmuLF++/ImImFJGpVKuAsKsWbMYHx/PuhlmZrki6eHGWzllZGZmCQcEMzMDHBDMzCzhgGBmZoADgpmZJRwQzMwMyNmwUzMrjiUrJli09D4eXbOWnUamc8qhs5k3ZzTrZg00BwQz67klKyY49ZI7WbtuAwATa9Zy6iV3AjgoZMgpIzPruUVL73sxGExau24Di5bel1GLDBwQzCwDj65Z29Tr1hsOCGbWczuNTG/qdesNBwQz67lTDp3N9OGhzV6bPjzEKYfOzqhFBu5UNrMMTHYce5RRf3FAMLNMzJsz6gDQZxwQzHqom2PvizSuv0jnkicOCGY90s2x90Ua11+kc8kbdyqb9Ug3x94XaVx/kc4lbxwQzHqkm2PvizSuv0jnkjcOCGY90s2x90Ua11+kc8kbBwSzHunm2Psijesv0rnkjTuVzXqkm2PvizSuv0jnkjeKiKzbkNrY2FiMj49n3Qwzs1yRtDwixhpt55SRmZkBGQYESbtIul7SSkl3S/pkVm0xM7Ns+xDWAydHxM8lbQMsl3RtRNyTYZvMzAZWZgEhIh4DHku+f1rSSmAUcEAw6xMuITFY+mKUkaRZwBzglirvnQScBLDrrrv2tF1mg8wlJAZP5p3Kkl4KLAbmR8RTle9HxLkRMRYRYzNnzux9A80GlEtIDJ5MA4KkYUrB4MKIuCTLtpjZ5lxCYvBkOcpIwHnAyog4K6t2mFl1LiExeLJ8QpgLfAA4RNLtyde7MmyPmZVxCYnBk+Uoo58Ayur4ZlafS0hkr9ejvPpilJGZ9Scvc5mdLEZ5ZT7KyMzMpspilJcDgplZH8pilJcDgplZH8pilJcDgplZH8pilJc7lc16rMj1gYp8br2WxSgvBwSzHipyfaAin1tWej3Kyykjsx7KU32gJSsmmLtwGbstuIq5C5exZMVE3e3zdG5WnZ8QzLqsPI1Sa8HafqsP1Mrdvmsf5Z+fEMy6aPLCOlEnGEBv6wOlufNv5W7ftY/yzwHBrIuqXVgr9bI+UGWAmrzzrwwKrdztu/ZR/jkgmHVRvQuogNGR6Xzx6Nf3rOMw7Z1/K3f78+aM8sWjX8/oyPRMzs3a5z4Esy7aaWQ6E1WCwujIdG5acEjP25P2zv+UQ2dv1ocA6e72Xfso3/yEYNZF/ZZGSXvn77v9weQnBLMu6rcS0gfvOZMLbv511dcr+W5/8DggmHVZP11Yr793dVOv22BxyshsgHiugNXjgGA2QDxXwOpxQLBCaLbMwqDqt05u6y/uQ7Dcc1G19Pqtk9v6iwOC5V69yVa+0E3VT53c1l+cMrLcc0epWWf4CcFyr9Zs4H7rKC3i4jFFPKdB5oBguddqmYVeKkI/R+XF/+A9Z7J4+USuz8k255SR5V4eyizkffGYalVSL7z517k+J5vKTwhWCP3eUdrNfo5epG2qBbS8LPZj6fkJwawHujUhLO36Bu1q5iLfb303lp4DglkPdGtCWK9SUbUu8qr4ud/6bqw5DghmPdCtfo5eDbmtFtAApg9PY7sZw33bd2PNybQPQdI3gXcDj0fE3lm2xazbutHP0asht5PtPuPyu1mzdt2Lrz+3biOB+Mpxb3AgKICsnxDOBw7LuA1mudVKKqrVuk/z5ozykq2m3kOuXbeBk7/7i0zqR7mGVWdl+oQQETdKmpVlG8zyrNnaRO3Oh6iVitoQ0fM5CEWY29Fv+n7YqaSTgJMAdt1114xbY9Z/mklFtVv3qVaKqtn9dIJrWHVe1imjhiLi3IgYi4ixmTOnLvNnVjTNpkGa2b7dTuhancvN7qcTXMOq8/o+IJgNkmbnFVTbfv7FtzPnb66p+pl250NMjpYaUuWA0+b20wle7KfzHBDM+kiz8wqqbQ/wu+fWVQ0knZgPMW/OKF9+376ZL7TjxX46L9OAIOk7wM+A2ZJWSfqzLNtjlrVm0yD10iPVAkmn5kP0Q/2ofmhD0WQ9yuj4LI9v1m+anVdQr5MXqgeMTs2H6If6Uf3QhiJxysisjzSbBmnUyet8ujWj74edmhVdZbXSY/Yb5fp7V6eaV1BrBjE4n27NU0StIrb9Z2xsLMbHx7NuhlnHVE6ugtKFvKlceARIXr3MapK0PCLGGm3nJwSzDLU9uerpp+G44+CDH2Te+9/fUgBwILFJDghmGWprctVjj8Hhh8OKFfDDH8KOO8Jb39rU8V3+wcq5U9ksQy1Prlq5Eg44oBQMAF54AebNg7vvbur4eV/a0zrLAcEsQy1Nrvrxj+Etb4GHH9789TVr4Igj4PnnUx/f5R+snFNG1nPOWW/SbLVSvvtd+MAHSk8EFX4/vBUfn3MC93zlptS/006sp+C/z+KoGRAkbQucCuwMXB0RF5W9988R8ZEetM8KxjnrqVJNroqAr3wFTj656tu/nfEy/uyYz3P7TrOhid/pKYfOrjrKKe1wVf99Fku9lNG/U1oydTHwfkmLJW2VvLd/11tmheScdQs2bID582sGg0deMcrRJywqBYNE2t9pu+Uf/PdZLPVSRrtHxDHJ90sknQYsk3RkD9plBdWpnPXApCnWroUTToBLLqn+/v77c9QbP85/z3jZlLfqlbQo1075B/dBFEu9gLCVpGkRsREgIs6UtAq4EXhpT1pnhdOpnHXe0xSpAtoTT8BRR8FPf1p9J0cdBRddxJN/e0MppVShVonqjrWP3q3pbL1RL2V0BXBI+QsR8S3gZGBqj5ZZCp0oWZz3NEWqNQ8efBDmzq0dDD76UVi8GGbMYEONagO1Xu9I+xIuQV0sNQNCRPxVRFxX5fUfRMQe3W2WFVUnShbnPU3RMKDddltpjsH991ffwZe+BOecA0OlC/FojbvxWq+33b4yLkFdLB52aj3XbsnifkhTtNOHUTegXXllqRTFc89N3WDLLeH88+H4zavGtztSqKn2VeES1MXhiWmWO1mnKZpd5rJSrcD1l/ddV+oXqBYMRkbgmmumBAPo/F26l6YcXA4IljtZpyna7cOYEtAi+MyN/8FfLTkbNm6csv2j287kh/+6uG6donlzRjnl0NnsNDKdR9esZdHS+1IHqIbtw/0CgyJVykjSW4BZ5dtHxH90qU1mDWWZpmi3D6N8DYNnn3mOhVd/jWPuvr7qtve8cjdOPPYMhh+At9XZZydHXjU9e9oKo2FAkPRtYHfgdmDytigABwQbSCMzhvndc+umvN5MSmXenFH++bLlfP6/zuDAh39RdZsbZ83hI/NO5ZmtZqAGwabdMtrV+kRuWnBIw89ZsaR5QhgDXhd5WknHrEuWrJjgmd+vn/L68JCaS6msWsXX/vmT7Ln6V1Xf/t7eb2PBYR9n/VDpv2ijYNPOU0sR5nVYZ6TpQ7gL2KHbDTHLg0VL72Pdxqn3Ri/Zcov0F88774QDDqgZDL76luP5zLvmvxgM0uTv2+kIzvu8DuucNAFhe+AeSUslXT751e2GmfWjWnfcT66dmkKqatkyOPBAWLVqylvrNY3PHvZxvnHIBxjdbkZTHebtdATnfV6HdU6alNEZ3W6EWb+qzK231X9w4YXw4Q/Duqmff3Z4az561AJu2H0MrdvIPQsOefHYn7r4dhYtva9ux247HcH9MK/D+kPDgBARP5L0KuCNyUu3RsTj3W2WWfaq5daHp4nhIbFuw6a0UcM78QhYuBA+97mqb69+yQgfPvYM7trhD4DShbiVvH6rI686PbGtHQNTtLBPNUwZSXofcCvwXuB9wC2Sju12w8yyVi23vm5j8JItt0g/B2L9evjIR2oGgwdfsTPvOeEfXwwGkxfiXub1s57XMandCX/WvjQpo9OAN04+FUiaCVwHfK+bDTPLWrUc+oLrv8mqkR34uyu+UiolUc+zz5ZmFl9xRfX3585l5d/+C3HL46jijvhTF9+euk2d0A/lJ9odOmvtSxMQplWkiH6LZzjbAKjMre/+20f489uWMBQbYY8r4a//Gj70IRgenvrhxx8vrW98663Vd37ssfDtb3P41ltz+MGNj13+elG5czt7aS7sP0hGGJ0o6UTgKuD73W2W9aslKyaYu3AZuy24irkLlxX6cb5y5M4nb/pOKRgA/PrX8Od/DnvuCd/6Vik1NOmXvyxVK60VDD71Kbj4Yth669THhuKXj3ANpew1DAgRcQpwLrAPsC9wbkR8ttsNs/4zaDne8tz67NW/4t33/njqRg8+CCeeCHvtBRddBDfdVAoGDz44dVuptC7yWWfBtPr/9cqPDaXFbibTJ0X9fQ9iEOw3ynICsqTDgK8CQ8C/RcTCetuPjY3F+Ph4T9pmU81duKxqGmN0ZHrxyxycdVbNNY03I1VdvYyttoILLiilippQOdoIShfJoq454FFG3SFpeUSMNdquZh+CpJ9ExIGSnqZUu+jFt4CIiG3bbOAQ8E/AnwCrgNskXR4R97SzX+uegc7xfvrTcPDBPDb/s+x447W1t6sWDF7+crjsstKEtCYNWkdrP3RuD7J6K6YdmPy5TURsW/a1TbvBIPEm4IGIeDAiXgD+EziqA/u1Lhn4HO+cORz7zgUc+cGzuGG3/dJ9ZtasUhqphWAAAx6ErefSzEPYXdJWyfcHSfqEpJEOHHsUeKTs51XJa5XHP0nSuKTx1atXd+Cw1qpBzvFOdqZPrFnLHTv+ISe+7wsc/b8W8ZNX71v7Q3vuCT/7WenPFg18ELaeSjPKaDGwQdIfAOcBuwEXdeDYqvLalOftiDg3IsYiYmzmzJkdOKy1Ks0EpiKOQirvTC/3851fywnvP5Pjjv8ivxrZcbP3npgxwhX/dhns0F5dyEEOwtZ7aeYhbIyI9ZLeA5wdEedIWtGBY68Cdin7eWfg0Q7s17qoXo63qGWUq+Xxy92y6+s56KRz+drlX+LIZCTSiceezqrrfsURc/+wrWN7sRrrpTQBYZ2k44EPAUckr1WZidO024A9JO0GTADvB/5nB/ZrGSlqB2i9fP3o5AQyiU8eeQpXzT6QVS97JXfvuAekrYDagDtarVfSpIw+DBwAnBkRDyUX8AvaPXBErAc+BiwFVgLfjYi7292vZaeoHaC18vWVw21D01i659xSMDDLoTTVTu8BPlH280NA3fkCaUXE9/Gs577WzLjwNOUW8jjO/OA9Z3Lhzb/erIOrPI+/XY2S2NvN6MSDtFnvpBllNFfStZLul/SgpIckVZmGaUXT7MzkRh2geZzpvGTFBIuXT0yZiHPMfpvSOKcfsRfDQ5uPkRgeEqcfsVfvGtqCIg4AsPakSRmdB5wFHEhpTYQxNq2NYAXWbAnmRqOQ8rhUY7U2B3D9vZuGQM+bM8qiY/fd7LwXHbtvXz/55DE4W/el6VR+MiKu7npLrO+00idQrwM0j30Maduct47fWsH5jMvvztV5WGelCQjXS1oEXAI8P/liRPy8a62yvtDpEsx5Kelc3s8xTWJDlXIU/dbmZtUKdGvWrmPJigkHhQGVJmX0Zkppor8Hvpx8/WM3G2X9odOTovIwyaoylVItGPRbm1tRL6D1cwrPuivNKKMqy3fYIOj0pKhuT7Iqv7N/2fRhJFjz3LqmjlNrEtpQ8qRQXoa6/Jzy5pRDZzO/x6uyWf9rWP5a0qsoPR3sFBHvlPQ64ICIOK8XDSzn8tdWS7Uy0eXSlozebcFVU+unlO2jSGWo5/zNNVWHyw5EOfMBk7b8dZqU0fmUJo/tlPx8PzC/9aZZP8r7EMRG5SXSjmaqlUqZfDJoZZ/96vQj9ur7FJ71VpqAsH1EfBfYCC/OMK79P89ypwhDENOkOdJsU6ufo1pfQtp99qs0xQptsKQZZfSspFeQVCKVtD/wZFdbZT3VzRpEaWcmtzuDudYIpsptGqnVz7Fo6X1V9z8yY5i5C5flauZ1ubwNl7XuShMQPg1cDuwu6SZgJtDcOoDW17o1PyBt9dNOVEk95dDZdfsQhqeJ515Yz24Lrmp44a51kazc//CQeOb361/MwxeluqsNroYpo2S+wVuBtwD/B9grIu7odsOsd7q1CEvamcmdmMFcmf4YmT7MdjOGX/wewe+eW9dySmzenFGO2W+UIZVKVAxJbDFNrNu4eSppcnKXWR41fEJI1j5+FzAr2f4dkoiIs7rcNuuRanfXnehcTPvk0aknlFp39nMXLmNNRSnqZlNikzWNJvsSNkSwdl31fgVP7rK8StOpfAVwIvAKYJuyLyuIbnUupn3y6PYykZ0IOI1GMVXb3ixv0vQh7BwR+3S9JZapbnQupn3y6NYTyqROlMxo9mklz6OPbHClCQhXS3pHRFzT9dZYoaSdmVy+3cSatQ1nAzc7IqnZgDO5/8m2TM5Qrjb0dJpgY5XMUd5rHdlgShMQbgYulTQNWEepHHxExLZdbZkVQtonj8ltGo02amVEUjMlMyr3X95nUGn68BDH7DfK4uUTHX+6yeNCQpZ/aQLClyktoXlnNKpzYdaGNPMhWp0zkTYwNeorGJLYGLHZRXrs1S/v6MW7E8NwzVqRJiD8Ergrz8HAd1v5kKbzt9trKjTaz8YIHlp4+GavdaL/pVHJ7U5NFDSrJ01AeAy4QdLVbL4eQi6Gnfpuq3XtBNJWPvuy6cNThocCbD28aTBct9dUaDTjuRt9A7XSVJXcUW3dlmbY6UPAD4EtyeGw0zwu29gP2qlv1Opnpeqvr1238cXPNrumQrNF+6rtP81xaklz/LRDWt1Rbd2WZj2EL/SiId2Sx2Ub+0E79Y1a/Wy1Uszl+yxPzbTSQdxsB3T5KKPRFlKNaY+f5t+iq5BaL9QMCJLOjoj5kq6AqSXiI+LIrrasQ/KybGO/qZU2aVRADloLwktWTJSGr6XYZzsdxOVPh7WCSqfmZKQNjLX+jVbrwDbrpnpPCN9O/sz1cpndnvRUVLXG3Q/VyuuUaSUIL1p6X81g0OiztdQKQJN36t3uV0obGGv9G3Upauu1mn0IEbE8+fNHwD3APRHxo8mvXjWwXa753ppaHZu1Xi/XytrJ9Z4eWg3gWS92k7Ykh/+NWr+olzIScDrwMUqT0aZJWg+cExF/06P2dYRrvjdvtMZd/mgbawrU+zuolzZp9eJY6867Vgdup/uVmnk69b9R6wf1UkbzgbnAGyPiIQBJrwH+r6RPRcRXetFAy0a7qbZmL3DdSJs0u9hNq/1KtYbYthIYzbKkWvPNJK0A/iQinqh4fSZwTUTM6UH7NjM2Nhbj4+O9PmxhNZor0OsJfb06XuXoH2g9+HRyX2bdIml5RIw13K5OQLgrIvZu9r1uckDonEG/kHUq+MxduKxmau2mBYd0oqlmbUsbEOqljF5o8b2GJL0XOAN4LfCmiPBVvse6uY5yHnQqZ+95LlYk9QLCvpKeqvK6gK3bPO5dwNHAN9rcj7XIF7LO8DwXK5J6w06HImLbKl/bRMRwOweNiJUR4doRGer2KmWDopUhtmb9Kk0to0xJOknSuKTx1atXZ92cwvCFrDM8h8CKpGancts7lq4Ddqjy1mkRcVmyzQ3AZ9L2IbhTubNcFtxsMHSiU7ktEfH2bu3bOsOToQabbwisUtcCglmWfLGrz+uEWDWZ9CFIeo+kVZSW5rxK0tIs2mHF1M5aDoPC64RYNZk8IUTEpcClWRzbiq/dORaD8HThYcdWTd+PMjJrVjsXu0F5uvCwY6vGAcH6VrPLX05q52I3KKkUDzu2ahwQrC+1c6fezsVuUFIpnj9h1XiUkfWldvoB2ik7PUilKDzs2Co5IFhfavdOvdWLnZdctUHmlJH1paw6PZ1KsUHmJ4QBkqfhlFneqTuVYoPKAWFApJmZ2k8Bw8tPmvWeA8KAaNRJ24+lDHynbtZbDggDolEnbadXUOunp41WFeEczJrhgDAgGg2n7OT4+3582mhWEc7BrFkeZTQgGk3W6uSoniLM9i3COZg1y08IA6JRJ22jUT3NpE86Pds3i9TNoMxYNivngDBA6nXS1gsYzaZPOjnbN6vUzSDNWDab5JSRvWjenFFuWnAIDy08nJsWHLJZkGgmfdLJwmlZpW5c/M0GkZ8QrKFm0yednEOQVerG8yBsEDkgWEOtpE86NYcgy9SN50HYoHHKyBrKMn3i1I1Z7/gJocfyONmp3fRJO+c8yKmbPP5bsXxTRGTdhtTGxsZifHw862a0rHLEDJTudotcTXMQz7kT/HuzTpK0PCLGGm3nlFEPDeJkp0E8507w782y4IDQQ4M42WkQz7kT/HuzLDgg9FBWi75kaRDPuRP8e7MsOCD00CCOmBnEc+4E/94sCx5l1EODOGJmEM+5E/x7syx4lJGZWcF5lJGZmTXFAcHMzAAHBDMzS2QSECQtknSvpDskXSppJIt2mJnZJlk9IVwL7B0R+wD3A6dm1A4zM0tkEhAi4pqIWJ/8eDOwcxbtMDOzTfqhD+FPgatrvSnpJEnjksZXr17dw2aZmQ2Wrk1Mk3QdsEOVt06LiMuSbU4D1gMX1tpPRJwLnAuleQhdaKqZmdHFgBARb6/3vqQPAe8G3hZ5mh2Xc66xb2a1ZFK6QtJhwGeBt0bEc1m0YRBV1tifWLOWUy+5E8BBwcwy60P4OrANcK2k2yX9S0btGCiusW9m9WTyhBARf5DFcQeda+ybWT2udjpAdhqZzkSVi39eauy7/8Osu/ph2Kn1SJ5r7E/2f0ysWUuwqf9jyYqJrJtmVhgOCANk3pxRvnj06xkdmY6A0ZHpuVm03f0fZt3nlNGAmTdnNBcBoJL7P8y6z08IlgteY9is+xwQLBfy3P9hlhdOGVkueI1hs+5zQLDcyGv/h1leOGVkZmaAA4KZmSUcEMzMDHBAMDOzhAOCmZkBDghmZpZwQDAzM8ABwczMEg4IZmYGOCCYmVnCpSus57zymVl/ckCwnppc+WxysZvJlc8ABwWzjDllZD3llc/M+pcDgvWUVz4z618OCNZTXvnMrH85IFhPeeUzs/7lTmXrKa98Zta/HBCs57zymVl/csrIzMwABwQzM0s4IJiZGZBRQJD0t5LukHS7pGsk7ZRFO8zMbJOsnhAWRcQ+EfEG4Erg8xm1w8zMEpkEhIh4quzHlwCRRTvMzGyTzIadSjoT+CDwJHBwne1OAk5Kfnxe0l09aF5WtgeeyLoRXeZzzL+inx8U7xxfnWYjRXTn5lzSdcAOVd46LSIuK9vuVGDriDg9xT7HI2Ksg83sK0U/P/A5FkHRzw8G4xyr6doTQkS8PeWmFwFXAQ0DgpmZdU9Wo4z2KPvxSODeLNphZmabZNWHsFDSbGAj8DDwFyk/d273mtQXin5+4HMsgqKfHwzGOU7RtT4EMzPLF89UNjMzwAHBzMwSuQoIg1DyQtIiSfcm53mppJGs29Rpkt4r6W5JGyUVZmifpMMk3SfpAUkLsm5Pp0n6pqTHizwXSNIukq6XtDL5N/rJrNvUS7kKCAxGyYtrgb0jYh/gfuDUjNvTDXcBRwM3Zt2Q38NCAAAEMElEQVSQTpE0BPwT8E7gdcDxkl6Xbas67nzgsKwb0WXrgZMj4rXA/sBHC/j3WFOuAsIglLyIiGsiYn3y483Azlm2pxsiYmVE3Jd1OzrsTcADEfFgRLwA/CdwVMZt6qiIuBH476zb0U0R8VhE/Dz5/mlgJTAwqznlbsW0tCUvCuJPgYuzboSlMgo8UvbzKuDNGbXFOkDSLGAOcEu2LemdvgsIjUpeRMRpwGlJyYuPkcMZzmnKekg6jdLj64W9bFunpC1dUiCq8lrhnmAHhaSXAouB+RWZiULru4AwCCUvGp2jpA8B7wbeFjmdKNLE32NRrAJ2Kft5Z+DRjNpibZA0TCkYXBgRl2Tdnl7KVR/CIJS8kHQY8FngyIh4Luv2WGq3AXtI2k3SlsD7gcszbpM1SZKA84CVEXFW1u3ptVzNVJa0GNis5EVETGTbqs6S9ACwFfDb5KWbIyJtaY9ckPQe4BxgJrAGuD0iDs22Ve2T9C7gbGAI+GZEnJlxkzpK0neAgyiVhv4NcHpEnJdpozpM0oHAj4E7KV1nAD4XEd/PrlW9k6uAYGZm3ZOrlJGZmXWPA4KZmQEOCGZmlnBAMDMzwAHBzMwSDghWKJI2JNVw75b0C0mfljQteW9M0tcyatdPO7SfQlaKtf7gYadWKJKeiYiXJt+/ktKM9psiIncz2quR9FpK4+O/AXwmIsYzbpIViJ8QrLAi4nHgJOBjKjlI0pUAks6Q9K1kXY1fSTpa0pck3SnpB0n5AiTtJ+lHkpZLWippx+T1GyT9g6RbJd0v6Y+S1/dKXrs9WdNij+T1Z5I/lax5cVdyrOOS1w9K9vm9ZD2MC5NZs5XnVMRKsdYnHBCs0CLiQUr/zl9Z5e3dgcMplam+ALg+Il4PrAUOT4LCOcCxEbEf8E2gfPbxFhHxJmA+m2pq/QXw1WTNjjFKNY7KHQ28AdgXeDuwaDLIUKqsOZ/SegqvAea2et5mrei74nZmXVCtEinA1RGxTtKdlMpN/CB5/U5gFqUyKXsD1yY360PAY2Wfnyx8tjzZHuBnlKrx7gxcEhG/rDjmgcB3ImID8BtJPwLeCDwF3BoRqwAk3Z7s8yfNnqxZq/yEYIUm6TXABuDxKm8/DxARG4F1ZZVlN1K6WRJwd0S8Ifl6fUS8o/Lzyf63SPZ1EaXCi2uBpZIOqWxSneY+X/b9i/s06xUHBCssSTOBfwG+3mIZ8fuAmZIOSPY3LGmvBsd8DfBgRHyNUrXTfSo2uRE4TtJQ0r4/Bm5toW1mHeeAYEUzfXLYKXAdcA3whVZ2lCyFeSzwD5J+AdwOvKXBx44D7kpSPnsC/1Hx/qXAHcAvgGXAX0XE/0vbJknvkbQKOAC4StLStJ81a8TDTs3MDPATgpmZJRwQzMwMcEAwM7OEA4KZmQEOCGZmlnBAMDMzwAHBzMwS/x8BU3ANqxgVDgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.scatter(sample[:,0], sample[:,1])\n", + "# plt.plot([0.0, comps[0,0]], [0.0,comps[0,1]], color = 'red', linewidth = 5)\n", + "# plt.plot([0.0, comps[1,0]], [0.0,comps[1,1]], color = 'red', linewidth = 5)\n", + "plt.plot([0.0, comps[0,0]], [0.0,comps[0,1]], color = 'red', linewidth = 5)\n", + "plt.plot([0.0, comps[1,0]], [0.0,comps[1,1]], color = 'red', linewidth = 5)\n", + "\n", + "plt.xlabel('Dimension 1')\n", + "plt.ylabel('Dimension 2')\n", + "plt.title('Sample data')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice the the fist principle component (the long red line) is along the direction of greatest variance of the data. This is as expected. The short red line is along the direction of the second principle component. The lengths of these lines is the variance in the directions of the projection. \n", + "\n", + "The ultimate goal of PCA is to transform data to a coordinate system with the highest variance directions along the axes. The code in the cell below uses the `transform` method on the PCA object to perform this operation and then plots the result. Execute this code: " + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5,1,'Sample data')" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY0AAAEWCAYAAACaBstRAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3X20XHV97/H3hxA0BTGgUeEEIVIaBEFyOaA2ahFBUCukUXlwWbHVxfJWVkvFtOFiFV11EZtqUeGuyvUJQUQeY5SH8BDUSkU4MUCIGOCiSA5ciEgAJUISvveP2RMmk5k5e87sPXvvmc9rrazM7Nln5jsze/b3t3+PigjMzMzS2K7oAMzMrDqcNMzMLDUnDTMzS81Jw8zMUnPSMDOz1Jw0zMwsNScNs5xJOlPShRk91zcl/WsWz2U2GU4aNrAkvVHSf0t6QtLvJN0s6ZCi4+oXST+U9OGi47DBsn3RAZjlQdLOwA+A/wlcAuwAvAl4psi4zKrOVxo2qP4MICK+ExGbI2JDRFwXEXcCSNpb0nJJj0n6raRvS5pe/2NJv5a0QNKdkv4g6WuSXi7pGklPSbpB0i7JvntJCkknS3pI0sOSTmsXmKTXJ1dA6yXdIemwDvvOkfTz5DW/C7yw4bFdJP1A0jpJjye3ZyaPfZZakjxH0u8lnZNs/6KkByU9KWmFpDf18iHb8HHSsEF1D7BZ0vmS3l4/wTcQcBawO/BqYA/gzKZ93g0cSS0BvQu4BvhfwEup/Xb+vmn/twD7AG8DFko6ojkoSSPAVcC/ArsCHwculzSjxb47AEuAC5J9L01iqtsO+AawJ/BKYANwDkBEnAH8F3BKROwUEackf3MbcFDyfBcBl0p6IWYpOWnYQIqIJ4E3AgH8H2CdpKWSXp48fl9EXB8Rz0TEOuALwF80Pc2XI+KRiBindgL+WUSsjIhngCuBOU37fzoi/hARq6idzE9sEdr7gasj4uqIeC4irgfGgHe02Pf1wFTg7IjYGBGXUTvp19/jYxFxeUQ8HRFPAZ9t8R6aP5cLk7/bFBGfB14AzO70N2aNnDRsYEXE3RHxwYiYCbyG2lXF2QCSXibpYknjkp4ELqR2BdHokYbbG1rc36lp/wcbbj+QvF6zPYH3JlVT6yWtp5bcdmux7+7AeGw9q+gD9RuS/kTSVyQ9kLyHHwPTJU1p8Vz1vzlN0t1J54D1wIvZ9n2bteWkYUMhIn4JfJNa8oBa1VQAB0bEztSuANTjy+zRcPuVwEMt9nkQuCAipjf82zEiFrXY92FgRFJjXK9suH0atauE1yXv4c3J9vr+W01hnbRf/DNwHLBLREwHnqD3921DxEnDBpKkfZNSdb1heA9q1UW3JLu8CPg9sD5pZ1iQwcv+S1L63x/4G+C7Lfa5EHiXpKMkTZH0QkmH1eNs8lNgE/D3kraXNB84tOHxF1G74lkvaVfgU01//wjwqqb9NwHrgO0lfRLYeRLv04aYk4YNqqeA1wE/k/QHasniLmqlc4BPA/+DWkn7KuCKDF7zR8B9wI3Av0fEdc07RMSDwLHUGtTXUbvyWECL32JEPAvMBz4IPA4c3xTn2cA04LfU3t+1TU/xReA9Sc+qLwHLqDXm30OtmuuPbF2lZjYheREms95I2gv4FTA1IjYVG41ZvnylYWZmqRWaNCQdLWmNpPskLWzx+EckrZJ0u6SfSNqviDjNzKymsOqppFvgPdQGT62l1v/8xIj4RcM+Oyf97ZF0DPB3EXF0EfGamVmxVxqHAvdFxP1Jg9/F1BoIt6gnjMSONHUhNDOz/ipywsIRtu65sZZab5etSPoo8DFqE84d3uqJJJ0MnAyw4447HrzvvvtmHqyZ2SBbsWLFbyNim+lsmhWZNFoNKNrmSiIizgXOlfQ+4BPASS32OQ84D2B0dDTGxsYyDtXMbLBJemDivYqtnlrL1iNoZ9J6BG3dxcC8XCMyM7OOikwatwH7SJqVzOZ5ArC0cQdJ+zTcfSdwbx/jMzOzJoVVT0XEJkmnUBulOgX4ekSslvQZYCwilgKnJNNLb6Q2InabqikzM+ufQlfui4irgaubtn2y4fY/9D0oMzNryyPCzcwsNScNMzNLzUnDzMxSc9IwM7PUnDTMzCw1Jw0zM0vNScPMzFIrdJzGMFuycpzFy9bw0PoN7D59GguOms28OSNFh2Vm1pGTRgGWrBzn9CtWsWHjZgDG12/g9CtWAThxmFmpuXqqAIuXrdmSMOo2bNzM4mVrCorIzCwdJ40CPLR+Q1fbzczKwkmjALtPn9bVdjOzsnDSKMCCo2YzbeqUrbZNmzqFBUfNLigiM7N03BBegHpjt3tPmVnVOGkUZN6cEScJM6scV0+ZmVlqThpmZpaak4aZmaXmpGFmZqk5aZiZWWpOGmZmlpqThpmZpeakYWZmqTlpmJlZak4aZmaWmpOGmZml5qRhZmapOWmYmVlqThpmZpaak4aZmaVWaNKQdLSkNZLuk7SwxeMfk/QLSXdKulHSnkXEaWZmNYUlDUlTgHOBtwP7ASdK2q9pt5XAaEQcCFwG/Ft/ozQzs0ZFXmkcCtwXEfdHxLPAxcCxjTtExE0R8XRy9xZgZp9jNDOzBkUmjRHgwYb7a5Nt7XwIuKbVA5JOljQmaWzdunUZhmhmZo2KTBpqsS1a7ii9HxgFFrd6PCLOi4jRiBidMWNGhiGamVmj7Qt87bXAHg33ZwIPNe8k6QjgDOAvIuKZPsVmZmYtFHmlcRuwj6RZknYATgCWNu4gaQ7wFeCYiHi0gBjNzKxBYVcaEbFJ0inAMmAK8PWIWC3pM8BYRCylVh21E3CpJIDfRMQxRcVsVhVLVo6zeNkaHlq/gd2nT2PBUbOZN6dTk6FZOkVWTxERVwNXN237ZMPtI/oelFnFLVk5zulXrGLDxs0AjK/fwOlXrAIYisThhJkvjwg3GzCLl63ZkjDqNmzczOJlawqKqH/qCXN8/QaC5xPmkpXjRYc2MJw0zAbMQ+s3dLV9kAxzwuwXJw2zAbP79GldbR8kw5ww+8VJw2zALDhqNtOmTtlq27SpU1hw1OyCIuqfYU6Y/eKkYTZg5s0Z4az5BzAyfRoCRqZP46z5BwxFY/AwJ8x+KbT3lJnlY96ckaFIEs3q79m9p/LjpGHWBXfnLL9hTZj94qRhltKwj38wA7dpmKXm7pxmThpmqbk7p5mThllq7s5p5qRhFbRk5ThzFy1n1sKrmLtoed+miHB3TjM3hFvFFNkY7e6cZk4aVjGdGqP7cfJ2d04bdq6eskpxY7RZsZw0rFLcGG1WLCcNqxQ3RpsVy20aVilujDYrlpOGFWoyczm5Mdry4HnF0nHSsMJ4LicrCx+L6blNwzLT7aA7z+VkZeFjMT1faVgmJlNSc/fZwVeVKh8fi+k5aVgmJjPobvfp0xhv8aN099nB0K4gMfbA77jpl+tKlUh8LKbn6inLxGRKau4+O9jaFSS+fctvGF+/geD5RNKv+cPa8bGYnq80LBOTKal10322CtUcVYixn9oVGKLpfj+ngWnHXbnTc9KwTCw4avZWVRGQrqSWpvtsFXq2VCHGfmtXkGilDG0H7sqdjqunLBPz5oxw1vwDGJk+DQEj06dx1vwDMvkRVqFnSxVi7LdWVT5qs6/bDqrDVxqWmbxKalXo2VKFGPutVZXPW/adweUrxru+IrXycNKw0qtCz5YqxFiEVgWJ0T13ddtBhTlpWOlNtr2kn6oQY1m47aDanDSs9KrQs6UKMZplQRHNHeD6+OLS0cAXgSnAVyNiUdPjbwbOBg4EToiIyyZ6ztHR0RgbG8sjXDOzgSVpRUSMTrRfYb2nJE0BzgXeDuwHnChpv6bdfgN8ELiov9GZmVkrRVZPHQrcFxH3A0i6GDgW+EV9h4j4dfLYc0UEaGZmWytynMYI8GDD/bXJtq5JOlnSmKSxdevWZRKcmZltq+2VhqSdgdOBmcA1EXFRw2P/OyL+rsfXbjXOZ1INLBFxHnAe1No0egmqmaeGKBd/H2bF6lQ99Q3gXuBy4G8lvRt4X0Q8A7w+g9deC+zRcH8m8FAGz5sZTw1RU5YTtb8Ps+J1Shp7R8S7k9tLJJ0BLJd0TEavfRuwj6RZwDhwAvC+jJ47E5OZ7rvsukkAS1aOc+bS1azfsHHLtiJP1IP4fVRBWQoNVg6d2jReIGnL4xHxWWpVQD8GXtLrC0fEJuAUYBlwN3BJRKyW9Jl6YpJ0iKS1wHuBr0ha3evrdmPQpoaol9TTTEtd37cxYdQVNafSoH0fVdDNMWPDoVPS+D5weOOGiDgfOA14NosXj4irI+LPImLvJCkREZ+MiKXJ7dsiYmZE7BgRL4mI/bN43bTaTQFR1akhuplUr9W+jYo4UQ/a91EFnojRmrVNGhHxTxFxQ4vt10bEPvmGVQ6DtjBLNyX1iZJCESfqQfs+qsBXd9bM04h0MGhTQ3QzqV6ntRCyOFFPpp580L6PKvBEjNas0GlE8uBpRNpr7n0EtQTQat2LVvsC7PInU/nUu/bv6UTdTRxWLH9XwyPtNCK+0hgi3ZTU8yzVuxdUdfjqzpqlutKQ9OfAXjQkmYj4Vn5hTZ6vNMpv1sKrWo7iFPCrRe/sdzhmRoZXGpIuAPYGbgfqxcMASpk0rPxcT25WXWmqp0aB/WLQGj+sMF6wyKy60iSNu4BXAA/nHIsNCdeTW9l41Ht6aZLGS4FfSLoVeKa+MSKymk7EhpCX/LSyKNOcZlVIXmmSxpl5B2HVl8XBXoUfjA2esvTmK1Py6mTCpBERP5L0cuCQZNOtEfFovmFZlU6gWRzsVfnB2OApy6j3siSviUy4CJOk44BbqU0aeBzwM0nvyTuwYVa1SeKymJ/IcxxZUcoyp1lZktdE0qzcdwZwSEScFBEfoLZM67/kG9Zwq9oJNIuDvSo/GBs8ZZnTrCzJayJpksZ2TdVRj6X8O5ukqp1AszjYq/KDscEzb84IZ80/gJHp0xAwMn1aIdOklCV5TSRNQ/i1kpYB30nuHw9cnV9IVrXBb1mMu/DYDesk7za+MvTmq0pX9DQN4QuSpV7nUpvp4byIuDL3yIZY1U6gWRzsVfnB5KFKnR6KMEydJMqQvCbiWW5LatBPJIP+/tIq0yyyZf1O5i5a3vLKe2T6NG5eeHiLv7DJ6HnuKUk/iYg3SnoKtppfTkBExM4ZxGltVKHEMVmDXHKc6MTb/PjTz24qRTfLsn0njZ9Tu2JtWdv4Bl3bpBERb0z+f1H/wrF2yloKnIyq9Efv1kQn3laPt5P1CXGi46dM30m7tVyalbWNb9ClmeV2b2BtRDwj6TDgQOBbEbE+7+CspmylwF5VrXdYWhOdeCdad71RlifENMdPr99JloWaNJ9Tmdv4Bl2arrOXA5sl/SnwNWAWcFGuUdlWqjZuo5UlK8eZu2g5sxZexXZSy32qXnKc6MSb9gSc9QkxzfHTS5fnrAejdvqciuwSazVputw+FxGbJP0VcHZEfFnSyrwDs+dVvWTeXNLd3KLzRS8nyrJU3U3UVbrd49OnTWXHF2yfW/xpjp9eeuxlXbXV7nPq1PBdlmNgGKRJGhslnQicBLwr2TY1v5CsWdXGbdTVf8jt6u7r1xu9/MjLVHU30Ym33eNnHtPbmusTSXP89NLlOetCTbcJrEzHQDeqmujSJI2/AT4CfDYifiVpFnBhvmFZo6qN24B0jZkBnH38QT39UNqVck+75A6gvyeNiU68RY1FSXv8TLbHXtaFmm4/pzI14qdV1UQHHqdRGVUrlbTrW9+s17727dYbh+LGO5RRnsdP0WNNqrjmfBnHnmS5Rvhcamtq7JnsXx+n8apeg7T0qjZuI23VRON+kzmxtSvlQvlLm/2U5/FT9Gj+KlbfVrmdMk311NeAfwRWAOn6C9rQ63Qyb94PJn+53qrqpVEVfoTdKuNVZ5GFmipW31Yx0dWl6XL7RERcExGPRsRj9X+5R2aV1mrGzmaNP+zJdiuuz1A6ZUC78TYrYq2Vxu7ScxctL926LmWZpbYbVZnRtpU0Vxo3SVoMXMHWa4T/PLeorPJaVVm8Zd8Z3PTLdS1LyL1crtefo2qlzcnod6NvVRpsq1Z9W3SVXi/SJI3XJf83NpAE4JnC+qSM1RFpdPND7vVyPe8fYavvIM/Xa6ffdeFV7JnUTtl+R1VLdHVppkZ/S14vLulo4IvAFOCrEbGo6fEXAN8CDqa2+NPxEfHrvOIpo6qU9HqVRb10Xj/CVt/BgsvugICNz8WWbf34XvpdF96uXSpNe1WZDMvvqB/SrBH+cklfk3RNcn8/SR/q9YUlTQHOBd4O7AecKGm/pt0+BDweEX8K/AfwuV5ft2oGYQqRNCZTL92vuvZW38HGzbElYdT143vpd114u7aidtvLalh+R/2Qpnrqm8A3qK0VDnAP8F1qvap6cShwX0TcDyDpYuBY4BcN+xxLrbsvwGXAOZIUgza4pIMqd83rVjdXCv0sOWax1nkvmqtV3n3wSNu2oay1mvKl0/ayGqbfUd7SJI2XRsQlkk4HSOahyqLr7QjwYMP9tTzffrLNPsnrPgG8BPhtBq9fCVXumpenXuva29Vvt9qetvswZP+9tEqOl68Y71vvoJEO80BViX9H2UnT5fYPkl5CshCTpNcDT2Tw2q2ub5uLL2n2QdLJksYkja1bty6D0Mqjyl3z8tRLybFdt9VPLFnVcvtb9p2xzXcwdYqYut3Wh2ce30vR1SrdHH9l7prr31F20lxpfAxYCuwt6WZgBvCeDF57LbBHw/2ZwENt9lkraXvgxcDvmp8oIs4DzoPaNCIZxFYaVe6al6deSo7tTsTf+dmD21S7bNi4mZt+uY6z5h/Qt95TZVq1Lu3xV/aGZv+OspNq7qnkhD2bWsl/TURs7PmFa895D/BWYBy4DXhfRKxu2OejwAER8RFJJwDzI+K4Ts87qHNPpVG2LoV56mW+o07zVbXSzzmM0q5aB7UqorJ8x2WcS8m6k+XcU1OAdwB7Jfu/TRIR8YVeAkzaKE4BllHrcvv1iFgt6TPAWEQspdbYfoGk+6hdYZzQy2sOsrKX9LLWS8mx3VXKFKllA2+rq5clK8f59PdX8/jTtfLT9GlTM5nivJvV/cr0HbuheXikqZ76PvBHYBXwXJYvHhFXA1c3bftkw+0/Au/N8jUnowol+EEahJXWZMdltBsT8u6DR7h8xfiEY0WWrBxnwWV3sHHz8wlm/YaNLLi09+nYuz3JluU7dkPz8EiTNGZGxIG5R1JSVSnBD0JJr1/JudNVyuieu04Yw+Jla7ZKGHUbn4ueT+CdVq1r18bR7Xecx+dcxUkDbXLSJI1rJL0tIq7LPZoSqkoJvuolvX4n5+bEUe+NlObqpdNJutck3enk224VxG6+47w+5ywamqtwRW/pksYtwJWStgM28vx6GjvnGllJVKUEX/WSXpUm4us0bqPXJD3RyXey33GnpXez+px7mcalKlf0li5pfB54A7BqmEZi11WlBF/1LoVVmohvwVGzt2nTAJi6nTJJ0u1OvpP9jtP0yCq6EFSVK3pLlzTuBe4axoQB1SrBZzVhXxHVBP1OzllMxZ5H76k0r93ta6TpkVV0Iajd5z6+fgOzFl5VuULQIEuTNB4GfphMWNi4nkZPXW6rouol+G4VVU3Q7+ScxVTseX0eWSftiRJhGQpBnar8Gkfng6uripYmafwq+bdD8m/oVHXe+8koqpqg38m5rFeQeSTtTifksgwQnGjZXnB1VVmkGhFeJcM8IjwL7UZL93tUdNrk0UupvIy9dfIYWd3L6Pk8NX/+jSs7tjsr9fM4HDY9jwiXdHZEnCrp+7SYJDAijukxRiuhohv+uylpt9r31O/ezqe/v5pPvev59oV2yaGMV5B5dAgoYxXrRLP3tkueRbe9WOfqqQuS//+9H4HYxPpRMi662qab6rF2DbyPP71xS6IBtjk5LbjsDs5cuponNmwsxQm0UV5Ju9cEmfWxN9H3XPRxaO21TRoRsSL5/0eSZiS3B2ve8QrpVwN10aXSbkranUrfjdOHt1p1b/2GWq+nsjWw9nKyzKtQkfWxt2TleNs2lvp3WvRxaO11qp4S8CngFGpVidtJ2gR8OSI+06f4LNHPBuoiq226KWlPtDhS2iqdMjWwZjUWI8tkmOWxV4+zncbvuYzVh9Z5EaZTgbnAIRHxkojYhdrKenMl/WNforMtqjIyvVfdLJbTat9Gu0+flrpap0yf47w5I9y88HB+teid3Lzw8FQnzjwXa8ry2Os0ZsTVT9XQqU3jA8CREbFladWIuF/S+4HrgP/IOzh7XtEN1BPJqmqkm5J2fduZS1dvqW6qazwBpVmfoqjPMavPLc9CRZbHXqd4iu7NZel0ShpTGxNGXUSskzQ1x5ishTI3DGZdNdJNtUR934lOvu3mXYJa3WsRn2OWn1uehYosj71Os/g6YVRDp+qpZyf5mOVg3pwRzpp/ACPTpyFqP7KylMyKXscaOlfp1B9rteA81PqTF/E5Zvm55bkGdpbHntfqrr5OVxqvlfRki+0CXphTPNZBr7OI5tUTpSrtLZ1KuUXI8nPLu7dRVo3S7hVVfZ263LZvYbRKybu7btnbW+rKVsWX9edWld5GVYnTWutUPWUDIu/qo6pUOZStiq8qn5tZozQTFlrF5V19VKUqhzKVcqv0uZnVOWkMgX5UH5XpZFwl/tysalw9NQRcDWJpLFk5ztxFy5m18CrmLlrOkpXjRYdkJeQrjSHgahCbiNfotrScNIaEq0GsE6/RbWm5esrMKjPWxornKw2zAZdmYGdVxtpY8XylYTbA6m0V48kSqvW2iuZGbneWsLR8pWGVUcY1vcsubVtFt50l/F0MLycNy1VWJ5du1w73Ca2mm7aKtJ0l3NNquLl6ynKTtmokjbRToWT5moOgXZtEL20VZZjV2IrjpGG5yfLkkrbE7BPa1vJoq3BPq+FWSNKQtKuk6yXdm/y/S5v9rpW0XtIP+h2j9S7Lk0vaErNPaFvLY5LGPK5erDqKutJYCNwYEfsANyb3W1kM/HXforJMZXlySVti9gltW5NZc7wT97QabkUljWOB85Pb5wPzWu0UETcCT/UrKMtWlieXtCVmn9DyV7Yp5q2/FBH9f1FpfURMb7j/eES0q6I6DPh4RPxlh+c7GTgZ4JWvfOXBDzzwQMYR22QV0ZMpzWu6h5XZ1iStiIjRCffLK2lIugF4RYuHzgDOzzJpNBodHY2xsbFJRGzDornLKMDU7cROL9ye9U9vdBKxoZQ2aeQ2TiMijmj3mKRHJO0WEQ9L2g14NK84zJq16mG18bng8ac3Ah53YNZJUW0aS4GTktsnAd8rKA4bQml6Ug1zN12zTopKGouAIyXdCxyZ3EfSqKSv1neS9F/ApcBbJa2VdFQh0dpASduTali76Zp1Usg0IhHxGPDWFtvHgA833H9TP+Oy9KrckLzgqNnbtGm0MszddKuiysdhVXnuKeta1eceap6c78XTpvKHZzexcfPznULcTbf8qn4cVpWThnVtEFZ5a56czyXW6hmE47CKnDSsa4M4VUfRy+E6aXVvEI/DKnDSKEiVTxJe5S1brmaZHB+HxfAstwWo+vTdnqojO0tWjnPaJXd4Zt5J8HFYDCeNAlR9+m7PPZSNeuFhc5tZGVzN0pmPw2K4eqoAg1AXW3QbwCBoVXho5GqWifk47D9faRTA03cbdC4kuJrFyspJowCuizVoX0iYIrmaxUrLSSOxZOU4cxctZ9bCq5i7aHmujdKuizVoX3j4/HGv9bFgpeU2DYrp8ui6WGsemV61rtc2nJw08MhSK44LD1Y1rp5iMHozmZn1g5MG7s1kZpaWkwbuzWRmlpbbNHCDpJlZWk4aCTdImplNzEnDzKzC+j1jtpOGmbVU5en7h0URY8zcEG5m26j69P3DoogZs500zGwbVZ++f1gUMcbMScPMtuEBr9VQxBgzJw0z24YHvFZDEWPMnDTMbBse8FoNRcyY7d5TZrYND3itjn6PMXPSMLOWPODVWnH1lJmZpeakYWZmqTlpmJlZak4aZmaWmpOGmZmlVkjSkLSrpOsl3Zv8v0uLfQ6S9FNJqyXdKen4ImI1M7PnFXWlsRC4MSL2AW5M7jd7GvhAROwPHA2cLWl6H2M0M7MmRSWNY4Hzk9vnA/Oad4iIeyLi3uT2Q8CjwIy+RWhmZtsoKmm8PCIeBkj+f1mnnSUdCuwA/N82j58saUzS2Lp16zIP1szManIbES7pBuAVLR46o8vn2Q24ADgpIp5rtU9EnAecBzA6Ohpdhmo5KeMiPmWMyaxKcksaEXFEu8ckPSJpt4h4OEkKj7bZb2fgKuATEXFLTqFaDopYUayKMZlVTVHVU0uBk5LbJwHfa95B0g7AlcC3IuLSPsZmGSjjIj5ljMmsaopKGouAIyXdCxyZ3EfSqKSvJvscB7wZ+KCk25N/BxUTrnWrjIv4lDEms6opZJbbiHgMeGuL7WPAh5PbFwIX9jk0y8ju06cx3uJkXOQiPmWMyaxqPCLcclHGRXzKGJNZ1Xg9DctFGRfxKWNMZlWjiMHqoTo6OhpjY2NFh2FmVimSVkTE6ET7uXrKzMxSc9IwM7PUnDTMzCw1Jw0zM0vNScPMzFJz0jAzs9ScNMzMLDUnDTMzS81Jw8zMUnPSMDOz1Jw0zMwstYGbe0rSOuCBAkN4KfDbAl+/SMP63v2+h8ugvu89I2LGRDsNXNIomqSxNJN+DaJhfe9+38NlWN93naunzMwsNScNMzNLzUkje+cVHUCBhvW9+30Pl2F934DbNMzMrAu+0jAzs9ScNMzMLDUnjRxIWizpl5LulHSlpOlFx9QPkt4rabWk5yQNfJdESUdLWiPpPkkLi46nXyR9XdKjku4qOpZ+krSHpJsk3Z0c5/9QdExFcNLIx/XAayLiQOAe4PSC4+mXu4D5wI+LDiRvkqYA5wJvB/YDTpS0X7FR9c03gaOLDqIAm4DTIuLVwOuBjw7Rd76Fk0YOIuK6iNiU3L0FmFlkPP0SEXdHxJqi4+iTQ4H7IuL+iHgWuBg4tuCY+iIifgz8rug4+i0iHo6Inye3nwLuBkaKjar/nDTy97fANUUHYZkbAR7CwKodAAADhElEQVRsuL+WITyBDCtJewFzgJ8VG0n/bV90AFUl6QbgFS0eOiMivpfscwa1S9pv9zO2PKV530NCLba5//oQkLQTcDlwakQ8WXQ8/eakMUkRcUSnxyWdBPwl8NYYoMEwE73vIbIW2KPh/kzgoYJisT6RNJVawvh2RFxRdDxFcPVUDiQdDfwzcExEPF10PJaL24B9JM2StANwArC04JgsR5IEfA24OyK+UHQ8RXHSyMc5wIuA6yXdLuk/iw6oHyT9laS1wBuAqyQtKzqmvCQdHU4BllFrEL0kIlYXG1V/SPoO8FNgtqS1kj5UdEx9Mhf4a+Dw5Hd9u6R3FB1Uv3kaETMzS81XGmZmlpqThpmZpeakYWZmqTlpmJlZak4aZmaWmpOGDR1Jm5Pukqsl3SHpY5K2Sx4blfSlguL674yeZ6hmG7b+cpdbGzqSfh8ROyW3XwZcBNwcEZ8qNrJsSHo18BzwFeDjETFWcEg2QHylYUMtIh4FTgZOUc1hkn4AIOlMSedLuk7SryXNl/RvklZJujaZUgJJB0v6kaQVkpZJ2i3Z/kNJn5N0q6R7JL0p2b5/su32ZM2VfZLtv0/+V7Imy13Jax2fbD8sec7LkvVavp2MUm5+T8M027D1mZOGDb2IuJ/ab+FlLR7eG3gntWnPLwRuiogDgA3AO5PE8WXgPRFxMPB14LMNf799RBwKnArUr2Q+AnwxIg4CRqnNY9VoPnAQ8FrgCGBxPRFRm1n1VGpreLyK2ihls77xhIVmNa1mrQW4JiI2SloFTAGuTbavAvYCZgOvoTZlDMk+Dzf8fX1SuxXJ/lCbguMMSTOBKyLi3qbXfCPwnYjYDDwi6UfAIcCTwK0RsRZA0u3Jc/6k2zdrNlm+0rChJ+lVwGbg0RYPPwMQEc8BGxtmLH6OWqFLwOqIOCj5d0BEvK3575Pn3z55rouAY6hdrSyTdHhzSB3Cfabh9pbnNOsXJw0bapJmAP8JnDPJKezXADMkvSF5vqmS9p/gNV8F3B8RX6I2M+6BTbv8GDhe0pQkvjcDt04iNrPMOWnYMJpW73IL3ABcB3x6Mk+ULPX6HuBzku4Abgf+fII/Ox64K6le2hf4VtPjVwJ3AncAy4F/ioj/lzamYZpt2PrPXW7NzCw1X2mYmVlqThpmZpaak4aZmaXmpGFmZqk5aZiZWWpOGmZmlpqThpmZpfb/AeaTYtFDWhWJAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "trans = pca_fit.transform(sample)\n", + "plt.scatter(trans[:,0], trans[:,1])\n", + "plt.xlabel('Dimension 1')\n", + "plt.ylabel('Dimension 2')\n", + "plt.title('Sample data')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the scale along these two coordinates are quite different. The first principle component is along the horizontal axis. The range of values on this direction is in the range of about $\\{ -2.5,2.5 \\}$. The range of values on the vertical axis or second principle component are only about $\\{ -0.2, 0.3 \\}$. It is clear that most of the variance is along the direction of the fist principle component. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load Features and Labels\n", + "\n", + "Keeping the foregoing simple example in mind, it is time to apply PCA to some real data. \n", + "\n", + "The code in the cell below loads the dataset which has had the the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizating numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example: " + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(999, 35)\n", + "(999, 1)\n" + ] + } + ], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are 35 features in this data set. The numeric features have been Zscore scaled so they are zero centered (mean removed) and unit variance (divide by standard deviation). \n", + "\n", + "****\n", + "**Note:** Before performaing PCA all features must be zero mean and unit variance. Failure to do so will result in biased computation of the components and scales. In this case, the data set has already been scaled, but ordinarily scaling is a key step. \n", + "****\n", + "\n", + "Now, run the code in the cell below to split the data set into test and training subsets:" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "x_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "x_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compute principle components\n", + "\n", + "The code in the cell below computes the principle components for the training feature subset. Execute this code:" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "PCA(copy=True, iterated_power='auto', n_components=None, random_state=None,\n", + " svd_solver='auto', tol=0.0, whiten=False)" + ] + }, + "execution_count": 28, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pca_mod = skde.PCA()\n", + "pca_comps = pca_mod.fit(x_train)\n", + "pca_comps" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to print the variance explained for each component and the sum of the variance explained:" + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[2.10371317e-01 1.41889525e-01 1.30416485e-01 5.29255602e-02\n", + " 4.58877065e-02 4.54231655e-02 4.01315388e-02 3.75523632e-02\n", + " 3.27679657e-02 3.15220744e-02 2.78790155e-02 2.68014992e-02\n", + " 2.43754088e-02 2.14515760e-02 1.74960773e-02 1.66840809e-02\n", + " 1.59279550e-02 1.48172414e-02 1.25291687e-02 1.03074323e-02\n", + " 9.25893998e-03 7.31621592e-03 6.92297009e-03 6.02021439e-03\n", + " 5.17958679e-03 3.83751243e-03 1.73470316e-03 1.34224772e-03\n", + " 1.23045274e-03 4.14884269e-32 3.16981128e-33 2.84951011e-33\n", + " 1.20251621e-33 1.20251621e-33 1.20251621e-33]\n", + "1.0\n" + ] + } + ], + "source": [ + "print(pca_comps.explained_variance_ratio_)\n", + "print(np.sum(pca_comps.explained_variance_ratio_))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These numbers are a bit abstract. However, you can see that the variance ratios are in decending order and that the sum is 1.0. \n", + "\n", + "Execute the code in the cell below to create a plot of the explained variance vs. the component: " + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAD8CAYAAACb4nSYAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAHh9JREFUeJzt3XtwXOWd5vHvr9XqVreurZZ8U+viGxdzB2FgIQzJEGJmd3CygQRmmWGqmJDMhpndymZqyW5VkmFmqpJM7eayw4YwCRvIJiHAJBPXhCkWyJUkgGVsQ4wB3y3ZsizrZsm6S+/+0UemLSSrZcs63aefT1VXnz79nu4fB/vp4/e85z3mnENERApDyO8CRERk8Sj0RUQKiEJfRKSAKPRFRAqIQl9EpIAo9EVECohCX0SkgCj0RUQKiEJfRKSAhP0uYLqamhrX1NTkdxkiInlly5Ytx5xztXO1y7nQb2pqoqWlxe8yRETyipkdyKadundERAqIQl9EpIAo9EVECohCX0SkgCj0RUQKiEJfRKSAKPRFRApIYEK/b3CMrz6/i+2tvX6XIiKSs3Lu4qwzZSH48vNvEy0OcVl9ld/liIjkpMAc6VeUFFNdGuFA1wm/SxERyVmBCX2Ahuo4B7oG/S5DRCRnBSr0m5IKfRGR0wlU6DcmSzncN8TI+ITfpYiI5KSAhX4c56C1e8jvUkREclJWoW9mG8zsLTPbbWYPzPD+p8zsDTN7zcxeMLPGjPfuMbNd3uOehSx+usZkKYBO5oqIzGLO0DezIuAh4FZgHXCXma2b1mwr0OycuxR4GviSt2018DngGmA98DkzSyxc+adqSsYB1K8vIjKLbI701wO7nXN7nXOjwBPAxswGzrmfOeemkvYlIOUtfwB4zjnX7ZzrAZ4DNixM6e9WXRqhLBrWkb6IyCyyCf06oDXjdZu3bjb3Av96htueFTOjMRnnQLeO9EVEZpLNFbk2wzo3Y0Ozu4Fm4Pfms62Z3QfcB9DQ0JBFSbNrTMbZ2d5/Vp8hIhJU2RzptwH1Ga9TwOHpjczsZuC/A7c550bms61z7hHnXLNzrrm2ds77+p5WY7KU1u5Bxicmz+pzRESCKJvQ3wysNbOVZhYB7gQ2ZTYwsyuAb5AO/KMZbz0L3GJmCe8E7i3eunOmKRlnfNLR3jd8Lr9GRCQvzRn6zrlx4H7SYb0TeNI5t8PMHjSz27xmfw+UAU+Z2TYz2+Rt2w38Dekfjs3Ag966c6ahOj1sc79O5oqIvEtWs2w6554Bnpm27rMZyzefZttHgUfPtMD5aqp5Z9jme9Yu1reKiOSHQF2RC7C0vIRoOKRhmyIiMwhc6IdCRkN1nP26QEtE5F0CF/qQHsFzUKEvIvIuAQ39OAe6T+DcjJcTiIgUrECGflMyzvDYJEf7R+ZuLCJSQAIZ+lOzbe4/ppO5IiKZAhr6mm1TRGQmgQz9uqoY4ZBxoFtH+iIimQIZ+uGiEKlETMM2RUSmCWToAzRo2KaIyLsENvSbknH2d2nYpohIpsCGfmOylP7hcXoGx/wuRUQkZwQ39KunRvDoZK6IyJTAhn7mbJsiIpIW2NBPJeKYKfRFRDIFNvRLiotYXlGi7h0RkQyBDX1In8zVHbRERN4R8NCPc7Bb3TsiIlMCHvqlHBsYZWBk3O9SRERyQqBDvympYZsiIpkCHfoNmm1TROQUgQ79k/Pq60hfRAQIeOiXRcPUlEU08ZqIiCfQoQ8atikikqkAQj+uPn0REU/wQ7+6lPa+YYbHJvwuRUTEd4EP/amJ11p1kZaISPBDf2oEj7p4REQKIfS9efV1MldEpABCvypeTEVJWEf6IiIUQOibGY3JUg6oT19EJPihD1PDNtW9IyJSEKHflCzlUM8QYxOTfpciIuKrggj9hmSc8UnH4d4hv0sREfFVQYR+08mJ19SvLyKFrUBCPz1s86D69UWkwBVE6NeWR4kVF+lIX0QKXkGEfnrYpiZeExHJKvTNbIOZvWVmu83sgRnev9HMXjWzcTO7fdp7E2a2zXtsWqjC56uhWsM2RUTCczUwsyLgIeD9QBuw2cw2OefeyGh2EPhT4NMzfMSQc+7yBaj1rDTVlPLztzuZnHSEQuZ3OSIivsjmSH89sNs5t9c5Nwo8AWzMbOCc2++cew3I2YHwjck4o+OTdPQP+12KiIhvsgn9OqA143Wbty5bJWbWYmYvmdkH51XdAmqs9oZtHlO/vogUrmxCf6a+EDeP72hwzjUDfwR8xcxWv+sLzO7zfhhaOjs75/HR2WucGrbZrX59ESlc2YR+G1Cf8ToFHM72C5xzh73nvcDPgStmaPOIc67ZOddcW1ub7UfPy4qqGMVFpmGbIlLQsgn9zcBaM1tpZhHgTiCrUThmljCzqLdcA1wPvHH6rc6NopBRn9AIHhEpbHOGvnNuHLgfeBbYCTzpnNthZg+a2W0AZna1mbUBdwDfMLMd3uYXAi1mth34GfCFaaN+FpXG6otIoZtzyCaAc+4Z4Jlp6z6bsbyZdLfP9O1+A1xyljUumMZkKZv39+Ccw0zDNkWk8BTEFblTGpNxBkbG6T4x6ncpIiK+KKjQ12ybIlLoCir0V9WmQ//ZHUd8rkRExB8FFfqNyVLuWl/PI7/cy4+3HfK7HBGRRVdQoQ/w17ddzDUrq/mrp19jW2uv3+WIiCyqggv9SDjE1+++iqUVUT72eAvtfbqFoogUjoILfYDq0gjfuudqhkYn+NjjLQyNTvhdkojIoijI0Ac4b2k5X7vrcnYcPs6nn9rO5OR8phMSEclPBRv6AO+7YCmfufUCfvJ6O199YZff5YiInHNZXZEbZB97zyp2dQzw1Rd2sXZpGf/u0hV+lyQics4U9JE+pO+f+7cfupjmxgSffmo7r7f1+V2SiMg5U/ChDxANF/HwH19FsjTKnz2+mY7juruWiASTQt9TUxblm/c00z88zn2PtzA8phE9IhI8Cv0MFy6v4Eu3X8r2tj5e2HnU73JERBacQn+am85fAsDBbk3KJiLBo9CfpiwapipeTFuPQl9EgkehP4NUIsahXk3PICLBo9CfQV1VjLYehb6IBI9CfwapRJxDPUM4p6kZRCRYFPozqKuKMTQ2odsqikjgKPRnkErEANSvLyKBo9CfQd1U6KtfX0QCRqE/g1QiDqCTuSISOAr9GVTGiimPhtW9IyKBo9CfRV0ipgu0RCRwFPqzSCU0Vl9EgkehP4upsfoiIkGi0J9FXVWM/pFx+obG/C5FRGTBKPRnMTVWX/36IhIkCv1ZaKy+iASRQn8WGqsvIkGk0J9FIl5MrLhIY/VFJFAU+rMwM2/Ypvr0RSQ4FPqnUaebqYhIwCj0T0M3UxGRoFHon0YqEad3cIyBkXG/SxERWRAK/dPQsE0RCRqF/mm8czMVncwVkWDIKvTNbIOZvWVmu83sgRnev9HMXjWzcTO7fdp795jZLu9xz0IVvhhSVVNX5epIX0SCYc7QN7Mi4CHgVmAdcJeZrZvW7CDwp8D3pm1bDXwOuAZYD3zOzBJnX/biqCmLEgmH1L0jIoGRzZH+emC3c26vc24UeALYmNnAObffOfcaMDlt2w8Azznnup1zPcBzwIYFqHtRhEKmETwiEijZhH4d0Jrxus1bl42z2TYnpBIx2jRWX0QCIpvQtxnWuSw/P6ttzew+M2sxs5bOzs4sP3px1FXFOKSrckUkILIJ/TagPuN1Cjic5ednta1z7hHnXLNzrrm2tjbLj14cqUSMYwOjDI9N+F2KiMhZyyb0NwNrzWylmUWAO4FNWX7+s8AtZpbwTuDe4q3LG3UJjeARkeCYM/Sdc+PA/aTDeifwpHNuh5k9aGa3AZjZ1WbWBtwBfMPMdnjbdgN/Q/qHYzPwoLcub0xNsaw5eEQkCMLZNHLOPQM8M23dZzOWN5Puuplp20eBR8+iRl/VVemqXBEJDl2RO4elFSWEQ6YplkUkEBT6cygKGcurStS9IyKBoNDPQqoqrhO5IhIICv0s1CVi6tMXkUBQ6GchlYjR0T/M6Pj0WSZERPKLQj8LdVUxnIP2Ph3ti0h+U+hnYWqsvvr1RSTfKfSzkNIdtEQkIBT6WVhWWULI0Fh9Ecl7Cv0sFBeFWFZRoimWRSTvKfSzVJfQzVREJP8p9LOUSsTVpy8ieU+hn6W6qhhHjg8zPqGx+iKSvxT6WUolYkxMOo4cH/a7FBGRM6bQz5JupiIiQaDQz9LJm6ko9EUkjyn0s7S8sgTQkb6I5DeFfpZKiotYUh7lUK8u0BKR/KXQnweN1ReRfKfQn4dUIq47aIlIXlPoz0NdVYzDvUNMTjq/SxEROSMK/XlIJWKMTTiO9o/4XYqIyBlR6M/D1Fh9ncwVkXyl0J+Hel2gJSJ5TqE/DyuqFPoikt8U+vMQj4RJlkYU+iKStxT681SXiGnYpojkLYX+PKUSMd02UUTylkJ/nuqqYhzqGcI5jdUXkfyj0J+nuqoYI+OTHBsY9bsUEZF5U+jP08kpltWvLyJ5SKE/T+/cTEX9+iKSfxT683TyqlwN2xSRPKTQn6eKkmIqSsIaqy8ieUmhfwY0xbKI5CuF/hmo01h9EclTCv0zkEporL6I5CeF/hmoq4pxYnSC3sExv0sREZmXrELfzDaY2VtmttvMHpjh/aiZ/cB7/2Uza/LWN5nZkJlt8x4PL2z5/pgaq7+nc8DnSkRE5mfO0DezIuAh4FZgHXCXma2b1uxeoMc5twb4MvDFjPf2OOcu9x6fWKC6fXVZfSWlkSI+9ngLL+zs8LscEZGsZXOkvx7Y7Zzb65wbBZ4ANk5rsxF4zFt+Gvh9M7OFKzO3LK+MsekvbmBZZYx7H2vhb//lDUbHJ/0uS0RkTtmEfh3QmvG6zVs3Yxvn3DjQByS991aa2VYz+4WZvecs680Zq2vL+NF//Df8yXWNfPPFfdzx8G842KURPSKS27IJ/ZmO2KcPW5mtTTvQ4Jy7AvgU8D0zq3jXF5jdZ2YtZtbS2dmZRUm5oaS4iAc3XszDd1/J3mMn+Ldf+xX/8tphv8sSEZlVNqHfBtRnvE4B05PtZBszCwOVQLdzbsQ51wXgnNsC7AHOm/4FzrlHnHPNzrnm2tra+f9X+GzDxct55i/fw5qlZdz/va185oevMzw24XdZIiLvkk3obwbWmtlKM4sAdwKbprXZBNzjLd8O/NQ558ys1jsRjJmtAtYCexem9NxSXx3nyY9fx8d/bxXff+UgG//h1+zq6Pe7LBGRU4TnauCcGzez+4FngSLgUefcDjN7EGhxzm0CvgV8x8x2A92kfxgAbgQeNLNxYAL4hHOu+1z8h+SC4qIQn7n1Qq5bleS/PLmdDV/9FYl4hNJoEaWRcPo5Gj65HI+EqS2P8tGr66kpi/pdvogUAMu1q0qbm5tdS0uL32WctY7jwzz2m/30DI5xYmScwdFxBkbGGRydSD+PTHBiZJz+kXHKomH+/KbV3HvDSkqKi/wuXUTykJltcc41z9lOoe+v3UcH+MK/vsnzOztYXlnCX33gfD54eR2hUGBHvIrIOZBt6GsaBp+tWVLGN+9p5vsfu5aasiifenI7tz30Ir/Zc8zv0kQkgBT6OeK61Ul+/Mnr+cpHL6d7YJQ/+seX+bPHNrP7qE4Gi8jCUfdODhoem+DRX+/jf/9sD0NjE3ykOcUfXLKcq5uq1ecvIjNSn34AdA2M8LUXdvH9V1oZnZgkGg6xfmU1N6yp4Ya1NVy4rEJ9/yICKPQDZXB0nJf3dfOrt4/x4u5O3u5Iz+5ZUxbh+jU13LCmhhvPq2VpRYnPlYqIX7IN/TnH6Yv/4pEw7z1/Ce89fwkAR/qGeXH3MV7c1cmLu4/x422HKQoZH7qijr943xoak6U+VywiuUpH+nluctLx5pF+nt7SxndfPsD4pOPfX1HH/Qp/kYKi7p0CdPT4MA//Yu/J8P/wlXXc/961NCTjfpcmIueYQr+AHT0+zNd/sYfvvnyQyUnHh69Mcf/71lBfrfAXCSqFvtBxfJiv/3wP33slHf63XrKcC5aVU18dp6E6Tn0iRnVphADf70akYCj05aQjfcN8/ee7+cnr7RwbGD3lvdJIEfXV8fQjEeeCZeV84OJlVMaKfapWRM6EQl9mdGJknLaeIQ52D9LaPUhrj/fcPURrzyCDoxNEwyFuvXgZH2mu59pVSV0LIJIHNGRTZlQaDXP+snLOX1b+rvecc7x+qI+nWtr48bZD/PO2w6QSMW6/KsXtV6VIJXROQCTf6UhfZjQ8NsGzO47wVEsbv/Ymf7t+dQ13NKf4wEXLNB2ESI5R944smNbuQf7p1TaeamnjUO8QIYMl5SUsryphRWWM5ZUlLK+KsSLjuaYsqm4hkUWk0JcFNznp+O3eLl7a28Xh3mHa+4Y40jfM4b4hhscmT2lbXhLmmpXVXLsqyXWrk5onSOQcU5++LLhQyLh+TQ3Xr6k5Zb1zjt7BMQ73DdHu/Ri80d7PS3u7eH7nUQCq4sVcs7Ka61YluW51DectLdNQUREfKPTlrJkZidIIidIIF62oPOW99r4hfrunK/3Y28WzOzoASJamJ4u7ed1Sbjq/looSDREVWQzq3pFF1do9ePIH4Jdvd9J1YpTiIuPaVUluvnApN69bSl1VzO8yRfKO+vQl501MOrYe7OG5nR0890YHeztPALBueQXvX7eU969bykUrKtQNJJIFhb7knT2dAzz/RgfP7+xgy4EeJh00JuPcfU0jdzSnqIpH/C5RJGcp9CWvdQ2M8MKbR3mqpZXN+3uIhkP84WUr+JPrGrk0VeV3eSI5R6EvgbGz/Tj/96UD/GjrIQZHJ7gsVcnd1zbyh5et0EViIh6FvgRO//AYP3z1EN956QC7jw5QFS/mI831/P4FS6gpj1JTGqUiFtY5AClICn0JLOccL+3t5jsv7efZHR1MTL7zZ7i4yKgujZAsjZIsi5AsjZAsi3rr0sNKq6ce8QiVsWJdNCaBoIuzJLDMjOtWp6/0Pdo/zFtH+ukaGOXYwAhdJ0bpGhih+8QoxwZG2d91gmP9owyNTcz4WSGDqnj6R2BJeZQVVTFWVMWoqyo5ubyiMkYsom4kCQaFvuS1JeUlLCkvmbPd0OgEPYOjdJ9IPzKXpx4dx4d5cdcxOvqHmf4P4OrSCMsrS7hweQXrm6q5emU1Tcm4upIk7yj0pSDEIkXEIukj97mMTUxypG+Y9r5hDvcOcah3iMO9Q7T1DPHTN4/y9JY2AGrLo6xvqmb9ymqubqrmgmXl6iqSnKfQF5mmuCh08m5i0znn2NM5wMv7utm8r5tX9nXzk9fbAagoCdPclP4BuLopwcV1lRpdJDlHoS8yD2bGmiXlrFlSzn+4phGAtp5BXtnXzeb93by8r5ufvpmeZC5SFOLSVKX3Q5DgqsaELjAT32n0jsgC6xoYYcuBHloO9LB5fze/O9TH2ET679naJWU0NyVYXVvG8soYyypLWF5ZwpLyKOGikM+VSz7T6B0RnyTLotxy0TJuuWgZkL4L2fbWXloO9NCyv5ufvNbO8eHxU7aZujHN1I/AssoSzl9aziWpSs5bWk6xfhBkgSj0Rc6xkuIirlmV5JpVSSB9XuD40Djtx4do7xs+edK4vXeII8eH2XV0gF+83cngaHqYaTQcYt2KCi6tq+SSVBWXpipZXVtGkU4ayxlQ6IssMjOjMl5MZbyYC5ZVzNhmctJxsHuQ7W29vN7Wx2uH+nhqSxuP/fYAAPFIEeuWV5AsixCPhCkpLiIeST8yl2ORMDWlEWrLo9SWR6mMFWuYaYFT6IvkoFDIaKoppammlI2X1wHpqaj3HRtge2sfrx/qY8fhPvYdO8HQ2ARDoxMMjk4wNDbxrmsMMkWKQtSWR6kpj1JbFj35Y1BbFqGmLEqyLEpNWfoq5ooSTWkRRAp9kTxRFHpn5NCHr0rN2MY5x8j4ZPpHYGyCwZFxjg2M0jkwQmd/+nG0f5jO/hHaegbZerCH7sHRGX8oIuEQNd40FrXlURqq4zQm4zQlS2lMxkkl4kTCOteQbxT6IgFiZpQUp7t4Et66tUtPv834xCTdg6Mc6x+l68RIejoL74dianqLI33DvLKvm4GRd05AhwzqErGTPwJNyVIuq6/iEl2fkNOyCn0z2wB8FSgCvumc+8K096PA48BVQBfwUefcfu+9zwD3AhPAXzrnnl2w6kXkrIWLQllNZ+Gco+vEKAe6TrD/2GD6uSv9vGnb4ZMjksIhY92KCq5sSHBFQxVXNiRIJWLqKsoRc4a+mRUBDwHvB9qAzWa2yTn3Rkaze4Ee59waM7sT+CLwUTNbB9wJXASsAJ43s/OcczPPfiUiOcvMqCmLUlMW5arG6lPec87ROTDCtoO9bG3t5dUDPfxgcyvf/s1+AGrKolzRkP5XQGWsmFhxUXpqjOKpE87pR7w4TElxiFDIKDJLP59chiJLv9YPyJnL5kh/PbDbObcXwMyeADYCmaG/Efi8t/w08A+W/r+yEXjCOTcC7DOz3d7n/XZhyheRXGBmLCkvOeX6hPGJSd480s/W1l62Huhha2svz73RsSDfFwmHSFXFSFXHqU/EqK+O01Adpz4Rp746plFKp5FN6NcBrRmv24BrZmvjnBs3sz4g6a1/adq2dWdcrYjkjXBRiIvrKrm4rpI/vjY9ZcXw2AQnRsZPjjgaGvNGHWUsD49NMOkcE5PpR3qZU9YNjU3Q1jNIa/cQr7X10js4dsp3l0fDLKmIEsqz4L9geQX/664rzul3ZBP6M+216ef6Z2uTzbaY2X3AfQANDQ1ZlCQi+WjqJPNC6x8eo7V7iNaeQVq704/OgZEF/55zrT4x9yywZyub0G8D6jNep4DDs7RpM7MwUAl0Z7ktzrlHgEcgPfdOtsWLiACUlxSzbkUx61bMfLGbvCObQbabgbVmttLMIqRPzG6a1mYTcI+3fDvwU5eeyW0TcKeZRc1sJbAWeGVhShcRkfma80jf66O/H3iW9JDNR51zO8zsQaDFObcJ+BbwHe9EbTfpHwa8dk+SPuk7DnxSI3dERPyjqZVFRAIg26mVdQ21iEgBUeiLiBQQhb6ISAFR6IuIFBCFvohIAcm50Ttm1gkcmOXtGuDYIpZztvKtXlDNi0U1n3v5Vi+cXc2NzrnauRrlXOifjpm1ZDMkKVfkW72gmheLaj738q1eWJya1b0jIlJAFPoiIgUk30L/Eb8LmKd8qxdU82JRzedevtULi1BzXvXpi4jI2cm3I30RETkLeRH6ZrbBzN4ys91m9oDf9WTDzPab2etmts3McnIGOTN71MyOmtnvMtZVm9lzZrbLe074WeN0s9T8eTM75O3rbWb2B37WmMnM6s3sZ2a208x2mNl/8tbn7H4+Tc25vJ9LzOwVM9vu1fzX3vqVZvayt59/4E0P77vT1PttM9uXsY8vX/Avd87l9IP0dM57gFVABNgOrPO7rizq3g/U+F3HHDXeCFwJ/C5j3ZeAB7zlB4Av+l1nFjV/Hvi037XNUu9y4EpvuRx4G1iXy/v5NDXn8n42oMxbLgZeBq4FngTu9NY/DPy537XOUe+3gdvP5Xfnw5H+yRuzO+dGgakbs8tZcs79kvT9DzJtBB7zlh8DPrioRc1hlppzlnOu3Tn3qrfcD+wkfZ/onN3Pp6k5Z7m0Ae9lsfdwwPuAp731ObOfT1PvOZcPoT/Tjdlz+g+gxwH/z8y2ePcAzhdLnXPtkP7LDyzxuZ5s3W9mr3ndPznTVZLJzJqAK0gf1eXFfp5WM+TwfjazIjPbBhwFniPdQ9DrnBv3muRUdkyv1zk3tY//ztvHXzaz6EJ/bz6EflY3V89B1zvnrgRuBT5pZjf6XVCAfR1YDVwOtAP/w99y3s3MyoB/Av6zc+643/VkY4aac3o/O+cmnHOXk74X93rgwpmaLW5Vs5ter5ldDHwGuAC4GqgG/utCf28+hH5WN1fPNc65w97zUeBHpP8Q5oMOM1sO4D0f9bmeOTnnOry/QJPAP5Jj+9rMikmH53edcz/0Vuf0fp6p5lzfz1Occ73Az0n3kVeZ2dRtYXMyOzLq3eB1rTnn3AjwfzgH+zgfQj+bG7PnFDMrNbPyqWXgFuB3p98qZ2Te5P4e4Mc+1pKVqfD0fIgc2tdmZqTvIb3TOfc/M97K2f08W805vp9rzazKW44BN5M+F/Ez4HavWc7s51nqfTPjQMBIn39Y8H2cFxdneUPDvsI7N2b/O59LOi0zW0X66B7SN5//Xi7WbGbfB24iPbNfB/A54J9Jj3hoAA4CdzjncubE6Sw130S6y8GRHjX18an+cr+Z2Q3Ar4DXgUlv9X8j3Ueek/v5NDXfRe7u50tJn6gtIn0w+6Rz7kHv7+ITpLtKtgJ3e0fRvjpNvT8Fakl3a28DPpFxwndhvjsfQl9ERBZGPnTviIjIAlHoi4gUEIW+iEgBUeiLiBQQhb6ISAFR6IuIFBCFvohIAVHoi4gUkP8P4DLv2e3hZhEAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "def plot_explained(mod):\n", + " comps = mod.explained_variance_ratio_\n", + " x = range(len(comps))\n", + " x = [y + 1 for y in x] \n", + " plt.plot(x,comps)\n", + "\n", + "plot_explained(pca_comps)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This curve is often referred to as a **scree plot**. Notice that the explained variance decreases rapidly until the 5th component and then slowly, thereafter. The first few components explain a large fraction of the variance and therefore contain much of the explanatory information in the data. The components with small explained variance are unlikely to contain much explanatory information. Often the inflection point or 'knee' in the scree curve is used to choose the number of components selected. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now it is time to create a PCA model with a reduced number of components. The code in the cell below trains and fits a PCA model with 5 components, and then transforms the features using that model. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(699, 5)" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pca_mod_5 = skde.PCA(n_components = 5)\n", + "pca_mod_5.fit(x_train)\n", + "Comps = pca_mod_5.transform(x_train)\n", + "Comps.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compute and evaluate a logistic regression model\n", + "\n", + "Next, you will compute and evaluate a logistic regression model using the features transformed by the first 5 principle components. Execute the code in the cell below to define and fit a logistic regression model, and print the model coefficients. " + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[-0.87078519]\n", + "[[ 0.33553175 -0.0568012 -0.44387225 1.10317819 -0.53369548]]\n" + ] + } + ], + "source": [ + "## Define and fit the logistic regression model\n", + "log_mod_5 = linear_model.LogisticRegression(C = 10.0, class_weight = {0:0.1, 0:0.9}) \n", + "log_mod_5.fit(Comps, y_train)\n", + "print(log_mod_5.intercept_)\n", + "print(log_mod_5.coef_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there are now 5 regresson coefficients, one for each component. This number is in contrast to the 35 features in the raw data. \n", + "\n", + "Next, evaluate this model using the code below. Notice that the test features are transformed using the same PCA tranformation used for the training data. Execute this code and examine the results:" + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 132 85\n", + "True negative 23 60\n", + "\n", + "Accuracy 0.64\n", + " \n", + " Positive Negative\n", + "Num case 217.00 83.00\n", + "Precision 0.85 0.41\n", + "Recall 0.61 0.72\n", + "F1 0.71 0.53\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xm81PP+wPHXu6hQkuRaKkXRJsmRLKlulkrUFS0iIZF9vbhc0o+Lrmu71hY7ZY24JaJF0UaLFnEq6hSVFEWl5f374/M9nWnMzPmec+Y731nez8djHs185zsz7/k2Z97z+Xy+n/dHVBVjjDEmnnJhB2CMMSa9WaIwxhiTkCUKY4wxCVmiMMYYk5AlCmOMMQlZojDGGJOQJQrjm4j0EpEPw44jnYjIRhE5NITXrSMiKiK7pfq1gyAi80WkTSkeZ5/JFLBEkaFE5DsR2eR9Uf0oIs+LSOUgX1NVX1HV04J8jUgicoKIfCIiG0TkFxF5T0Qaper1Y8QzQUT6Rm5T1cqquiSg1ztcRN4QkZ+89z9XRG4QkfJBvF5peQmrXlmeQ1Ubq+qEYl7nT8kx1Z/JXGWJIrOdqaqVgWbA0cBtIcdTKrF+FYvI8cCHwLvAQUBdYA4wJYhf8On2y1xEDgOmAcuBI1W1KnAukAdUSfJrhfbe0+24mzhU1S4ZeAG+A06JuD0I+F/E7YrAg8AyYBXwNLBHxP2dgdnAr8BioL23vSowDPgBWAHcA5T37usDTPauPw08GBXTu8AN3vWDgLeANcBS4JqI/QYAbwIve6/fN8b7+xR4Msb2McCL3vU2QAHwD+An75j08nMMIh57C/Aj8BJQDXjfi3mdd72mt/+9wHZgM7AReNzbrkA97/rzwBPA/4ANuC/6wyLiOQ1YBPwCPAlMjPXevX1fjvz/jHF/He+1L/Te30/A7RH3twA+B9Z7/5ePAxUi7lfgSuBbYKm37VFcYvoV+AJoFbF/ee84L/be2xdALWCS91y/ecelu7d/J9znaz3wGdA06rN7CzAX2ALsRsTn2Yt9phfHKuAhb/sy77U2epfjifhMevs0Bj4CfvYe+4+w/1az4RJ6AHYp5X/crn9YNYGvgEcj7n8EGAXsi/sF+h5wn3dfC+/L6lRcq/JgoIF33zvAM8BewP7AdOAy776df5TAyd6Xini3qwGbcAminPdFcidQATgUWAKc7u07ANgKdPH23SPqve2J+1JuG+N9XwT84F1vA2wDHsIlhdbeF9YRPo5B4WMf8B67B1Ad6Oq9fhXgDeCdiNeeQNQXO39OFD97x3c34BVghHffft4X39nefdd6xyBeovgRuCjB/38d77WHeLEfhfvSbejdfwzQ0nutOsBC4LqouD/yjk1h8jzfOwa7ATd6MVTy7rsZ9xk7AhDv9apHHwPvdnNgNXAcLsFciPu8Voz47M7GJZo9IrYVfp4/By7wrlcGWka9590iXqsPRZ/JKrikeCNQybt9XNh/q9lwCT0Au5TyP879YW3E/bpT4GNgH+8+wX1hRv6aPZ6iX47PAA/HeM6/eF82kS2PnsB473rkH6XgfuGd7N2+FPjEu34csCzquW8DnvOuDwAmJXhvNb331CDGfe2Brd71Nrgv+70i7n8d+KePY9AG+KPwizBOHM2AdRG3J1B8ohgacV9H4Gvvem/g84j7BJdo4yWKrXitvDj3F35p1ozYNh3oEWf/64CRUXH/tZjP2DrgKO/6IqBznP2iE8VTwP9F7bMIaB3x2b04xue5MFFMAu4G9ovznuMlip7ArCD/7nL1Yv2Dma2Lqo4TkdbAq7hfreuBGrhfxV+ISOG+gvt1B+6X3OgYz3cIsDvwQ8TjyuG+0HahqioiI3B/nJOA83DdJYXPc5CIrI94SHlcd1KhPz1nhHXADuBA4Ouo+w7EdbPs3FdVf4u4/T2uVVPcMQBYo6qbd94psifwMC4ZVfM2VxGR8qq6PUG8kX6MuP477hcxXkw737N3/AoSPM9a3Hst1euJyOG4llYe7jjshmvlRdrl/0BEbgT6erEqsDfuMwXuM7PYRzzg/v8vFJGrI7ZV8J435mtHuQQYCHwtIkuBu1X1fR+vW5IYTQnYYHYWUNWJuF+zD3qbfsJ1AzVW1X28S1V1A9/g/kgPi/FUy3Etiv0iHre3qjaO89LDgXNE5BBcK+KtiOdZGvEc+6hqFVXtGBl2gvfzG6774dwYd3fDtZ4KVRORvSJu1wZW+jgGsWK4Ede1cpyq7o3rXgOXYBLG7MMPuJaSe0KXvWrG351xuG6w0noKl2Tre+/lHxS9j0I734+ItMKNG3QDqqnqPrjuycLHxPvMxLIcuDfq/39PVR0e67Wjqeq3qtoT1/X5APCm939c3PEvSYymBCxRZI9HgFNFpJmq7sD1XT8sIvsDiMjBInK6t+8w4CIRaSci5bz7GqjqD7gzjf4jInt79x3mtVj+RFVn4QZ+hwJjVbWwBTEd+FVEbhGRPUSkvIg0EZFjS/B+bsX9Kr1GRKqISDURuQfXfXR31L53i0gF78uuE/CGj2MQSxVcclkvIvsCd0Xdvwo33lIa/wOOFJEu3pk+VwIHJNj/LuAEEfm3iBzgxV9PRF4WkX18vF4V3JjIRhFpAPT3sf823P/nbiJyJ65FUWgo8H8iUl+cpiJS3bsv+rgMAS4XkeO8ffcSkTNExNfZWiJyvojU8P4PCz9T273YdhD//+B94AARuU5EKnqfm+P8vKZJzBJFllDVNcCLuP55cL8O84GpIvIr7hfqEd6+03GDwg/jfjVOxHUXgOtLrwAswHUBvUniLpDhwCm4rq/CWLYDZ+L6+Jfift0PxZ1R5ff9TAZOxw3+/oDrUjoaOElVv43Y9UcvzpW4wePLVbWwuyruMYjjEdzA8E/AVOCDqPsfxbWg1onIY37fi/d+fsK1kAbhupUa4c7s2RJn/8W4pFgHmC8iv+BabDNx41LFuQnXHbgB98X9WjH7j8WdUfYN7lhvZtfuoYdw4z8f4hLQMNyxAjfm9IKIrBeRbqo6Ezdm9Tju/yYfN5bgV3vce96IO+Y9VHWzqv6OO/tsivdaLSMfpKobcCdonIn7XHwLtC3B65o4Cs9YMSbjeDN5X1bVRF04aUlEyuFOz+2lquPDjseYRKxFYUyKiMjpIrKPiFSkaMxgashhGVOswBKFiDwrIqtFZF6c+0VEHhORfK80QfOgYjEmTRyPOyvnJ1z3SBdV3RRuSMYUL7CuJxE5GXee/4uq2iTG/R2Bq3Hnmh+HmyxmA0/GGJNmAmtRqOok3CzVeDrjkoiq6lRgHxHxc964McaYFApzwt3B7HpWRYG37YfoHUWkH9APYK+99jqmQYMGKQnQGGNC9+si2L4Jyu9R/L6xrN4CG7fxxXb9SVVrlOYpwkwU0ZN/IM6EGlUdDAwGyMvL05kzZwYZlzHGpI9xbdy/p0zw/5jCIQUReOopWL0aGTDg+9KGEGaiKMBNuS9UE3cuvDHGhCd/MHz3avH7pcq62VCtmf/9V6yA/v2he3fo1ctdBxgwoNQhhHl67Cigt3f2U0vgF29msDHGhOe7V92Xc7qo1gzqnFf8fqowZAg0agTjxsHGjUkLIbAWhYgMx1Xo3M8rfnYXruAcqvo0rihdR9yszd9xM4WNMSZ81ZqVrKsnbIsXw6WXwvjx0LatSxiHJa/sVWCJwivqlej+woVTjDEmGKXpRippV086+Oor+OILGDwY+vZ1YxNJZGXGjTHZq7AbqSRf/H67esI2bx58+SX07g1dusCSJVC9evGPKwVLFMaY7BHdgihMEpnUjVScP/6Af/3LXf7yF+jWDSpVCixJgNV6MsZkk+iB6ExpHfg1bRo0bw533+3Oapo1yyWJgFmLwhiTXbKtBVFoxQpo1cq1It5/H844I2UvbYnCGJN54g1SZ+JAdHG++QYOPxwOPhheew3atYO99y7+cUlkXU/GmMwTb65DNnU1rV8P/fpBgwYwaZLb9re/pTxJgLUojMle6TbDOJmycZA60qhRbkb1jz/CzTfDsSVZRTj5rEVhTLZKtxnGyZRNLYdofftC587uLKZp0+CBB2CPUhYETBJrURiTzbL5V3c2iSzil5cHhxwCt9wCFSqEG5fHEoUx2SSyuykbB3az0fLlcPnl0KMHXHCBu55mrOvJmGwS2d2Uzd0z2WDHDlcCvHFjmDABtmwJO6K4rEVhTLax7qb09+23bixi0iQ45RRXo6lu3bCjistaFMZkg/zBboGbbB28zjYLFsDcufDss/Dhh2mdJMBaFMZkh8jid9bdlJ7mzIHZs+HCC91ZTUuWQLVqYUfliyUKYzJJcTOSrcsp/WzZAvfcA/ffDwce6Go0VaqUMUkCrOvJmMySCzOSs8nnn8PRR7tEcd55KSvil2zWojAm01jLITOsWAGtW8MBB8Do0dChQ9gRlZolCmPSnc2NyCwLF0LDhq6I3+uvuyJ+VaqEHVWZWNeTMenO5kZkhnXr4OKLoVEj+PRTt61Ll4xPEmAtCmMyg3U3pbeRI+GKK2DNGrjtttCL+CWbJQpjwuSnwqt1N6W3iy+G556DZs3gf/9zK9BlGUsUxoQpcv5DPNbdlH4ii/i1bAn168NNN8Huu4cbV0AsURgTtEStBpv/kHm+/x4uu8yd7tq7t1tcKMvZYLYxQUu0LoS1FjLHjh3wxBPQpAlMngxbt4YdUcpYi8KYVLBWQ2ZbtMgV8Zs8GU47DZ55BurUCTuqlLFEYUxJlXSJURuMznyLFsH8+fD88667SSTsiFLKup6MKamSLjFq3UuZadYsdzYTwFlnuSJ+F16Yc0kCrEVhzK5KcrqqdSVlp82bYeBAGDTIza7u2dPVZ9pnn7AjC421KIyJ5Ke1YC2E7DVlipsPcd99rotp9uyMLOKXbNaiMCaatRZy04oV0Lata0WMHesGrQ1gLQpjHFshLnctWOD+PfhgeOst+OorSxJRLFEYA7ZCXC76+Wfo0wcaN3ZrVwOceSZUrhxqWOnIup6MKWRdTrnjrbfgyith7Vq4/XZo0SLsiNKaJQpjTG7p0wdeeMEV7/vgAzd4bRKyRGGMyX6RRfxOOMEtLHTjjbCbfQX6EegYhYi0F5FFIpIvIrfGuL+2iIwXkVkiMldEOgYZjzEx5Q+G1RPDjsIEZelSNzj94ovudr9+cMstliRKILBEISLlgSeADkAjoKeINIra7Q7gdVU9GugBPBlUPMbEVTjBzgaxs8v27fDYY66I39SpRa0KU2JBtihaAPmqukRV/wBGAJ2j9lFgb+96VWBlgPEYE9/+raFe9peLzhkLF0KrVnDttdC6tavT1KdP2FFlrCDbXgcDyyNuFwDHRe0zAPhQRK4G9gJOifVEItIP6AdQu3btpAdqclBkqQ4r2pd98vNdIb+XXoJevXKyPlMyBdmiiPU/E9326wk8r6o1gY7ASyLyp5hUdbCq5qlqXo0aNQII1eScyFIdNnciO3zxBTz7rLt+5plubOL88y1JJEGQLYoCoFbE7Zr8uWvpEqA9gKp+LiKVgP2A1QHGZYxj8yayw6ZNcPfd8OCDUKuWW3muUiXYe+/iH2t8CbJFMQOoLyJ1RaQCbrB6VNQ+y4B2ACLSEKgErAkwJmNMNpk0CY46Ch54wI1BzJplRfwCEFiLQlW3ichVwFigPPCsqs4XkYHATFUdBdwIDBGR63HdUn1U7dQEY4wPK1ZAu3auFTFunLtuAhHoicSqOhoYHbXtzojrC4ATg4zBmJ1sADs7fPUVHHmkK+I3cqSr+LrXXmFHldWsKKDJHTaAndl++gkuuACaNi0q4tepkyWJFLCpiSa32AB25lGFN96Aq66CdevgrrvguOgz7U2QLFGY7BW9rKl1N2WmCy908yHy8uDjj123k0kpSxQme0WuMQHW3ZRJIov4tW7tupuuu87qM4XEjrrJbtbVlHmWLIFLL3WT5S66CC65JOyIcp4NZhtj0sP27fDII65racYMKGdfT+nCWhTGmPAtWAAXXwzTpsEZZ8DTT0PNmmFHZTyWKExmix6wjmSD15lj6VJYvBhefRV69LD6TGnG2nYms0XOjYhmg9fpbcYMGDLEXT/jDDc20bOnJYk0ZC0Kk/lswDqz/P473HknPPwwHHKIm0RXqRJUqRJ2ZCYOSxQmM8TrYrLupcwyYQL07eu6mS67zBXzsyJ+ac+6nkxmiNfFZN1LmaOgAE491V3/5BM3YF21argxGV+sRWEyh3UxZaY5c1wp8Jo14d13oU0b2HPPsKMyJWAtCmNMMNascYsINWsGEye6bR07WpLIQNaiMMYklyqMGAHXXAO//OJWnzv++LCjMmXgK1F4K9TVVtX8gOMxxmS6Cy6AV15xFV6HDYPGjcOOyJRRsV1PInIG8BXwkXe7mYiMDDowY0wG2bGjqJBf27bw0EMwZYoliSzhZ4xiIHAcsB5AVWcD9YIMypid8gfDuDbxJ9WZ8OXnu2VIn3vO3b7kErj+eihfPty4TNL4SRRbVXV91DZb19qkRmSpcDsNNr1s2wYPPuiK+M2aBRUqhB2RCYifMYqFItINKCcidYFrganBhmVMBDstNv3Mm+dKgM+cCZ07w5NPwkEHhR2VCYifFsVVwDHADuBtYDMuWRhjctWyZfD99+7sppEjLUlkOT8titNV9RbglsINInI2LmkYY3LFtGlu8ly/fm4+xJIlULly2FGZFPDTorgjxrbbkx2IMSZN/fYb3HCDmwsxaBBs2eK2W5LIGXFbFCJyOtAeOFhEHoq4a29cN5QxwYgsAGhF/8L1ySduWdIlS6B/f7j/fqhYMeyoTIol6npaDczDjUnMj9i+Abg1yKBMjos808nOdgpPQQGcfjrUretKcJx8ctgRmZDETRSqOguYJSKvqOrmFMZkjJ3pFKZZs+Doo10Rv/feg9atYY89wo7KhMjPGMXBIjJCROaKyDeFl8AjM8ak1qpV0L07NG9eVMSvfXtLEsZXongeeA4QoAPwOjAiwJiMMamkCi+/DI0awTvvwD33wAknhB2VSSN+EsWeqjoWQFUXq+odQNtgwzI5ycp1hOO881whvyOOgNmz4fbbYffdw47KpBE/8yi2iIgAi0XkcmAFsH+wYZmcZOU6UmfHDhBxl9NOc6e+Xnml1WcyMflJFNcDlYFrgHuBqsDFQQZlcpgNYgfvm2/cKa+9e7sCfhddFHZEJs0VmyhUdZp3dQNwAYCI1AwyKJNhIuc9lIXNmQjWtm2u/Pddd0GlSjZIbXxLOEYhIseKSBcR2c+73VhEXsSKAppIhV1GZWVdTsGZOxdatoRbboEOHWDBAjc2YYwPiWZm3wd0BeYAd3iLFV0LPABcnprwTForbEkUtgSsyyh9FRTA8uXwxhvQtasbmzDGp0RdT52Bo1R1k4jsC6z0bi/y++Qi0h54FCgPDFXV+2Ps0w0YgFvjYo6q2s+cTGGDz+nts89cS+Lyy4uK+O21V9hRmQyUKFFsVtVNAKr6s4h8XcIkUR54AjgVKABmiMgoVV0QsU994DbgRFVdJyJ2NlWmsZZE+tm40Z3i+t//wmGHucHqihUtSZhSS5QoDhWRwlLiAtSJuI2qnl3Mc7cA8lV1CYCIjMC1UhZE7HMp8ISqrvOec3UJ4zepFD1obYPP6efDD10Z8GXL3Omu//qXFfEzZZYoUXSNuv14CZ/7YGB5xO0C3NrbkQ4HEJEpuO6pAar6QfQTiUg/oB9A7dq1SxiGSZrIriawLqd0s3w5nHGGa0VMmgQnnRR2RCZLJCoK+HEZnzvWaFn0Wtu7AfWBNkBN4FMRaRK9RreqDgYGA+Tl5dl63WGyrqb088UXcMwxUKsWjB4NrVq501+NSRI/JTxKqwCoFXG7Jm5APHqfd1V1q6ouBRbhEocxpjg//gjnngt5eUVF/E491ZKESbogE8UMoL6I1BWRCkAPYFTUPu/g1Y3y5mocDiwJMCZjMp8qvPCCK+L33ntuHMKK+JkA+SnhAYCIVFTVLX73V9VtInIVMBY3/vCsqs4XkYHATFUd5d13mogsALYDN6vq2pK9BWNyTI8e8PrrcOKJMHQoNGgQdkQmyxWbKESkBTAMV+OptogcBfRV1auLe6yqjgZGR227M+K6Ajd4F2NMPJFF/Dp2dOMQV1wB5YLsFDDG8fMpewzoBKwFUNU5WJlxY1Ln66/dMqTDhrnbF14IV11lScKkjJ9PWjlV/T5q2/YggjHGRNi61Y0/HHWUq81UuXLYEZkc5WeMYrnX/aTebOurAVsK1ZggzZ7tZlTPng3nnONmWR9wQNhRmRzlJ1H0x3U/1QZWAeO8bcaYoPz4o7u89RacXVwRBGOC5SdRbFPVHoFHYkyumzzZFfG74gpo3x4WL4Y99ww7KmN8jVHMEJHRInKhiFQJPCJjcs2GDW5wulUreOQR2OKdhW5JwqQJPyvcHSYiJ+AmzN0tIrOBEao6IvDoTDjirVhnRQCTb+xYV8Rv+XK49lq45x4r4mfSjq/z61T1M1W9BmgO/Aq8EmhUJlzxVqyzIoDJtXw5dOrkWg6TJ7vWhJ3ZZNKQnwl3lXHlwXsADYF3AasXkO2s+F8wVGHGDGjRwhXxGzPGVXm1+kwmjfkZzJ4HvAcMUtVPA47HpEq87iWwLqag/PCDWyNi5EiYMAFat4ZTTgk7KmOK5SdRHKqqOwKPxKRW9NoSkayLKblU4fnn4YYbYPNmeOABV6fJmAwRN1GIyH9U9UbgLRH50xoQPla4M+nOupdSo1s3ePNNd1bT0KFw+OFhR2RMiSRqUbzm/VvSle2MMdu3uwJ+5crBmWfCX/8Kl11m9ZlMRor7qVXV6d7Vhqr6ceQFN6htjIll4ULXeigs4te7N/Tvb0nCZCw/n9yLY2y7JNmBmBTKHwyrJ4YdRfbZutXNg2jWDBYtgqpVw47ImKRINEbRHXdKbF0ReTvirirA+tiPMhmh8GwnG7BOnlmzoE8fV4Kje3d47DHYf/+wozImKRKNUUzHrUFRE3giYvsGYFaQQZkARJ4Ou2427N8a6vULN6ZssmoV/PQTvPMOdO4cdjTGJFXcRKGqS4GluGqxJtNFng5rp78mx6RJ8NVXbm5E+/aQnw977BF2VMYkXaKup4mq2lpE1gGRp8cKbhXTfQOPziSXnQ6bHL/+CrfeCk895U517dvX1WeyJGGyVKLB7MLlTvcDakRcCm8bk3tGj4bGjeGZZ9wEui+/tCJ+JuslOj22cDZ2LaC8qm4HjgcuA/ZKQWzGpJfly934Q9Wq8Nln8J//wF72p2Cyn5/TY9/BLYN6GPAibg5FnCJBxmQZVZg61V2vVQs+/NC1Io47Lty4jEkhP4lih6puBc4GHlHVq4GDgw3LJE3+YBjXJnbZcJPYypXQpQscfzxM9OadtG0LFSqEG5cxKeYnUWwTkXOBC4D3vW27BxeSSarIs53sTCd/VF1NpkaNXAviwQetiJ/JaX6qx14MXIErM75EROoCw4MNy/iSqFR4ocIkYWc7+XfOOfD2264M+NChUK9e2BEZEyo/S6HOE5FrgHoi0gDIV9V7gw/NFCtRqfBC1pLwJ7KIX5cucNppcOmlVp/JGPytcNcKeAlYgZtDcYCIXKCqU4IOzvhgrYWymzfPzYW45BKXHC64IOyIjEkrfrqeHgY6quoCABFpiEsceUEGZuKILsVhK9GV3h9/wH33wb33ulNeq1ULOyJj0pKfdnWFwiQBoKoLATvtIyyF3U1g3Upl8cUXcMwxMGAAnHsuLFjgxiaMMX/ip0XxpYg8g2tFAPTCigKGy7qbym7tWli/Ht57Dzp1CjsaY9Kan0RxOXAN8HfcGMUk4L9BBmVMIMaPd0X8rrnGDVZ/+y1UqhR2VMakvYSJQkSOBA4DRqrqoNSEZEyS/fIL/P3vMHgwNGjgliStWNGShDE+xR2jEJF/4Mp39AI+EpFYK90Zk97ee89NnBs6FG66yY1NWBE/Y0okUYuiF9BUVX8TkRrAaODZ1IRlTBIsXw5du7pWxDvvwLHHhh2RMRkp0VlPW1T1NwBVXVPMvsakB1VX2RWKivjNnGlJwpgySNSiODRirWwBDotcO1tVzy7uyUWkPfAoUB4Yqqr3x9nvHOAN4FhVnek3+KwXq0SHzZ2Ir6AA+veH99+HCRNcCY42bcKOypiMlyhRdI26/XhJnlhEyuPW2j4VKABmiMioyDkZ3n5VcGdVTSvJ8+eEWCU6bO7En+3YAUOGwM03w7Zt8NBDcNJJYUdlTNZItGb2x2V87ha4ulBLAERkBNAZWBC13/8Bg4Cbyvh62cnmTBSva1c3BvHXv7qEceihYUdkTFYJctzhYGB5xO0CotaxEJGjgVqq+j4JiEg/EZkpIjPXrFmT/EhN5tm2zbUkwCWKIUNg3DhLEsYEIMhEITG26c47Rcrh6kjdWNwTqepgVc1T1bwaNWy57pw3d65bTGjIEHf7/PNdUT+J9ZEzxpSVn5nZAIhIRVXdUoLnLsCtt12oJrAy4nYVoAkwQdwf+AHAKBE5K+cGtOOtK2ED17vasgX+9S93qVYN7EeDMSlRbItCRFqIyFfAt97to0TETwmPGUB9EakrIhWAHsCowjtV9RdV3U9V66hqHWAqkHtJAnYt9BfJBq6LzJgBzZvDwIHQsycsXAhnF3vinTEmCfy0KB4DOuFmaaOqc0SkbXEPUtVtInIVMBZ3euyzqjpfRAYCM1V1VOJnyDE2aJ3YunWwcSOMHg0dOoQdjTE5xU+iKKeq38uu/b/b/Ty5qo7GzeiO3HZnnH3b+HlOk0M++cQV8bv2WlfE75tvrPyGMSHwM5i9XERaACoi5UXkOuCbgOMyuWz9erfSXLt28MwzbmwCLEkYExI/LYr+uO6n2sAqYJy3zZSGzbZO7N133ezqVatcxdcBAyxBGBOyYhOFqq7GDUSbZLDZ1vEtW+ZWm2vYEEaNgjxbbdeYdFBsohCRIUTMfyikqv0CiSgX2MB1EVWYPBlatYLatd2kuZYtoYKttmtMuvDT9TQu4nol4G/sOuPaxGJzI4q3bBlcfjmMGVNUxO/kk8OOyhgTxU/X02uRt0XkJeCjwCLKFrG6mMC6mcCV3nj6abjlFteieOwxK+JnTBrzPTM7Ql3gkGQHkpU53fGkAAAVGklEQVSsiym2s892g9annuqWJ61TJ+yIjDEJ+BmjWEfRGEU54Gfg1iCDMllo2zYoV85duneHzp2hTx+rz2RMBkiYKMTNsjsKWOFt2qGqfxrYNiahOXPg4ovd3IjLL3clOIwxGSPhhDsvKYxU1e3exZKE8W/zZrjjDneaa0EBHHBA2BEZY0rBz8zs6SLSPPBITHaZPh2OPhruvRd69XJF/Lp0CTsqY0wpxO16EpHdVHUbcBJwqYgsBn7DrTOhqmrJw8T366+waRN88AGcfnrY0RhjyiDRGMV0oDlgPwPjiTdXAnJzvsSHH8L8+XD99XDKKbBokZXfMCYLJOp6EgBVXRzrkqL40lu8dSQgt+ZLrFsHF13kWg7DhlkRP2OyTKIWRQ0RuSHenar6UADxZJ5cnyvx9ttw5ZWwZg3cdhvceaclCGOyTKJEUR6oTOy1r41xJTh69IAmTdyCQkcfHXZExpgAJEoUP6jqwJRFYjKDKkya5Ooy1a7tFhc67jjYffewIzPGBKTYMQpjdvr+e7cMaZs2MHGi23bSSZYkjMlyiRJFu5RFYdLbjh3w+OPQuLErCf7f/7qy4MaYnBC360lVf05lICaNdekC773nzmp65hk4xGpCGpNL/MzMNtHyB8O4NvFPjc0GW7e6lgS42kwvvODWjbAkYUzOsURRGpFrTWTjXIkvv4QWLdyaEeASRe/eVunVmBxVmvUoDGTn/IlNm2DgQPj3v6FGDahVK+yIjDFpwBKFcaZOhQsvhG++cSXBH3wQqlULOypjTBqwRGGc335z4xIffeTqNBljjMcSRXFiFf7LloJ/H3zgivjdeCO0awdffw0VKoQdlTEmzdhgdnFiFf7L9EHstWtdN1OHDu5spj/+cNstSRhjYrAWRSL5g2H1RNi/dXYMXKvCW2+5In4//+xWn7vjDksQxpiELFEkUtjllMmth0jLlsF550HTpm7tiKOOCjsiY0wGsK6n4uzfGur1CzuK0lN1hfvATZabMMGd4WRJwhjjkyWKbLZ0KZx2mhuoLizid8IJsJs1JI0x/lmiyEbbt8Ojj7p1IqZNg6eesiJ+xphSs5+W2ahzZ/jf/6BjR1eGw2ZYG2PKwBJFtti6FcqXh3Ll4IILXH2m886z+kzGmDILtOtJRNqLyCIRyReRW2Pcf4OILBCRuSLysYhYadLSmDkT8vJcFxNA9+7Qq5clCWNMUgSWKESkPPAE0AFoBPQUkUZRu80C8lS1KfAmMCioeEokU8qIb9oEt9ziliJds8ZKgBtjAhFki6IFkK+qS1T1D2AE0DlyB1Udr6q/ezenAjUDjMe/TCgj/vnn7hTXQYNcEb8FC6BTp7CjMsZkoSDHKA4GlkfcLgCOS7D/JcCYWHeISD+gH0Dt2rWTFV9i6V5GfNMmt7DQuHHu9FdjjAlIkC2KWB3kGnNHkfOBPODfse5X1cGqmqeqeTVq1EhiiDEUlu1IR6NHu7UiAP76V1i40JKEMSZwQSaKAiDyvMyawMronUTkFOB24CxV3RJgPP6kY9mOn36C88+HM86AV14pKuK3++7hxmWMyQlBJooZQH0RqSsiFYAewKjIHUTkaOAZXJJYHWAsJZMuZTtUYcQIaNgQXn8d7roLpk+3In7GmJQKLFGo6jbgKmAssBB4XVXni8hAETnL2+3fQGXgDRGZLSKj4jxd8NLxTKdly1w58Lp14YsvYMAASxLGmJQLdMKdqo4GRkdtuzPievospZYuZzqpwscfu1XmDjnE1Wg69lg3mc4YY0JgtZ6gaAC78EynsLqdFi92g9OnnlpUxK9lS0sSxphQWaKA8Aewt2+Hhx6CI490XUzPPGNF/IwxacNqPRUKcwD7zDNhzBg3Ye6pp6Bmesw7NMYYyPUWRZgD2H/84SbMAfTpA6++CqNGWZIwxqSd3E4UYQ1gT58OxxwDTz7pbnfr5qq9WhE/Y0wayq2up/zBReMRUJQkUlWq4/ff4Z//hEcegQMPhMMOS83rGmNMGeRWi6KwBVEolS2JyZPdYPVDD8Gll8L8+dChQ2pe2xhjyiC3WhQQXrG/woWFxo+HNm1S//rGGFNKuZcoUum991zhvr//Hdq2daXAd7NDbozJLLnV9ZQqa9a4ZUjPOguGDy8q4mdJwhiTgSxRJJOqO821YUN4800YOBCmTbP6TMaYjGY/cZNp2TK46CI4+mgYNgwaNw47ImOMKTNrUZTVjh0wdqy7fsgh8OmnMGWKJQljTNbI7hZFvHkTyfLtt+5U14kT3eXkk6FFi+Q9vzHGpIHsblEENW9i2za3JGnTpjB7tutmsiJ+xpgsld0tCghm3kSnTq67qXNnV4bjoIOS+/zGZImtW7dSUFDA5s2bww4lZ1SqVImaNWuyexKXSs6+RBHZ3ZTMrqYtW9wa1eXKQd++cPHFcO65Vp/JmAQKCgqoUqUKderUQexvJXCqytq1aykoKKBu3bpJe97s63qK7G5KVlfT1KnQvDk88YS7fc45rpCfffCNSWjz5s1Ur17dkkSKiAjVq1dPegsuc1sU0QPVhZJZ6O+33+COO+DRR1357/r1y/6cxuQYSxKpFcTxztwWRfRAdaFktSI+/dQV8XvkEejfH+bNg/bty/68xhiTYTI3UUBRyyH6koyV6rZtc2MSEye6Lqe99y77cxpjQjFy5EhEhK+//nrntgkTJtCpU6dd9uvTpw9vvvkm4Abib731VurXr0+TJk1o0aIFY8aMKXMs9913H/Xq1eOII45gbOEcrCitWrWiWbNmNGvWjIMOOoguXboA8Morr9C0aVOaNm3KCSecwJw5c8ocjx+Z2fWUPxhWT3TLlybTO++4In633eaK+M2fb/WZjMkCw4cP56STTmLEiBEMGDDA12P++c9/8sMPPzBv3jwqVqzIqlWrmDhxYpniWLBgASNGjGD+/PmsXLmSU045hW+++Yby5cvvst+nn36683rXrl3p3LkzAHXr1mXixIlUq1aNMWPG0K9fP6ZNm1ammPzIzG/BwrGJZK0lsWoVXH01vPGGG7S+8UZXn8mShDHJ88V1yV92uFozOOaRhLts3LiRKVOmMH78eM466yxfieL3339nyJAhLF26lIoVKwLwl7/8hW7dupUp3HfffZcePXpQsWJF6tatS7169Zg+fTrHH398zP03bNjAJ598wnPPPQfACSecsPO+li1bUlBQUKZ4/Mrcrqf9W5e9i0kVXnoJGjWCd9+Fe+91ZzhZET9jssY777xD+/btOfzww9l333358ssvi31Mfn4+tWvXZm8fXc7XX3/9zm6iyMv999//p31XrFhBrVq1dt6uWbMmK1asiPvcI0eOpF27djHjGDZsGB1StPhZbv9kXrbMzYnIy3Ozqxs0CDsiY7JXMb/8gzJ8+HCuu+46AHr06MHw4cNp3rx53LODSnrW0MMPP+x7X1Ut0esNHz6cvn37/mn7+PHjGTZsGJMnT/b92mWRe4misIhfhw6uiN+UKa7aa1QfoTEm861du5ZPPvmEefPmISJs374dEWHQoEFUr16ddevW7bL/zz//zH777Ue9evVYtmwZGzZsoEqVKglf4/rrr2f8+PF/2t6jRw9uvfXWXbbVrFmT5cuX77xdUFDAQXEqO6xdu5bp06czcuTIXbbPnTuXvn37MmbMGKpXr54wtqRR1Yy6HFO/surrVVU/aq0ltmiRaqtWqqA6YULJH2+MKZEFCxaE+vpPP/209uvXb5dtJ598sk6aNEk3b96sderU2Rnjd999p7Vr19b169erqurNN9+sffr00S1btqiq6sqVK/Wll14qUzzz5s3Tpk2b6ubNm3XJkiVat25d3bZtW8x9n3rqKe3du/cu277//ns97LDDdMqUKQlfJ9ZxB2ZqKb93M2+MYvumks+V2LYNHnjAFfH76it47jlX6dUYk9WGDx/O3/72t122de3alVdffZWKFSvy8ssvc9FFF9GsWTPOOecchg4dStWqVQG45557qFGjBo0aNaJJkyZ06dKFGjVqlCmexo0b061bNxo1akT79u154okndp7x1LFjR1auXLlz3xEjRtCzZ89dHj9w4EDWrl3LFVdcQbNmzcjLyytTPH6JxugzS2d5h1fRmd9sKNmDTj8dPvwQzj7bzYk44IBggjPG7GLhwoU0bNgw7DByTqzjLiJfqGqpMkv2jlFs3uwmzJUvD/36uUvXrmFHZYwxGSfzup78mDIFmjUrKuLXtaslCWOMKaXsShQbN8I117hFhDZvBmvyGhO6TOveznRBHO/sSRQTJ0KTJvD443DVVa6I36mnhh2VMTmtUqVKrF271pJFiqi3HkWlSpWS+rzZNUax556u6uuJJ4YdiTEGN2+goKCANWvWhB1Kzihc4S6ZMvusp7ffhq+/hn/8w93evt0mzhljTAxlOesp0K4nEWkvIotEJF9Ebo1xf0URec27f5qI1Cn2SXfbE3780a0y17UrjBwJf/zh7rMkYYwxSRdYohCR8sATQAegEdBTRBpF7XYJsE5V6wEPAw8U+8Sb9nSD1O+/D/fdB599ZkX8jDEmQEG2KFoA+aq6RFX/AEYAnaP26Qy84F1/E2gnxVXk+v57N2g9Zw7cequbK2GMMSYwQQ5mHwwsj7hdABwXbx9V3SYivwDVgZ8idxKRfkBhTfEtMnnyPKv0CsB+RB2rHGbHoogdiyJ2LIocUdoHBpkoYrUMokfO/eyDqg4GBgOIyMzSDshkGzsWRexYFLFjUcSORRERmVnaxwbZ9VQA1Iq4XRNYGW8fEdkNqAr8HGBMxhhjSijIRDEDqC8idUWkAtADGBW1zyjgQu/6OcAnmmnn6xpjTJYLrOvJG3O4ChgLlAeeVdX5IjIQVxd9FDAMeElE8nEtiR4+nnpwUDFnIDsWRexYFLFjUcSORZFSH4uMm3BnjDEmtbKn1pMxxphAWKIwxhiTUNomikDKf2QoH8fiBhFZICJzReRjETkkjDhTobhjEbHfOSKiIpK1p0b6ORYi0s37bMwXkVdTHWOq+PgbqS0i40Vklvd30jGMOIMmIs+KyGoRmRfnfhGRx7zjNFdEmvt64tIuth3kBTf4vRg4FKgAzAEaRe1zBfC0d70H8FrYcYd4LNoCe3rX++fysfD2qwJMAqYCeWHHHeLnoj4wC6jm3d4/7LhDPBaDgf7e9UbAd2HHHdCxOBloDsyLc39HYAxuDltLYJqf503XFkUw5T8yU7HHQlXHq+rv3s2puDkr2cjP5wLg/4BBwOZUBpdifo7FpcATqroOQFVXpzjGVPFzLBTY27telT/P6coKqjqJxHPROgMvqjMV2EdEDizuedM1UcQq/3FwvH1UdRtQWP4j2/g5FpEuwf1iyEbFHgsRORqoparvpzKwEPj5XBwOHC4iU0Rkqoi0T1l0qeXnWAwAzheRAmA0cHVqQks7Jf0+AdJ34aKklf/IAr7fp4icD+QBrQONKDwJj4WIlMNVIe6TqoBC5OdzsRuu+6kNrpX5qYg0UdX1AceWan6ORU/geVX9j4gcj5u/1URVdwQfXlop1fdmurYorPxHET/HAhE5BbgdOEtVt6QotlQr7lhUAZoAE0TkO1wf7KgsHdD2+zfyrqpuVdWlwCJc4sg2fo7FJcDrAKr6OVAJVzAw1/j6PomWronCyn8UKfZYeN0tz+CSRLb2Q0Mxx0JVf1HV/VS1jqrWwY3XnKWqpS6Glsb8/I28gzvRARHZD9cVtSSlUaaGn2OxDGgHICINcYkiF9dnHQX09s5+agn8oqo/FPegtOx60uDKf2Qcn8fi30Bl4A1vPH+Zqp4VWtAB8XkscoLPYzEWOE1EFgDbgZtVdW14UQfD57G4ERgiItfjulr6ZOMPSxEZjutq3M8bj7kL2B1AVZ/Gjc90BPKB34GLfD1vFh4rY4wxSZSuXU/GGGPShCUKY4wxCVmiMMYYk5AlCmOMMQlZojDGGJOQJQqTdkRku4jMjrjUSbBvnXiVMkv4mhO86qNzvJIXR5TiOS4Xkd7e9T4iclDEfUNFpFGS45whIs18POY6EdmzrK9tcpclCpOONqlqs4jLdyl63V6qehSu2OS/S/pgVX1aVV/0bvYBDoq4r6+qLkhKlEVxPom/OK8DLFGYUrNEYTKC13L4VES+9C4nxNinsYhM91ohc0Wkvrf9/Ijtz4hI+WJebhJQz3tsO28Ng6+8Wv8Vve33S9EaIA962waIyE0icg6u5tYr3mvu4bUE8kSkv4gMioi5j4j8t5Rxfk5EQTcReUpEZopbe+Jub9s1uIQ1XkTGe9tOE5HPveP4hohULuZ1TI6zRGHS0R4R3U4jvW2rgVNVtTnQHXgsxuMuBx5V1Wa4L+oCr1xDd+BEb/t2oFcxr38m8JWIVAKeB7qr6pG4Sgb9RWRf4G9AY1VtCtwT+WBVfROYifvl30xVN0Xc/SZwdsTt7sBrpYyzPa5MR6HbVTUPaAq0FpGmqvoYrpZPW1Vt65XyuAM4xTuWM4Ebinkdk+PSsoSHyXmbvC/LSLsDj3t98ttxdYuifQ7cLiI1gbdV9VsRaQccA8zwypvsgUs6sbwiIpuA73BlqI8AlqrqN979LwBXAo/j1roYKiL/A3yXNFfVNSKyxKuz8633GlO85y1JnHvhylVErlDWTUT64f6uD8Qt0DM36rEtve1TvNepgDtuxsRlicJkiuuBVcBRuJbwnxYlUtVXRWQacAYwVkT64soqv6Cqt/l4jV6RBQRFJOb6Jl5toRa4InM9gKuAv5bgvbwGdAO+Bkaqqor71vYdJ24Vt/uBJ4CzRaQucBNwrKquE5HncYXvognwkar2LEG8JsdZ15PJFFWBH7z1Ay7A/ZrehYgcCizxultG4bpgPgbOEZH9vX32Ff9rin8N1BGRet7tC4CJXp9+VVUdjRsojnXm0QZc2fNY3ga64NZIeM3bVqI4VXUrrguppddttTfwG/CLiPwF6BAnlqnAiYXvSUT2FJFYrTNjdrJEYTLFk8CFIjIV1+30W4x9ugPzRGQ20AC35OMC3BfqhyIyF/gI1y1TLFXdjKuu+YaIfAXsAJ7Gfem+7z3fRFxrJ9rzwNOFg9lRz7sOWAAcoqrTvW0ljtMb+/gPcJOqzsGtjz0feBbXnVVoMDBGRMar6hrcGVnDvdeZijtWxsRl1WONMcYkZC0KY4wxCVmiMMYYk5AlCmOMMQlZojDGGJOQJQpjjDEJWaIwxhiTkCUKY4wxCf0/1V8VF1CcAd8AAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "def print_metrics(labels, scores):\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('True positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('True negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1])\n", + " \n", + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show() \n", + "\n", + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "probabilities = log_mod_5.predict_proba(pca_mod_5.transform(x_test))\n", + "scores = score_model(probabilities, 0.3)\n", + "print_metrics(y_test, scores) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the most part, these results look good. The question remains, were the correct number of principle components used? " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Add more components to the model\n", + "\n", + "Now you will compute and evaluate a logistic regression model using the first 10 principle components. You will compare this model to the one created with 5 principle components. Execute the code below to transform the training features using the first 10 principle components. " + ] + }, + { + "cell_type": "code", + "execution_count": 38, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(699, 10)" + ] + }, + "execution_count": 38, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pca_mod_10 = skde.PCA(n_components = 10)\n", + "pca_mod_10.fit(x_train)\n", + "Comps_10 = pca_mod_10.transform(x_train)\n", + "Comps_10.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to define and fit a logistic regression model using the 10 components of the transformed features. " + ] + }, + { + "cell_type": "code", + "execution_count": 39, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "LogisticRegression(C=100, class_weight=None, dual=False, fit_intercept=True,\n", + " intercept_scaling=1, max_iter=100, multi_class='ovr', n_jobs=1,\n", + " penalty='l2', random_state=None, solver='liblinear', tol=0.0001,\n", + " verbose=0, warm_start=False)" + ] + }, + "execution_count": 39, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "## define and fit the linear regression model\n", + "log_mod_10 = linear_model.LogisticRegression(C = 100) \n", + "log_mod_10.fit(Comps_10, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below scores the logistic regression model and displays performance metrics, the ROC curve, and the AUC. Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": 40, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 145 72\n", + "True negative 19 64\n", + "\n", + "Accuracy 0.70\n", + " \n", + " Positive Negative\n", + "Num case 217.00 83.00\n", + "Precision 0.88 0.47\n", + "Recall 0.67 0.77\n", + "F1 0.76 0.58\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XecVNX5x/HPIwpYEBExRhFBsQFRxA1iBWPDBkQMghUVscZYf2pMYon+jMZo4s+KNTYwGlEwKDYQJSKiKFIsCAqLDREURJDy/P44d9lx2Z2d3Z07d8r3/XrNi5k7d2aeuczOM+ece55j7o6IiEhN1kk6ABERyW9KFCIikpYShYiIpKVEISIiaSlRiIhIWkoUIiKSlhKFZMzMjjOz55OOI5+Y2RIz2zaB121rZm5m6+b6teNgZtPMrEc9HqfPZA4oURQoM/vEzH6Ivqi+MLMHzGyjOF/T3R9x94PjfI1UZraXmb1sZovN7FszG2lmHXL1+tXEM9bMBqVuc/eN3H1WTK+3g5k9bmZfR+9/ipldYGaN4ni9+ooSVvuGPIe7d3T3sbW8zlrJMdefyVKlRFHYjnT3jYDOwG7AZQnHUy/V/So2sz2B54GngS2BdsC7wPg4fsHn2y9zM9sOeAOYC/zC3ZsDvwHKgGZZfq3E3nu+HXepgbvrUoAX4BPgwJTbNwD/SbndBLgRmAN8CdwJrJ9yf2/gHeA74GOgZ7S9OXAv8DkwD7gGaBTdNxB4Lbp+J3BjlZieBi6Irm8J/BuYD8wGzk3Z70rgCeDh6PUHVfP+XgVur2b7s8CD0fUeQDnwe+Dr6Jgcl8kxSHnsJcAXwENAC+CZKOaF0fXW0f7XAquAZcAS4NZouwPto+sPALcB/wEWE77ot0uJ52DgA+Bb4Hbgleree7Tvw6n/n9Xc3zZ67ZOi9/c1cHnK/V2B14FF0f/lrUDjlPsdOBv4CJgdbfsHITF9B7wF7Juyf6PoOH8cvbe3gK2BcdFzfR8dl2Oi/Y8gfL4WAf8Fdqny2b0EmAIsB9Yl5fMcxT4piuNL4KZo+5zotZZElz1J+UxG+3QEXgC+iR77+6T/VovhkngAutTzP+6nf1itgfeAf6Tc/3dgBLAp4RfoSOC66L6u0ZfVQYRW5VbATtF9TwF3ARsCmwMTgdOj+9b8UQL7RV8qFt1uAfxASBDrRF8kfwIaA9sCs4BDon2vBFYAfaJ916/y3jYgfCnvX837Phn4PLreA1gJ3ERICt2jL6wdMzgGFY+9Pnrs+kBLoG/0+s2Ax4GnUl57LFW+2Fk7UXwTHd91gUeAYdF9m0VffEdF9/0uOgY1JYovgJPT/P+3jV777ij2XQlfujtH9+8OdIteqy0wAzivStwvRMemInkeHx2DdYELoxiaRvddTPiM7QhY9Hotqx6D6HYX4CtgD0KCOYnweW2S8tl9h5Bo1k/ZVvF5fh04Ibq+EdCtynteN+W1BlL5mWxGSIoXAk2j23sk/bdaDJfEA9Clnv9x4Q9rCeHXnQMvAZtE9xnhCzP11+yeVP5yvAu4uZrn/Fn0ZZPa8hgAjImup/5RGuEX3n7R7dOAl6PrewBzqjz3ZcD90fUrgXFp3lvr6D3tVM19PYEV0fUehC/7DVPu/xfwxwyOQQ/gx4ovwhri6AwsTLk9ltoTxT0p9x0GvB9dPxF4PeU+IyTamhLFCqJWXg33V3xptk7ZNhHoX8P+5wHDq8T9q1o+YwuBXaPrHwC9a9ivaqK4A/hzlX0+ALqnfHZPqebzXJEoxgFXAZvV8J5rShQDgMlx/t2V6kX9g4Wtj7u/aGbdgUcJv1oXAa0Iv4rfMrOKfY3w6w7CL7lR1TzfNsB6wOcpj1uH8IX2E+7uZjaM8Mc5DjiW0F1S8TxbmtmilIc0InQnVVjrOVMsBFYDPwfer3LfzwndLGv2dffvU25/SmjV1HYMAOa7+7I1d5ptANxMSEYtos3NzKyRu69KE2+qL1KuLyX8IiaKac17jo5feZrnWUB4r/V6PTPbgdDSKiMch3UJrbxUP/k/MLMLgUFRrA5sTPhMQfjMfJxBPBD+/08ys9+mbGscPW+1r13FqcDVwPtmNhu4yt2fyeB16xKj1IEGs4uAu79C+DV7Y7Tpa0I3UEd33yS6NPcw8A3hj3S7ap5qLqFFsVnK4zZ29441vPRQ4Ggz24bQivh3yvPMTnmOTdy9mbsflhp2mvfzPaH74TfV3N2P0Hqq0MLMNky53Qb4LINjUF0MFxK6VvZw940J3WsQEkzamDPwOaGlFJ4wZK/WNe/Oi4RusPq6g5Bkt4/ey++pfB8V1rwfM9uXMG7QD2jh7psQuicrHlPTZ6Y6c4Frq/z/b+DuQ6t77arc/SN3H0Do+rweeCL6P67t+NclRqkDJYri8XfgIDPr7O6rCX3XN5vZ5gBmtpWZHRLtey9wspkdYGbrRPft5O6fE840+puZbRzdt13UYlmLu08mDPzeA4x294oWxETgOzO7xMzWN7NGZtbJzH5Zh/dzKeFX6blm1szMWpjZNYTuo6uq7HuVmTWOvuyOAB7P4BhUpxkhuSwys02BK6rc/yVhvKU+/gP8wsz6RGf6nA1skWb/K4C9zOyvZrZFFH97M3vYzDbJ4PWaEcZElpjZTsCZGey/kvD/ua6Z/YnQoqhwD/BnM9vegl3MrGV0X9XjcjdwhpntEe27oZkdbmYZna1lZsebWavo/7DiM7Uqim01Nf8fPANsYWbnmVmT6HOzRyavKekpURQJd58PPEjon4fw63AmMMHMviP8Qt0x2nciYVD4ZsKvxlcI3QUQ+tIbA9MJXUBPkL4LZChwIKHrqyKWVcCRhD7+2YRf9/cQzqjK9P28BhxCGPz9nNCltBuwj7t/lLLrF1GcnxEGj89w94ruqhqPQQ3+ThgY/hqYADxX5f5/EFpQC83slkzfS/R+via0kG4gdCt1IJzZs7yG/T8mJMW2wDQz+5bQYptEGJeqzUWE7sDFhC/ux2rZfzThjLIPCcd6GT/tHrqJMP7zPCEB3Us4VhDGnP5pZovMrJ+7TyKMWd1K+L+ZSRhLyFRPwnteQjjm/d19mbsvJZx9Nj56rW6pD3L3xYQTNI4kfC4+Avavw+tKDSrOWBEpONFM3ofdPV0XTl4ys3UIp+ce5+5jko5HJB21KERyxMwOMbNNzKwJlWMGExIOS6RWsSUKM7vPzL4ys6k13G9mdouZzYxKE3SJKxaRPLEn4aycrwndI33c/YdkQxKpXWxdT2a2H+E8/wfdvVM19x8G/JZwrvkehMliGngSEckzsbUo3H0cYZZqTXoTkoi7+wRgEzPL5LxxERHJoSQn3G3FT8+qKI+2fV51RzMbDAwG2HDDDXffaaedchKgiEiDffcBrPoBGq1f+75x+Go5LFnJW6v8a3dvVZ+nSDJRVJ38AzVMqHH3IcAQgLKyMp80aVKccYmIZM+LPcK/B47N3WtWDCmYwR13wFdfYVde+Wl9ny7Js57KCVPuK7QmnAsvIiL1NW8e9O4Nj0ZTm848E66oOne0bpJMFCOAE6Ozn7oB30Yzg0VEpK7c4e67oUMHePFFWLIka08dW9eTmQ0lVOjcLCp+dgWh4BzufiehKN1hhFmbSwkzhUVEpK4+/hhOOw3GjIH99w8JY7vslb2KLVFERb3S3V+xcIqISN3MHAKfPFr7fvlg4TvQonO8r/Hee/DWWzBkCAwaFMYmskhlxkWk8HzyaG6+gLOhRWdoe2z2n3fqVHj7bTjxROjTB2bNgpYta39cPShRiEg84vzVX5EkcnkmUb748Uf43/8Nl5/9DPr1g6ZNY0sSoFpPIhKXil/9cYjrV3q+e+MN6NIFrroKjjkGJk8OSSJmalGISHxK9Vd/HObNg333Da2IZ56Bww/P2UsrUYhIzRrSfVQoYwj57sMPYYcdYKut4LHH4IADYOONa39cFqnrSURq1pDuo1LtHsqWRYtg8GDYaScYNy5s+/Wvc54kQC0KkeIQ18BxKQ8aJ2nEiDCj+osv4OKL4Zd1WUU4+9SiECkGcQ0cq1WQe4MGhRIcLVuGwevrr4f1EyooGFGLQqRY6Jd/4Uot4ldWBttsA5dcAo0bJxtXRIlCpNBU182kgePCNXcunHEG9O8PJ5wQrucZdT2JFJrqupnURVR4Vq8OJcA7doSxY2H58qQjqpFaFCKFSN1Mhe2jj8JYxLhxcOCBoUZTu3ZJR1UjJQqRbMlVoTp1MxW+6dNhyhS47z4YODDrRfyyTYlCJFtyVahO3UyF6d134Z134KSTwllNs2ZBixZJR5URJQqRbFKXkFS1fDlccw385S/w85+HGk1NmxZMkgANZouIxOf112G33UKiOPbYnBXxyza1KERE4jBvHnTvDltsAaNGwaGHJh1RvSlRiGQik4FqDTILwIwZsPPOoYjfv/4Vivg1a5Z0VA2irieRTGRSIkODzKVt4UI45RTo0AFefTVs69On4JMEqEUhkjkNVEtNhg+Hs86C+fPhsssSL+KXbUoUIjVJ7W5St5LU5JRT4P77oXNn+M9/wgp0RUaJQqQmqfMi1K0kqVKL+HXrBttvDxddBOutl2xcMVGikNJSl9nTWotBqvPpp3D66eF01xNPDIsLFTkNZktpqcu6DWpFSKrVq+G226BTJ3jtNVixIumIckYtCik9aiVIXX3wQSji99prcPDBcNdd0LZt0lHljBKFFA/NdZC4fPABTJsGDzwQupvyvIhftqnrSYqH5jpINk2eHM5mAujVKxTxO+mkkksSoBaFFKKaWg4afJZsWLYMrr4abrghzK4eMCDUZ9pkk6QjS4xaFFJ4amo5qLUgDTV+fJgPcd11oYvpnXcKsohftqlFIYVJLQfJtnnzYP/9Qyti9OgwaC2AWhQiUuqmTw//brUV/Pvf8N57ShJVKFGISGn65puwDGnHjmHtaoAjj4SNNko0rHykricRKT3//jecfTYsWACXXw5duyYdUV5TopD8Vt0ZTpoLIQ0xcCD885+heN9zz4XBa0lLiULyW2phvgo6u0nqKrWI3157hYWFLrwQ1tVXYCZiPUpm1hP4B9AIuMfd/1Ll/jbAP4FNon0udfdRccYkeSxd60FnOEl9zZ4dCvcdf3yYMFcCRfyyLbbBbDNrBNwGHAp0AAaYWYcqu/0B+Je77wb0B26PKx4pANXNj1DrQepr1Sq45ZZQxG/ChMpWhdRZnC2KrsBMd58FYGbDgN7A9JR9HNg4ut4c+CzGeKQQqPUg2TBjBpx6Krz+Ohx6KNx5J7Rpk3RUBSvORLEVMDfldjmwR5V9rgSeN7PfAhsCB1b3RGY2GBgM0Eb/2YWvthIcIg01c2Yo5PfQQ3DccSVZnymb4pxHUd3/TNW23wDgAXdvDRwGPGRma8Xk7kPcvczdy1q1ahVDqJJTKsEhcXjrLbjvvnD9yCPD2MTxxytJZEGcLYpyYOuU261Zu2vpVKAngLu/bmZNgc2Ar2KMS/KBupgkW374Aa66Cm68EbbeOqw817QpbLxx7Y+VjMTZongT2N7M2plZY8Jg9Ygq+8wBDgAws52BpsD8GGOSpM0cAl+9knQUUizGjYNdd4Xrrw/zIyZPVhG/GMTWonD3lWZ2DjCacOrrfe4+zcyuBia5+wjgQuBuMzuf0C010F2nJhS1irEJdTFJQ82bBwccEFoRL74YrkssrNC+l8vKynzSpElJhyF1VTGArXkR0lDvvQe/+EW4/swzoeLrhhsmG1MBMLO33L2sPo9VUUDJjdQkodaE1MfXX8MJJ8Auu1QW8TviCCWJHND8dckdtSSkPtzh8cfhnHNg4UK44grYo+qZ9hInJQqJT+p8Cc2RkPo66aQwH6KsDF56qbLbSXJGiULik9rdpC4nqYvUIn7du4fupvPOUxG/hOioS7zU3SR1NWsWnHZamCx38smhFIckSoPZIpIfVq2Cv/89dC29+Saso6+nfKEWhYgkb/p0OOUUeOMNOPzwUMSvdeuko5KIUrbEQzOwpS5mz4aPP4ZHH4WRI5Uk8oxaFBIPzcCW2rz5JrzzThiPOPzwMDbRrFnSUUk11KKQ+GzeHdprNTGpYulSuOgi6NYNrrsOli0L25Uk8pZaFNJw6ZYwFUk1diwMGhS6mU4/PRTzUxG/vKcWhTScljCVTJSXw0EHhesvvxwGrJs3TzYmyYhaFFI/1c261nwJqc6774ZS4K1bw9NPQ48esMEGSUcldaAWhdRPaitCrQepzvz5YRGhzp3hlegMuMMOU5IoQGpRSP2pFSHVcYdhw+Dcc+Hbb8Pqc3vumXRU0gAZJYpohbo27j4z5nhEpNCdcAI88kio8HrvvdCxY9IRSQPV2vVkZocD7wEvRLc7m9nwuAMTkQKyenVlIb/994ebboLx45UkikQmYxRXA3sAiwDc/R2gfZxBiUgBmTkzLEN6//3h9qmnwvnnQ6NGycYlWZNJoljh7ouqbCus9VNFJPtWroQbbwxF/CZPhsaNk45IYpLJGMUMM+sHrGNm7YDfARPiDUtE8trUqaEE+KRJ0Ls33H47bLll0lFJTDJJFOcAfwJWA08Co4HL4gxK8kB1s61TaeZ1aZszBz79NJzd1K9fWGBIilYmieIQd78EuKRig5kdRUgaUqxSV6erjuZOlJ433giT5wYPDvMhZs2CjTZKOirJgUwSxR9YOylcXs02KTaaJyEA338Pf/xjWFRo223DGtZNmihJlJAaE4WZHQL0BLYys5tS7tqY0A0lIsXu5ZdDGfBZs+DMM+EvfwlJQkpKuhbFV8BUYBkwLWX7YuDSOIMSkTxQXg6HHALt2oUSHPvtl3REkpAaE4W7TwYmm9kj7r4shzFJrqlMuKSaPBl22y0U8Rs5Erp3h/XXTzoqSVAm8yi2MrNhZjbFzD6suMQemeSOyoQLwJdfwjHHQJculUX8evZUkpCMBrMfAK4BbgQOBU5GYxTFRwPXpcs91Gb63e9gyRK45hrYa6+ko5I8kkmLYgN3Hw3g7h+7+x+A/eMNS3Jm5hD46pWko5AkHXtsKOS3445hDevLL4f11ks6KskjmbQolpuZAR+b2RnAPGDzeMOSnKkYm1A3U2lZvTpMkjODgw8OZcDPPlv1maRambQozgc2As4F9gZOA06JMyjJsc27Q/vBSUchufLhh6HC6333hdsnnxzWjlCSkBrUmijc/Q13X+zuc9z9BHfvBXyag9gkTjOHwIs91h7EluK1ciXccENYlnTKFA1SS8bSJgoz+6WZ9TGzzaLbHc3sQVQUsPClluhQt1PxmzIFunWDSy6BQw+F6dPD2IRIBtLNzL4O6Au8C/whWqzod8D1wBm5CU9iUTGAvXl3nelUKsrLYe5cePxx6NtXRfykTtINZvcGdnX3H8xsU+Cz6PYHmT65mfUE/gE0Au5x979Us08/4ErCGhfvurt+5sRNA9il4b//DS2JM86oLOK34YZJRyUFKF3X0zJ3/wHA3b8B3q9jkmgE3EaYe9EBGGBmHarssz2hZPne7t4ROK+O8Ut9aQC7eC1ZEuZE7LMP/O1vsHx52K4kIfWUrkWxrZlVVIg1oG3Kbdz9qFqeuysw091nAZjZMEIrZXrKPqcBt7n7wug5v6pj/FIXFaU6VJ6jeD3/fCgDPmdOON31f/9XRfykwdIlir5Vbt9ax+feCpibcrucsPZ2qh0AzGw8oXvqSnd/ruoTmdlgYDBAmzZt6hiGrKEB7OI2dy4cfjhstx2MGxdaFCJZkK4o4EsNfO7qRsuqrrW9LrA90ANoDbxqZp2qrtHt7kOAIQBlZWVar7suUgv+VSQJDWAXl7fegt13h623hlGjYN99oWnTpKOSIpLJhLv6Kge2TrndmjAgXnWfp919hbvPBj4gJA7JltSCf2pJFJcvvoDf/AbKyiqL+B10kJKEZF0mJTzq601gezNrRyj70R+o+i31FDAAeCCaq7EDMCvGmEqTWhHFxR0efBDOPx+WLg3jECriJzHKuEVhZnUaEXP3lcA5wGhgBvAvd59mZlebWa9ot9HAAjObDowBLnb3BXV5HUlDBf+KU//+MHAgdOgQivhddpmK+Emsam1RmFlX4F6gOdDGzHYFBrn7b2t7rLuPAkZV2fanlOsOXBBdJNs0X6J4pBbxO+ywMA5x1lmwTpy9xyJBJp+yW4AjgAUA7v4uKjNeODRfovC9/35YhvTee8Ptk06Cc85RkpCcyeSTto67Vy0CuCqOYCQLKor9qeBf4VuxIow/7LprqM200UZJRyQlKpNEMTfqfnIza2Rm5wFaCjVf6Syn4vDOO9C1a1hEqFevkCj69086KilRmZz1dCah+6kN8CXwYrRN8pXOcip8X3wRLv/+NxxVWxEEkXhlkihWurt+yojE7bXXQhG/s86Cnj3h449hgw2Sjkoko66nN81slJmdZGbNYo9IpNQsXhwGp/fdF/7+98oifkoSkicyWeFuO+AaYHfgPTN7yszUwhDJhtGjoVMnuP32UPH17bdVxE/yTkbn17n7f939XKAL8B3wSKxRiZSCuXPhiCNCy+G110JrQmc2SR6qNVGY2UZmdpyZjQQmAvMB1QsQqQ93mDgxXN96a3j2WZg8WSU4JK9l0qKYCnQDbnD39u5+obu/EXNcIsXn88/DMqR77FFZxO/AA1XET/JeJmc9bevuq2OPRKRYucMDD8AFF8CyZXD99bD33klHJZKxGhOFmf3N3S8E/m1ma60BkcEKd5JLWr0uf/XrB088Ec5quuce2GGHpCMSqZN0LYrHon/rurKdJEGr1+WXVatCAb911oEjj4Rf/QpOP131maQgpVvhLhpxY2d3/0myMLNzgIaugCfZphnZ+WHGDDj1VDj5ZDjtNDjxxKQjEmmQTMYoTmHtVsWp1WyTbEhdurQu1OWUvBUrwvjDn/8cTnNt3jzpiESyIt0YxTGEVenamdmTKXc1AxZV/yhpsPqOM6jLKVmTJ4fFhKZMgWOOgVtugc03TzoqkaxI16KYSFiDojVwW8r2xcDkOIMqeepCKjxffglffw1PPQW9eycdjUhWpRujmA3MJlSLFZGqxo2D996Ds88ORfxmzoT11086KpGsq/EUDDN7Jfp3oZl9k3JZaGbf5C5EkTzz3Xehwmv37qGLqaKIn5KEFKl05+pVLHe6GdAq5VJxW6T0jBoFHTvCXXeFCXQq4icloMZEkTIbe2ugkbuvAvYETgc2zEFsIvll7tww/tC8Ofz3v/C3v8GG+lOQ4pfJ7J+nCMugbgc8COwM1OP8TZEC5A4TJoTrW28Nzz8fWhF77JFsXCI5lEmiWO3uK4CjgL+7+2+BreINq0TNHAJfvZJ0FFLhs8+gTx/Yc8/KIn777w+NGycbl0iOZZIoVprZb4ATgGeibevFF1IJq5hop/kQyXIPNZk6dAgtiBtvVBE/KWmZzsw+i1BmfJaZtQOGxhtWiUkt6Ld5d2g/OOmIStvRR8OTT4azmu65B9q3TzoikUTVmijcfaqZnQu0N7OdgJnufm38oZUQFfRLXmoRvz594OCDQ50mFfETqT1RmNm+wEPAPMCALczsBHcfH3dwJUWzsZMzdSoMGhQK+Z12GpxwQtIRieSVTLqebgYOc/fpAGa2MyFxlMUZWMGrS3E/FfRLxo8/wnXXwbXXhlNeW7RIOiKRvJRJu7pxRZIAcPcZgE77qE1Fd1Im1OWUe2+9BbvvDldeCb/5DUyfHsYmRGQtmbQo3jazuwitCIDjUFHAzKg7KX8tWACLFsHIkXDEEUlHI5LXMkkUZwDnAv9DGKMYB/xfnEGJxGLMmFDE79xzw2D1Rx9B06ZJRyWS99ImCjP7BbAdMNzdb8hNSCJZ9u238D//A0OGwE47hSVJmzRRkhDJULrqsb8nlO84DnjBzE7JWVQi2TJyZJg4d889cNFFYWxCRfxE6iRdi+I4YBd3/97MWgGjgPtyE5ZIFsydC337hlbEU0/BL3+ZdEQiBSndWU/L3f17AHefX8u+IvnBPVR2hcoifpMmKUmINEC6L/9tzezJ6DIc2C7l9pNpHreGmfU0sw/MbKaZXZpmv6PNzM1MczOk/srLoVevUJepoohfjx4q4ifSQOm6nvpWuX1rXZ7YzBoR1to+CCgH3jSzEalzMqL9mhHOqnqjLs8vssbq1XD33XDxxbByJdx0E+yzT9JRiRSNdGtmv9TA5+5KqAs1C8DMhgG9gelV9vszcANwUQNfL1lVZ2JrtnXu9O0bxiB+9auQMLbdNumIRIpKnOMOWwFzU26XU2UdCzPbDdja3Z8hDTMbbGaTzGzS/Pnzsx9pNlSdia3Z1vFauTK0JCAkirvvhhdfVJIQiUEmE+7qy6rZ5mvuNFuHUEdqYG1P5O5DgCEAZWVlXsvuydFM7NyYMiUU8Bs0KMyJOP74pCMSKWoZtyjMrK4nn5cT1tuu0Br4LOV2M6ATMNbMPgG6ASM0oC01Wr4crrgi1Gj69FNo1SrpiERKQq2Jwsy6mtl7wEfR7V3NLJMSHm8C25tZOzNrDPQHRlTc6e7fuvtm7t7W3dsCE4Be7j6pPm9Eitybb0KXLnD11TBgAMyYAUcdlXRUIiUhkxbFLcARwAIAd38X2L+2B7n7SuAcYDQwA/iXu08zs6vNrFf9Q5aStHAhLFkCo0bBgw9Cy5ZJRyRSMjIZo1jH3T81+8mQw6pMntzdRxFmdKdu+1MN+/bI5DnzTuoypjrLKbtefjkU8fvd70IRvw8/VPkNkQRk0qKYa2ZdATezRmZ2HvBhzHEVDi1jmn2LFoWV5g44AO66K4xNgJKESEIySRRnAhcAbYAvCYPOZ8YZVMGpONup/eCkIyl8Tz8divjdd1+o+KoifiKJq7Xryd2/IgxEi8Rrzpyw2tzOO8OIEVCmE+BE8kGticLM7iZl/kMFd9fPZ2k4d3jtNdh3X2jTJkya69ZN9ZlE8kgmXU8vAi9Fl/HA5sDyOIMqGDOHwFevJB1F4ZozBw4/HPbbr7KI3377KUmI5JlMup4eS71tZg8BL8QWUSGpqO2kQey6Wb0a7rwTLrkktChuuUVF/ETyWH1KeLQDtsl2IAVr8+4axK6ro44Kg9YHHRSWJ23bNumIRCSNTMYoFlI5RrEO8A1Q49oSItVauRLWWScyYpbDAAAUm0lEQVRcjjkGeveGgQPBqisJJiL5JG2isDDLbldgXrRptbvnb1E+yU/vvgunnBLmRpxxRijBISIFI+1gdpQUhrv7quiiJCGZW7YM/vCHcJpreTlssUXSEYlIPWRy1tNEM+sSeyRSXCZOhN12g2uvheOOC0X8+vRJOioRqYcau57MbN2osN8+wGlm9jHwPWGdCXd3JQ+p2XffwQ8/wHPPwSGHJB2NiDRAujGKiUAXQD8DK2i50/Sefx6mTYPzz4cDD4QPPlD5DZEikK7ryQDc/ePqLjmKL79oudPqLVwIJ58cWg733qsifiJFJl2LopWZXVDTne5+Uwzx5D8td/pTTz4JZ58N8+fDZZfBn/6kBCFSZNIlikbARlS/9rVIKMHRvz906hQWFNptt6QjEpEYpEsUn7v71TmLRAqDO4wbB927hyJ+L78Me+wB662XdGQiEpNaxyhE1vj0Uzj0UOjRo7KI3z77KEmIFLl0ieKAnEUh+W31arj1VujYMZQE/7//C2XBRaQk1Nj15O7f5DIQyWN9+sDIkeGsprvugm1UE1KklNSnemzxqTo/oialNG9ixQpo1CgU8RswAI4+Gk44QUX8REpQJiU8il/V+RE1KZV5E2+/DV27hjUjICSKE09UkhApUWpRVND8iFBy4+qr4a9/hVatYOutk45IRPKAEoUEEybASSfBhx+GkuA33ggtWiQdlYjkASUKCb7/PoxLvPBCqNMkIhJRoihlzz0XivhdeCEccAC8/z40bpx0VCKSZzSYXYoWLAjdTIceCv/8J/z4Y9iuJCEi1SjdRDFzCLzYI1wyOeOpGLjDE09Ahw7w6KNh9bk331SCEJG0SrfrqeKU2BadS+e01zlz4NhjYZddwtoRu+6adEQiUgBKN1FAaZwS6w5jxsCvfhVmVI8dG+ZIrFva//UikrnS7XoqBbNnw8EHh4HqiiJ+e+2lJCEidaJEUYxWrYJ//COsE/HGG3DHHSriJyL1pp+Wxah3b/jPf+Cww0IZDs2wFpEGKL0WRcXZTsV2ptOKFaEcOITifQ8/DM88oyQhIg0Wa6Iws55m9oGZzTSzS6u5/wIzm25mU8zsJTOLv3516tlOxXKm06RJUFYWupgAjjkGjjtORfxEJCti63oys0bAbcBBQDnwppmNcPfpKbtNBsrcfamZnQncAByTtSCqKx9ekSSK4WynH36AK68MdZl+9jOtEyEisYizRdEVmOnus9z9R2AY0Dt1B3cf4+5Lo5sTgNZZjaC68uHF0pJ4/fUwD+KGG0IRv+nT4Ygjko5KRIpQnIPZWwFzU26XA3uk2f9U4Nnq7jCzwcBggDZt2tQtimJpPVT1ww9hTOLFF8PpryIiMYmzRVFdB7lXu6PZ8UAZ8Nfq7nf3Ie5e5u5lrVq1yuzVZw6Br17JMNQCMWpUWCsCwgS6GTOUJEQkdnEminIg9ZSb1sBnVXcyswOBy4Fe7r48a69eMTZRDN1MX38Nxx8Phx8OjzxSWcRvvfWSjUtESkKcieJNYHsza2dmjYH+wIjUHcxsN+AuQpL4KusRbN4d2g/O+tPmjDsMGwY77wz/+hdccQVMnKgifiKSU7GNUbj7SjM7BxgNNALuc/dpZnY1MMndRxC6mjYCHrdwKuccd+8VV0wFZ86cUA58113h3nvhF79IOiIRKUGxzsx291HAqCrb/pRyXUupVeUOL70UVpnbZptQo+mXv4RGjZKOTERKVPHNzC7kmdcffxwGpw86qLKIX7duShIikqjiSxSFOPN61Sq46abQtfTWW3DXXSriJyJ5oziLAhba3Ikjj4Rnnw0T5u64A1pnd96hiEhDFFeLopDmTvz4Y2URv4EDw9KkI0YoSYhI3imuRFEocycmToTdd4fbbw+3+/WDAQNUxE9E8lJxJQrI77kTS5fChRfCnnvCwoWw3XZJRyQiUqviHKPIR6+9FuZEzJoFp58O118PzZsnHZWISK2UKHJlxYpwmuuYMdCjR9LRiIhkrDi6nvJ17sTIkaEMOMD++4dS4EoSIlJgiiNR5Nvcifnz4dhjoVcvGDq0sojfumrAiUjhKZ5vrnyYO+EeEsO558J338HVV8Mll6iIn4gUtMJvUeTT3Ik5c+Dkk6F9e5g8Gf74RyUJESl4hZ8okp47sXo1jB4drm+zDbz6KowfDx07JhOPiEiWFX6igOTmTnz0UVhprmdPGDcubOvaVUX8RKSoFEeiyLWVK8OSpLvsAu+8E9aKUBE/ESlSxTOYnUtHHBG6m3r3DmU4ttwy6YhE8tKKFSsoLy9n2bJlSYdSMpo2bUrr1q1ZL4tLJStRZGr58rBG9TrrwKBBcMop8JvfqD6TSBrl5eU0a9aMtm3bYvpbiZ27s2DBAsrLy2nXrl3WnlddT5mYMAG6dIHbbgu3jz46FPLTB18krWXLltGyZUsliRwxM1q2bJn1FpwSRTrffw/nnw977QWLF8P22ycdkUjBUZLIrTiOt7qeavLqq6GI3+zZcNZZcN11sPHGSUclIpJzalHUZOXKMCbxyiuhy0lJQqRgDR8+HDPj/fffX7Nt7NixHHHEET/Zb+DAgTzxxBNAGIi/9NJL2X777enUqRNdu3bl2WefbXAs1113He3bt2fHHXdkdMUcrCr23XdfOnfuTOfOndlyyy3p06cPAH/961/XbO/UqRONGjXim2++aXBMtSnMFsXMIZUT7SpqPGXDU0/BjBlw2WWhiN+0aarPJFIEhg4dyj777MOwYcO48sorM3rMH//4Rz7//HOmTp1KkyZN+PLLL3nllYZVgZg+fTrDhg1j2rRpfPbZZxx44IF8+OGHNKoy9+rVV19dc71v37707t0bgIsvvpiLL74YgJEjR3LzzTez6aabNiimTBTmt2BqEcBsFAL88kv47W/h8cfDoPWFF4bSG0oSItnz1nnZr/DcojPs/ve0uyxZsoTx48czZswYevXqlVGiWLp0KXfffTezZ8+mSZMmAPzsZz+jX79+DQr36aefpn///jRp0oR27drRvn17Jk6cyJ577lnt/osXL+bll1/m/vvvX+u+oUOHMmDAgAbFk6nC/SbMRhFAd3j4YTjvPFiyBK69Fi6+OHQ5iUhReOqpp+jZsyc77LADm266KW+//TZdunRJ+5iZM2fSpk0bNs6gy/n8889nzJgxa23v378/l1566U+2zZs3j27duq253bp1a+bNm1fjcw8fPpwDDjhgrTiWLl3Kc889x6233lprfNlQuIkiG+bMCXMiysrC7Oqddko6IpHiVcsv/7gMHTqU8847Dwhf3kOHDqVLly41nh1U17OGbr755oz3dfc6vd7QoUMZNGjQWttHjhzJ3nvvnZNuJyjFRFFRxO/QQ0MRv/HjYbfdVJ9JpAgtWLCAl19+malTp2JmrFq1CjPjhhtuoGXLlixcuPAn+3/zzTdsttlmtG/fnjlz5rB48WKaNWuW9jXq0qJo3bo1c+fOXXO7vLycLWuo7LBgwQImTpzI8OHD17pv2LBhOet2AkKGK6TL7rvv7v5C93Cpqw8+cN93X3dwHzu27o8XkTqZPn16oq9/5513+uDBg3+ybb/99vNx48b5smXLvG3btmti/OSTT7xNmza+aNEid3e/+OKLfeDAgb58+XJ3d//ss8/8oYcealA8U6dO9V122cWXLVvms2bN8nbt2vnKlSur3feOO+7wE088ca3tixYt8hYtWviSJUtqfJ3qjjswyev5vVsap8euXAnXXx+K+L33Htx/P+y3X9JRiUjMhg4dyq9//eufbOvbty+PPvooTZo04eGHH+bkk0+mc+fOHH300dxzzz00b94cgGuuuYZWrVrRoUMHOnXqRJ8+fWjVqlWD4unYsSP9+vWjQ4cO9OzZk9tuu23NGU+HHXYYn3322Zp9a2o1DB8+nIMPPpgNN9ywQbHUhXk1fWb5rKyszCf9ZaNwI9PB7EMOgeefh6OOCnMittgitvhEpNKMGTPYeeedkw6j5FR33M3sLXcvq8/zFe8YxbJl4eylRo1g8OBw6ds36ahERApOcXY9jR8PnTtXFvHr21dJQkSknoorUSxZAueeGxYRWrYM1OQVSVyhdW8XujiOd/EkildegU6d4NZb4ZxzYOpUOOigpKMSKWlNmzZlwYIFShY54tF6FE2bNs3q8xbXGMUGG4Sqr3vvnXQkIkKYN1BeXs78+fOTDqVkVKxwl02Fd9bTDs180rWNQgmP786F99+H3/8+3LlqlSbOiYhUoyFnPcXa9WRmPc3sAzObaWaXVnN/EzN7LLr/DTNrm9ETewe4cXEYoB4+HH78MWxXkhARybrYEoWZNQJuAw4FOgADzKxDld1OBRa6e3vgZuD6Wp94xWZw+gwYOy0sJvTf/4ZKryIiEos4WxRdgZnuPsvdfwSGAb2r7NMb+Gd0/QngAKutItenn4ZB63ffhUsvVaVXEZGYxTmYvRUwN+V2ObBHTfu4+0oz+xZoCXydupOZDQYGRzeX22uvTVWlVwA2o8qxKmE6FpV0LCrpWFTasb4PjDNRVNcyqDpynsk+uPsQYAiAmU2q74BMsdGxqKRjUUnHopKORSUzm1Tfx8bZ9VQObJ1yuzXwWU37mNm6QHMg/gVgRUQkY3EmijeB7c2snZk1BvoDI6rsMwI4Kbp+NPCyF9r5uiIiRS62rqdozOEcYDTQCLjP3aeZ2dWEuugjgHuBh8xsJqEl0T+Dpx4SV8wFSMeiko5FJR2LSjoWlep9LApuwp2IiORW8dR6EhGRWChRiIhIWnmbKGIr/1GAMjgWF5jZdDObYmYvmdk2ScSZC7Udi5T9jjYzN7OiPTUyk2NhZv2iz8Y0M3s01zHmSgZ/I23MbIyZTY7+Tg5LIs64mdl9ZvaVmU2t4X4zs1ui4zTFzLpk9MT1XWw7zgth8PtjYFugMfAu0KHKPmcBd0bX+wOPJR13gsdif2CD6PqZpXwsov2aAeOACUBZ0nEn+LnYHpgMtIhub5503AkeiyHAmdH1DsAnSccd07HYD+gCTK3h/sOAZwlz2LoBb2TyvPnaooin/EdhqvVYuPsYd18a3ZxAmLNSjDL5XAD8GbgBWJbL4HIsk2NxGnCbuy8EcPevchxjrmRyLBzYOLrenLXndBUFdx9H+rlovYEHPZgAbGJmP6/tefM1UVRX/mOrmvZx95VARfmPYpPJsUh1KuEXQzGq9ViY2W7A1u7+TC4DS0Amn4sdgB3MbLyZTTCznjmLLrcyORZXAsebWTkwCvhtbkLLO3X9PgHyd+GirJX/KAIZv08zOx4oA7rHGlFy0h4LM1uHUIV4YK4CSlAmn4t1Cd1PPQitzFfNrJO7L4o5tlzL5FgMAB5w97+Z2Z6E+Vud3H11/OHllXp9b+Zri0LlPyplciwwswOBy4Fe7r48R7HlWm3HohnQCRhrZp8Q+mBHFOmAdqZ/I0+7+wp3nw18QEgcxSaTY3Eq8C8Ad38daEooGFhqMvo+qSpfE4XKf1Sq9VhE3S13EZJEsfZDQy3Hwt2/dffN3L2tu7cljNf0cvd6F0PLY5n8jTxFONEBM9uM0BU1K6dR5kYmx2IOcACAme1MSBSluD7rCODE6OynbsC37v55bQ/Ky64nj6/8R8HJ8Fj8FdgIeDwaz5/j7r0SCzomGR6LkpDhsRgNHGxm04FVwMXuviC5qOOR4bG4ELjbzM4ndLUMLMYflmY2lNDVuFk0HnMFsB6Au99JGJ85DJgJLAVOzuh5i/BYiYhIFuVr15OIiOQJJQoREUlLiUJERNJSohARkbSUKEREJC0lCsk7ZrbKzN5JubRNs2/bmipl1vE1x0bVR9+NSl7sWI/nOMPMToyuDzSzLVPuu8fMOmQ5zjfNrHMGjznPzDZo6GtL6VKikHz0g7t3Trl8kqPXPc7ddyUUm/xrXR/s7ne6+4PRzYHAlin3DXL36VmJsjLO28kszvMAJQqpNyUKKQhRy+FVM3s7uuxVzT4dzWxi1AqZYmbbR9uPT9l+l5k1quXlxgHto8ceEK1h8F5U679JtP0vVrkGyI3RtivN7CIzO5pQc+uR6DXXj1oCZWZ2ppndkBLzQDP7v3rG+TopBd3M7A4zm2Rh7Ymrom3nEhLWGDMbE2072Mxej47j42a2US2vIyVOiULy0fop3U7Do21fAQe5exfgGOCWah53BvAPd+9M+KIuj8o1HAPsHW1fBRxXy+sfCbxnZk2BB4Bj3P0XhEoGZ5rZpsCvgY7uvgtwTeqD3f0JYBLhl39nd/8h5e4ngKNSbh8DPFbPOHsSynRUuNzdy4BdgO5mtou730Ko5bO/u+8flfL4A3BgdCwnARfU8jpS4vKyhIeUvB+iL8tU6wG3Rn3yqwh1i6p6HbjczFoDT7r7R2Z2ALA78GZU3mR9QtKpziNm9gPwCaEM9Y7AbHf/MLr/n8DZwK2EtS7uMbP/ABmXNHf3+WY2K6qz81H0GuOj561LnBsSylWkrlDWz8wGE/6uf05YoGdKlcd2i7aPj16nMeG4idRIiUIKxfnAl8CuhJbwWosSufujZvYGcDgw2swGEcoq/9PdL8vgNY5LLSBoZtWubxLVFupKKDLXHzgH+FUd3stjQD/gfWC4u7uFb+2M4ySs4vYX4DbgKDNrB1wE/NLdF5rZA4TCd1UZ8IK7D6hDvFLi1PUkhaI58Hm0fsAJhF/TP2Fm2wKzou6WEYQumJeAo81s82ifTS3zNcXfB9qaWfvo9gnAK1GffnN3H0UYKK7uzKPFhLLn1XkS6ENYI+GxaFud4nT3FYQupG5Rt9XGwPfAt2b2M+DQGmKZAOxd8Z7MbAMzq651JrKGEoUUituBk8xsAqHb6ftq9jkGmGpm7wA7EZZ8nE74Qn3ezKYALxC6ZWrl7ssI1TUfN7P3gNXAnYQv3Wei53uF0Nqp6gHgzorB7CrPuxCYDmzj7hOjbXWOMxr7+Btwkbu/S1gfexpwH6E7q8IQ4FkzG+Pu8wlnZA2NXmcC4ViJ1EjVY0VEJC21KEREJC0lChERSUuJQkRE0lKiEBGRtJQoREQkLSUKERFJS4lCRETS+n9Ht86YvnK7JgAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "probabilities = log_mod_10.predict_proba(pca_mod_10.transform(x_test))\n", + "scores = score_model(probabilities, 0.3)\n", + "print_metrics(y_test, scores) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All of the metrics have improved compared to the 5 component model. Apparently there is useful information in the first 10 components. \n", + "\n", + "But, is this difference really significant. To find out, you will now peform cross validation on the result. Ideally, the fitting of the PCA model should be part of the cross validation process. However, at the risk of a small bias, this step is omitted for the sake of simplicity. Execute the code in the cell below to perform the cross validation and display the result. " + ] + }, + { + "cell_type": "code", + "execution_count": 42, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Precision Recall, AUC\n", + "Fold 1 0.686 0.652 0.769\n", + "Fold 2 0.716 0.676 0.786\n", + "Fold 3 0.570 0.557 0.748\n", + "Fold 4 0.690 0.633 0.746\n", + "Fold 5 0.672 0.626 0.782\n", + "Fold 6 0.770 0.698 0.818\n", + "Fold 7 0.650 0.640 0.691\n", + "Fold 8 0.781 0.714 0.816\n", + "Fold 9 0.705 0.650 0.767\n", + "Fold 10 0.676 0.672 0.717\n", + "----------------------------------------\n", + "Mean 0.692 0.652 0.764\n", + "Std 0.057 0.041 0.038\n" + ] + } + ], + "source": [ + "def print_format(f,x,y,z):\n", + " print('Fold %1d %4.3f %4.3f %4.3f' % (f, x, y, z))\n", + "\n", + "def print_cv(scores):\n", + " fold = [x + 1 for x in range(len(scores['test_precision_macro']))]\n", + " print(' Precision Recall, AUC')\n", + " [print_format(f,x,y,z) for f,x,y,z in zip(fold, scores['test_precision_macro'], \n", + " scores['test_recall_macro'],\n", + " scores['test_roc_auc'])]\n", + " print('-' * 40)\n", + " print('Mean %4.3f %4.3f %4.3f' % \n", + " (np.mean(scores['test_precision_macro']), np.mean(scores['test_recall_macro']), np.mean(scores['test_roc_auc']))) \n", + " print('Std %4.3f %4.3f %4.3f' % \n", + " (np.std(scores['test_precision_macro']), np.std(scores['test_recall_macro']), np.std(scores['test_roc_auc'])))\n", + "\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "scoring = ['precision_macro', 'recall_macro', 'roc_auc']\n", + "\n", + "pca_mod = skde.PCA(n_components = 10)\n", + "pca_mod.fit(Features)\n", + "Comps = pca_mod.transform(Features)\n", + "scores = ms.cross_validate(log_mod_10, Comps, Labels, scoring=scoring,\n", + " cv=10, return_train_score=False)\n", + "print_cv(scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the AUC and its standard deviation obtained above to the AUC of the 5 component model. The difference does appear to be significant. This difference supports the hypothesis that the first 10 components all contain useful information. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have applied principle component analysis to dimensionality reduction for suppervised machine learing. The first components computed contain most of the available information. When faced with large number of feautures, PCA is an effective way to make supervised machine learning models tractable. \n", + "\n", + "Specifically in this lab you have:\n", + "1. Computed PCA models with different numbers of components.\n", + "2. Compared logistic regression models with different numbers of components. In this case, using 10 components produced a significantly better model. Using 10 components is a useful reduction in dimensionality compared to the original 35 features. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/.ipynb_checkpoints/FeatureSelection-checkpoint.ipynb b/Module5/.ipynb_checkpoints/FeatureSelection-checkpoint.ipynb new file mode 100644 index 0000000..2885b0b --- /dev/null +++ b/Module5/.ipynb_checkpoints/FeatureSelection-checkpoint.ipynb @@ -0,0 +1,710 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Feature Selection\n", + "\n", + "**Feature selection** can be an important part of model selection. In supervised learning, including features in a model which do not provide information on the label is useless at best and may prevent generalization at worst.\n", + "\n", + "Feature selection can involve application of several methods. Two important methods include:\n", + "1. Eliminating features with **low variance** and **zero variance**. Zero variance features are comprised of the same values. Low variance features arrise from features with most values the same and with few unique values. One way low variance features can arise, is from dummy variables for categories with very few members. The dummy variable will be mostly 0s with very few 1s. \n", + "2. Training machine learning models with features that are **uninformative** can create a variety of problems. An uniformative featue does not significantly improve model performance. In many cases, the noise in the uniformative features will increase the variance of the model predictions. In other words, uniformative models are likely to reduce the ability of the machine learning model to generalize. \n", + "\n", + "****\n", + "**Note:** the second case of feature selection involves applying a selection statistic or hypothesis test multiple times. For large number of features, this process is very likely to lead to false positive and false negative results. This likely outcome is known as the **multiple comparisions problem** in statitics.\n", + "\n", + "To understand this problem, consider the decision to keep a feature in a model as a hypothesis test. Any hypothesis test has some probability of both a false positive result and a false negative result. Consider a case where there are 40 uninformative features which are excluded from the model with 95% confidence. There will be an approximately 5% chance of accepting a featuree which should be rejected. In this case we would expect about 2 uniformative features to be accepted because of these errors. \n", + "\n", + "You may well ask, if testing features for importance can fail with large numbers of features, what is the alternative? The most general and scalable alternative is to use regularization methods. Consider applying regularization methods to a linear model. In this case, machine learning algorithm learns which features should be weighted highly and which should not. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load the dataset\n", + "\n", + "You will now apply the aformentioned principles to the bank credit data set. \n", + "\n", + "As a first step, run the code in the cell below to load the required packages. " + ] + }, + { + "cell_type": "code", + "execution_count": 214, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "from sklearn import preprocessing\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "from sklearn import feature_selection as fs\n", + "from sklearn import metrics, cross_validation\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import scipy.stats as ss\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, load the preprocessed files containing the features and the labels. The preprocessing includes the following:\n", + "1. Clean missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encod categorical variables as binary dummy variables.\n", + "4. Standardize numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": 215, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(999, 35)\n", + "(999, 1)\n" + ] + } + ], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Eliminate low variance features\n", + "\n", + "As a fist step in selecting features from this dataset you will remove features with low vaiance. The `VarianceThreshold` functon from the Scikit Learn `feature_selection` package identifies features with less than some threshold of unique values. Fo a probability that a feature is unique $p$ the threshold is specified as;\n", + "\n", + "$$Var(x) = p(1-p)$$\n", + "\n", + "In this case a 80%, or $p=0.8$, threshold is used. \n", + "\n", + "The `fit_transform` method applies the threshold to the variance of each feature and removes features with variance below the threshold. The `get_support_` attribute shows the `True` and `False` logical for inclusion of each feature. \n", + "\n", + "Execute the code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": 216, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "(999, 35)\n", + "[ True True True True True True False True False True True False\n", + " False False True False False False False False True False False True\n", + " False False True False True False True True True True False]\n", + "(999, 18)\n" + ] + } + ], + "source": [ + "print(Features.shape)\n", + "\n", + "## Define the variance threhold and fit the threshold to the feature array. \n", + "sel = fs.VarianceThreshold(threshold=(.8 * (1 - .8)))\n", + "Features_reduced = sel.fit_transform(Features)\n", + "\n", + "## Print the support and shape for the transformed features\n", + "print(sel.get_support())\n", + "print(Features_reduced.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The number of features has been reduced from 35 to 18. Apparently, there are 17 low variance features in the original array. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Select k best features\n", + "\n", + "The low variance features have been eliminated. But, the question remains, are all these features informative? There are a number of methods used to determine the importance of features. Many machine learning models have specialized methods to determine feature importance specifically indended for those methods. \n", + "\n", + "In this example, you will use a fairly general and robust method using cross validation. The algorithm is straight forward. Features are recursively removed. Cross validation is used to find the change in model performance, if any, to determine if a feature should be deleted altogether. \n", + "\n", + "The code in the cell below performs the following processing:\n", + "1. Create the folds for the cross validation for feature selection. These folds should be independent of any other cross validation performed. \n", + "2. The logistic regression model is defined. \n", + "3. The `RFECV` funcion from the Scikit Learn `feature_selection` package is used to determine which features to retain using a cross validation method. Notice that AUC is used as the model selection metric as the labels are imbalanced. In this case, the default, accuracy is a poor choice. \n", + "4. The RFECV feature selector is fit to the data. \n", + "\n", + "Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": 217, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([ True, False, True, True, True, True, True, True, True,\n", + " True, True, False, True, True, False, True, True, True])" + ] + }, + "execution_count": 217, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "## Reshape the Label array\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "\n", + "## Set folds for nested cross validation\n", + "nr.seed(988)\n", + "feature_folds = ms.KFold(n_splits=10, shuffle = True)\n", + "\n", + "## Define the model\n", + "logistic_mod = linear_model.LogisticRegression(C = 10, class_weight = {0:0.1, 0:0.9}) \n", + "\n", + "## Perform feature selection by CV with high variance features only\n", + "nr.seed(6677)\n", + "selector = fs.RFECV(estimator = logistic_mod, cv = feature_folds,\n", + " scoring = sklm.make_scorer(sklm.roc_auc_score))\n", + "selector = selector.fit(Features_reduced, Labels)\n", + "selector.support_ " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From the support you can see that some features are selected (True) and eliminated (False). \n", + "\n", + "Execute the code below to see the relative ranking of the features." + ] + }, + { + "cell_type": "code", + "execution_count": 218, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([1, 4, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1, 3, 1, 1, 1])" + ] + }, + "execution_count": 218, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "selector.ranking_" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the features which have been selected are shown with a rank of 1. The features eliminated are shown with higher numbers. \n", + "\n", + "The code in the cell below as uses the `transform` method applies the selector to the feature array. " + ] + }, + { + "cell_type": "code", + "execution_count": 219, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "(999, 15)" + ] + }, + "execution_count": 219, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "Features_reduced = selector.transform(Features_reduced)\n", + "Features_reduced.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The features have been reduced from the 18 high variance features to 15. Three features have been found to be unimportant. \n", + "\n", + "The code in the cell below creates a plot of AUC (the metric) vs. the number of features. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": 220, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5,0,'Number of features')" + ] + }, + "execution_count": 220, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEWCAYAAACXGLsWAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3Xl8VPW9//HXmyTsgQAJyL6D4AJKBBFXREutSu3ivlEV217b2lZb7e3PWnvbe7v3trXXui9VEbUVtCoqoK0ISpBNSEBWCVsCYQkEyPb5/XFOdBiyAZnMZPJ5Ph7zyMw533PO55yZzGe+33PO9yszwznnnKtNi3gH4JxzLvF5snDOOVcnTxbOOefq5MnCOedcnTxZOOecq5MnC+ecc3XyZOGSlqR+kkxSarxjORKS7pX0tzhu/78kbZe0tYb535C0TdJeSV0aOz4XH54skpSk9ZJKJWVGTV8cfoH2i1Nc/SVVSvpL1PRqv9glPS7pvyJed5f0iKQtkool5Un6qaR2jbUPyUxSb+D7wHAzO66a+WnA74ALzay9me04hm01yWTeXHmySG7rgKuqXkg6CWgTv3AAuB7YCVwpqdWRLCipMzCPYB/Gmlk6cAGQAQxs6ECTwVF8EfcFdphZQQ3zuwGtgeXHFFgDUMC/wxqJH+jk9hTBl3OVG4AnIwtIaiXpN5I+CZsWHpDUJpzXSdIrkgol7Qyf94pY9m1JP5M0N/yV/0Z0TaYa1wM/BsqAS45wf74HFAPXmtl6ADPbaGbfMbOltSz3NUmbw9rI98PYj5NUEtmMImlUuK9p0SsIm4amSXoy3NflkrIj5pukQRGvP60RSTpXUr6kH0gqCOP4oqSLJK2SVCTpR1GbbC3puXBbH0oaEbHuHpJeDGNdJ+nbUXG+IOlvkvYAN1azLx3D/SiUtEHSjyW1kDQBeBPoETYxPR613BBgZfhyl6TZ4fTjJb0Z7sdKSZdHLPMFSYsk7ZG0UdK9Eav8V8S69koaq6gmuOjaR/iZ+7mkuUAJMCDcn6ra5iYFzWgpYflBkt6RtFtB09pz0cfD1Y8ni+Q2H+ggaVj4z3MFEN0W/ktgCDASGAT0BO4J57UAHiP4tdkH2A/8OWr5q4HJQFegJXBHTcFIOgvoBUwFpnFoIquPCcDfzazyCJc7DxgMXAjcJWmCmW0F3gYujyh3LTDVzMpqWM+lBLFnADM4/FjU5jiCX+RVx/ehcHujgLOAeyQNiCg/CXge6Aw8A7wkKS38Jf0ysCRc1/nA7ZI+F7XsC2GcT1cTy5+AjsAA4ByC92Gymb0FfB7YHDYx3Ri5kJmtAk4IX2aY2fiw+e/NMMauBDXZv0iqKrcvXH8G8AXgG5K+GM47O2Jd7c1sXi3HL9J1wBQgHdgAPAGUE3x+TyF4n28Oy/4MeAPoRPDZ+1M9t+GieLJIflW1iwuAPGBT1QxJAm4BvmtmRWZWDPwCuBLAzHaY2YtmVhLO+znBl0ukx8xslZntJ0gAI2uJ5QbgNTPbSfDl8nlJXY9gX7oAW46gfJWfmtk+M1tGkPyqmuaeIPjCJkymVxEcr5q8a2avmllFWG5ELWWjlQE/DxPRVCAT+F8zKzaz5QTNOidHlF9oZi+E5X9HkGhOB04DsszsPjMrNbO1BInnyohl55nZS2ZWGb4vn4r40XB3uO31wG8JvoCPxsXAejN7zMzKzexD4EXgKwBm9raZLQtjWQo8y+GfoSP1uJktN7NygmT6eeD28D0uAH7PZ8ejjODHTg8zO2Bm7x7jtpstP7GU/J4iqO73J6oJCsgC2gILg7wBgICqKnxbgn+8iQS/zADSJaWEX5gAkVfMlADtqwsibNr6KuEvPjObJ+kTgprJHwh+GQKkRTyvel31S38H0L3OPT7cxojnG4CTwufTgQfCX/RDgN1m9kEt64ne19aSUsMvrbrsiDhmVV/g2yLm7+fQY/dpzGZWKSkf6AEYQTPRroiyKcC/q1u2GpkENcANEdM2ENRSjkZfYExUPKmESVfSGOB/gBPD7bYiqDEdi8j960vwGdkS8RluEVHmBwS1iw8k7QR+a2aPHuP2myWvWSQ5M9tAcKL7IuDvUbO3E3xJnWBmGeGjo5lVfWl9HxgKjDGzDnzWbCCO3GVAB4Imiq0KLsvsyWdNUVsIkkK/qOX689kX21vAZTryk5q9I573ATYDmNkBgtrQNQS/rGurVdSlhCDxVjnsSqIj9GnM4f72Ioh7I7Au4v3KMLN0M7soYtnaupLezme/tqv0IaLGeYQ2Au9ExdPezL4Rzn+GoMmut5l1BB7gs89PdXHuo+7jGLncRuAgkBmx/Q5mdgKAmW01s1vMrAdwK8Hnb1A163R18GTRPNwEjDezfZETw7b/h4DfVzUHSeoZ0f6dTpBMdim4EuknxxDDDcCjBL/qR4aPccBISSeFv7pfBH4uqUvYPn8VMBx4LVzH7wgSzhOS+kbE+ztJJ1Oz/yepbdiOPhmIPMn5JMFJ4Es5/HzOkVgMXC0pRdJEjr2pZZSkL4Undm8n+EKcD3wA7JH0Q0ltwu2dKOm0+qw0PM7TCI5zengcv8fR7/srwBBJ14XvWZqk0yQNC+enA0VmdkDSaIKaZJVCoJLg3EmVxcDZkvpI6gjcXcf+bCE4J/FbSR3CE/UDJZ0DIOmr+uyijJ0EiaaihtW5WniyaAbMbI2Z5dQw+4fAamB+ePXMWwS1CQiah9oQ/BqdD7x+NNuXVHUi9g/hL72qx8JwnTeERb8JFAFLgQLgNuALZrYt3I8i4AyCX8bvSyoGZgG7w32oyTvh/FnAb8zsjaoZZjaX4Avrw6orrI7Sdwiu7tpFUFN56RjWBUET2RUEX3DXAV8ys7Lwy/4SgmS7juC9eZjghHV9fYvgF/xa4F2CX/9H1TQTnsu6kOAcwWaCprpfEjQ3QfCe3he+V/cQJKqqZUsIzoPNlbRL0ulm9iZBMl8KLCRIRnW5nqCJawXB8XqBz5orTyP4rOwlqOF8x8zWHc2+NnfywY9ccxdeAvqMmT0c71icS1SeLFyzFjbfvEnQpl4c73icS1TeDOWaLUlPEDS73e6Jwrnaec3COedcnbxm4Zxzrk5Jc1NeZmam9evXL95hOOdck7Jw4cLtZpZVV7mkSRb9+vUjJ6emq0Odc85VR9KGukt5M5Rzzrl6iGmykDQx7LJ4taS7aihzuaQVCrp8fiZi+q/CabmS/qiIjl+cc841rpg1Q4W9W95P0NtpPrBA0gwzWxFRZjDB7fzjzGxnRJcTZxB0BVHVhcO7BN0nvB2reJ1zztUsljWL0cBqM1trZqUE3TJPiipzC3B/2GU1EaNzGUGXzFW9VKZxaA+dzjnnGlEsk0VPDu1KOJ/Du0EeQtAJ2VxJ88MO2AgHQZlD0BPpFmCmmeVGb0DSFEk5knIKCwtjshPOOedimyyqO8cQfQdgKsEIZucSDDzzsKSMsAvhYQTdMvcExks6O2pZzOxBM8s2s+ysrDqv/HLOOXeUYpks8jl0HIGq/vijy0wPe9NcRzC+72CCsQ/mm9leM9tL0EX16TGM1TnnXC1ieZ/FAmCwpP4EA6tcyaF92UPQjfNVwOOSMgmapdYS9G9/i6T/JqihnEPQXbZzzlWrstLYV1pO8YHgsfdgGXsOVL0uo/hAOQLGDcrkhB4d8Assj0zMkoWZlUu6DZhJMOzjo2a2XNJ9QI6ZzQjnXShpBcGAJHea2Q5JLwDjgWUETVevm9nLsYrVORd/ZkbxwXJ2l5Sxe3/w2BU+37W/9JAv/ejnew6UsfdgOfXt6q57x9aMP74rE4Z1Y+zALrROS4ntziWBpOlIMDs72/wObucSh5mxNH83BcUHwy/+0mqSQBl7wnl7DpRTUVnz91FaikhvnUZ661Tat0olvXXqp687hH8jp30277Pnew+W83ZeIW/lbuPd1dspKa2gTVoKZw7OZMKwrpx3fFe6prduxKMUf5IWmll2neU8WTjnGlre1j3cM305H6wrOmS6BB3bpNGxTRoZbdLo0CaNjLYtyaia1jac9unrlp+Wb53WokGbjg6UVTB/7Q5m5RYwK3cbm3cfAGBE7wwmHN+V8cO6Mrx7wzZXlZSWs7FoP58UlbBhxz5KKyoZkNmOAVnt6dulLa1SG7+G48nCOdfodu8v4/dvruKp+Rvo0DqV710whJG9OwVf+G3TSG+VSosWiXeuwMzI3VLMrNxtvJVXwJKNuwDo0bE144d15fxh3Rg7oO7mqspKY1vxAT7ZUcInRSVsLAr+Bo/9bN97sMZlWwh6dWrLgKx2DMhsH/zNasfArPZ0TW8Vs3Msniycc42mstJ44cN8fvlaHjtLSrlmTF++f+EQMtq2jHdoR6Wg+ABz8gqYlVvAvz/ezv6yCtq2TOHMQZlMGNaN4T06sHnX/sMSwsad+yktr/x0PS0EPTLa0KdzW/p0bkvv8G/VIzVFrNu+j7WF+1hbuJc14fN12/dyoOyz9bRvlUr/zHaHJZIBme1p0/LYaiOeLJxzjWJp/i7umb6cxRt3MapvJ3566Qmc2LNjvMNqMAfKKpi3dgezcrcxK7eALWFzVZX0Vqn06dK22oTQs1Mb0lKO/A6Fykpjy54DrC3c+2kiWRsmkk279h9StkfH1pwxKJPffHXEUe2fJwvnXEwV7Svl1zNXMnXBJ3Rp14ofXXQ8l53SM6kvSTUzVmzZw4YdJfTqFNQYOrZJa9R93l9aEdRGtn+WSDq3a8U9lww/qvXVN1kkzXgWzjVlK7cW06ltGl07JP6VOBWVxrMffMJv3lhJ8YFybhrXn29PGEyH1mnxDi3mJHFCj46c0CN+Nac2LVMY3qMDw3t0aNTterJwLs5e/2gr//HMh7QQfOGk7kwe158RvTPiHVa1Fm4o4p7py1m+eQ9jB3Thp5NOYEi39HiH5RqBJwvn4mjOygK+9eyHnNyrIyN7Z/B8Tj4vLd7MqL6dmDyuHxNPOI7Uo2jzbmgFxQf45WsrefHDfLp3bM2frz6FL5zUPambnNyhPFk4FyfvrdnO159ayNDj0nl88mg6tknjexcM4fmcfJ6Yt57bnllE946tuW5sX646rQ+d2jX+lUVlFZU8OW8Df3hzFQfKK/jmuQP5j/MG0a6Vf3U0N36C27k4WLihiOse+YBendowdcpYOkclgopKY05eAY+9t465q3fQOq0Fl53Si8nj+jVas897a7Zz74zlrNq2l3OGZPGTS4YzIKt9o2zbNR6/Gsq5BLUsfzdXPzSfzPRWPHfr6XV2L5G3dQ+Pz13PPxZt4mB5JWcNzmTyuH6cO6Rrg97gVrSvlFXbilm1rZh3P97OGyu20btzG+65+AQmDOvqTU5JypOFcwkob+sernxwPu1bpTLt1rH0yGhT72WL9pXy7Aef8OS89Wzbc5D+me248Yx+fHlUL9ofQbPQ3oPlQVLYWszKMDms2raXwuLP7i7OaJvG5DP6c+s5A7yTvSTnycK5BLOmcC9X/HUeqS1aMO3WsfTp0vao1lNWUcmry7bw2Nz1LN64i/RWqVx+Wm9uPKMfvTt/ts4DZRWsKdzLqm3FrNxa9bf4kJu62qSlMKRbe4Z0S2focemf/o1l9xIusXiycC6BbCwq4asPzKO8spLnbh3LwAZq+//wk508Nnc9ry3bQqUZ44/vSlpKC1ZuK2b99n1UdeKaliIGZrX/LCF0C/726tQmIftqco3Hb8pzLkFs2b2fqx6az4HyCqZOOb3BEgXAqX06cWqfTmy9aBhPzV/PCwvzadcylSHd0rn45B4M7ZbO0OPa07dLu6PqdsK5Kl6zcC6GCooPcOVf51NYfJCnbxnDyb0S82Y713x5zcK5OCvaV8p1D3/A1j0HeOqm0Z4oXJPm9VLnYmD3/jKuf/R91u/Yx8PXZzOqb+d4h+TcMfFk4VwD23ewnMmPfcDKrcU8cN0ozhiUGe+QnDtm3gzlXAM6UFbBTU8sYEn+bu6/+lTOG9o13iE51yC8ZuFcAzlYXsGtTy3k/XVF/O7yEUw88bh4h+Rcg/Fk4VwDKKuo5FvPLOKdVYX88ksnM2lkz3iH5FyDimmykDRR0kpJqyXdVUOZyyWtkLRc0jMR0/tIekNSbji/Xyxjde5oVVQa35u2hDdWbOOnl57A5af1jndIzjW4mJ2zkJQC3A9cAOQDCyTNMLMVEWUGA3cD48xsp6TIBt4ngZ+b2ZuS2gOVOJdgKiuNu15cystLNnP354/nhjP6xTsk52Iilie4RwOrzWwtgKSpwCRgRUSZW4D7zWwngJkVhGWHA6lm9mY4fW8M43SuXsyMwuKDbNxZwidFJWws2s+Hn+zk7ZWF3D5hMLeeMzDeIToXM7FMFj2BjRGv84ExUWWGAEiaC6QA95rZ6+H0XZL+DvQH3gLuMrOKyIUlTQGmAPTp0ycW++CameIDZWws2s/GnSVsLAofO/fzSVEJ+TtLOFB2aAW3a3orvnfBEL41flCcInauccQyWVTXO1l03yKpwGDgXKAX8G9JJ4bTzwJOAT4BngNuBB45ZGVmDwIPQtDdR8OF7pLdhh37mLt6x6e1hPyi4O/OkrJDyqW3SqV357YMzGrHeUOz6N25Lb07taV357b06tTGu+92zUYsk0U+EHmmrxewuZoy882sDFgnaSVB8sgHFkU0Yb0EnE5UsnDuaMzO28ZtzyyipLSCtBTRM6MNvTu35fMndadPmAz6dG5L785t6Ngmzbvqdo7YJosFwGBJ/YFNwJXA1VFlXgKuAh6XlEnQ/LQW2AV0kpRlZoXAeMB7CXTH7Ml567l3xnKG9+jA/155Cv26tCPFu+h2rk4xSxZmVi7pNmAmwfmIR81suaT7gBwzmxHOu1DSCqACuNPMdgBIugOYpeBn3ULgoVjF6pJfRaXxi1dzeeTddUwY1pU/XnUKbVt6BwbO1Zd3Ue6S3v7SCm5/bhEzl2/jxjP68f8uHu61CedC3kW5cwTjSdzyRA5LN+3mJ5cMZ/K4/vEOybkmyZOFS1qrthUz+bEFFO0r5cHrsrlgeLd4h+Rck+XJwiWluau38/W/LaR1WgrTbh3LSb06xjsk55o0TxYu6UzL2ciP/r6MAVnteGzyaHpmtIl3SM41eZ4sXNIwM377xir+PGc1Zw3O5P5rTqVD67R4h+VcUvBk4ZLCwfIK7nx+KTOWbObK03rzsy+eSFqK98DvXEPxZOGavJ37SpnyVA4L1u/kBxOH8o1zBvpd1841ME8Wrklbv30fkx9fwKZd+/nTVadwyYge8Q7JuaTkycI1WQvWFzHlyeBGzGduHkN2v85xjsi55OXJwjVJM5Zs5o5pS+jZqQ2P3Xga/TLbxTsk55KaJwvXpJgZf3l7Db+euZLR/Trz1+tG0aldy3iH5VzS82Thmgwz457py3lq/gYmjezBr75yMq1SfTwJ5xqDJwvXJJgZ//XPXJ6av4FbzurPjy4a5lc8OdeI/EJ01yT87s1VPPLuOm48o58nCufiwJOFS3j3z1nNn2av5ors3txz8XBPFM7FgScLl9AefXcdv565kkkje/CLL51ECx+Hwrm48GThEtbUDz7hvldW8LkTuvHbr47wAYuciyNPFq5O8RhN8aVFm7j7H8s4d2gWf7zqFFK9nyfn4sr/A12N9h0s5wcvLOG0n8/iH4vyGy1pvP7RFr7//BJO79+FB64d5ZfHOpcAPFm4an20aTeX/Oldnl+YT8c2qXz3uSXc/EQOW3cfiOl25+QV8K1nFzGiV0ceviGb1mmeKJxLBJ4s3CHMjEffXceX/vIeJaUVPHPz6bzx3XP48ReGMXfNdi74/TtMy9kYk1rGe+HodkOPS+exyaNp18pvA3IuUcQ0WUiaKGmlpNWS7qqhzOWSVkhaLumZqHkdJG2S9OdYxukCO/Ye5KYncrjvlRWcPSSLV79zFmMHdiGlhbj5rAG89p2zGda9Az94YSk3PBb09NpQFm4o4uYnc+jbpS1Pfm0MHdv4oEXOJRLFqh1aUgqwCrgAyAcWAFeZ2YqIMoOBacB4M9spqauZFUTM/18gCygys9tq2152drbl5OTEYE+ah/dWb+f25xaza38Z/3nRMK4f27fa+xkqK42n5m/gl6/n0ULi7ouO5+rRfY7p3odl+bu5+qH5ZKa34rlbT6dreutj2RXn3BGQtNDMsusqF8uaxWhgtZmtNbNSYCowKarMLcD9ZrYTICpRjAK6AW/EMMZmr6yikl+9nsc1j7xPeutUXvrmOG44o1+NX/4tWogbzujHzNvP5uReHfnPf3zENQ+/z8aikqPaft7WPVz36Pt0aJPG0zeP8UThXIKKZbLoCWyMeJ0fTos0BBgiaa6k+ZImAkhqAfwWuLO2DUiaIilHUk5hYWEDht48bCwq4fK/zuMvb6/hiuzevPytMxneo0O9lu3duS1P3zyGX1x2Ekvzd/O5P/yLJ95bT2Vl/Wuqawv3cu3DH9AqtQXP3nI6PTLaHO2uOOdiLJZnEKv7aRr9TZIKDAbOBXoB/5Z0InAt8KqZbaytecPMHgQehKAZqgFibjZeWbqZu19cBnDUI8xJ4uoxfThnaBZ3/30ZP5mxnH8u28KvvnxyneNLbCwq4ZqH38fMePrmsfTp0vao9sM51zhimSzygd4Rr3sBm6spM9/MyoB1klYSJI+xwFmSvgm0B1pK2mtm1Z4kd/VXUlrOfS+vYOqCjZzSJ4M/XnkKvTsf2xd1z4w2PDH5NJ5fmM/PXlnBxP/9F3dcOJTJ4/pXe9f1lt37ufrh+ZSUVjB1yukM6tr+mLbvnIu9WJ7gTiU4wX0+sIngBPfVZrY8osxEgpPeN0jKBBYBI81sR0SZG4FsP8F97FZs3sO3nv2Qtdv38c1zB3L7hCGkNfCd0Vt3H+A//7GMWXkFnNong199ZcQhyaCw+CBXPDiPgj0HefrmMYzondGg23fOHZm4n+A2s3LgNmAmkAtMM7Plku6TdGlYbCawQ9IKYA5wZ2SicA3DzHjivfV88S9zKT5Qzt9uGsOdnzu+wRMFwHEdW/PwDdn8/ooRrCncx0V//DcPvLOG8opKdpWUct0j77Nl1wEem3yaJwrnmpCY1Swam9csqrdzXyl3vrCUt3K3cd7QLH7z1RF0ad+qUbZdUHyA//fSR8xcvo0RvTpSabByWzGP3nAaZw7ObJQYnHO1q2/Nwm+RTWLz1uzgu88tpmhfKfdcPJzJ42q+JDYWuqa35oFrR/HPZVu4Z/py9uwv48HrR3micK4J8mSRpN5bvZ1rHnmf/l3a8fANZ3Biz45xiUMSF5/cgzMHZbJ970EGdU2PSxzOuWPjySJJ/X3RJtJbpfLyt85MiD6WMtq2JKNty3iH4Zw7St6RYBKqrDTm5BVwztCuCZEonHNNnyeLJLQkfxc79pUyYVjXeIfinEsSniyS0Oy8AloIzhmSFe9QnHNJwpNFEpqVW0B2385+jsA512A8WSSZLbv3s2LLHsZ7E5RzrgF5skgys/OCXt7PP96ThXOu4XiySDKzcwvo3bmNd87nnGtQniySyP7SCt5dvZ3zj+/WqHdqO+eSnyeLJPLemu0cLK/kfD9f4ZxrYJ4sksisvALatUxhdP/O8Q7FOZdkPFkkCTNjdm4BZw3OolVqSrzDcc4lGU8WSWLFlj1s3XPAL5l1zsWEJ4skMTs3uGT2vKGeLJxzDc+TRZKYlVfAiN4ZZKU3zsBGzrnmxZNFEigsPsiS/F1+I55zLmY8WSSBOSsLMMMvmXXOxYwniyQwO7eA4zq0Znj3DvEOxTmXpDxZNHEHyyv498eFjB/W1e/ads7FjCeLJu6DdUXsK63w8xXOuZiKabKQNFHSSkmrJd1VQ5nLJa2QtFzSM+G0kZLmhdOWSroilnE2ZbNyC2iV2oIzBmbGOxTnXBKL2QDNklKA+4ELgHxggaQZZrYiosxg4G5gnJntlFT187gEuN7MPpbUA1goaaaZ7YpVvE2RmTErbxvjBmXSpqXfte2ci51Y1ixGA6vNbK2ZlQJTgUlRZW4B7jeznQBmVhD+XWVmH4fPNwMFgI8RGmVN4V42Fu1nvDdBOediLJbJoiewMeJ1fjgt0hBgiKS5kuZLmhi9EkmjgZbAmmrmTZGUIymnsLCwAUNvGmaFd217snDOxVosk0V1l+ZY1OtUYDBwLnAV8LCkjE9XIHUHngImm1nlYSsze9DMss0sOyur+VU8ZuUWMLx7B3pktIl3KM65JBfLZJEP9I543QvYXE2Z6WZWZmbrgJUEyQNJHYB/Aj82s/kxjLNJ2lVSSs6GIr8RzznXKGpMFpI+J+kr1Uy/RtIF9Vj3AmCwpP6SWgJXAjOiyrwEnBeuN5OgWWptWP4fwJNm9nz9dqV5eWdVIZXmTVDOucZRW83ip8A71UyfBdxX14rNrBy4DZgJ5ALTzGy5pPskXRoWmwnskLQCmAPcaWY7gMuBs4EbJS0OHyPrvVfNwKzcArq0a8mIXhl1F3bOuWNU26Wzbc3ssLPGZrZVUrv6rNzMXgVejZp2T8RzA74XPiLL/A34W3220RyVV1Ty9soCLjzhOFq08Lu2nXOxV1vNorWkw5KJpDTAz6jG0cINO9lzoNzv2nbONZraksXfgYciaxHh8wfCeS5OZucVkJYizhzsd2075xpHbcnix8A2YIOkhZI+BNYDheE8Fyez8go4fUAX0lunxTsU51wzUeM5i/AE9V2SfgoMCievNrP9jRKZq9aGHftYXbCXa8b0iXcozrlmpMZkIelLUZMMyJC02MyKYxuWq4nfte2ci4faroa6pJppnYGTJd1kZrNjFJOrxey8AgZ1bU/fLvW6IM055xpEbc1Qk6ubLqkvMA0YE6ugXPWKD5Tx/rodfG1c/3iH4pxrZo64uw8z2wD4mdU4ePfj7ZRVmDdBOeca3REnC0nHAwdjEIurw6y8Ajq2SWNU307xDsU518zUdoL7ZQ7vJbYz0B24NpZBucNVVhpz8go4d2gWqSk+Gq5zrnHVdoL7N1GvDSgiSBjXAvNiFZQ73JL8XezYV+pNUM65uKjtBPennQiGnfhdTdDB3zrgxdiH5iLNyi0gpYU4Z0jzG7fDORd/tTVDDSHoVvwqYAfwHCAzO6+RYnMRZuUVMKpvJzLatox3KM65ZqgUHoQbAAAUcElEQVS2xu884HzgEjM708z+BFQ0Tlgu0uZd+8ndssc7DnTOxU1tyeLLwFZgjqSHJJ1P9UOluhibnRfcte2j4jnn4qXGZGFm/zCzK4DjgbeB7wLdJP2fpAsbKT5HkCz6dG7LwKz28Q7FOddM1XkNppntM7OnzexignG0FwN3xTwyB8D+0grmrt7O+cO6InnFzjkXH0d0wb6ZFZnZX81sfKwCcod6b812DpZXcv7x3eIdinOuGfO7uxLcrLwC2rVMYXT/zvEOxTnXjHmySGBmxuzcAs4ekkXLVH+rnHPx499ACWz55j1s3XPA79p2zsVdTJOFpImSVkpaLanak+KSLpe0QtJySc9ETL9B0sfh44ZYxpmoZucVIMG5Qz1ZOOfiq7a+oY6JpBTgfuACIB9YIGmGma2IKDMYuBsYZ2Y7JXUNp3cGfgJkE/RJtTBcdmes4k1Es/IKGNErg6z0VvEOxTnXzMWyZjGaYMzutWZWCkwFJkWVuQW4vyoJmFlBOP1zwJvh1Vc7gTeBiTGMNeEUFh9kycZdfte2cy4hxDJZ9AQ2RrzOD6dFGgIMkTRX0nxJE49gWSRNkZQjKaewsLABQ4+/OSur7tr2S2adc/EXy2RR3R1k0eNjpAKDgXMJOix8WFJGPZfFzB40s2wzy87KSq7eWGfnFtC9Y2uGdU+PdyjOORfTZJEP9I543QvYXE2Z6WZWZmbrgJUEyaM+yyatg+UV/PvjQsYf73dtO+cSQyyTxQJgsKT+kloSdHc+I6rMS8B5AJIyCZql1gIzgQsldZLUCbgwnNYsvL+2iH2lFd5xoHMuYcTsaigzK5d0G8GXfArwqJktl3QfkGNmM/gsKawg6P78TjPbASDpZwQJB+A+MyuKVayJZnZeAa3TWnDGwMx4h+Kcc0AMkwWAmb0KvBo17Z6I5wZ8L3xEL/so8Ggs40tEZsasvG2MG5hJ67SUeIfjnHOA38GdcD4u2MvGov2M9yYo51wC8WSRYF5ZugUJJvgls865BOLJIoGYGTMWb+KMgV3o1qF1vMNxzrlPebJIIEvyd7N+RwmTRhx2/6FzzsWVJ4sEMn3xJlqmtmDiScfFOxTnnDuEJ4sEUV5RyctLtjB+aFc6tE6LdzjOOXcITxYJYt7aHWzfe5AvntIj3qE459xhPFkkiJcWbSa9daqPXeGcS0ieLBLAgbIKZi7fyudPPM5vxHPOJSRPFglgVm4Bew+WM2mkXwXlnEtMniwSwPTFm+ia3orTB3SJdyjOOVctTxZxtrukjLdXFnLJiB6ktPDuyJ1zicmTRZy99tEWSisqmTTSr4JyziUuTxZxNn3xZgZktuOknh3jHYpzztXIk0Ucbd19gPnrdnDpyB4+Ip5zLqF5soijl5dsxgy/Cso5l/A8WcTRS4s3MaJXR/pntot3KM45VytPFnGyuqCY5Zv3eK3COdckeLKIk+mLN9NCcPGI7vEOxTnn6uTJIg7MjOmLNzNuUCZd032QI+dc4vNkEQeLNu7ik6ISLh3h91Y455oGTxZxMGPx5mCQoxN9kCPnXNMQ02QhaaKklZJWS7qrmvk3SiqUtDh83Bwx71eSlkvKlfRHJcmNCOUVlbyydDMThnUl3Qc5cs41EamxWrGkFOB+4AIgH1ggaYaZrYgq+pyZ3Ra17BnAOODkcNK7wDnA27GKt7HMXbOD7XtL/Soo51yTEsuaxWhgtZmtNbNSYCowqZ7LGtAaaAm0AtKAbTGJspFNX7QpHOQoK96hOOdcvcUyWfQENka8zg+nRfuypKWSXpDUG8DM5gFzgC3hY6aZ5UYvKGmKpBxJOYWFhQ2/Bw1sf2kwyNFFJ3anVaoPcuScazpimSyqO8dgUa9fBvqZ2cnAW8ATAJIGAcOAXgQJZryksw9bmdmDZpZtZtlZWYn/S/2t3G3sK61gko+z7ZxrYmKZLPKB3hGvewGbIwuY2Q4zOxi+fAgYFT6/DJhvZnvNbC/wGnB6DGNtFNMXb6Zbh1aM6e+DHDnnmpZYJosFwGBJ/SW1BK4EZkQWkBR5+/KlQFVT0yfAOZJSJaURnNw+rBmqKdlVUso7qwq41Ac5cs41QTG7GsrMyiXdBswEUoBHzWy5pPuAHDObAXxb0qVAOVAE3Bgu/gIwHlhG0HT1upm9HKtYG8Ory7ZSVmF+FZRzrkmKWbIAMLNXgVejpt0T8fxu4O5qlqsAbo1lbI1t+uJNDMxqxwk9OsQ7FOecO2J+B3cj2LxrP++vK2LSyJ4+yJFzrknyZNEIXl4SnNf3cbadc02VJ4tG8NLizYzsnUHfLj7IkXOuafJkEWOrthWTu2UPX/RahXOuCfNkEWPTF28ipYX4wsmeLJxzTZcnixiKHOQoK71VvMNxzrmj5skihj78ZCf5O/czyQc5cs41cZ4sYmj64s20Sm3BhSd0i3cozjl3TDxZxEhZRSWvLN3ChOHdfJAj51yT58kiRt5dvZ2ifaXeBOWcSwqeLGJk+qJNdGyTxrlDu8Y7FOecO2aeLGKgpLScN1Zs46KTjqNlqh9i51zT599kMfBWbgElpRXew6xzLml4soiB6Ys20b1ja0b36xzvUJxzrkF4smhgO/eV8s6qQi4d0YMWPsiRcy5JeLJoYP9ctoXySuNS7wvKOZdEPFk0sBmLNzO4a3uGd/dBjpxzycOTRQPK31nCB+uLmDSyhw9y5JxLKp4sGtDLS7YA+FVQzrmk48miAU1fvIlT+2TQu3PbeIfinHMNypNFA5mTV0De1mKvVTjnklJMk4WkiZJWSlot6a5q5t8oqVDS4vBxc8S8PpLekJQraYWkfrGM9Vis276P70xdxLDuHbg8u3e8w3HOuQaXGqsVS0oB7gcuAPKBBZJmmNmKqKLPmdlt1aziSeDnZvampPZAZaxiPRZ7D5Yz5ckcUlqIB68bRZuWKfEOyTnnGlwsaxajgdVmttbMSoGpwKT6LChpOJBqZm8CmNleMyuJXahHp7LS+P60xawp3Mufrz7Vz1U455JWLJNFT2BjxOv8cFq0L0taKukFSVVtOEOAXZL+LmmRpF+HNZVDSJoiKUdSTmFhYcPvQR3un7Oamcu38aOLhjFuUGajb9855xpLLJNFdTcaWNTrl4F+ZnYy8BbwRDg9FTgLuAM4DRgA3HjYysweNLNsM8vOyspqqLjr5a0V2/jdW6u47JSe3HRm/0bdtnPONbZYJot8IPJsby9gc2QBM9thZgfDlw8BoyKWXRQ2YZUDLwGnxjDWI7K6YC/ffW4xJ/TowH9/6SS/Ac85l/RimSwWAIMl9ZfUErgSmBFZQFL3iJeXArkRy3aSVFVdGA9EnxiPiz0HypjyVA4tU1vw1+uyaZ3mJ7Sdc8kvZldDmVm5pNuAmUAK8KiZLZd0H5BjZjOAb0u6FCgHigibmsysQtIdwCwFP9sXEtQ84qqy0vju1MV8sqOEp28eQ8+MNvEOyTnnGoXMok8jNE3Z2dmWk5MT02387o2V/HH2au6bdALXj+0X020551xjkLTQzLLrKud3cNfT6x9t4Y+zV3N5di+uO71vvMNxzrlG5cmiHlZtK+b705YwsncG90060U9oO+eaHU8WddhdUsaUJ3No2yqVB64d5Se0nXPNkieLWlRUGt+euohNu/bzf9ecynEdW8c7JOeci4uYXQ2VDH7zxkreWVXILy47iex+neMdjnPOxY3XLGrwytLN/N/ba7h6TB+uHtMn3uE451xcebKoxorNe7jz+aVk9+3EvZecEO9wnHMu7jxZRNm5r5QpT+XQoU0qf7n2VFqm+iFyzjk/ZxGhvKKS2579kII9B5n29bF0TfcT2s45B54sDvE/r+Uxd/UOfvWVkxnZOyPe4TjnXMLwNpbQS4s28fC767hhbF8fGtU556J4sgA+2rSbH764lNH9O/Pji4fHOxznnEs4zT5ZbN97kClP5tClXUv+cs2ppKU0+0PinHOHafbnLFIkhnXvwO0ThpDZvlW8w3HOuYTU7JNFp3YteeTG0+IdhnPOJTRvc3HOOVcnTxbOOefq5MnCOedcnTxZOOecq5MnC+ecc3XyZOGcc65Oniycc87VyZOFc865OsnM4h1Dg5BUCGyIdxz1lAlsj3cQR6CpxQsec2NpajE3tXgh9jH3NbOsugolTbJoSiTlmFl2vOOor6YWL3jMjaWpxdzU4oXEidmboZxzztXJk4Vzzrk6ebKIjwfjHcARamrxgsfcWJpazE0tXkiQmP2chXPOuTp5zcI551ydPFk455yrkyeLGJDUW9IcSbmSlkv6TjVlzpW0W9Li8HFPPGKNimm9pGVhPDnVzJekP0paLWmppFPjEWdEPEMjjt9iSXsk3R5VJu7HWdKjkgokfRQxrbOkNyV9HP7tVMOyN4RlPpZ0Qxzj/bWkvPB9/4ekjBqWrfUz1Mgx3ytpU8R7f1ENy06UtDL8XN8V55ifi4h3vaTFNSzb+MfZzPzRwA+gO3Bq+DwdWAUMjypzLvBKvGONimk9kFnL/IuA1wABpwPvxzvmiNhSgK0ENxgl1HEGzgZOBT6KmPYr4K7w+V3AL6tZrjOwNvzbKXzeKU7xXgikhs9/WV289fkMNXLM9wJ31ONzswYYALQElkT/rzZmzFHzfwvckyjH2WsWMWBmW8zsw/B5MZAL9IxvVA1iEvCkBeYDGZK6xzuo0PnAGjNLuLv4zexfQFHU5EnAE+HzJ4AvVrPo54A3zazIzHYCbwITYxZoqLp4zewNMysPX84HesU6jiNRwzGuj9HAajNba2alwFSC9ybmaotZkoDLgWcbI5b68GQRY5L6AacA71cze6ykJZJek3RCowZWPQPekLRQ0pRq5vcENka8zidxkuCV1PyPlWjHGaCbmW2B4McF0LWaMol6vL9GUMOsTl2focZ2W9h09mgNTX2JeozPAraZ2cc1zG/04+zJIoYktQdeBG43sz1Rsz8kaDIZAfwJeKmx46vGODM7Ffg88B+Szo6ar2qWifu115JaApcCz1czOxGPc30l3PGW9J9AOfB0DUXq+gw1pv8DBgIjgS0EzTrREu4Yh66i9lpFox9nTxYxIimNIFE8bWZ/j55vZnvMbG/4/FUgTVJmI4cZHdPm8G8B8A+CKnqkfKB3xOtewObGia5Wnwc+NLNt0TMS8TiHtlU14YV/C6opk1DHOzzBfjFwjYUN59Hq8RlqNGa2zcwqzKwSeKiGWBLqGANISgW+BDxXU5l4HGdPFjEQtjc+AuSa2e9qKHNcWA5Jowneix2NF+Vh8bSTlF71nOCE5kdRxWYA14dXRZ0O7K5qSomzGn+FJdpxjjADqLq66QZgejVlZgIXSuoUNqFcGE5rdJImAj8ELjWzkhrK1Ocz1GiizqddVkMsC4DBkvqHNdQrCd6beJoA5JlZfnUz43acG/NsenN5AGcSVGWXAovDx0XA14Gvh2VuA5YTXH0xHzgjzjEPCGNZEsb1n+H0yJgF3E9w9cgyIDsBjnVbgi//jhHTEuo4EySyLUAZwS/Zm4AuwCzg4/Bv57BsNvBwxLJfA1aHj8lxjHc1Qdt+1ef5gbBsD+DV2j5DcYz5qfBzupQgAXSPjjl8fRHBFYtr4h1zOP3xqs9vRNm4H2fv7sM551ydvBnKOedcnTxZOOecq5MnC+ecc3XyZOGcc65Oniycc87VyZOFaxIkmaTfRry+Q9K9DbTuxyV9pSHWVcd2vqqgJ+I51cz7tYIein99FOsdWVOPqs41FE8Wrqk4CHwpQe6+/pSklCMofhPwTTM7r5p5txL0VHznUYQxkuBegXoLb6z0/39Xb/5hcU1FOcFYxN+NnhFdM5C0N/x7rqR3JE2TtErS/0i6RtIH4VgAAyNWM0HSv8NyF4fLp4S/+BeEndHdGrHeOZKeIbjpKzqeq8L1fyTpl+G0ewhu1nwguvYgaQbQDnhf0hWSsiS9GG53gaRxYbnRkt6TtCj8OzS86/g+4AoFYxtcoWAchzsi1v+RpH7hI1fSXwj6zOot6UJJ8yR9KOn5sD8zwmO1Itzv3xzpm+WSUGPdregPfxzLA9gLdCDox78jcAdwbzjvceArkWXDv+cCuwjGF2kFbAJ+Gs77DvCHiOVfJ/jxNJjgbtrWwBTgx2GZVkAO0D9c7z6gfzVx9gA+AbKAVGA28MVw3tvUcNd7Vczh82eAM8PnfQi6jSHc/6oxJSYAL4bPbwT+HLH8vUSM40DQFUS/8FEJnB5OzwT+BbQLX/8QuIdg/IyV8OlNuxnxfv/9Ef9Hat3pxLnEYGZ7JD0JfBvYX8/FFljYf5WkNcAb4fRlQGRz0DQLOpz7WNJa4HiCPndOjqi1dCRIJqXAB2a2rprtnQa8bWaF4TafJhjk5kh6u50ADA+7tALoEPYF1BF4QtJggu5k0o5gnVU2WDAWCQQDWA0H5obbagnMA/YAB4CHJf0TeOUotuOSjCcL19T8gaAJ5bGIaeWETaphp4EtI+YdjHheGfG6kkM//9H93hhBX1jfMrNDOu+TdC5BzaI61XV5faRaAGPN7JCEKOlPwBwzu0zBOClv17D8p8cj1DrieWTcIhhc6aroFYSdLp5P0LHebcD4I9sFl2z8nIVrUsysCJhGcLK4ynpgVPh8Ekf3i/urklqE5zEGEDTDzAS+oaC7eSQNCXv5rM37wDmSMsOT31cB7xxhLG8QfEETbndk+LQjQVMaBE1PVYoJhu+tsp5guE4UjJPev4btzAfGSRoUlm0b7mN7go4ZXwVuJziB7po5TxauKfotQXt7lYcIvqA/AMZQ86/+2qwk+FJ/jaDHzwPAw8AK4ENJHwF/pY7aeNjkdTcwh6BX0A/NrLrux2vzbSA7PLm8gqAXXQjG7f5vSXMJxo6uMoeg2WqxpCsIxlHpLGkx8A2CHlWri7WQIOk8K2kpQfI4niDxvBJOe4dqLipwzY/3Ouucc65OXrNwzjlXJ08Wzjnn6uTJwjnnXJ08WTjnnKuTJwvnnHN18mThnHOuTp4snHPO1en/A6+KnonFbEOvAAAAAElFTkSuQmCC\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(range(1, len(selector.grid_scores_) + 1), selector.grid_scores_)\n", + "plt.title('Mean AUC by number of features')\n", + "plt.ylabel('AUC')\n", + "plt.xlabel('Number of features')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the change in AUC is not that great across a range of features around the 15 selected. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply nested cross validation to create model\n", + "\n", + "The next step is to use nested cross validation to optimize the model hyperparameter and test the model performance. The model is constructed using the features selected. \n", + "\n", + "As a first step, construct the inside and outside folds for the nested cross validaton by running the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": 221, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below performs the grid search for the optimal model hyperparameter. As before, the scoring metric is AUC. " + ] + }, + { + "cell_type": "code", + "execution_count": 222, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "10" + ] + }, + "execution_count": 222, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "nr.seed(3456)\n", + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"C\": [0.1, 1, 10, 100, 1000]}\n", + "## Define the logistic regression model\n", + "logistic_mod = linear_model.LogisticRegression(class_weight = {0:0.1, 0:0.9}) \n", + "\n", + "## Perform the grid search over the parameters\n", + "clf = ms.GridSearchCV(estimator = logistic_mod, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = sklm.make_scorer(sklm.roc_auc_score),\n", + " return_train_score = True)\n", + "\n", + "## Fit the cross validated grid search over the data \n", + "clf.fit(Features_reduced, Labels)\n", + "\n", + "## And print the best parameter value\n", + "clf.best_estimator_.C" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The optimal value of the hyperparameter is 10. This parameter is larger than for the same model with all the features. Recalling that the parameter is the inverse of regularization strength, the smaller parameter means the model with fewer features requires less regularaization. \n", + "\n", + "To get a feel for the results of the cross validation execute the code in the cell below and observe the results. " + ] + }, + { + "cell_type": "code", + "execution_count": 223, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Performance metrics by parameter\n", + "Parameter Mean perforance STD performance\n", + " 0.10 0.65771 0.03956\n", + " 1.00 0.67252 0.04432\n", + " 10.00 0.67278 0.03846\n", + " 100.00 0.67196 0.03850\n", + " 1000.00 0.67196 0.03850\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZIAAAElCAYAAADOTWQ3AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzsnXmYHFW5/z9vb7NvWSbJTDLZZkIIO4SEXVAQFBUERBYRvCp6XXC5BkG5gggXFNefOxcBFSRCgAgINyA7SIBAgJAFsiczk0kyyexLr+/vj1PdU7P1dDIz6Z6Z83mefrrqnFOn3jrVXd8623tEVbFYLBaLZX/xpNsAi8VisYxsrJBYLBaLZVBYIbFYLBbLoLBCYrFYLJZBYYXEYrFYLIPCConFYrFYBoUVkhGOiNwkIvUiUpduW0YrInKpiDyZbjsAROQGEbkn3XZY+kdEVEQq023HgcQKyQFGRLaISIeItIrIThG5S0Ty9zOvacB/AfNUdfLQWjr6EZEZzp/elyydqt6rqh8+UHZZ9g0RuVtEbkq3HWMZKyTp4eOqmg8cDRwLXLevGTgPv+nAHlXdtZ/HWwbAltO+M9LKbKTZm4lYIUkjqloDPAEcCiAiRSLyJxHZISI1TrOV14m7QkReFpFfiMhe4DngKaDMqd3c7aT7hIisFpFGEXlORA6On8+pDX1XRN4B2kTE54QtEpF3RKTNOf8kEXlCRFpE5F8iUuLK4wERqRORJhF5QUQOccXdLSK/FZF/Ose+KiKzXfGHiMhTIrLXqY19zwn3iMg1IrJRRPaIyP0iMq6vMhORU0WkWkSuFpFdTlmdKyIfFZH3nby/50qfLO8XnO9GpwyP76Ocb3DCXhroOnrYeZxTTl5X2CedskdEFojIChFpdvL4eZKfSr8McL+PFpGVzr14QET+3t+bu+u6f+3c23Ui8iFX/OdEZK2T1yYR+ZIrLn5PviumifUuESkRkcdEZLeINDjbU13HPOf8vv/tlP2jIjJeRO51yuR1EZnhSj/XVebviciFTviVwKXA1fF8nPAyEXnQOf9mEbnKldcNIrJERO4RkWbgih5lkcq9e8Up8x0i8hsRCfRTrs+JyBd6lLP7t9TndY04VNV+DuAH2AKc7mxPA1YDP3L2lwJ/BPKAUuA14EtO3BVABPg64ANygFOBalfec4A24AzAD1wNbAACrnO/5Zw3xxW2HJgElAO7gDeBo4As4Bngetc5/gMocOJ+Cbzlirsb2AsscGy8F1jsxBUAOzBNcdnO/kIn7puODVOdfP8I3NdP+Z3qlMMPnGv8IrAb+JuT5yFAJzBroLyBGYACPlf+fZXzFcBLA11HH7ZuBM5w7T8AXONsvwJc5mznA8el+Pu5AbhnoPvtfLYC33DizgNCwE395Bu/7m856T8NNAHjnPizgdmAAB8A2oGje9yTHztlnAOMB84Hcp0yegBY6jrfc46ts4EiYA3wPnC6U+5/Ae5y0uYB24HPOXFHA/XAIa7f3U2uvD3AG85vJADMAjYBZ7rKMAyc66TN2cd7dwxwnGPLDGAt8E1XWgUqXdf5hR7l/FIq1zWSPmk3YKx9MA/uVqDR+aP/zvnjTQKC7h81cDHwrLN9BbCtR16n0l1I/hu437XvAWqAU13n/o8+7LnUtf8g8HvX/tfdD4AexxY7f5oiZ/9u4A5X/EeBda5rWdlPPmuBD7n2pzh/dF8faU8FOgCvs1/g2LDQleYN4NyB8qZ/IelZzu4/f7/X0YetNwF3uuxsA6Y7+y8APwQm7OPv5wa6hKTf+w2c4myLK/4lkgtJbY/0r+GIXR/plwLfcN2TEJCdxO4jgQbX/nPA9137PwOecO1/HOclBSNqL/bI7484Lzj0FpKFfdzDa+kSphuAF/b33vWR9pvAw679VIUk6XWNpI9tG0wP56rqv9wBInIY5k1wh4jEgz2YN5Y47u2+KMOIEwCqGhOR7ZiaRrI8drq2O/rYz3ds9AI3A58CJgIxJ80EzNsrgHv0WHv8WEwtaGM/dk8HHhaRmCssihHXmj7S71HVqMu+vq4hft5kefdHsnJOdh09+RvwbxH5T0yN4E1Vjd+fzwM3AutEZDPwQ1V9LMV84yS731GgRp2nk8NAv5+e6bc650BEPgJcj6kFeTA1jVWutLtVtTO+IyK5wC+As4B402iBiHhd9y6l3x3mHi4UkUZXvA/4az/XMR3T5OtO7wVedO0PVBb93jsRmQP8HJiPKQcf5uVlX9nX68pYbB9J5rAdUyOZoKrFzqdQVQ9xpRnIVXMt5scJgBhFmkb3h/Fg3D1fApyDaX4owrzRg2nuGIjtmGaM/uI+4rruYlXNVtOHNFiS5d1fWSQro2TX0T0T1TWYh/FHMGX3N1fcelW9GNOE+WNgiYjkpZKvi2T3ewdQLq63EicuGT3TVwC1IpKFqan+FJikqsXA43S/7z3L7L+AgzA1xUJMDQlS+630ZDvwfI97mK+q/9nPubcDm3ukL1DVjyaxtxvJ7h3we2AdUOVc2/eSXFcbRmziuEdXDnRdIwYrJBmCqu4AngR+JiKFYjqJZ4vIB/Yhm/uBs0XkQyLix/yZg8C/h8jMAie/PZg/x//sw7GPAZNF5JsikiUiBSKy0In7A3CziEwHEJGJInLOENmcLO/dmFrVrH3IL9l19MXfgKswD9IH4oEi8hkRmaiqMUwzJ5haxL6Q7H6/4uT3NTGDKs7B9F0loxS4SkT8IvIp4GCMYAQwfR+7gYhTOxloOHQBplbRKGZww/X7eG1uHgPmiMhljm1+ETlWugYW7KT7PXwNaHY6/3NExCsih4rIsft43j7vHebamoFWEZkLJHvwvwWcJyK5YuaWfH4frmvEYIUks/gs5k+7BmgAlmDa9FNCVd8DPgP8GtNp93HMUOPQENn3F8xbWo1j4/J9sK0F0yn8cUzz13rgNCf6V8AjwJMi0uLkm+zhvC/0m7eqtmOa6l52RuAcN8jr6Iv7MH0Iz6hqvSv8LGC1iLQ6Nl4UbxpyRh+dnIIt/d5v556fh3lwNTrpHsMITX+8ClQ5ed0MXKCqe5xrvgojXA2YN/RHBjDvl5i+v3pMmf/fQNfTH875PwxchKmF1dHVsQ/wJ2Cecw+XOk1nH8f0y2x2bLgDU4veF/q7d9/BlEEL8L/A35Pk8QtM/9FO4M+YASipXteIQbo3iVosltGKiLwK/EFV7+oj7gpMp/BJB9wwy4jH1kgsllGKiHxARCY7TVuXA4cziJqBxdIfdtSWxTJ6OQjTHJWPGWl2gdMXZ7EMKbZpy2KxWCyDwjZtWSwWi2VQWCGxZBQicqKIrHdGLp2bbntGKmJ8qJ3ubH9PRO5IJe1+nOdkEXlvf+20jA5sH4kl07gR+I2q/irdhowWVHVf5vskRUQUMxFvg5P3i5i+GMsYxtZILJnGdIwjy16IISN+s2Jdj1ssCTLiT2mxAIjIRswM5Uedpq0sxw33zSLyMsZ31ywxLsIfcVxvbxCRL7ryuEGMy/R7xLg8XyUic0TkWjFu57eLSL+zskVkmog8JMb9+B4R+Y0T3pd7eY+IXCciW528/yIiRU76bMeGPc5EuddFZJIrr02OfZtF5NI+7CgTswDaOFfYUWJWw/SL8XrwjJN/vRj368X9XFO3VRWdmdRbnWO/3yNtvy7SRSTudv9t5/58WhwX8q7jD3buWaMY9/afcMUlXWbAMnKxQmLJGFR1NrANZ+EvVY3Pwr4MuBLjmmIrZsZxNcah4AXA/4hr7QzMrOa/YpwFrgSWYX7r5Zimsz/2dX4xTikfc84xw0m/2JVkIcYdeSlm5vcVzuc0jADmA79x0l6OmUk9DeNS/ctAhxh/Wv8P4/+rADgB40ajZ1nUYtycnO8KvgRYoqphjG+nW5wyONg5zw19XVePa5yH8RV1mXPseIyL/ThRjCv5CcDxwIeArzg2xf1lHeHcn24zusW4aXkU4+qnFOM5+l4RcTd9XYzxelyCcSN/80A2WzIfKySWkcDdqrpaVSMYp3cnAd9V1U5VfQvj/uIyV/oXVXWZk/4BjKfiW50H8GJgRj9v7wswD9dFqtrm5P+SK75WVX+tqhFV7cAsqPRzVd2kqq0YV+UXOc1eYcxDulJVo6r6hqo2O/nEgENFJEdVd6hqn015GF9PF0PCIeNFThiqukFVn1LVoKruxnijTcUv2wXAY6r6giPU/02XF2ccO5c717gFI7qp+ns7DiOmtzpuWp7BCPPFrjQPqeprzr25F+PGxDLCsUJiGQm4XX6XAXsdP0VxttLdVX5Pd+T1fbidz6c304CtzkNuIDvitmx17W/FDGCZhKkRLQMWi0itiPxERPyq2oZZh+LLmCUD/inG8V9fLAGOF5EyjONAxXGFLiKlIrJYzEqazcA9mFrEQJS5r8OxZ09832kGfEzMCoHNGMecqeSbyNtxRBmn573pb5kBywjGCollJOCeNVsLjBORAldYBX2vW7KvbAcqknSk95y9282Nu2NHBNipqmFV/aGqzsM0X30M45QTp7Z0BsYh5zqM47/eJ1NtxDQTXYhp1rrPtV7ILY49hzuuzD9Dai7ad+ByJy9m3ZDxrvh9cZHek1pgWo8BEUN1bywZjBUSy4hCVbdj3KTf4nRoH47xcHtv8iNT4jXMg/ZWEclz8j8xSfr7gG+JyEwRyce8vf9dVSMicpqIHOb0uzRjmrqiIjJJzDrreRhPvK0kdx//N4wAnU/3NTEKnGMbRaQcWJTiNS4BPiYiJzmd6DfS/TkwkIv0ni7b3byKWX/jamdAwKmY/qrF/aS3jBKskFhGIhdjOsNrgYcxS5M+NdhMXe7HKzGd/tWYZqj+uBPThPUCxl15J6aDGUxfzhLMQ3kt8Dym+cmDWTekFrO+/QdwOrP74RGMa/edqvq2K/yHmDW+m4B/Ag+leI2rga9iRGkHxi18tSvJQC7SbwD+7IzKurBH3iHgE5jFoOoxy0h/VlXXpWKbZeRifW1ZLBaLZVDYGonFYrFYBoUVEovFYrEMCiskFovFYhkUVkgsFovFMijGhOO5CRMm6IwZM9JthsVisYwo3njjjXpVnThQujEhJDNmzGDFihXpNsNisVhGFCKydeBUtmnLYrFYLIPEConFYrFYBoUVEovFYrEMCiskFovFYhkUVkgsFovFMiiskFgsFotlUFghsVgsFsugsEJisVgslkFhhcRisVgsg8IKicVisVgGhRUSi8VisQwKKyQWi8ViGRRWSCwWi8UyKKyQWCwWi2VQWCGxWCwWy6CwQmKxWCyWQWGFxGKxWCyDYliFRETOEpH3RGSDiFzTR/wvROQt5/O+iDS64qKuuEdc4TNF5FURWS8ifxeRwHBeg8VisViSM2xCIiJe4LfAR4B5wMUiMs+dRlW/papHquqRwK+Bh1zRHfE4Vf2EK/zHwC9UtQpoAD4/XNdgsVgsloEZzhrJAmCDqm5S1RCwGDgnSfqLgfuSZSgiAnwQWOIE/Rk4dwhstVgsFst+4hvGvMuB7a79amBhXwlFZDowE3jGFZwtIiuACHCrqi4FxgONqhpx5VneT55XAlcCVFRUDOIyLPvD0pU13LbsPWobOygrzmHRmQdx7lF93iqLxTLCGU4hkT7CtJ+0FwFLVDXqCqtQ1VoRmQU8IyKrgOZU81TV24HbAebPn9/feS3DwNKVNVz70Co6wuZ21jR2cO1DqwCsmFgso5DhbNqqBqa59qcCtf2kvYgezVqqWut8bwKeA44C6oFiEYkLYLI8LWnitmXv0RGOskDW8l3ffZzqWYmG27lt2XvpNs1isQwDw1kjeR2oEpGZQA1GLC7pmUhEDgJKgFdcYSVAu6oGRWQCcCLwE1VVEXkWuADT53I58I9hvAbLfrCjsY2vef/Bt3xL8IrynzxKp/pZ3jYPXl0PVWfAuFnpNtNisQwRwyYkqhoRka8BywAvcKeqrhaRG4EVqhof0nsxsFhV3c1PBwN/FJEYptZ0q6quceK+CywWkZuAlcCfhusaLPtB+17uzf05x8feZGn0BG4IX85hns2c6nmbM/xvwxNXwxPA+EqoPMOIyvQTwZ+dbsstFst+It2f36OT+fPn64oVK9JtxuinegXcfznR1p38KHI5d4dOI95VluP3cst5h3FuRSds+BesfxK2vASRTvDnwswPQNXpUPVhKLaDIyyWTEBE3lDV+QOlG86mLctYQRVe/SM8eR0UTsH7+Sc5ctckyvsbtTV+Niz8EoTajZisfxLWL4P3nzDxE+eamkrVh2HaceCzc04tlkzG1kgsg6OzGR75OqxZCnM+Ap/8PeSU7Hs+qlC/HjY85dRWXoZYGAL5MOtUIypVZ0Bh2VBfgcVi6QdbI7EMPztXw98vg4YtcPoP4YSrwLOfAwFFYOIc8zn+qxBsgc0vOLWVp2DdYybdpMO6msCmLgCv/QlbLOnG1kgs+8dbf4PHvg3ZhXDBXTDjxOE7lyrsWtslKtteAY1CdhHM/qARlcrTIb90+GywWMYgtkZiGR7CHfD4Ilj5V5hxMpz/JyiYNLznFIFJ88znpG9CZxNseq5LWFY/bNJNOdJpAvswlB8NHu/w2mWxWABbI7HsC3s2wv2Xw85VcPJ34LTvpf9hHYsZe+KiUv06aAxyxkHlh4yozP4Q5I1Pr50WywjE1kgsQ8uaR+AfXzXCcckDMOfD6bbI4PHAlCPM55RF0L4XNj5jRGXDU7DqAUBg6vyuJrApR+5/X47FYumFrZFYkhMJwb+uh+W/g/Jj4FN3j5x5HrEY1K7sGglW8yagkDexazLk7NP2b5SZxTIGSLVGYoXE0j9N1fDA56D6NVjwJfjwTSN7Tkfrbtj4tBGVDU9DZyOIF6Yt6Jq3MulQ0ydjsViskLixQrIfbPgXPPhFiIbgE7+GQ89Lt0VDSzQCNW84fStPQt07JrxgihGVyjPM/JXswnRaabGkFSskLqyQ7AOxKDz/Y3j+J1B6MFz4F5hQlW6rhp+Wui7XLRufhWAzeHxQcXzXZMiJc21txTKmsELiwgpJirTuhoe+YIbWHnEJnP0zCOSm26oDTzQM2191aiv/gl2rTXjRtK4msJmnQCAvvXZaLMOMFRIXVkhSYNtyeOAKM+rp7J/CUZfZt+84TdVmFNj6p4zIhtvAGzBei+PzVsbPtuVlGXVYIXFhhSQJqvDKb+Cp681orAv/AlMOT7dVmUskaGbWr3dGgtW/b8JLZnY1gc04Cfw56bXTYhkCrJC4sELSDx2NZm7Iusdg7sfg3N8ZtyOW1Nm72elbecr4Bot0gC/bNH3F562Mm5luKy2W/cJOSLQkZ8fbcP9nTbPNh282jhJt08y+M24mLPii+YQ7jNfi+Eiw9U+aNOOrumor008AXxZLV9ZwW39u9i22fAYg08rH1kjGGqrw5p/h8ashd7yZYFixMN1WjU72bOwSlC0vmaHU/jx2jF/IH2pn8VTocGqZALgW/rIPS5aurOHah1bREY4mwmz5dHEgy8c2bbmwQuIQajMee99ZDLNOg/PvgLwJ6bZqbBBqc9ziP0XdikeYzG4Adug4Imr8lXm9QlmRXXK4tqmTaLT3c8mWj8FdPi3k8tHQLQCUF+fw8jUfHNJzZUTTloicBfwKs2b7Hap6a4/4XwCnObu5QKmqFovIkcDvgUIgCtysqn93jrkb+ADQ5Bx3haq+NZzXMSrY/b5pytq9Dk691vilSrfDxbFEIA8O+ggc9BGOf+mDzJYaTvO8xVzP9q40ETh/+tT02ZghvPJmdd8RtnyA7uXTrlmJ7drGjnSYAwyjkIiIF/gtcAZQDbwuIo+o6pp4GlX9liv914GjnN124LOqul5EyoA3RGSZqjY68YtUdclw2T7qWLUEHv0G+LLgMw8ar7iWtFFWnMuGxqlsiE41r0kO5cU5nP/JoX2jHIn8/L1nqOnjoWjLx9Bf+ZQVp2+k4HC6QF0AbFDVTaoaAhYD5yRJfzFwH4Cqvq+q653tWmAXMHEYbR2dRILwz+/Ag5+HSYfAl160IpIBLDrzIHL83WuDOX4vi848KE0WZRa2fJKTieUznE1b5YCr3k410GevrohMB2YCz/QRtwAIABtdwTeLyA+Ap4FrVDXYx3FXAlcCVFSMEG+1Q0nDVjPBsPZNOP5rcPoN4PUfsNNn2qiSTCJeDrZ8+saWT3IysXyGrbNdRD4FnKmqX3D2LwMWqOrX+0j7XWBqzzgRmQI8B1yuqstdYXUYcbkd2KiqNyazZcx1tr+/DB660izwdM5vYd4nDujpl66s4QcPrqIoouQB9SgdPg83n29H3VgsI4lM6GyvBqa59qcCtf2kvQj4qjtARAqBfwLXxUUEQFV3OJtBEbkL+M6QWTzSiUbg2ZvhpZ/DpMPgwj8b1x3DgIajRBqCRBs6iTQ6387+rO3NPK7d/VBFIkrj/RvZ9cpuvEVZrk+ga7sggHjHxlwWW2NLji2f5GRa+QynkLwOVInITKAGIxaX9EwkIgcBJcArrrAA8DDwF1V9oEf6Kaq6Q0QEOBd4d/guYQTRstP0hWx5EY7+LHzkJ4Ny0xELRbuJQ0+xiLWGux/gFbzFWfhKsnlZw9QRow6lFWUCQikeJqrwyYCX8I42OtftRcOx7nkIeAsCvQXGvV8YQLwje3XDnvMAaho7uPahVQD2YYktn4HIxPIZNiFR1YiIfA1Yhhn+e6eqrhaRG4EVqvqIk/RiYLF2b2O7EDgFGC8iVzhh8WG+94rIRECAt4AvD9c1jBi2vARL/gM6m+Hc38ORvfS6F7FghGhDkEhDp/lu7HTtdxJri3Q/wCf4irPxlmQRmDceb0lWYt9Xko2nIIB4TG3ip9c+TlSV0pzdHFm6ik5RtmGqqGeeMRcABQjHiHVGiHVGiXVE0Ph2MIJ2RIjVRaCmd9OrZHmRbB+ebB+ebC+ebB/ifHtyfCbek7k1m3+/vZFTysN9hD/LUeOGpwY5krDlkxx3+YSjPp7Z/gE6wlFuW/Ze2oTETkgcycRi8PIv4ZkfwbhZxuHipENMVGekqzbhiIW7ZhFr7ykUHnwlWXhLsrt/F2cbocj3p/xwnnnNo3yo4gXOq3qULG/vB4LFYhka2sK5XPWsmZ4nwOZbzx7S/DOhj8QyTKgqureeyNIbiG7ZQGTS94hO/RiRZUq04U0iDUG0s7tQiN+TEIfAtIKEWMSbozz5fmQIfG11dGzn+8f9npmF7/HO7nncs/ZCWsKmv6SsKIen/+sDgz5HT2KdEaItIaKNIaLNQecTJtrkbDeF0GC013GeXB/ewgCeoiy8habZzFsUMNtFAXyFAcQ/9JM2P/Sz56lt6mMewDCVz0jDlk9y+i2fNM4jsUKSgagqsfYI0cbuNYrE9942NASm2wmoBtm11xGKbAIzCvGVZHdrfvLkDY1QJLO5tnYx6zfcwoxC5Z61l/Ls9gWY9yQzzv2bZxyG1zv0C2V588CfB0zuP00sGDWi0miEJSEyjUGie0OEtrT0rqXhiE1fAwNc257AvonNN884nEVL3ibscgPi9wrfPOPwYSmfkYYtn+T0Vz6jdR7JqGYwoyZUlVhbuI++ia5vDXV/g5Ysr6lFsJOs2At48zvwnXwJ3tnz8JZk48n1DatQJKOzcwdr113L3r0vUlJyPAfP/TGtJfB+Bo0q8WR58UzMxT+x/wdRLBQl2hxyxKarNhPfD21v7t13BEiOD19/gwOcbU9Wj79azxbl0d/CvG/Y8klOhpWP7SPZDwbyvqmqxFrDRhR6jHaKf/ccsSTZviR9FFmIpwN57Jvw7oNQeQacdzvkjhuya9ofVJW6uod5f/2NxGIRKiu/y9TySxEZ2aOqkqFhIzaRxqARnSZHdFz7vUa0AZLtTQjLE1v2sDkUYhdKE5p4BozLC/CT8+2iYlc/+A5720K9wm35GNzlE0F51fGzk06njVZI9oMTbzW+bsoR5uJlCh4m42GGz8f8kjwiDUGIdBcKT66vW59EfLRToo8iJ0nlcNda+PtlsHcjnPZ9OOnb4EnvwzoYqmfduu9TX/8vioqOYd7BPyE3d0ZabcoUNBLrqtk0O0LTFBefIHXVzZQgeMjckWWWkUEzykdpAWxn+4gj7mXzQ/i5EuPWuoEYdRHFPymX7LnjeomFJ3s/i/rtxfDYt4z32M/+w6y8l2Z27nqc9977AdFoG5WV11Ix7XMYH50WAPF58I3Lxjeub5fn5936DDsbOxiPUOwSk4n5Wdx5xbEHysyM5T/ufp3drb28HtnycXCXj7sBPKM720XkOGC1qrY4+wXAPFV9dbiNy1TKinOoaezgn4R5kQh1xOjAqVp+Zt7QnCTcCf/3XXjjbqg4AS64EwqnDE3e+2tSuIH33ruBnbseo6DgMObNu438vKq02jQSWXTmQVz70Cp2hqPsdBq2cvxe/vPsgwhMLUizdenngrMP6rPp2JaPob/yyfTO9t8DR7v22/oIG1PEHwR7wlH2uB4EQ3Yj926C+y+HunfgxG/CB/8bvOmtPO6uf5p1675PONzIrJnfYvr0L+Px2Art/pCJTvcyCVs+ycnE8hmwj0RE3lLVI3uEvaOqI6bXazgmJA6br5u1j8HSr5gGz0/+0SyGlEYikRbef/9H7Kh7kPz8ucw7+DYKCoao1mWxWDKaoewj2SQiV2FqIQBfATYNxrjRwLlHlQ/tG0A0DE//EP79a5hypHG4WDJj6PLfD/bsfYm1a68hGNzJjOlfYebMr+PxBNJqk8ViyTxSEZIvA/8PuA4zWvlpnHU+LENEc63xlbXtFZj/eTjzf8CfvrWpI5E2Nmz8MTU195KbO5v585dQVHhE2uyxWCyZzYBCoqq7SEyhtgw5G5+FB78A4Q447w44/FNpNaeh4TXWrv0uHZ3bqZj2eWbN+jZeb/pEzWKxZD79ComIXK2qPxGRX9PHvElVvWpYLRvtxGLw4k/h2f+BCXPg03+FiekbdRGNdrJx08/Yvv0ucrKncfTR91FSbIdaWiyWgUlWI1nrfI9Ct7lppm0PPPRF2Pg0HHYhfOwXkJWfNnOamt5izdpFtLdvorz8M1TOvhqfL2/gAy0Wi4UkQqKqj4qZZXaoqi46gDaNbra/ZtZSb9sNZ/8c5v8HpMlHViwWZPPmX7Nl6x/JypqOMZowAAAgAElEQVTEUUf+hXHjTkyLLRaLZeSStI9EVaMicsyBMmZUowqv/gGevA4Ky+HzT0LZUWkzp6VlDWvWLqK1dR1TplzAnKrr8PnsZC+LxbLvpDJqa6WIPAI8gJmMCICqPjRsVo02OpvhH1+FtY/AQR+Fc38HOSVpMSUWC7N16x/YvOU3+P0lHHH4/zJhwtA6erNYLGOLVIRkHLAHcD9tFLBCkgp1q+D+z0LDVjjjRjjhqrQ1ZbW2rWfNmkW0tKxi0qSPc9Cc6/H70yNoFotl9JCKkNyhqi+7A0QkpYZ0ETkL+BVmzfY7VPXWHvG/AE5zdnOBUlUtduIux8xdAbhJVf/shB8D3A3kAI8D39BMdWH85l/h8e9AdjFc/ijMSE//g2qUbdvvZNOmn+P15nPoob9hUml6Z8xbLJbRQypC8mt6+9XqK6wbTkf9b4EzgGrgdRF5RFXXxNOo6rdc6b8OHOVsjwOuB+Zjaj9vOMc2YGbYXwksxwjJWcATKVzHgSPUDo8vgrfuMd56z/8T5JemxZT29i2sWXs1TU1vMHHCGRw09yayAhPSYovFYhmdJJtHcjxwAjBRRL7tiirE1DAGYgGwQVU3OfktBs4B1vST/mKMeACcCTylqnudY58CzhKR54BCVX3FCf8LcC6ZJCT1G0xT1q7VcMoiOPVa8Bx4F+uqMapr7mHDhp/g8fiYd/BPmTz53LStomixWEYvyWokASDfSeMeztMMXJBC3uXAdtd+NbCwr4QiMh2YCTyT5Nhy51PdR3hfeV6J48qloqIiBXOHgNUPwz++bjz1XroEqs44MOftQUdHDWvXfZeGhlcYP+4U5h58C9lZSRY0t1gslkGQbB7J88DzInK3qm4VkTxVbesvfR/09erbX1/GRcASVY072O/v2JTzVNXbgdvBeP9Nbuq+4/b+W1Hk487yR5m96a9QPh8+dTcUTxvqUw6IqrJjxwO8v/5mQJl70M2UlX3a1kIsFsuwksp6rWUisgZnpruIHCEiv0vhuGrA/TSdCtT2k/Yi4L4Ujq12tlPJc9iIr9le09jBFOr5Rcf3mL3pr2yc9Rn43BNpEZFgcCdvv/MF1q67loKCQ1i44HHKyy+yImKxWIadVITkl5g+iz0Aqvo2kMp6r68DVSIyU0QCGLF4pGciETkIKAFecQUvAz4sIiUiUgJ8GFimqjuAFhE5TswT8rPAP1KwZUi5bdl7dISjnOJ5m39mfY85Us1XQlfx2drzwXdg3ayrKnV1j7D81Y/Q0LCcOVX/zdFH3UNOztSBD7ZYLJYhIKUl7lR1e48322h/aV3HRETkaxhR8AJ3qupqEbkRWKGqcVG5GFjsHsKrqntF5EcYMQK4Md7xDvwnXcN/nyANHe3xNdvH08xOLeEr4W+wScsQJ/xAEQrVs+69H7B79zKKCo9i3rzbyM2deUBtsFgsllSEZLuInACoU7O4ii6HjklR1ccxQ3TdYT/osX9DP8feCdzZR/gK4NBUzj9cxNdsfzh2Mo+FjifsFGNZcc4Bs2HXrmWse+86IpFWKmdfTUXFFzAjri0Wi+XAkkrT1peBr9I1YupIZ3/MsujMg8jxm4d2XESGdM32JITDTaxe/W1WvfsVsrOnsODYpUyf/iUrIhaLJW2ksrBVPXDpAbBlxBBfYndY1mxPQn39s6xd9z3C4b3MnPlNZkz/Mh6Pf1jPabFYLAMxoJCIyEzg68AMd3pV/cTwmZX5DPma7UmIRFpYv/5/qN1xP3l5VRxxxP9SWJDW1j2LxWJJkEofyVLgT8CjQGx4zbH0ZO/el1m79ho6g3VMn/5lZs28Co8nK91mWSwWS4JUhKRTVf/fsFti6UY02s6GjT+huvqv5ObOZP4x91NUlL71SywWi6U/UhGSX4nI9cCTQDAeqKpvDptVY5zGxhWsWXs1HR1bmTb1CmbP/g5e74EbEWaxWCz7QipCchhwGWY9knjTltJ9fRLLEBCNBtm0+eds2/YnsrPLOfqoeykpOS7dZlksFktSUhGSTwKzVDU03MaMZZqb32H1mkW0t2+gvOxiKiuvwefLT7dZFovFMiCpCMnbQDGwa5htGVG4nTYOZvhvLBZi85bfsnXr7wkEJnLkEXcxfnwqHmgsFoslM0hFSCYB60Tkdbr3kYzZ4b9xp40dYeMppqaxg2sfWgWwT2LS0rqONWsW0dq6hsmTP8mcqh/g9xcOi80Wi8UyXKQiJNcPnGRsEXfaOKf1fea2vk+HJ4cObw7/+MtbzG48mtzCYnILi8gtKiKnsAh/VnY3L7yxWIRt2/6XTZt/hc9XyOGH/YGJE9OzdonFYrEMllRmtj9/IAwZScSdNno0RlYsSHG4kZxoB4HmCE/+4aVe6X2BLHIKC8ktLCa/VMmb8xqenJ34wodTzOW0VBcTbXqf3KJicgoL8WdlH+hLslgslv0mJe+/lu4U5/ppaA9T5y+nPb+Ydn827b5s8vIDPP21hbQ3NdLe0kR7UxMdzU20NzfR0dxIJGc52VNXEot42PZsBXvfDwN39Mrfn5WdqM3kFjrfRcXkFhQ6YuMKLyzCFziwrustFovFjRWS/SDu8P7Mra9xxdruXux3PJqLJz8fT14eefn5FOTnES31sPPYd2kfV09B00ym7T2bY+aWokdmE/QKQYWgRglGI3SEgnQGO+loa6WjpZnWvXvZtWUTHc1NRCORPu0J5OSQW2hqM7lFxS6RKSa3sJAcJ8yEF+L1Wf9cFotl6EhJSEQkB6hQ1feG2Z4RQVNHGICXyw6jLm8cOZEgueFOciNBvn5cGdG2VmKtbURbW2is2MyeBdsgppQ8UED2CztojN7eZ75ZzgcAr9cRpFy8eflIXh6x/FxCOdmEsgKE/T6CHiHkETo1RjAaobOjk8amrezoaKezrZVYrG+PNll5eY6oGKExIlTUrRaUEKSCQjzeffcsfN3SVXzsqksB5dJLf8zFC6dx07mH7XM+o5WlK2uoOP9sQpEo//WVXx0Qp58jiaEaFTlaybTyScVp48eBnwIBYKaIHIlZaGrMjtqKr0dSXVBKdUFpIry8OIebrjHzNDs7a1m79lr2NmxhXMlJHHzwLWR/tAxVRTs7ibW2EmtrI9raZrbbzXe01YhQrM0Jd+Kira3Q3EJgRx0+Jzyvvb1fGxWIeD0EfV7C2VmE8/MI5+QYEeqIEAw1ENzbyG6UYDRCMBKmv4Xts3PzjMAUF7tqPI4IFRWTU2BEKLewiOyCAn7wyBruWb6Njzk5RlW5Z/k2ACsmdI36uysyuFF/o5VE+fx5EQAXXXKrLR8XmVg+4lqYsO8EIm9gZrE/p6pHOWHvqOrhB8C+IWH+/Pm6YsWKIcuv5/BfMOuR3HLeYZxzZBk76h7k/fd/BMSorLyW8rKLh2XtdI3FiLW3dwlOW1uXEDn7sTZHnNrausJbW4m2t3VPFwwS9noI+bwEfV5Crk+3fb/PfHsE+rmmmHqJ4GNiSyO+SIx1E2cS9XiIiZeTD56E+HzgGbvrpzz//m46w1GO3b4agNenHQJAtt/LB+ZMTKdpGYEtn+S4y8cbi/GDBUZQyotzePmaoXU4IiJvqOr8gdKl0rQVUdWm4XgQjlT6W4/kI/MCvLPqS9TXP01x8QLmHfxjcnIqhs0O8Xjw5ufjzR/8DHgNhYi2tRFrayfW1tq3ODm1pGhbK5GWVjpbW+hoa6Wzo53OUJCOcIhgNELII13ik+NjVuv2xHl2Ld/kGC/g8YDHg/T4Thrm9fYOG2G/zaz2ZrKAzmzz9ysKN5mIMDTu7LsfbCxhyyc57vLxRruar2sP8FLfblIRkndF5BLAKyJVmKV2/51K5iJyFvArzJrtd6jqrX2kuRC4AdMa87aqXiIipwG/cCWbC1ykqktF5G7gA4Dz6+IKVX0rFXuGE1/oaZa/+itisQ6qqq5j2tTLEUllAcrMQAIBM/qrpGRQ+agqh1z9DwLhTm5/+CYAfnT6F8mNBMmPBPnpxyqT1JraiTW5RCxJ011P2z35+YlBDt68vG77nvw8vPn5ePLi+/kmzJ0uPx9Pbq4RpmHmxFufoaaxg8UvXAPAjy/5BmDeKG8Z4jfKkYgtn+T0LB+OMF8HcqnvnqQiJF8Hvo+Z1f43YBlw00AHiVn79bfAGZglel8XkUdUdY0rTRVwLXCiqjaISCmAqj6LWdIXERkHbMB4H46zSFWXpGD7sOBu2sr3t/LxijvJaXuLoHceJx77K/LyZqXLtLQjIpx34mzuWb6NmJqawqqJlQB85rgKSvahj6Rb012bqw/J3UzX1netKbxrJ7FNmxL7GgwOfELAk5vbtwjlukUnzwhWQoBMnDc/L3GcZGf325x52tyJiT6jnuEWs5R1vM0/zoFaynokkInlk8qExHaMkHx/H/NeAGxQ1U0AIrIYOAdY40rzReC3qtrgnKsvf14XAE84dmQE8Znth09YzRWH/I08fzsPrv8Yqxo+xosfGLsiEifRof43ARSvyH6N2hrSprtwuGtwQ1trt8EMidpQj8ENcWEK7dnmNOmZY4hGBz6hM+rOm5fnqgUZkanY2MwXYz6y/FFUheNr32Vnbgmvvq3oOYcOS3/aSCLedBy410soEqU8A0YlZRKZWD6pdLY/BXxKVRud/RJgsaqeOcBxFwBnqeoXnP3LgIWq+jVXmqXA+8CJmOavG1T1/3rk8wzwc1V9zNm/GzgeU0N6GrhGVXu9borIlcCVABUVFcds3bo16XXuCzOv+ScKLJy8grNmPM2f3v0M1a3lCLD51rOH7DwjnlNPNd/PPZdOK4YUVUWDwb77j9pa+6k1xZvvzH5t7R5yI2a4eE88+fn4y8rwl5eb7/h2ufn2lpSMHaEZhb+fIeUAlM9QdrZPiIsIgLsJaiAb+gjrqVo+oAo4FZgKvCgih7pEawpmPZRlrmOuBeoww5FvB74L3NjrRKq3O/HMnz8/uVruI/Hhv6/WHcPrO48ipt5E+Jgi/kPuj+efTy3dCHpQiAiSnY0nOxsmTEieuJ/rbmruJBSJMnf7OqJeL2+Wz8XjUQJeoTQWJbx1K+FNm2iPRon1eNGTnByXwJThL3NEpsxs+yZOOCD9PBaLm1SEJCYiFaq6DUBEptNbEPqiGpjm2p8K1PaRZrmqhoHNIvIeRlhed+IvBB524gFQ1R3OZlBE7gK+k4ItQ0q8jVu9HqJ4kKgmwi2WgagYl8Om3W0I4ItGTV9SzEPZ+Dwm5Gd1SxuNxQjffjvh2lrC1TXmu8Z8d777LtGGhm7pxe/HVzaFQHk5vrIyAvGajfPtmzQJ2Y8JpsPCGHwR2SdGUPmkIiTfB14SkbjzxlNwmowG4HWgSkRmAjXARcAlPdIsBS4G7haRCcAcYJMr/mJMDSSBiExR1R1i6vfnAu+mYMuQ8uy63QBEK/KIzCmCSAzpjHJvtJPGNVuZnOVncpafKVl+JgfMdmnAj98zypokBvqBjvWmiX6uewLw0soa3uwxs31BH23cXueTPXdun3nF2toI79iREJf4d6imhs7nnye6u777AT4f/kmTujefuZrO/JMmIdZ3m2UfSaWz/f9E5GjgOExz1bdUtX6Aw1DViIh8DdMs5QXuVNXVInIjsEJVH3HiPiwia4AoZjTWHgARmYGp0fT0PnyviEx0bHkL+HJKVzqEJLz/7gnie68JzfaiWV6CWR6WN7WyMxgh3LNJApgQ8DHFEZbEJ9AlOpOy/JT4vGOnDXwMc+5R5VBRDDCoSWSevDyyKivJqqzsMz4WDDoCU0u4tsb5NoLTtnw5kZ07u5zHAYjgiwtNT7EpK8NfNsU06w0F9kUkOSOofFJ12pgF7HXSzxMRVPWFgQ5S1ceBx3uE/cC1rcC3nU/PY7cAvV7RVDXtA8njfSSe5jCe5kSrm5lZet5CYqrsCUfYGQyzIximLhSmLmg+O4JhaoIhVjS3sTfce/RPtkeY1IfYTHHtTwr4yfHadnDLwHiyssiaOZOsmTP7jNdQiPDOnUZgetRqOlaupPmJJ3qNUvNOmJDol4k3ofldzWievLwDcWmWDCIVX1s/Bj4NrAbi0ygVGFBIRivxcdw9XaTEx3F7RJgY8DMx4OfQgv7zCcZi7IwLTCicEJ6doQg7giHeaWnnyfowHbHeXVIlPi+T4jWZHkIzOcvPlICf8QEfXlu7sSRBAgEC06YRmDatz3iNRIjs2tVNYOLbwTVraf3X02g43O0Yb3Fx301nzra3cOBVQJeurKFiW6Np+rv1mbQPb800Mq18UqmRnAsc1NcQ27FKfy5S9vVGZnk8VORkUZGT1W8aVaU5EnWExghMXTBMXShCXTDEjmCYta2d7AqF6enr1yt01W56NKG5m9gKfBnS+WrJOMTnSzRzMb/3KFCNxYjU17tqM11NaMFNm2h98UW0s7PbMZ78/N5NZonvMh7b0s61D79rnVr2QyY6/UxFSDYBflzrtVvMDTsQN01EKPL7KPL7mJukxSASU3aHw9QFuwQmXrPZGYywoT3IS40tNEd6u5bP83r6rtkEujen7ctggUx7Y8o0Rkv5iMeDv7QUf2kpHHVUr3hVJdrQYISmpnetpv2114i1tXU7ZoYvwC9zisnNiqIK33rz7wDsXuml9vCyA3Jdmczud2r5cihKTiCa6N7qCEe5bdl7GS0k7cBbIvI0LjFR1auGzaqRQiZ1dnmEKVkBpmQFgNx+07VFo4majRGaMHXBkCNA4aSDBcb7fb2EJlHDccJKfF7+8VZtxr0xZRJLV9awaMnb/NVVPouWvA2MvvIREXzjxuEbN46cw3p7NlBVYs3N3ZrO/vD3lyhtb6C8aRcicOTu9Yn0bW1DN7F4pDLHGezj82i3eRg1Ge608RHnYxkF5Hm9zMr1Miu3/+a0mCp7w1EjMKGIM0gg5AhQmNpgiDeb29kT7u2JNcsjRNsjRI8q4ba8r1LY0kq4qogwcPW67byWN6RzQ0cki9dsI1hVyO2fuQyA8BxTPt9es3Xslo+3ACrmQsVc/qyzQOHM941v2GVzTjBpBC4/fkb6bMwQ/vzKlkT5ZIVCifB0docO6CJlNDDU65EkyKAaSToIxmLsSghN12CBP766JTEkWv3dR5eV5Nplfhvaw4AigPSY25uXZVe/bguaF5Qcx4VMh6/rpceWT/fyyenspPX1rj6oLUPsomnIXKQ4HnpvAeYBiQHkqmq9E45xsjwepmUHmJbdfQLbsiXr+qxmD8fCOyOJUCjEli1b+O+/vMhUTyMFntDAB1ksSQiql/s4Ot1mpNS0dRdwPWZ9kNOAz9G3Hy2LBRh4ePRYYs+ePaxfv57169ezZcsWotEoVT4PtdFC3g1PYVcs36mbQEGWjwe/ckKaLU4/5//u37QEezeb2vIxuMvHXZ8tzklfbT8VIclR1adFRFR1K3CDiLyIEReLpRdDNTx6JBIOh9m6dWtCPPbu3QvA+PHjOfbYY6mqquLtBh9/f2g1Ydf8IL9HuOGcIygtTcUf6ujmv845lkUPvG3Lpx/6LZ9PHJI2m1IRkk4xS/2td1ye1AD2blqScqCGR2cCDQ0NbNiwgfXr17N582bC4TA+n48ZM2awcOFCqqqqGDduXCL9bMDj9Y1JoU2FsfwikgqZWD6prEdyLLAWKAZ+BBQBP1HV5cNv3tBgO9stQ0kkEmHbtm2JWkd9vXE9V1xczJw5c6isrGTGjBkErPNDywhnyDrbVTXu0r0V0z8ydhhBbpwtw0tTU1Oi1rFp0yZCoRAej4cZM2ZwzDHHUFVVxfjx463DTcuYJJVRW/MxruSnu9Or6uHDaJfFklai0SjV1dWJWsfOnTsBKCws5LDDDqOqqoqZM2eSldX/fByLZayQSh/JvcAiYBX0cuc0uhlBbpwtg6elpSVR69i4cSPBYBCPx0NFRQWnn346VVVVlJaW2lqHxdKDVIRkt7N2iMUyqojFYtTU1CRqHTt2mMU38/PzmTdvHlVVVcyaNYvsoVp/w2IZpaQiJNeLyB1AT19bDw2bVRbLMNHW1sbGjRtZv349GzZsoKOjAxFh6tSpfPCDH6SqqorJkyfbWofFsg+kIiSfA+ZiPAC71yOxQmLJeGKxGDt27EjUOmpqagDIzc1NjLCaPXs2ubn9O7q0WCzJSUVIjlDV3m47LZYMpaOjo1uto81xU15eXs6pp55KVVUVU6ZMweOxq0xaLENBKkKyXETmqeqafc1cRM4CfoVZs/0OVb21jzQXAjdgajlvq+olTngU08EPsE1VP+GEzwQWA+OAN4HLVNU6LRrDqCp1dXWJjvLt27ejqmRnZ1NZWUlVVRWVlZXk2SVgLZZhIRUhOQm4XEQ2Y/pIBLPcetLhvyLiBX4LnAFUA6+LyCNuQXIcQl4LnKiqDSLinjHfoapH9pH1j4FfqOpiEfkD8Hng9ylch2UU0dnZyaZNmxK1jpaWFgCmTJnCySefTGVlJVOnTrW1DovlAJCKkJy1n3kvADao6iYAEVkMnAO4azZfBH6rqg0AqrorWYZiekA/CFziBP0ZU5uxQjLKUVV2796d6OvYtm0bsViMrKwsZs+enah1FBQUpNtUi2XMkVRIHB9b/1TVQ/cj73Jgu2u/GljYI80c5zwvY5q/blDV/3PiskVkBRABblXVpcB4oFFVI648+3QwIyJXAlcCVFRU7If5KWDnjwwrwWCQzZs3J5qsmpqaAJg0aRLHH388VVVVTJs2Da/XrjlvsaSTpEKiqjEReVtEKlR12z7m3df4yZ6OvXxAFXAqMBV4UUQOVdVGoEJVa0VkFvCMiKwCmlPIM2777cDtYHxt7aPtljSgqt3crm/dupVoNEogEGDWrFmccsopVFZWUlRUlG5TLRaLi1SatqYAq0XkNaAtHhjv/E5CNTDNtT8VqO0jzXJVDQObReQ9jLC8rqq1znk2ichzwFHAg0CxiPicWklfeVoygKUra1LyThoOh9myZUtCPBoaGgCYMGECCxYsoKqqioqKCny+0bUy3nVLV3Hfq9uJquIV4eKF07jpXDs4Mo4tn+RkWvmk8u/84X7m/TpQ5YyyqgEuoqtvI85S4GLgbhGZgGnq2iQiJUC7qgad8BMxHodVRJ4FLsCM3Loc+Md+2mcZJpaurOm2XkJNYweLHngbMC6w9+7dm+gk37x5M5FIBJ/Px6xZsxJNViUlJem8hGHluqWruGd5VwU/qprYtw9LWz4DkYnlk9Ka7SIyCTjW2X1toE5x13EfBX6J6f+4U1VvFpEbgRWq+ojTef4zTId+FLjZGY11AvBHzARID/BLVf2Tk+csuob/rgQ+o6rBnud2M2xu5Mc40ViU1nArreFWWkIttIRaaA218o37X6Ej2op4OsETxqMwMeqnIhbgIH8usVYzr9WT58Ff6sc/yY9vvA/xjo3Z5L9/ZiP5kRwmdZZQGO6aCCkIx80el+TIscHyjXvRPlqsbfkY3OUTFeG1iFn13CvCxls+OqTnStWNfCrrkVwI3AY8h+n3OBlYpKpLhsDOA4IVkt7ENEZ7uN0IQNgIQDdBcLZbQ6290jSHmmkNtdIeae83/5xIDpPbJzO5fQqlnRPxqY8oUepz6tmVu4uduTtpC7T1e/yoQSE/WMKE1qlMaJvKhNZpTGiZRnbUzqS3DJ5Obzu/Luh6Adty69lDmv+QrUeCcSF/bLwWIiITgX8BI0ZIRhuqSkeko/sDv4cI9BSAnuLQGm7t863Pjc/jozBQSL4/n/xAPgWBAibkTCA/kE++P9/EOdt5vjzCe8M01zSz/I1NlEgYgFYNsCFaRHWsmLpYARtuGKhrbeSiqrQ1hti1tZnd21rYtbWZXVtb6Gw1ZeHxCOPK83imvZEdgRB13hgN3q674BVh9Y1npu8CMoRDfrCMaB8vuLZ8DN3Lp0tEvGn0D5eKkHh6NGXtwTQ3jWlS7Uzui2A02K0pqCXc0u0B36tW4AhDc6g5sR3VaNJzeMWbeMgXBArI9+dTnl+e2M4P5HcXCX+BiQt0pc/yZiV1Xtjc3GyG5q5az7pN6xJu1yNSwOvhQqpjRTRpNvEfe0muP+XyHQm0NQXZvdURjG0t7N7aQnuzcbIgHmHclDxmHj6B0ukFTJxeyPjyPHx+L6uWrmLZ8t6DIC86bhq+gB3K/KnjpnXrA4hjy8fQX/lcvHBaH6kPDKkIyf+JyDLgPmf/08Djw2dS5rN0ZQ3XPrSKTt2LJ3c3O6OdfO+pl/n37hJmlnq7PfD7EolwLJw0f0ESD/j4Q740t5RZxbMSwhB/2Lu/3ds5vpwh92AbjUa7uV2vq6sDoKCggEMOOSThdv2JNfUsWvI2Yddbpd8rXP/xQ4bUngNJR0uIXY5omNpGC22NpmtOBEqm5FExbxwTpxdQOr2Q8VPz8ffz0It3iGbSqJtMwpZPcjKxfPrtIxGRrHgntoich3GVIsALqvrwgTNx8Ax1H8mJtz5DTWMHgfFPk1X6VK/4XF9uQgDib/jx7V5v/32kyfPn4ZHMqPS1trZ2W+yps7MTEaGioiLhx2rSpEm9RGswNbZ009kWNjWNbc0J8Wjd64znECiZlMvECiMYE6cXMGFqPoHs0TU82WKBoekjeQU4WkT+qqqXYd3GJ6ht7AAg3Hwk0Y4ZaDQHjWVDNJv1N30Sn2fkPlRisRi1tbWJWkdtrZmmk5eXx9y5cxO1jpycnKT5nHtU+YgQjmB7OFHD2LW1hd3bmmmu70zEF03MYcqsIiaeVmiaqKYVEMgZuffXYhkOkv0jAiJyOXCCUyPpxlhe2KqsOIeaxg40PJ5oeHwivLw4Z0SKSHt7eze36+3tZjTW1KlTOe200xKLPY10B4ihzgi7t7W4hKOZpl0difjCCdlMrCjkkJPLjWhUFJA1yvp1LJbhINlT78vApUAx8PEecWN6YatFZx7EtQ+toiPc1eGd4/ey6MyD0mhV6sRiMerq6hLCUV1djaqSm5tLZWVlYrGnkex2PRyMUr/dEYxtzeze2kLDziMw3uwAABf2SURBVPaEQ538cVmUVhQy9/gplE4voLSikOx8KxoWy/7Qr5Co6ksi8m+gWlVvPoA2ZTzxJpuR1AfQ2dnZrdbR2toKQFlZGaeccgpVVVWUlZWNyFpHJBSlvrrVNE05I6gadrQR7/7LKwowcXohVcdOMv0aFQXkFgbSa7TFMopIZULiK6p6/AGyZ1gYixMSVZVdu3Z1c7seX+zJ7XY9Pz8/3abuE9FwjPqa1oRg7Nrawt7aNtRxx5JT4Kd0RiGlrs7wvKKsNFttsYxMhnJC4pMicj7wkKbiT8WSNoLBYLfFnpqbjbPkyZMnc9JJJ1FVVUV5efmIcbsejcTYW9vWbZ7GnppWYlHzM8zO81M6o4AZh42ndLrpDM8rTj73xWKxDD2pCMm3gTwgKiIddK2QWDisllkGRFWpr6/v5nY9FosRCASYPXs2p556KpWVlRQWZv6tikVj7N3RbuZpOB3h9TWtxCJGNLJyfUysKODI0yucCX4FFIzLtqJhsWQAAwqJqtol5zKIUCjUze16Y2MjAKWlpRx33HGJxZ7S7XY92TySWExpqGtz5mqYfo367a1EwsaZYyDby8TpBRxx2rTEBL/CCVY0LJZMZcCnjeOh91Jgpqr+SESmAVNU9bVht84C0G2xpy1bthCNRvH7/cyaNYuTTjqJyspKiouL021mgvjM/45wFBTa93Ry199W0/jvXRR2xNi9vZVI0Ix482V5Ka0o4JD/3969R1dVnnkc//5yIYlAuKPcQlAPNy+ABNHaegEv6KxRWq1V61jaTh1nja1tp061Y1urbZddrjVdzoztDLUW20VbbavUKgjUymhtUQJEQCiKECAJcg83k3CS88wfeyccYnIScnI454Tns9ZZ2Zd37/2cN7Cf7P3u/b6XjWhp1+g3pAjleNJwLlt05s/WHxF05z4DeBg4DDzOsW7lXTeLRqNs3bq1JXns27cPgEGDBjFt2jQikQijR49O+1VHex5dvJGC+hgfa8hn/NFcCsK+tg6s20/fMf2Y+JFhLf1P9T/9NHI8aTiX1TpzJppuZhdIWg1gZvsl+bOT3Wz//v0tXZFs2bKFaDRKXl4epaWlTJ8+nUgkwsCBmT8Ww/73jzCpupEJ0QJiwIZeTWzPjbEzL8a+HGPzv81Id4jOuW7WmUQSlZRL+CpX2I18LKVRnQIaGxvZtm1by1XHnj17ABgwYABTpkwhEolQWlpKfn52vCS3t/ow5Ysq2bRyF+PIY1VBI28WRDkS91rKiP6Ju1VxzmWnziSS/wSeA4ZK+h7BMLcPpDSqHurAgQMtVx2bN2/m6NGj5ObmMnr0aKZOnUokEmHQoEFZ1ai8e9shyhdVsnn1bvILcrng6hJqTs/nfxZtoC6uk+NsevPfOXdiOvPU1nxJK4GZBI/+zjazDZ3ZuaRZwGMEQ+0+YWaPtFHmZuBBgiuet8zsNkmTgR8DxRwbgvfpsPw84DLgQLiLOWZW0Zl4Trampiaqqqparjp27twJQHFxMeeddx6RSIQxY8ZQUJB9L8zt3HKQ8oVbqFy7l15FeZRdV8qkGaNauhnJLcrLqjf/nXNdl6gb+UKC/rbOBtYCPzWzxk7vOLgd9g5wFVAFrABuNbP1cWUiwDPAjLDtZaiZ7ZI0luBdlXclDQdWAhPMrDZMJC+cyFC/qXizvb3HWw8dOnRct+vNgz2VlJQQiUSIRCIMGTIkq6464tVsqqV8YSXb1++joHcek2aM4vwrRnrnhs71QN3xZvtTQBR4DbgWmAB8+QRiuBDYZGabw4B+DdwArI8r8wXgcTPbD9A8EqOZvdNcwMxqJO0ChgC1J3D8lIl/vFUYRw/sZv5z77FxWQN1B4K2jj59+jBx4sSWbtcLCwvTHHXXmRnV79RS/uIWqt+ppahvPhd//CzOvWyEj8PhnEuYSCaa2XkAkn4KnOh7IyOA7XHzVcD0VmXGhvt/neD214Nm9lJ8AUkXAr2A9+IWf0/St4CXgfuaB+A6WR5dvJG6aBPjcncxJa+KQjURM6g5WMyNM2a0dLuerVcdzcyM7ev3Ub6wkh3vHeC04l5cctPZnPOxEeQXZEc3K8651EuUSFqaSs2ssQsnxbY2aH0fLQ+IAJcDI4HXJJ1rZrUAkoYBvwA+Y2bNT4rdD7xPkFzmAl8HHvrQwaU7gTsBSkpKTjT2hJoHtjpivaiK9aeqqR81sX5EyeOxSy/t1mOlg5lRuXYv5Qsr2VV5kD4DCrj0lrFMuGQYefmeQJxzx0uUSCZJOhhOCygK5zvb11YVED8a/Uigpo0yy80sCmyRtJEgsayQVAy8CDxgZsubNzCzHeFkg6SfAV9r6+BmNpcg0VBWVtatnU02D2xVFetPVezYG+XZ/nirxYzNFbspX1TJnu2H6TuokMs/PY7xFw0jNz/7upd3zp0cicYjSfZPzxVARNIYoBq4BbitVZkFwK3APEmDCW51bQ5feHwO+LmZ/SZ+A0nDzGxH2HXLbGBdknGesGwf2Kq1WMx4b+UuyhdVsq/mCP2GFjHjjgmMnX46ubmeQJxziaWspTS8HXY3sJig/eNJM3tb0kNAuZk9H667WtJ6gsd87zWzvZJuBy4FBkmaE+6y+THf+eFLkQIqCJ4sO6mycWCrtsSaYryzYicrF22lducHDBjWm6s+N5Gzy073bkucc53W4cBWPcGpOLBVIk2NMTYuf5+VL1VycE89g0b0oey6Us6aMsQ7S3TOtejOga1cD9EYbWLD6ztYtWQrh/c1MKSkL9feFWHM+YM9gTjnuswTySkgerSJ9a/VsHrJVo4cOMoZZxZz+W3jKTlnYNY/ouycSz9PJD3Y0fpG1r1aTcXSbdQdijI80p+Zn53IyHEDPIE457qNJ5IeqKGukbWvVPHWy9upPxJl1IQBlF03huGRzBn8yjnXc3gi6UHqj0RZ86ftrHmlioYPGhl93iDKri3ljDP7pTs051wP5omkB6g7dJSKl7ezdlkV0fomxkwaTNl1pQwd3dE7o845lzxPJFnsyIEGKpZuY92r1TRGY5x9wVCmXlvK4JF90h2ac+4U4okkCx3eX8+qJdtY/+caYo0xIheeztRZpQwc1jvdoTnnTkGeSLLIwb11rFq8jQ1/qYEYjLvoDC6YNZr+Q09Ld2jOuVOYJ5IsULvrA1a9tJWNy9+HHJjwkeFccHUJxYOzu5NI51zP4Ikkg+1//wjliyp5982d5OTlcM5lI7jg6hL6DMjeQbKccz2PJ5IMtLf6MOULK9m0ahd5+TlMmjmKyVeV0Ltf9o3t7pzr+TyRZJDd2w6x4sUtbHlrD/mFuVxwzWgmzxxFUd9e6Q7NOefa5YkkA7y/5QDlCyvZunYvvYrymPZ3pZw/YxSFvfPTHZpzznXIE0ka1bxbS/nCLWzfsJ/C3vlMv/5MzrtiJAVF/mtxzmUPP2OdZGZG1cb9lL9YSc27tRT1zefiT5zFuZeOoFeh/zqcc9nHz1wniZmxbf0+yl/cwvubD9K7Xy8++skIEz82nPxeyY5q7Jxz6eOJJMXMjMo1eyhfWMmurYfoM6CAy24dy/iPDCMv3xOIcy77pTSRSJoFPEYwZvsTZvZIG2VuBh4EDHjLzG4Ll38GeCAs9l0zeypcPhWYBxQBC4F7LAPHC7aY8d7q3ZQvqmRv1WGKBxdyxe3jGXfRGeTm5aQ7POec6zYpSySScoHHgauAKmCFpOfNbH1cmQhwP3CJme2XNDRcPhD4NlBGkGBWhtvuB34M3AksJ0gks4BFqfoeJyoWMzaV76R80Vb27zhC/9NPY+acCYyddjo5uZ5AnHM9TyqvSC4ENpnZZgBJvwZuANbHlfkC8HiYIDCzXeHya4ClZrYv3HYpMEvSMqDYzP4aLv85MJs0JJIFq6t5dPFGamrrGN6/iK9dOZYJ0VzKF1VyYFcdA4f35urPn8NZU4eS4+OhO+d6sFQmkhHA9rj5KmB6qzJjASS9TnD760Eze6mdbUeEn6o2lp9UC1ZXc/+za6mLNpFjMGjnUTY8uZGamBg8qg+z/ulczpw0BHkCcc6dAlKZSNo6i7Zuy8gDIsDlwEjgNUnnJti2M/sMDi7dSXALjJKSks5F3EmPLt5IXbSJsUdzuKIun2LLYUdujIqh4ulvTPPx0J1zp5RUJpIqYFTc/Eigpo0yy80sCmyRtJEgsVQRJJf4bZeFy0d2sE8AzGwuMBegrKysWxvja2rrAMhBHMwxFhc2UJkXQ0fxJOKcO+WkMpGsACKSxgDVwC3Aba3KLABuBeZJGkxwq2sz8B7wfUkDwnJXA/eb2T5JhyRdBLwB3AH8Vwq/Q5uG9y+iuraOv+U38bf8ppbrpOH9vVv3Zq3bkO69Zhyzp5z0u5AZy+snMa+fxDKtflL2GJGZNQJ3A4uBDcAzZva2pIckXR8WWwzslbQeeAW418z2ho3sDxMkoxXAQ80N78A/A08AmwgSzklvaL/3mnEU5ecGCSRMIkX5udx7zbiTHUpGam5Dqq6tw4Dq2jruf3YtC1ZXpzu0jOD1k5jXT2KZWD/KwFcwul1ZWZmVl5d36z4z7S+CVGiKGYfqoxyqb+Rg+PNQfSMH66Ityw81NM8fK7Ou+gCNsZ7/78q5TDKifxGv3zejW/cpaaWZlXVUzt9s76LZU0ZkdOKIxYzDRxvDk3+Ug3WNx07+9VEOtkoOQZnj548cberwOIX5OfQtzKdvYR7F4c9ESeSemZHu/JpZ6bGX3213ndeP109H2quf5rbbdPBE0kWpvCIxMz442hT3V35w4j92NfDhpHAo/oqhPsrhhkY6utjslZtD38K8IAkUBUlgaN/C4+bjk0Rx3Hzzul5tvKV/ySN/orqNf9Qj+hfxlavGdksdZbPfrqzy+knA6yex9uonnW20nki6IP49Ejh2jxLghsnDqY/GPnSCP1jfOgEESaGtMocbGmnq4NZQbo6OuwroW5jHqIGntczHn/TbSgp9C/MoTFFfX/deM+64+gFvQ4rn9ZOY109imVg/3kbSBe39xZ2j4AQfbUpcpxL0Lfjwif3YCT9YV1zYev5YmaL83Ix+1PhUaENKhtdPYl4/iZ2s+ulsG4knki4Yc9+Lbb8FCdx12VktSaG4naTQu1eed5vinMt43tieQs3vkbQ2on8R9107Pg0ROedc+nh3tF3Q8h5JnHTfo3TOuXTxK5IuaL4X6fdwnXPOE0mXZfp7JM45d7L4rS3nnHNJ8UTinHMuKZ5InHPOJcUTiXPOuaR4InHOOZcUTyTOOeeS4onEOedcUjyROOecS4onEuecc0lJaSKRNEvSRkmbJN3Xxvo5knZLqgg//xguvyJuWYWkekmzw3XzJG2JWzc5ld/BOedcYinrIkVSLvA4cBVQBayQ9LyZrW9V9Gkzuzt+gZm9AkwO9zMQ2AQsiStyr5n9NlWxO+ec67xUXpFcCGwys81mdhT4NXBDF/ZzE7DIzD7o1uicc851i1QmkhHA9rj5qnBZazdKWiPpt5JGtbH+FuBXrZZ9L9zmh5IKuile55xzXZDKRNLWEICtBxb8A1BqZucDfwSeOm4H0jDgPGBx3OL7gfHANGAg8PU2Dy7dKalcUvnu3bu79g2cc851KJWJpAqIv8IYCdTEFzCzvWbWEM7+BJjaah83A8+ZWTRumx0WaAB+RnAL7UPMbK6ZlZlZ2ZAhQ5L8Ks4559qTykSyAohIGiOpF8EtqufjC4RXHM2uBza02settLqt1byNJAGzgXXdHLdzzrkTkLKntsysUdLdBLelcoEnzextSQ8B5Wb2PPAlSdcDjcA+YE7z9pJKCa5o/q/VrudLGkJw66wCuCtV38E551zHZNa62aLnKSsrs/Ly8nSH4ZxzWUXSSjMr66icv9nunHMuKZ5InHPOJeWUuLUlaTewNUW7HwzsSdG+ewKvn8S8fhLz+kks1fUz2sw6fOz1lEgkqSSpvDP3EE9VXj+Jef0k5vWTWKbUj9/acs45lxRPJM4555LiiSR5c9MdQIbz+knM6ycxr5/EMqJ+vI3EOedcUvyKxDnnXFI8kTjnnEuKJ5JuIOmTkt6WFJOU9kfxMkVHQy2fyiQ9KWmXJO90tA2SRkl6RdKG8P/WPemOKZNIKpT0pqS3wvr5Tjrj8UTSPdYBnwBeTXcgmSJuqOVrgYnArZImpjeqjDIPmJXuIDJYI/CvZjYBuAj4F//3c5wGYIaZTSIYlnyWpIvSFYwnkm5gZhvMbGO648gw3TXUco9kZq8S9Hjt2hCOO7QqnD5EMMREWyOsnpLCMZkOh7P54SdtT055InGp0tmhlp1LKBxSYgrwRnojySySciVVALuApWaWtvpJ2XgkPY2kPwJntLHq383s9yc7nizQmaGWnUtIUh/gd8CXzexguuPJJGbWBEyW1B94TtK5ZpaWNjdPJJ1kZlemO4Ys0+FQy84lIimfIInMN7Nn0x1PpjKzWknLCNrc0pJI/NaWS5UOh1p2rj3hUNo/BTaY2X+kO55MI2lIeCWCpCLgSuBv6YrHE0k3kPRxSVXAxcCLkhanO6Z0M7NGoHmo5Q3AM2b2dnqjyhySfgX8FRgnqUrS59MdU4a5BPgHYIakivBzXbqDyiDDgFckrSH4o22pmb2QrmC8ixTnnHNJ8SsS55xzSfFE4pxzLimeSJxzziXFE4lzzrmkeCJxzjmXFE8kLm0kHe641Antr9R70z1G0jfScMx5km462cd16eWJxLkukpR0zxBhL8mpcsKJJMXxuB7KE4nLKJJGS3pZ0prwZ0m4/CxJyyWtkPRQgquZXEk/CcdoWCKpKNx2VdwxIpJWhtOVkn4Qju3wpqSzw+VDJP0uPN4KSZeEyx+UNFfSEuDnkuZI+r2kl8KxV74dd5wFklaGsdwZt/xw+B3eAC6W9K3wGOvCfSsst0zSDyW9Go7LMU3Ss5LelfTduP3dHsZeIel/w878HgGKwmXz2yvXVjxx+50g6c24+dLwBTjai7nV77JS0uBwuizsxgNJvRWMx7JC0mpJ3it0tjMz//gnLR/gcBvL/gB8Jpz+HLAgnH4BuDWcvqudbUsJxrGYHM4/A9weTr8St/z7wBfD6UqCjjcB7gBeCKd/CXw0nC4h6KoD4EFgJVAUzs8BdgCDgCKCvo7KwnUDw5/NyweF8wbcHBf3wLjpXwB/H04vA34QTt9D0FfZMKCAoC+zQcCEsM7yw3I/Au5oXb8dlDsunlZ1WgGcGU5/HXigg5jnATfF1e3gcLoMWBZX/82/l/7AO0DvdP979E/XP35F4jLNxQQncQhOUB+NW/6bcPqXrTeKs8XMKsLplQTJBeAJ4LPhX+GfarWPX8X9bP6L/Ergv8Nuup8HiiX1Ddc9b2Z1cdsvNbO94bJn42L+kqS3gOUEHVhGwuVNBJ0RNrtC0huS1gIzgHPi1jX3T7YWeNuCcToagM3hPmcCU4EVYawzgTPbqJdE5VrHE+8Z4OZw+lPA052IuSNXA/eFcSwDCgmStctS3vuvy3Qn2odPQ9x0E8HVAAQnym8DfwJWmtnedo7RPJ0DXNwqYRDewTnSQYwm6XKCZHSxmX0Q3tYpDNfXW9AFOJIKCa4Oysxsu6QH48rFf59Yq+8WI/j/K+ApM7ufxBKVa4mnDU8Dv5H0LMF4Su92IuZmjRy7fR6/XsCN5oPB9Rh+ReIyzV8IegoG+DTw53B6OXBjOH1L6406Ymb1BB1I/hj4WavVn4r7+ddweglBp5MASJqcYPdXSRqooBfW2cDrQD9gf5hExhMMF9uW5hPsHgVjb5zoE08vAzdJGhrGOVDS6HBdVEFX7B2Va5eZvUeQkL/JsauRzsZcSXAVBMd+dxD8Hr4Y1xY0paM4XGbzROLS6TQFPd82f74KfIngFtQagt5f7wnLfhn4atj4Oww40IXjzSe4eljSanlB2NB8D/CVcNmXgDIFjf7rCdpl2vNngttwFcDvzKwceAnIC7/HwwSJ8EPMrBb4CcGtqwUEPbl2mpmtBx4AloTHWkpQPwBzgTWS5ndQriNPA7cT3OY6kZi/Azwm6TWCZNTsYYKhYdcoeFz74U7G4TKU9/7rsoKk04A6MzNJtxA0vJ/Q0z6Svgb0M7Nvxi2rJLhFs6eLcc0Jt7+7o7LO9VTeRuKyxVSCxm8BtQRPdHWapOeAswgahp1z3civSJxzziXF20icc84lxROJc865pHgicc45lxRPJM4555LiicQ551xS/h+VYmRKpApf4wAAAABJRU5ErkJggg==\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "def plot_cv(clf, params_grid, param = 'C'):\n", + " params = [x for x in params_grid[param]]\n", + " \n", + " keys = list(clf.cv_results_.keys()) \n", + " grid = np.array([clf.cv_results_[key] for key in keys[6:16]])\n", + " means = np.mean(grid, axis = 0)\n", + " stds = np.std(grid, axis = 0)\n", + " print('Performance metrics by parameter')\n", + " print('Parameter Mean perforance STD performance')\n", + " for x,y,z in zip(params, means, stds):\n", + " print('%8.2f %6.5f %6.5f' % (x,y,z))\n", + " \n", + " params = [math.log10(x) for x in params]\n", + " \n", + " plt.scatter(params * grid.shape[0], grid.flatten())\n", + " p = plt.scatter(params, means, color = 'red', marker = '+', s = 300)\n", + " plt.plot(params, np.transpose(grid))\n", + " plt.title('Performance metric vs. log parameter value\\n from cross validation')\n", + " plt.xlabel('Log hyperparameter value')\n", + " plt.ylabel('Performance metric')\n", + " \n", + "plot_cv(clf, param_grid) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the mean AUCs are within 1 standard deviation of each other. The AUC for the hyperparameter value of 10 is not significantly better than the other values tested. \n", + "\n", + "Now you will perform the outer loop of the nested cross validation by executing the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": 224, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Mean performance metric = 0.667\n", + "SDT of the metric = 0.050\n", + "Outcomes by cv fold\n", + "fold 1 0.745\n", + "fold 2 0.701\n", + "fold 3 0.661\n", + "fold 4 0.687\n", + "fold 5 0.712\n", + "fold 6 0.657\n", + "fold 7 0.563\n", + "fold 8 0.695\n", + "fold 9 0.627\n", + "fold 10 0.625\n" + ] + } + ], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('fold ' + str(i+1) + ' %4.3f' % x)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The performance metric is not significanty different than for the inner loop of the cross validation. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Test the model\n", + "\n", + "With the features selected and the optimal hyperparameters estimated, it is time to test the model. the code in the cell below does the following processing;\n", + "1. Split the reduced feature subset of the data into training and test subsets.\n", + "2. Define and fit a model using the optimal hyperparameter. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": 225, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "LogisticRegression(C=10, class_weight={0: 0.9}, dual=False,\n", + " fit_intercept=True, intercept_scaling=1, max_iter=100,\n", + " multi_class='ovr', n_jobs=1, penalty='l2', random_state=None,\n", + " solver='liblinear', tol=0.0001, verbose=0, warm_start=False)" + ] + }, + "execution_count": 225, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(9988)\n", + "indx = range(Features_reduced.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "x_train = Features_reduced[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "x_test = Features_reduced[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])\n", + "\n", + "## Define and fit the logistic regression model\n", + "logistic_mod = linear_model.LogisticRegression(C = 10, class_weight = {0:0.1, 0:0.9}) \n", + "logistic_mod.fit(x_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to score the model and display a sample of the resulting probabilities. " + ] + }, + { + "cell_type": "code", + "execution_count": 226, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[0.70177629 0.29822371]\n", + " [0.50919021 0.49080979]\n", + " [0.86044445 0.13955555]\n", + " [0.30472088 0.69527912]\n", + " [0.38551875 0.61448125]\n", + " [0.96476496 0.03523504]\n", + " [0.33723633 0.66276367]\n", + " [0.79788711 0.20211289]\n", + " [0.87142481 0.12857519]\n", + " [0.86314312 0.13685688]\n", + " [0.97290684 0.02709316]\n", + " [0.7790572 0.2209428 ]\n", + " [0.88528488 0.11471512]\n", + " [0.7970326 0.2029674 ]\n", + " [0.95768334 0.04231666]]\n" + ] + } + ], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "probabilities = logistic_mod.predict_proba(x_test)\n", + "print(probabilities[:15,:])\n", + "scores = score_model(probabilities, 0.3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the model scored, execute the code in the cell below to display performance metrics for the model." + ] + }, + { + "cell_type": "code", + "execution_count": 227, + "metadata": { + "scrolled": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " Confusion matrix\n", + " Score positive Score negative\n", + "True positive 134 71\n", + "True negative 30 65\n", + "\n", + "Accuracy 0.66\n", + " \n", + " Positive Negative\n", + "Num case 205.00 95.00\n", + "Precision 0.82 0.48\n", + "Recall 0.65 0.68\n", + "F1 0.73 0.56\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYoAAAEWCAYAAAB42tAoAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzt3XeYFFXWx/HvAQUMiAnXlSAoGAAVccSIYEYMsOIiKgoqsuY1rrjuKrrua1izYgBM6yoYUXRxMREURYKgEkQJCgMGQFBQQMJ5/7g10AwzPT3D9FSH3+d5+qG7qrr6dNHTp++9VeeauyMiIlKaanEHICIimU2JQkREklKiEBGRpJQoREQkKSUKERFJSolCRESSUqKQlJnZWWb2VtxxZBIzW2Zmu8Xwuo3MzM1ss6p+7XQwsylm1q4Cz9NnsgooUWQpM/vazJZHX1TfmdlTZrZ1Ol/T3Z919+PS+RqJzOxQM3vPzJaa2U9m9rqZNauq1y8hnhFm1jNxmbtv7e6z0vR6e5jZi2a2MHr/n5nZVWZWPR2vV1FRwmqyKftw9+buPqKM19koOVb1ZzJfKVFkt5PdfWugJbA/cH3M8VRISb+KzewQ4C3gNWAXoDHwKTA6Hb/gM+2XuZntDnwMzAX2cfc6wB+BAqB2Jb9WbO890467lMLddcvCG/A1cEzC4zuB/yY8rgncBcwBvgceBbZIWN8RmAT8DMwE2kfL6wCPA98C84BbgerRuh7AB9H9R4G7isX0GnBVdH8X4GVgATAbuDxhuz7AS8B/otfvWcL7ex94uITlbwL/ju63AwqBvwILo2NyVirHIOG51wHfAc8A2wFvRDEvju7Xj7b/J7AGWAEsAx6KljvQJLr/FNAX+C+wlPBFv3tCPMcB04GfgIeBkSW992jb/yT+f5awvlH02t2j97cQuCFhfWvgI2BJ9H/5EFAjYb0DlwBfAbOjZfcTEtPPwASgTcL21aPjPDN6bxOABsCoaF+/RMfl9Gj7kwifryXAh8C+xT671wGfASuBzUj4PEexj4/i+B64J1o+J3qtZdHtEBI+k9E2zYG3gR+j5/417r/VXLjFHoBuFfyP2/APqz7wOXB/wvr7gCHA9oRfoK8Dt0XrWkdfVscSWpX1gL2ida8CjwFbATsBY4E/RevW/VECR0RfKhY93g5YTkgQ1aIvkhuBGsBuwCzg+GjbPsAqoFO07RbF3tuWhC/lI0t43+cC30b32wGrgXsISaFt9IW1ZwrHoOi5d0TP3QLYAegcvX5t4EXg1YTXHkGxL3Y2ThQ/Rsd3M+BZYFC0bsfoi+/UaN2fo2NQWqL4Djg3yf9/o+i1+0ex70f40t07Wn8AcHD0Wo2AacAVxeJ+Ozo2RcmzW3QMNgOujmKoFa27lvAZ2xOw6PV2KH4MosetgB+AgwgJpjvh81oz4bM7iZBotkhYVvR5/gg4O7q/NXBwsfe8WcJr9WD9Z7I2ISleDdSKHh8U999qLtxiD0C3Cv7HhT+sZYRfdw68C2wbrTPCF2bir9lDWP/L8THg3hL2+bvoyyax5XEGMDy6n/hHaYRfeEdEjy8A3ovuHwTMKbbv64Eno/t9gFFJ3lv96D3tVcK69sCq6H47wpf9VgnrXwD+nsIxaAf8VvRFWEocLYHFCY9HUHaiGJCwrgPwRXT/HOCjhHVGSLSlJYpVRK28UtYXfWnWT1g2FuhayvZXAIOLxX1UGZ+xxcB+0f3pQMdStiueKB4B/lFsm+lA24TP7nklfJ6LEsUo4GZgx1Lec2mJ4gxgYjr/7vL1pv7B7NbJ3d8xs7bAc4RfrUuAuoRfxRPMrGhbI/y6g/BLbmgJ+9sV2Bz4NuF51QhfaBtwdzezQYQ/zlHAmYTukqL97GJmSxKeUp3QnVRko30mWAysBX4PfFFs3e8J3SzrtnX3XxIef0No1ZR1DAAWuPuKdSvNtgTuJSSj7aLFtc2suruvSRJvou8S7v9K+EVMFNO69xwdv8Ik+1lEeK8Vej0z24PQ0iogHIfNCK28RBv8H5jZ1UDPKFYHtiF8piB8ZmamEA+E///uZnZZwrIa0X5LfO1izgduAb4ws9nAze7+RgqvW54YpRw0mJ0D3H0k4dfsXdGihYRuoObuvm10q+Nh4BvCH+nuJexqLqFFsWPC87Zx9+alvPRA4DQz25XQing5YT+zE/axrbvXdvcOiWEneT+/ELof/ljC6i6E1lOR7cxsq4THDYH5KRyDkmK4mtC1cpC7b0PoXoOQYJLGnIJvCS2lsMOQveqXvjnvELrBKuoRQpJtGr2Xv7L+fRRZ937MrA1h3KALsJ27b0vonix6TmmfmZLMBf5Z7P9/S3cfWNJrF+fuX7n7GYSuzzuAl6L/47KOf3lilHJQosgd9wHHmllLd19L6Lu+18x2AjCzemZ2fLTt48C5Zna0mVWL1u3l7t8SzjS628y2idbtHrVYNuLuEwkDvwOAYe5e1IIYC/xsZteZ2RZmVt3MWpjZgeV4P70Jv0ovN7PaZradmd1K6D66udi2N5tZjejL7iTgxRSOQUlqE5LLEjPbHrip2PrvCeMtFfFfYB8z6xSd6XMJsHOS7W8CDjWzf5nZzlH8TczsP2a2bQqvV5swJrLMzPYCLkph+9WE/8/NzOxGQouiyADgH2bW1IJ9zWyHaF3x49IfuNDMDoq23crMTjSzlM7WMrNuZlY3+j8s+kytiWJbS+n/B28AO5vZFWZWM/rcHJTKa0pyShQ5wt0XAP8m9M9D+HU4AxhjZj8TfqHuGW07ljAofC/hV+NIQncBhL70GsBUQhfQSyTvAhkIHEPo+iqKZQ1wMqGPfzbh1/0AwhlVqb6fD4DjCYO/3xK6lPYHDnf3rxI2/S6Kcz5h8PhCdy/qrir1GJTiPsLA8EJgDPC/YuvvJ7SgFpvZA6m+l+j9LCS0kO4kdCs1I5zZs7KU7WcSkmIjYIqZ/URosY0njEuV5RpCd+BSwhf382VsP4xwRtmXhGO9gg27h+4hjP+8RUhAjxOOFYQxp6fNbImZdXH38YQxq4cI/zczCGMJqWpPeM/LCMe8q7uvcPdfCWefjY5e6+DEJ7n7UsIJGicTPhdfAUeW43WlFEVnrIhknehK3v+4e7IunIxkZtUIp+ee5e7D445HJBm1KESqiJkdb2bbmllN1o8ZjIk5LJEypS1RmNkTZvaDmU0uZb2Z2QNmNiMqTdAqXbGIZIhDCGflLCR0j3Ry9+XxhiRStrR1PZnZEYTz/P/t7i1KWN8BuIxwrvlBhIvFNPAkIpJh0taicPdRhKtUS9ORkETc3ccA25pZKueNi4hIFYrzgrt6bHhWRWG07NviG5pZL6AXwFZbbXXAXnvtVSUBioiU6efpsGY5VN+i7G3j8MNKWLaaCWt8obvXrcgu4kwUxS/+gVIuqHH3fkA/gIKCAh8/fnw64xIRSd077cK/x4yIM4oNFQ0pmMEjj8APP2B9+nxT0d3FedZTIeGS+yL1CefCi4hkvhn9QpJYPCnuSDY0bx507AjPRZc2XXQR3FT82tHyiTNRDAHOic5+Ohj4KboyWEQk8339XEgS27WERmfGHU1oRfTvD82awTvvwLJllbbrtHU9mdlAQoXOHaPiZzcRCs7h7o8SitJ1IFy1+SvhSmERkcwyo19ICsUVJYlM6HKaORMuuACGD4cjjwwJY/fKK3uVtkQRFfVKtr5o4hQRkcyV2HJIlCktCYDPP4cJE6BfP+jZM4xNVCKVGRcRKVJS6yGTWg6JJk+GTz6Bc86BTp1g1izYYYeyn1cBKuEhIlKkqPWQKJNaDgC//QZ9+kCrVnDDDbAimlIlTUkC1KIQEdlQJrYeinz8MZx/PkyZAt26wb33Qq1aaX9ZJQoRiU9pA8VxKWksIlPMmwdt2sDvfgdvvAEnnlhlL62uJxGJT0ldPXHKtG4mgC+/DP/WqwfPPx9aE1WYJEAtChGJWyZ39cRpyRL4y19gwAAYMQKOOAL+8IdYQlGiEBHJNEOGhCuqv/sOrr0WDizPLMKVT4lCRNar6jGDTB4TiEvPnvD447DPPvDaa1BQEHdEShQikqC0i8vSJRPHBOKQWMSvoAB23RWuuw5q1Ig3rogShUg+SLWlkKkXl+WyuXPhwguha1c4++xwP8PorCeRfJDq2UX6hV911q4NJcCbNw+D1StXxh1RqdSiEMkXailkjq++CmMRo0bBMceEGk2NG8cdVanUohDJdTP6wQ8j445CEk2dCp99Bk88AW+9ldFJAtSiEMl9RWMT6lKK16efwqRJ0L17mFho1izYbru4o0qJEoVItinvKayLJ8FObaFJr/TFJKVbuRJuvRVuvx1+/3s4/fRQnylLkgSo60kk+5S37IUGqOPz0Uew//4hUZx5JkycWCVF/CqbWhQicarIBW46hTU7zJsHbdvCzjvD0KFwwglxR1RhalGIxKkiRfHUQshs06aFf+vVgxdeCEX8sjhJgFoUIvFT6yA3LF4MV18NTz4ZTntt0ybMPJcDlChERDbV4MFw8cWwYAFcf33sRfwqmxKFiMimOO+80Ipo2RL++98wRWmOUaIQESmvxCJ+Bx8MTZvCNdfA5pvHG1eaKFGIiJTHN9/An/4UTnc95xzolfvXp+isJxGRVKxdC337QosW8MEHsGpV3BFVGbUoRETKMn16KOL3wQdw3HHw2GPQqFHcUVUZJQqRdCrrgjrN8JYdpk8P10M89VTobjKLO6Iqpa4nkXQq64I6XTyXuSZODGczAZxySiji17173iUJUItCJP10QV12WbECbrkF7rwzXF19xhmhPtO228YdWWzUohBJF80DkX1Gjw7XQ9x2W+himjQpK4v4VTa1KETSRfNAZJd58+DII0MrYtiwMGgtgFoUIumleSAy39Sp4d969eDll+Hzz5UkilGiEJH89OOP0KMHNG8eivgBnHwybL11rGFlInU9iVSW4qfC6tTXzPXyy3DJJbBoEdxwA7RuHXdEGU0tCpHKUvxUWJ36mpl69IDTTgtdTePGhdnnNGCdlFoUIpVJp8JmpsQifoceCnvvHeaO2ExfgalI61Eys/bA/UB1YIC7315sfUPgaWDbaJve7j40nTGJlKoi05ImUldTZpo9OxTu69YtXDCXB0X8Klvaup7MrDrQFzgBaAacYWbNim32N+AFd98f6Ao8nK54RMpUkWlJE6mrKbOsWQMPPBCK+I0Zs75VIeWWzhZFa2CGu88CMLNBQEdgasI2DmwT3a8DzE9jPCIlK2pJFLUI1HWU/aZNg/PPh48+CvNVP/ooNGwYd1RZK52Joh4wN+FxIXBQsW36AG+Z2WXAVsAxJe3IzHoBvQAa6j9bKltiklCLIDfMmBEK+T3zDJx1Vl7WZ6pM6TzrqaT/meJtvzOAp9y9PtABeMbMNorJ3fu5e4G7F9StWzcNoUreKiqzUdSS0MVx2WvCBHjiiXD/5JPD2ES3bkoSlSCdiaIQaJDwuD4bdy2dD7wA4O4fAbWAHdMYk8iGVGYj+y1fDr17w0EHwT/+EYr6AWyzTfLnScrSmSjGAU3NrLGZ1SAMVg8pts0c4GgAM9ubkCgWpDEmkY2pzEb2GjUK9tsP7rgjXB8xcaKuiUiDtI1RuPtqM7sUGEY49fUJd59iZrcA4919CHA10N/MriR0S/Vw16kJIpKCefPg6KOhQQN4551wX9IirddRRNdEDC227MaE+1OBw9IZg4jkmM8/h332CVdWDx4cKr5utVXcUeU0lfAQkeywcCGcfTbsu+/6In4nnaQkUQV0/bqIZDZ3ePFFuPRSWLwYbropDFxLlVGikNxWVlkOld3IfN27h+shCgrg3XdDt5NUKSUKyW2JF9OVRBfZZabEIn5t24bupiuuUBG/mOioS25SWY7sNWsWXHBBuFju3HNDKQ6JlQazJTepLEf2WbMG7rsvdC2NGwfV9PWUKdSikNyllkT2mDoVzjsPPv4YTjwxFPGrXz/uqCSilC25ZUY/eKfdppULl6o3ezbMnAnPPQevv64kkWHUopDcoi6n7DFuHEyaFMYjTjwxjE3Urh13VFICJQrJTqWd9qrB68z3669w441w772w667hIrpatZQkMpi6niQ7lTYbnVoSmW3EiHCq6913h5aEivhlBbUoJHMlu1hOLYfsU1gIxx4bWhHvvRdqNElWUItCMleyOazVcsgen34a/q1fH157DT77TEkiy6hFIfHTeENuWrAA/vxnGDgwdDm1bQsdOsQdlVSAWhQSP4035Bb3kByaNYOXXoKbb4ZDDok7KtkEKbUoohnqGrr7jDTHI/lKLYfccfbZ8OyzocLr449D8+ZxRySbqMwWhZmdCHwOvB09bmlmg9MdmIhkkbVr1xfyO/JIuOceGD1aSSJHpNL1dAtwELAEwN0nAU3SGZSIZJEZM8I0pE8+GR6ffz5ceSVUrx5vXFJpUul6WuXuS8wscZnmtZaKKWngWnNCZKfVq0MRv7//HWrWVJXXHJZKi2KamXUBqplZYzO7DxiT5rgkV5U0cK1B6+wzeXIYoL72Wjj++FDUr1u3uKOSNEmlRXEpcCOwFngFGAZcn86gJAdpfojcMmcOfPMNDBoEXbqECYYkZ6WSKI539+uA64oWmNmphKQhkhoV68t+H38cLp7r1StcDzFrFmy9ddxRSRVIpevpbyUsu6GyA5E8UNSSaNIr7kikPH75Ba66KnQ13XknrFwZlitJ5I1SWxRmdjzQHqhnZvckrNqG0A0lIrnuvfdC8b5Zs+Cii+D228PAteSVZF1PPwCTgRXAlITlS4He6QxKRDJAYWEYqG7cGEaOhCOOiDsiiUmpicLdJwITzexZd19RhTFJNiurbpNkvokTYf/9QxG/118PNZq22CLuqCRGqYxR1DOzQWb2mZl9WXRLe2SSnVS3KXt9/z2cfjq0ahVaEADt2ytJSEpnPT0F3ArcBZwAnIvGKKRI8RaETn/NPu6hNtOf/wzLlsGtt8Khh8YdlWSQVFoUW7r7MAB3n+nufwNUTF6C4i0ItRyyz5lnhkJ+e+4Z5rC+4QbYfPO4o5IMkkqLYqWF+h0zzexCYB6wU3rDkqyiFkT2Wbs2XCRnBscdF059veQS1WeSEqXSorgS2Bq4HDgMuAA4L51BiUgaffllqPD6xBPh8bnnwuWXK0lIqcpsUbj7x9HdpcDZAGZWP51BiUgarF4dyn/fdBPUqqVBaklZ0kRhZgcC9YAP3H2hmTUnlPI4ClCyyCc67TW7ffYZnHceTJgAf/gD9O0Lv/993FFJlii168nMbgOeBc4C/mdmNwDDgU+BPaomPMkYOu01uxUWwty58OKL8PLLShJSLslaFB2B/dx9uZltD8yPHk9Pdedm1h64H6gODHD320vYpgvQhzDHxafurm+duCWbM0KD1tnjww9DS+LCC9cX8dtqq7ijkiyUbDB7hbsvB3D3H4EvypkkqgN9CddeNAPOMLNmxbZpSihZfpi7NweuKGf8kg6aMyK7LVsWrok4/HC4++71RfyUJKSCkrUodjOzolLiBjRKeIy7n1rGvlsDM9x9FoCZDSK0UqYmbHMB0NfdF0f7/KGc8UtFlTbmAGo9ZLO33gplwOfMCae7/t//qYifbLJkiaJzsccPlXPf9YC5CY8LCXNvJ9oDwMxGE7qn+rj7/4rvyMx6Ab0AGjZsWM4wpESJ80MUp9ZDdpo7F048EXbfHUaNCi0KkUqQrCjgu5u475KmvCo+1/ZmQFOgHeEsqvfNrIW7LykWSz+gH0BBQYHm664sajXkhgkT4IADoEEDGDoU2rQJp7+KVJJUrsyuqEKgQcLj+oQB8eLbjHH3VcBsM5tOSBzj0hhX/kk2OC3Z67vv4LLL4KWXYMSIUOX12GPjjkpyUCpXZlfUOKCpmTU2sxpAV2BIsW1eJaobZWY7ErqiZqUxpvykwenc4g5PPw3NmoUy4P/3fyriJ2mVcovCzGq6+8pUt3f31WZ2KTCMMP7whLtPMbNbgPHuPiRad5yZTQXWANe6+6LyvQVJibqZckfXrvDCC3DYYTBgAOy1V9wRSY4rM1GYWWvgcaAO0NDM9gN6uvtlZT3X3YcCQ4stuzHhvgNXRTcRKU1iEb8OHcI4xMUXQ7V0dgqIBKl8yh4ATgIWAbj7p6jMuEjV+eKLMA3p44+Hx927w6WXKklIlUnlk1bN3b8ptmxNOoIRkQSrVoXxh/32g6lTYeut445I8lQqYxRzo+4nj662vgzQVKiZrLRZ5yR7TJoUyn9PmgSnnQYPPgg77xx3VJKnUmlRXEQYQ2gIfA8cHC2TTKVZ57Lfd9+F28svh0J+ShISo1RaFKvdvWvaI5HKpbOcss8HH4QifhdfDO3bw8yZsOWWcUclklKLYpyZDTWz7mZWO+0RScXN6AfvtCu5HLhkrqVLw+B0mzZw333ri/gpSUiGKDNRuPvuwK3AAcDnZvaqmamFkYkS6zepqyk7DBsGLVrAww+Hiq+ffKIifpJxUrrgzt0/BD40sz7AfYQJjQalMa78lKyiaypU9TW7zJ0LJ50ETZqEbiddXS0ZqswWhZltbWZnmdnrwFhgAaBPdDqUNotcqtSSyHzuMHZsuN+gAbz5JkycqCQhGS2VFsVk4HXgTnd/P83xiFoEuevbb8McEYMHry/id8wxcUclUqZUEsVu7r427ZGI5Cp3eOopuOoqWLEC7rgj1GkSyRKlJgozu9vdrwZeNrON5oBIYYY7EQHo0iWUAm/TJhTx22OPuCMSKZdkLYrno3/LO7OdiKxZEwr4VasGJ58MRx0Ff/qT6jNJVir1U+vu0Ygbe7v7u4k3YO+qCU8kC02bFloPRUX8zjkHLrpISUKyViqf3PNKWHZ+ZQcikvVWrYJbb4WWLWH6dKhTJ+6IRCpFsjGK0wmz0jU2s1cSVtUGlpT8LJE8NXEi9OgRSnCcfjo88ADstFPcUYlUimRjFGMJc1DUB/omLF8KTExnUHlDVV5zx/ffw8KF8Oqr0LFj3NGIVKpSE4W7zwZmA+9UXTh5JrHkBuiCuWwzahR8/nm4NqJ9e5gxA7bYIu6oRCpdsq6nke7e1swWA4mnxxphFtPt0x5dPtAFdtnn55+hd2945JFwqmvPnqE+k5KE5Khkg9lF053uCNRNuBU9Fsk/Q4dC8+bw2GPhAjoV8ZM8kOz02KKrsRsA1d19DXAI8CdgqyqITSSzzJ0bxh/q1IEPP4S774at9KcguS+V02NfJUyDujvwb8I1FJtQ4lQki7jDmDHhfoMG8NZboRVx0EHxxiVShVJJFGvdfRVwKnCfu18G1EtvWCIZYP586NQJDjkERo4My448EmrUiDcukSqWSqJYbWZ/BM4G3oiWbZ6+kPLEjH7ww8i4o5CSuIeaTM2ahRbEXXepiJ/ktVSqx54HXEwoMz7LzBoDA9MbVh4oun5Cp8NmntNOg1deCWXABwwIEwuJ5LEyE4W7Tzazy4EmZrYXMMPd/5n+0PLATm2hSa+4oxDYsIhfp05w3HFwwQWqzyRCajPctQFmAI8DTwBfmpna4ZI7Jk8OXUtFRfzOPluVXkUSpPKXcC/Qwd0Pc/dDgROB+9MblkgV+O03uPlmaNUKZs6E7baLOyKRjJRKoqjh7lOLHrj7NECnfVTUjH7wTrtNmxtbNt2ECXDAAdCnD/zxjzB1ahibEJGNpDKY/YmZPQY8Ez0+CxUFrLjE+k4ayI7PokWwZAm8/jqcdFLc0YhktFQSxYXA5cBfCHWeRgEPpjOonFRUKbYoSai+U9UbPjwU8bv88jBY/dVXUKtW3FGJZLykicLM9gF2Bwa7+51VE1KOUksiPj/9BH/5C/TrB3vtFQaqa9ZUkhBJUaljFGb2V0L5jrOAt82spJnuJBVFF9cVtSR0SmzVef31cOHcgAFwzTVhbEJF/ETKJVmL4ixgX3f/xczqAkMJp8dKeeniunjMnQudO4dWxKuvwoEHxh2RSFZKdtbTSnf/BcDdF5SxrZRFF9dVDfdQ2RXWF/EbP15JQmQTJGtR7JYwV7YBuyfOne3up5a1czNrT7jmojowwN1vL2W704AXgQPdfXyqwWek4tObgqY4rSqFhXDRRfDGGzBiRCjB0a5d3FGJZL1kiaJzsccPlWfHZladMNf2sUAhMM7MhiRekxFtV5twVtXH5dl/xio+vSloADvd1q6F/v3h2mth9Wq45x44/PC4oxLJGcnmzH53E/fdmlAXahaAmQ0COgJTi233D+BO4JpNfL146fTX+HTuHMYgjjoqJIzddos7IpGcks5xh3rA3ITHhRSbx8LM9gcauPsbJGFmvcxsvJmNX7BgQeVHWhl0+mvVWr06tCQgJIr+/eGdd5QkRNIgnYnCSljm61aaVSPUkbq6rB25ez93L3D3grp1M3C6bp3+WrU++yxMJtS/f3jcrRv07Bmqv4pIpUs5UZhZeU8+LyTMt12kPjA/4XFtoAUwwsy+Bg4GhphZQTlfJ346/bVqrFwJN90UajR98w1k4o8GkRyUSpnx1mb2OfBV9Hg/M0ulhMc4oKmZNTazGkBXYEjRSnf/yd13dPdG7t4IGAOckrVnPen01/QaNy5Ueb3lFjjjDJg2DU4t88Q7EakEqbQoHgBOAhYBuPunwJFlPcndVwOXAsOAacAL7j7FzG4xs1MqHrLkpcWLYdkyGDoU/v1v2GGHuCMSyRupFAWs5u7f2Ib9v2tS2bm7DyVc0Z247MZStm2Xyj4lj7z3Xiji9+c/hyJ+X36p8hsiMUilRTHXzFoDbmbVzewK4Ms0xyX5bMmSMA3p0UfDY4+FsQlQkhCJSSqJ4iLgKqAh8D1h0PmidAYleey110IRvyeeCBVfVcRPJHZldj25+w+EgWiR9JozJ8w2t/feMGQIFGTfCXAiuajMRGFm/Um4/qGIu+sUH9l07vDBB9CmDTRsGC6aO/hgqKHZdkUyRSpdT+8A70a30cBOwMp0BiV5Ys4cOPFEOOIIGDkyLDviCCUJkQyTStfT84mPzewZ4O20RSS5b+1aePRRuO660KJ44AEV8RPJYKmcHltcY2DXyg5E8sipp4ZB62OPDdOTNmoUd0QikkQqYxSLWT9GUQ34EeidzqCyRvGKsVK61auhWrVwO/106NgRevRQfSajhzHxAAAUk0lEQVSRLJA0UVi4ym4/YF60aK27bzSwnbdUMTY1n34K550Xro248MJQgkNEskbSwewoKQx29zXRTUmiOFWMLd2KFfC3v4XTXAsLYeed445IRCoglbOexppZq7RHIrll7FjYf3/45z/hrLNCEb9OneKOSkQqoNSuJzPbLCrsdzhwgZnNBH4hzDPh7p5fyUNzYZfPzz/D8uXwv//B8cfHHY2IbIJkYxRjgVaAfgaC5sJOxVtvwZQpcOWVcMwxMH26ym+I5IBkicIA3H1mFcWSeRJbEZoLu3SLF8NVV8FTT0Hz5nDxxSFBKEmI5IRkiaKumV1V2kp3vycN8WSWxFaEWg8le+UVuOQSWLAArr8ebrxRCUIkxyRLFNWBrSl57uv8oVZE6ebMga5doUWLMKHQ/vvHHZGIpEGyRPGtu99SZZFIdnCHUaOgbdtQxO+99+Cgg2DzzeOOTETSJNnpsfndkpCNffMNnHACtGu3vojf4YcrSYjkuGSJ4ugqi0Iy29q18NBDYaD6gw/gwQdDWXARyQuldj25+49VGYhksE6d4PXXw/UQjz0Gu6ompEg+SeXK7Pwzox+80y6c8ZSvVq0KLQkItZmefhrefFNJQiQPKVGUJN+L/X3yCbRuHeaMgJAozjlHlV5F8lRF5qPID/l4Wuzy5XDLLfCvf0HdutCgQdwRiUgGUKKQYMwY6N4dvvwylAS/6y7Ybru4oxKRDKBEIcEvv4RxibffDnWaREQiShT5XBX2f/8LRfyuvhqOPhq++AJq1Ig7KhHJMBrMLhq4TpTrg9iLFoVuphNOCGcz/fZbWK4kISIlUIsC8mfg2h1efjkU8fvxxzD73N/+pgQhIkkpUeSTOXPgzDNh333D3BH77Rd3RCKSBdT1lOvcQ+E+CBfLjRgRznBSkhCRFClR5LLZs+G448JAdVERv0MPhc3UkBSR1ClR5KI1a+D++8M8ER9/DI88oiJ+IlJh+f3TckY/+GEk7NQ27kgqV8eO8N//QocOoQyHrrAWkU2Q34mi6PqJXDgVdtUqqF4dqlWDs88O9ZnOPFP1mURkk6W168nM2pvZdDObYWa9S1h/lZlNNbPPzOxdM0tvadKiqrBFt8WTQmuiSa+0vmzajR8PBQWhiwng9NPhrLOUJESkUqQtUZhZdaAvcALQDDjDzJoV22wiUODu+wIvAXemKx5g44vrsv3CuuXL4brrwlSkCxaoBLiIpEU6u55aAzPcfRaAmQ0COgJTizZw9+EJ248BulV6FIklOopKc+TCxXUffRSurv7qK+jZM1R83XbbuKMSkRyUzq6nesDchMeF0bLSnA+8WdIKM+tlZuPNbPyCBQvKF0ViKyLbWxCJli8PEwu98w70768kISJpk84WRUkd5F7ihmbdgAKgxNOP3L0f0A+goKCgxH0klSutiKFDQxG/a6+Fo46CadNg883jjkpEclw6WxSFQOJ5mfWB+cU3MrNjgBuAU9x9ZRrjyV4LF0K3bnDiifDss+uL+ClJiEgVSGeiGAc0NbPGZlYD6AoMSdzAzPYHHiMkiR/SGEt2codBg2DvveGFF+Cmm2DsWBXxE5EqlbZE4e6rgUuBYcA04AV3n2Jmt5jZKdFm/wK2Bl40s0lmNqSU3VVM0QV12WrOnDBg3bgxTJgAffooSYhIlUvrBXfuPhQYWmzZjQn30zuVWjZeUOcO774bZpnbdddQo+nAA8PFdCIiMcjNWk9FF9Zl2wV1M2eGAn7HHru+iN/BBytJiEiscjNRFJ0Smy2nw65ZA/fcA/vsE7qYHntMRfxEJGPkbq2nbDol9uST4c034aSTQhmO+vXjjkhEZJ3cTRSZ7rffwrwQ1apBjx6hkF/XrqrPJCIZJze7njLd2LFwwAHw8MPhcZcuodqrkoSIZKDcShSJg9iZ6Ndf4eqr4ZBDYPFi2H33uCMSESlTbnU9ZfIg9gcfhGsiZs2CP/0J7rgD6tSJOyoRkTLlVqKAzB3ELppYaPhwaNcu7mhERFKWe4kik7z+eijc95e/wJFHwtSpYQBbRCSLZP8YReKsdZkyNrFgQZiG9JRTYODA9UX8lCREJAtlf6LIpPkm3OG550IRv5degltugY8/Vn0mEclqufETN1PGJebMgXPPhf33h8cfh+bN445IRGSTZX+LIm5r18KwYeH+rrvC++/D6NFKEiKSM5QoNsVXX4WZ5tq3h1GjwrLWrVXET0RyihJFRaxeDf/6F+y7L0yaFLqZVMRPRHJUboxRVLWTTgrdTR07hjIcu+wSd0QiGWnVqlUUFhayYsWKuEPJG7Vq1aJ+/fpsXolTJStRpGrlyjBHdbVq0LMnnHce/PGPqs8kkkRhYSG1a9emUaNGmP5W0s7dWbRoEYWFhTRu3LjS9quup1SMGQOtWkHfvuHxaaeFQn764IsktWLFCnbYYQcliSpiZuywww6V3oJTokjml1/gyivh0ENh6VJo2jTuiESyjpJE1UrH8VbXU2nefz8U8Zs9Gy6+GG67DbbZJu6oRESqnFoUpVm9OoxJjBwZupyUJESy1uDBgzEzvvjii3XLRowYwUknnbTBdj169OCll14CwkB87969adq0KS1atKB169a8+eabmxzLbbfdRpMmTdhzzz0ZVnQNVjFt2rShZcuWtGzZkl122YVOnTptsH7cuHFUr159XazpphZFoldfDUX8rr8+FPGbMkX1mURywMCBAzn88MMZNGgQffr0Sek5f//73/n222+ZPHkyNWvW5Pvvv2fkyJGbFMfUqVMZNGgQU6ZMYf78+RxzzDF8+eWXVC927dX777+/7n7nzp3p2LHjusdr1qzhuuuu4/jjj9+kWMpD34IA338Pl10GL74YBq2vvjrUZ1KSEKk8E66o/MKd27WEA+5LusmyZcsYPXo0w4cP55RTTkkpUfz666/079+f2bNnU7NmTQB+97vf0aVLl00K97XXXqNr167UrFmTxo0b06RJE8aOHcshhxxS4vZLly7lvffe48knn1y37MEHH6Rz586MGzduk2Ipj/zuenKHZ56BZs3gtdfgn/8MZzipiJ9Iznj11Vdp3749e+yxB9tvvz2ffPJJmc+ZMWMGDRs2ZJsUupyvvPLKdd1Eibfbb799o23nzZtHgwYN1j2uX78+8+bNK3XfgwcP5uijj14Xx7x58xg8eDAXXnhhmXFVpvz+yTxnTrgmoqAgXF29115xRySSu8r45Z8uAwcO5IorrgCga9euDBw4kFatWpV6dlB5zxq69957U97W3cv1egMHDqRnz57rHl9xxRXccccdG3VVpVv+JYqiIn4nnBCK+I0eHaq9qj6TSM5ZtGgR7733HpMnT8bMWLNmDWbGnXfeyQ477MDixYs32P7HH39kxx13pEmTJsyZM4elS5dSu3btpK9x5ZVXMnz48I2Wd+3ald69e2+wrH79+sydO3fd48LCQnYppbLDokWLGDt2LIMHD163bPz48XTt2hWAhQsXMnToUDbbbLONBrsrnbtn1e2AAw7wDbzdNtxSMX26e5s27uA+YkRqzxGRCps6dWqsr//oo496r169Nlh2xBFH+KhRo3zFihXeqFGjdTF+/fXX3rBhQ1+yZIm7u1977bXeo0cPX7lypbu7z58/35955plNimfy5Mm+7777+ooVK3zWrFneuHFjX716dYnbPvLII37OOeeUuq/u3bv7iy++WOK6ko47MN4r+L2bH2MUq1fDHXeEIn6ffw5PPglHHBF3VCKSZgMHDuQPf/jDBss6d+7Mc889R82aNfnPf/7DueeeS8uWLTnttNMYMGAAderUAeDWW2+lbt26NGvWjBYtWtCpUyfq1q27SfE0b96cLl260KxZM9q3b0/fvn3XdSN16NCB+fPnr9t20KBBnHHGGZv0epXFvIQ+s0xWsEdtH//wAesXLJ5U9sRFxx8Pb70Fp54aronYeee0xykiMG3aNPbee++4w8g7JR13M5vg7gUV2V/2jVGsWb7h49KmP12xIlwwV7069OoVbp07V02MIiI5JPsSRfUtyp72dPRoOP/8UHrj8suVIERENkFujVEsWxYSQ5s2oUWhJq9I7LKtezvbpeN4506iGDkSWrSAhx6CSy+FyZPh2GPjjkokr9WqVYtFixYpWVQRj+ajqFWrVqXuN/u6npLZcstQ9fWww+KOREQI1w0UFhayYMGCuEPJG0Uz3FWm7Dzr6cul4cErr8AXX8Bf/xoer1mjC+dEREqwKWc9pbXryczam9l0M5thZr1LWF/TzJ6P1n9sZo1S2vF334VZ5jp3hsGD4bffwnIlCRGRSpe2RGFm1YG+wAlAM+AMM2tWbLPzgcXu3gS4F7ijzB0vszBI/cYbYTKhDz9UET8RkTRKZ4uiNTDD3We5+2/AIKBjsW06Ak9H918CjrayKnJ9tywMWn/6KfTuHa6VEBGRtEnnYHY9YG7C40LgoNK2cffVZvYTsAOwMHEjM+sF9IoerrQPPpisSq8A7EixY5XHdCzW07FYT8divT0r+sR0JoqSWgbFR85T2QZ37wf0AzCz8RUdkMk1Ohbr6Visp2Oxno7FemY2vqLPTWfXUyHQIOFxfWB+aduY2WZAHeDHNMYkIiLllM5EMQ5oamaNzawG0BUYUmybIUD36P5pwHuebefriojkuLR1PUVjDpcCw4DqwBPuPsXMbiHURR8CPA48Y2YzCC2Jrinsul+6Ys5COhbr6Visp2Oxno7FehU+Fll3wZ2IiFSt3Kn1JCIiaaFEISIiSWVsokhb+Y8slMKxuMrMpprZZ2b2rpntGkecVaGsY5Gw3Wlm5maWs6dGpnIszKxL9NmYYmbPVXWMVSWFv5GGZjbczCZGfycd4ogz3czsCTP7wcwml7LezOyB6Dh9ZmatUtpxRSfbTueNMPg9E9gNqAF8CjQrts3FwKPR/a7A83HHHeOxOBLYMrp/UT4fi2i72sAoYAxQEHfcMX4umgITge2ixzvFHXeMx6IfcFF0vxnwddxxp+lYHAG0AiaXsr4D8CbhGraDgY9T2W+mtijSU/4jO5V5LNx9uLv/Gj0cQ7hmJRel8rkA+AdwJ7CiKoOrYqkciwuAvu6+GMDdf6jiGKtKKsfCgW2i+3XY+JqunODuo0h+LVpH4N8ejAG2NbPfl7XfTE0UJZX/qFfaNu6+Gigq/5FrUjkWic4n/GLIRWUeCzPbH2jg7m9UZWAxSOVzsQewh5mNNrMxZta+yqKrWqkciz5ANzMrBIYCl1VNaBmnvN8nQOZOXFRp5T9yQMrv08y6AQVA27RGFJ+kx8LMqhGqEPeoqoBilMrnYjNC91M7QivzfTNr4e5L0hxbVUvlWJwBPOXud5vZIYTrt1q4+9r0h5dRKvS9maktCpX/WC+VY4GZHQPcAJzi7iurKLaqVtaxqA20AEaY2deEPtghOTqgnerfyGvuvsrdZwPTCYkj16RyLM4HXgBw94+AWoSCgfkmpe+T4jI1Uaj8x3plHouou+UxQpLI1X5oKONYuPtP7r6juzdy90aE8ZpT3L3CxdAyWCp/I68STnTAzHYkdEXNqtIoq0Yqx2IOcDSAme1NSBT5OD/rEOCc6Oyng4Gf3P3bsp6UkV1Pnr7yH1knxWPxL2Br4MVoPH+Ou58SW9BpkuKxyAspHothwHFmNhVYA1zr7oviizo9UjwWVwP9zexKQldLj1z8YWlmAwldjTtG4zE3AZsDuPujhPGZDsAM4Ffg3JT2m4PHSkREKlGmdj2JiEiGUKIQEZGklChERCQpJQoREUlKiUJERJJSopCMY2ZrzGxSwq1Rkm0blVYps5yvOSKqPvppVPJizwrs40IzOye638PMdklYN8DMmlVynOPMrGUKz7nCzLbc1NeW/KVEIZloubu3TLh9XUWve5a770coNvmv8j7Z3R91939HD3sAuySs6+nuUyslyvVxPkxqcV4BKFFIhSlRSFaIWg7vm9kn0e3QErZpbmZjo1bIZ2bWNFreLWH5Y2ZWvYyXGwU0iZ57dDSHwedRrf+a0fLbbf0cIHdFy/qY2TVmdhqh5taz0WtuEbUECszsIjO7MyHmHmb2YAXj/IiEgm5m9oiZjbcw98TN0bLLCQlruJkNj5YdZ2YfRcfxRTPbuozXkTynRCGZaIuEbqfB0bIfgGPdvRVwOvBACc+7ELjf3VsSvqgLo3INpwOHRcvXAGeV8fonA5+bWS3gKeB0d9+HUMngIjPbHvgD0Nzd9wVuTXyyu78EjCf88m/p7ssTVr8EnJrw+HTg+QrG2Z5QpqPIDe5eAOwLtDWzfd39AUItnyPd/ciolMffgGOiYzkeuKqM15E8l5ElPCTvLY++LBNtDjwU9cmvIdQtKu4j4AYzqw+84u5fmdnRwAHAuKi8yRaEpFOSZ81sOfA1oQz1nsBsd/8yWv80cAnwEGGuiwFm9l8g5ZLm7r7AzGZFdXa+il5jdLTf8sS5FaFcReIMZV3MrBfh7/r3hAl6Piv23IOj5aOj16lBOG4ipVKikGxxJfA9sB+hJbzRpETu/pyZfQycCAwzs56EsspPu/v1KbzGWYkFBM2sxPlNotpCrQlF5roClwJHleO9PA90Ab4ABru7W/jWTjlOwixutwN9gVPNrDFwDXCguy82s6cIhe+KM+Btdz+jHPFKnlPXk2SLOsC30fwBZxN+TW/AzHYDZkXdLUMIXTDvAqeZ2U7RNttb6nOKfwE0MrMm0eOzgZFRn34ddx9KGCgu6cyjpYSy5yV5BehEmCPh+WhZueJ091WELqSDo26rbYBfgJ/M7HfACaXEMgY4rOg9mdmWZlZS60xkHSUKyRYPA93NbAyh2+mXErY5HZhsZpOAvQhTPk4lfKG+ZWafAW8TumXK5O4rCNU1XzSzz4G1wKOEL903ov2NJLR2insKeLRoMLvYfhcDU4Fd3X1stKzccUZjH3cD17j7p4T5sacATxC6s4r0A940s+HuvoBwRtbA6HXGEI6VSKlUPVZERJJSi0JERJJSohARkaSUKEREJCklChERSUqJQkREklKiEBGRpJQoREQkqf8HkgPyvI3HT7IAAAAASUVORK5CYII=\n", + "text/plain": [ + "" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "def print_metrics(labels, scores):\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('True positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('True negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1])\n", + "\n", + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show()\n", + " \n", + "print_metrics(y_test, scores) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "At first glance, these performamce metrics look quite good. Notice however, that the AUC is much larger than achieved with cross validation. This indicates that these results are overly optimistic, a common situation when a single split is used to evaluate a model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have performed two types of feature selection:\n", + "1. Eliminating low variance featurers, which by their nature cannot be highly informative since they contain a high fraction of the same value.\n", + "2. Using recursive feature elimination, a cross validation technique for identifing uninformative features. \n", + "\n", + "With a reduced feature set less regulariztion was required for the model. This is expected since the most uninformative features have alredy been eliminated. It should be noted that for large numbers of features, these types of feature elimination algorithms should not be expected to give good generalization performance as a result of the multiple comparisons problem. In these cases, stronger regularization is a better approach. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/Auto_Data_Features.csv b/Module5/Auto_Data_Features.csv new file mode 100644 index 0000000..69d1d18 --- /dev/null +++ b/Module5/Auto_Data_Features.csv @@ -0,0 +1,196 @@ +0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44 +-1.683439128405708,-0.43850391155869184,-0.8397490803384212,-2.1172454168040913,-0.02101769012578214,0.2045987001229315,0.5185550398228566,-1.8202776060185226,-0.29493307013241693,-0.6851049824879971,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +-1.683439128405708,-0.43850391155869184,-0.8397490803384212,-2.1172454168040913,-0.02101769012578214,0.2045987001229315,0.5185550398228566,-1.8202776060185226,-0.29493307013241693,-0.6851049824879971,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +-0.7188028512711508,-0.24564625663014425,-0.1815478201248432,-0.6113626583476852,0.5044245630187714,1.3429929274160832,-2.394771141238803,0.7012021638506771,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.14773482140904332,0.18828346695909237,0.14755280998194586,0.18340879750430586,-0.4241752007203305,-0.03366985907796072,-0.5140162648572261,0.4777799057610003,-0.04812185789389084,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.0823357517728036,0.18828346695909237,0.2415815614410294,0.18340879750430586,0.5063352621211152,0.31049583754555027,-0.5140162648572261,0.4777799057610003,-0.541744282370943,-1.1549600994346305,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.14773482140904332,0.244533616313254,0.1945671857114843,-0.3185521219811608,-0.09935635332187921,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.1287208659526597,1.4820369021047775,2.5923003479180844,0.7690298702373548,0.5445492441679919,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.1287208659526597,1.4820369021047775,2.5923003479180844,0.7690298702373548,0.7547261454258133,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.1287208659526597,1.4820369021047775,2.5923003479180844,0.8526900234849311,1.0069384269351989,0.9723529464369175,-0.7352815444315295,0.4777799057610003,-0.46770091869938496,-1.311578471750175,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,-0.3133546527843883,-0.060144143433615405,0.6291876796100074,-1.4372680207219364,-0.3442953125801219,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,-0.3133546527843883,-0.060144143433615405,0.6291876796100074,-1.4372680207219364,-0.3442953125801219,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,0.28851556445391846,0.4693415436794784,-0.07148570570861922,-0.19248686850802735,-0.29493307013241693,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,0.39360401508282916,0.4693415436794784,-0.07148570570861922,-0.19248686850802735,-0.29493307013241693,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.752676215544274,1.1847146840899314,0.4766534400887349,0.7690298702373548,0.9477067547625402,0.4693415436794784,-0.07148570570861922,-0.19248686850802735,-0.29493307013241693,-0.8417233548035415,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.752676215544274,1.1847146840899314,0.4766534400887349,0.7690298702373548,1.2820790976727106,2.0842728893744145,1.0717182387586144,0.4458624403196187,-0.541744282370943,-1.4681968440657196,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.752676215544274,1.5704299939470312,0.9467971973841459,-0.06757166223842598,1.5686839630242853,2.0842728893744145,1.0717182387586144,0.4458624403196187,-0.541744282370943,-1.4681968440657196,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.8154110971331918,1.8275735338517627,2.357228469270379,1.0200103299800867,1.807521350817264,2.0842728893744145,1.0717182387586144,0.4458624403196187,-0.541744282370943,-1.6248152163812641,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.7161386632238267,-2.6644026788590343,-2.6262953580609816,-0.27672204535737116,-2.046358738610243,-1.463281214283314,-1.546587569537307,-0.7031663155701442,-0.17152746401315389,3.38697269771616,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-1.4751138067996454,-1.0748209589861235,-0.7786829648428408,-1.3088288851055243,-0.8808469584589108,-1.104057010388702,-0.44782659203908576,-0.14684634278930137,1.9774073468762596,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-1.242077473760981,-1.0748209589861235,-0.7786829648428408,-1.2419544165234901,-0.8808469584589108,-1.104057010388702,-0.44782659203908576,-0.14684634278930137,1.9774073468762596,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-1.3050074869008366,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.19374047311462117,1.820788974560715,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-1.3050074869008366,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-0.8235113131101912,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.131133868587548,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.0890984883359838,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.0890984883359838,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-0.7031372696625298,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.719976680726153,0.02756875451863454,-0.6046772016907158,2.4840630118127027,-0.04585677845625195,-0.4043098400571264,0.039146934078531676,0.6692846984092942,-0.4183386762516799,-0.21524986554136352,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.4814961737906454,1.104724368215191,0.9979631455671799,2.073653177830114,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.0104344765869135,-2.383151932088233,-0.9337778317975015,-1.2806438843283103,-1.616451440582881,-1.1985383707267672,-1.546587569537307,0.5096973712023833,-0.14684634278930137,3.700209442347249,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.0104344765869135,-2.383151932088233,-0.9337778317975015,-1.2806438843283103,-1.413917335734435,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.9492222084989965,-0.8867634560679597,-0.527702505100106,-1.379524751892246,-1.1455898020154578,-1.546587569537307,-0.575496453804615,-0.023440736670038324,1.9774073468762596,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.9492222084989965,-0.8867634560679597,-0.527702505100106,-1.1827227443508315,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.9492222084989965,-0.8867634560679597,-0.527702505100106,-1.1521515587133302,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.39180750308994533,-0.8724336351479285,-0.8867634560679597,0.26706895075188514,-1.0489738071867631,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,-1.3786849793353715,-0.9337778317975015,1.8566118624558674,-1.0222240197539496,-0.7220012523249827,-1.5097100229415907,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.39180750308994533,-0.5429684746449903,-0.32259094731346516,-0.2348919687335845,-0.6171558100570574,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.39180750308994533,-0.5429684746449903,-0.32259094731346516,-0.2348919687335845,-0.5158887576328344,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.39180750308994533,0.09185463949481859,-0.32259094731346516,0.09974864425672958,-0.4872282710976769,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,0.09185463949481859,-1.5919790920110761,0.09974864425672958,-0.3573007321382964,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,0.09185463949481859,-0.32259094731346516,0.09974864425672958,-0.17960571562032013,-0.060144143433615405,-0.661526451240095,1.0522942837058817,-0.29493307013241693,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,-0.4143967046926245,0.05352405852286232,-1.1969837310807312,-0.508245961223459,-0.08661842778927009,-0.661526451240095,1.0522942837058817,-0.2702519489085644,-0.058631493225819016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7515023860892718,-0.2858249347402587,-1.9210797221178653,-0.15123181548600523,-0.4241752007203305,-0.6690526836136733,-0.07148570570861922,-0.06481700674249814,-0.4183386762516799,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.47355634013524667,-0.1331459579218233,-0.32259094731346516,-1.0296634245855756,0.33437234291017043,-0.351361271345817,0.37104485343998767,-0.06481700674249814,-0.24557082768471186,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.305904119405,2.0365026600243574,1.7460415847863393,-0.44404235185252966,2.8794235472321534,1.9254271832404863,1.1085957853543307,2.9354247447474355,-0.5170631611470905,-1.6248152163812641,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.305904119405,2.0365026600243574,1.7460415847863393,-0.44404235185252966,2.8794235472321534,1.9254271832404863,1.1085957853543307,2.9354247447474355,-0.5170631611470905,-1.6248152163812641,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5074297044083699,1.4016795458845486,2.2161853420817503,-2.5355461830419816,2.657782451360269,4.202215637826789,0.7766978659928765,-1.5649378824874658,0.32209496046389824,-1.9380519610123532,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9476995949979956,-1.2179702668949137,-0.7927347046088762,0.09974864425672958,-1.278257699468023,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9476995949979956,-1.2179702668949137,-0.7927347046088762,0.09974864425672958,-1.2591507084445845,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9476995949979956,-1.2179702668949137,-0.7927347046088762,0.09974864425672958,-1.2495972129328654,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9476995949979956,-0.5992186239991497,-0.7927347046088762,0.09974864425672958,-1.1731692488391123,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9476995949979956,-0.5992186239991497,-0.7927347046088762,0.09974864425672958,-1.163615753327393,-0.9337955271702202,-0.9196692774101148,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,-0.06757166223842598,-0.3324616438078266,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.28469416624923083,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,-0.06757166223842598,-0.3324616438078266,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.28469416624923083,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.2216410958718844,-1.039692664592839,0.2235346670571187,0.4458624403196187,3.08638053753539,1.6641706022451705,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.25603367971407337,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9815729592711188,0.059711697006726565,0.10053843425240075,0.2252388741280955,0.21208760036016525,0.4428672593238237,1.588003891098654,-0.28823926483217355,-0.541744282370943,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9815729592711188,0.059711697006726565,0.10053843425240075,0.2252388741280955,0.2694085734304802,-0.8278983897476014,0.37104485343998767,1.2437990763541755,2.913612688968422,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.8154110971331918,1.3373936609083668,2.0751422148931282,1.1036704832276658,1.8266283418407026,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.8154110971331918,1.3373936609083668,2.0751422148931282,2.023932168951026,2.2756426308915025,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.2758687726342033,1.064178649759588,2.0751422148931282,0.4343892572470407,1.7884143597938258,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +2.7309980720405664,2.2775747286850443,2.7333434751067065,1.0200103299800867,2.3138566129383795,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.7309980720405664,2.2775747286850443,2.7333434751067065,1.1036704832276658,2.2565356398680643,1.369467211771738,0.48167749322713854,-0.4797440574804674,-0.46770091869938496,-1.4681968440657196,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.375457735680886,0.48560568497394074,2.169170966352212,-1.2806438843283103,2.151447189239154,1.369467211771738,0.48167749322713854,-0.4797440574804674,-0.46770091869938496,-1.4681968440657196,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +3.5975357447207625,2.7195401878963033,2.7333434751067065,1.1873306364752452,2.5622474962430775,2.137221458085724,1.735514077481523,0.3181925785540895,-0.541744282370943,-1.7814335886968087,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.142406445314397,2.0043597175362655,2.874386602295328,0.6435396403659859,2.2087681623094686,2.137221458085724,1.735514077481523,0.3181925785540895,-0.541744282370943,-1.7814335886968087,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.6218780762717923,0.33292670815550535,0.9938115731136843,0.39255918062325107,0.6706553849226847,1.8989528988848317,1.6617589842900886,-0.4159091265977028,-0.541744282370943,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.6987059531497927,-1.2806438843283103,-1.2247581246023957,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,1.820788974560715,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.6987059531497927,-1.2806438843283103,-1.175079947941456,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.6987059531497927,-1.2806438843283103,-1.0604380018008261,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.964049362407055,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-0.7910294283703461,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4245070379080663,-0.10100301543373127,-0.22856219585438162,-1.8662649570613563,-0.36112213034298407,0.33697012190120496,-0.5877713580486605,0.6692846984092942,-0.665149888490206,-0.37186823785690803,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4245070379080663,-0.10100301543373127,-0.22856219585438162,-1.8662649570613563,-0.44137149264142495,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.058631493225819016,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.5235315540422097,1.104724368215191,0.9242080523757454,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.6916730750484669,1.104724368215191,0.9610855989714618,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.701226570560186,1.104724368215191,0.9610855989714618,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.3706756258547032,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.058631493225819016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.29424766176094996,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.058631493225819016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.29806905996563765,0.33697012190120496,-0.5877713580486605,0.6692846984092942,-0.665149888490206,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.29806905996563765,0.33697012190120496,-0.5877713580486605,0.6692846984092942,-0.665149888490206,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.2801683985703667,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.0355989134703565,-1.2779612237937312,-1.2515671967715691,0.7012021638506771,2.888931567744569,3.0737359530850714,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.2247581246023957,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.186544142555519,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.32600361285037316,-0.9807922075270432,-0.15123181548600523,-1.0222240197539496,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.1617050542250493,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.6956474514634258,-0.9807922075270432,-0.2348919687335845,-1.0145812233445743,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.1234910721781726,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.32600361285037316,-0.9807922075270432,-0.15123181548600523,-0.9973849314234798,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.62070424681679,-0.9527909913681574,-0.9807922075270432,-0.2348919687335845,-1.0527952053914509,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2773591312265229,-0.06886007294563926,-0.32259094731346516,0.3507291039994644,-0.44901428905080026,-0.16604128085623418,0.002269387482815258,0.7012021638506771,-0.4183386762516799,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2773591312265229,-0.06886007294563926,-0.32259094731346516,0.3507291039994644,-0.49104966930236454,-0.16604128085623418,0.002269387482815258,0.7012021638506771,-0.4183386762516799,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.5981059836822594,0.2885959371705678,0.5180494104946199,1.0241347188562935,1.2900443587047739,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-1.311578471750175,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.8311423167209238,0.2885959371705678,0.9363501767325103,1.4081852384274034,1.2900443587047739,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-1.311578471750175,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.24583342586340634,0.8311423167209238,0.2885959371705678,0.5180494104946199,0.9572602502742593,1.2900443587047739,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.24199540836108,-0.2858249347402587,0.9467971973841459,-1.7407747271899874,0.9782779404000415,1.5018386335500113,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.24199540836108,-0.2858249347402587,0.9467971973841459,-1.7407747271899874,1.108205479359422,2.560810007776199,0.37104485343998767,0.06285285502303108,-0.5911065248186482,-1.311578471750175,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.04963621695468259,0.34096244377752777,0.9467971973841459,-1.7407747271899874,1.108205479359422,1.5018386335500113,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,0.8808322861805061,-0.16604128085623418,0.48167749322713854,-0.19248686850802735,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,1.2190260272953641,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,2.023932168951026,1.2820790976727106,-0.16604128085623418,0.48167749322713854,-0.19248686850802735,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,2.023932168951026,1.6642189181414768,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,0.9859207368094168,-0.21898984956754355,0.48167749322713854,-3.384233412646255,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,1.3241144779242748,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,1.1873306364752452,1.3871675483016213,-0.21898984956754355,0.48167749322713854,-3.384233412646255,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,2.023932168951026,1.7693073687703875,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,0.9859207368094168,-0.16604128085623418,0.48167749322713854,-0.19248686850802735,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,1.3241144779242748,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.4884157489519863,0.999892764783404,1.1348547003023064,0.8945201001087207,1.0910091874383274,1.025301515148227,1.0348406921628963,-0.12865193762526275,-0.788555494609469,-1.1549600994346305,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-1.2247581246023957,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,1.820788974560715,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-0.8235113131101912,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.131133868587548,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-0.5590399458890352,-0.9807922075270432,-1.2806438843283103,-1.0890984883359838,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-0.5590399458890352,-0.9807922075270432,-1.2806438843283103,-0.7031372696625298,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.719976680726153,0.02756875451863454,-0.6046772016907158,2.4840630118127027,-0.04585677845625195,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.21524986554136352,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.49487106750705223,1.104724368215191,0.9610855989714618,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-0.43046817593666936,1.1348547003023064,-1.5316243440710422,0.418443103413299,1.0517757995038817,2.251799729821564,-0.44782659203908576,-0.17152746401315389,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.5362912217241647,-0.43046817593666936,-0.4166196987725487,-0.9460032713379963,0.3764077231617347,2.746129998265782,1.5142487979072212,-1.1180933663081134,-0.17152746401315389,-1.311578471750175,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.5362912217241647,-0.43046817593666936,-0.4166196987725487,-0.9460032713379963,0.3764077231617347,2.746129998265782,1.5142487979072212,-1.1180933663081134,-0.17152746401315389,-1.311578471750175,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.5362912217241647,-0.43046817593666936,-0.4166196987725487,-0.9460032713379963,0.46047848366486327,2.746129998265782,1.5142487979072212,-1.1180933663081134,-0.17152746401315389,-1.311578471750175,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.18915921113203926,0.1781244157672768,0.7766978659928765,-0.575496453804615,-0.2184215943384737,-0.6851049824879971,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.259855077918761,0.1781244157672768,0.7766978659928765,-0.575496453804615,-0.2208897064608589,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.282783467146887,0.1781244157672768,-2.9110567935788443,-3.7672429979428426,-0.2208897064608589,-0.6851049824879971,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.3802291213664224,0.1781244157672768,0.7766978659928765,-0.575496453804615,-0.2208897064608589,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.4757640764836139,1.5018386335500113,0.7766978659928765,-0.575496453804615,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.5502813414750233,1.5018386335500113,0.7766978659928765,-0.575496453804615,-0.29493307013241693,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3947564505794163,-1.168849710445207,-0.06757166223842598,-0.97254584309301,-0.9073212428145655,1.0717182387586144,-2.8416365001427564,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3143990943591874,-1.0748209589861235,-0.06757166223842598,-0.8387969059289418,-0.8014241053919468,1.0717182387586144,-1.947947467784052,-0.3689764338039749,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9150000601798746,-1.3626135080913244,-0.9807922075270432,0.7690298702373548,-0.6095130136476821,-0.8014241053919468,1.0717182387586144,-1.947947467784052,-0.3689764338039749,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2773591312265229,-0.18136037165396018,-0.22856219585438162,-0.5695325817238956,-0.7910294283703461,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.17152746401315389,1.0376971129829926,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2773591312265229,-0.18136037165396018,-0.22856219585438162,-0.5695325817238956,-0.7050479687648736,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.17152746401315389,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2773591312265229,-0.18136037165396018,-0.22856219585438162,-0.5695325817238956,-0.418443103413299,-0.24546413392319824,1.0717182387586144,-1.947947467784052,-0.29493307013241693,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.31005866604464394,-0.18136037165396018,-0.22856219585438162,0.18340879750430586,-0.3324616438078266,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.29493307013241693,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.31005866604464394,-0.18136037165396018,-0.22856219585438162,0.18340879750430586,-0.09362425601484772,0.2045987001229315,1.0717182387586144,-1.947947467784052,-0.6157876460425007,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.31005866604464394,-0.06082433732361681,-0.22856219585438162,-0.3603821986049504,-0.5139780585304905,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.29493307013241693,0.4112236237208145,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.31005866604464394,-0.06082433732361681,-0.22856219585438162,-0.3603821986049504,-0.19871270664375842,-0.24546413392319824,1.0717182387586144,-1.947947467784052,-0.29493307013241693,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.3264084334537033,-0.05278860170159438,-0.22856219585438162,0.4343892572470407,-0.2655871752257925,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.29493307013241693,-0.37186823785690803,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.3264084334537033,-0.05278860170159438,-0.22856219585438162,0.4343892572470407,0.17387361831328862,0.2045987001229315,1.0717182387586144,-1.947947467784052,-0.6157876460425007,-0.37186823785690803,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-1.2501132093830056,-1.0748209589861235,0.26706895075188514,-1.096741284745359,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,1.507552229929626,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-1.2501132093830056,-1.0748209589861235,0.26706895075188514,-0.9916528341164483,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-1.2501132093830056,-1.0748209589861235,0.26706895075188514,-1.039420311675044,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.3661822909604876,-1.0748209589861235,2.1912524754461815,-0.5330850495539289,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.8810787406674481,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-0.3661822909604876,-1.0748209589861235,2.1912524754461815,-0.5139780585304905,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-0.3661822909604876,-1.0748209589861235,2.1912524754461815,1.0527952053914509,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.3603821986049504,-0.9133141709203513,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.7244603683519035,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.859814596054724,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.3603821986049504,-0.542638545065648,-1.2514869394380765,-0.21899589209148818,0.3181925785540895,3.037018295087685,1.3509338576140817,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.542638545065648,-1.2514869394380765,-0.21899589209148818,0.3181925785540895,3.037018295087685,1.9774073468762596,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.3603821986049504,-0.8884750825898814,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,1.9774073468762596,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.8349755077242541,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.4112236237208145,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.8005829238820652,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.745172649914094,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.567841996036359,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.67829818133206,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.567841996036359,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.5617455360890863,0.23107298447858618,-0.32962853187863905,-0.5435789883632319,-0.1962085852370064,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.49487106750705223,0.23107298447858618,-0.32962853187863905,-0.5435789883632319,-0.1962085852370064,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,-0.03630328294453279,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,-0.04394607935390811,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,-0.015285592818750648,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,0.22928389228125973,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,0.29615836086329383,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.3603821986049504,0.7948508265750337,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.4343892572470407,-0.44519289084611263,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.567841996036359,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.4343892572470407,-0.15094522908516264,-0.8014241053919468,-0.21899589209148818,0.3181925785540895,3.037018295087685,0.7244603683519035,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.016088491009150316,-0.2770513698398555,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.4343892572470407,-0.2770513698398555,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.016088491009150316,-0.19298060933672692,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.6545776110899132,0.7427492248786723,0.8527684459250624,-0.7786829648428408,0.7967615256773776,1.528312917905666,-0.21899589209148818,0.3181925785540895,-0.2208897064608589,-0.8417233548035415,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.6545776110899132,0.7427492248786723,0.8527684459250624,-0.7786829648428408,0.8731894897711308,1.528312917905666,-0.21899589209148818,0.3181925785540895,-0.2208897064608589,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9161738896348767,1.0882858566256577,0.2885959371705678,0.09974864425672958,1.0929198865406713,1.3959414961273926,-0.21899589209148818,0.3181925785540895,-0.24557082768471186,-0.8417233548035415,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9161738896348767,1.0882858566256577,0.2885959371705678,0.09974864425672958,1.131133868587548,1.3959414961273926,-0.21899589209148818,0.3181925785540895,-0.24557082768471186,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.5693883324984617,-1.3573840768606953,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.820788974560715,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.6687446858203409,-0.48373269312409045,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.5636562351914302,-1.3573840768606953,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.820788974560715,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.6630125885133094,-0.48373269312409045,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.542638545065648,-0.48373269312409045,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.45856778456251945,-0.9337955271702202,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.820788974560715,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.49487106750705223,-0.08661842778927009,-0.5140162648572261,0.4777799057610003,-0.04812185789389084,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-1.2018987956508667,-0.7927347046088762,0.7271997936135651,-0.5827632262148684,-0.351361271345817,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.21524986554136352,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +-0.7188028512711508,-0.6876117158414033,-0.8867634560679597,-1.0296634245855756,-0.6458162965922148,-0.351361271345817,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.24583342586340634,0.47756994935191605,0.4766534400887349,0.5180494104946199,0.19489130843907077,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.47756994935191605,0.4766534400887349,0.5180494104946199,0.03821398204687662,-0.9337955271702202,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.1943154852985371,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.7106062823905803,0.4766534400887349,0.5180494104946199,0.007642796409375324,-0.4043098400571264,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,0.9781802533563,0.6744767831273724,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,1.5219712494655562,0.9075820736133198,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.37186823785690803,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,0.9781802533563,0.7184228624812805,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,1.5219712494655562,0.9228676664320704,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.21524986554136352,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,0.9781802533563,0.9285997637391019,1.5547872022613207,1.0717182387586144,-0.32015673027355657,-0.665149888490206,-1.311578471750175,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,1.5219712494655562,1.1425980632016108,1.5547872022613207,1.0717182387586144,-0.32015673027355657,-0.665149888490206,-1.311578471750175,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,0.7509047472211255,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.369926578950012,0.6853697169897756,0.9362425601484772,1.5018386335500113,1.6617589842900886,-0.32015673027355657,-0.3689764338039749,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,0.8655466933617555,0.8135072403029894,0.9242080523757454,-1.2138457626322596,-0.3442953125801219,-1.1549600994346305,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,1.257240009342241,0.07222727834465804,-1.1778121035801363,0.4777799057610003,3.160423901206948,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,0.961081648478947,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 diff --git a/Module5/Auto_Data_Labels.csv b/Module5/Auto_Data_Labels.csv new file mode 100644 index 0000000..41e5c6f --- /dev/null +++ b/Module5/Auto_Data_Labels.csv @@ -0,0 +1,196 @@ +0 +9.510074525452104 +9.711115659888671 +9.711115659888671 +9.543234787249512 +9.767094927630573 +9.632334782035558 +9.781884730776879 +9.847974842605868 +10.08058716534893 +9.706864211031315 +9.736547097780475 +9.950848123895966 +9.9572652582166 +10.109077944602749 +10.333970423619581 +10.628980909135285 +10.51542467767053 +8.546946149565585 +8.747510946478448 +8.791029857045965 +8.625509334899697 +8.760453046315272 +8.981807323377534 +8.736971085254146 +8.808668062106715 +8.937087036176523 +9.054621796976763 +9.096163326913784 +9.469931564261438 +8.776321456449958 +8.832733591996416 +8.593969030218288 +8.784009071186633 +8.871926251117628 +8.894944460956886 +8.894944460956886 +8.973984926689743 +9.115480090952223 +9.087607606593886 +9.23941361946189 +9.468464892185638 +9.244258590179644 +8.82246957226897 +9.310004695089129 +10.381273322223919 +10.478695435231387 +10.491274217438248 +8.555451903533328 +8.715224041915372 +8.823942326585245 +8.809116258125274 +8.9085593751449 +9.087607606593886 +9.047233034106032 +9.2681374707024 +9.234545060673 +9.286838342948908 +9.327678864393416 +9.813562845008141 +9.81705782453774 +10.148470870454794 +10.248777937608223 +10.246225830735987 +10.360912399575003 +10.439512977323862 +10.464702061835247 +10.62035125971339 +10.72326738402944 +9.711297461543568 +8.592115117933497 +8.73052880173936 +8.805225202632306 +8.94754601503218 +9.20623194393164 +9.047703788498627 +9.44375103564617 +9.607033787697222 +9.581145019820722 +8.85209276347713 +9.010547069270189 +9.13550906135318 +9.13550906135318 +8.612321536507814 +8.867709208039386 +8.802221746402456 +8.831857935197906 +8.90231952852887 +8.895492631451633 +8.961750799330497 +8.92252495730139 +8.987071812848821 +9.017847259860732 +9.099297073182859 +9.164191715950203 +9.510370887608827 +9.57491403870827 +9.510370887608827 +9.752606521576492 +9.888323152016355 +9.820051594294094 +9.38429367909962 +9.487972108574462 +9.42867236629317 +9.536762272743895 +9.65374331942474 +9.735068900911164 +9.722864552377755 +9.74537068443899 +9.718963572186974 +9.795345393916424 +9.806425839692997 +8.625509334899697 +8.981807323377534 +8.736971085254146 +8.808668062106715 +8.937087036176523 +9.096163326913784 +9.454383987398135 +9.999615579630348 +10.389856535868129 +10.434938994095775 +10.519429662187102 +9.380083146563278 +9.406729185981574 +9.618468597503831 +9.649240256170584 +9.806425839692997 +9.831991550831058 +8.540519016719735 +8.86120833720818 +8.936298185228436 +8.871505346165781 +8.958668737047434 +9.206332350578643 +9.130539301772627 +9.32892308780313 +8.917712757131387 +9.2299469016151 +8.98882050177807 +9.366831168735615 +8.584477938221834 +8.754318540250866 +8.77770959579525 +8.841881989497114 +8.974364841846596 +9.080003870248179 +8.844768827529695 +8.881558488638976 +8.974364841846596 +8.960339366492091 +8.953898535260459 +9.03097444300786 +9.133243321591216 +8.994420665751292 +9.016512874996026 +9.13755460225053 +9.16303909885817 +9.041803370152845 +9.173572647783969 +9.20923976653215 +9.323579767582693 +9.354354132115088 +9.77956697061665 +9.09918532261002 +9.277812087091196 +9.209139651399664 +9.296334565143043 +9.327945614050446 +9.71462464769875 +9.680218993408767 +9.660778845727078 +9.66459564425378 +8.958668737047434 +8.984066927653044 +8.986571625268054 +9.01127949117793 +9.047233034106032 +9.158520623246385 +9.209840246934501 +9.358329249689632 +9.20833836930551 +9.49514330367712 +9.535679435605061 +9.416541202560081 +9.468078568054892 +9.504128762859725 +9.679406061493943 +9.71202433782489 +9.821192309809298 +9.849559210510572 +9.731809155840653 +9.854559878912704 +9.975110296209095 +10.019936365179374 +10.026810768568128 diff --git a/Module5/Bias-Variance-Trade-Off.ipynb b/Module5/Bias-Variance-Trade-Off.ipynb new file mode 100644 index 0000000..18effdf --- /dev/null +++ b/Module5/Bias-Variance-Trade-Off.ipynb @@ -0,0 +1,499 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Regularization and the bias-variance trade-off\n", + "\n", + "**Overfitting** is a constant danger with machine learning models. Overfit models fit the training data well. However, an overfit model will not **generalize**. A model that generalizes is a model which exhibits good performance on data cases beyond the ones used in training. Models that generalize will be useful in production. \n", + "\n", + "As a general rule, an overfit model has learned the training data too well. The overfitting likely involved learning noise present in the training data. The noise in the data is random and uninformative. When a new data case is presented to such a model it may produce unexpected results since the random noise will be different. \n", + "\n", + "So, what is one to do to prevent overfitting of machine learning models? The most widely used set of tools for preventing overfitting are known as **regularization methods**. Regularization methods take a number of forms, but all have the same goal, to prevent overfitting of machine learning models. \n", + "\n", + "Regularization is not free however. While regularization reduces the **variance** in the model results, it introduces **bias**. Whereas, an overfit model exhibits low bias the variance is high. The high variance leads to unpredictable results when the model is exposed to new data cases. On the other hand, the stronger the regularization of a model the lower the variance, but the greater the bias. This all means that when applying regularization you will need to contend with the **bias-variance trade-off**. \n", + "\n", + "To better understand the bias variance trade-off consider the following examples of extreme model cases:\n", + "\n", + "- If the prediction for all cases is just the mean (or median), variance is minimized. The estimate for all cases is the same, so the bias of the estimates is zero. However, there is likely considerable variance in these estimates. \n", + "- On the other hand, consider what happens when the data are fit with a kNN model with k=1. The training data will fit this model perfectly, since there is one model coefficient per training data point. The variance will be low. However, the model will have considerable bias when applied to test data. \n", + "\n", + "In either case, these extreme models will not generalize well and will exhibit large errors on any independent test data. Any practical model must come to terms with the trade-off between bias and variance to make accurate predictions. \n", + "\n", + "To better understand this trade-off you can consider the example of the mean square error, which can be decomposed into its components. The mean square error can be written as:\n", + "\n", + "$$\\Delta x = E \\Big[ \\big(Y - \\hat{f}(X) \\big)^2 \\Big] = \\frac{1}{N} \\sum_{i=1}^N \\big(y_i - \\hat{f}(x_i) \\big)^2 $$\n", + "\n", + "Where,\n", + "$Y = $ the label vector. \n", + "$X = $ the feature matrix. \n", + "$\\hat{f}(x) = $ the trained model. \n", + "\n", + "Expanding the representation of the mean square error:\n", + "\n", + "$$\\Delta x = \\big( E[ \\hat{f}(X)] - \\hat{f}(X) \\big)^2 + E \\big[ ( \\hat{f}(X) - E[ \\hat{f}(X)])^2 \\big] + \\sigma^2\\\\\n", + "\\Delta x = Bias^2 + Variance + Irreducible\\ Error$$\n", + "\n", + "Study this relationship. Notice that as regularization reduces variance, bias increases. The irreducible error will remain unchanged. Regularization parameters are chosen to minimize $\\Delta x$. In many cases, this will prove challenging. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load a data set\n", + "\n", + "With the above bit of theory in mind, it is time to try an example. In this example you will compute and compare linear regression models using different levels and types of regularization. \n", + "\n", + "Execute the code in the cell below to load the packages required for the rest of this notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import scipy.stats as ss\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The data are available in a pre-processed form. The preprocessing includes the following:\n", + "1. Cleaning missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardization of numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Auto_Data_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Auto_Data_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there are 195 cases and a total of 45 features. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Split the dataset\n", + "\n", + "With the feature array loaded, you must now create randomly sampled training and test data sets. The code in the cell below uses the `train_test_split` function from the `sklearn.model_selection` module to Bernoulli sample the cases in the original dataset into the two subsets. Since this data set is small only 40 cases will be included in the test dataset. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(9988)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 40)\n", + "\n", + "x_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "x_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A first linear regression model\n", + "\n", + "To create a baseline for comparison, you will first create a model using all 45 features and no regularization. In the terminology used before this model has high variance and low bias. In otherwords, this model is overfit. \n", + "\n", + "The code in the cell below should be familiar. In summary, it performs the following processing:\n", + "1. Define and train the linear regression model using the training features and labels.\n", + "2. Score the model using the test feature set. \n", + "3. Compute and display key performance metrics for the model using the test feature set. \n", + "4. Plot a histogram of the residuals of the model using the test partition.\n", + "5. Plot a Q-Q Normal plot of the residuals of the model using the test partition.\n", + "6. Plot the residuals of the model vs. the predicted values using the test partition. \n", + "\n", + "Execute this code and examine the results for the linear regression model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "## define and fit the linear regression model\n", + "lin_mod = linear_model.LinearRegression()\n", + "lin_mod.fit(x_train, y_train)\n", + "\n", + "def print_metrics(y_true, y_predicted):\n", + " ## First compute R^2 and the adjusted R^2\n", + " r2 = sklm.r2_score(y_true, y_predicted)\n", + " \n", + " ## Print the usual metrics and the R^2 values\n", + " print('Mean Square Error = ' + str(sklm.mean_squared_error(y_true, y_predicted)))\n", + " print('Root Mean Square Error = ' + str(math.sqrt(sklm.mean_squared_error(y_true, y_predicted))))\n", + " print('Mean Absolute Error = ' + str(sklm.mean_absolute_error(y_true, y_predicted)))\n", + " print('Median Absolute Error = ' + str(sklm.median_absolute_error(y_true, y_predicted)))\n", + " print('R^2 = ' + str(r2))\n", + " \n", + "def resid_plot(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " sns.regplot(y_score, resids, fit_reg=False)\n", + " plt.title('Residuals vs. predicted values')\n", + " plt.xlabel('Predicted values')\n", + " plt.ylabel('Residual')\n", + " plt.show()\n", + "\n", + "def hist_resids(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test.reshape(-1,1), y_score.reshape(-1,1))\n", + " ## now make the residual plots\n", + " sns.distplot(resids)\n", + " plt.title('Histogram of residuals')\n", + " plt.xlabel('Residual value')\n", + " plt.ylabel('count')\n", + " plt.show()\n", + " \n", + "def resid_qq(y_test, y_score):\n", + " ## first compute vector of residuals. \n", + " resids = np.subtract(y_test, y_score)\n", + " ## now make the residual plots\n", + " ss.probplot(resids.flatten(), plot = plt)\n", + " plt.title('Residuals vs. predicted values')\n", + " plt.xlabel('Predicted values')\n", + " plt.ylabel('Residual')\n", + " plt.show()\n", + " \n", + "\n", + "y_score = lin_mod.predict(x_test) \n", + "print_metrics(y_test, y_score) \n", + "hist_resids(y_test, y_score) \n", + "resid_qq(y_test, y_score) \n", + "resid_plot(y_test, y_score) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Overall these results are reasonably good. The error metrics are relatively small. Further, the distribution of the residuals is a bit skewed, but otherwise well behaved. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply l2 regularization\n", + "\n", + "Now, you will apply **l2 regularization** to constrains the model parameters. Constraining the model parameters prevent overfitting of the model. This method is also known as **Ridge Regression**. \n", + "\n", + "But, how does this work? l2 regularization applies a **penalty** proportional to the **l2** or **Eucldian norm** of the model weights to the loss function. For linear regression using squared error as the metric, the total **loss function** is the sum of the squared error and the regularization term. The total loss function can then be written as follows: \n", + "\n", + "$$J(\\beta) = ||A \\beta + b||^2 + \\lambda ||\\beta||^2$$\n", + "\n", + "Where the penalty term on the model coefficients, $\\beta_i$, is written:\n", + "\n", + "$$\\lambda || \\beta||^2 = \\lambda \\big(\\beta_1^2 + \\beta_2^2 + \\ldots + \\beta_n^2 \\big)^{\\frac{1}{2}} = \\lambda \\Big( \\sum_{i=1}^n \\beta_i^2 \\Big)^{\\frac{1}{2}}$$\n", + "\n", + "We call $||\\beta||^2$ the **l2 norm** of the coefficients, since we raise the weights of each coefficient to the power of 2, sum the squares and then raise the sum to the power of $\\frac{1}{2}$. \n", + "\n", + "You can think of this penalty as constraining the 12 or Euclidean norm of the model weight vector. The value of $\\lambda$ determines how much the norm of the coefficient vector constrains the solution. You can see a geometric interpretation of the l2 penalty constraint in the figure below. \n", + "\n", + "\"Drawing\"\n", + "
**Geometric view of l2 regularization**\n", + "\n", + "Notice that for a constant value of the l2 norm, the values of the model parameters $B_1$ and $B_2$ are related. The Euclidian or l2 norm of the coefficients is shown as the dotted circle. The constant value of the l2 norm is a constant value of the penalty. Along this circle the coefficients change in relation to each other to maintain a constant l2 norm. For example, if $B_1$ is maximized then $B_2 \\sim 0$, or vice versa. It is important to note that l2 regularization is a **soft constraint**. Coefficients are driven close to, but likely not exactly to, zero. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With this bit of theory in mind, it is time to try an example of l2 regularization. The code in the cell below computes the linear regression model over a grid of 100 l2 penalty values. In a bit more detail, this code performs the following processing:\n", + "1. A grid of approximately 100 l2 penalty parameters is created.\n", + "2. A function is called which loops over the list of l2 penalty parameters. For each iteration, a penalized l2 regression model is computed along with the root mean squared error which is saved in a list. The `Ridge` function from the scikit-learn `Linear_model` package is used to define the l2 regularized linear regression model. The value of the reqularization parameter is which achieved the lowest RMSE is saved. \n", + "3. A function is called which creates two plots: One of RMSE vs regularization parameter, and, one of the values of the regression coefficients vs. the regularization parameter.\n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_regularization(l, train_RMSE, test_RMSE, coefs, min_idx, title): \n", + " plt.plot(l, test_RMSE, color = 'red', label = 'Test RMSE')\n", + " plt.plot(l, train_RMSE, label = 'Train RMSE') \n", + " plt.axvline(min_idx, color = 'black', linestyle = '--')\n", + " plt.legend()\n", + " plt.xlabel('Regularization parameter')\n", + " plt.ylabel('Root Mean Square Error')\n", + " plt.title(title)\n", + " plt.show()\n", + " \n", + " plt.plot(l, coefs)\n", + " plt.axvline(min_idx, color = 'black', linestyle = '--')\n", + " plt.title('Model coefficient values \\n vs. regularizaton parameter')\n", + " plt.xlabel('Regularization parameter')\n", + " plt.ylabel('Model coefficient value')\n", + " plt.show()\n", + "\n", + "def test_regularization_l2(x_train, y_train, x_test, y_test, l2):\n", + " train_RMSE = []\n", + " test_RMSE = []\n", + " coefs = []\n", + " for reg in l2:\n", + " lin_mod = linear_model.Ridge(alpha = reg)\n", + " lin_mod.fit(x_train, y_train)\n", + " coefs.append(lin_mod.coef_)\n", + " y_score_train = lin_mod.predict(x_train)\n", + " train_RMSE.append(sklm.mean_squared_error(y_train, y_score_train))\n", + " y_score = lin_mod.predict(x_test)\n", + " test_RMSE.append(sklm.mean_squared_error(y_test, y_score))\n", + " min_idx = np.argmin(test_RMSE)\n", + " min_l2 = l2[min_idx]\n", + " min_RMSE = test_RMSE[min_idx]\n", + " \n", + " title = 'Train and test root mean square error \\n vs. regularization parameter'\n", + " plot_regularization(l2, train_RMSE, test_RMSE, coefs, min_l2, title)\n", + " return min_l2, min_RMSE\n", + " \n", + "l2 = [x for x in range(1,101)]\n", + "out_l2 = test_regularization_l2(x_train, y_train, x_test, y_test, l2)\n", + "print(out_l2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results.\n", + "\n", + "The top plot shows the training and test RMSE vs the regularization parameter. The point with the minimum RMSE is shown with a dotted line. Notice that there is a minimum where the l2 parameter has a value of 14.0. To the left of the minimum variance dominates bias. To the right of the minimum bias dominates variance. In this case, the changes in RMSE are not dramatic until the bias grows significantly. \n", + "\n", + "The bottom plot shows the value of the 45 model coefficients vs. the regularization parameter. At the left the regularization penalty is small and the coefficient values show a wide range of values, giving a high variance model. To the right the coefficient values become more tightly clustered, giving a more constrained and higher bias model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, you will evaluate the model using the best l2 regularization parameter discovered above. The code in the cell below computes the regression model with the training data and computes and displays the results using the test data. \n", + "\n", + "Execute the code, and then answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lin_mod_l2 = linear_model.Ridge(alpha = out_l2[0])\n", + "lin_mod_l2.fit(x_train, y_train)\n", + "y_score_l2 = lin_mod_l2.predict(x_test)\n", + "\n", + "print_metrics(y_test, y_score_l2)\n", + "hist_resids(y_test, y_score_l2) \n", + "resid_qq(y_test, y_score_l2) \n", + "resid_plot(y_test, y_score_l2) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the error metrics achieved to those of the un-regularized model. The error metrics for the regularized model are somewhat better. This fact, indicates that the regularized model generalizes better than the unregularized model. Notice also that the residuals are a bit closer to Normally distributed than for the unregularized model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply l1 regularizaton\n", + "\n", + "Regularization can be performed using norms other than l2. The **l1 regularizaton** or **Lasso** method limits the sum of the absolute values of the model coefficients. The l1 norm is sometime know as the **Manhattan norm**, since distance are measured as if you were traveling on a rectangular grid of streets. This is in contrast to the l2 norm that measures distance 'as the crow flies'. \n", + "\n", + "We can compute the l1 norm of the model coefficients as follows:\n", + "\n", + "$$||\\beta||^1 = \\big( |\\beta_1| + |\\beta_2| + \\ldots + |\\beta_n| \\big) = \\Big( \\sum_{i=1}^n |\\beta_i| \\Big)^1$$\n", + "\n", + "where $|\\beta_i|$ is the absolute value of $\\beta_i$. \n", + "\n", + "Notice that to compute the l1 norm, we raise the sum of the absolute values to the first power.\n", + "\n", + "As with l2 regularization, for l1 regularization, a penalty term is multiplied by the l1 norm of the model coefficients. A penalty multiplier, $\\lambda$, determines how much the norm of the coefficient vector constrains values of the weights. The complete loss function is the sum of the squared errors plus the penalty term which becomes: \n", + "\n", + "$$J(\\beta) = ||A \\beta + b||^2 + \\lambda ||\\beta||^1$$\n", + "\n", + "You can see a geometric interpretation of the l1 norm penalty in the figure below. \n", + "\n", + "\"Drawing\"\n", + "
**Geometric view of L1 regularization**\n", + "\n", + "The l1 norm is constrained by the sum of the absolute values of the coefficients. This fact means that values of one parameter highly constrain another parameter. The dotted line in the figure above looks as though someone has pulled a rope or lasso around pegs on the axes. This behavior leads the name lasso for l1 regularization. \n", + "\n", + "Notice that in the figure above that if $B_1 = 0$ then $B_2$ has a value at the limit, or vice versa. In other words, using a l1 norm constraint forces some weight values to zero to allow other coefficients to take non-zero values. Thus, you can think of the l1 norm constraint **knocking out** some weights free the model altogether. In contrast to l2 regularization, l1 regularization does drive some coefficients to exactly zero." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below computes l1 regularized or lasso regression over a grid of regularization values. The structure of the code is nearly the same as for the l2 example. The `Lasso` function from the scikit-learn `Linear_model` package is used to define the l1 regularized linear regression model.\n", + "\n", + "Execute the code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def test_regularization_l1(x_train, y_train, x_test, y_test, l1):\n", + " train_RMSE = []\n", + " test_RMSE = []\n", + " coefs = []\n", + " for reg in l1:\n", + " lin_mod = linear_model.Lasso(alpha = reg)\n", + " lin_mod.fit(x_train, y_train)\n", + " coefs.append(lin_mod.coef_)\n", + " y_score_train = lin_mod.predict(x_train)\n", + " train_RMSE.append(sklm.mean_squared_error(y_train, y_score_train))\n", + " y_score = lin_mod.predict(x_test)\n", + " test_RMSE.append(sklm.mean_squared_error(y_test, y_score))\n", + " min_idx = np.argmin(test_RMSE)\n", + " min_l1 = l1[min_idx]\n", + " min_RMSE = test_RMSE[min_idx]\n", + " \n", + " title = 'Train and test root mean square error \\n vs. regularization parameter'\n", + " plot_regularization(l1, train_RMSE, test_RMSE, coefs, min_l1, title)\n", + " return min_l1, min_RMSE\n", + " \n", + "l1 = [x/5000 for x in range(1,101)]\n", + "out_l1 = test_regularization_l1(x_train, y_train, x_test, y_test, l1)\n", + "print(out_l1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As before, the top plot shows the training and test RMSE vs the regularization parameter. The point with the minimum RMSE is shown with a dotted line. Notice that there is a minimum where the l1 parameter has a value of 0.0044. To the left of the minimum variance dominates bias. To the right of the minimum bias dominates variance. Notice that the curve of RMSE has some kinks or sharp bends. This is in contrast to the smooth curve produced by l2 regularization. \n", + "\n", + "The bottom plot shows the value of the 45 model coefficients vs. the regularization parameter. At the left the regularization penalty is small and the coefficient values show a wide range of values, giving a high variance model. To the right the coefficient values become more tightly clustered, giving a more constrained and higher bias model. In addition, many of the parameters are being driven to zero as the regularization parameter increases. The parameters being driven to zero account for the kinks in the RMSE curve. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below computes the l1 regularized regression model with the training data and computes and displays the results using the test data. Execute the code. Execute this code, evaluate the results, and answer **Question 2** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lin_mod_l1 = linear_model.Lasso(alpha = out_l1[0])\n", + "lin_mod_l1.fit(x_train, y_train)\n", + "y_score_l1 = lin_mod_l1.predict(x_test)\n", + "\n", + "print_metrics(y_test, y_score_l1) \n", + "hist_resids(y_test, y_score_l1) \n", + "resid_qq(y_test, y_score_l1) \n", + "resid_plot(y_test, y_score_l1) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the performance metrics of the 11 regularized model to the metrics for the un-regularized model and the l2 regularized model. In this case the error metrics are in between the two previous models. The residuals are closer to the unregularized model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have explored the basics of regularization. Regularization can prevent machine learning models from being overfit. Regularization is required to help machine learning models generalize when placed in production. Selection of regularization strength involves consideration of the bias-variance trade-off. \n", + "\n", + "L2 and l1 regularization constrain model coefficients to prevent overfitting. L2 regularization constrains model coefficients using a Euclidian norm. L2 regularization can drive some coefficients toward zero, usually not to zero. On the other hand, l1 regularization can drive model coefficients to zero. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/Credit_Features.csv b/Module5/Credit_Features.csv new file mode 100644 index 0000000..329e4db --- /dev/null +++ b/Module5/Credit_Features.csv @@ -0,0 +1,1001 @@ +0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34 +-1.864869059639166,-0.9339010037591353,0.9184771676276685,2.2710059235929747,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.1630458092056901,-0.8701833340815202,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.18155879674684203,-0.8701833340815202,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.4789131410521794,1.5251480607350405,-0.8701833340815202,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.9047427423164252,0.024146916773074168,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.703911049605318,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.20758784255072407,0.024146916773074168,1.4886197467137152,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.3626301604781133,-0.8701833340815202,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.30557459935806763,-0.8701833340815202,1.95785628118201,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.9976214799989517,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8020057724539522,0.024146916773074168,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.7467445133529922,0.024146916773074168,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.5563470132891891,-1.7645135849361147,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.9012510127304115,0.9184771676276685,1.902684530253145,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.6987928175688173,-0.8701833340815202,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.8150060579794622,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.005776555397474003,0.9184771676276685,1.4886197467137152,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,1.5558400345857029,-0.8701833340815202,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,2.1274617402310874,0.9184771676276685,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.4530740122559791,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.15840750725674918,0.9184771676276685,1.157872651879297,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,0.1191762052780822,-0.8701833340815202,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.09536802942532471,-1.7645135849361147,1.157872651879297,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.37486777091334544,0.024146916773074168,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.1982648982260506,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7257055738935877,-1.7645135849361147,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-2.2346136944792545,0.9184771676276685,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-2.287087520950489,0.024146916773074168,0.712169575197047,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.5999820607810264,0.000983552835622907,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.0918105708507295,1.3416903575955137,0.024146916773074168,2.065537068931412,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.29927526453011033,0.024146916773074168,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.6575896042051788,-0.8701833340815202,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.1445088090554238,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8332257744712419,0.9184771676276685,1.7314770061469296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.6351825900567682,0.9184771676276685,-0.09278473996200068,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.5974680166535271,0.8715095869768755,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.197020791620573,-1.7645135849361147,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.17910213752862986,0.9184771676276685,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.8736084908949318,-0.8701833340815202,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-2.141286590336398,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,-0.0435273700892056,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.9460830378293954,0.024146916773074168,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.2166931946515063,-0.8701833340815202,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.2131575947761968,-1.7645135849361147,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.2039613247155165,0.9184771676276685,1.7895274247278445,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.8233070367656448,-0.7080097193124013,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,-0.06244376793067328,0.9184771676276685,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7465038113077511,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.8233070367656448,1.4135375803531633,-1.7645135849361147,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.19577621107936602,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.0435273700892056,0.9184771676276685,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.7196835879968371,1.1660735452706705,-1.7645135849361147,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8352661824504807,0.024146916773074168,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.43339005037194916,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,-0.1046006161335618,0.9184771676276685,1.7314770061469296,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-1.4503017405821597,-1.7645135849361147,-0.8885566148438414,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,1.2703891878031444,-0.8701833340815202,1.4250403970430767,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.7746481456438878,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.2673434116480888,0.024146916773074168,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.221875045798567,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.7098610376907004,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.5812546967791408,0.9184771676276685,1.2941289148926294,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.27261073577272804,0.9184771676276685,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,2.3035462985602226,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.3559653537222094,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.7196835879968371,0.9867436785831314,0.9184771676276685,1.157872651879297,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.13625823124290218,-0.8701833340815202,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.126113093929714,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.36419823131605006,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,-0.010269945504946504,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.5655407357021147,-1.7645135849361147,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.5999820607810264,-1.540611554529536,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.3705265054419635,-0.9394240185144892,0.024146916773074168,1.3602264200472212,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.4789131410521794,1.1636952089857369,-0.8701833340815202,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.2568729234011431,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.5905095136250601,0.9184771676276685,2.220812271924916,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.4789131410521794,0.6398389276596715,0.9184771676276685,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,0.8782791589714488,-0.8701833340815202,1.3602264200472212,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.9107628400069734,1.7570173834616774,-0.8701833340815202,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.595876009995791,-0.8701833340815202,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.161312474665324,-1.7645135849361147,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.8862929316691417,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.5555249926108252,0.024146916773074168,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.41035040137585416,0.9184771676276685,1.7895274247278445,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.05350733076049297,0.024146916773074168,1.4250403970430767,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6905536020519931,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.8020057724539522,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,2.1308376361876697,-1.7645135849361147,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.09077642592192822,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.0029553479656197,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.7552219523560137,0.9184771676276685,1.6723990933496,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6932941575756586,0.9184771676276685,1.5510106274611108,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.4274666579207524,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.521474559691811,0.9184771676276685,-1.7642799753681382,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.7793217579792584,0.9184771676276685,1.5510106274611108,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.9107628400069734,2.4329905252942026,0.024146916773074168,1.7895274247278445,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.2342611082653446,0.9184771676276685,1.95785628118201,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.10694877085967346,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,-0.04132005928022949,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.20399462102766036,1.38268745847696,0.024146916773074168,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.6395608388143988,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.049062247026092645,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.2258418618515272,0.024146916773074168,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.29524023815448264,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.016891376735312855,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,2.0600694894399676,-0.8701833340815202,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.268395504333086,-0.8701833340815202,0.46481087889052375,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,1.1902546855206053,-0.8701833340815202,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.4985559105628365,-1.7645135849361147,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.4089028087708907,-0.6923799912587492,-1.7645135849361147,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.7980319300763852,-1.7645135849361147,2.6026928508806457,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-1.2453441186561935,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.6572241362412526,-1.7645135849361147,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-2.3417893563402172,0.9184771676276685,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,1.2282717341633516,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.5207266331292069,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.4666264793189246,0.024146916773074168,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,0.5075057842703586,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.4789131410521794,1.403874996819156,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.15961568115645694,-0.8701833340815202,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.0645089130574243,0.7386434343965812,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.025429148231241563,0.024146916773074168,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.35291393483100025,0.024146916773074168,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.6079245718904525,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.4008410049331007,0.024146916773074168,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-1.4535971832215155,0.9184771676276685,2.065537068931412,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.2918873417280138,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.1662809633907126,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.592843728434742,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.7121648599011403,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.335477709772362,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.4787373511257855,0.024146916773074168,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.6204389482909707,-1.7645135849361147,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.3512676640726533,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.14853305289879132,-0.8701833340815202,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.25231870427607594,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +2.0918105708507295,1.8502419194373931,-0.8701833340815202,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.8579265443504762,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.7196835879968371,1.6394279689830877,-0.8701833340815202,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4785853749281765,0.9184771676276685,2.220812271924916,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,0.15801445969508324,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.32101144461029796,-0.8701833340815202,0.8674447781905507,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.5782220978378143,-0.8701833340815202,-0.7625863844258073,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.884744657979629,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.7196835879968371,0.4478039952178397,0.024146916773074168,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.025819390968304514,-0.8701833340815202,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.06862371919994346,0.9184771676276685,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.50317705443334,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.3294392065544454,-1.7645135849361147,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.628249904034309,0.9184771676276685,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.0309145871324998,0.024146916773074168,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.544071429465936,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7522348097976082,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.2918873417280138,-1.7645135849361147,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1405488754873503,0.9184771676276685,-1.155724360007621,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.5047158993834735,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.3661489309039503,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8150060579794622,-0.8701833340815202,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.808989623532786,0.024146916773074168,1.157872651879297,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-2.5289616216937225,0.9184771676276685,0.9424550121005936,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.4835157693264826,-0.8701833340815202,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.30941846072708806,-1.7645135849361147,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.22429567877487686,0.024146916773074168,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.0661130852226384,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.8352661824504807,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,1.4277206335103227,-0.8701833340815202,2.4172109717234496,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,-1.258038973753803,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.864869059639166,0.270995778550678,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9764819741456465,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,-0.5481503102688159,0.9184771676276685,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.6421117319751589,-0.8701833340815202,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.2845415373982147,0.9184771676276685,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-1.201193594157096,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4836416954897562,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.20138253190192865,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.3705265054419635,-0.6887297979688279,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.4470493730178621,-0.8701833340815202,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,1.4585566314896248,0.9184771676276685,1.4886197467137152,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.0846426621106051,-0.8701833340815202,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.5327681750371434,0.9184771676276685,1.4250403970430767,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.2660299381708094,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-1.8571427070735047,0.9184771676276685,2.1698522754169245,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.7754560779744457,-1.7645135849361147,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.7899785253100698,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,-0.4921885278537161,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.5772481509827424,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.2939731867977968,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.7389019168708525,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.9715095521911344,-0.8701833340815202,2.6026928508806457,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.17944725343182721,-0.9273044840465697,-0.8701833340815202,2.3204559381997742,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.6434538252089206,0.9184771676276685,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.3812351423791228,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.828725279026598,-0.8701833340815202,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.5999047283469973,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.7196835879968371,0.6234869649375272,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.174202826038074,-0.8701833340815202,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.5974680166535271,0.29372608028670516,0.9184771676276685,-1.601427436689871,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.6117937334450844,-0.8701833340815202,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7182250456574802,-1.7645135849361147,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.199837973443276,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.17304107162380952,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.743450219619934,0.9184771676276685,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-1.2203235777442643,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.9350037144551346,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.9684913566991636,0.024146916773074168,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.2679999527780403,0.9184771676276685,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.6169546910909554,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.9096929049527973,0.024146916773074168,0.37810951241256296,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.2845415373982147,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6796492997141513,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.2901581759113947,-0.8701833340815202,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.6896413773274241,0.024146916773074168,1.6122566565644016,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.3118774850855542,-1.7645135849361147,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.5968843711034428,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7196835879968371,1.0120649721293709,-0.8701833340815202,1.2941289148926294,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,-0.30264746579964313,0.9184771676276685,2.220812271924916,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.41958437505525603,0.9184771676276685,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.2258418618515272,-1.7645135849361147,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.3243915106367889,0.024146916773074168,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,0.622828550324511,-0.8701833340815202,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.28946792251309633,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.7351177221696129,-0.8701833340815202,2.1181021700318468,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7407091334673612,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.9001768006051394,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.581861979833523,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.270995778550678,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,0.8471170589214615,0.024146916773074168,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.10251833427823558,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.9500519187228837,-1.7645135849361147,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,1.5223659684262065,0.9184771676276685,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.6316906709287615,0.9184771676276685,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.34293757897345356,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +1.2140261421940395,0.7170942963619847,0.9184771676276685,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,0.049158037317674076,-0.8701833340815202,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.15418779747967742,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-1.3204807646545005,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-2.5616057574520994,-0.5753996905428135,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.36136787900912043,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,2.315463924071049,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.2878338890420431,0.1763049132775382,0.9184771676276685,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8059919082536569,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.05684458250013819,-1.7645135849361147,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.249561855219652,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.5335263748256596,0.024146916773074168,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.9494254972124627,0.024146916773074168,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,0.8326485624349164,0.024146916773074168,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.9163847799675291,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.45944447609274913,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8393567172180595,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.5637645282690953,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.8958888852682305,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.2912178081355871,-0.8701833340815202,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-2.2136129465191967,0.024146916773074168,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-1.658839290029222,0.024146916773074168,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.08963107707350554,-1.7645135849361147,1.0158165352161053,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9007313188405941,-0.14878268289945443,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.6989089628615197,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.22278487572030806,-0.8701833340815202,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.44697087756068,-1.7645135849361147,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.13706249611248938,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.14938213272168033,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,0.5891333605456632,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,-0.9505415798891141,0.9184771676276685,1.7314770061469296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.48438876209009274,-0.8701833340815202,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5270797453951935,0.9184771676276685,1.6122566565644016,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.014254053604786,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.16742660797110837,-0.8701833340815202,1.7314770061469296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.8673127908963916,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-1.419408361677478,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.2372967255742207,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5846123986652059,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.4089028087708907,1.6929071744373827,-1.7645135849361147,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.1363904554511066,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.12645735501432606,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.699711546570046,0.024146916773074168,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,2.0847641111591955,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,0.305995752285377,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,2.0665293159862403,-1.7645135849361147,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,0.14328838661407511,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.0048988774125396616,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8352661824504807,-0.8701833340815202,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.864869059639166,0.8343263196636707,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.30738342365816596,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.4295699545716889,0.024146916773074168,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.5506038509079058,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.6607860386328506,0.9184771676276685,1.2266960166673486,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.5955857540826947,0.9184771676276685,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.6112514969755848,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.6721912732076694,1.9216455059659838,-1.7645135849361147,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,0.8828622363597508,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.4751475801392828,-0.8701833340815202,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.021697749245869,0.9184771676276685,1.2266960166673486,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-1.1045421368991482,0.9184771676276685,1.157872651879297,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0407168011077441,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.751817879746369,-1.7645135849361147,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.260590529406177,-0.8701833340815202,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.4789131410521794,0.8850133509989807,0.9184771676276685,1.6723990933496,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.4831103056803951,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.8266551888028266,-1.7645135849361147,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.8520877707552964,-1.7645135849361147,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.8099904137731051,0.9184771676276685,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.053263233524950146,0.024146916773074168,0.790710020896414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.16601916122315946,0.024146916773074168,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.6472830029739154,-1.7645135849361147,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.5864263902861151,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.7541508211092194,0.9184771676276685,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-1.0870450269365557,0.9184771676276685,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.848080723599094,-0.8701833340815202,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.5762344928847029,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.8890370371133962,-0.8701833340815202,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.5459177419992647,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.3705265054419635,-0.8610477022019005,0.024146916773074168,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-2.7938798157605738,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.0333113835260288,0.024146916773074168,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.064070695443022,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.5676604143861476,-0.8701833340815202,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.6225943688262727,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-0.61781686400344,-1.7645135849361147,-0.524069587159072,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.16648848097689462,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.5800407535661753,-0.8701833340815202,0.37810951241256296,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.7595419962838468,-1.7645135849361147,1.157872651879297,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.592843728434742,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.5307036491782056,-1.7645135849361147,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.7289757323429747,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.2825453830568732,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.2412325051405017,-0.8701833340815202,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.1250044128912331,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0995186710056148,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.3705265054419635,0.4409964323758953,-1.7645135849361147,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,1.1301513057406816,-0.8701833340815202,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.5913541661427753,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.7951741669047615,0.9184771676276685,-0.301466121277391,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.0503326470007321,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.2993458529138144,-0.8701833340815202,2.6474964086269943,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.33270967998683115,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.0918105708507295,1.4452326911786044,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,2.021950078687176,-0.8701833340815202,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.6861188294812336,0.024146916773074168,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,0.4356766833937663,-1.7645135849361147,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.5362472898802806,-0.1784887038728471,-0.8701833340815202,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.8220609223671386,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.7044842871017741,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.5947383240162177,0.9184771676276685,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.1172035170558898,-0.8701833340815202,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.5150462446255809,-1.7645135849361147,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.36886272628501093,-1.7645135849361147,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.7853425234764302,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,0.634628842746036,-1.7645135849361147,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.6483622211883368,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.5362472898802806,-1.2968916871839868,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.5707499739659353,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.41919103550504583,-1.7645135849361147,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.9707981653279775,0.9184771676276685,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.8620897704913175,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-1.189044047540058,-1.7645135849361147,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.3752633224683898,-0.8701833340815202,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,1.215654319025098,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-1.5459177419992647,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.8517068185023379,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.04463244508801633,-1.7645135849361147,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.7922902861806974,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4618728530926033,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,-0.0038273452346677026,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.8589660904896659,0.9184771676276685,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.4413755753445366,-0.8701833340815202,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.09192279376412217,-0.8701833340815202,0.19764313372674408,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.4022994288235318,-1.7645135849361147,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.03156355086263981,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.5770698364274996,-0.8701833340815202,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.6019143737291102,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.5331771509901511,-1.7645135849361147,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.4590706135137063,0.9184771676276685,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.28090915727133475,-0.8701833340815202,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.31397161421484665,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,1.1885575929214574,0.024146916773074168,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.1511821572793285,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +2.0918105708507295,2.242714897447815,-0.8701833340815202,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +2.0918105708507295,2.335404564249628,0.024146916773074168,1.902684530253145,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.4925340087405463,-0.8701833340815202,0.2890958389937262,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.05072735693212146,-0.8701833340815202,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.5999820607810264,-1.3505876803356258,0.024146916773074168,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,2.2943102181590924,0.9184771676276685,1.7314770061469296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-2.444378052901822,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.20399462102766036,-0.1121511113947324,0.9184771676276685,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,2.167499506108515,0.024146916773074168,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.36777221524449144,-0.814001366022412,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.7676432728708812,0.9184771676276685,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.735931716728659,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.09709410787451524,-0.8701833340815202,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9821909660708611,0.9184771676276685,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,1.4393052567549693,0.9184771676276685,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.047399286111720625,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.6572241362412526,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.3634900601689934,-0.8701833340815202,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.157194416874332,-1.7645135849361147,-1.9354874994743552,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.3832196143093988,-0.8701833340815202,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.26997438217857805,-1.7645135849361147,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.0038273452346677026,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.3515686599427887,2.040712547486976,-0.8701833340815202,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.08514257186261158,0.024146916773074168,1.6122566565644016,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,-0.03526937961693151,0.024146916773074168,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.8757139119019909,-1.7645135849361147,1.0158165352161053,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5989810587340788,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.6360570504701303,0.024146916773074168,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.31690788263994424,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.6446534101214754,-0.8701833340815202,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4089028087708907,-1.419408361677478,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.23635259189993624,0.9184771676276685,0.790710020896414,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.2170848633152671,-1.7645135849361147,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.12091855757970564,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-1.0685580904527296,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.3776553727277202,0.024146916773074168,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.2162003186429162,0.9184771676276685,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.26340700114368076,0.9184771676276685,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.0645089130574243,1.4179865778317933,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.06637303326589096,0.9184771676276685,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.5319116911937641,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.7191577401387655,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1392261885931338,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.08564228773842482,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,1.618007495993326,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.1832711441739957,-1.7645135849361147,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0820891847902205,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.3579890974547502,-0.8701833340815202,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.2240549767296369,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.190388353733601,-0.8701833340815202,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,-0.5367618250369562,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.1739744417259306,-1.7645135849361147,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.1818809077115242,-1.7645135849361147,-1.601427436689871,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.7821764432392455,0.1650800085798083,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9561365403500611,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.7842192073172909,-1.7645135849361147,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9109594356590391,-0.8701833340815202,1.6122566565644016,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-2.1781638829926115,0.45981823022724516,-1.7645135849361147,2.6026928508806457,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.992487973892608,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.32719139587956814,0.9184771676276685,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.20513367682089648,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.157200465136036,0.024146916773074168,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6264704718793631,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-1.6705001602889722,-0.8701833340815202,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8099904137731051,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.4789131410521794,0.43947874389697433,0.9184771676276685,2.1698522754169245,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7741247466618797,0.9184771676276685,-0.8885566148438414,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.31895803004084333,0.9184771676276685,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.5134868205257854,-0.8701833340815202,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.20399462102766036,0.11038416485333973,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.5601753410747223,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.9631924429819175,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.853776726368572,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,-0.3480079612200745,0.9184771676276685,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.5999820607810264,0.08414255833190894,-0.8701833340815202,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6796492997141513,0.024146916773074168,1.6122566565644016,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.602385333817686,0.024146916773074168,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.9609383896022512,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.982662296108943,0.024146916773074168,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.17257413290892082,-0.8701833340815202,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.13032387665425876,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.8906429999755946,-0.8701833340815202,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.1346598898418097,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.8233070367656448,0.6201915222981714,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.43643799353091134,0.024146916773074168,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-2.513846875128404,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.8295669888243895,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.5225428344825612,-1.7645135849361147,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.4355581043724524,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.28776071143905857,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.4989308087237618,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.2745915468770928,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.24741669362265298,0.024146916773074168,2.065537068931412,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.48283445868131436,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.4153190231498993,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.0645089130574243,0.17490713620721476,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.8531897741319698,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.31940047520481774,0.024146916773074168,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.169731837552031,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-1.6822675239881504,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.8600064759879598,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.8527413567912606,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.32853598950625584,-1.7645135849361147,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.3515686599427887,0.08063638645521447,0.9184771676276685,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.9772747401966062,0.9184771676276685,-1.0194680969942889,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.088286971485364,0.024146916773074168,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.6316906709287615,0.9184771676276685,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.5057039137912442,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8991034832839071,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.5232545259503283,0.9184771676276685,1.7314770061469296,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-1.1877011423954342,0.024146916773074168,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.9405314679764167,0.9184771676276685,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.8905489796993064,0.9184771676276685,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.3131343709302673,-0.8701833340815202,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.5704070551695962,0.9184771676276685,2.012130890609527,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.6850899159731182,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.9971538191659439,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,0.485715230519944,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.28520760997424366,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.7196835879968371,1.5945220453619653,-0.8701833340815202,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.8610477022019005,-1.7645135849361147,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-2.4231964684739915,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.1656736019042681,0.024146916773074168,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.27876055352245976,-0.8701833340815202,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.700919300263891,0.9184771676276685,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.5416302843664004,0.9184771676276685,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.7960495952705704,0.9184771676276685,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.7744428526058714,-0.8701833340815202,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.332254656934756,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.059855258630974,-0.8701833340815202,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.9821909660708611,-0.8701833340815202,1.2266960166673486,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8831100958729894,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.8926822870899767,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.7881506134781747,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.028700882624603703,-0.8701833340815202,0.19764313372674408,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,1.3443265185581093,-1.7645135849361147,0.006858926312855975,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.6896413773274241,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.3515686599427887,1.6356824940778016,0.9184771676276685,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4904144699247184,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.855116015085992,-0.8701833340815202,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,0.1385019124263602,-0.8701833340815202,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.8259314178186645,0.9184771676276685,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.08734342912304956,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-1.7741247466618797,0.9184771676276685,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.7379548257810722,-0.8701833340815202,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.395405584273037,-0.8701833340815202,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.8969595270993638,0.024146916773074168,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-1.5946831539714412,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.0631351227973544,0.024146916773074168,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.3596058070390366,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.3939584215549896,0.024146916773074168,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.47468018702782266,0.024146916773074168,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.6548319621298557,1.5388073833809512,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.5854531930634803,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-0.6100779976794142,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.06076345814727301,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.6666014663323474,0.024146916773074168,0.6317364076235067,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.07709897911105941,0.024146916773074168,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.11136402601815405,-0.8701833340815202,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.6100779976794142,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.7861834110050226,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.3248065589153429,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.05128287208176305,-0.8701833340815202,-0.09278473996200068,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7257055738935877,0.9184771676276685,2.6474964086269943,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.5196921311745797,0.024146916773074168,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.5055460761853598,0.9184771676276685,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.3013554817389127,-1.7645135849361147,0.9424550121005936,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.583772152551548,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.22151602542610968,0.9184771676276685,1.902684530253145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,1.246664840019176,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.22070148573413742,-0.8701833340815202,0.006858926312855975,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.8424331628569156,0.9184771676276685,1.95785628118201,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.7647401157216668,0.9184771676276685,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.22787280876610916,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.5687406712279073,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7386491532210315,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.6755687109437596,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.1402656985815554,0.9184771676276685,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.41402663031645043,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.3720536463536046,-1.7645135849361147,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.2451944382927659,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.9012510127304115,0.9184771676276685,2.2710059235929747,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.7666748268503689,-0.8701833340815202,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.07426769392522936,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.9394602907081131,-1.7645135849361147,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.49810842098674996,0.024146916773074168,-0.7625863844258073,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.2892112922310683,-0.8701833340815202,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.26492399243317133,0.9184771676276685,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.5737317066598784,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.6263619661586286,-0.8701833340815202,-0.524069587159072,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,2.107850796138405,-1.7645135849361147,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.8622455186147777,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.5679107083202207,0.024146916773074168,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.7275825113990118,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.08514257186261158,-0.8701833340815202,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.6443805343692689,0.9184771676276685,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.3269036196508959,0.024146916773074168,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.3772570059384482,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.1636952089857369,0.024146916773074168,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.0457033605284425,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-1.4129978142774215,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-1.0335517184948613,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.18604712521819458,-0.8701833340815202,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.25005800107908954,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.2898797837478186,0.024146916773074168,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.20075219060594576,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-1.2189476914740385,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0648923202145708,0.024146916773074168,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.3326671805078755,-1.7645135849361147,1.2266960166673486,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.7126430122214735,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.015663496158023912,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.15901145260159322,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.2170848633152671,-1.7645135849361147,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,0.1911209427876586,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.8079896100020846,0.9184771676276685,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.8820508953450885,0.9184771676276685,1.0876002682121086,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.09249636037481059,0.024146916773074168,1.902684530253145,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-2.365006513655273,0.9184771676276685,1.7895274247278445,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.2647178022431281,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.5522421440334229,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.1532824772573735,0.024146916773074168,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.7407981900160938,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.2272251334493627,-1.7645135849361147,-0.1954948418550697,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.6634639419929222,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.726547430338748,-1.7645135849361147,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.18002492270264447,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.606257882523891,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.5999820607810264,-0.04573846864917447,-1.7645135849361147,0.9424550121005936,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.3515103209997907,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,0.42228042194752646,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.8220609223671386,0.9184771676276685,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.20440238389586818,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.8103519000326348,0.024146916773074168,2.6026928508806457,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.13080637707915702,0.9184771676276685,1.2941289148926294,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.2095238594993918,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.7970403814552081,0.9184771676276685,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.5213403712962208,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.8579265443504762,-1.7645135849361147,1.157872651879297,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.4253547267036001,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.5268070988366464,-1.7645135849361147,-1.446152233696366,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.37201391067774936,0.9184771676276685,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,2.0884647749212926,-0.8701833340815202,1.157872651879297,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.7183444062868711,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.864869059639166,0.5423230948248702,-1.7645135849361147,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.4571996717451271,-0.8701833340815202,-1.601427436689871,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7101499769735495,0.9184771676276685,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.5338829945408807,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.587136429579318,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.6231577996861181,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.3368639577486042,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.096091800409488,0.9184771676276685,2.1698522754169245,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.2555700745847507,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.1693220725900519,-0.8701833340815202,0.6317364076235067,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.6679395191554858,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.4789131410521794,0.6646219665213371,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,0.595876009995791,-1.7645135849361147,2.1181021700318468,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.5367025085097402,-0.8701833340815202,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.5679107083202207,0.9184771676276685,0.8674447781905507,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.6616780548468318,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.25491914384616077,-0.8701833340815202,-1.9354874994743552,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.7436478425663765,0.024146916773074168,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.7080097193124013,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.723831366480044,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,2.4091752342819883,-0.8701833340815202,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.6186796141994723,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.4789131410521794,0.7651564599365479,0.024146916773074168,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-1.5057846406795408,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.7891353421234543,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.8376752945694327,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.34172760417431386,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.3216966439721684,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.5410699254366262,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.8264780277578018,-0.8701833340815202,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.7145010044246421,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.205269034675227,0.9184771676276685,0.37810951241256296,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.6244767904461521,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.4570063798051685,0.9184771676276685,1.2941289148926294,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.2932274535176988,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.06020384147414028,0.9184771676276685,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.5580728910888668,-0.8701833340815202,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.03636739192848441,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.4089028087708907,0.6424360912895554,-1.7645135849361147,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.2881559427426412,0.9184771676276685,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.860111785512503,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.7204565782048127,-0.8701833340815202,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.2488949862868526,-0.8701833340815202,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.8000173193284096,0.024146916773074168,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.27086014789172,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.09536802942532471,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-1.072234319393326,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-1.0759210668859946,-0.8701833340815202,1.2941289148926294,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.2393390740278603,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.4776321181462543,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.5186214893434488,-1.7645135849361147,-0.7625863844258073,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.8975791350704793,0.024146916773074168,0.790710020896414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.28776071143905857,0.9184771676276685,1.0876002682121086,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.6951786325273495,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1169791344049713,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.8781368027101244,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.2928755737412154,-0.8701833340815202,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.7080097193124013,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.19143254138080676,-1.7645135849361147,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.08614180988836612,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.8114901663342491,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.97677444224055,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +2.405105394204176,1.0835624157804757,-0.8701833340815202,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.015663496158023912,0.9184771676276685,2.1181021700318468,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.6536720530279021,0.024146916773074168,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.5804166366717869,-1.7645135849361147,1.6723990933496,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.07370218267207239,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.6316906709287615,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.9649611699166599,0.024146916773074168,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.8132607815571835,-1.7645135849361147,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +2.0918105708507295,1.2820895413511235,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.7512778714917354,0.9184771676276685,-0.7625863844258073,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,0.21980136877847217,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,0.1697689415913028,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.533530121097724,-1.7645135849361147,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-1.167723728741789,-0.8701833340815202,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.11136402601815405,0.024146916773074168,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.23501849269226532,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,0.8633388724093886,-1.7645135849361147,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.07087834334411355,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.8620897704913175,-0.8701833340815202,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.008783116243064,0.9184771676276685,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.2341638544154838,-1.7645135849361147,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.377727966027024,0.9184771676276685,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.30467503092239573,0.9184771676276685,0.5493170365187798,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.9856285422395067,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.242803654254177,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.7228952840780793,0.9184771676276685,1.0876002682121086,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.04710052411460556,0.9184771676276685,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.7196835879968371,0.05990640739345421,0.9184771676276685,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,1.0195895368790233,-1.7645135849361147,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.2885877706351847,0.024146916773074168,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.26709577344560614,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.8937502667363918,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.1681047674738778,-0.8701833340815202,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.7324888990827545,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.9023261211529298,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.03312570587673631,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.9685316434249494,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +2.0918105708507295,2.2678525198905097,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,1.4775246058726885,-1.7645135849361147,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.3131343709302673,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.6083645434779358,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.3425283309567904,0.024146916773074168,-0.301466121277391,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.20399462102766036,1.2050096669502504,0.024146916773074168,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.7608793866321918,0.9184771676276685,0.006858926312855975,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.2136129465191967,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.8704567957981898,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-1.4388336145456433,0.9184771676276685,2.220812271924916,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.08113785264937592,0.024146916773074168,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-2.9213656571223523,-0.8701833340815202,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.7812785045506497,-0.8701833340815202,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.32032660942562263,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.2599881839166103,0.9184771676276685,1.8465854797112444,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.8220609223671386,-0.8701833340815202,0.19764313372674408,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.2573747614113253,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.2503718128913762,-0.8701833340815202,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.3705265054419635,-1.4887179359473908,0.9184771676276685,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.09757769413898225,-0.8701833340815202,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,0.4341527123472398,-1.7645135849361147,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.6479377435719302,0.024146916773074168,-0.524069587159072,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,2.018610506961237,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.7681016437779716,0.024146916773074168,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,1.3274754891933684,-1.7645135849361147,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.73834241410094,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.04795336798206724,-0.8701833340815202,-0.1954948418550697,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-1.0746909786115382,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.34743074918374894,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.03676338588404028,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.3515686599427887,2.281740099563163,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.5362472898802806,-0.37987728657481445,0.024146916773074168,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.05121227030699309,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8230719207474747,0.024146916773074168,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,1.0010634288448887,-1.7645135849361147,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.2285784732834433,-0.8701833340815202,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-1.1664028497233658,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.3582256242750765,-0.8701833340815202,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.1234704195172134,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.5662523845782773,0.9184771676276685,1.6122566565644016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.811994329215319,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.8261096841261264,0.024146916773074168,0.46481087889052375,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.6113572275221764,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.9502321945113027,-1.7645135849361147,1.8465854797112444,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.16445973712336395,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.11098661301485367,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.2878338890420431,2.1377662612276866,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.02634264676011784,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.3260509021985633,-0.8701833340815202,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,0.23724122460387168,-1.7645135849361147,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.521474559691811,-1.7645135849361147,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.4859449426773566,-0.8701833340815202,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.19709162743479447,-0.8701833340815202,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.5547221536718805,0.024146916773074168,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.39348988683374747,-1.7645135849361147,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.10575935485340944,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.6299482536657656,-0.8701833340815202,2.220812271924916,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.7285220064239617,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.4919267256861641,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.482414629219901,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.1100038203376585,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.6092563729078653,-0.8701833340815202,2.2710059235929747,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.3515686599427887,0.9213045523499567,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.2812163294376125,0.9184771676276685,1.902684530253145,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6923799912587492,-0.8701833340815202,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.3659091137665273,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.20399462102766036,1.2703891878031444,-1.7645135849361147,1.902684530253145,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.2805523163491565,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.36777221524449144,0.13273457402653865,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.1688325190430867,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.2208403413587743,0.9184771676276685,1.2941289148926294,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.3950738730377965,1.1731823275244582,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.2878338890420431,-0.9131268329230363,-0.8701833340815202,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.239134984794895,0.024146916773074168,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.8778227787714831,-1.7645135849361147,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.233237542347047,0.024146916773074168,1.3602264200472212,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.3068376454023158,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.06132331797343269,-0.8701833340815202,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,1.463024687661877,-1.7645135849361147,1.3602264200472212,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.8393567172180595,-0.8701833340815202,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.5637645282690953,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.5712337723661562,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5319116911937641,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.3813121612995245,0.024146916773074168,1.157872651879297,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.20399462102766036,0.735931716728659,-1.7645135849361147,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.1664028497233658,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,1.4563167799119987,-1.7645135849361147,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.734286760747295,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.814965246353996,0.024146916773074168,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.228609891637058,0.9184771676276685,2.1698522754169245,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.4789131410521794,1.7359534860455126,-1.7645135849361147,1.6122566565644016,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.3935735442142963,-0.8701833340815202,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.3705265054419635,-1.2608771188872538,0.024146916773074168,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-2.070140172686223,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.7831062738827694,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.3022004111919005,0.024146916773074168,1.5510106274611108,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.6287422049304561,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.4494004059807892,-0.8701833340815202,1.5510106274611108,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.7599160091146911,-1.7645135849361147,2.012130890609527,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.5670812796661316,-1.7645135849361147,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,2.4258595386936377,-0.8701833340815202,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.7531924593174495,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0111216239078757,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.28733355500563035,0.024146916773074168,-1.155724360007621,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.15043492390036597,-0.8701833340815202,0.6317364076235067,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.3705265054419635,-1.5388476701067393,0.9184771676276685,1.0876002682121086,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.5782711898373049,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.527883812422618,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.6401638595412662,-1.7645135849361147,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.7032474092821677,-0.8701833340815202,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.5971527789106184,0.024146916773074168,1.0876002682121086,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.3121380979339703,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.020537070502105105,0.9184771676276685,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.8831100958729894,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.5974680166535271,2.0468337816576634,-0.8701833340815202,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.9601546796032658,-0.8701833340815202,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.046845442402512136,-0.8701833340815202,-1.0194680969942889,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0335517184948613,0.9184771676276685,1.157872651879297,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.2910612818983251,0.9184771676276685,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-1.7911632411419318,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.2618773361961817,0.9184771676276685,2.065537068931412,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.10153172062387894,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.9840098187349172,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.2878338890420431,0.27746966942420725,0.024146916773074168,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.2792253158511944,0.9184771676276685,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5629421261882505,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.4500652137618334,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.2878338890420431,0.643408679613227,-0.8701833340815202,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.3274754891933684,-0.8701833340815202,2.3204559381997742,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.8486081780439698,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.7351177221696129,0.024146916773074168,1.8465854797112444,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.5782220978378143,0.9184771676276685,1.7314770061469296,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.20399462102766036,-0.0988225016543111,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.6646219665213371,0.024146916773074168,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.6378077536873272,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.6634639419929222,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.9399023430912783,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.6351825900567682,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-1.279479025309302,0.9184771676276685,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.41996987015749826,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,0.6378876137537329,0.024146916773074168,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.5071456116978844,-1.7645135849361147,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.1308174931073358,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.1374458049395338,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.013725484799558268,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.8103519000326348,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.11331666314280593,-0.8701833340815202,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.10691913654254248,-0.8701833340815202,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.012963898753946785,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.4153362995557147,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.4454066153699499,0.024146916773074168,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.6862139814977934,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.4913598081011363,-0.8701833340815202,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7551098962332411,-1.7645135849361147,1.0158165352161053,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.7182250456574802,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-1.308632245583138,0.9184771676276685,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.5118200205651716,-0.8701833340815202,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.8233070367656448,-0.7754171569015452,0.9184771676276685,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.2812163294376125,0.024146916773074168,0.19764313372674408,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,0.5136133644449646,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.6814602871797085,0.024146916773074168,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,1.3238493548261325,-0.8701833340815202,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.5139834672953743,0.024146916773074168,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.7351203928895729,-0.8701833340815202,1.157872651879297,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,-0.13033672555645318,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.0146373624318314,0.9184771676276685,2.1698522754169245,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.6693948477767842,0.024146916773074168,0.790710020896414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4307041404521588,0.9184771676276685,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.20303477290284017,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.7083686963912428,2.410738320189429,-0.8701833340815202,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.299151050982891,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.7821764432392455,1.5156313986744103,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.015309403093213013,0.9184771676276685,0.790710020896414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.35713397365652133,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.13625823124290218,0.9184771676276685,0.37810951241256296,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1299291667228684,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9339010037591353,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.6884503735133218,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.09906187633759772,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.5359522005242147,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.12738614636552228,-1.7645135849361147,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.6995296359802818,-0.8701833340815202,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.17944725343182721,0.1084222042616649,-0.8701833340815202,0.790710020896414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.20399462102766036,0.4735714670084882,-0.8701833340815202,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.8918610055772254,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.7145010044246421,0.9184771676276685,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8190326780364441,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0041187948795498,-0.8701833340815202,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.572463200344689,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.554533314240912,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.5132548952898817,-1.7645135849361147,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.3616238456185479,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.7896892633558561,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.8687917682733318,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.276608360989972,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,-0.15358610829310324,-1.7645135849361147,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.3478384433398584,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,2.6191927041199468,-1.7645135849361147,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,0.2134828821505448,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,2.345303594009422,-1.7645135849361147,2.3204559381997742,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.02924698026771496,-1.7645135849361147,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.42074051454030087,0.9184771676276685,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.36561574231669436,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +1.7083686963912428,2.144758900868038,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.7332297842939325,0.024146916773074168,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.24068123798813007,0.024146916773074168,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.3484581912618832,-0.8701833340815202,1.6122566565644016,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.6002172385111977,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0734620635258851,0.9184771676276685,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.8695312731185745,0.9184771676276685,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,-0.33063755297424724,0.9184771676276685,1.7895274247278445,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.7541508211092194,0.9184771676276685,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.41623741273406273,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.474319143180654,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.8746607713696127,0.024146916773074168,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.9727505751602739,0.9184771676276685,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.6143716283422375,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9007313188405941,-0.29524023815448264,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-1.5144035180971214,0.024146916773074168,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.2020069648531537,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.2340222196959683,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.3428208013319771,-0.8701833340815202,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.4890073987627655,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.5737317066598784,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-1.2299961396133003,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-0.6518996767241932,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.3459111058026742,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.6007034789951051,-1.7645135849361147,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.42228042194752646,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.21845001325908134,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.5846123986652059,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.5228987293528337,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.5118200205651716,0.024146916773074168,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,-0.1517827246962295,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.6870590100188486,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.951579224650985,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.31281734787236537,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.8673127908963916,0.9184771676276685,1.7314770061469296,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.5352935227273163,0.9184771676276685,1.2266960166673486,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.9505415798891141,-0.8701833340815202,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7821764432392455,0.6530943982567955,0.024146916773074168,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.3097799469866166,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.42141070421870225,-0.8701833340815202,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,-0.03252843544438422,-1.7645135849361147,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.4995586376503998,0.024146916773074168,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.00865627294634702,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.1525894467574407,0.024146916773074168,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,-0.440058169608717,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.7196835879968371,0.0558223552591907,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.5038995200365869,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.4789131410521794,1.4024373203858955,-0.8701833340815202,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.8233070367656448,0.6313618091255205,-1.7645135849361147,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.6006820720103364,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.442621007216505,-1.7645135849361147,0.790710020896414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.9077151602490164,-1.7645135849361147,-0.524069587159072,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +2.0918105708507295,1.4257796963318181,0.9184771676276685,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.20576854486813365,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.8393567172180595,0.024146916773074168,1.7314770061469296,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.5006408519337913,-0.8701833340815202,2.1181021700318468,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.007370269106689906,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.06499333610718552,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.8332257744712419,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,1.605012905652971,-0.8701833340815202,1.2266960166673486,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.8978451652912308,0.024146916773074168,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.24697594883654195,-1.7645135849361147,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.5806610250226196,-0.8701833340815202,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.2240549767296369,-0.8701833340815202,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.6715312075288409,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.4789131410521794,1.2342271202878985,-0.8701833340815202,-0.09278473996200068,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.5362472898802806,-0.6932941575756586,-0.8701833340815202,2.1181021700318468,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.2923143627788825,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.41919103550504583,0.9184771676276685,1.157872651879297,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.5028156696687675,-0.8701833340815202,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.5547034960132607,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.28387580895683195,-0.8701833340815202,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.6378876137537329,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.012424657327523568,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.4243762230642614,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.604255005380252,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.4161991013145072,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.5974680166535271,-0.3459111058026742,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.5974680166535271,0.8245084626647526,0.024146916773074168,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 diff --git a/Module5/Credit_Labels.csv b/Module5/Credit_Labels.csv new file mode 100644 index 0000000..d3c17a1 --- /dev/null +++ b/Module5/Credit_Labels.csv @@ -0,0 +1,1001 @@ +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +1 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +0 +1 +1 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +1 +0 +1 +0 +1 +1 +1 +0 +1 +1 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +0 +0 +0 +1 +1 +1 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +1 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +1 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +1 +0 +1 +0 +1 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +1 +1 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +1 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +1 +1 +1 +0 +1 +0 +1 +0 +0 +1 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +1 +0 +1 +0 +0 +1 +1 +0 +0 +0 +1 +1 +1 +1 +1 +1 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +1 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +1 +1 +0 +0 +0 +0 +1 +1 +0 +1 +0 +0 +1 +0 +1 +1 +1 +0 +1 +1 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +0 +1 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +1 +1 +1 +1 +0 +1 +0 +1 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +1 +1 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +1 +1 +1 +0 +0 +1 +1 +0 +1 +1 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +1 +1 +0 +0 +1 +0 +1 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +1 +0 +1 +1 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +1 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 diff --git a/Module5/CrossValidation.ipynb b/Module5/CrossValidation.ipynb new file mode 100644 index 0000000..5abc9c6 --- /dev/null +++ b/Module5/CrossValidation.ipynb @@ -0,0 +1,523 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction to Cross Validation and Model Selection\n", + "\n", + "In a previous lab you created a model with l2 or ridge regularization and l1 or lasso regularization. In both cases, an apparently optimum value of the regularization parameter was found. This process is an example of **model selection**. The goal of model selection is to find the best performing model for the problem at hand. Model selection is a very general term and can apply to at least the following common cases:\n", + "- Selection of optimal model **hyperparameters**. Hyperparameters are parameters which determine the characteristics of a model. Hyperparameters are distinct from the model parameters. For example, for the case of l2 regularized regression, the degree of regularization is determined by a hyperparameter, which is distinct from the regression coefficients or parameters. \n", + "- **Feature selection** is the process of determining which features should be used in a model. \n", + "- Comparing different model types is an obvious case of model selection. \n", + "\n", + "If you are thinking that the model selection process is closely related to model training, you are correct. Model selection is a component of model training. However, one must be careful, as applying a poor model selection method can lead to an over-fit model!\n", + "\n", + "## Overview of k-fold cross validation\n", + "\n", + "The questions remain, how good are the hyperparameter estimates previously obtained for the l2 and l1 regularization parameters and are there better ways to estimate these parameters? The answer to both questions is to use **resampling methods**. Resampling methods repeat a calculation multiple times using randomly selected subsets of the complete dataset. In fact, resampling methods are generally the best approach to model selection problems. \n", + "\n", + "\n", + "**K-folod Cross validation** is a widely used resampling method. In cross validation a dataset is divided into **k folds**. Each fold contains $\\frac{1}{k}$ cases and is created by **Bernoulli random sampling** of the full data set. A computation is performed on $k-1$ folds of the full dataset. The $k^{th}$ fold is **held back** and is used for testing the result. The computation is performed $k$ times and model parameters are averaged (mean taken) over the results of the $k$ folds. For each iteration, $k-1$ folds are used for training and the $k^{th}$ fold is used for testing. \n", + "\n", + "4-fold cross validation is illustrated in the figure below. To ensure the data are randomly sampled the data is randomly shuffled at the start of the procedure. The random samples can then be efficiently sub-sampled as shown in the figure. The model is trained and tested four times. For each iteration the data is trained with three folds of the data and tested with the fold shown in the dark shading. \n", + "\n", + "\"Drawing\"\n", + "
**Resampling scheme for 4-fold cross validation**
\n", + "\n", + "## Introduction to nested cross validation\n", + "\n", + "Unfortunately, simple cross validation alone does not provide an unbiased approach to model selection. The problem with evaluating model performance with simple cross validation uses the same data samples as the model selection process. This situation will lead to model over fitting wherein the model selection is learned based on the evaluation data. The result is usually unrealistically optimistic model performance estimates.\n", + "\n", + "To obtain unbiased estimates of expected model performance while performing model selection, it is necessary to use **nested cross validation**. As the name implies, nested cross validation is performed though a pair of nested CV loops. The outer loop uses a set of folds to perform model evaluation. The inner loop performs model selection using another randomly sampled set of folds not used for evaluation by the outer loop. This algorithm allows model selection and evaluation to proceed with randomly sampled subsets of the full data set, thereby avoiding model selection bias. \n", + "\n", + "## Cross validation and computational efficiency\n", + "\n", + "As you may have surmised, cross validation can be computationally intensive. Processing each fold of a cross validation requires fitting and evaluating the model. It is desirable to compute a reasonable number of folds. Since the results are averaged over the folds, a small number of folds can lead to significant variability in the final result. However, with large data sets or complex models, the number of folds must be limited in order to complete the cross validation process in a reasonable amount of time. It is, therefore, necessary to trade off accuracy of the cross validation result with the practical consideration of the required computational resources. \n", + "\n", + "As mentioned earlier, other resampling methods exist. For example, leave-one-out resampling has the same number of folds as data cases. Such methods provide optimal unbiased estimates of model performance. Unfortunately, as you might think, such methods are computationally intensive and are only suitable for small datasets. In practice k-fold cross validation is a reasonable way to explore bias-variance trade-off with reasonable computational resources. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Load Features and Labels\n", + "\n", + "With the above theory in mind, you will now try an example. \n", + "\n", + "As a first step, execute the code in the cell below to load the packages required for this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "from sklearn import cross_validation\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, load the preprocessed files containing the features and the labels. The preprocessing includes the following:\n", + "1. Cleaning missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardization of numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Construct the logistic regression model\n", + "\n", + "To create a baseline for comparison you will now create a logistic regression model without cross validation. This model uses a fixed set of hyperparameters. You will compare the performance of this model with the cross validation results computed subsequently. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, execute the code in the cell below to create training and testing splits of the dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, it is time to compute the logistic regression model. The code in the cell below does the following:\n", + "1. Define a logistic regression model object using the `LogisticRegression` method from the scikit-learn `linear_model` package.\n", + "2. Fit the linear model using the numpy arrays of the features and the labels for the training data set.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "logistic_mod = linear_model.LogisticRegression(C = 1.0, class_weight = {0:0.45, 1:0.55}) \n", + "logistic_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below computes and displays metrics and the ROC curve for the model using the test data subset. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + "\n", + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show() \n", + "\n", + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results look promising, with most of the metrics having reasonable values. The question is now, how will these performance estimates hold up to cross validation?" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Cross validate model\n", + "\n", + "To compute a better estimate of model performance, you can perform simple cross validation. The code in the cell performs the following processing:\n", + "1. Create a list of the metrics to be computed for each fold. \n", + "2. Defines a logistic regression model object.\n", + "3. A 10 fold cross validation is performed using the `cross_validate` function from the scikit-learn `model_selection` package.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Labels = Labels.reshape(Labels.shape[0],)\n", + "scoring = ['precision_macro', 'recall_macro', 'roc_auc']\n", + "logistic_mod = linear_model.LogisticRegression(C = 1.0, class_weight = {0:0.45, 1:0.55}) \n", + "scores = ms.cross_validate(logistic_mod, Features, Labels, scoring=scoring,\n", + " cv=10, return_train_score=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below displays the performance metrics along with the mean and standard deviation, computed for each fold to the cross validation. The 'macro' versions of precision and recall are used. These macro versions average over the positive and negative cases. \n", + "\n", + "Execute this code, examine the result, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def print_format(f,x,y,z):\n", + " print('Fold %2d %4.3f %4.3f %4.3f' % (f, x, y, z))\n", + "\n", + "def print_cv(scores):\n", + " fold = [x + 1 for x in range(len(scores['test_precision_macro']))]\n", + " print(' Precision Recall AUC')\n", + " [print_format(f,x,y,z) for f,x,y,z in zip(fold, scores['test_precision_macro'], \n", + " scores['test_recall_macro'],\n", + " scores['test_roc_auc'])]\n", + " print('-' * 40)\n", + " print('Mean %4.3f %4.3f %4.3f' % \n", + " (np.mean(scores['test_precision_macro']), np.mean(scores['test_recall_macro']), np.mean(scores['test_roc_auc']))) \n", + " print('Std %4.3f %4.3f %4.3f' % \n", + " (np.std(scores['test_precision_macro']), np.std(scores['test_recall_macro']), np.std(scores['test_roc_auc'])))\n", + "\n", + "print_cv(scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there is considerable variability in each of the performance metrics from fold to fold. Even so, the standard deviations are at least an order of magnitude than the means. It is clear that **any one fold does not provide a representative value of the performance metrics**. The later is a key point as to why cross validation is important when evaluating a machine learning model. \n", + "\n", + "Compare the performance metric values to the values obtained for the baseline model you created above. In general the metrics obtained by cross validation are lower. However, the metrics obtained for the baseline model are within 1 standard deviation of the average metrics from cross validation. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Optimize hyperparameters with nested cross validation\n", + "\n", + "Given the variability observed in cross validation, it should be clear that performing model selection from a single training and evaluation is an uncertain proposition at best. Fortunately, the nested cross validation approach provides a better way to perform model selection. However, there is no guarantee that a model selection process will, in fact, improve a model. In some cases, it may prove to be that model selection has minimal impact. \n", + "\n", + "To start the nested cross validation process it is necessary to define the randomly sampled folds for the inner and outer loops. The code in the cell below uses the `KFolds` function from the scikit-learn `model_selection` package to define fold selection objects. Notice that the `shuffle = True` argument is used in both cases. This argument specifies that a random shuffle is preformed before folds are created, ensuring that the sampling of the folds for the inside and outside loops are independent. Notice that by creating these independent fold objects there is no need to actually create nested loops for this process. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "An important decision in model selection searches is the choice of performance metric used to find the best model. For classification problems scikit-learn uses accuracy as the default metric. However, as you have seen previously, accuracy is not necessarily the best metric, particularly when there is a class imbalance as is the case here. There are a number of alternatives which one could choose for such a situation. In this case AUC will be used. \n", + "\n", + "The code below uses the `inside` k-fold object to execute the inside loop of the nested cross validation. Specifically, the steps are:\n", + "1. Define a dictionary with the grid of parameter values to search over. In this case there is only one parameter, `C`, with a list of values to try. In a more general case, the dictionary can contain values from multiple parameters, creating a multi-dimensional grid that the cross validation process will iterate over. In this case there are 5 hyperparameter values in the grid and 10-fold cross validation is being used. Thus, the model will be trained and evaluated 50 times. \n", + "2. The logistic regression model object is defined. \n", + "3. The cross validation search over the parameter grid is performed using the `GridSearch` function from the scikit-learn `model_selection` package. Notice that the cross validation folds are computed using the `inside` k-fold object.\n", + "\n", + "\n", + "****\n", + "**Note:** Somewhat confusingly, the scikit-learn `LogisticRegression` function uses a regularization parameter `C` which is the inverse of the usual l2 regularization parameter $\\lambda$. Thus, the smaller the parameter the stronger the regulation \n", + "****\n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "nr.seed(3456)\n", + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"C\": [0.1, 1, 10, 100, 1000]}\n", + "## Define the logistic regression model\n", + "logistic_mod = linear_model.LogisticRegression(class_weight = {0:0.45, 0:0.55}) \n", + "\n", + "## Perform the grid search over the parameters\n", + "clf = ms.GridSearchCV(estimator = logistic_mod, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The cross validated grid search object, `clf`, has been created. \n", + "\n", + "The code in the cell below fits the cross validated model using the `fit`method. The AUC for each hyperparameter and fold is displayed as an array. Finally, the hyperparameter for the model with the best average AUC is displayed. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Fit thhe cross validated grid search over the data \n", + "clf.fit(Features, Labels)\n", + "keys = list(clf.cv_results_.keys())\n", + "for key in keys[6:16]:\n", + " print(clf.cv_results_[key])\n", + "## And print the best parameter value\n", + "clf.best_estimator_.C" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The array of AUC metrics has dimensions 10 folds X hyperparameter values. As you might expect by now, there is considerable variation in the AUC from fold to fold for each hyperparamter value, or column. \n", + "\n", + "Evidently, the optimal hyperparameter value is 0.1. \n", + "\n", + "To help understand this behavior a bit more, the code in the cell below does the following:\n", + "1. Compute and display the mean and standard deviation of the AUC for each hyperparameter value.\n", + "2. Plot the AUC values for each fold vs. the hyperparameter values. The mean AUC for each hyperparameter value is shown with a red +. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_cv(clf, params_grid, param = 'C'):\n", + " params = [x for x in params_grid[param]]\n", + " \n", + " keys = list(clf.cv_results_.keys()) \n", + " grid = np.array([clf.cv_results_[key] for key in keys[6:16]])\n", + " means = np.mean(grid, axis = 0)\n", + " stds = np.std(grid, axis = 0)\n", + " print('Performance metrics by parameter')\n", + " print('Parameter Mean performance STD performance')\n", + " for x,y,z in zip(params, means, stds):\n", + " print('%8.2f %6.5f %6.5f' % (x,y,z))\n", + " \n", + " params = [math.log10(x) for x in params]\n", + " \n", + " plt.scatter(params * grid.shape[0], grid.flatten())\n", + " p = plt.scatter(params, means, color = 'red', marker = '+', s = 300)\n", + " plt.plot(params, np.transpose(grid))\n", + " plt.title('Performance metric vs. log parameter value\\n from cross validation')\n", + " plt.xlabel('Log hyperparameter value')\n", + " plt.ylabel('Performance metric')\n", + " \n", + "plot_cv(clf, param_grid) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are a number of points to notice here:\n", + "1. The mean AUC for each value of the hyperparameter are all within 1 standard deviation of each other. This result indicates that model performance is not sensitive to the choice of hyperparamter. \n", + "2. Graphically you can see that there is a noticeable variation in the AUC from metric to metric, regardless of hyperparameter. Keep in mind that **this variation is simply a result of random sampling of the data!**\n", + "\n", + "Finally, it is time to try execute the outer loop of the nested cross validation to evaluate the performance of the 'best' model selected by the inner loop. In this case, 'best' is quite approximate, since as already noted, the differences in performance between the models is not significant. \n", + "\n", + "The code in the cell below executes the outer loop of the nested cross validation using the `cross_val_scores` function from the scikit-learn `model_selection` package. The folds are determined by the `outside` k-fold object. The mean and standard deviation of the AUC is printed along with the value estimated for each fold. Execute this code and examine the result. \n", + "\n", + "Then, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, there is considerable variation in AUC across the folds. The mean AUC is a bit lower than estimated for the inner loop of the nested cross validation and the baseline model. However, all of these values are within 1 standard deviation of each other, and thus these differences cannot be considered significant. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will build and test a model using the estimated optimal hyperparameters. Then, answer **Question 3** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "logistic_mod = linear_model.LogisticRegression(C = 0.1, class_weight = {0:0.45, 1:0.55}) \n", + "logistic_mod.fit(X_train, y_train)\n", + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have performed by simple cross validation and nested cross validation. Key points and observations are:\n", + "1. Model selection should be done using a resampling procedure such as nested cross validation. The nested sampling structure is required to prevent bias in model selection wherein the model selected learns the best hyperparameters for the samples used, rather than a model that generalizes well. \n", + "2. There is significant variation in model performance from fold to fold in cross validation. This variation arises from the sampling of the data alone and is not a property of any particular model.\n", + "3. Given the expected sampling variation in cross validation, there is generally considerable uncertainty as to which model is best when performing model selection. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/CrossValidation.jpg b/Module5/CrossValidation.jpg new file mode 100644 index 0000000..14e1c3b Binary files /dev/null and b/Module5/CrossValidation.jpg differ diff --git a/Module5/DimensionalityReduction.ipynb b/Module5/DimensionalityReduction.ipynb new file mode 100644 index 0000000..2ae4589 --- /dev/null +++ b/Module5/DimensionalityReduction.ipynb @@ -0,0 +1,622 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Dimensionality reduction with principle components\n", + "\n", + "**Principle component analysis**, or **PCA**, is an alternative to regularization and straight-forward feature elimination. PCA is particularly useful for problems with very large numbers of features compared to the number of training cases. For example, when faced with a problem with many thousands of features and perhaps a few thousand cases, PCA can be a good choice to **reduce the dimensionality** of the feature space. \n", + "\n", + "PCA is one of a family of transformation methods that reduce dimensionality. PCA is the focus here, since it is the most widely used of these methods. \n", + "\n", + "The basic idea of PCA is rather simple: Find a linear transformation of the feature space which **projects the majority of the variance** onto a few orthogonal dimensions in the transformed space. The PCA transformation maps the data values to a new coordinate system defined by the principle components. Assuming the highest variance directions, or **components**, are the most informative, low variance components can be eliminated from the space with little loss of information. \n", + "\n", + "The projection along which the greatest variance occurs is called the **first principle component**. The next projection, orthogonal to the first, with the greatest variance is call the **second principle component**. Subsequent components are all mutually orthogonal with decreasing variance along the projected direction. \n", + "\n", + "Widely used PCA algorithms compute the components sequentially, starting with the first principle component. This means that it is computationally efficient to compute the first several components from a very large number of features. Thus, PCA can make problems with very large numbers of features computationally tractable. \n", + "\n", + "****\n", + "**Note:** It may help your understanding to realize that principle components are a scaled version of the **eigenvectors** of the feature matrix. The scale for each dimensions is given by the **eigenvalues**. The eigenvalues are the fraction of the variance explained by the components. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A simple example\n", + "\n", + "To cement the concepts of PCA you will now work through a simple example. This example is restricted to 2-d data so that the results are easy to visualize. \n", + "\n", + "As a first step, execute the code in cell below to load the packages required for the rest of this notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "import sklearn.decomposition as skde\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below simulates data from a bivariate Normal distribution. The distribution is deliberately centered on $\\{ 0,0 \\}$ and with unit variance on each dimension. There is considerable correlation between the two dimensions leading to a covariance matrix:\n", + "\n", + "$$cov(X) = \\begin{bmatrix}\n", + " 1.0 & 0.6 \\\\\n", + " 0.6 & 1.0\n", + " \\end{bmatrix}$$\n", + "\n", + "Given the covariance matrix 100 draws from this distribution are computed using the `multivariate_normal` function from the Numpy `random` package. Execute this code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(124)\n", + "cov = np.array([[1.0, 0.6], [0.6, 1.0]])\n", + "mean = np.array([0.0, 0.0])\n", + "\n", + "sample = nr.multivariate_normal(mean, cov, 100)\n", + "sample.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for this data, execute the code in the cell below to display a plot and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "plt.scatter(sample[:,0], sample[:,1])\n", + "plt.xlabel('Dimension 1')\n", + "plt.ylabel('Dimension 2')\n", + "plt.title('Sample data')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that the data have a roughly elliptical pattern. The correlation between the two dimensions is also visible. \n", + "\n", + "With the simulated data set created, it is time to compute the PCA model. The code in the cell below does the following:\n", + "1. Define a PCA model object using the `PCA` function from the scikit-learn `decomposition` package.\n", + "2. Fit the PCA model to the sample data.\n", + "3. Display the ratio of the **variance explained** by each of the components, where, for a matrix X, this ratio is given by:\n", + "\n", + "$$VE(X) = \\frac{Var_{X-component}(X)}{Var_{X-total}(X)}$$\n", + "\n", + "Notice that by construction:\n", + "\n", + "$$VE(X) = \\sum_{i=1}^N VE_i(X) = 1.0$$\n", + "\n", + "In other words, the sum of the variance explained for each component must add to the total variance or 1.0 for standardized data. \n", + "\n", + "Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pca_model = skde.PCA()\n", + "pca_fit = pca_model.fit(sample)\n", + "print(pca_fit.explained_variance_ratio_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the explained variance of the first component is many times larger than for the second component. This is exactly the desired result indicating the first principle component explains the majority of the variance of the sample data. \n", + "\n", + "The code in the cell below computes and prints the scaled components. Mathematically, the scaled components are the eigenvectors scaled by the eigenvalues. Execute this code: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "comps = pca_fit.components_\n", + "for i in range(2):\n", + " comps[:,i] = comps[:,i] * pca_fit.explained_variance_ratio_\n", + "print(comps)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the two vectors have their origins at $\\[ 0,0 \\}$, and are quite different magnitude, and are pointing in different directions. \n", + "\n", + "To better understand how the projections of the components relate to the data, execute the code to plot the data along with the principle components. Execute this code: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plt.scatter(sample[:,0], sample[:,1])\n", + "# plt.plot([0.0, comps[0,0]], [0.0,comps[0,1]], color = 'red', linewidth = 5)\n", + "# plt.plot([0.0, comps[1,0]], [0.0,comps[1,1]], color = 'red', linewidth = 5)\n", + "plt.plot([0.0, comps[0,0]], [0.0,comps[0,1]], color = 'red', linewidth = 5)\n", + "plt.plot([0.0, comps[1,0]], [0.0,comps[1,1]], color = 'red', linewidth = 5)\n", + "\n", + "plt.xlabel('Dimension 1')\n", + "plt.ylabel('Dimension 2')\n", + "plt.title('Sample data')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice the the fist principle component (the long red line) is along the direction of greatest variance of the data. This is as expected. The short red line is along the direction of the second principle component. The lengths of these lines is the variance in the directions of the projection. \n", + "\n", + "The ultimate goal of PCA is to transform data to a coordinate system with the highest variance directions along the axes. The code in the cell below uses the `transform` method on the PCA object to perform this operation and then plots the result. Execute this code: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trans = pca_fit.transform(sample)\n", + "plt.scatter(trans[:,0], trans[:,1])\n", + "plt.xlabel('Dimension 1')\n", + "plt.ylabel('Dimension 2')\n", + "plt.title('Sample data')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the scale along these two coordinates are quite different. The first principle component is along the horizontal axis. The range of values on this direction is in the range of about $\\{ -2.5,2.5 \\}$. The range of values on the vertical axis or second principle component are only about $\\{ -0.2, 0.3 \\}$. It is clear that most of the variance is along the direction of the fist principle component. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load Features and Labels\n", + "\n", + "Keeping the foregoing simple example in mind, it is time to apply PCA to some real data. \n", + "\n", + "The code in the cell below loads the dataset which has had the the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizing numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are 35 features in this data set. The numeric features have been Zscore scaled so they are zero centered (mean removed) and unit variance (divide by standard deviation). \n", + "\n", + "****\n", + "**Note:** Before performing PCA all features must be zero mean and unit variance. Failure to do so will result in biased computation of the components and scales. In this case, the data set has already been scaled, but ordinarily scaling is a key step. \n", + "****\n", + "\n", + "Now, run the code in the cell below to split the data set into test and training subsets:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compute principle components\n", + "\n", + "The code in the cell below computes the principle components for the training feature subset. Execute this code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pca_mod = skde.PCA()\n", + "pca_comps = pca_mod.fit(X_train)\n", + "pca_comps" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to print the variance explained for each component and the sum of the variance explained:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(pca_comps.explained_variance_ratio_)\n", + "print(np.sum(pca_comps.explained_variance_ratio_))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These numbers are a bit abstract. However, you can see that the variance ratios are in descending order and that the sum is 1.0. \n", + "\n", + "Execute the code in the cell below to create a plot of the explained variance vs. the component: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_explained(mod):\n", + " comps = mod.explained_variance_ratio_\n", + " x = range(len(comps))\n", + " x = [y + 1 for y in x] \n", + " plt.plot(x,comps)\n", + "\n", + "plot_explained(pca_comps)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This curve is often referred to as a **scree plot**. Notice that the explained variance decreases rapidly until the 5th component and then slowly, thereafter. The first few components explain a large fraction of the variance and therefore contain much of the explanatory information in the data. The components with small explained variance are unlikely to contain much explanatory information. Often the inflection point or 'knee' in the scree curve is used to choose the number of components selected. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now it is time to create a PCA model with a reduced number of components. The code in the cell below trains and fits a PCA model with 5 components, and then transforms the features using that model. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pca_mod_5 = skde.PCA(n_components = 5)\n", + "pca_mod_5.fit(X_train)\n", + "Comps = pca_mod_5.transform(X_train)\n", + "Comps.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Compute and evaluate a logistic regression model\n", + "\n", + "Next, you will compute and evaluate a logistic regression model using the features transformed by the first 5 principle components. Execute the code in the cell below to define and fit a logistic regression model, and print the model coefficients. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Define and fit the logistic regression model\n", + "log_mod_5 = linear_model.LogisticRegression(C = 10.0, class_weight = {0:0.45, 1:0.55}) \n", + "log_mod_5.fit(Comps, y_train)\n", + "print(log_mod_5.intercept_)\n", + "print(log_mod_5.coef_)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that there are now 5 regression coefficients, one for each component. This number is in contrast to the 35 features in the raw data. \n", + "\n", + "Next, evaluate this model using the code below. Notice that the test features are transformed using the same PCA transformation used for the training data. Execute this code and examine the results.\n", + "\n", + "Then, answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + "\n", + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show() \n", + "\n", + "probabilities = log_mod_5.predict_proba(pca_mod_5.transform(X_test))\n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the most part, these results look good. The question remains, were the correct number of principle components used? " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Add more components to the model\n", + "\n", + "Now you will compute and evaluate a logistic regression model using the first 10 principle components. You will compare this model to the one created with 5 principle components. Execute the code below to transform the training features using the first 10 principle components. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pca_mod_10 = skde.PCA(n_components = 10)\n", + "pca_mod_10.fit(X_train)\n", + "Comps_10 = pca_mod_10.transform(X_train)\n", + "Comps_10.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to define and fit a logistic regression model using the 10 components of the transformed features. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## define and fit the linear regression model\n", + "log_mod_10 = linear_model.LogisticRegression(C = 10.0, class_weight = {0:0.45, 1:0.55}) \n", + "log_mod_10.fit(Comps_10, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below scores the logistic regression model and displays performance metrics, the ROC curve, and the AUC. Execute this code and examine the result. \n", + "\n", + "Then, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "probabilities = log_mod_10.predict_proba(pca_mod_10.transform(X_test))\n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "All of the metrics have improved compared to the 5 component model. Apparently there is useful information in the first 10 components. \n", + "\n", + "But, is this difference really significant. To find out, you will now perform cross validation on the result. Ideally, the fitting of the PCA model should be part of the cross validation process. However, at the risk of a small bias, this step is omitted for the sake of simplicity. Execute the code in the cell below to perform the cross validation and display the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def print_format(f,x,y,z):\n", + " print('Fold %2d %4.3f %4.3f %4.3f' % (f, x, y, z))\n", + "\n", + "def print_cv(scores):\n", + " fold = [x + 1 for x in range(len(scores['test_precision_macro']))]\n", + " print(' Precision Recall AUC')\n", + " [print_format(f,x,y,z) for f,x,y,z in zip(fold, scores['test_precision_macro'], \n", + " scores['test_recall_macro'],\n", + " scores['test_roc_auc'])]\n", + " print('-' * 40)\n", + " print('Mean %4.3f %4.3f %4.3f' % \n", + " (np.mean(scores['test_precision_macro']), np.mean(scores['test_recall_macro']), np.mean(scores['test_roc_auc']))) \n", + " print('Std %4.3f %4.3f %4.3f' % \n", + " (np.std(scores['test_precision_macro']), np.std(scores['test_recall_macro']), np.std(scores['test_roc_auc'])))\n", + " \n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "scoring = ['precision_macro', 'recall_macro', 'roc_auc']" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pca_mod = skde.PCA(n_components = 5)\n", + "pca_mod.fit(Features)\n", + "Comps = pca_mod.transform(Features)\n", + "\n", + "scores = ms.cross_validate(log_mod_5, Comps, Labels, scoring=scoring,\n", + " cv=10, return_train_score=False)\n", + "print_cv(scores) " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "pca_mod = skde.PCA(n_components = 10)\n", + "pca_mod.fit(Features)\n", + "Comps = pca_mod.transform(Features)\n", + "\n", + "scores = ms.cross_validate(log_mod_10, Comps, Labels, scoring=scoring,\n", + " cv=10, return_train_score=False)\n", + "print_cv(scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare the AUC and its standard deviation obtained above to the AUC of the 5 component model. The difference does appear to be significant. This difference supports the hypothesis that the first 10 components all contain useful information. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have applied principle component analysis to dimensionality reduction for supervised machine learning. The first components computed contain most of the available information. When faced with large number of features, PCA is an effective way to make supervised machine learning models tractable. \n", + "\n", + "Specifically in this lab you have:\n", + "1. Computed PCA models with different numbers of components.\n", + "2. Compared logistic regression models with different numbers of components. In this case, using 10 components produced a significantly better model. Using 10 components is a useful reduction in dimensionality compared to the original 35 features. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/FeatureSelection.ipynb b/Module5/FeatureSelection.ipynb new file mode 100644 index 0000000..4ba0033 --- /dev/null +++ b/Module5/FeatureSelection.ipynb @@ -0,0 +1,518 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Feature Selection\n", + "\n", + "**Feature selection** can be an important part of model selection. In supervised learning, including features in a model which do not provide information on the label is useless at best and may prevent generalization at worst.\n", + "\n", + "Feature selection can involve application of several methods. Two important methods include:\n", + "1. Eliminating features with **low variance** and **zero variance**. Zero variance features are comprised of the same values. Low variance features arise from features with most values the same and with few unique values. One way low variance features can arise, is from dummy variables for categories with very few members. The dummy variable will be mostly 0s with very few 1s. \n", + "2. Training machine learning models with features that are **uninformative** can create a variety of problems. An uninformative feature does not significantly improve model performance. In many cases, the noise in the uninformative features will increase the variance of the model predictions. In other words, uninformative models are likely to reduce the ability of the machine learning model to generalize. \n", + "\n", + "****\n", + "**Note:** the second case of feature selection involves applying a selection statistic or hypothesis test multiple times. For large number of features, this process is very likely to lead to false positive and false negative results. This likely outcome is known as the **multiple comparisons problem** in statistics.\n", + "\n", + "To understand this problem, consider the decision to keep a feature in a model as a hypothesis test. Any hypothesis test has some probability of both a false positive result and a false negative result. Consider a case where there are 40 uninformative features which are excluded from the model with 95% confidence. There will be an approximately 5% chance of accepting a feature which should be rejected. In this case we would expect about 2 uninformative features to be accepted because of these errors. \n", + "\n", + "You may well ask, if testing features for importance can fail with large numbers of features, what is the alternative? The most general and scalable alternative is to use regularization methods. Consider applying regularization methods to a linear model. In this case, machine learning algorithm learns which features should be weighted highly and which should not. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Load the dataset\n", + "\n", + "You will now apply the aforementioned principles to the bank credit data set. \n", + "\n", + "As a first step, run the code in the cell below to load the required packages. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "from sklearn import preprocessing\n", + "import sklearn.model_selection as ms\n", + "from sklearn import linear_model\n", + "import sklearn.metrics as sklm\n", + "from sklearn import feature_selection as fs\n", + "from sklearn import metrics, cross_validation\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import scipy.stats as ss\n", + "import math\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, load the preprocessed files containing the features and the labels. The preprocessing includes the following:\n", + "1. Clean missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encode categorical variables as binary dummy variables.\n", + "4. Standardize numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Eliminate low variance features\n", + "\n", + "As a fist step in selecting features from this dataset you will remove features with low variance. The `VarianceThreshold` function from the scikit-learn `feature_selection` package identifies features with less than some threshold of unique values. For a probability that a feature is unique $p$ the threshold is specified as;\n", + "\n", + "$$Var(x) = p(1-p)$$\n", + "\n", + "In this case a 80%, or $p=0.8$, threshold is used. \n", + "\n", + "The `fit_transform` method applies the threshold to the variance of each feature and removes features with variance below the threshold. The `get_support_` attribute shows the `True` and `False` logical for inclusion of each feature. \n", + "\n", + "Execute the code, examine the result, and answer **Question 1** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(Features.shape)\n", + "\n", + "## Define the variance threhold and fit the threshold to the feature array. \n", + "sel = fs.VarianceThreshold(threshold=(.8 * (1 - .8)))\n", + "Features_reduced = sel.fit_transform(Features)\n", + "\n", + "## Print the support and shape for the transformed features\n", + "print(sel.get_support())\n", + "print(Features_reduced.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The number of features has been reduced from 35 to 18. Apparently, there are 17 low variance features in the original array. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Select k best features\n", + "\n", + "The low variance features have been eliminated. But, the question remains, are all these features informative? There are a number of methods used to determine the importance of features. Many machine learning models have specialized methods to determine feature importance specifically intended for those methods. \n", + "\n", + "In this example, you will use a fairly general and robust method using cross validation. The algorithm is straight forward. Features are recursively removed. Cross validation is used to find the change in model performance, if any, to determine if a feature should be deleted altogether. \n", + "\n", + "The code in the cell below performs the following processing:\n", + "1. Create the folds for the cross validation for feature selection. These folds should be independent of any other cross validation performed. \n", + "2. The logistic regression model is defined. \n", + "3. The `RFECV` function from the scikit-learn `feature_selection` package is used to determine which features to retain using a cross validation method. Notice that AUC is used as the model selection metric as the labels are imbalanced. In this case, the default, accuracy is a poor choice. \n", + "4. The RFECV feature selector is fit to the data. \n", + "\n", + "Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Reshape the Label array\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "\n", + "## Set folds for nested cross validation\n", + "nr.seed(988)\n", + "feature_folds = ms.KFold(n_splits=10, shuffle = True)\n", + "\n", + "## Define the model\n", + "logistic_mod = linear_model.LogisticRegression(C = 10, class_weight = {0:0.45, 1:0.55}) \n", + "\n", + "## Perform feature selection by CV with high variance features only\n", + "nr.seed(6677)\n", + "selector = fs.RFECV(estimator = logistic_mod, cv = feature_folds,\n", + " scoring = 'roc_auc')\n", + "selector = selector.fit(Features_reduced, Labels)\n", + "selector.support_ " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "From the support you can see that some features are selected (True) and eliminated (False). \n", + "\n", + "Execute the code below to see the relative ranking of the features." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "selector.ranking_" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the features which have been selected are shown with a rank of 1. The features eliminated are shown with higher numbers. \n", + "\n", + "The code in the cell below as uses the `transform` method applies the selector to the feature array. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features_reduced = selector.transform(Features_reduced)\n", + "Features_reduced.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The features have been reduced from the 18 high variance features to 16. Two features have been found to be unimportant. \n", + "\n", + "The code in the cell below creates a plot of AUC (the metric) vs. the number of features. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plt.plot(range(1, len(selector.grid_scores_) + 1), selector.grid_scores_)\n", + "plt.title('Mean AUC by number of features')\n", + "plt.ylabel('AUC')\n", + "plt.xlabel('Number of features')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the change in AUC is not that great across a range of features around the 16 selected. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply nested cross validation to create model\n", + "\n", + "The next step is to use nested cross validation to optimize the model hyperparameter and test the model performance. The model is constructed using the features selected. \n", + "\n", + "As a first step, construct the inside and outside folds for the nested cross validation by running the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below performs the grid search for the optimal model hyperparameter. As before, the scoring metric is AUC. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(3456)\n", + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"C\": [0.1, 1, 10, 100, 1000]}\n", + "## Define the logistic regression model\n", + "logistic_mod = linear_model.LogisticRegression(class_weight = {0:0.45, 1:0.55}) \n", + "\n", + "## Perform the grid search over the parameters\n", + "clf = ms.GridSearchCV(estimator = logistic_mod, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)\n", + "\n", + "## Fit the cross validated grid search over the data \n", + "clf.fit(Features_reduced, Labels)\n", + "\n", + "## And print the best parameter value\n", + "clf.best_estimator_.C" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The optimal value of the hyperparameter is 1. This parameter is larger than for the same model with all the features. Recalling that the parameter is the inverse of regularization strength, the smaller parameter means the model with fewer features requires less regularization. \n", + "\n", + "To get a feel for the results of the cross validation execute the code in the cell below and observe the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_cv(clf, params_grid, param = 'C'):\n", + " params = [x for x in params_grid[param]]\n", + " \n", + " keys = list(clf.cv_results_.keys()) \n", + " grid = np.array([clf.cv_results_[key] for key in keys[6:16]])\n", + " means = np.mean(grid, axis = 0)\n", + " stds = np.std(grid, axis = 0)\n", + " print('Performance metrics by parameter')\n", + " print('Parameter Mean performance STD performance')\n", + " for x,y,z in zip(params, means, stds):\n", + " print('%8.2f %6.5f %6.5f' % (x,y,z))\n", + " \n", + " params = [math.log10(x) for x in params]\n", + " \n", + " plt.scatter(params * grid.shape[0], grid.flatten())\n", + " p = plt.scatter(params, means, color = 'red', marker = '+', s = 300)\n", + " plt.plot(params, np.transpose(grid))\n", + " plt.title('Performance metric vs. log parameter value\\n from cross validation')\n", + " plt.xlabel('Log hyperparameter value')\n", + " plt.ylabel('Performance metric')\n", + " \n", + "plot_cv(clf, param_grid) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the mean AUCs are within 1 standard deviation of each other. The AUC for the hyperparameter value of 10 is not significantly better than the other values tested. \n", + "\n", + "Now you will perform the outer loop of the nested cross validation by executing the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The performance metric is not significantly different than for the inner loop of the cross validation. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Test the model\n", + "\n", + "With the features selected and the optimal hyperparameters estimated, it is time to test the model. the code in the cell below does the following processing;\n", + "1. Split the reduced feature subset of the data into training and test subsets.\n", + "2. Define and fit a model using the optimal hyperparameter. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features_reduced.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features_reduced[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features_reduced[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])\n", + "\n", + "## Define and fit the logistic regression model\n", + "logistic_mod = linear_model.LogisticRegression(C = 1, class_weight = {0:0.45, 1:0.55}) \n", + "logistic_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to score the model and display a sample of the resulting probabilities. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "probabilities = logistic_mod.predict_proba(X_test)\n", + "print(probabilities[:15,:])\n", + "scores = score_model(probabilities, 0.3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the model scored, execute the code in the cell below to display performance metrics for the model. Then, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + "\n", + "def plot_auc(labels, probs):\n", + " ## Compute the false positive rate, true positive rate\n", + " ## and threshold along with the AUC\n", + " fpr, tpr, threshold = sklm.roc_curve(labels, probs[:,1])\n", + " auc = sklm.auc(fpr, tpr)\n", + " \n", + " ## Plot the result\n", + " plt.title('Receiver Operating Characteristic')\n", + " plt.plot(fpr, tpr, color = 'orange', label = 'AUC = %0.2f' % auc)\n", + " plt.legend(loc = 'lower right')\n", + " plt.plot([0, 1], [0, 1],'r--')\n", + " plt.xlim([0, 1])\n", + " plt.ylim([0, 1])\n", + " plt.ylabel('True Positive Rate')\n", + " plt.xlabel('False Positive Rate')\n", + " plt.show() \n", + " \n", + "print_metrics(y_test, probabilities, 0.3) \n", + "plot_auc(y_test, probabilities) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "At first glance, these performance metrics look quite good. Notice however, that the AUC is much larger than achieved with cross validation. This indicates that these results are overly optimistic, a common situation when a single split is used to evaluate a model. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have performed two types of feature selection:\n", + "1. Eliminating low variance features, which by their nature cannot be highly informative since they contain a high fraction of the same value.\n", + "2. Using recursive feature elimination, a cross validation technique for identifying uninformative features. \n", + "\n", + "With a reduced feature set less regularization was required for the model. This is expected since the most uninformative features have already been eliminated. It should be noted that for large numbers of features, these types of feature elimination algorithms should not be expected to give good generalization performance as a result of the multiple comparisons problem. In these cases, stronger regularization is a better approach. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module5/L1.jpg b/Module5/L1.jpg new file mode 100644 index 0000000..adf09d3 Binary files /dev/null and b/Module5/L1.jpg differ diff --git a/Module5/L2.jpg b/Module5/L2.jpg new file mode 100644 index 0000000..f8c7c63 Binary files /dev/null and b/Module5/L2.jpg differ diff --git a/Module5/img/CrossValidation.jpg b/Module5/img/CrossValidation.jpg new file mode 100644 index 0000000..14e1c3b Binary files /dev/null and b/Module5/img/CrossValidation.jpg differ diff --git a/Module5/img/L1.jpg b/Module5/img/L1.jpg new file mode 100644 index 0000000..adf09d3 Binary files /dev/null and b/Module5/img/L1.jpg differ diff --git a/Module5/img/L2.jpg b/Module5/img/L2.jpg new file mode 100644 index 0000000..f8c7c63 Binary files /dev/null and b/Module5/img/L2.jpg differ diff --git a/Module5/img/ROC_AUC.JPG b/Module5/img/ROC_AUC.JPG new file mode 100644 index 0000000..82f47fc Binary files /dev/null and b/Module5/img/ROC_AUC.JPG differ diff --git a/Module6/Bagging.ipynb b/Module6/Bagging.ipynb new file mode 100644 index 0000000..9d59ba1 --- /dev/null +++ b/Module6/Bagging.ipynb @@ -0,0 +1,660 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Bagging and Random Forest Models\n", + "\n", + "Using **ensemble methods** can greatly improve the results achieved with weak machine learning algorithms, also called **weak learners**. Ensemble methods achieve better performance by aggregating the results of many statistically independent models. This process averages out the errors and produces a final better prediction. \n", + "\n", + "In this lab you will work with a widely used ensemble method known as **bootstrap aggregating** or simply **bagging**. Bagging follows a simple procedure:\n", + "1. N learners (machine learning models) are defined. \n", + "2. N subsamples of the training data are created by **Bernoulli sampling with replacement**.\n", + "3. The N learners are trained on the subsamples of the training data.\n", + "4. The ensemble is scored by averaging, or taking a majority vote, of the predictions from the N learners.\n", + "\n", + "**Classification and regression tree models** are most typically used with bagging methods. The most common such algorithm is know as the **random forest**. The random forest method is highly scalable and generally produces good results, even for complex problems. \n", + "\n", + "Classification and regression trees tend to be robust to noise or outliers in the training data. This is true for the random forest algorithm as well. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example: Iris dataset\n", + "\n", + "As a first example you will use random forest to classify the species of iris flowers. \n", + "\n", + "As a first step, execute the code in the cell below to load the required packages to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "from sklearn.ensemble import RandomForestClassifier\n", + "from sklearn import preprocessing\n", + "#from statsmodels.api import datasets\n", + "from sklearn import datasets ## Get dataset from sklearn\n", + "import sklearn.model_selection as ms\n", + "import sklearn.metrics as sklm\n", + "import matplotlib.pyplot as plt\n", + "import pandas as pd\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for these data, you will now load and plot them. The code in the cell below does the following:\n", + "\n", + "1. Loads the iris data as a Pandas data frame. \n", + "2. Adds column names to the data frame.\n", + "3. Displays all 4 possible scatter plot views of the data. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_iris(iris):\n", + " '''Function to plot iris data by type'''\n", + " setosa = iris[iris['Species'] == 'setosa']\n", + " versicolor = iris[iris['Species'] == 'versicolor']\n", + " virginica = iris[iris['Species'] == 'virginica']\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = 'x')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = 'o')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = '+')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + " \n", + "## Import the dataset from sklearn.datasets\n", + "iris = datasets.load_iris()\n", + "\n", + "## Create a data frame from the dictionary\n", + "species = [iris.target_names[x] for x in iris.target]\n", + "iris = pd.DataFrame(iris['data'], columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "iris['Species'] = species\n", + "\n", + "## Plot views of the iris data \n", + "plot_iris(iris) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that Setosa (in blue) is well separated from the other two categories. The Versicolor (in orange) and the Virginica (in green) show considerable overlap. The question is how well our classifier will separate these categories. \n", + "\n", + "Scikit Learn classifiers require numerically coded numpy arrays for the features and as a label. The code in the cell below does the following processing:\n", + "1. Creates a numpy array of the features.\n", + "2. Numerically codes the label using a dictionary lookup, and converts it to a numpy array. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(iris[['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']])\n", + "\n", + "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", + "Labels = np.array([levels[x] for x in iris['Species']])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to split the dataset into test and training set. Notice that unusually, 100 of the 150 cases are being used as the test dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As is always the case with machine learning, numeric features must be scaled. The code in the cell below performs the following processing:\n", + "\n", + "1. A Zscore scale object is defined using the `StandarScaler` function from the Scikit Learn preprocessing package. \n", + "2. The scaler is fit to the training features. Subsequently, this scaler is used to apply the same scaling to the test data and in production. \n", + "3. The training features are scaled using the `transform` method. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scale = preprocessing.StandardScaler()\n", + "scale.fit(X_train)\n", + "X_train = scale.transform(X_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you will define and fit a random forest model. The code in the cell below defines random forest model with 5 trees using the `RandomForestClassifer` function from the Scikit Learn ensemble package, and then fits the model. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(444)\n", + "rf_clf = RandomForestClassifier(n_estimators=5)\n", + "rf_clf.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the many hyperparameters of the random forest model object are displayed. \n", + "\n", + "Next, the code in the cell below performs the following processing to score the test data subset:\n", + "1. The test features are scaled using the scaler computed for the training features. \n", + "2. The `predict` method is used to compute the scores from the scaled features. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "X_test = scale.transform(X_test)\n", + "scores = rf_clf.predict(X_test)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is time to evaluate the model results. Keep in mind that the problem has been made difficult deliberately, by having more test cases than training cases. \n", + "\n", + "The iris data has three species categories. Therefore it is necessary to use evaluation code for a three category problem. The function in the cell below extends code from pervious labs to deal with a three category problem. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "def print_metrics_3(labels, scores):\n", + " \n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score Setosa Score Versicolor Score Virginica')\n", + " print('Actual Setosa %6d' % conf[0,0] + ' %5d' % conf[0,1] + ' %5d' % conf[0,2])\n", + " print('Actual Versicolor %6d' % conf[1,0] + ' %5d' % conf[1,1] + ' %5d' % conf[1,2])\n", + " print('Actual Vriginica %6d' % conf[2,0] + ' %5d' % conf[2,1] + ' %5d' % conf[2,2])\n", + " ## Now compute and display the accuracy and metrics\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " print(' ')\n", + " print(' Setosa Versicolor Virginica')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1] + ' %0.2f' % metrics[3][2])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1] + ' %0.2f' % metrics[0][2])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1] + ' %0.2f' % metrics[1][2])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1] + ' %0.2f' % metrics[2][2])\n", + " \n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice the following:\n", + "1. The confusion matrix has dimension 3X3. You can see that most cases are correctly classified with only a few errors. \n", + "2. The overall accuracy is 0.94. Since the classes are roughly balanced, this metric indicates relatively good performance of the classifier, particularly since it was only trained on 50 cases. \n", + "3. The precision, recall and F1 for each of the classes is quite good.\n", + "|\n", + "To get a better feel for what the classifier is doing, the code in the cell below displays a set of plots showing correctly (as '+') and incorrectly (as 'o') cases, with the species color-coded. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_iris_score(iris, y_test, scores):\n", + " '''Function to plot iris data by type'''\n", + " ## Find correctly and incorrectly classified cases\n", + " true = np.equal(scores, y_test).astype(int)\n", + " \n", + " ## Create data frame from the test data\n", + " iris = pd.DataFrame(iris)\n", + " levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", + " iris['Species'] = [levels[x] for x in y_test]\n", + " iris.columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Species']\n", + " \n", + " ## Set up for the plot\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " markers = ['o', '+']\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " \n", + " for t in range(2): # loop over correct and incorect classifications\n", + " setosa = iris[(iris['Species'] == 'setosa') & (true == t)]\n", + " versicolor = iris[(iris['Species'] == 'versicolor') & (true == t)]\n", + " virginica = iris[(iris['Species'] == 'virginica') & (true == t)]\n", + " # loop over all the dimensions\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = markers[t], color = 'blue')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = markers[t], color = 'orange')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = markers[t], color = 'green')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + "\n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these plots. You can see how the classifier has divided the feature space between the classes. Notice that most of the errors occur in the overlap region between Virginica and Versicolor. This behavior is to be expected. \n", + "\n", + "Is it possible that a random forest model with more trees would separate these cases better? The code in the cell below uses a model with 40 trees (estimators). This model is fit with the training data and displays the evaluation of the model. \n", + "\n", + "Execute this code and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(444)\n", + "rf_clf = RandomForestClassifier(n_estimators=40)\n", + "rf_clf.fit(X_train, y_train)\n", + "scores = rf_clf.predict(X_test)\n", + "print_metrics_3(y_test, scores) \n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results are slightly better than for the model with 5 trees. However, this small difference in performance is unlikely to be significant. \n", + "\n", + "Like most tree-based models, random forest models have a nice property that **feature importance** is computed during model training. Feature importance can be used as a feature selection method. \n", + "\n", + "Execute the code in the cell below to display a plot of the feature importance." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "importance = rf_clf.feature_importances_\n", + "plt.bar(range(4), importance, tick_label = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "plt.xticks(rotation=90)\n", + "plt.ylabel('Feature importance')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine the plot displayed above. Notice that the Speal_Lenght and Sepal_Width have rather low importance. \n", + "\n", + "Should these features be dropped from the model? To find out, you will create a model with a reduced feature set and compare the results. As a first step, execute the code in the cell below to create training and test datasets using the reduced features." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Create reduced feature set\n", + "Features = np.array(iris[['Petal_Length', 'Petal_Width']])\n", + "\n", + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to define the model, fit the model, score the model and print the results. \n", + "\n", + "Once you have executed the code, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(444)\n", + "rf_clf = RandomForestClassifier(n_estimators=40)\n", + "rf_clf.fit(X_train, y_train)\n", + "scores = rf_clf.predict(X_test)\n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results are nearly the same as those obtained with the full feature set. In all likelihood, there is no significant difference. Given that a simpler model is more likely to generalize, this model is preferred. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "Now, you will try a more complex example using the credit scoring data. You will use the prepared data which had the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizing numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Nested cross validation is used to estimate the optimal hyperparameters and perform model selection for the random forest model. Since random forest models are efficient to train, 10 fold cross validation is used. Execute the code in the cell below to define inside and outside fold objects. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below estimates the best hyperparameters using 10 fold cross validation. There are a few points to note here:\n", + "1. In this case, a grid of two hyperparameters is searched: \n", + " - max_features determines the maximum number of features used to determine the splits. Minimizing the number of features can prevent model over-fitted by induces bias. \n", + " - min_samples_leaf determines the minimum number of samples or leaves which must be on each terminal node of the tree. Maintaining the minimum number of samples per terminal node is a regularization method. Having too few samples on terminal leaves allows the model training to memorize the data, leading to high variance. Forcing too many samples on the terminal nodes leads to biased predictions. \n", + "2. Since there is a class imbalance and a difference in the cost to the bank of misclassification of a bad credit risk customer, the \"balanced\" argument is used. The balanced argument ensures that the subsamples used to train each tree have balanced cases. \n", + "3. The model is fit on each set of hyperparameters from the grid. \n", + "4. The best estimated hyperparameters are printed. \n", + "\n", + "Notice that the model uses regularization rather than feature selection. The hyperparameter search is intended to optimize the level of regularization. \n", + "\n", + "Execute this code, examine the result, and answer **Question 3** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"max_features\": [2, 3, 5, 10, 15], \"min_samples_leaf\":[3, 5, 10, 20]}\n", + "## Define the random forest model\n", + "nr.seed(3456)\n", + "rf_clf = RandomForestClassifier(class_weight = \"balanced\") # class_weight = {0:0.33, 1:0.67}) \n", + "\n", + "## Perform the grid search over the parameters\n", + "nr.seed(4455)\n", + "rf_clf = ms.GridSearchCV(estimator = rf_clf, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)\n", + "rf_clf.fit(Features, Labels)\n", + "print(rf_clf.best_estimator_.max_features)\n", + "print(rf_clf.best_estimator_.min_samples_leaf)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will run the code in the cell below to perform the outer cross validation of the model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(rf_clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice that the standard deviation of the mean of the AUC is more than an order of magnitude smaller than the mean. This indicates that this model is likely to generalize well. \n", + "\n", + "Now, you will build and test a model using the estimated optimal hyperparameters. As a first step, execute the code in the cell below to create training and testing dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below defines a random forest model object using the estimated optimal model hyperparameters and then fits the model to the training data. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "rf_mod = RandomForestClassifier(class_weight = \"balanced\", \n", + " max_features = rf_clf.best_estimator_.max_features, \n", + " min_samples_leaf = rf_clf.best_estimator_.min_samples_leaf) \n", + "rf_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the hyperparemeters of the random forest model object reflect those specified. \n", + "\n", + "The code in the cell below scores and prints evaluation metrics for the model, using the test data subset. \n", + "\n", + "Execute this code, examine the results, and answer **Question 4** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + " \n", + "probabilities = rf_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Overall, these performance metrics look quite good. A large majority of negative (bad credit) cases are identified at the expense of significant false positives. The reported AUC is within a standard deviation of the figure obtained with cross validation indicating that the model is generalizing well. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have accomplished the following:\n", + "1. Used a random forest model to classify the cases of the iris data. A model with more trees had marginally lower error rates, but likely not significantly different.\n", + "2. Applied feature importance was used for feature selection with the iris data. The model created and evaluated with the reduced feature set had essentially the same performance as the model with more features. \n", + "2. Used 10 fold to find estimated optimal hyperparameters for a random forest model to classify credit risk cases. The model appears to generalize well. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module6/Boosting.ipynb b/Module6/Boosting.ipynb new file mode 100644 index 0000000..07af084 --- /dev/null +++ b/Module6/Boosting.ipynb @@ -0,0 +1,812 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Boosting and the AdaBoost Method\n", + "\n", + "Using **ensemble methods** can greatly improve the results achieved with weak machine learning algorithms, also called **weak learners**. Ensemble methods achieve better performance by aggregating the results of many statistically independent models. This process averages out the errors and produces a better, final, prediction. \n", + "\n", + "In this lab you will work with widely used ensemble method known as **boosting**. Boosting is a meta-algorithm since the method can be applied to many types of machine learning algorithms. In summary, boosting iteratively improves the learning of the N models by giving greater weight to training cases with larger errors. The basic boosting procedure is simple and follows these steps:\n", + "1. N learners (machine learning models) are defined.\n", + "2. Each of i training data cases is given an initial equal weight of 1/i.\n", + "3. The N learners are trained on the weighted training data cases.\n", + "4. The prediction is computed based on a aggregation of the learners; averaging over the hypothesis of the N learners. \n", + "5. Weights for the training data cases are updated based on the aggregated errors made by the learners. Cases with larger errors are given larger weights. \n", + "6. Steps 3, 4, and 5 are repeated until a convergence criteria is met.\n", + "\n", + "**Classification and regression tree models** are the weak learners most commonly used with boosting. In this lab you will work with one of the most widely used and successful boosted methods, known as **AdaBoost** or **adaptive boosting**. AdaBoost uses some large number, N, tree models. The rate at which weights are updated is **adaptive** with the errors. \n", + "\n", + "It is important to keep in mind that boosted machine learning is not robust to significant noise or outliers in the training data. The reweighting process gives greater weight to the large errors, and therefore can give undue weight to outliers and errors. In cases where data is noisy, the random forest algorithm may prove to be more robust. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example: Iris dataset\n", + "\n", + "As a first example you will use AdaBoost to classify the species of iris flowers. \n", + "\n", + "As a first step, execute the code in the cell below to load the required packages to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "from sklearn.ensemble import AdaBoostClassifier\n", + "from sklearn import preprocessing\n", + "#from statsmodels.api import datasets\n", + "from sklearn import datasets ## Get dataset from sklearn\n", + "import sklearn.model_selection as ms\n", + "import sklearn.metrics as sklm\n", + "import matplotlib.pyplot as plt\n", + "import pandas as pd\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for these data, you will now load and plot them. The code in the cell below does the following:\n", + "\n", + "1. Loads the iris data as a Pandas data frame. \n", + "2. Adds column names to the data frame.\n", + "3. Displays all 4 possible scatter plot views of the data. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_iris(iris):\n", + " '''Function to plot iris data by type'''\n", + " setosa = iris[iris['Species'] == 'setosa']\n", + " versicolor = iris[iris['Species'] == 'versicolor']\n", + " virginica = iris[iris['Species'] == 'virginica']\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = 'x')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = 'o')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = '+')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + " \n", + "## Import the dataset from sklearn.datasets\n", + "iris = datasets.load_iris()\n", + "\n", + "## Create a data frame from the dictionary\n", + "species = [iris.target_names[x] for x in iris.target]\n", + "iris = pd.DataFrame(iris['data'], columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "iris['Species'] = species\n", + "\n", + "## Plot views of the iris data \n", + "plot_iris(iris) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that Setosa (in blue) is well separated from the other two categories. The Versicolor (in orange) and the Virginica (in green) show considerable overlap. The question is how well our classifier will separate these categories. \n", + "\n", + "Scikit Learn classifiers require numerically coded numpy arrays for the features and as a label. The code in the cell below does the following processing:\n", + "1. Creates a numpy array of the features.\n", + "2. Numerically codes the label using a dictionary lookup, and converts it to a numpy array. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(iris[['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']])\n", + "\n", + "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", + "Labels = np.array([levels[x] for x in iris['Species']])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to split the dataset into test and training set. Notice that unusually, 100 of the 150 cases are being used as the test dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As is always the case with machine learning, numeric features must be scaled. The code in the cell below performs the following processing:\n", + "\n", + "1. A Zscore scale object is defined using the `StandarScaler` function from the Scikit Learn preprocessing package. \n", + "2. The scaler is fit to the training features. Subsequently, this scaler is used to apply the same scaling to the test data and in production. \n", + "3. The training features are scaled using the `transform` method. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scale = preprocessing.StandardScaler()\n", + "scale.fit(X_train)\n", + "X_train = scale.transform(X_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you will define and fit an AdaBoosted tree model. The code in the cell below defines the model with 100 estimators (trees) using the `AdaBoostClassifer` function from the Scikit Learn ensemble package, and then fits the model. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "ab_clf = AdaBoostClassifier()\n", + "ab_clf.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the many hyperparameters of the AdaBoosted tree model object are displayed. \n", + "\n", + "Next, the code in the cell below performs the following processing to score the test data subset:\n", + "1. The test features are scaled using the scaler computed for the training features. \n", + "2. The `predict` method is used to compute the scores from the scaled features. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "X_test = scale.transform(X_test)\n", + "scores = ab_clf.predict(X_test)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is time to evaluate the model results. Keep in mind that the problem has been made deliberately difficult, by having more test cases than training cases. \n", + "\n", + "The iris data has three species categories. Therefore it is necessary to use evaluation code for a three category problem. The function in the cell below extends code from pervious labs to deal with a three category problem. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def print_metrics_3(labels, scores):\n", + " \n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score Setosa Score Versicolor Score Virginica')\n", + " print('Actual Setosa %6d' % conf[0,0] + ' %5d' % conf[0,1] + ' %5d' % conf[0,2])\n", + " print('Actual Versicolor %6d' % conf[1,0] + ' %5d' % conf[1,1] + ' %5d' % conf[1,2])\n", + " print('Actual Vriginica %6d' % conf[2,0] + ' %5d' % conf[2,1] + ' %5d' % conf[2,2])\n", + " ## Now compute and display the accuracy and metrics\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " print(' ')\n", + " print(' Setosa Versicolor Virginica')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1] + ' %0.2f' % metrics[3][2])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1] + ' %0.2f' % metrics[0][2])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1] + ' %0.2f' % metrics[1][2])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1] + ' %0.2f' % metrics[2][2])\n", + " \n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice the following:\n", + "1. The confusion matrix has dimension 3X3. You can see that most cases are correctly classified with only a few errors. \n", + "2. The overall accuracy is 0.95. Since the classes are roughly balanced, this metric indicates relatively good performance of the classifier, particularly since it was only trained on 50 cases. \n", + "3. The precision, recall and F1 for each of the classes is quite good.\n", + "\n", + "To get a better feel for what the classifier is doing, the code in the cell below displays a set of plots showing correctly (as '+') and incorrectly (as 'o') cases, with the species color-coded. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_iris_score(iris, y_test, scores):\n", + " '''Function to plot iris data by type'''\n", + " ## Find correctly and incorrectly classified cases\n", + " true = np.equal(scores, y_test).astype(int)\n", + " \n", + " ## Create data frame from the test data\n", + " iris = pd.DataFrame(iris)\n", + " levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", + " iris['Species'] = [levels[x] for x in y_test]\n", + " iris.columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Species']\n", + " \n", + " ## Set up for the plot\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " markers = ['o', '+']\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " \n", + " for t in range(2): # loop over correct and incorect classifications\n", + " setosa = iris[(iris['Species'] == 'setosa') & (true == t)]\n", + " versicolor = iris[(iris['Species'] == 'versicolor') & (true == t)]\n", + " virginica = iris[(iris['Species'] == 'virginica') & (true == t)]\n", + " # loop over all the dimensions\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = markers[t], color = 'blue')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = markers[t], color = 'orange')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = markers[t], color = 'green')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + "\n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these plots. You can see how the classifier has divided the feature space between the classes. Notice that most of the errors occur in the overlap region between Virginica and Versicolor. This behavior is to be expected. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Like most tree-based models, Ada Boosted tree models have a nice property that **feature importance** is computed during model training. Feature importance can be used as a feature selection method. \n", + "\n", + "Execute the code in the cell below to display a plot of the feature importance." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "importance = ab_clf.feature_importances_\n", + "plt.bar(range(4), importance, tick_label = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "plt.xticks(rotation=90)\n", + "plt.ylabel('Feature importance')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine the plot displayed above. Notice that the Sepal_Lenght has virtually no importance. \n", + "\n", + "Should this feature be dropped from the model? To find out, you will create a model with a reduced feature set and compare the results. As a first step, execute the code in the cell below to create training and test datasets using the reduced features." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Create reduced feature set\n", + "Features = np.array(iris[['Sepal_Width', 'Petal_Length', 'Petal_Width']])\n", + "\n", + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to define the model, fit the model, score the model and print the results. \n", + "\n", + "Once you have executed the code, answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "ab_clf = AdaBoostClassifier()\n", + "ab_clf.fit(X_train, y_train)\n", + "scores = ab_clf.predict(X_test)\n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results are identical to those obtained with the full feature set. In all likelihood, there is no significant difference. Given that a simpler model is more likely to generalize, this model is preferred. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "Now, you will try a more complex example using the credit scoring data. You will use the prepared data which had the the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizing numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Nested cross validation is used to estimate the optimal hyperparameters and perform model selection for the AdaBoosted tree model. Since AdaBoosted tree models are efficient to train, 10 fold cross validation is used. Execute the code in the cell below to define inside and outside fold objects. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below estimates the best hyperparameters using 10 fold cross validation. There are a few points to notice here:\n", + "1. In this case, a grid of one hyperparameter is searched: \n", + " - learning rate shrinks the contribution of each classifier by learning_rate. There is a trade-off between learning_rate and n_estimators\n", + "2. There is a class imbalance and a difference in the cost to the bank of misclassification of a bad credit risk customer. This will be addressed later. \n", + "3. The model is fit on each set of hyperparameters from the grid. \n", + "4. The best estimated hyperparameters are printed. \n", + "\n", + "Notice that the model uses regularization rather than feature selection. The hyperparameter search is intended to optimize the level of regularization. \n", + "\n", + "Execute this code, examine the result, and answer **Question 2** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"learning_rate\": [0.1, 1, 10]}\n", + "## Define the AdaBoosted tree model\n", + "nr.seed(3456)\n", + "ab_clf = AdaBoostClassifier() \n", + "\n", + "## Perform the grid search over the parameters\n", + "nr.seed(4455)\n", + "ab_clf = ms.GridSearchCV(estimator = ab_clf, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)\n", + "ab_clf.fit(Features, Labels)\n", + "print(ab_clf.best_estimator_.learning_rate)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will run the code in the cell below to perform the outer cross validation of the model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(ab_clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice that the standard deviation of the mean of the AUC is more than an order of magnitude smaller than the mean. This indicates that this model is likely to generalize well. \n", + "\n", + "Now, you will build and test a model using the estimated optimal hyperparameters. As a first step, execute the code in the cell below to create training and testing datasets." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below defines an AdaBoosted tree model object using the estimated optimal model hyperparameters and then fits the model to the training data. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "ab_mod = AdaBoostClassifier(learning_rate = ab_clf.best_estimator_.learning_rate) \n", + "ab_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the hyperparameters of the AdaBoosted tree model object reflect those specified. \n", + "\n", + "The code in the cell below scores and prints evaluation metrics for the model, using the test data subset. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + " \n", + "probabilities = ab_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Overall, these performance metrics are poor. The majority of negative cases have been misclassified as positive. Adaboosted methods are sensitive to class imbalance. \n", + "\n", + "It is likely that the poor performance arises from the class imbalance. Notice that there is no way to reweight classes with boosting methods. Some alternatives are:\n", + "1. **Impute** new values using a statistical algorithm. \n", + "2. **Undersample** the majority cases. For this method a number of the cases equal to the minority case are Bernoulli sampled from the majority case. \n", + "3. **Oversample** the minority cases. For this method the number of minority cases are resampled until they equal the number of majority cases.\n", + "\n", + "The code in the cell below undersamples the majority cases; good credit customers. The `choice` function from the numpy.random package is used to randomize the undersampling. The count of unique label values and the shape of the resulting arrays is printed. Execute this code to create a data set with balanced cases. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "temp_Labels_1 = Labels[Labels == 1] # Save these\n", + "temp_Features_1 = Features[Labels == 1,:] # Save these\n", + "temp_Labels_0 = Labels[Labels == 0] # Undersample these\n", + "temp_Features_0 = Features[Labels == 0,:] # Undersample these\n", + "\n", + "indx = nr.choice(temp_Features_0.shape[0], temp_Features_1.shape[0], replace=True)\n", + "\n", + "temp_Features = np.concatenate((temp_Features_1, temp_Features_0[indx,:]), axis = 0)\n", + "temp_Labels = np.concatenate((temp_Labels_1, temp_Labels_0[indx,]), axis = 0) \n", + "\n", + "print(np.bincount(temp_Labels))\n", + "print(temp_Features.shape)\n", + "print(temp_Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There are now 300 of each label case with 600 cases overall. The question is, will using these data produce better results?\n", + "\n", + "You will perform model selection and evaluation using nested cross validation. The code in the cell below finds the optimal learning rate parameter using cross validation. \n", + "\n", + "Once you have executed the code, answer **Question 3** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1234)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(3214)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)\n", + "\n", + "## Define the AdaBoosted tree model\n", + "nr.seed(3456)\n", + "ab_clf = AdaBoostClassifier() \n", + "\n", + "## Perform the grid search over the parameters\n", + "nr.seed(4455)\n", + "ab_clf = ms.GridSearchCV(estimator = ab_clf, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)\n", + "ab_clf.fit(temp_Features, temp_Labels)\n", + "print(ab_clf.best_estimator_.learning_rate)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the estimated optimal learning rate parameter is smaller than before.\n", + "\n", + "Now, run the code in the cell below to execute the outer loop of the cross validation. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(ab_clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The average AUC is improved compared to the imbalanced training cases. However, the differences are just within 1 standard deviation. Still, there is a reasonable chance the new results represent an improvement. \n", + "\n", + "Finally, you will train and evaluate a model with the balanced cases and the update hyperparameter. The code in the cell below does the following processing:\n", + "1. Creates Bernoulli sampled test and training subsets. \n", + "2. Defines an AdaBoosted model.\n", + "3. Trains the AdaBoosted model.\n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])\n", + "\n", + "## Undersample the majority case for the training data\n", + "temp_Labels_1 = y_train[y_train == 1] # Save these\n", + "temp_Features_1 = X_train[y_train == 1,:] # Save these\n", + "temp_Labels_0 = y_train[y_train == 0] # Undersample these\n", + "temp_Features_0 = X_train[y_train == 0,:] # Undersample these\n", + "\n", + "indx = nr.choice(temp_Features_0.shape[0], temp_Features_1.shape[0], replace=True)\n", + "\n", + "X_train = np.concatenate((temp_Features_1, temp_Features_0[indx,:]), axis = 0)\n", + "y_train = np.concatenate((temp_Labels_1, temp_Labels_0[indx,]), axis = 0) \n", + "\n", + "print(np.bincount(y_train))\n", + "print(X_train.shape)\n", + "print(y_train.shape)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Define and fit the model\n", + "nr.seed(1115)\n", + "ab_mod = AdaBoostClassifier(learning_rate = ab_clf.best_estimator_.learning_rate) \n", + "ab_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, execute the code in the cell below to score and evaluate the model.\n", + "\n", + "Once you have executed the code, answer **Question 4** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "probabilities = ab_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results are significantly better than those obtained with the imbalanced training data in classifying the negative cases. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have accomplished the following:\n", + "1. Used an AdaBoosted tree model to classify the cases of the iris data. This model produced quite good results.\n", + "2. Applied feature importance was used for feature selection with the iris data. The model created and evaluated with the reduced feature set had essentially the same performance as the model with more features. \n", + "3. Used 10 fold to find estimated optimal hyperparameters for an AdaBoosted tree model to classify credit risk cases. The model did not generalize well.\n", + "4. Applied undersampling of the majority cases to create a balanced training dataset and retrained and evaluated the model. The model created with balanced training data was significantly better. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module6/Credit_Features.csv b/Module6/Credit_Features.csv new file mode 100644 index 0000000..329e4db --- /dev/null +++ b/Module6/Credit_Features.csv @@ -0,0 +1,1001 @@ +0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34 +-1.864869059639166,-0.9339010037591353,0.9184771676276685,2.2710059235929747,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.1630458092056901,-0.8701833340815202,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.18155879674684203,-0.8701833340815202,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.4789131410521794,1.5251480607350405,-0.8701833340815202,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.9047427423164252,0.024146916773074168,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.703911049605318,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.20758784255072407,0.024146916773074168,1.4886197467137152,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.3626301604781133,-0.8701833340815202,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.30557459935806763,-0.8701833340815202,1.95785628118201,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.9976214799989517,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8020057724539522,0.024146916773074168,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.7467445133529922,0.024146916773074168,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.5563470132891891,-1.7645135849361147,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.9012510127304115,0.9184771676276685,1.902684530253145,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.6987928175688173,-0.8701833340815202,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.8150060579794622,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.005776555397474003,0.9184771676276685,1.4886197467137152,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,1.5558400345857029,-0.8701833340815202,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,2.1274617402310874,0.9184771676276685,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.4530740122559791,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.15840750725674918,0.9184771676276685,1.157872651879297,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,0.1191762052780822,-0.8701833340815202,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.09536802942532471,-1.7645135849361147,1.157872651879297,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.37486777091334544,0.024146916773074168,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.1982648982260506,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7257055738935877,-1.7645135849361147,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-2.2346136944792545,0.9184771676276685,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-2.287087520950489,0.024146916773074168,0.712169575197047,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.5999820607810264,0.000983552835622907,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.0918105708507295,1.3416903575955137,0.024146916773074168,2.065537068931412,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.29927526453011033,0.024146916773074168,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.6575896042051788,-0.8701833340815202,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.1445088090554238,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8332257744712419,0.9184771676276685,1.7314770061469296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.6351825900567682,0.9184771676276685,-0.09278473996200068,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.5974680166535271,0.8715095869768755,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.197020791620573,-1.7645135849361147,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.17910213752862986,0.9184771676276685,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.8736084908949318,-0.8701833340815202,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-2.141286590336398,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,-0.0435273700892056,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.9460830378293954,0.024146916773074168,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.2166931946515063,-0.8701833340815202,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.2131575947761968,-1.7645135849361147,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.2039613247155165,0.9184771676276685,1.7895274247278445,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.8233070367656448,-0.7080097193124013,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,-0.06244376793067328,0.9184771676276685,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7465038113077511,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.8233070367656448,1.4135375803531633,-1.7645135849361147,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.19577621107936602,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.0435273700892056,0.9184771676276685,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.7196835879968371,1.1660735452706705,-1.7645135849361147,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8352661824504807,0.024146916773074168,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.43339005037194916,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,-0.1046006161335618,0.9184771676276685,1.7314770061469296,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-1.4503017405821597,-1.7645135849361147,-0.8885566148438414,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,1.2703891878031444,-0.8701833340815202,1.4250403970430767,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.7746481456438878,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.2673434116480888,0.024146916773074168,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.221875045798567,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.7098610376907004,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.5812546967791408,0.9184771676276685,1.2941289148926294,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.27261073577272804,0.9184771676276685,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,2.3035462985602226,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.3559653537222094,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.7196835879968371,0.9867436785831314,0.9184771676276685,1.157872651879297,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.13625823124290218,-0.8701833340815202,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.126113093929714,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.36419823131605006,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,-0.010269945504946504,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.5655407357021147,-1.7645135849361147,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.5999820607810264,-1.540611554529536,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.3705265054419635,-0.9394240185144892,0.024146916773074168,1.3602264200472212,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.4789131410521794,1.1636952089857369,-0.8701833340815202,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.2568729234011431,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.5905095136250601,0.9184771676276685,2.220812271924916,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.4789131410521794,0.6398389276596715,0.9184771676276685,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,0.8782791589714488,-0.8701833340815202,1.3602264200472212,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.9107628400069734,1.7570173834616774,-0.8701833340815202,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.595876009995791,-0.8701833340815202,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.161312474665324,-1.7645135849361147,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.8862929316691417,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.5555249926108252,0.024146916773074168,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.41035040137585416,0.9184771676276685,1.7895274247278445,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.05350733076049297,0.024146916773074168,1.4250403970430767,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6905536020519931,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.8020057724539522,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,2.1308376361876697,-1.7645135849361147,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.09077642592192822,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.0029553479656197,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.7552219523560137,0.9184771676276685,1.6723990933496,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6932941575756586,0.9184771676276685,1.5510106274611108,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.4274666579207524,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.521474559691811,0.9184771676276685,-1.7642799753681382,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.7793217579792584,0.9184771676276685,1.5510106274611108,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.9107628400069734,2.4329905252942026,0.024146916773074168,1.7895274247278445,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.2342611082653446,0.9184771676276685,1.95785628118201,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.10694877085967346,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,-0.04132005928022949,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.20399462102766036,1.38268745847696,0.024146916773074168,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.6395608388143988,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.049062247026092645,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.2258418618515272,0.024146916773074168,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.29524023815448264,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.016891376735312855,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,2.0600694894399676,-0.8701833340815202,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.268395504333086,-0.8701833340815202,0.46481087889052375,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,1.1902546855206053,-0.8701833340815202,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.4985559105628365,-1.7645135849361147,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.4089028087708907,-0.6923799912587492,-1.7645135849361147,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.7980319300763852,-1.7645135849361147,2.6026928508806457,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-1.2453441186561935,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.6572241362412526,-1.7645135849361147,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-2.3417893563402172,0.9184771676276685,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,1.2282717341633516,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.5207266331292069,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.4666264793189246,0.024146916773074168,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,0.5075057842703586,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.4789131410521794,1.403874996819156,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.15961568115645694,-0.8701833340815202,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.0645089130574243,0.7386434343965812,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.025429148231241563,0.024146916773074168,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.35291393483100025,0.024146916773074168,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.6079245718904525,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.4008410049331007,0.024146916773074168,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-1.4535971832215155,0.9184771676276685,2.065537068931412,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.2918873417280138,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.1662809633907126,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.592843728434742,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.7121648599011403,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.335477709772362,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.4787373511257855,0.024146916773074168,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.6204389482909707,-1.7645135849361147,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.3512676640726533,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.14853305289879132,-0.8701833340815202,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.25231870427607594,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +2.0918105708507295,1.8502419194373931,-0.8701833340815202,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.8579265443504762,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.7196835879968371,1.6394279689830877,-0.8701833340815202,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4785853749281765,0.9184771676276685,2.220812271924916,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,0.15801445969508324,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.32101144461029796,-0.8701833340815202,0.8674447781905507,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.5782220978378143,-0.8701833340815202,-0.7625863844258073,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.884744657979629,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.7196835879968371,0.4478039952178397,0.024146916773074168,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.025819390968304514,-0.8701833340815202,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.06862371919994346,0.9184771676276685,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.50317705443334,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.3294392065544454,-1.7645135849361147,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.628249904034309,0.9184771676276685,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.0309145871324998,0.024146916773074168,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.544071429465936,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7522348097976082,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.2918873417280138,-1.7645135849361147,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1405488754873503,0.9184771676276685,-1.155724360007621,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.5047158993834735,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.3661489309039503,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8150060579794622,-0.8701833340815202,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.808989623532786,0.024146916773074168,1.157872651879297,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-2.5289616216937225,0.9184771676276685,0.9424550121005936,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.4835157693264826,-0.8701833340815202,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.30941846072708806,-1.7645135849361147,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.22429567877487686,0.024146916773074168,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.0661130852226384,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.8352661824504807,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,1.4277206335103227,-0.8701833340815202,2.4172109717234496,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,-1.258038973753803,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.864869059639166,0.270995778550678,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9764819741456465,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,-0.5481503102688159,0.9184771676276685,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.6421117319751589,-0.8701833340815202,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.2845415373982147,0.9184771676276685,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-1.201193594157096,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4836416954897562,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.20138253190192865,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.3705265054419635,-0.6887297979688279,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.4470493730178621,-0.8701833340815202,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,1.4585566314896248,0.9184771676276685,1.4886197467137152,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.0846426621106051,-0.8701833340815202,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.5327681750371434,0.9184771676276685,1.4250403970430767,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.2660299381708094,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-1.8571427070735047,0.9184771676276685,2.1698522754169245,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.7754560779744457,-1.7645135849361147,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.7899785253100698,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,-0.4921885278537161,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.5772481509827424,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.2939731867977968,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.7389019168708525,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.9715095521911344,-0.8701833340815202,2.6026928508806457,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.17944725343182721,-0.9273044840465697,-0.8701833340815202,2.3204559381997742,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.6434538252089206,0.9184771676276685,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.3812351423791228,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.828725279026598,-0.8701833340815202,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.5999047283469973,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.7196835879968371,0.6234869649375272,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.174202826038074,-0.8701833340815202,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.5974680166535271,0.29372608028670516,0.9184771676276685,-1.601427436689871,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.6117937334450844,-0.8701833340815202,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7182250456574802,-1.7645135849361147,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.199837973443276,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.17304107162380952,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.743450219619934,0.9184771676276685,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-1.2203235777442643,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.9350037144551346,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.9684913566991636,0.024146916773074168,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.2679999527780403,0.9184771676276685,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.6169546910909554,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.9096929049527973,0.024146916773074168,0.37810951241256296,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.2845415373982147,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6796492997141513,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.2901581759113947,-0.8701833340815202,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.6896413773274241,0.024146916773074168,1.6122566565644016,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.3118774850855542,-1.7645135849361147,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.5968843711034428,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7196835879968371,1.0120649721293709,-0.8701833340815202,1.2941289148926294,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,-0.30264746579964313,0.9184771676276685,2.220812271924916,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.41958437505525603,0.9184771676276685,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.2258418618515272,-1.7645135849361147,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.3243915106367889,0.024146916773074168,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,0.622828550324511,-0.8701833340815202,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.28946792251309633,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.7351177221696129,-0.8701833340815202,2.1181021700318468,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7407091334673612,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.9001768006051394,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.581861979833523,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.270995778550678,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,0.8471170589214615,0.024146916773074168,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.10251833427823558,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.9500519187228837,-1.7645135849361147,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,1.5223659684262065,0.9184771676276685,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.6316906709287615,0.9184771676276685,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.34293757897345356,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +1.2140261421940395,0.7170942963619847,0.9184771676276685,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,0.049158037317674076,-0.8701833340815202,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.15418779747967742,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-1.3204807646545005,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-2.5616057574520994,-0.5753996905428135,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.36136787900912043,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,2.315463924071049,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.2878338890420431,0.1763049132775382,0.9184771676276685,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8059919082536569,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.05684458250013819,-1.7645135849361147,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.249561855219652,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.5335263748256596,0.024146916773074168,1.3602264200472212,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.9494254972124627,0.024146916773074168,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,0.8326485624349164,0.024146916773074168,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.9163847799675291,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.45944447609274913,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8393567172180595,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.5637645282690953,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.8958888852682305,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.2912178081355871,-0.8701833340815202,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-2.2136129465191967,0.024146916773074168,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-1.658839290029222,0.024146916773074168,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.08963107707350554,-1.7645135849361147,1.0158165352161053,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9007313188405941,-0.14878268289945443,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.6989089628615197,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.22278487572030806,-0.8701833340815202,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.44697087756068,-1.7645135849361147,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.13706249611248938,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.14938213272168033,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,0.5891333605456632,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,-0.9505415798891141,0.9184771676276685,1.7314770061469296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.48438876209009274,-0.8701833340815202,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5270797453951935,0.9184771676276685,1.6122566565644016,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.014254053604786,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.16742660797110837,-0.8701833340815202,1.7314770061469296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.8673127908963916,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-1.419408361677478,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.2372967255742207,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5846123986652059,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.4089028087708907,1.6929071744373827,-1.7645135849361147,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.1363904554511066,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.12645735501432606,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.699711546570046,0.024146916773074168,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,2.0847641111591955,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,0.305995752285377,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,2.0665293159862403,-1.7645135849361147,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,0.14328838661407511,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.0048988774125396616,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8352661824504807,-0.8701833340815202,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.864869059639166,0.8343263196636707,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.30738342365816596,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.4295699545716889,0.024146916773074168,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.5506038509079058,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.6607860386328506,0.9184771676276685,1.2266960166673486,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.5955857540826947,0.9184771676276685,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.6112514969755848,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.6721912732076694,1.9216455059659838,-1.7645135849361147,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,0.8828622363597508,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.4751475801392828,-0.8701833340815202,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.021697749245869,0.9184771676276685,1.2266960166673486,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-1.1045421368991482,0.9184771676276685,1.157872651879297,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0407168011077441,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.751817879746369,-1.7645135849361147,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.260590529406177,-0.8701833340815202,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.4789131410521794,0.8850133509989807,0.9184771676276685,1.6723990933496,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.4831103056803951,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.8266551888028266,-1.7645135849361147,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.8520877707552964,-1.7645135849361147,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.8099904137731051,0.9184771676276685,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.053263233524950146,0.024146916773074168,0.790710020896414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.16601916122315946,0.024146916773074168,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.6472830029739154,-1.7645135849361147,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.5864263902861151,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.7541508211092194,0.9184771676276685,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-1.0870450269365557,0.9184771676276685,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.848080723599094,-0.8701833340815202,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.5762344928847029,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.8890370371133962,-0.8701833340815202,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.5459177419992647,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.3705265054419635,-0.8610477022019005,0.024146916773074168,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-2.7938798157605738,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.0333113835260288,0.024146916773074168,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.064070695443022,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.5676604143861476,-0.8701833340815202,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.6225943688262727,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-0.61781686400344,-1.7645135849361147,-0.524069587159072,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.16648848097689462,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.5800407535661753,-0.8701833340815202,0.37810951241256296,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.7595419962838468,-1.7645135849361147,1.157872651879297,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.592843728434742,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.5307036491782056,-1.7645135849361147,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.7289757323429747,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.2825453830568732,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.2412325051405017,-0.8701833340815202,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.1250044128912331,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0995186710056148,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.3705265054419635,0.4409964323758953,-1.7645135849361147,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,1.1301513057406816,-0.8701833340815202,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.5913541661427753,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.7951741669047615,0.9184771676276685,-0.301466121277391,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.0503326470007321,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.2993458529138144,-0.8701833340815202,2.6474964086269943,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.33270967998683115,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.0918105708507295,1.4452326911786044,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,2.021950078687176,-0.8701833340815202,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.6861188294812336,0.024146916773074168,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,0.4356766833937663,-1.7645135849361147,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.5362472898802806,-0.1784887038728471,-0.8701833340815202,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.8220609223671386,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.7044842871017741,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.5947383240162177,0.9184771676276685,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.1172035170558898,-0.8701833340815202,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.5150462446255809,-1.7645135849361147,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.36886272628501093,-1.7645135849361147,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.7853425234764302,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,0.634628842746036,-1.7645135849361147,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.6483622211883368,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.5362472898802806,-1.2968916871839868,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.5707499739659353,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.41919103550504583,-1.7645135849361147,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.9707981653279775,0.9184771676276685,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.8620897704913175,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-1.189044047540058,-1.7645135849361147,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.3752633224683898,-0.8701833340815202,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,1.215654319025098,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-1.5459177419992647,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.8517068185023379,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.04463244508801633,-1.7645135849361147,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.7922902861806974,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4618728530926033,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,-0.0038273452346677026,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.8589660904896659,0.9184771676276685,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.4413755753445366,-0.8701833340815202,0.2890958389937262,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.09192279376412217,-0.8701833340815202,0.19764313372674408,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.4022994288235318,-1.7645135849361147,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.03156355086263981,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.5770698364274996,-0.8701833340815202,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.6019143737291102,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.5331771509901511,-1.7645135849361147,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.4590706135137063,0.9184771676276685,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.28090915727133475,-0.8701833340815202,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.31397161421484665,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,1.1885575929214574,0.024146916773074168,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.1511821572793285,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +2.0918105708507295,2.242714897447815,-0.8701833340815202,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +2.0918105708507295,2.335404564249628,0.024146916773074168,1.902684530253145,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.4925340087405463,-0.8701833340815202,0.2890958389937262,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.05072735693212146,-0.8701833340815202,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.5999820607810264,-1.3505876803356258,0.024146916773074168,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,2.2943102181590924,0.9184771676276685,1.7314770061469296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-2.444378052901822,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.20399462102766036,-0.1121511113947324,0.9184771676276685,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,2.167499506108515,0.024146916773074168,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.36777221524449144,-0.814001366022412,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.7676432728708812,0.9184771676276685,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.735931716728659,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.09709410787451524,-0.8701833340815202,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9821909660708611,0.9184771676276685,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,1.4393052567549693,0.9184771676276685,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.047399286111720625,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.6572241362412526,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.3634900601689934,-0.8701833340815202,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.157194416874332,-1.7645135849361147,-1.9354874994743552,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.3832196143093988,-0.8701833340815202,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.26997438217857805,-1.7645135849361147,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.0038273452346677026,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.3515686599427887,2.040712547486976,-0.8701833340815202,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.08514257186261158,0.024146916773074168,1.6122566565644016,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,-0.03526937961693151,0.024146916773074168,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.8757139119019909,-1.7645135849361147,1.0158165352161053,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5989810587340788,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.6360570504701303,0.024146916773074168,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.31690788263994424,0.9184771676276685,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.6446534101214754,-0.8701833340815202,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4089028087708907,-1.419408361677478,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.23635259189993624,0.9184771676276685,0.790710020896414,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.2170848633152671,-1.7645135849361147,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.12091855757970564,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-1.0685580904527296,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.3776553727277202,0.024146916773074168,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.2162003186429162,0.9184771676276685,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.26340700114368076,0.9184771676276685,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.0645089130574243,1.4179865778317933,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.06637303326589096,0.9184771676276685,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.5319116911937641,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.7191577401387655,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1392261885931338,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.08564228773842482,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,1.618007495993326,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.1832711441739957,-1.7645135849361147,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0820891847902205,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.3579890974547502,-0.8701833340815202,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.2240549767296369,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.190388353733601,-0.8701833340815202,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,-0.5367618250369562,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.1739744417259306,-1.7645135849361147,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.1818809077115242,-1.7645135849361147,-1.601427436689871,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.7821764432392455,0.1650800085798083,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9561365403500611,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.7842192073172909,-1.7645135849361147,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9109594356590391,-0.8701833340815202,1.6122566565644016,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-2.1781638829926115,0.45981823022724516,-1.7645135849361147,2.6026928508806457,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.992487973892608,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.32719139587956814,0.9184771676276685,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.20513367682089648,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.157200465136036,0.024146916773074168,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6264704718793631,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-1.6705001602889722,-0.8701833340815202,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8099904137731051,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.4789131410521794,0.43947874389697433,0.9184771676276685,2.1698522754169245,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7741247466618797,0.9184771676276685,-0.8885566148438414,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.31895803004084333,0.9184771676276685,0.46481087889052375,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.5134868205257854,-0.8701833340815202,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.20399462102766036,0.11038416485333973,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.5601753410747223,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.9631924429819175,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.853776726368572,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,-0.3480079612200745,0.9184771676276685,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.5999820607810264,0.08414255833190894,-0.8701833340815202,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6796492997141513,0.024146916773074168,1.6122566565644016,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.602385333817686,0.024146916773074168,1.95785628118201,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.9609383896022512,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.982662296108943,0.024146916773074168,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.17257413290892082,-0.8701833340815202,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.13032387665425876,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.8906429999755946,-0.8701833340815202,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.1346598898418097,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.8233070367656448,0.6201915222981714,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.43643799353091134,0.024146916773074168,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-2.513846875128404,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.8295669888243895,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.5225428344825612,-1.7645135849361147,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.4355581043724524,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.28776071143905857,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.4989308087237618,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.2745915468770928,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.24741669362265298,0.024146916773074168,2.065537068931412,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.48283445868131436,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.4153190231498993,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.0645089130574243,0.17490713620721476,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.8531897741319698,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.31940047520481774,0.024146916773074168,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.169731837552031,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-1.6822675239881504,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.8600064759879598,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.8527413567912606,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.32853598950625584,-1.7645135849361147,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.3515686599427887,0.08063638645521447,0.9184771676276685,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.9772747401966062,0.9184771676276685,-1.0194680969942889,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.088286971485364,0.024146916773074168,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.6316906709287615,0.9184771676276685,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.5057039137912442,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8991034832839071,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.5232545259503283,0.9184771676276685,1.7314770061469296,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-1.1877011423954342,0.024146916773074168,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.9405314679764167,0.9184771676276685,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.8905489796993064,0.9184771676276685,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.3131343709302673,-0.8701833340815202,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.5704070551695962,0.9184771676276685,2.012130890609527,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.6850899159731182,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.9971538191659439,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,0.485715230519944,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.28520760997424366,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.7196835879968371,1.5945220453619653,-0.8701833340815202,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.8610477022019005,-1.7645135849361147,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-2.4231964684739915,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.1656736019042681,0.024146916773074168,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.27876055352245976,-0.8701833340815202,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.700919300263891,0.9184771676276685,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.5416302843664004,0.9184771676276685,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.7960495952705704,0.9184771676276685,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.7744428526058714,-0.8701833340815202,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.332254656934756,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.059855258630974,-0.8701833340815202,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.9821909660708611,-0.8701833340815202,1.2266960166673486,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8831100958729894,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.8926822870899767,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.7881506134781747,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.028700882624603703,-0.8701833340815202,0.19764313372674408,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,1.3443265185581093,-1.7645135849361147,0.006858926312855975,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.6896413773274241,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.3515686599427887,1.6356824940778016,0.9184771676276685,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4904144699247184,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.855116015085992,-0.8701833340815202,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,0.1385019124263602,-0.8701833340815202,-0.8885566148438414,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.8259314178186645,0.9184771676276685,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.08734342912304956,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-1.7741247466618797,0.9184771676276685,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.7379548257810722,-0.8701833340815202,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.395405584273037,-0.8701833340815202,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.8969595270993638,0.024146916773074168,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-1.5946831539714412,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.0631351227973544,0.024146916773074168,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.3596058070390366,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.3939584215549896,0.024146916773074168,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.47468018702782266,0.024146916773074168,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.6548319621298557,1.5388073833809512,-0.8701833340815202,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.5854531930634803,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-0.6100779976794142,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.06076345814727301,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.6666014663323474,0.024146916773074168,0.6317364076235067,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.07709897911105941,0.024146916773074168,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.11136402601815405,-0.8701833340815202,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.6100779976794142,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.7861834110050226,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.3248065589153429,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.05128287208176305,-0.8701833340815202,-0.09278473996200068,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7257055738935877,0.9184771676276685,2.6474964086269943,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.5196921311745797,0.024146916773074168,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.5055460761853598,0.9184771676276685,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,0.3013554817389127,-1.7645135849361147,0.9424550121005936,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.583772152551548,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.22151602542610968,0.9184771676276685,1.902684530253145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,1.246664840019176,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,0.22070148573413742,-0.8701833340815202,0.006858926312855975,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.8424331628569156,0.9184771676276685,1.95785628118201,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.7647401157216668,0.9184771676276685,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.22787280876610916,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.5687406712279073,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7386491532210315,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.7083686963912428,1.6755687109437596,-0.8701833340815202,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.1402656985815554,0.9184771676276685,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.41402663031645043,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.3720536463536046,-1.7645135849361147,0.006858926312855975,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.2451944382927659,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.9012510127304115,0.9184771676276685,2.2710059235929747,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.7666748268503689,-0.8701833340815202,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.07426769392522936,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.9394602907081131,-1.7645135849361147,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.49810842098674996,0.024146916773074168,-0.7625863844258073,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.2892112922310683,-0.8701833340815202,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.26492399243317133,0.9184771676276685,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.5737317066598784,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.6263619661586286,-0.8701833340815202,-0.524069587159072,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,2.107850796138405,-1.7645135849361147,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.8622455186147777,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.5679107083202207,0.024146916773074168,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.7275825113990118,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.08514257186261158,-0.8701833340815202,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.6443805343692689,0.9184771676276685,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.3269036196508959,0.024146916773074168,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.3772570059384482,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.1636952089857369,0.024146916773074168,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.0457033605284425,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-1.4129978142774215,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-1.0335517184948613,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.18604712521819458,-0.8701833340815202,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.25005800107908954,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.2898797837478186,0.024146916773074168,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.20075219060594576,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-1.2189476914740385,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0648923202145708,0.024146916773074168,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.3326671805078755,-1.7645135849361147,1.2266960166673486,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.7126430122214735,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,-0.015663496158023912,0.9184771676276685,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.15901145260159322,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.2170848633152671,-1.7645135849361147,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,0.1911209427876586,-0.8701833340815202,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.8079896100020846,0.9184771676276685,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.8820508953450885,0.9184771676276685,1.0876002682121086,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.09249636037481059,0.024146916773074168,1.902684530253145,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-2.365006513655273,0.9184771676276685,1.7895274247278445,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.2647178022431281,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.5522421440334229,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.1532824772573735,0.024146916773074168,-1.7642799753681382,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.7407981900160938,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.2272251334493627,-1.7645135849361147,-0.1954948418550697,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.6634639419929222,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.726547430338748,-1.7645135849361147,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.18002492270264447,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.606257882523891,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.5999820607810264,-0.04573846864917447,-1.7645135849361147,0.9424550121005936,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.3515103209997907,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,0.42228042194752646,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-0.8220609223671386,0.9184771676276685,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.20440238389586818,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.8103519000326348,0.024146916773074168,2.6026928508806457,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.13080637707915702,0.9184771676276685,1.2941289148926294,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.2095238594993918,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.7970403814552081,0.9184771676276685,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.5213403712962208,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.8579265443504762,-1.7645135849361147,1.157872651879297,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.4253547267036001,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.5268070988366464,-1.7645135849361147,-1.446152233696366,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.37201391067774936,0.9184771676276685,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.7083686963912428,2.0884647749212926,-0.8701833340815202,1.157872651879297,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.7183444062868711,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-1.864869059639166,0.5423230948248702,-1.7645135849361147,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.4571996717451271,-0.8701833340815202,-1.601427436689871,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.7101499769735495,0.9184771676276685,1.2266960166673486,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.5338829945408807,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.587136429579318,0.024146916773074168,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.6231577996861181,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.3368639577486042,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.096091800409488,0.9184771676276685,2.1698522754169245,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.2555700745847507,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.1693220725900519,-0.8701833340815202,0.6317364076235067,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.6679395191554858,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.4789131410521794,0.6646219665213371,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,0.595876009995791,-1.7645135849361147,2.1181021700318468,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.5367025085097402,-0.8701833340815202,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.5679107083202207,0.9184771676276685,0.8674447781905507,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.6616780548468318,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.25491914384616077,-0.8701833340815202,-1.9354874994743552,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.7436478425663765,0.024146916773074168,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.7080097193124013,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.723831366480044,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,2.4091752342819883,-0.8701833340815202,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.6186796141994723,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.4789131410521794,0.7651564599365479,0.024146916773074168,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-1.5057846406795408,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.7891353421234543,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.8376752945694327,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.34172760417431386,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.3216966439721684,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.5410699254366262,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.8264780277578018,-0.8701833340815202,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.7145010044246421,-0.8701833340815202,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.205269034675227,0.9184771676276685,0.37810951241256296,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.6244767904461521,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.4570063798051685,0.9184771676276685,1.2941289148926294,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.2932274535176988,0.9184771676276685,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.06020384147414028,0.9184771676276685,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.5580728910888668,-0.8701833340815202,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.03636739192848441,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.4089028087708907,0.6424360912895554,-1.7645135849361147,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.2881559427426412,0.9184771676276685,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.860111785512503,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.7204565782048127,-0.8701833340815202,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.2488949862868526,-0.8701833340815202,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.8000173193284096,0.024146916773074168,-1.2977804766708128,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.27086014789172,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.09536802942532471,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-1.072234319393326,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-1.0759210668859946,-0.8701833340815202,1.2941289148926294,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.2393390740278603,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.4776321181462543,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.5186214893434488,-1.7645135849361147,-0.7625863844258073,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.8975791350704793,0.024146916773074168,0.790710020896414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.28776071143905857,0.9184771676276685,1.0876002682121086,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.6951786325273495,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1169791344049713,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.8781368027101244,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.2928755737412154,-0.8701833340815202,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.7080097193124013,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.19143254138080676,-1.7645135849361147,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.08614180988836612,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.8114901663342491,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.97677444224055,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +2.405105394204176,1.0835624157804757,-0.8701833340815202,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.015663496158023912,0.9184771676276685,2.1181021700318468,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.6536720530279021,0.024146916773074168,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.5804166366717869,-1.7645135849361147,1.6723990933496,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-0.07370218267207239,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.6316906709287615,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.9649611699166599,0.024146916773074168,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.8132607815571835,-1.7645135849361147,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +2.0918105708507295,1.2820895413511235,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.9870846309824751,-0.7512778714917354,0.9184771676276685,-0.7625863844258073,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,0.21980136877847217,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,0.1697689415913028,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.533530121097724,-1.7645135849361147,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-1.167723728741789,-0.8701833340815202,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.11136402601815405,0.024146916773074168,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.23501849269226532,-0.8701833340815202,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,0.8633388724093886,-1.7645135849361147,0.8674447781905507,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.07087834334411355,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.8620897704913175,-0.8701833340815202,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.008783116243064,0.9184771676276685,-0.524069587159072,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.2341638544154838,-1.7645135849361147,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.377727966027024,0.9184771676276685,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.30467503092239573,0.9184771676276685,0.5493170365187798,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.9856285422395067,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.242803654254177,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.7228952840780793,0.9184771676276685,1.0876002682121086,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.04710052411460556,0.9184771676276685,0.6317364076235067,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.7196835879968371,0.05990640739345421,0.9184771676276685,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,1.0195895368790233,-1.7645135849361147,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.2885877706351847,0.024146916773074168,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.26709577344560614,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.8937502667363918,0.9184771676276685,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.1681047674738778,-0.8701833340815202,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.7324888990827545,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.9023261211529298,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.03312570587673631,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.9685316434249494,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +2.0918105708507295,2.2678525198905097,0.9184771676276685,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,1.4775246058726885,-1.7645135849361147,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.3131343709302673,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.6083645434779358,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.3425283309567904,0.024146916773074168,-0.301466121277391,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.20399462102766036,1.2050096669502504,0.024146916773074168,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.7608793866321918,0.9184771676276685,0.006858926312855975,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.2136129465191967,0.9184771676276685,-1.155724360007621,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.8704567957981898,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-1.4388336145456433,0.9184771676276685,2.220812271924916,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.08113785264937592,0.024146916773074168,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-2.9213656571223523,-0.8701833340815202,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.7812785045506497,-0.8701833340815202,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.32032660942562263,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.2599881839166103,0.9184771676276685,1.8465854797112444,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.8220609223671386,-0.8701833340815202,0.19764313372674408,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.2573747614113253,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.2503718128913762,-0.8701833340815202,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.3705265054419635,-1.4887179359473908,0.9184771676276685,0.8674447781905507,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.09757769413898225,-0.8701833340815202,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,0.4341527123472398,-1.7645135849361147,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.6479377435719302,0.024146916773074168,-0.524069587159072,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,2.018610506961237,-1.7645135849361147,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.7681016437779716,0.024146916773074168,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,1.3274754891933684,-1.7645135849361147,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.73834241410094,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.04795336798206724,-0.8701833340815202,-0.1954948418550697,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.9870846309824751,-1.0746909786115382,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.34743074918374894,0.9184771676276685,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.03676338588404028,0.9184771676276685,-1.446152233696366,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.3515686599427887,2.281740099563163,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.5362472898802806,-0.37987728657481445,0.024146916773074168,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.05121227030699309,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.8230719207474747,0.024146916773074168,0.2890958389937262,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,1.0010634288448887,-1.7645135849361147,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.2285784732834433,-0.8701833340815202,1.2266960166673486,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-1.1664028497233658,-1.7645135849361147,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.3582256242750765,-0.8701833340815202,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,1.1234704195172134,0.9184771676276685,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.5662523845782773,0.9184771676276685,1.6122566565644016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.811994329215319,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.8261096841261264,0.024146916773074168,0.46481087889052375,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,-1.6113572275221764,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.9502321945113027,-1.7645135849361147,1.8465854797112444,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.16445973712336395,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.11098661301485367,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.2878338890420431,2.1377662612276866,0.9184771676276685,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.02634264676011784,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.3260509021985633,-0.8701833340815202,-0.301466121277391,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,0.23724122460387168,-1.7645135849361147,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.521474559691811,-1.7645135849361147,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.4859449426773566,-0.8701833340815202,2.065537068931412,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.19709162743479447,-0.8701833340815202,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.5547221536718805,0.024146916773074168,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.2878338890420431,0.39348988683374747,-1.7645135849361147,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.10575935485340944,0.9184771676276685,1.4250403970430767,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.6299482536657656,-0.8701833340815202,2.220812271924916,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.7285220064239617,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.4919267256861641,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.482414629219901,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.1100038203376585,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.6092563729078653,-0.8701833340815202,2.2710059235929747,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.3515686599427887,0.9213045523499567,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.2812163294376125,0.9184771676276685,1.902684530253145,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-0.6923799912587492,-0.8701833340815202,-0.301466121277391,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.3659091137665273,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.20399462102766036,1.2703891878031444,-1.7645135849361147,1.902684530253145,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.2805523163491565,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.36777221524449144,0.13273457402653865,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.1688325190430867,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.7083686963912428,1.2208403413587743,0.9184771676276685,1.2941289148926294,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.3950738730377965,1.1731823275244582,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.2878338890420431,-0.9131268329230363,-0.8701833340815202,0.46481087889052375,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.239134984794895,0.024146916773074168,0.6317364076235067,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.8778227787714831,-1.7645135849361147,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.233237542347047,0.024146916773074168,1.3602264200472212,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,0.3068376454023158,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-0.06132331797343269,-0.8701833340815202,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.02294689018390335,1.463024687661877,-1.7645135849361147,1.3602264200472212,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.8393567172180595,-0.8701833340815202,-1.446152233696366,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-1.5637645282690953,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.5712337723661562,0.9184771676276685,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5319116911937641,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.3813121612995245,0.024146916773074168,1.157872651879297,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.20399462102766036,0.735931716728659,-1.7645135849361147,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.1664028497233658,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,1.4563167799119987,-1.7645135849361147,-1.155724360007621,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.734286760747295,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.814965246353996,0.024146916773074168,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.228609891637058,0.9184771676276685,2.1698522754169245,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.4789131410521794,1.7359534860455126,-1.7645135849361147,1.6122566565644016,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.3935735442142963,-0.8701833340815202,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.3705265054419635,-1.2608771188872538,0.024146916773074168,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-2.070140172686223,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.7831062738827694,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.3022004111919005,0.024146916773074168,1.5510106274611108,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.7083686963912428,0.6287422049304561,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.4494004059807892,-0.8701833340815202,1.5510106274611108,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.864869059639166,-0.7599160091146911,-1.7645135849361147,2.012130890609527,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.5670812796661316,-1.7645135849361147,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,2.4258595386936377,-0.8701833340815202,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.7531924593174495,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0111216239078757,0.024146916773074168,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.28733355500563035,0.024146916773074168,-1.155724360007621,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.15043492390036597,-0.8701833340815202,0.6317364076235067,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.3705265054419635,-1.5388476701067393,0.9184771676276685,1.0876002682121086,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.5782711898373049,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,-0.527883812422618,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.6401638595412662,-1.7645135849361147,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.7032474092821677,-0.8701833340815202,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.5971527789106184,0.024146916773074168,1.0876002682121086,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.7083686963912428,1.3121380979339703,0.9184771676276685,0.37810951241256296,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.020537070502105105,0.9184771676276685,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.8831100958729894,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.5974680166535271,2.0468337816576634,-0.8701833340815202,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.9601546796032658,-0.8701833340815202,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.046845442402512136,-0.8701833340815202,-1.0194680969942889,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0335517184948613,0.9184771676276685,1.157872651879297,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.2910612818983251,0.9184771676276685,-1.601427436689871,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-1.7911632411419318,-1.7645135849361147,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.2618773361961817,0.9184771676276685,2.065537068931412,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.10153172062387894,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.9840098187349172,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.2878338890420431,0.27746966942420725,0.024146916773074168,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.2792253158511944,0.9184771676276685,-1.2977804766708128,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.5629421261882505,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.4500652137618334,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.2878338890420431,0.643408679613227,-0.8701833340815202,0.10361395983652989,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,1.3274754891933684,-0.8701833340815202,2.3204559381997742,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.8486081780439698,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.168132361826232,-0.7351177221696129,0.024146916773074168,1.8465854797112444,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.5782220978378143,0.9184771676276685,1.7314770061469296,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.20399462102766036,-0.0988225016543111,0.9184771676276685,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.6646219665213371,0.024146916773074168,0.790710020896414,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.6378077536873272,0.9184771676276685,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.6634639419929222,0.9184771676276685,-0.1954948418550697,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,1.9399023430912783,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.6351825900567682,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-1.279479025309302,0.9184771676276685,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.41996987015749826,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,0.6378876137537329,0.024146916773074168,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.5071456116978844,-1.7645135849361147,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.1308174931073358,0.9184771676276685,-0.7625863844258073,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-0.1374458049395338,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.013725484799558268,0.9184771676276685,0.10361395983652989,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.8103519000326348,0.9184771676276685,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9870846309824751,-0.11331666314280593,-0.8701833340815202,-1.0194680969942889,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.10691913654254248,-0.8701833340815202,-1.7642799753681382,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.012963898753946785,0.9184771676276685,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,0.4153362995557147,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.4454066153699499,0.024146916773074168,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.6862139814977934,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.4913598081011363,-0.8701833340815202,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-0.7551098962332411,-1.7645135849361147,1.0158165352161053,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.7182250456574802,0.9184771676276685,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-1.308632245583138,0.9184771676276685,-1.155724360007621,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,0.5118200205651716,-0.8701833340815202,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.8233070367656448,-0.7754171569015452,0.9184771676276685,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.2812163294376125,0.024146916773074168,0.19764313372674408,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.2140261421940395,0.5136133644449646,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.6814602871797085,0.024146916773074168,-0.7625863844258073,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.9007313188405941,1.3238493548261325,-0.8701833340815202,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.5139834672953743,0.024146916773074168,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,1.7351203928895729,-0.8701833340815202,1.157872651879297,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,-0.13033672555645318,0.9184771676276685,0.19764313372674408,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,-1.0146373624318314,0.9184771676276685,2.1698522754169245,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.6693948477767842,0.024146916773074168,0.790710020896414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-1.4307041404521588,0.9184771676276685,1.4886197467137152,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.20303477290284017,0.9184771676276685,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.7083686963912428,2.410738320189429,-0.8701833340815202,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.299151050982891,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.7821764432392455,1.5156313986744103,0.024146916773074168,0.5493170365187798,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7196835879968371,0.015309403093213013,0.9184771676276685,0.790710020896414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.35713397365652133,0.9184771676276685,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.13625823124290218,0.9184771676276685,0.37810951241256296,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.1299291667228684,0.024146916773074168,0.006858926312855975,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.9339010037591353,0.9184771676276685,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.2140261421940395,1.6884503735133218,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.09906187633759772,0.9184771676276685,-0.6411979185373197,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.5359522005242147,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.12738614636552228,-1.7645135849361147,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.6995296359802818,-0.8701833340815202,0.712169575197047,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.17944725343182721,0.1084222042616649,-0.8701833340815202,0.790710020896414,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.20399462102766036,0.4735714670084882,-0.8701833340815202,0.8674447781905507,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +1.2140261421940395,1.8918610055772254,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.2903479331695421,-0.7145010044246421,0.9184771676276685,0.5493170365187798,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.8190326780364441,0.9184771676276685,0.19764313372674408,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.0041187948795498,-0.8701833340815202,-1.7642799753681382,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,0.572463200344689,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.554533314240912,-0.8701833340815202,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,0.5132548952898817,-1.7645135849361147,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,0.3616238456185479,-1.7645135849361147,-0.09278473996200068,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.2140261421940395,0.7896892633558561,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.8687917682733318,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.9007313188405941,0.276608360989972,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.8233070367656448,-0.15358610829310324,-1.7645135849361147,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,0.3478384433398584,0.9184771676276685,-0.301466121277391,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,2.6191927041199468,-1.7645135849361147,-0.1954948418550697,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,0.2134828821505448,-1.7645135849361147,-0.1954948418550697,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,2.345303594009422,-1.7645135849361147,2.3204559381997742,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.02924698026771496,-1.7645135849361147,-0.09278473996200068,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.42074051454030087,0.9184771676276685,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.02294689018390335,-0.36561574231669436,0.9184771676276685,-0.6411979185373197,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +1.7083686963912428,2.144758900868038,0.9184771676276685,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.7332297842939325,0.024146916773074168,-1.446152233696366,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.24068123798813007,0.024146916773074168,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,1.3484581912618832,-0.8701833340815202,1.6122566565644016,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.6002172385111977,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-1.0734620635258851,0.9184771676276685,-1.601427436689871,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.8695312731185745,0.9184771676276685,0.46481087889052375,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,-0.33063755297424724,0.9184771676276685,1.7895274247278445,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.7541508211092194,0.9184771676276685,0.790710020896414,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-0.41623741273406273,0.9184771676276685,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.474319143180654,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.168132361826232,-0.8746607713696127,0.024146916773074168,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.6737898076290296,-1.9727505751602739,0.9184771676276685,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.6143716283422375,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9007313188405941,-0.29524023815448264,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-1.168132361826232,-1.5144035180971214,0.024146916773074168,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.864869059639166,-0.2020069648531537,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +2.0918105708507295,1.2340222196959683,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.5172894443811066,1.3428208013319771,-0.8701833340815202,1.0158165352161053,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.4890073987627655,-0.8701833340815202,0.9424550121005936,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.9870846309824751,-0.5737317066598784,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,-1.2299961396133003,0.9184771676276685,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.5616057574520994,-0.6518996767241932,-0.8701833340815202,0.712169575197047,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.3459111058026742,0.9184771676276685,1.0158165352161053,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,1.6007034789951051,-1.7645135849361147,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.42228042194752646,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-0.6737898076290296,0.21845001325908134,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.02294689018390335,-0.5846123986652059,0.9184771676276685,0.790710020896414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.5228987293528337,-0.8701833340815202,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,0.5118200205651716,0.024146916773074168,0.5493170365187798,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,-0.1517827246962295,-0.8701833340815202,-1.155724360007621,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.6870590100188486,0.024146916773074168,-0.6411979185373197,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.951579224650985,0.9184771676276685,-0.8885566148438414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.31281734787236537,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.8673127908963916,0.9184771676276685,1.7314770061469296,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.9007313188405941,0.5352935227273163,0.9184771676276685,1.2266960166673486,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-1.168132361826232,-0.9505415798891141,-0.8701833340815202,0.2890958389937262,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.7821764432392455,0.6530943982567955,0.024146916773074168,0.9424550121005936,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,0.3097799469866166,0.9184771676276685,-0.4109124816337716,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +-1.864869059639166,-0.42141070421870225,-0.8701833340815202,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.2878338890420431,-0.03252843544438422,-1.7645135849361147,1.0876002682121086,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.4995586376503998,0.024146916773074168,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.00865627294634702,0.024146916773074168,0.10361395983652989,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-2.1525894467574407,0.024146916773074168,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,-0.440058169608717,0.9184771676276685,-0.8885566148438414,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.7196835879968371,0.0558223552591907,0.9184771676276685,-1.2977804766708128,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,0.5038995200365869,0.9184771676276685,1.5510106274611108,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.4789131410521794,1.4024373203858955,-0.8701833340815202,-0.524069587159072,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.8233070367656448,0.6313618091255205,-1.7645135849361147,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2903479331695421,-0.6006820720103364,0.9184771676276685,-1.446152233696366,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.442621007216505,-1.7645135849361147,0.790710020896414,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.9077151602490164,-1.7645135849361147,-0.524069587159072,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +2.0918105708507295,1.4257796963318181,0.9184771676276685,0.19764313372674408,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +0.9007313188405941,0.20576854486813365,0.9184771676276685,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5172894443811066,-0.8393567172180595,0.024146916773074168,1.7314770061469296,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-1.864869059639166,-1.5006408519337913,-0.8701833340815202,2.1181021700318468,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0 +0.02294689018390335,0.007370269106689906,0.9184771676276685,0.712169575197047,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,0.06499333610718552,0.9184771676276685,1.0876002682121086,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +-0.2903479331695421,-0.8332257744712419,-0.8701833340815202,-1.0194680969942889,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.9007313188405941,1.605012905652971,-0.8701833340815202,1.2266960166673486,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +1.7083686963912428,0.8978451652912308,0.024146916773074168,-0.09278473996200068,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.2878338890420431,0.24697594883654195,-1.7645135849361147,-0.6411979185373197,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.2140261421940395,1.5806610250226196,-0.8701833340815202,-0.8885566148438414,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.5172894443811066,-0.2240549767296369,-0.8701833340815202,-0.4109124816337716,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.6715312075288409,0.9184771676276685,-1.0194680969942889,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +1.4789131410521794,1.2342271202878985,-0.8701833340815202,-0.09278473996200068,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.5362472898802806,-0.6932941575756586,-0.8701833340815202,2.1181021700318468,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.5172894443811066,1.2923143627788825,0.9184771676276685,-0.524069587159072,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0 +0.5172894443811066,-0.41919103550504583,0.9184771676276685,1.157872651879297,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,0.5028156696687675,-0.8701833340815202,0.2890958389937262,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2903479331695421,-0.5547034960132607,0.9184771676276685,0.006858926312855975,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +0.02294689018390335,-0.28387580895683195,-0.8701833340815202,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.2140261421940395,0.6378876137537329,0.9184771676276685,-0.4109124816337716,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-0.012424657327523568,0.9184771676276685,1.2941289148926294,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0 +-0.6737898076290296,-0.4243762230642614,0.024146916773074168,-0.301466121277391,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0 +0.9007313188405941,0.604255005380252,0.9184771676276685,0.5493170365187798,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,1.0,0.0,0.0,0.0 +-0.6737898076290296,-1.4161991013145072,0.9184771676276685,0.37810951241256296,0.0,0.0,0.0,1.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 +1.5974680166535271,-0.3459111058026742,0.9184771676276685,-1.2977804766708128,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0 +1.5974680166535271,0.8245084626647526,0.024146916773074168,-0.7625863844258073,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,1.0,0.0,1.0,0.0,0.0 diff --git a/Module6/Credit_Labels.csv b/Module6/Credit_Labels.csv new file mode 100644 index 0000000..d3c17a1 --- /dev/null +++ b/Module6/Credit_Labels.csv @@ -0,0 +1,1001 @@ +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +1 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +0 +1 +1 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +1 +0 +1 +0 +1 +1 +1 +0 +1 +1 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +0 +0 +0 +1 +1 +1 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +1 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +1 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +1 +0 +1 +0 +1 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +1 +1 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +1 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +0 +1 +0 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +1 +1 +1 +0 +1 +0 +1 +0 +0 +1 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +0 +0 +1 +0 +1 +0 +0 +1 +1 +0 +0 +0 +1 +1 +1 +1 +1 +1 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +1 +1 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +0 +0 +1 +0 +0 +0 +1 +0 +1 +1 +0 +0 +0 +0 +1 +1 +0 +1 +0 +0 +1 +0 +1 +1 +1 +0 +1 +1 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +1 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +1 +0 +1 +0 +0 +1 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +1 +1 +1 +1 +0 +1 +0 +1 +0 +1 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +1 +1 +1 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +1 +1 +1 +0 +0 +1 +1 +0 +1 +1 +0 +0 +0 +0 +1 +0 +1 +0 +0 +0 +1 +0 +0 +1 +1 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 +0 +1 +1 +0 +1 +1 +1 +0 +0 +1 +0 +1 +1 +0 +1 +0 +0 +0 +1 +0 +0 +0 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +1 +0 +1 +1 +0 +1 +1 +1 +0 +0 +0 +0 +1 +0 +0 +0 +0 +1 +0 +0 +1 +0 +0 +0 +0 +0 +1 +1 +0 +0 +0 +0 +1 +1 +1 +1 +0 +1 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +0 +1 +0 diff --git a/Module6/NaiveBayes.ipynb b/Module6/NaiveBayes.ipynb new file mode 100644 index 0000000..0bb5cac --- /dev/null +++ b/Module6/NaiveBayes.ipynb @@ -0,0 +1,719 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Naive Bayes Models\n", + "\n", + "In this lab you will work with **naive Bayes models**. Naive Bayes models are a surprisingly useful and effective simplification of the general Bayesian models. Naive Bayes models make the naive assumption of statistical independence of the features. In many cases, naive Bayes module are surprisingly effective despite violating the assumption of independence. \n", + "\n", + "In simple terms, naive Bayes models use empirical distributions of the features to compute probabilities of the labels. The naive Bayes models can use most any family of distributions for the features. It is important to select the correct distribution family for the data you are working with. Common cases are:\n", + "- **Gaussian;** for continuous or numerical features.\n", + "- **Bernoulli;** for features with binary values. \n", + "- **Multinomial;** for features with more than two categories. \n", + "\n", + "These is one pit fall, the model fails if a zero probability is encountered. This situation occurs when there is a 'hole' in the sample space where there are no samples. A simple smoothing procedure can deal with this problem. The smoothing hyperparameter, usually called alpha, is one of the few required for naive Bayes models. \n", + "\n", + "Some properties of naive Bayes models are:\n", + "- Computational complexity is linear in number of parameter/features, making naive Bayes models highly scalable. There are out or core approaches suitable for massive datasets.\n", + "- Requires minimal data to produce models that generalizes well. If there are only a few cases per category to train a model a naive Bayes model can be a good choice. \n", + "- Have a simple and inherent regularization.\n", + "\n", + "Naive Bayes models are used in many situations including:\n", + "\n", + "- Document classification\n", + "- SPAM detection\n", + "- Image classification \n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example: Iris dataset\n", + "\n", + "As a first example you will use a naive Bayes model to classify the species of iris flowers. \n", + "\n", + "As a first step, execute the code in the cell below to load the required packages to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "from sklearn import preprocessing\n", + "from sklearn.naive_bayes import GaussianNB, BernoulliNB\n", + "#from statsmodels.api import datasets\n", + "from sklearn import datasets ## Get dataset from sklearn\n", + "import sklearn.model_selection as ms\n", + "import sklearn.metrics as sklm\n", + "import matplotlib.pyplot as plt\n", + "import pandas as pd\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for these data, you will now load and plot them. The code in the cell below does the following:\n", + "\n", + "1. Loads the iris data as a Pandas data frame. \n", + "2. Adds column names to the data frame.\n", + "3. Displays all 4 possible scatter plot views of the data. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_iris(iris):\n", + " '''Function to plot iris data by type'''\n", + " setosa = iris[iris['Species'] == 'setosa']\n", + " versicolor = iris[iris['Species'] == 'versicolor']\n", + " virginica = iris[iris['Species'] == 'virginica']\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = 'x')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = 'o')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = '+')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + " \n", + "## Import the dataset from sklearn.datasets\n", + "iris = datasets.load_iris()\n", + "\n", + "## Create a data frame from the dictionary\n", + "species = [iris.target_names[x] for x in iris.target]\n", + "iris = pd.DataFrame(iris['data'], columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "iris['Species'] = species\n", + "\n", + "## Plot views of the iris data \n", + "plot_iris(iris) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that Setosa (blue) is well separated from the other two categories. The Versicolor (orange) and the Virginica (green) show considerable overlap. The question is how well our classifier will separate these categories. \n", + "\n", + "Scikit Learn classifiers require numerically coded numpy arrays for the features and as a label. The code in the cell below does the following processing:\n", + "1. Creates a numpy array of the features.\n", + "2. Numerically codes the label using a dictionary lookup, and converts it to a numpy array. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(iris[['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']])\n", + "\n", + "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", + "Labels = np.array([levels[x] for x in iris['Species']])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to split the dataset into test and training set. Notice that unusually, 100 of the 150 cases are being used as the test dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As is always the case with machine learning, numeric features must be scaled. The code in the cell below performs the following processing:\n", + "\n", + "1. A Zscore scale object is defined using the `StandarScaler` function from the scikit-learn preprocessing package. \n", + "2. The scaler is fit to the training features. Subsequently, this scaler is used to apply the same scaling to the test data and in production. \n", + "3. The training features are scaled using the `transform` method. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scale = preprocessing.StandardScaler()\n", + "scale.fit(X_train)\n", + "X_train = scale.transform(X_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you will define and fit a Gaussian naive Bayes model. A Gaussian model is appropriate here since all of the features are numeric. \n", + "\n", + "The code in the cell below defines a Gaussian naive Bayes model object using the `GaussianNB` function from the scikit-learn naive_bayes package, and then fits the model. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "NB_mod = GaussianNB()\n", + "NB_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the Gaussian naive Bayes model object has only one hyperparameter. \n", + "\n", + "Next, the code in the cell below performs the following processing to score the test data subset:\n", + "1. The test features are scaled using the scaler computed for the training features. \n", + "2. The `predict` method is used to compute the scores from the scaled features. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "X_test = scale.transform(X_test)\n", + "scores = NB_mod.predict(X_test)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is time to evaluate the model results. Keep in mind that the problem has been made deliberately difficult, by having more test cases than training cases. \n", + "\n", + "The iris data has three species categories. Therefore it is necessary to use evaluation code for a three category problem. The function in the cell below extends code from previous labs to deal with a three category problem. \n", + "\n", + "Execute this code, examine the results, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "def print_metrics_3(labels, scores):\n", + " \n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score Setosa Score Versicolor Score Virginica')\n", + " print('Actual Setosa %6d' % conf[0,0] + ' %5d' % conf[0,1] + ' %5d' % conf[0,2])\n", + " print('Actual Versicolor %6d' % conf[1,0] + ' %5d' % conf[1,1] + ' %5d' % conf[1,2])\n", + " print('Actual Vriginica %6d' % conf[2,0] + ' %5d' % conf[2,1] + ' %5d' % conf[2,2])\n", + " ## Now compute and display the accuracy and metrics\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " print(' ')\n", + " print(' Setosa Versicolor Virginica')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1] + ' %0.2f' % metrics[3][2])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1] + ' %0.2f' % metrics[0][2])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1] + ' %0.2f' % metrics[1][2])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1] + ' %0.2f' % metrics[2][2])\n", + " \n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice the following:\n", + "1. The confusion matrix has dimension 3X3. You can see that most cases are correctly classified. \n", + "2. The overall accuracy is 0.91. Since the classes are roughly balanced, this metric indicates relatively good performance of the classifier, particularly since it was only trained on 50 cases. As was mentioned previously, naive Bayes models require only small amounts of training data. \n", + "3. The precision, recall and F1 for each of the classes is relatively good. Versicolor has the worst metrics since it has the largest number of misclassified cases. \n", + "\n", + "To get a better feel for what the classifier is doing, the code in the cell below displays a set of plots showing correctly (as '+') and incorrectly (as 'o') cases, with the species color-coded. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_iris_score(iris, y_test, scores):\n", + " '''Function to plot iris data by type'''\n", + " ## Find correctly and incorrectly classified cases\n", + " true = np.equal(scores, y_test).astype(int)\n", + " \n", + " ## Create data frame from the test data\n", + " iris = pd.DataFrame(iris)\n", + " levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", + " iris['Species'] = [levels[x] for x in y_test]\n", + " iris.columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Species']\n", + " \n", + " ## Set up for the plot\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " markers = ['o', '+']\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " \n", + " for t in range(2): # loop over correct and incorect classifications\n", + " setosa = iris[(iris['Species'] == 'setosa') & (true == t)]\n", + " versicolor = iris[(iris['Species'] == 'versicolor') & (true == t)]\n", + " virginica = iris[(iris['Species'] == 'virginica') & (true == t)]\n", + " # loop over all the dimensions\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = markers[t], color = 'blue')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = markers[t], color = 'orange')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = markers[t], color = 'green')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + "\n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these plots. You can see how the classifier has divided the feature space between the classes. Notice that most of the errors occur in the overlap region between Virginica and Versicolor. This behavior is to be expected. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "Now, you will try a more complex example using the credit scoring data. You will use the prepared data which had the the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizing numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The Features array has both numeric features and binary features (dummy variables for the categorical features). Therefore, a Gaussian model must be used. However, this model is not ideal, since numeric features are mixed with features exhibiting Bernoulli distributions, the binary features. \n", + "\n", + "The code in the cell below does the following processing:\n", + "1. Defines a 10 fold cross validation object. \n", + "2. Defines a Gaussian naive Bayes model.\n", + "3. Performs a 10 fold cross validation.\n", + "4. Prints results from the cross validation. \n", + "\n", + "Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(321)\n", + "cv_folds = ms.KFold(n_splits=10, shuffle = True)\n", + " \n", + "nr.seed(498)\n", + "NB_credit = GaussianNB()\n", + "cv_estimate = ms.cross_val_score(NB_credit, Features, Labels, \n", + " cv = cv_folds) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "NB_credit = GaussianNB()\n", + "cv_estimate = ms.cross_val_score(NB_credit, Features, Labels, \n", + " cv = 10) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice that the standard deviation of the mean of the AUC is more than an order of magnitude smaller than the mean. This indicates that this model is likely to generalize well. \n", + "\n", + "Now, you will build and test a model using a single split of the dataset. As a first step, execute the code in the cell below to create training and testing dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below defines a naive Bayes model object and then fits the model to the training data. Execute this code:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "NB_credit_mod = GaussianNB() \n", + "NB_credit_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the test data subset, the code in the cell below scores and prints evaluation metrics for the model. \n", + "\n", + "Execute this code, examine the results, and answer **Question 2** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + " \n", + "probabilities = NB_credit_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Overall, these performance metrics are poor. Barely half the bad credit risk customers are correctly identified. The reported AUC is quite a bit better than the mean achieved with the 5 fold cross validation. It is likely these figures are optimistic. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Perhaps, a Bernoulli naive Bayes model will work better. Since naive Bayes models tend to be less sensitive to the quantity of training data, this approach may be reasonable. To apply this model, the numeric features must be dropped from the array. Execute the code in the cell below to remove the numeric features and examine the resulting sample. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = Features[:,4:]\n", + "Features[:3,:]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that all of the features are binary. A Bernoulli naive Bayes can now be applied to these features. \n", + "\n", + "10 fold nested cross validation is used to estimate the optimal hyperparameter and performs model selection for the naive Bayes model. Execute the code in the cell below to define inside and outside fold objects." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=10, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=10, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below estimates the best hyperparameters using 10 fold cross validation. There are two points to notice here:\n", + "1. In this case, a grid for one hyperparameter: alpha is the smoothing parameter to avoid zero probabilities. \n", + "3. The model is fit on the grid and the best estimated hyperparameters are printed. \n", + "\n", + "Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(3456)\n", + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"alpha\": [0.0001, 0.001, 0.01, 0.1, 1, 10]}\n", + "## Define the NB regression model\n", + "NB_clf = BernoulliNB() \n", + "\n", + "## Perform the grid search over the parameters\n", + "clf = ms.GridSearchCV(estimator = NB_clf, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)\n", + "clf.fit(Features, Labels)\n", + "print(clf.best_estimator_.alpha)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The smallest alpha hyperparameter has been selected. This indicates that there is very little problem with zero probabilities in this problem. This situation results from the fact that the probability space sampled is dense. \n", + "\n", + "The code in the cell below executes the outer loop of the cross validation to estimate model performance with the optimal hyperparameter. Execute this code and examine the result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#NB_credit = BernoulliNB(alpha = clf.best_estimator_.alpha)\n", + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "At first look, the AUC seems reasonable. The standard deviation is an order of magnitude less than the AUC. \n", + "\n", + "Next, sample the dataset into training and testing subsets by executing the code in the cell below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, execute the code below to fit and score the Bernoulli naive Bayes model and display the performance metrics. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "NB_credit_mod = BernoulliNB(alpha = clf.best_estimator_.alpha) \n", + "NB_credit_mod.fit(X_train, y_train)\n", + "probabilities = NB_credit_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The results for this Bernoulli naive Bayes model are much better than for the Gaussian model. Still, they could be better.\n", + "\n", + "The current model uses the empirical distribution of the label values for the prior value of $p$ of the Bernoulli distribution. This probability is invariably skewed toward the majority case. Since the bank cares more about the minority case, setting this distribution to a fixed prior value can help overcome the class imbalance. The code in the cell below redefines the model object with prior probability of 0.6 for the minority case. \n", + "\n", + "Execute this code, examine the results, and answer **Question 3** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "NB_credit_mod = BernoulliNB(alpha = clf.best_estimator_.alpha,\n", + " class_prior = [0.4,0.6]) \n", + "NB_credit_mod.fit(X_train, y_train)\n", + "probabilities = NB_credit_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The majority of bad credit cases are now correctly identified. However, this characteristic is at the cost of a high false negative error rate. Still, given that the cost to the bank of a false negative is five times the cost of a false positive, this may be a good solution. An infinite number of other models are possible by changing the prior distribution. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have accomplished the following:\n", + "1. Used a Gaussian naive model to classify the cases of the iris data. The overall model performance was reasonable. \n", + "2. Fit a Gaussian naive Bayes model on the bank credit data. The performance of this model was poor as a result of many Bernoulli distributed dummy features.\n", + "3. Used a Bernoulli naive Bayes model for the bank credit data by eliminating the numeric features. Overall, this model was much better. \n", + "4. A model skewed toward detecting bad credit cases was created using a prior distribution rather than the empirical distribution of the labels. This model correctly classified a significant number of the bad credit cases. Selecting other prior distributions will give other models. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module6/NeuralNetworks.ipynb b/Module6/NeuralNetworks.ipynb new file mode 100644 index 0000000..c9078de --- /dev/null +++ b/Module6/NeuralNetworks.ipynb @@ -0,0 +1,632 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Classification with Neural Networks\n", + "\n", + "**Neural networks** are a powerful set of machine learning algorithms. Neural network use one or more **hidden layers** of multiple **hidden units** to perform **function approximation**. The use of multiple hidden units in one or more layers, allows neural networks to approximate complex functions. Neural network models capable of approximating complex functions are said to have high **model capacity**. This property allows neural networks to solve complex machine learning problems. \n", + "\n", + "However, because of the large number of hidden units, neural networks have many **weights** or **parameters**. This situation often leads to **over-fitting** of neural network models, which limits generalization. Thus, finding optimal hyperparameters when fitting neural network models is essential for good performance. \n", + "\n", + "An additional issue with neural networks is **computational complexity**. Many optimization iterations are required. Each optimization iteration requires the update of a large number of parameters. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example: Iris dataset\n", + "\n", + "As a first example you will use neutral network models to classify the species of iris flowers using the famous iris dataset. \n", + "\n", + "As a first step, execute the code in the cell below to load the required packages to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sklearn.neural_network import MLPClassifier\n", + "from sklearn import preprocessing\n", + "#from statsmodels.api import datasets\n", + "from sklearn import datasets ## Get dataset from sklearn\n", + "import sklearn.model_selection as ms\n", + "import sklearn.metrics as sklm\n", + "import matplotlib.pyplot as plt\n", + "import pandas as pd\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for these data, you will now load and plot them. The code in the cell below does the following:\n", + "\n", + "1. Loads the iris data as a Pandas data frame. \n", + "2. Adds column names to the data frame.\n", + "3. Displays all 4 possible scatter plot views of the data. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_iris(iris):\n", + " '''Function to plot iris data by type'''\n", + " setosa = iris[iris['Species'] == 'setosa']\n", + " versicolor = iris[iris['Species'] == 'versicolor']\n", + " virginica = iris[iris['Species'] == 'virginica']\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = 'x')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = 'o')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = '+')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + " \n", + "## Import the dataset from sklearn.datasets\n", + "iris = datasets.load_iris()\n", + "\n", + "## Create a data frame from the dictionary\n", + "species = [iris.target_names[x] for x in iris.target]\n", + "iris = pd.DataFrame(iris['data'], columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "iris['Species'] = species\n", + "\n", + "## Plot views of the iris data \n", + "plot_iris(iris) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that Setosa (in blue) is well separated from the other two categories. The Versicolor (in orange) and the Virginica (in green) show considerable overlap. The question is how well our classifier will seperate these categories. \n", + "\n", + "Scikit Learn classifiers require numerically coded numpy arrays for the features and as a label. The code in the cell below does the following processing:\n", + "1. Creates a numpy array of the features.\n", + "2. Numerically codes the label using a dictionary lookup, and converts it to a numpy array. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(iris[['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']])\n", + "\n", + "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", + "Labels = np.array([levels[x] for x in iris['Species']])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to split the dataset into test and training set. Notice that unusually, 100 of the 150 cases are being used as the test dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As is always the case with machine learning, numeric features must be scaled. The code in the cell below performs the following processing:\n", + "\n", + "1. A Zscore scale object is defined using the `StandarScaler` function from the Scikit Learn preprocessing package. \n", + "2. The scaler is fit to the training features. Subsequently, this scaler is used to apply the same scaling to the test data and in production. \n", + "3. The training features are scaled using the `transform` method. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scale = preprocessing.StandardScaler()\n", + "scale.fit(X_train)\n", + "X_train = scale.transform(X_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you will define and fit a neural network model. The code in the cell below defines a single hidden layer neural network model with 50 units. The code uses the MLPClassifer function from the Scikit Lean neural_network package. The model is then fit. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "nn_mod = MLPClassifier(hidden_layer_sizes = (50,))\n", + "nn_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the many neural network model object hyperparameters are displayed. Optimizing these parameters for a given situation can be quite time consuming. \n", + "\n", + "Next, the code in the cell below performs the following processing to score the test data subset:\n", + "1. The test features are scaled using the scaler computed for the training features. \n", + "2. The `predict` method is used to compute the scores from the scaled features. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "X_test = scale.transform(X_test)\n", + "scores = nn_mod.predict(X_test)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is time to evaluate the model results. Keep in mind that the problem has been made difficult deliberately, by having more test cases than training cases. \n", + "\n", + "The iris data has three species categories. Therefore it is necessary to use evaluation code for a three category problem. The function in the cell below extends code from pervious labs to deal with a three category problem. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def print_metrics_3(labels, scores):\n", + " \n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score Setosa Score Versicolor Score Virginica')\n", + " print('Actual Setosa %6d' % conf[0,0] + ' %5d' % conf[0,1] + ' %5d' % conf[0,2])\n", + " print('Actual Versicolor %6d' % conf[1,0] + ' %5d' % conf[1,1] + ' %5d' % conf[1,2])\n", + " print('Actual Vriginica %6d' % conf[2,0] + ' %5d' % conf[2,1] + ' %5d' % conf[2,2])\n", + " ## Now compute and display the accuracy and metrics\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " print(' ')\n", + " print(' Setosa Versicolor Virginica')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1] + ' %0.2f' % metrics[3][2])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1] + ' %0.2f' % metrics[0][2])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1] + ' %0.2f' % metrics[1][2])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1] + ' %0.2f' % metrics[2][2])\n", + " \n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice the following:\n", + "1. The confusion matrix has dimension 3X3. You can see that most cases are correctly classified. \n", + "2. The overall accuracy is 0.88. Since the classes are roughly balanced, this metric indicates relatively good performance of the classifier, particularly since it was only trained on 50 cases. \n", + "3. The precision, recall and F1 for each of the classes is relatively good. Versicolor has the worst metrics since it has the largest number of misclassified cases. \n", + "\n", + "To get a better feel for what the classifier is doing, the code in the cell below displays a set of plots showing correctly (as '+') and incorrectly (as 'o') cases, with the species color-coded. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_iris_score(iris, y_test, scores):\n", + " '''Function to plot iris data by type'''\n", + " ## Find correctly and incorrectly classified cases\n", + " true = np.equal(scores, y_test).astype(int)\n", + " \n", + " ## Create data frame from the test data\n", + " iris = pd.DataFrame(iris)\n", + " levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", + " iris['Species'] = [levels[x] for x in y_test]\n", + " iris.columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Species']\n", + " \n", + " ## Set up for the plot\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " markers = ['o', '+']\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " \n", + " for t in range(2): # loop over correct and incorect classifications\n", + " setosa = iris[(iris['Species'] == 'setosa') & (true == t)]\n", + " versicolor = iris[(iris['Species'] == 'versicolor') & (true == t)]\n", + " virginica = iris[(iris['Species'] == 'virginica') & (true == t)]\n", + " # loop over all the dimensions\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = markers[t], color = 'blue')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = markers[t], color = 'orange')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = markers[t], color = 'green')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + "\n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these plots. You can see how the classifier has divided the feature space between the classes. Notice that most of the errors occur in the overlap region between Virginica and Versicolor. This behavior is to be expected. There is an error in classifying Setosa which is a bit surprising, and which probably arises from the projection of the division between classes. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Is it possible that a more complex neural network would separate these cases better? The more complex model should have greater model capacity, but will be more susceptible to over-fitting. The code in the cell below uses a neural network with 2 hidden layers and 100 units per layer, coded as (100,100). This model is fit with the training data and displays the evaluation of the model. \n", + "\n", + "Execute this code, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "nn_mod = MLPClassifier(hidden_layer_sizes = (100,100),\n", + " max_iter=300)\n", + "nn_mod.fit(X_train, y_train)\n", + "scores = nn_mod.predict(X_test)\n", + "print_metrics_3(y_test, scores) \n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These are remarkably good results. Apparently, adding additional model capacity allowed the neural network model to perform exceptionally well. There are only 7 misclassified cases, giving an overall accuracy of 0.93. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "Now, you will try a more complex example using the credit scoring data. You will use the prepared data which had the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizing numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Neural network training is known to be problematic when there is significant class imbalance. Unfortunately, neural networks have no method for weighting cases. Some alternatives are:\n", + "1. **Impute** new values using a statistical algorithm. \n", + "2. **Undersample** the majority cases. For this method a number of the cases equal to the minority case are Bernoulli sampled from the majority case. \n", + "3. **Oversample** the minority cases. For this method the number of minority cases are resampled until they equal the number of majority cases.\n", + "\n", + "The code in the cell below oversamples the minority cases; bad credit customers. Execute this code to create a data set with balanced cases. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "temp_Labels = Labels[Labels == 1] \n", + "temp_Features = Features[Labels == 1,:]\n", + "temp_Features = np.concatenate((Features, temp_Features), axis = 0)\n", + "temp_Labels = np.concatenate((Labels, temp_Labels), axis = 0) \n", + "\n", + "print(temp_Features.shape)\n", + "print(temp_Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Nested cross validation is used to estimate the optimal hyperparameters and perform model selection for a neural network model. 3 fold cross validation is used since training neural networks is computationally intensive. Additional folds would give better estimates but at the cost of greater computation time. Execute the code in the cell below to define inside and outside fold objects. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=3, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=3, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below estimates the best hyperparameters using 3 fold cross validation. In the interest of computational efficiency, values for only 4 parameters will be searched. There are several points to note here:\n", + "1. In this case, a grid of four hyperparameters: \n", + " - **alpha** is the l2 regularization hyperparameter, \n", + " - **early_stopping** determines when the training metric becomes worse following an iteration of the optimization algorithm stops the training at the previous iteration. Early stopping is a powerful method to prevent over-fitting of machine learning models in general and neural networks in particular,\n", + " - **beta_1** and **beta_2** are hyperparameters that control the adaptive learning rate used by the **Adam** optimizer,\n", + "3. The model is fit on the grid, and\n", + "4. The best estimated hyperparameters are printed. \n", + "\n", + "This code searches over a 3X3X3X2 or 54 element grid using 3 fold cross validation. Using even this modest search grid and number of folds requires the model to be trained 162 times. Execute this code and examine the result, but expect execution to take some time. \n", + "\n", + "Once you have executed the code, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {#\"alpha\":[0.0000001,0.000001,0.00001], \n", + " #\"early_stopping\":[True, False], \n", + " \"beta_1\":[0.95,0.90,0.80], \n", + " \"beta_2\":[0.999,0.9,0.8]}\n", + "\n", + "## Define the Neural Network model\n", + "nn_clf = MLPClassifier(hidden_layer_sizes = (100,100),\n", + " max_iter=300)\n", + "\n", + "## Perform the grid search over the parameters\n", + "nr.seed(3456)\n", + "nn_clf = ms.GridSearchCV(estimator = nn_clf, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'recall',\n", + " return_train_score = True)\n", + "\n", + "nr.seed(6677)\n", + "nn_clf.fit(temp_Features, temp_Labels)\n", + "#print(nn_clf.best_estimator_.alpha)\n", + "#print(nn_clf.best_estimator_.early_stopping)\n", + "print(nn_clf.best_estimator_.beta_1)\n", + "print(nn_clf.best_estimator_.beta_2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will run the code in the cell below to perform the outer cross validation of the model. The multiple trainings of this model will take some time. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(nn_clf, temp_Features, temp_Labels, \n", + " cv = outside) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice that the standard deviation of the mean of Recall is an order of magnitude less than the mean itself. This indicates that this model is likely to generalize well, but the level of performance is still unclear. \n", + "\n", + "Now, you will build and test a model using the estimated optimal hyperparameters. However, there is a complication. The training data subset must have the minority case oversampled. Execute the code in the cell below to create training and testing dataset, with oversampled minority cases for the training subset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])\n", + "\n", + "## Oversample the minority case for the training data\n", + "y_temp = y_train[y_train == 1] \n", + "X_temp = X_train[y_train == 1,:]\n", + "X_train = np.concatenate((X_train, X_temp), axis = 0)\n", + "y_train = np.concatenate((y_train, y_temp), axis = 0) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below defines a neural network model object using the estimated optimal model hyperparameters and then fits the model to the training data. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "nn_mod = MLPClassifier(hidden_layer_sizes = (100,100), \n", + " #alpha = nn_clf.best_estimator_.alpha, \n", + " #early_stopping = nn_clf.best_estimator_.early_stopping, \n", + " beta_1 = nn_clf.best_estimator_.beta_1, \n", + " beta_2 = nn_clf.best_estimator_.beta_2,\n", + " max_iter = 300)\n", + "nn_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the hyperparameters of the neural network model object reflect those specified. \n", + "\n", + "The code in the cell below scores and prints evaluation metrics for the model, using the test data subset. \n", + "\n", + "Execute this code, examine the results, and answer **Question 3** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + " \n", + "probabilities = nn_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The performance of the neural network model is less than ideal. For the negative (bad credit) case the recall is perhaps adequate, but the precision is poor. Perhaps the oversampling does not help much in this case. Challenge yourself - try and perform the undersampling method and compare the result!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have accomplished the following:\n", + "1. Used neural models to classify the cases of the iris data. The model with greater capacity achieved significantly better results. \n", + "2. Used 3 fold to find estimated optimal hyperparameters for a neural network model to classify credit risk cases. Oversampling of the minority case for the training data was required to deal with the class imbalance. Despite this approach, the results achieved are marginal at best. Perhaps, a model with greater capacity would achieve better results, or a different approach to dealing with class imbalance would be more successful. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module6/SupportVectorMachines.ipynb b/Module6/SupportVectorMachines.ipynb new file mode 100644 index 0000000..aa767dd --- /dev/null +++ b/Module6/SupportVectorMachines.ipynb @@ -0,0 +1,572 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Support Vector Machine Models\n", + "\n", + "**Support vector machines (SVMs)** are a widely used and powerful category of machine learning algorithms. There are many variations on the basic idea of an SVM. An SVM attempts to **maximally seperate** classes by finding the **suport vector** with the lowest error rate or maximum separation. SVMs can use many types of **kernel functions**. The most common kernel functions are **linear** and the **radial basis function** or **RBF**. The linear basis function attempts to separate classes by finding hyperplanes in the feature space that maximally separate classes. The RBF uses set of local Gaussian shaped basis kernels to find a nonlinear separation of the classes. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Example: Iris dataset\n", + "\n", + "As a first example you will use SVMs to classify the species of iris flowers. \n", + "\n", + "As a first step, execute the code in the cell below to load the required packages to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "from sklearn import svm, preprocessing\n", + "#from statsmodels.api import datasets\n", + "from sklearn import datasets ## Get dataset from sklearn\n", + "import sklearn.model_selection as ms\n", + "import sklearn.metrics as sklm\n", + "import matplotlib.pyplot as plt\n", + "import pandas as pd\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To get a feel for these data, you will now load and plot them. The code in the cell below does the following:\n", + "\n", + "1. Loads the iris data as a Pandas data frame. \n", + "2. Adds column names to the data frame.\n", + "3. Displays all 4 possible scatter plot views of the data. \n", + "\n", + "Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_iris(iris):\n", + " '''Function to plot iris data by type'''\n", + " setosa = iris[iris['Species'] == 'setosa']\n", + " versicolor = iris[iris['Species'] == 'versicolor']\n", + " virginica = iris[iris['Species'] == 'virginica']\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = 'x')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = 'o')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = '+')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + " \n", + "## Import the dataset from sklearn.datasets\n", + "iris = datasets.load_iris()\n", + "\n", + "## Create a data frame from the dictionary\n", + "species = [iris.target_names[x] for x in iris.target]\n", + "iris = pd.DataFrame(iris['data'], columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width'])\n", + "iris['Species'] = species\n", + "\n", + "## Plot views of the iris data \n", + "plot_iris(iris) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can see that Setosa (in blue) is well separated from the other two categories. The Versicolor (in orange) and the Virginica (in green) show considerable overlap. The question is how well our classifier will separate these categories. \n", + "\n", + "Scikit Learn classifiers require numerically coded numpy arrays for the features and as a label. The code in the cell below does the following processing:\n", + "1. Creates a numpy array of the features.\n", + "2. Numerically codes the label using a dictionary lookup, and converts it to a numpy array. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(iris[['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width']])\n", + "\n", + "levels = {'setosa':0, 'versicolor':1, 'virginica':2}\n", + "Labels = np.array([levels[x] for x in iris['Species']])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to split the dataset into test and training set. Notice that unusually, 100 of the 150 cases are being used as the test dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 100)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As is always the case with machine learning, numeric features must be scaled. The code in the cell below performs the following processing:\n", + "\n", + "1. A Zscore scale object is defined using the `StandarScaler` function from the Scikit Learn preprocessing package. \n", + "2. The scaler is fit to the training features. Subsequently, this scaler is used to apply the same scaling to the test data and in production. \n", + "3. The training features are scaled using the `transform` method. \n", + "\n", + "Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "scale = preprocessing.StandardScaler()\n", + "scale.fit(X_train)\n", + "X_train = scale.transform(X_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you will define and fit a linear SVM model. The code in the cell below defines a linear SVM object using the `LinearSVC` function from the Scikit Learn SVM package, and then fits the model. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "svm_mod = svm.LinearSVC()\n", + "svm_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the SVM model object hyper parameters are displayed. \n", + "\n", + "Next, the code in the cell below performs the following processing to score the test data subset:\n", + "1. The test features are scaled using the scaler computed for the training features. \n", + "2. The `predict` method is used to compute the scores from the scaled features. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "X_test = scale.transform(X_test)\n", + "scores = svm_mod.predict(X_test)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "It is time to evaluate the model results. Keep in mind that the problem has been made difficult deliberately, by having more test cases than training cases. \n", + "\n", + "The iris data has three species categories. Therefore it is necessary to use evaluation code for a three category problem. The function in the cell below extends code from pervious labs to deal with a three category problem. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "def print_metrics_3(labels, scores):\n", + " \n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score Setosa Score Versicolor Score Virginica')\n", + " print('Actual Setosa %6d' % conf[0,0] + ' %5d' % conf[0,1] + ' %5d' % conf[0,2])\n", + " print('Actual Versicolor %6d' % conf[1,0] + ' %5d' % conf[1,1] + ' %5d' % conf[1,2])\n", + " print('Actual Vriginica %6d' % conf[2,0] + ' %5d' % conf[2,1] + ' %5d' % conf[2,2])\n", + " ## Now compute and display the accuracy and metrics\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " print(' ')\n", + " print(' Setosa Versicolor Virginica')\n", + " print('Num case %0.2f' % metrics[3][0] + ' %0.2f' % metrics[3][1] + ' %0.2f' % metrics[3][2])\n", + " print('Precision %0.2f' % metrics[0][0] + ' %0.2f' % metrics[0][1] + ' %0.2f' % metrics[0][2])\n", + " print('Recall %0.2f' % metrics[1][0] + ' %0.2f' % metrics[1][1] + ' %0.2f' % metrics[1][2])\n", + " print('F1 %0.2f' % metrics[2][0] + ' %0.2f' % metrics[2][1] + ' %0.2f' % metrics[2][2])\n", + " \n", + "print_metrics_3(y_test, scores) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice the following:\n", + "1. The confusion matrix has dimension 3X3. You can see that most cases are correctly classified. \n", + "2. The overll accuracy is 0.86. Since the classes are roughly balanced, this metric indicates relatively good performance of the classifier, particularly since it was only trained on 50 cases. \n", + "3. The precision, recall and F1 for each of the classes is relatively good. Versicolor has the worst metrics since it has the largest number of misclassified cases. \n", + "\n", + "To get a better feel for what the classifier is doing, the code in the cell below displays a set of plots showing correctly (as '+') and incorrectly (as 'o') cases, with the species color-coded. Execute this code and examine the results. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "def plot_iris_score(iris, y_test, scores):\n", + " '''Function to plot iris data by type'''\n", + " ## Find correctly and incorrectly classified cases\n", + " true = np.equal(scores, y_test).astype(int)\n", + " \n", + " ## Create data frame from the test data\n", + " iris = pd.DataFrame(iris)\n", + " levels = {0:'setosa', 1:'versicolor', 2:'virginica'}\n", + " iris['Species'] = [levels[x] for x in y_test]\n", + " iris.columns = ['Sepal_Length', 'Sepal_Width', 'Petal_Length', 'Petal_Width', 'Species']\n", + " \n", + " ## Set up for the plot\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,12))\n", + " markers = ['o', '+']\n", + " x_ax = ['Sepal_Length', 'Sepal_Width']\n", + " y_ax = ['Petal_Length', 'Petal_Width']\n", + " \n", + " for t in range(2): # loop over correct and incorect classifications\n", + " setosa = iris[(iris['Species'] == 'setosa') & (true == t)]\n", + " versicolor = iris[(iris['Species'] == 'versicolor') & (true == t)]\n", + " virginica = iris[(iris['Species'] == 'virginica') & (true == t)]\n", + " # loop over all the dimensions\n", + " for i in range(2):\n", + " for j in range(2):\n", + " ax[i,j].scatter(setosa[x_ax[i]], setosa[y_ax[j]], marker = markers[t], color = 'blue')\n", + " ax[i,j].scatter(versicolor[x_ax[i]], versicolor[y_ax[j]], marker = markers[t], color = 'orange')\n", + " ax[i,j].scatter(virginica[x_ax[i]], virginica[y_ax[j]], marker = markers[t], color = 'green')\n", + " ax[i,j].set_xlabel(x_ax[i])\n", + " ax[i,j].set_ylabel(y_ax[j])\n", + "\n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these plots. You can see how the classifier has divided the feature space between the classes. Notice that most of the errors occur in the overlap region between Virginica and Versicolor. This behavior is to be expected. There is an error in classifying Setosa which is a bit surprising, and which probably arises from the projection of the division between classes. \n", + "\n", + "Is it possible that a nonlinear SVM would separate these cases better? The code in the cell below uses the `SVC` function to define a nonlinear model using radial basis function. This model is fit with the training data and displays the evaluation of the model. \n", + "\n", + "Execute this code, and answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "svc_mod = svm.SVC()\n", + "svc_mod.fit(X_train, y_train)\n", + "scores = svm_mod.predict(X_test)\n", + "print_metrics_3(y_test, scores) \n", + "plot_iris_score(X_test, y_test, scores)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results are identical to those obtained with the linear SVM model. Apparently, there is no advantage in a nonlinear SVM for these data. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "Now, you will try a more complex example using the credit scoring data. You will use the prepared data which had the following preprocessing:\n", + "1. Cleaning missing values.\n", + "2. Aggregating categories of certain categorical variables. \n", + "3. Encoding categorical variables as binary dummy variables.\n", + "4. Standardizing of numeric variables. \n", + "\n", + "Execute the code in the cell below to load the features and labels as numpy arrays for the example. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Credit_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Credit_Labels.csv'))\n", + "Labels = Labels.reshape(Labels.shape[0],)\n", + "print(Features.shape)\n", + "print(Labels.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Nested cross validation is used to estimate the optimal hyperparameters and perform model selection for the nonlinear SVM model. 5 fold cross validation is used since training SVMs are computationally intensive to train. Additional folds would give better estimates but at the cost of greater computation time. Execute the code in the cell below to define inside and outside fold objects. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(123)\n", + "inside = ms.KFold(n_splits=5, shuffle = True)\n", + "nr.seed(321)\n", + "outside = ms.KFold(n_splits=5, shuffle = True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below estimates the best hyperparameters using 5 fold cross validation. There are two points to notice here:\n", + "1. In this case, a grid of two hyperparameters: C is the inverse of lambda of l2 regularization, and gamma is the span of the RBF kernel. \n", + "2. Since there is a class imbalance and a difference in the cost to the bank of misclassification of a bad credit risk customer, class weights are used. \n", + "3. The model is fit on the grid and the best estimated hyperparameters are printed. \n", + "\n", + "Execute this code, examine the result, and answer **Question 2** on the course page. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(3456)\n", + "## Define the dictionary for the grid search and the model object to search on\n", + "param_grid = {\"C\": [1, 10, 100, 1000], \"gamma\":[1.0/50.0, 1.0/200.0, 1.0/500.0, 1.0/1000.0]}\n", + "## Define the SVM model\n", + "svc_clf = svm.SVC(class_weight = {0:0.33, 1:0.67}) \n", + "\n", + "## Perform the grid search over the parameters\n", + "clf = ms.GridSearchCV(estimator = svc_clf, param_grid = param_grid, \n", + " cv = inside, # Use the inside folds\n", + " scoring = 'roc_auc',\n", + " return_train_score = True)\n", + "clf.fit(Features, Labels)\n", + "print(clf.best_estimator_.C)\n", + "print(clf.best_estimator_.gamma)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, you will run the code in the cell below to perform the outer cross validation of the model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(498)\n", + "cv_estimate = ms.cross_val_score(clf, Features, Labels, \n", + " cv = outside) # Use the outside folds\n", + "\n", + "print('Mean performance metric = %4.3f' % np.mean(cv_estimate))\n", + "print('SDT of the metric = %4.3f' % np.std(cv_estimate))\n", + "print('Outcomes by cv fold')\n", + "for i, x in enumerate(cv_estimate):\n", + " print('Fold %2d %4.3f' % (i+1, x))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results. Notice that the standard deviation of the mean of the AUC is more than an order of magnitude smaller than the mean. This indicates that this model is likely to generalize well. \n", + "\n", + "Now, you will build and test a model using the estimated optimal hyperparameters. As a first step, execute the code in the cell below to create training and testing dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "## Randomly sample cases to create independent training and test data\n", + "nr.seed(1115)\n", + "indx = range(Features.shape[0])\n", + "indx = ms.train_test_split(indx, test_size = 300)\n", + "X_train = Features[indx[0],:]\n", + "y_train = np.ravel(Labels[indx[0]])\n", + "X_test = Features[indx[1],:]\n", + "y_test = np.ravel(Labels[indx[1]])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below defines a nonlinear SVM model object using the estimated optimal model hyperparameters and then fits the model to the training data. Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(1115)\n", + "svm_mod = svm.SVC(C = clf.best_estimator_.C,\n", + " gamma = clf.best_estimator_.gamma,\n", + " class_weight = {0:0.33, 1:0.67},\n", + " probability=True) \n", + "svm_mod.fit(X_train, y_train)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected, the hyperparameters of the SVM model object reflect those specified. \n", + "\n", + "The code in the cell below scores and prints evaluation metrics for the model, using the test data subset. \n", + "\n", + "Execute this code, examine the results, and answer **Question 3** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def score_model(probs, threshold):\n", + " return np.array([1 if x > threshold else 0 for x in probs[:,1]])\n", + "\n", + "def print_metrics(labels, probs, threshold):\n", + " scores = score_model(probs, threshold)\n", + " metrics = sklm.precision_recall_fscore_support(labels, scores)\n", + " conf = sklm.confusion_matrix(labels, scores)\n", + " print(' Confusion matrix')\n", + " print(' Score positive Score negative')\n", + " print('Actual positive %6d' % conf[0,0] + ' %5d' % conf[0,1])\n", + " print('Actual negative %6d' % conf[1,0] + ' %5d' % conf[1,1])\n", + " print('')\n", + " print('Accuracy %0.2f' % sklm.accuracy_score(labels, scores))\n", + " print('AUC %0.2f' % sklm.roc_auc_score(labels, probs[:,1]))\n", + " print('Macro precision %0.2f' % float((float(metrics[0][0]) + float(metrics[0][1]))/2.0))\n", + " print('Macro recall %0.2f' % float((float(metrics[1][0]) + float(metrics[1][1]))/2.0))\n", + " print(' ')\n", + " print(' Positive Negative')\n", + " print('Num case %6d' % metrics[3][0] + ' %6d' % metrics[3][1])\n", + " print('Precision %6.2f' % metrics[0][0] + ' %6.2f' % metrics[0][1])\n", + " print('Recall %6.2f' % metrics[1][0] + ' %6.2f' % metrics[1][1])\n", + " print('F1 %6.2f' % metrics[2][0] + ' %6.2f' % metrics[2][1])\n", + " \n", + "probabilities = svm_mod.predict_proba(X_test)\n", + "print_metrics(y_test, probabilities, 0.5) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Overall, these performance metrics seem acceptable. A large majority of high credit risk customers are identified, but at the cost of a large number of false positives and a low precision for the negative cases. However, one should be cautious. The reported AUC is quite a bit better than the mean achieved with the 5 fold cross validation. It is likely these figures are optimistic. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have accomplished the following:\n", + "1. Used a linear and nonlinear SVM model to classify the cases of the iris data. For this particular problem there was no difference between the linear and nonlinear models. \n", + "2. Used 5 fold to find estimated optimal hyperparameters for a nonlinear SVM model to classify credit risk cases. The model appears to generalize well. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module7/ApplicationOfClustering.ipynb b/Module7/ApplicationOfClustering.ipynb new file mode 100644 index 0000000..830f8f3 --- /dev/null +++ b/Module7/ApplicationOfClustering.ipynb @@ -0,0 +1,569 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Applying Cluster Models\n", + "\n", + "In this lab you will apply K-means and agglomerative to clustering to finding structure in the automotive data set. Finding meaningful clusters in such a complex data set will prove challenging. The challenge is two-fold. First, the optimal number of clusters must be determined. Then the clusters must be interpreted in some useful manner. These challenges are typical of unsupervised learning. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prepare the dataset\n", + "\n", + "Before you start building and evaluating cluster models, the dataset must be prepared. First, execute the code in the cell below to load the packages required to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "import numpy as np\n", + "import numpy.random as nr\n", + "from sklearn.cluster import KMeans, AgglomerativeClustering\n", + "from sklearn.metrics import silhouette_score\n", + "from sklearn.preprocessing import StandardScaler\n", + "import matplotlib.pyplot as plt\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below loads a prepared version of the autos dataset which has the the following preprocessing:\n", + "1. Clean missing values.\n", + "2. Aggregate categories of certain categorical variables. \n", + "3. Encode categorical variables as binary dummy variables.\n", + "4. Standardize of numeric variables. \n", + "\n", + "However, for this case, some additional processing is required:\n", + "1. The log of the Label vector is taken. You know from previous analysis that the log of the Label (price) values are closer to being linearly related to several of the numeric features. Further, the log of the Label is closer to Normally distributed. \n", + "2. The Label value is Zscore standardized to ensure the variance is 1.0 and the mean 0.0. As with all machine learning, scaling is vital in cluster analysis to avoid bias in the solution. \n", + "3. The vector of scaled log transformed Label values is appended to the features to create the complete dataset. \n", + "\n", + "Execute the code in the cell below to perform the required processing to create the dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Features = np.array(pd.read_csv('Auto_Data_Features.csv'))\n", + "Labels = np.array(pd.read_csv('Auto_Data_Labels.csv'))\n", + "Labels = np.log(Labels)\n", + "scaler = StandardScaler()\n", + "Labels = scaler.fit_transform(Labels)\n", + "Auto_Data = np.concatenate((Features,Labels), 1)\n", + "print(Auto_Data.shape)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the dataset has 46 columns (dimensions) for a small number of cases, 195. The small number of rows compared to the number of features adds to the challenge of this problem. \n", + "\n", + "In order to create meaningful visualizations of the cluster assignments a version of the dataset in the original units. The code in the cell below loads and preforms preparation on the original automotive dataset. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices = pd.read_csv('Automobile price data _Raw_.csv')\n", + "\n", + "def clean_auto_data(auto_prices):\n", + " 'Function to load the auto price data set from a .csv file' \n", + " import pandas as pd\n", + " import numpy as np\n", + " \n", + " ## Remove rows with missing values, accounting for mising values coded as '?'\n", + " cols = ['price', 'bore', 'stroke', \n", + " 'horsepower', 'peak-rpm']\n", + " for column in cols:\n", + " auto_prices.loc[auto_prices[column] == '?', column] = np.nan\n", + " auto_prices.dropna(axis = 0, inplace = True)\n", + "\n", + " ## Convert some columns to numeric values\n", + " for column in cols:\n", + " auto_prices[column] = pd.to_numeric(auto_prices[column])\n", + " \n", + " ## fix column names so the '-' character becomes '_'\n", + " cols = auto_prices.columns\n", + " auto_prices.columns = [str.replace('-', '_') for str in cols]\n", + " \n", + " return auto_prices\n", + "auto_prices = clean_auto_data(auto_prices)\n", + "\n", + "print(auto_prices.columns)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "So that you can view the cluster assignments in the same frame as the clustering algorithms view them the price column should be log transformed. Execute the code in the cell below to apply this transformation. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "auto_prices['price'] = np.log(auto_prices['price'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There is one last bit of preparation required. A list of marker shapes for the assignment visualization is created by the code below. Markers are assigned from a dictionary using a key that is a tuple of fuel type and aspiration. Execute the code in the cell below to create this list. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "marker_dic = {('gas','std'):'o', ('gas','turbo'):'s', ('diesel','std'):'x', ('diesel','turbo'):'^'}\n", + "markers = [marker_dic[(x,y)] for x,y in zip(auto_prices['fuel_type'], auto_prices['aspiration'])]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply K-means clustering\n", + "\n", + "With the data prepared, you will now create and evaluate a series of K-means clustering models applied to the automotive data set. \n", + "\n", + "The code in the cell below performs the following processing:\n", + "1. A dictionary is defined for mapping cluster assignment numbers to colors for the assignment visualization.\n", + "2. A K=2 K-means cluster model is defined. \n", + "3. The data is fit to the cluster model and assignments are computed. \n", + "4. The assignments are mapped to a list of colors. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(2233)\n", + "col_dic = {0:'blue',1:'green',2:'orange',3:'gray',4:'magenta',5:'black'}\n", + "kmeans_2 = KMeans(n_clusters=2, random_state=0)\n", + "assignments_km2 = kmeans_2.fit_predict(Auto_Data)\n", + "assign_color_km2 = [col_dic[x] for x in assignments_km2]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, the code in the cell below plots four views of the cluster assignments. With high dimensional data many views are possible. However, given limits of perception it is often best to select a few meaningful views. In this case 5 numeric columns and 2 categorical variables are displayed, for a total of 7 of 45 possible dimensions. \n", + "\n", + "The function in the cell below performs the following processing:\n", + "1. Lists of numeric columns are displayed.\n", + "2. The outer loop iterates over the column paris, the indices of the 4 plot axes. \n", + "3. The inner loop iterates over the x and y coordinates, the color and marker shape. The points are plotted inside this loop\n", + "4. The annotation for the plot are added.\n", + "\n", + "Execute this code to display the cluster assignments for the K=2 model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_auto_cluster(auto_prices, assign_color, markers):\n", + " fig, ax = plt.subplots(2, 2, figsize=(12,11)) # define plot area \n", + " x_cols = ['city_mpg', 'curb_weight', 'curb_weight', 'horsepower']\n", + " y_cols = ['price', 'price', 'city_mpg', 'price']\n", + " for x_col,y_col,i,j in zip(x_cols,y_cols,[0,0,1,1],[0,1,0,1]):\n", + " for x,y,c,m in zip(auto_prices[x_col], auto_prices[y_col], assign_color, markers):\n", + " ax[i,j].scatter(x,y, color = c, marker = m)\n", + " ax[i,j].set_title('Scatter plot of ' + y_col + ' vs. ' + x_col) # Give the plot a main title\n", + " ax[i,j].set_xlabel(x_col) # Set text for the x axis\n", + " ax[i,j].set_ylabel(y_col)# Set text for y axis\n", + " plt.show()\n", + "\n", + "plot_auto_cluster(auto_prices, assign_color_km2, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The K=2 clustering model has divided the data between high price, low fuel efficiency, high weight and high horsepower autos and ones that have the opposite characteristics. While this clustering is interesting, it can hardly be considered surprising. \n", + "\n", + "Next, execute the code in the cell below to compute and display the cluster assignments for the K=3 model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(4455)\n", + "kmeans_3 = KMeans(n_clusters=3, random_state=0)\n", + "assignments_km3 = kmeans_3.fit_predict(Auto_Data)\n", + "assign_color_km3 = [col_dic[x] for x in assignments_km3]\n", + "plot_auto_cluster(auto_prices, assign_color_km3, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The basic divisions of the dataset between the clusters is similar to the K=2 model case. However, there is significantly more overlap of these clusters than for the K=2 case. The standard aspiration autos are all in the cluster shown in blue. Beyond this, it is not completely clear what new information on the structure of this dataset has been learned. \n", + "\n", + "Execute the code in the cell below to compute and display the cluster assignments for the K=4 model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(223)\n", + "kmeans_4 = KMeans(n_clusters=4, random_state=0)\n", + "assignments_km4 = kmeans_4.fit_predict(Auto_Data)\n", + "assign_color_km4 = [col_dic[x] for x in assignments_km4]\n", + "plot_auto_cluster(auto_prices, assign_color_km4, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "There appears to be less overlap when compared to the k=4 model. Further, some additional interesting structure is starting to emerge. Primary divisions of these clusters are by price, weight, fuel efficiency and horsepower. All of the diesel autos are in two clusters, one with high cost, weight and horse power in blue, and one for lower cost, weight and horse power in orange. \n", + "\n", + "Execute the code in the cell below to compute and display the cluster assignments for a K=5 model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(4443)\n", + "kmeans_5 = KMeans(n_clusters=5, random_state=0)\n", + "assignments_km5 = kmeans_5.fit_predict(Auto_Data)\n", + "assign_color_km5 = [col_dic[x] for x in assignments_km5]\n", + "plot_auto_cluster(auto_prices, assign_color_km5, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The structure of these clusters is rather complex. The general pattern is similar to the K=4 model, but with finer grained divisions between the clusters. \n", + "\n", + "Finally, execute the code in the cell below to compute and display the class assignments for the K=6 model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "nr.seed(2288)\n", + "kmeans_6 = KMeans(n_clusters=6, random_state=0)\n", + "assignments_km6 = kmeans_6.fit_predict(Auto_Data)\n", + "assign_color_km6 = [col_dic[x] for x in assignments_km6]\n", + "plot_auto_cluster(auto_prices, assign_color_km6, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The structure of these clusters follows the general pattern of the K=4 and K=5 models. The difference being that there is a finer grained divisions between the clusters.\n", + "\n", + "While these visualizations are interesting, it is hard to select a best model based on just this evidence. To establish a quantitative basis for model selection, you will now compute and compare the within cluster sum of squares (WCSS), between cluster sum of squares (BCSS) and silhouette coefficient (SC) metrics. Execute the code in the cell below and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "km_models = [kmeans_2, kmeans_3, kmeans_4, kmeans_5, kmeans_6]\n", + "\n", + "def plot_WCSS_km(km_models, samples):\n", + " fig, ax = plt.subplots(1, 2, figsize=(12,5))\n", + " \n", + " ## Plot WCSS\n", + " wcss = [mod.inertia_ for mod in km_models]\n", + " n_clusts = range(2,len(wcss) + 2)\n", + " ax[0].bar(n_clusts, wcss)\n", + " ax[0].set_xlabel('Number of clusters')\n", + " ax[0].set_ylabel('WCSS')\n", + " \n", + " ## Plot BCSS\n", + " ## Compute BCSS as TSS - WCSS \n", + " n_1 = (float(samples.shape[0]) * float(samples.shape[1])) - 1.0\n", + " tss = n_1 * np.var(samples)\n", + " bcss = [tss - x for x in wcss]\n", + " ax[1].bar(n_clusts, bcss)\n", + " ax[1].set_xlabel('Number of clusters')\n", + " ax[1].set_ylabel('BCSS')\n", + " plt.show()\n", + " \n", + "\n", + "plot_WCSS_km(km_models, Auto_Data)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The WCSS decreases with cluster number, rapidly at first. The BCSS increases with cluster number, again rapidly at first. These results indicate that higher numbers of clusters create models that are better at separating the clusters. \n", + "\n", + "Now, execute the code in the cell below to compute and display the SC for each of the cluster models. \n", + "\n", + "Then, answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "assignment_list = [assignments_km2, assignments_km3, assignments_km4, assignments_km5, assignments_km6]\n", + "\n", + "def plot_sillohette(samples, assignments, x_lab = 'Number of clusters'):\n", + " silhouette = [silhouette_score(samples, a) for a in assignments]\n", + " n_clusts = range(2, len(silhouette) + 2)\n", + " plt.bar(n_clusts, silhouette)\n", + " plt.xlabel(x_lab)\n", + " plt.ylabel('SC')\n", + " plt.show()\n", + "\n", + "plot_sillohette(Auto_Data, assignment_list)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The SC is highest for the K=6 model, but only marginally higher than K=5. The K=5 and K=6 models have significantly higher SC than for K=2, 3 or 4. However, all these SC values are fairly low.\n", + "\n", + "Overall, it appears that the k=6 model is the best in terms of these metrics. It also appears that there is an improvement over the K=5 model" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Apply agglomerative clustering\n", + "\n", + "Having tried the K-means clustering mode with various numbers of clusters, you will now try agglomerative clustering models. You will compare these models using both visualization and the SC metric. \n", + "\n", + "The code in the cell below computes a 2 cluster agglomerative model and displays the cluster assignments. Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(2233)\n", + "agc_2 = AgglomerativeClustering(n_clusters=2)\n", + "assignments_ag2 = agc_2.fit_predict(Auto_Data)\n", + "assign_color_ag2 = [col_dic[x] for x in assignments_ag2]\n", + "plot_auto_cluster(auto_prices, assign_color_ag2, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine the above plots and compare them to the cluster assignments for the K=2 K-means model. Whereas the K-means model created an approximately even split of the dataset, the agglomerative clustering model has placed the majority of points in one cluster. There is considerable overlap in these views of the assignments for the agglomerative clustering model.\n", + "\n", + "Next, execute the code in the cell below to compute and display the assignments for the 3 cluster agglomerative model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(4433)\n", + "agc_3 = AgglomerativeClustering(n_clusters=3)\n", + "assignments_ag3 = agc_3.fit_predict(Auto_Data)\n", + "assign_color_ag3 = [col_dic[x] for x in assignments_ag3]\n", + "plot_auto_cluster(auto_prices, assign_color_ag3, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these plots and compare them to the 2 cluster model. It appears the 3 cluster model has split the larger cluster, but with considerable overlap in these views. \n", + "\n", + "Execute the code in the cell below to compute and display the cluster assignments for the 4 cluster agglomerative model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(2663)\n", + "agc_4 = AgglomerativeClustering(n_clusters=4)\n", + "assignments_ag4 = agc_4.fit_predict(Auto_Data)\n", + "assign_color_ag4 = [col_dic[x] for x in assignments_ag4]\n", + "plot_auto_cluster(auto_prices, assign_color_ag4, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare these cluster assignments to the 3 cluster model. Notice that low weight, low horsepower and low cost autos have been split into two clusters. Further, all diesel cars are in two clusters. \n", + "\n", + "Execute the code in the cell below to compute and display the cluster assignments for a 5 cluster model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(6233)\n", + "agc_5 = AgglomerativeClustering(n_clusters=5)\n", + "assignments_ag5 = agc_5.fit_predict(Auto_Data)\n", + "assign_color_ag5 = [col_dic[x] for x in assignments_ag5]\n", + "plot_auto_cluster(auto_prices, assign_color_ag5, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results are complex and hard to interpret. The data are now in multiple clusters with considerable overlap. Still, some patterns are visible with distinct fuel-aspiration combinations divided into distinct clusters, for example. \n", + "\n", + "Finally, execute the code in the cell below to compute and display the assignments for the 6 cluster agglomerative model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(2288)\n", + "agc_6 = AgglomerativeClustering(n_clusters=6)\n", + "assignments_ag6 = agc_6.fit_predict(Auto_Data)\n", + "assign_color_ag6 = [col_dic[x] for x in assignments_ag6]\n", + "plot_auto_cluster(auto_prices, assign_color_ag6, markers)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These results appear simpler to the 5 cluster agglomerative model. Some clusters are clearly separated but there is considerable overlap in other cases. \n", + "\n", + "Finally, execute the code in the cell below to compute and display the SC for the agglomerative clustering models. \n", + "\n", + "Then, answer **Question 2** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "assignment_list = [assignments_ag2, assignments_ag3, assignments_ag4, assignments_ag5, assignments_ag6]\n", + "plot_sillohette(Auto_Data, assignment_list)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The 5 cluster agglomerative model has the largest SC. The SC for this model is marginally higher than the 6 cluster model. It appears that the 5 cluster model is preferred. None of these models has a particularly high SC." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have computed, evaluated and compared K-means and agglomerative clustering models with 2, 3, 4, 5 and 6 clusters applied to the automotive dataset. As is often the case with unsupervised learning, it has proven difficult to compare models. It is also challenging to determine the most interesting aspects of data structure discovered by the clustering process. \n", + "\n", + "Specifically, your analysis discovered:\n", + "1. The K=6 model appears to be the best of the K-means clustering model tried. Some interesting structure was revealed in this analysis, but overall the reduction in BCSS and SC were relatively low. \n", + "2. The 5 cluster agglomerative model appears the be the best of those tried. As with the K-means model, some interesting structure was revealed, but the SC values were relatively low. \n", + "\n", + "Cluster analysis of the automotive data can be extended in a number of ways, including:\n", + "1. Use larger numbers of clusters to determine if finer groupings reveal structure. \n", + "2. For agglomerative clustering model try other linkage functions and distance metrics. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.4" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/Module7/Auto_Data_Features.csv b/Module7/Auto_Data_Features.csv new file mode 100644 index 0000000..69d1d18 --- /dev/null +++ b/Module7/Auto_Data_Features.csv @@ -0,0 +1,196 @@ +0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44 +-1.683439128405708,-0.43850391155869184,-0.8397490803384212,-2.1172454168040913,-0.02101769012578214,0.2045987001229315,0.5185550398228566,-1.8202776060185226,-0.29493307013241693,-0.6851049824879971,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +-1.683439128405708,-0.43850391155869184,-0.8397490803384212,-2.1172454168040913,-0.02101769012578214,0.2045987001229315,0.5185550398228566,-1.8202776060185226,-0.29493307013241693,-0.6851049824879971,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +-0.7188028512711508,-0.24564625663014425,-0.1815478201248432,-0.6113626583476852,0.5044245630187714,1.3429929274160832,-2.394771141238803,0.7012021638506771,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.14773482140904332,0.18828346695909237,0.14755280998194586,0.18340879750430586,-0.4241752007203305,-0.03366985907796072,-0.5140162648572261,0.4777799057610003,-0.04812185789389084,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.0823357517728036,0.18828346695909237,0.2415815614410294,0.18340879750430586,0.5063352621211152,0.31049583754555027,-0.5140162648572261,0.4777799057610003,-0.541744282370943,-1.1549600994346305,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.14773482140904332,0.244533616313254,0.1945671857114843,-0.3185521219811608,-0.09935635332187921,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.1287208659526597,1.4820369021047775,2.5923003479180844,0.7690298702373548,0.5445492441679919,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.1287208659526597,1.4820369021047775,2.5923003479180844,0.7690298702373548,0.7547261454258133,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.1287208659526597,1.4820369021047775,2.5923003479180844,0.8526900234849311,1.0069384269351989,0.9723529464369175,-0.7352815444315295,0.4777799057610003,-0.46770091869938496,-1.311578471750175,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,-0.3133546527843883,-0.060144143433615405,0.6291876796100074,-1.4372680207219364,-0.3442953125801219,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,-0.3133546527843883,-0.060144143433615405,0.6291876796100074,-1.4372680207219364,-0.3442953125801219,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,0.28851556445391846,0.4693415436794784,-0.07148570570861922,-0.19248686850802735,-0.29493307013241693,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.3766315651358881,0.20435493820313955,-0.5106484502316322,0.18340879750430586,0.39360401508282916,0.4693415436794784,-0.07148570570861922,-0.19248686850802735,-0.29493307013241693,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.752676215544274,1.1847146840899314,0.4766534400887349,0.7690298702373548,0.9477067547625402,0.4693415436794784,-0.07148570570861922,-0.19248686850802735,-0.29493307013241693,-0.8417233548035415,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.752676215544274,1.1847146840899314,0.4766534400887349,0.7690298702373548,1.2820790976727106,2.0842728893744145,1.0717182387586144,0.4458624403196187,-0.541744282370943,-1.4681968440657196,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.752676215544274,1.5704299939470312,0.9467971973841459,-0.06757166223842598,1.5686839630242853,2.0842728893744145,1.0717182387586144,0.4458624403196187,-0.541744282370943,-1.4681968440657196,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.8154110971331918,1.8275735338517627,2.357228469270379,1.0200103299800867,1.807521350817264,2.0842728893744145,1.0717182387586144,0.4458624403196187,-0.541744282370943,-1.6248152163812641,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.7161386632238267,-2.6644026788590343,-2.6262953580609816,-0.27672204535737116,-2.046358738610243,-1.463281214283314,-1.546587569537307,-0.7031663155701442,-0.17152746401315389,3.38697269771616,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-1.4751138067996454,-1.0748209589861235,-0.7786829648428408,-1.3088288851055243,-0.8808469584589108,-1.104057010388702,-0.44782659203908576,-0.14684634278930137,1.9774073468762596,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-1.242077473760981,-1.0748209589861235,-0.7786829648428408,-1.2419544165234901,-0.8808469584589108,-1.104057010388702,-0.44782659203908576,-0.14684634278930137,1.9774073468762596,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-1.3050074869008366,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.19374047311462117,1.820788974560715,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-1.3050074869008366,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-0.8235113131101912,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.131133868587548,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.0890984883359838,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.0890984883359838,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-0.7031372696625298,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.719976680726153,0.02756875451863454,-0.6046772016907158,2.4840630118127027,-0.04585677845625195,-0.4043098400571264,0.039146934078531676,0.6692846984092942,-0.4183386762516799,-0.21524986554136352,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.4814961737906454,1.104724368215191,0.9979631455671799,2.073653177830114,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.0104344765869135,-2.383151932088233,-0.9337778317975015,-1.2806438843283103,-1.616451440582881,-1.1985383707267672,-1.546587569537307,0.5096973712023833,-0.14684634278930137,3.700209442347249,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-2.0104344765869135,-2.383151932088233,-0.9337778317975015,-1.2806438843283103,-1.413917335734435,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.9492222084989965,-0.8867634560679597,-0.527702505100106,-1.379524751892246,-1.1455898020154578,-1.546587569537307,-0.575496453804615,-0.023440736670038324,1.9774073468762596,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.9492222084989965,-0.8867634560679597,-0.527702505100106,-1.1827227443508315,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.9492222084989965,-0.8867634560679597,-0.527702505100106,-1.1521515587133302,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.39180750308994533,-0.8724336351479285,-0.8867634560679597,0.26706895075188514,-1.0489738071867631,-0.7220012523249827,-1.546587569537307,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,-1.3786849793353715,-0.9337778317975015,1.8566118624558674,-1.0222240197539496,-0.7220012523249827,-1.5097100229415907,0.5096973712023833,-0.24557082768471186,0.7244603683519035,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.39180750308994533,-0.5429684746449903,-0.32259094731346516,-0.2348919687335845,-0.6171558100570574,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.39180750308994533,-0.5429684746449903,-0.32259094731346516,-0.2348919687335845,-0.5158887576328344,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.39180750308994533,0.09185463949481859,-0.32259094731346516,0.09974864425672958,-0.4872282710976769,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,0.09185463949481859,-1.5919790920110761,0.09974864425672958,-0.3573007321382964,-0.45725840876843576,-0.661526451240095,1.0522942837058817,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,0.09185463949481859,-0.32259094731346516,0.09974864425672958,-0.17960571562032013,-0.060144143433615405,-0.661526451240095,1.0522942837058817,-0.29493307013241693,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.39180750308994533,-0.4143967046926245,0.05352405852286232,-1.1969837310807312,-0.508245961223459,-0.08661842778927009,-0.661526451240095,1.0522942837058817,-0.2702519489085644,-0.058631493225819016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7515023860892718,-0.2858249347402587,-1.9210797221178653,-0.15123181548600523,-0.4241752007203305,-0.6690526836136733,-0.07148570570861922,-0.06481700674249814,-0.4183386762516799,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.47355634013524667,-0.1331459579218233,-0.32259094731346516,-1.0296634245855756,0.33437234291017043,-0.351361271345817,0.37104485343998767,-0.06481700674249814,-0.24557082768471186,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +2.305904119405,2.0365026600243574,1.7460415847863393,-0.44404235185252966,2.8794235472321534,1.9254271832404863,1.1085957853543307,2.9354247447474355,-0.5170631611470905,-1.6248152163812641,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.305904119405,2.0365026600243574,1.7460415847863393,-0.44404235185252966,2.8794235472321534,1.9254271832404863,1.1085957853543307,2.9354247447474355,-0.5170631611470905,-1.6248152163812641,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5074297044083699,1.4016795458845486,2.2161853420817503,-2.5355461830419816,2.657782451360269,4.202215637826789,0.7766978659928765,-1.5649378824874658,0.32209496046389824,-1.9380519610123532,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9476995949979956,-1.2179702668949137,-0.7927347046088762,0.09974864425672958,-1.278257699468023,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9476995949979956,-1.2179702668949137,-0.7927347046088762,0.09974864425672958,-1.2591507084445845,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9476995949979956,-1.2179702668949137,-0.7927347046088762,0.09974864425672958,-1.2495972129328654,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9476995949979956,-0.5992186239991497,-0.7927347046088762,0.09974864425672958,-1.1731692488391123,-0.9337955271702202,-1.104057010388702,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.9476995949979956,-0.5992186239991497,-0.7927347046088762,0.09974864425672958,-1.163615753327393,-0.9337955271702202,-0.9196692774101148,-0.32015673027355657,-0.29493307013241693,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,-0.06757166223842598,-0.3324616438078266,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.28469416624923083,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,-0.06757166223842598,-0.3324616438078266,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.28469416624923083,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.2216410958718844,-1.039692664592839,0.2235346670571187,0.4458624403196187,3.08638053753539,1.6641706022451705,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.01576285268155944,0.28471229442336843,0.2885959371705678,0.6853697169897756,-0.25603367971407337,-0.5102069774797452,0.2235346670571187,0.4458624403196187,-0.3936575550278274,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9815729592711188,0.059711697006726565,0.10053843425240075,0.2252388741280955,0.21208760036016525,0.4428672593238237,1.588003891098654,-0.28823926483217355,-0.541744282370943,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9815729592711188,0.059711697006726565,0.10053843425240075,0.2252388741280955,0.2694085734304802,-0.8278983897476014,0.37104485343998767,1.2437990763541755,2.913612688968422,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.8154110971331918,1.3373936609083668,2.0751422148931282,1.1036704832276658,1.8266283418407026,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.8154110971331918,1.3373936609083668,2.0751422148931282,2.023932168951026,2.2756426308915025,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.2758687726342033,1.064178649759588,2.0751422148931282,0.4343892572470407,1.7884143597938258,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +2.7309980720405664,2.2775747286850443,2.7333434751067065,1.0200103299800867,2.3138566129383795,0.5222901123907878,0.9242080523757454,1.2437990763541755,2.7902070828491587,-0.5284866101724526,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.7309980720405664,2.2775747286850443,2.7333434751067065,1.1036704832276658,2.2565356398680643,1.369467211771738,0.48167749322713854,-0.4797440574804674,-0.46770091869938496,-1.4681968440657196,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.375457735680886,0.48560568497394074,2.169170966352212,-1.2806438843283103,2.151447189239154,1.369467211771738,0.48167749322713854,-0.4797440574804674,-0.46770091869938496,-1.4681968440657196,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +3.5975357447207625,2.7195401878963033,2.7333434751067065,1.1873306364752452,2.5622474962430775,2.137221458085724,1.735514077481523,0.3181925785540895,-0.541744282370943,-1.7814335886968087,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.142406445314397,2.0043597175362655,2.874386602295328,0.6435396403659859,2.2087681623094686,2.137221458085724,1.735514077481523,0.3181925785540895,-0.541744282370943,-1.7814335886968087,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +0.6218780762717923,0.33292670815550535,0.9938115731136843,0.39255918062325107,0.6706553849226847,1.8989528988848317,1.6617589842900886,-0.4159091265977028,-0.541744282370943,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.6987059531497927,-1.2806438843283103,-1.2247581246023957,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,1.820788974560715,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.6987059531497927,-1.2806438843283103,-1.175079947941456,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.6987059531497927,-1.2806438843283103,-1.0604380018008261,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.964049362407055,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-0.7910294283703461,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4245070379080663,-0.10100301543373127,-0.22856219585438162,-1.8662649570613563,-0.36112213034298407,0.33697012190120496,-0.5877713580486605,0.6692846984092942,-0.665149888490206,-0.37186823785690803,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4245070379080663,-0.10100301543373127,-0.22856219585438162,-1.8662649570613563,-0.44137149264142495,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.058631493225819016,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.5235315540422097,1.104724368215191,0.9242080523757454,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.6916730750484669,1.104724368215191,0.9610855989714618,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.701226570560186,1.104724368215191,0.9610855989714618,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.3706756258547032,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.058631493225819016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.29424766176094996,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.058631493225819016,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.29806905996563765,0.33697012190120496,-0.5877713580486605,0.6692846984092942,-0.665149888490206,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.4245070379080663,-0.14921742916586817,-0.22856219585438162,-0.9460032713379963,-0.29806905996563765,0.33697012190120496,-0.5877713580486605,0.6692846984092942,-0.665149888490206,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.2801683985703667,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.0355989134703565,-1.2779612237937312,-1.2515671967715691,0.7012021638506771,2.888931567744569,3.0737359530850714,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.2247581246023957,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.186544142555519,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.32600361285037316,-0.9807922075270432,-0.15123181548600523,-1.0222240197539496,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.1617050542250493,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.6956474514634258,-0.9807922075270432,-0.2348919687335845,-1.0145812233445743,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-0.719754658329493,-0.9807922075270432,0.26706895075188514,-1.1234910721781726,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.32600361285037316,-0.9807922075270432,-0.15123181548600523,-0.9973849314234798,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.62070424681679,-0.9527909913681574,-0.9807922075270432,-0.2348919687335845,-1.0527952053914509,-0.9073212428145655,-0.661526451240095,0.12668778590579569,-0.1962085852370064,0.8810787406674481,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.2773591312265229,-0.06886007294563926,-0.32259094731346516,0.3507291039994644,-0.44901428905080026,-0.16604128085623418,0.002269387482815258,0.7012021638506771,-0.4183386762516799,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2773591312265229,-0.06886007294563926,-0.32259094731346516,0.3507291039994644,-0.49104966930236454,-0.16604128085623418,0.002269387482815258,0.7012021638506771,-0.4183386762516799,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.5981059836822594,0.2885959371705678,0.5180494104946199,1.0241347188562935,1.2900443587047739,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-1.311578471750175,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.8311423167209238,0.2885959371705678,0.9363501767325103,1.4081852384274034,1.2900443587047739,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-1.311578471750175,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.24583342586340634,0.8311423167209238,0.2885959371705678,0.5180494104946199,0.9572602502742593,1.2900443587047739,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-1.24199540836108,-0.2858249347402587,0.9467971973841459,-1.7407747271899874,0.9782779404000415,1.5018386335500113,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.24199540836108,-0.2858249347402587,0.9467971973841459,-1.7407747271899874,1.108205479359422,2.560810007776199,0.37104485343998767,0.06285285502303108,-0.5911065248186482,-1.311578471750175,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.04963621695468259,0.34096244377752777,0.9467971973841459,-1.7407747271899874,1.108205479359422,1.5018386335500113,0.37104485343998767,0.06285285502303108,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,0.8808322861805061,-0.16604128085623418,0.48167749322713854,-0.19248686850802735,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,1.2190260272953641,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,2.023932168951026,1.2820790976727106,-0.16604128085623418,0.48167749322713854,-0.19248686850802735,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,2.023932168951026,1.6642189181414768,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,0.9859207368094168,-0.21898984956754355,0.48167749322713854,-3.384233412646255,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,1.3241144779242748,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,1.1873306364752452,1.3871675483016213,-0.21898984956754355,0.48167749322713854,-3.384233412646255,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +2.502101328313724,1.9802525106701983,1.1818690760318513,2.023932168951026,1.7693073687703875,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,0.9859207368094168,-0.16604128085623418,0.48167749322713854,-0.19248686850802735,-0.44301979747553244,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.472065981542927,0.999892764783404,1.1818690760318513,1.1873306364752452,1.3241144779242748,-0.21898984956754355,1.3667386115243523,0.860789491057588,2.666801476729896,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.4884157489519863,0.999892764783404,1.1348547003023064,0.8945201001087207,1.0910091874383274,1.025301515148227,1.0348406921628963,-0.12865193762526275,-0.788555494609469,-1.1549600994346305,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-1.2247581246023957,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,1.820788974560715,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.2806438843283103,-0.8235113131101912,-0.03366985907796072,-1.104057010388702,0.4458624403196187,-0.6404687672663535,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3626135080913244,-0.9807922075270432,-1.3643040375758868,-1.131133868587548,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-0.5590399458890352,-0.9807922075270432,-1.2806438843283103,-1.0890984883359838,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-0.5590399458890352,-0.9807922075270432,-1.2806438843283103,-0.7031372696625298,-0.9337955271702202,-1.3253222899630037,-0.06481700674249814,-0.1962085852370064,0.8810787406674481,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.719976680726153,0.02756875451863454,-0.6046772016907158,2.4840630118127027,-0.04585677845625195,-0.4043098400571264,0.07602448067424973,0.6692846984092942,-0.4183386762516799,-0.21524986554136352,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.489906107544306,-0.0849315441896864,0.1945671857114843,-1.5316243440710422,0.49487106750705223,1.104724368215191,0.9610855989714618,1.945983316064585,-0.788555494609469,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-0.43046817593666936,1.1348547003023064,-1.5316243440710422,0.418443103413299,1.0517757995038817,2.251799729821564,-0.44782659203908576,-0.17152746401315389,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-1.5362912217241647,-0.43046817593666936,-0.4166196987725487,-0.9460032713379963,0.3764077231617347,2.746129998265782,1.5142487979072212,-1.1180933663081134,-0.17152746401315389,-1.311578471750175,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.5362912217241647,-0.43046817593666936,-0.4166196987725487,-0.9460032713379963,0.3764077231617347,2.746129998265782,1.5142487979072212,-1.1180933663081134,-0.17152746401315389,-1.311578471750175,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-1.5362912217241647,-0.43046817593666936,-0.4166196987725487,-0.9460032713379963,0.46047848366486327,2.746129998265782,1.5142487979072212,-1.1180933663081134,-0.17152746401315389,-1.311578471750175,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.18915921113203926,0.1781244157672768,0.7766978659928765,-0.575496453804615,-0.2184215943384737,-0.6851049824879971,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.259855077918761,0.1781244157672768,0.7766978659928765,-0.575496453804615,-0.2208897064608589,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.282783467146887,0.1781244157672768,-2.9110567935788443,-3.7672429979428426,-0.2208897064608589,-0.6851049824879971,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.3802291213664224,0.1781244157672768,0.7766978659928765,-0.575496453804615,-0.2208897064608589,-0.6851049824879971,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.4757640764836139,1.5018386335500113,0.7766978659928765,-0.575496453804615,-0.29493307013241693,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.03328644954562092,0.9918570291613815,0.2885959371705678,0.9363501767325103,0.5502813414750233,1.5018386335500113,0.7766978659928765,-0.575496453804615,-0.29493307013241693,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.8496009905436326,-1.3947564505794163,-1.168849710445207,-0.06757166223842598,-0.97254584309301,-0.9073212428145655,1.0717182387586144,-2.8416365001427564,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.8496009905436326,-1.3143990943591874,-1.0748209589861235,-0.06757166223842598,-0.8387969059289418,-0.8014241053919468,1.0717182387586144,-1.947947467784052,-0.3689764338039749,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.9150000601798746,-1.3626135080913244,-0.9807922075270432,0.7690298702373548,-0.6095130136476821,-0.8014241053919468,1.0717182387586144,-1.947947467784052,-0.3689764338039749,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.2773591312265229,-0.18136037165396018,-0.22856219585438162,-0.5695325817238956,-0.7910294283703461,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.17152746401315389,1.0376971129829926,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2773591312265229,-0.18136037165396018,-0.22856219585438162,-0.5695325817238956,-0.7050479687648736,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.17152746401315389,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2773591312265229,-0.18136037165396018,-0.22856219585438162,-0.5695325817238956,-0.418443103413299,-0.24546413392319824,1.0717182387586144,-1.947947467784052,-0.29493307013241693,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.31005866604464394,-0.18136037165396018,-0.22856219585438162,0.18340879750430586,-0.3324616438078266,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.29493307013241693,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.31005866604464394,-0.18136037165396018,-0.22856219585438162,0.18340879750430586,-0.09362425601484772,0.2045987001229315,1.0717182387586144,-1.947947467784052,-0.6157876460425007,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.31005866604464394,-0.06082433732361681,-0.22856219585438162,-0.3603821986049504,-0.5139780585304905,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.29493307013241693,0.4112236237208145,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.31005866604464394,-0.06082433732361681,-0.22856219585438162,-0.3603821986049504,-0.19871270664375842,-0.24546413392319824,1.0717182387586144,-1.947947467784052,-0.29493307013241693,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.3264084334537033,-0.05278860170159438,-0.22856219585438162,0.4343892572470407,-0.2655871752257925,-0.5631555461910546,1.0717182387586144,-1.947947467784052,-0.29493307013241693,-0.37186823785690803,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.3264084334537033,-0.05278860170159438,-0.22856219585438162,0.4343892572470407,0.17387361831328862,0.2045987001229315,1.0717182387586144,-1.947947467784052,-0.6157876460425007,-0.37186823785690803,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-1.2501132093830056,-1.0748209589861235,0.26706895075188514,-1.096741284745359,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,1.507552229929626,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-1.2501132093830056,-1.0748209589861235,0.26706895075188514,-0.9916528341164483,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-1.2501132093830056,-1.0748209589861235,0.26706895075188514,-1.039420311675044,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.8810787406674481,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.3661822909604876,-1.0748209589861235,2.1912524754461815,-0.5330850495539289,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.8810787406674481,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-0.3661822909604876,-1.0748209589861235,2.1912524754461815,-0.5139780585304905,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-0.3661822909604876,-1.0748209589861235,2.1912524754461815,1.0527952053914509,-1.0926412333041484,-1.0303019171972674,-0.7031663155701442,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.3603821986049504,-0.9133141709203513,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.7244603683519035,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.859814596054724,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.7244603683519035,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.3603821986049504,-0.542638545065648,-1.2514869394380765,-0.21899589209148818,0.3181925785540895,3.037018295087685,1.3509338576140817,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.542638545065648,-1.2514869394380765,-0.21899589209148818,0.3181925785540895,3.037018295087685,1.9774073468762596,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.3603821986049504,-0.8884750825898814,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,1.9774073468762596,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.8349755077242541,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.4112236237208145,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.522605642362427,-0.6393973021092642,-0.6987059531497927,-0.44404235185252966,-0.8005829238820652,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.4112236237208145,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.745172649914094,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.567841996036359,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.67829818133206,-0.8808469584589108,-0.5140162648572261,-0.7031663155701442,-0.29493307013241693,0.567841996036359,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.5617455360890863,0.23107298447858618,-0.32962853187863905,-0.5435789883632319,-0.1962085852370064,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-0.44653964718071654,-0.8867634560679597,-0.527702505100106,-0.49487106750705223,0.23107298447858618,-0.32962853187863905,-0.5435789883632319,-0.1962085852370064,0.09798687908972549,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,-0.03630328294453279,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,-0.04394607935390811,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,-0.015285592818750648,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,0.22928389228125973,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.7786829648428408,0.29615836086329383,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +-0.08116192231779915,0.15614052447100035,-0.13453344439530476,-0.3603821986049504,0.7948508265750337,0.33697012190120496,1.0717182387586144,0.7969545601748234,-0.2208897064608589,-0.21524986554136352,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.4343892572470407,-0.44519289084611263,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.567841996036359,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.4343892572470407,-0.15094522908516264,-0.8014241053919468,-0.21899589209148818,0.3181925785540895,3.037018295087685,0.7244603683519035,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.016088491009150316,-0.2770513698398555,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.4343892572470407,-0.2770513698398555,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.5728287740446119,0.10792611073886346,0.2885959371705678,0.016088491009150316,-0.19298060933672692,-0.29841270263450764,-0.07148570570861922,0.9246244219403525,-0.3689764338039749,0.25460525140527,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.6545776110899132,0.7427492248786723,0.8527684459250624,-0.7786829648428408,0.7967615256773776,1.528312917905666,-0.21899589209148818,0.3181925785540895,-0.2208897064608589,-0.8417233548035415,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.6545776110899132,0.7427492248786723,0.8527684459250624,-0.7786829648428408,0.8731894897711308,1.528312917905666,-0.21899589209148818,0.3181925785540895,-0.2208897064608589,-0.9983417271190861,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.9161738896348767,1.0882858566256577,0.2885959371705678,0.09974864425672958,1.0929198865406713,1.3959414961273926,-0.21899589209148818,0.3181925785540895,-0.24557082768471186,-0.8417233548035415,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.9161738896348767,1.0882858566256577,0.2885959371705678,0.09974864425672958,1.131133868587548,1.3959414961273926,-0.21899589209148818,0.3181925785540895,-0.24557082768471186,-0.9983417271190861,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.5693883324984617,-1.3573840768606953,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.820788974560715,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.6687446858203409,-0.48373269312409045,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.5636562351914302,-1.3573840768606953,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.820788974560715,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.6630125885133094,-0.48373269312409045,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.542638545065648,-0.48373269312409045,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,0.25460525140527,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.45856778456251945,-0.9337955271702202,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.820788974560715,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.2610093638174636,-0.20546757852002978,-0.1815478201248432,0.7690298702373548,-0.49487106750705223,-0.08661842778927009,-0.5140162648572261,0.4777799057610003,-0.04812185789389084,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +-0.7188028512711508,-1.2018987956508667,-0.7927347046088762,0.7271997936135651,-0.5827632262148684,-0.351361271345817,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.21524986554136352,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0 +-0.7188028512711508,-0.6876117158414033,-0.8867634560679597,-1.0296634245855756,-0.6458162965922148,-0.351361271345817,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.21524986554136352,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0 +0.24583342586340634,0.47756994935191605,0.4766534400887349,0.5180494104946199,0.19489130843907077,0.1781244157672768,-0.5140162648572261,0.4777799057610003,-0.4183386762516799,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.47756994935191605,0.4766534400887349,0.5180494104946199,0.03821398204687662,-0.9337955271702202,-1.1778121035801363,0.4777799057610003,3.160423901206948,1.1943154852985371,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.24583342586340634,0.7106062823905803,0.4766534400887349,0.5180494104946199,0.007642796409375324,-0.4043098400571264,-0.5140162648572261,0.4777799057610003,-0.29493307013241693,-0.058631493225819016,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,0.9781802533563,0.6744767831273724,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,1.5219712494655562,0.9075820736133198,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.37186823785690803,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,0.9781802533563,0.7184228624812805,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.21524986554136352,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,1.5219712494655562,0.9228676664320704,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.21524986554136352,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,0.9781802533563,0.9285997637391019,1.5547872022613207,1.0717182387586144,-0.32015673027355657,-0.665149888490206,-1.311578471750175,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +0.8834743548167557,1.1686432128458866,0.6176965672773569,1.5219712494655562,1.1425980632016108,1.5547872022613207,1.0717182387586144,-0.32015673027355657,-0.665149888490206,-1.311578471750175,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,0.7509047472211255,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.37186823785690803,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.369926578950012,0.6853697169897756,0.9362425601484772,1.5018386335500113,1.6617589842900886,-0.32015673027355657,-0.3689764338039749,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,0.8655466933617555,0.8135072403029894,0.9242080523757454,-1.2138457626322596,-0.3442953125801219,-1.1549600994346305,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,1.257240009342241,0.07222727834465804,-1.1778121035801363,0.4777799057610003,3.160423901206948,0.09798687908972549,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 +1.6682631904516485,1.1686432128458866,1.4169409546795568,0.6853697169897756,0.961081648478947,0.2840215531898956,1.6617589842900886,-0.32015673027355657,-0.17152746401315389,-0.9983417271190861,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0,0.0,0.0,0.0,1.0,0.0 diff --git a/Module7/Auto_Data_Labels.csv b/Module7/Auto_Data_Labels.csv new file mode 100644 index 0000000..41e5c6f --- /dev/null +++ b/Module7/Auto_Data_Labels.csv @@ -0,0 +1,196 @@ +0 +9.510074525452104 +9.711115659888671 +9.711115659888671 +9.543234787249512 +9.767094927630573 +9.632334782035558 +9.781884730776879 +9.847974842605868 +10.08058716534893 +9.706864211031315 +9.736547097780475 +9.950848123895966 +9.9572652582166 +10.109077944602749 +10.333970423619581 +10.628980909135285 +10.51542467767053 +8.546946149565585 +8.747510946478448 +8.791029857045965 +8.625509334899697 +8.760453046315272 +8.981807323377534 +8.736971085254146 +8.808668062106715 +8.937087036176523 +9.054621796976763 +9.096163326913784 +9.469931564261438 +8.776321456449958 +8.832733591996416 +8.593969030218288 +8.784009071186633 +8.871926251117628 +8.894944460956886 +8.894944460956886 +8.973984926689743 +9.115480090952223 +9.087607606593886 +9.23941361946189 +9.468464892185638 +9.244258590179644 +8.82246957226897 +9.310004695089129 +10.381273322223919 +10.478695435231387 +10.491274217438248 +8.555451903533328 +8.715224041915372 +8.823942326585245 +8.809116258125274 +8.9085593751449 +9.087607606593886 +9.047233034106032 +9.2681374707024 +9.234545060673 +9.286838342948908 +9.327678864393416 +9.813562845008141 +9.81705782453774 +10.148470870454794 +10.248777937608223 +10.246225830735987 +10.360912399575003 +10.439512977323862 +10.464702061835247 +10.62035125971339 +10.72326738402944 +9.711297461543568 +8.592115117933497 +8.73052880173936 +8.805225202632306 +8.94754601503218 +9.20623194393164 +9.047703788498627 +9.44375103564617 +9.607033787697222 +9.581145019820722 +8.85209276347713 +9.010547069270189 +9.13550906135318 +9.13550906135318 +8.612321536507814 +8.867709208039386 +8.802221746402456 +8.831857935197906 +8.90231952852887 +8.895492631451633 +8.961750799330497 +8.92252495730139 +8.987071812848821 +9.017847259860732 +9.099297073182859 +9.164191715950203 +9.510370887608827 +9.57491403870827 +9.510370887608827 +9.752606521576492 +9.888323152016355 +9.820051594294094 +9.38429367909962 +9.487972108574462 +9.42867236629317 +9.536762272743895 +9.65374331942474 +9.735068900911164 +9.722864552377755 +9.74537068443899 +9.718963572186974 +9.795345393916424 +9.806425839692997 +8.625509334899697 +8.981807323377534 +8.736971085254146 +8.808668062106715 +8.937087036176523 +9.096163326913784 +9.454383987398135 +9.999615579630348 +10.389856535868129 +10.434938994095775 +10.519429662187102 +9.380083146563278 +9.406729185981574 +9.618468597503831 +9.649240256170584 +9.806425839692997 +9.831991550831058 +8.540519016719735 +8.86120833720818 +8.936298185228436 +8.871505346165781 +8.958668737047434 +9.206332350578643 +9.130539301772627 +9.32892308780313 +8.917712757131387 +9.2299469016151 +8.98882050177807 +9.366831168735615 +8.584477938221834 +8.754318540250866 +8.77770959579525 +8.841881989497114 +8.974364841846596 +9.080003870248179 +8.844768827529695 +8.881558488638976 +8.974364841846596 +8.960339366492091 +8.953898535260459 +9.03097444300786 +9.133243321591216 +8.994420665751292 +9.016512874996026 +9.13755460225053 +9.16303909885817 +9.041803370152845 +9.173572647783969 +9.20923976653215 +9.323579767582693 +9.354354132115088 +9.77956697061665 +9.09918532261002 +9.277812087091196 +9.209139651399664 +9.296334565143043 +9.327945614050446 +9.71462464769875 +9.680218993408767 +9.660778845727078 +9.66459564425378 +8.958668737047434 +8.984066927653044 +8.986571625268054 +9.01127949117793 +9.047233034106032 +9.158520623246385 +9.209840246934501 +9.358329249689632 +9.20833836930551 +9.49514330367712 +9.535679435605061 +9.416541202560081 +9.468078568054892 +9.504128762859725 +9.679406061493943 +9.71202433782489 +9.821192309809298 +9.849559210510572 +9.731809155840653 +9.854559878912704 +9.975110296209095 +10.019936365179374 +10.026810768568128 diff --git a/Module7/Automobile price data _Raw_.csv b/Module7/Automobile price data _Raw_.csv new file mode 100644 index 0000000..9b36ab2 --- /dev/null +++ b/Module7/Automobile price data _Raw_.csv @@ -0,0 +1,206 @@ +symboling,normalized-losses,make,fuel-type,aspiration,num-of-doors,body-style,drive-wheels,engine-location,wheel-base,length,width,height,curb-weight,engine-type,num-of-cylinders,engine-size,fuel-system,bore,stroke,compression-ratio,horsepower,peak-rpm,city-mpg,highway-mpg,price +3,?,alfa-romero,gas,std,two,convertible,rwd,front,88.60,168.80,64.10,48.80,2548,dohc,four,130,mpfi,3.47,2.68,9.00,111,5000,21,27,13495 +3,?,alfa-romero,gas,std,two,convertible,rwd,front,88.60,168.80,64.10,48.80,2548,dohc,four,130,mpfi,3.47,2.68,9.00,111,5000,21,27,16500 +1,?,alfa-romero,gas,std,two,hatchback,rwd,front,94.50,171.20,65.50,52.40,2823,ohcv,six,152,mpfi,2.68,3.47,9.00,154,5000,19,26,16500 +2,164,audi,gas,std,four,sedan,fwd,front,99.80,176.60,66.20,54.30,2337,ohc,four,109,mpfi,3.19,3.40,10.00,102,5500,24,30,13950 +2,164,audi,gas,std,four,sedan,4wd,front,99.40,176.60,66.40,54.30,2824,ohc,five,136,mpfi,3.19,3.40,8.00,115,5500,18,22,17450 +2,?,audi,gas,std,two,sedan,fwd,front,99.80,177.30,66.30,53.10,2507,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,25,15250 +1,158,audi,gas,std,four,sedan,fwd,front,105.80,192.70,71.40,55.70,2844,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,25,17710 +1,?,audi,gas,std,four,wagon,fwd,front,105.80,192.70,71.40,55.70,2954,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,25,18920 +1,158,audi,gas,turbo,four,sedan,fwd,front,105.80,192.70,71.40,55.90,3086,ohc,five,131,mpfi,3.13,3.40,8.30,140,5500,17,20,23875 +0,?,audi,gas,turbo,two,hatchback,4wd,front,99.50,178.20,67.90,52.00,3053,ohc,five,131,mpfi,3.13,3.40,7.00,160,5500,16,22,? +2,192,bmw,gas,std,two,sedan,rwd,front,101.20,176.80,64.80,54.30,2395,ohc,four,108,mpfi,3.50,2.80,8.80,101,5800,23,29,16430 +0,192,bmw,gas,std,four,sedan,rwd,front,101.20,176.80,64.80,54.30,2395,ohc,four,108,mpfi,3.50,2.80,8.80,101,5800,23,29,16925 +0,188,bmw,gas,std,two,sedan,rwd,front,101.20,176.80,64.80,54.30,2710,ohc,six,164,mpfi,3.31,3.19,9.00,121,4250,21,28,20970 +0,188,bmw,gas,std,four,sedan,rwd,front,101.20,176.80,64.80,54.30,2765,ohc,six,164,mpfi,3.31,3.19,9.00,121,4250,21,28,21105 +1,?,bmw,gas,std,four,sedan,rwd,front,103.50,189.00,66.90,55.70,3055,ohc,six,164,mpfi,3.31,3.19,9.00,121,4250,20,25,24565 +0,?,bmw,gas,std,four,sedan,rwd,front,103.50,189.00,66.90,55.70,3230,ohc,six,209,mpfi,3.62,3.39,8.00,182,5400,16,22,30760 +0,?,bmw,gas,std,two,sedan,rwd,front,103.50,193.80,67.90,53.70,3380,ohc,six,209,mpfi,3.62,3.39,8.00,182,5400,16,22,41315 +0,?,bmw,gas,std,four,sedan,rwd,front,110.00,197.00,70.90,56.30,3505,ohc,six,209,mpfi,3.62,3.39,8.00,182,5400,15,20,36880 +2,121,chevrolet,gas,std,two,hatchback,fwd,front,88.40,141.10,60.30,53.20,1488,l,three,61,2bbl,2.91,3.03,9.50,48,5100,47,53,5151 +1,98,chevrolet,gas,std,two,hatchback,fwd,front,94.50,155.90,63.60,52.00,1874,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,6295 +0,81,chevrolet,gas,std,four,sedan,fwd,front,94.50,158.80,63.60,52.00,1909,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,6575 +1,118,dodge,gas,std,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,1876,ohc,four,90,2bbl,2.97,3.23,9.41,68,5500,37,41,5572 +1,118,dodge,gas,std,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,1876,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6377 +1,118,dodge,gas,turbo,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,2128,ohc,four,98,mpfi,3.03,3.39,7.60,102,5500,24,30,7957 +1,148,dodge,gas,std,four,hatchback,fwd,front,93.70,157.30,63.80,50.60,1967,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6229 +1,148,dodge,gas,std,four,sedan,fwd,front,93.70,157.30,63.80,50.60,1989,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6692 +1,148,dodge,gas,std,four,sedan,fwd,front,93.70,157.30,63.80,50.60,1989,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,7609 +1,148,dodge,gas,turbo,?,sedan,fwd,front,93.70,157.30,63.80,50.60,2191,ohc,four,98,mpfi,3.03,3.39,7.60,102,5500,24,30,8558 +-1,110,dodge,gas,std,four,wagon,fwd,front,103.30,174.60,64.60,59.80,2535,ohc,four,122,2bbl,3.34,3.46,8.50,88,5000,24,30,8921 +3,145,dodge,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2811,ohc,four,156,mfi,3.60,3.90,7.00,145,5000,19,24,12964 +2,137,honda,gas,std,two,hatchback,fwd,front,86.60,144.60,63.90,50.80,1713,ohc,four,92,1bbl,2.91,3.41,9.60,58,4800,49,54,6479 +2,137,honda,gas,std,two,hatchback,fwd,front,86.60,144.60,63.90,50.80,1819,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,31,38,6855 +1,101,honda,gas,std,two,hatchback,fwd,front,93.70,150.00,64.00,52.60,1837,ohc,four,79,1bbl,2.91,3.07,10.10,60,5500,38,42,5399 +1,101,honda,gas,std,two,hatchback,fwd,front,93.70,150.00,64.00,52.60,1940,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,30,34,6529 +1,101,honda,gas,std,two,hatchback,fwd,front,93.70,150.00,64.00,52.60,1956,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,30,34,7129 +0,110,honda,gas,std,four,sedan,fwd,front,96.50,163.40,64.00,54.50,2010,ohc,four,92,1bbl,2.91,3.41,9.20,76,6000,30,34,7295 +0,78,honda,gas,std,four,wagon,fwd,front,96.50,157.10,63.90,58.30,2024,ohc,four,92,1bbl,2.92,3.41,9.20,76,6000,30,34,7295 +0,106,honda,gas,std,two,hatchback,fwd,front,96.50,167.50,65.20,53.30,2236,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,7895 +0,106,honda,gas,std,two,hatchback,fwd,front,96.50,167.50,65.20,53.30,2289,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,9095 +0,85,honda,gas,std,four,sedan,fwd,front,96.50,175.40,65.20,54.10,2304,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,8845 +0,85,honda,gas,std,four,sedan,fwd,front,96.50,175.40,62.50,54.10,2372,ohc,four,110,1bbl,3.15,3.58,9.00,86,5800,27,33,10295 +0,85,honda,gas,std,four,sedan,fwd,front,96.50,175.40,65.20,54.10,2465,ohc,four,110,mpfi,3.15,3.58,9.00,101,5800,24,28,12945 +1,107,honda,gas,std,two,sedan,fwd,front,96.50,169.10,66.00,51.00,2293,ohc,four,110,2bbl,3.15,3.58,9.10,100,5500,25,31,10345 +0,?,isuzu,gas,std,four,sedan,rwd,front,94.30,170.70,61.80,53.50,2337,ohc,four,111,2bbl,3.31,3.23,8.50,78,4800,24,29,6785 +1,?,isuzu,gas,std,two,sedan,fwd,front,94.50,155.90,63.60,52.00,1874,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,? +0,?,isuzu,gas,std,four,sedan,fwd,front,94.50,155.90,63.60,52.00,1909,ohc,four,90,2bbl,3.03,3.11,9.60,70,5400,38,43,? +2,?,isuzu,gas,std,two,hatchback,rwd,front,96.00,172.60,65.20,51.40,2734,ohc,four,119,spfi,3.43,3.23,9.20,90,5000,24,29,11048 +0,145,jaguar,gas,std,four,sedan,rwd,front,113.00,199.60,69.60,52.80,4066,dohc,six,258,mpfi,3.63,4.17,8.10,176,4750,15,19,32250 +0,?,jaguar,gas,std,four,sedan,rwd,front,113.00,199.60,69.60,52.80,4066,dohc,six,258,mpfi,3.63,4.17,8.10,176,4750,15,19,35550 +0,?,jaguar,gas,std,two,sedan,rwd,front,102.00,191.70,70.60,47.80,3950,ohcv,twelve,326,mpfi,3.54,2.76,11.50,262,5000,13,17,36000 +1,104,mazda,gas,std,two,hatchback,fwd,front,93.10,159.10,64.20,54.10,1890,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,30,31,5195 +1,104,mazda,gas,std,two,hatchback,fwd,front,93.10,159.10,64.20,54.10,1900,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,31,38,6095 +1,104,mazda,gas,std,two,hatchback,fwd,front,93.10,159.10,64.20,54.10,1905,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,31,38,6795 +1,113,mazda,gas,std,four,sedan,fwd,front,93.10,166.80,64.20,54.10,1945,ohc,four,91,2bbl,3.03,3.15,9.00,68,5000,31,38,6695 +1,113,mazda,gas,std,four,sedan,fwd,front,93.10,166.80,64.20,54.10,1950,ohc,four,91,2bbl,3.08,3.15,9.00,68,5000,31,38,7395 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2380,rotor,two,70,4bbl,?,?,9.40,101,6000,17,23,10945 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2380,rotor,two,70,4bbl,?,?,9.40,101,6000,17,23,11845 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2385,rotor,two,70,4bbl,?,?,9.40,101,6000,17,23,13645 +3,150,mazda,gas,std,two,hatchback,rwd,front,95.30,169.00,65.70,49.60,2500,rotor,two,80,mpfi,?,?,9.40,135,6000,16,23,15645 +1,129,mazda,gas,std,two,hatchback,fwd,front,98.80,177.80,66.50,53.70,2385,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,8845 +0,115,mazda,gas,std,four,sedan,fwd,front,98.80,177.80,66.50,55.50,2410,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,8495 +1,129,mazda,gas,std,two,hatchback,fwd,front,98.80,177.80,66.50,53.70,2385,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,10595 +0,115,mazda,gas,std,four,sedan,fwd,front,98.80,177.80,66.50,55.50,2410,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,10245 +0,?,mazda,diesel,std,?,sedan,fwd,front,98.80,177.80,66.50,55.50,2443,ohc,four,122,idi,3.39,3.39,22.70,64,4650,36,42,10795 +0,115,mazda,gas,std,four,hatchback,fwd,front,98.80,177.80,66.50,55.50,2425,ohc,four,122,2bbl,3.39,3.39,8.60,84,4800,26,32,11245 +0,118,mazda,gas,std,four,sedan,rwd,front,104.90,175.00,66.10,54.40,2670,ohc,four,140,mpfi,3.76,3.16,8.00,120,5000,19,27,18280 +0,?,mazda,diesel,std,four,sedan,rwd,front,104.90,175.00,66.10,54.40,2700,ohc,four,134,idi,3.43,3.64,22.00,72,4200,31,39,18344 +-1,93,mercedes-benz,diesel,turbo,four,sedan,rwd,front,110.00,190.90,70.30,56.50,3515,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,25552 +-1,93,mercedes-benz,diesel,turbo,four,wagon,rwd,front,110.00,190.90,70.30,58.70,3750,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,28248 +0,93,mercedes-benz,diesel,turbo,two,hardtop,rwd,front,106.70,187.50,70.30,54.90,3495,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,28176 +-1,93,mercedes-benz,diesel,turbo,four,sedan,rwd,front,115.60,202.60,71.70,56.30,3770,ohc,five,183,idi,3.58,3.64,21.50,123,4350,22,25,31600 +-1,?,mercedes-benz,gas,std,four,sedan,rwd,front,115.60,202.60,71.70,56.50,3740,ohcv,eight,234,mpfi,3.46,3.10,8.30,155,4750,16,18,34184 +3,142,mercedes-benz,gas,std,two,convertible,rwd,front,96.60,180.30,70.50,50.80,3685,ohcv,eight,234,mpfi,3.46,3.10,8.30,155,4750,16,18,35056 +0,?,mercedes-benz,gas,std,four,sedan,rwd,front,120.90,208.10,71.70,56.70,3900,ohcv,eight,308,mpfi,3.80,3.35,8.00,184,4500,14,16,40960 +1,?,mercedes-benz,gas,std,two,hardtop,rwd,front,112.00,199.20,72.00,55.40,3715,ohcv,eight,304,mpfi,3.80,3.35,8.00,184,4500,14,16,45400 +1,?,mercury,gas,turbo,two,hatchback,rwd,front,102.70,178.40,68.00,54.80,2910,ohc,four,140,mpfi,3.78,3.12,8.00,175,5000,19,24,16503 +2,161,mitsubishi,gas,std,two,hatchback,fwd,front,93.70,157.30,64.40,50.80,1918,ohc,four,92,2bbl,2.97,3.23,9.40,68,5500,37,41,5389 +2,161,mitsubishi,gas,std,two,hatchback,fwd,front,93.70,157.30,64.40,50.80,1944,ohc,four,92,2bbl,2.97,3.23,9.40,68,5500,31,38,6189 +2,161,mitsubishi,gas,std,two,hatchback,fwd,front,93.70,157.30,64.40,50.80,2004,ohc,four,92,2bbl,2.97,3.23,9.40,68,5500,31,38,6669 +1,161,mitsubishi,gas,turbo,two,hatchback,fwd,front,93,157.30,63.80,50.80,2145,ohc,four,98,spdi,3.03,3.39,7.60,102,5500,24,30,7689 +3,153,mitsubishi,gas,turbo,two,hatchback,fwd,front,96.30,173.00,65.40,49.40,2370,ohc,four,110,spdi,3.17,3.46,7.50,116,5500,23,30,9959 +3,153,mitsubishi,gas,std,two,hatchback,fwd,front,96.30,173.00,65.40,49.40,2328,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,25,32,8499 +3,?,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2833,ohc,four,156,spdi,3.58,3.86,7.00,145,5000,19,24,12629 +3,?,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2921,ohc,four,156,spdi,3.59,3.86,7.00,145,5000,19,24,14869 +3,?,mitsubishi,gas,turbo,two,hatchback,fwd,front,95.90,173.20,66.30,50.20,2926,ohc,four,156,spdi,3.59,3.86,7.00,145,5000,19,24,14489 +1,125,mitsubishi,gas,std,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2365,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,25,32,6989 +1,125,mitsubishi,gas,std,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2405,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,25,32,8189 +1,125,mitsubishi,gas,turbo,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2403,ohc,four,110,spdi,3.17,3.46,7.50,116,5500,23,30,9279 +-1,137,mitsubishi,gas,std,four,sedan,fwd,front,96.30,172.40,65.40,51.60,2403,ohc,four,110,spdi,3.17,3.46,7.50,116,5500,23,30,9279 +1,128,nissan,gas,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,1889,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,5499 +1,128,nissan,diesel,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,2017,ohc,four,103,idi,2.99,3.47,21.90,55,4800,45,50,7099 +1,128,nissan,gas,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,1918,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,6649 +1,122,nissan,gas,std,four,sedan,fwd,front,94.50,165.30,63.80,54.50,1938,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,6849 +1,103,nissan,gas,std,four,wagon,fwd,front,94.50,170.20,63.80,53.50,2024,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7349 +1,128,nissan,gas,std,two,sedan,fwd,front,94.50,165.30,63.80,54.50,1951,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7299 +1,128,nissan,gas,std,two,hatchback,fwd,front,94.50,165.60,63.80,53.30,2028,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7799 +1,122,nissan,gas,std,four,sedan,fwd,front,94.50,165.30,63.80,54.50,1971,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7499 +1,103,nissan,gas,std,four,wagon,fwd,front,94.50,170.20,63.80,53.50,2037,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,7999 +2,168,nissan,gas,std,two,hardtop,fwd,front,95.10,162.40,63.80,53.30,2008,ohc,four,97,2bbl,3.15,3.29,9.40,69,5200,31,37,8249 +0,106,nissan,gas,std,four,hatchback,fwd,front,97.20,173.40,65.20,54.70,2324,ohc,four,120,2bbl,3.33,3.47,8.50,97,5200,27,34,8949 +0,106,nissan,gas,std,four,sedan,fwd,front,97.20,173.40,65.20,54.70,2302,ohc,four,120,2bbl,3.33,3.47,8.50,97,5200,27,34,9549 +0,128,nissan,gas,std,four,sedan,fwd,front,100.40,181.70,66.50,55.10,3095,ohcv,six,181,mpfi,3.43,3.27,9.00,152,5200,17,22,13499 +0,108,nissan,gas,std,four,wagon,fwd,front,100.40,184.60,66.50,56.10,3296,ohcv,six,181,mpfi,3.43,3.27,9.00,152,5200,17,22,14399 +0,108,nissan,gas,std,four,sedan,fwd,front,100.40,184.60,66.50,55.10,3060,ohcv,six,181,mpfi,3.43,3.27,9.00,152,5200,19,25,13499 +3,194,nissan,gas,std,two,hatchback,rwd,front,91.30,170.70,67.90,49.70,3071,ohcv,six,181,mpfi,3.43,3.27,9.00,160,5200,19,25,17199 +3,194,nissan,gas,turbo,two,hatchback,rwd,front,91.30,170.70,67.90,49.70,3139,ohcv,six,181,mpfi,3.43,3.27,7.80,200,5200,17,23,19699 +1,231,nissan,gas,std,two,hatchback,rwd,front,99.20,178.50,67.90,49.70,3139,ohcv,six,181,mpfi,3.43,3.27,9.00,160,5200,19,25,18399 +0,161,peugot,gas,std,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3020,l,four,120,mpfi,3.46,3.19,8.40,97,5000,19,24,11900 +0,161,peugot,diesel,turbo,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3197,l,four,152,idi,3.70,3.52,21.00,95,4150,28,33,13200 +0,?,peugot,gas,std,four,wagon,rwd,front,114.20,198.90,68.40,58.70,3230,l,four,120,mpfi,3.46,3.19,8.40,97,5000,19,24,12440 +0,?,peugot,diesel,turbo,four,wagon,rwd,front,114.20,198.90,68.40,58.70,3430,l,four,152,idi,3.70,3.52,21.00,95,4150,25,25,13860 +0,161,peugot,gas,std,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3075,l,four,120,mpfi,3.46,2.19,8.40,95,5000,19,24,15580 +0,161,peugot,diesel,turbo,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3252,l,four,152,idi,3.70,3.52,21.00,95,4150,28,33,16900 +0,?,peugot,gas,std,four,wagon,rwd,front,114.20,198.90,68.40,56.70,3285,l,four,120,mpfi,3.46,2.19,8.40,95,5000,19,24,16695 +0,?,peugot,diesel,turbo,four,wagon,rwd,front,114.20,198.90,68.40,58.70,3485,l,four,152,idi,3.70,3.52,21.00,95,4150,25,25,17075 +0,161,peugot,gas,std,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3075,l,four,120,mpfi,3.46,3.19,8.40,97,5000,19,24,16630 +0,161,peugot,diesel,turbo,four,sedan,rwd,front,107.90,186.70,68.40,56.70,3252,l,four,152,idi,3.70,3.52,21.00,95,4150,28,33,17950 +0,161,peugot,gas,turbo,four,sedan,rwd,front,108.00,186.70,68.30,56.00,3130,l,four,134,mpfi,3.61,3.21,7.00,142,5600,18,24,18150 +1,119,plymouth,gas,std,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,1918,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,37,41,5572 +1,119,plymouth,gas,turbo,two,hatchback,fwd,front,93.70,157.30,63.80,50.80,2128,ohc,four,98,spdi,3.03,3.39,7.60,102,5500,24,30,7957 +1,154,plymouth,gas,std,four,hatchback,fwd,front,93.70,157.30,63.80,50.60,1967,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6229 +1,154,plymouth,gas,std,four,sedan,fwd,front,93.70,167.30,63.80,50.80,1989,ohc,four,90,2bbl,2.97,3.23,9.40,68,5500,31,38,6692 +1,154,plymouth,gas,std,four,sedan,fwd,front,93.70,167.30,63.80,50.80,2191,ohc,four,98,2bbl,2.97,3.23,9.40,68,5500,31,38,7609 +-1,74,plymouth,gas,std,four,wagon,fwd,front,103.30,174.60,64.60,59.80,2535,ohc,four,122,2bbl,3.35,3.46,8.50,88,5000,24,30,8921 +3,?,plymouth,gas,turbo,two,hatchback,rwd,front,95.90,173.20,66.30,50.20,2818,ohc,four,156,spdi,3.59,3.86,7.00,145,5000,19,24,12764 +3,186,porsche,gas,std,two,hatchback,rwd,front,94.50,168.90,68.30,50.20,2778,ohc,four,151,mpfi,3.94,3.11,9.50,143,5500,19,27,22018 +3,?,porsche,gas,std,two,hardtop,rwd,rear,89.50,168.90,65.00,51.60,2756,ohcf,six,194,mpfi,3.74,2.90,9.50,207,5900,17,25,32528 +3,?,porsche,gas,std,two,hardtop,rwd,rear,89.50,168.90,65.00,51.60,2756,ohcf,six,194,mpfi,3.74,2.90,9.50,207,5900,17,25,34028 +3,?,porsche,gas,std,two,convertible,rwd,rear,89.50,168.90,65.00,51.60,2800,ohcf,six,194,mpfi,3.74,2.90,9.50,207,5900,17,25,37028 +1,?,porsche,gas,std,two,hatchback,rwd,front,98.40,175.70,72.30,50.50,3366,dohcv,eight,203,mpfi,3.94,3.11,10.00,288,5750,17,28,? +0,?,renault,gas,std,four,wagon,fwd,front,96.10,181.50,66.50,55.20,2579,ohc,four,132,mpfi,3.46,3.90,8.70,?,?,23,31,9295 +2,?,renault,gas,std,two,hatchback,fwd,front,96.10,176.80,66.60,50.50,2460,ohc,four,132,mpfi,3.46,3.90,8.70,?,?,23,31,9895 +3,150,saab,gas,std,two,hatchback,fwd,front,99.10,186.60,66.50,56.10,2658,ohc,four,121,mpfi,3.54,3.07,9.31,110,5250,21,28,11850 +2,104,saab,gas,std,four,sedan,fwd,front,99.10,186.60,66.50,56.10,2695,ohc,four,121,mpfi,3.54,3.07,9.30,110,5250,21,28,12170 +3,150,saab,gas,std,two,hatchback,fwd,front,99.10,186.60,66.50,56.10,2707,ohc,four,121,mpfi,2.54,2.07,9.30,110,5250,21,28,15040 +2,104,saab,gas,std,four,sedan,fwd,front,99.10,186.60,66.50,56.10,2758,ohc,four,121,mpfi,3.54,3.07,9.30,110,5250,21,28,15510 +3,150,saab,gas,turbo,two,hatchback,fwd,front,99.10,186.60,66.50,56.10,2808,dohc,four,121,mpfi,3.54,3.07,9.00,160,5500,19,26,18150 +2,104,saab,gas,turbo,four,sedan,fwd,front,99.10,186.60,66.50,56.10,2847,dohc,four,121,mpfi,3.54,3.07,9.00,160,5500,19,26,18620 +2,83,subaru,gas,std,two,hatchback,fwd,front,93.70,156.90,63.40,53.70,2050,ohcf,four,97,2bbl,3.62,2.36,9.00,69,4900,31,36,5118 +2,83,subaru,gas,std,two,hatchback,fwd,front,93.70,157.90,63.60,53.70,2120,ohcf,four,108,2bbl,3.62,2.64,8.70,73,4400,26,31,7053 +2,83,subaru,gas,std,two,hatchback,4wd,front,93.30,157.30,63.80,55.70,2240,ohcf,four,108,2bbl,3.62,2.64,8.70,73,4400,26,31,7603 +0,102,subaru,gas,std,four,sedan,fwd,front,97.20,172.00,65.40,52.50,2145,ohcf,four,108,2bbl,3.62,2.64,9.50,82,4800,32,37,7126 +0,102,subaru,gas,std,four,sedan,fwd,front,97.20,172.00,65.40,52.50,2190,ohcf,four,108,2bbl,3.62,2.64,9.50,82,4400,28,33,7775 +0,102,subaru,gas,std,four,sedan,fwd,front,97.20,172.00,65.40,52.50,2340,ohcf,four,108,mpfi,3.62,2.64,9.00,94,5200,26,32,9960 +0,102,subaru,gas,std,four,sedan,4wd,front,97.00,172.00,65.40,54.30,2385,ohcf,four,108,2bbl,3.62,2.64,9.00,82,4800,24,25,9233 +0,102,subaru,gas,turbo,four,sedan,4wd,front,97.00,172.00,65.40,54.30,2510,ohcf,four,108,mpfi,3.62,2.64,7.70,111,4800,24,29,11259 +0,89,subaru,gas,std,four,wagon,fwd,front,97.00,173.50,65.40,53.00,2290,ohcf,four,108,2bbl,3.62,2.64,9.00,82,4800,28,32,7463 +0,89,subaru,gas,std,four,wagon,fwd,front,97.00,173.50,65.40,53.00,2455,ohcf,four,108,mpfi,3.62,2.64,9.00,94,5200,25,31,10198 +0,85,subaru,gas,std,four,wagon,4wd,front,96.90,173.60,65.40,54.90,2420,ohcf,four,108,2bbl,3.62,2.64,9.00,82,4800,23,29,8013 +0,85,subaru,gas,turbo,four,wagon,4wd,front,96.90,173.60,65.40,54.90,2650,ohcf,four,108,mpfi,3.62,2.64,7.70,111,4800,23,23,11694 +1,87,toyota,gas,std,two,hatchback,fwd,front,95.70,158.70,63.60,54.50,1985,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,35,39,5348 +1,87,toyota,gas,std,two,hatchback,fwd,front,95.70,158.70,63.60,54.50,2040,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,31,38,6338 +1,74,toyota,gas,std,four,hatchback,fwd,front,95.70,158.70,63.60,54.50,2015,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,31,38,6488 +0,77,toyota,gas,std,four,wagon,fwd,front,95.70,169.70,63.60,59.10,2280,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,31,37,6918 +0,81,toyota,gas,std,four,wagon,4wd,front,95.70,169.70,63.60,59.10,2290,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,27,32,7898 +0,91,toyota,gas,std,four,wagon,4wd,front,95.70,169.70,63.60,59.10,3110,ohc,four,92,2bbl,3.05,3.03,9.00,62,4800,27,32,8778 +0,91,toyota,gas,std,four,sedan,fwd,front,95.70,166.30,64.40,53.00,2081,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,30,37,6938 +0,91,toyota,gas,std,four,hatchback,fwd,front,95.70,166.30,64.40,52.80,2109,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,30,37,7198 +0,91,toyota,diesel,std,four,sedan,fwd,front,95.70,166.30,64.40,53.00,2275,ohc,four,110,idi,3.27,3.35,22.50,56,4500,34,36,7898 +0,91,toyota,diesel,std,four,hatchback,fwd,front,95.70,166.30,64.40,52.80,2275,ohc,four,110,idi,3.27,3.35,22.50,56,4500,38,47,7788 +0,91,toyota,gas,std,four,sedan,fwd,front,95.70,166.30,64.40,53.00,2094,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,38,47,7738 +0,91,toyota,gas,std,four,hatchback,fwd,front,95.70,166.30,64.40,52.80,2122,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,28,34,8358 +0,91,toyota,gas,std,four,sedan,fwd,front,95.70,166.30,64.40,52.80,2140,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,28,34,9258 +1,168,toyota,gas,std,two,sedan,rwd,front,94.50,168.70,64.00,52.60,2169,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,29,34,8058 +1,168,toyota,gas,std,two,hatchback,rwd,front,94.50,168.70,64.00,52.60,2204,ohc,four,98,2bbl,3.19,3.03,9.00,70,4800,29,34,8238 +1,168,toyota,gas,std,two,sedan,rwd,front,94.50,168.70,64.00,52.60,2265,dohc,four,98,mpfi,3.24,3.08,9.40,112,6600,26,29,9298 +1,168,toyota,gas,std,two,hatchback,rwd,front,94.50,168.70,64.00,52.60,2300,dohc,four,98,mpfi,3.24,3.08,9.40,112,6600,26,29,9538 +2,134,toyota,gas,std,two,hardtop,rwd,front,98.40,176.20,65.60,52.00,2540,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,8449 +2,134,toyota,gas,std,two,hardtop,rwd,front,98.40,176.20,65.60,52.00,2536,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,9639 +2,134,toyota,gas,std,two,hatchback,rwd,front,98.40,176.20,65.60,52.00,2551,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,9989 +2,134,toyota,gas,std,two,hardtop,rwd,front,98.40,176.20,65.60,52.00,2679,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,11199 +2,134,toyota,gas,std,two,hatchback,rwd,front,98.40,176.20,65.60,52.00,2714,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,11549 +2,134,toyota,gas,std,two,convertible,rwd,front,98.40,176.20,65.60,53.00,2975,ohc,four,146,mpfi,3.62,3.50,9.30,116,4800,24,30,17669 +-1,65,toyota,gas,std,four,sedan,fwd,front,102.40,175.60,66.50,54.90,2326,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,29,34,8948 +-1,65,toyota,diesel,turbo,four,sedan,fwd,front,102.40,175.60,66.50,54.90,2480,ohc,four,110,idi,3.27,3.35,22.50,73,4500,30,33,10698 +-1,65,toyota,gas,std,four,hatchback,fwd,front,102.40,175.60,66.50,53.90,2414,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,27,32,9988 +-1,65,toyota,gas,std,four,sedan,fwd,front,102.40,175.60,66.50,54.90,2414,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,27,32,10898 +-1,65,toyota,gas,std,four,hatchback,fwd,front,102.40,175.60,66.50,53.90,2458,ohc,four,122,mpfi,3.31,3.54,8.70,92,4200,27,32,11248 +3,197,toyota,gas,std,two,hatchback,rwd,front,102.90,183.50,67.70,52.00,2976,dohc,six,171,mpfi,3.27,3.35,9.30,161,5200,20,24,16558 +3,197,toyota,gas,std,two,hatchback,rwd,front,102.90,183.50,67.70,52.00,3016,dohc,six,171,mpfi,3.27,3.35,9.30,161,5200,19,24,15998 +-1,90,toyota,gas,std,four,sedan,rwd,front,104.50,187.80,66.50,54.10,3131,dohc,six,171,mpfi,3.27,3.35,9.20,156,5200,20,24,15690 +-1,?,toyota,gas,std,four,wagon,rwd,front,104.50,187.80,66.50,54.10,3151,dohc,six,161,mpfi,3.27,3.35,9.20,156,5200,19,24,15750 +2,122,volkswagen,diesel,std,two,sedan,fwd,front,97.30,171.70,65.50,55.70,2261,ohc,four,97,idi,3.01,3.40,23.00,52,4800,37,46,7775 +2,122,volkswagen,gas,std,two,sedan,fwd,front,97.30,171.70,65.50,55.70,2209,ohc,four,109,mpfi,3.19,3.40,9.00,85,5250,27,34,7975 +2,94,volkswagen,diesel,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2264,ohc,four,97,idi,3.01,3.40,23.00,52,4800,37,46,7995 +2,94,volkswagen,gas,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2212,ohc,four,109,mpfi,3.19,3.40,9.00,85,5250,27,34,8195 +2,94,volkswagen,gas,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2275,ohc,four,109,mpfi,3.19,3.40,9.00,85,5250,27,34,8495 +2,94,volkswagen,diesel,turbo,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2319,ohc,four,97,idi,3.01,3.40,23.00,68,4500,37,42,9495 +2,94,volkswagen,gas,std,four,sedan,fwd,front,97.30,171.70,65.50,55.70,2300,ohc,four,109,mpfi,3.19,3.40,10.00,100,5500,26,32,9995 +3,?,volkswagen,gas,std,two,convertible,fwd,front,94.50,159.30,64.20,55.60,2254,ohc,four,109,mpfi,3.19,3.40,8.50,90,5500,24,29,11595 +3,256,volkswagen,gas,std,two,hatchback,fwd,front,94.50,165.70,64.00,51.40,2221,ohc,four,109,mpfi,3.19,3.40,8.50,90,5500,24,29,9980 +0,?,volkswagen,gas,std,four,sedan,fwd,front,100.40,180.20,66.90,55.10,2661,ohc,five,136,mpfi,3.19,3.40,8.50,110,5500,19,24,13295 +0,?,volkswagen,diesel,turbo,four,sedan,fwd,front,100.40,180.20,66.90,55.10,2579,ohc,four,97,idi,3.01,3.40,23.00,68,4500,33,38,13845 +0,?,volkswagen,gas,std,four,wagon,fwd,front,100.40,183.10,66.90,55.10,2563,ohc,four,109,mpfi,3.19,3.40,9.00,88,5500,25,31,12290 +-2,103,volvo,gas,std,four,sedan,rwd,front,104.30,188.80,67.20,56.20,2912,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,23,28,12940 +-1,74,volvo,gas,std,four,wagon,rwd,front,104.30,188.80,67.20,57.50,3034,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,23,28,13415 +-2,103,volvo,gas,std,four,sedan,rwd,front,104.30,188.80,67.20,56.20,2935,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,24,28,15985 +-1,74,volvo,gas,std,four,wagon,rwd,front,104.30,188.80,67.20,57.50,3042,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,24,28,16515 +-2,103,volvo,gas,turbo,four,sedan,rwd,front,104.30,188.80,67.20,56.20,3045,ohc,four,130,mpfi,3.62,3.15,7.50,162,5100,17,22,18420 +-1,74,volvo,gas,turbo,four,wagon,rwd,front,104.30,188.80,67.20,57.50,3157,ohc,four,130,mpfi,3.62,3.15,7.50,162,5100,17,22,18950 +-1,95,volvo,gas,std,four,sedan,rwd,front,109.10,188.80,68.90,55.50,2952,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,23,28,16845 +-1,95,volvo,gas,turbo,four,sedan,rwd,front,109.10,188.80,68.80,55.50,3049,ohc,four,141,mpfi,3.78,3.15,8.70,160,5300,19,25,19045 +-1,95,volvo,gas,std,four,sedan,rwd,front,109.10,188.80,68.90,55.50,3012,ohcv,six,173,mpfi,3.58,2.87,8.80,134,5500,18,23,21485 +-1,95,volvo,diesel,turbo,four,sedan,rwd,front,109.10,188.80,68.90,55.50,3217,ohc,six,145,idi,3.01,3.40,23.00,106,4800,26,27,22470 +-1,95,volvo,gas,turbo,four,sedan,rwd,front,109.10,188.80,68.90,55.50,3062,ohc,four,141,mpfi,3.78,3.15,9.50,114,5400,19,25,22625 \ No newline at end of file diff --git a/Module7/IntroToUnsupervisedLearning.ipynb b/Module7/IntroToUnsupervisedLearning.ipynb new file mode 100644 index 0000000..53b2f91 --- /dev/null +++ b/Module7/IntroToUnsupervisedLearning.ipynb @@ -0,0 +1,806 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Introduction to Unsupervised Learning\n", + "\n", + "In this lab you will work with **unsupervised learning**. Unsupervised learning attempts to find inherent structure or relationships in complex data. Unsupervised learning does not use labeled cases in contrast with supervised learning. \n", + "\n", + "There are number of approaches to unsupervised learning. In this lab you will work with two widely used **clustering** methods, **K-means clustering** and **hierarchical**.\n", + "\n", + "## K-means clustering\n", + "\n", + "K-means clustering separates a dataset into K clusters of equal variance. The number of clusters, K, is user defined. The basic algorithm has the following steps:\n", + "1. A set of K centroids are randomly chosen. \n", + "2. Clusters are formed by minimizing variance within each cluster. This metric is also know as the **within cluster sum of squares** (see further discussion in the section on evaluating clusters). This step partitions the data into clusters with minimum squared distance to the centroid of the cluster. \n", + "3. The centroids are moved to mean of each cluster. The means of each cluster is computed and the centroid is moved to the mean. \n", + "4. Steps 2 and 3 are repeated until a stopping criteria is met. Typically, the algorithm terminates when the within cluster variance decreases only minimally. \n", + "5. The above steps are repeated starting with a random start of step 1. The best set of clusters by within cluster variance and between cluster separation are retained. \n", + "\n", + "Since K-means clustering relies only on basic linear algebra operations, the method is massively scalable. Out-of-core K-means clustering algorithms are widely used. However, this method assumes equal variance of the clusters, a fairly restrictive assumption. In practice, this criteria is almost never true, and yet K-means clustering still produces useful results. \n", + "\n", + "## Hierarchical clustering\n", + "\n", + "Hierarchical clustering methods make fewer distributional assumptions when compared to K-means methods. However, K-means methods are generally more scalable, sometimes very much so. \n", + "\n", + "Hierarchical clustering creates clusters by either a **divisive** method or **agglomerative** method. The divisive method is a \"top down\" approach starting with the entire dataset and then finding partitions in a stepwise manner. The later method is known as agglomerative clustering is a \"bottom up** approach. In this lab you will work with agglomerative clustering which roughly works as follows:\n", + "1. The **linkage distances** between each of the data points is computed.\n", + "2. Points are clustered pairwise with their nearest neighbor. \n", + "3. Linkage distances between the clusters are computed.\n", + "4. Clusters are combined pairwise into larger clusters.\n", + "5. Steps 3 and 4 are repeated until all data points are in a single cluster. \n", + "\n", + "The linkage function can be computed in a number of ways:\n", + "- **Ward** linkage measures the increase in variance for the clusters being linked,\n", + "- **Average** linkage uses the mean pairwise distance between the members of the two clusters, \n", + "- **Complete** or **Maximal** linkage uses the maximum distance between the members of the two clusters.\n", + "\n", + "Several different distance metrics are used to compute linkage functions:\n", + "- **Euclidian** or **l2** distance is the most widely used. This metric is only choice for the Ward linkage method. \n", + "- **Manhattan** or **l1** distance is robust to outliers and has other interesting properties. \n", + "- **Cosine** similarity, is the dot product between the location vectors divided by the magnitudes of the vectors. Notice that this metric is a measure of similarity, whereas the other two metrics are measures of difference. Similarity can be quite useful when working with data such as images or text documents. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## A first example\n", + "\n", + "With the above bit of theory in mind, you will now apply both K-means and agglomerative clustering modes to a set of synthetic data. \n", + "\n", + "First, execute the code in the cell below to load the packages required to run the rest of this notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import numpy.random as nr\n", + "from sklearn.cluster import KMeans, AgglomerativeClustering\n", + "from sklearn.metrics import silhouette_score\n", + "import matplotlib.pyplot as plt\n", + "\n", + "%matplotlib inline" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The code in the cell below creates synthetic data from bivariate Normal distribution with mean of $\\{ 0,0 \\}$ and a covariance matrix:\n", + "\n", + "$$cov(X) = \\begin{bmatrix}\n", + " 1.0 & 0.4 \\\\\n", + " 0.4 & 1.0\n", + " \\end{bmatrix}$$\n", + " \n", + " Execute this code." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def make_dist(mean, cov, dist_num, n = 100, seed = 123):\n", + " nr.seed(seed)\n", + " sample = nr.multivariate_normal(mean, cov, n) # Compute the 2-d Normally distributed data\n", + " sample = np.column_stack((sample, np.array([dist_num]*n))) ## Add the distribution identifier\n", + " print('Shape of sample = ' + str(sample.shape))\n", + " return(sample)\n", + "\n", + "cov = np.array([[1.0, 0.4], [0.4, 1.0]])\n", + "mean = np.array([0.0, 0.0])\n", + "sample1 = make_dist(mean, cov, 1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to create simulate data from two difference distributions. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "cov = np.array([[1.0, 0.8], [0.8, 1.0]])\n", + "mean = np.array([3.0, 0.0])\n", + "sample2 = make_dist(mean, cov, 2, 100, 3344)\n", + "\n", + "mean = np.array([-3.0,0.0])\n", + "cov = np.array([[1.0,0.8],[0.8,1.0]])\n", + "sample3 = make_dist(mean, cov, 3, 100, 5566)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to display the data simulated from the three distributions." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_dat(sample1, sample2, sample3):\n", + " plt.scatter(sample1[:,0], sample1[:,1], color = 'blue')\n", + " plt.scatter(sample2[:,0], sample2[:,1], color = 'orange')\n", + " plt.scatter(sample3[:,0], sample3[:,1], color = 'green')\n", + " plt.xlabel('Dimension 1')\n", + " plt.ylabel('Dimension 2')\n", + " plt.title('Sample data')\n", + " plt.show()\n", + " \n", + "plot_dat(sample1, sample2, sample3) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice that the three distributions are mostly separated, with some overlap.\n", + "\n", + "Execute the code in the cell below to concatenate the data from the three numpy arrays and Z-Score normalize the feature columns. This is a vital step for any " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "sample = np.concatenate((sample1, sample2, sample3))\n", + "\n", + "for i in range(1):\n", + " mean_col = np.mean(sample[:,i])\n", + " std_col = np.std(sample[:,i])\n", + " sample[:,i] = [(x - mean_col)/std_col for x in sample[:,i]]\n", + "\n", + "sample.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, you will create a K-means clustering model by the following steps in the code below:\n", + "1. A K-means clustering model is defined with K = 3. \n", + "2. The model is fit to the data and cluster assignments computed. \n", + "3. The assignments to the clusters are printed. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "kmeans_3 = KMeans(n_clusters=3, random_state=0)\n", + "assignments_km3 = kmeans_3.fit_predict(sample[:,0:2])\n", + "print(assignments_km3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in the cell below to plot the locations of the cluster assignments. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_clusters(sample, assignment):\n", + " col_dic = {0:'blue',1:'green',2:'orange',3:'gray',4:'magenta',5:'black'}\n", + " colors = [col_dic[x] for x in assignment]\n", + " plt.scatter(sample[:,0], sample[:,1], color = colors)\n", + " plt.xlabel('Dimension 1')\n", + " plt.ylabel('Dimension 2')\n", + " plt.title('Sample data')\n", + " plt.show()\n", + "\n", + "plot_clusters(sample, assignments_km3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine this plot. The k-means clustering algorithm has found a structure in the data which is quite different from the distributions used to generate them. None the less, these clusters look reasonable given the structure of the data. The clusters separate the data into three fairly tight regions. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now you will compare the result of the K-means clustering with agglomerative clustering. The code in the cell below performs the following processing:\n", + "1. Defines an agglomerative clustering model with three clusters.\n", + "2. Fits the model and computes the cluster assignments. \n", + "\n", + "Execute this code. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "agglomerative_3 = AgglomerativeClustering(n_clusters=3)\n", + "assignments_ag3 = agglomerative_3.fit_predict(sample[:,0:2])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Execute the code in cell below to visualize the cluster assignments for the agglomerative models. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plot_clusters(sample, assignments_ag3)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The results for the agglomerative clustering model are noticeably different from those created by the k-means model. The agglomerative clusters deviate from the approximately elliptical clusters created by the k-means algorithm. \n", + "\n", + "The foregoing 3 cluster models were applied to the three simulated distributions. What will be the result of using other numbers of clusters? \n", + "\n", + "Execute the code in the cell below to compute a K-means cluster model, with K=4, and display the assignments." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "kmeans_4 = KMeans(n_clusters=4, random_state=0)\n", + "assignments_km4 = kmeans_4.fit_predict(sample[:,0:2])\n", + "plot_clusters(sample, assignments_km4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To compare the K-means model with an agglomerative clustering model with two clusters, execute the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "agglomerative_4 = AgglomerativeClustering(n_clusters=4)\n", + "assignments_ag4 = agglomerative_4.fit_predict(sample[:,0:2])\n", + "plot_clusters(sample, assignments_ag4)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Compare these results to each other and to the 3 cluster models. As expected, the 4 cluster models exhibit smaller clusters showing greater detail of the structure of the data. The structure discovered is noticeably different between the k-means and the agglomerative model. \n", + "\n", + "Next, you will explore the differences between K-means and agglomerative clustering methods with 5 clusters. Execute the code in the cell below to compute the K=5 K-means model and display the assignments. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "kmeans_5 = KMeans(n_clusters=5, random_state=0)\n", + "assignments_km5 = kmeans_5.fit_predict(sample[:,0:2])\n", + "plot_clusters(sample, assignments_km5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, execute the code in the cell below to compute and display the assignments for an agglomerative clustering model with 5 clusters." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "agglomerative_5 = AgglomerativeClustering(n_clusters=5)\n", + "assignments_ag5 = agglomerative_5.fit_predict(sample[:,0:2])\n", + "plot_clusters(sample, assignments_ag5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine the differences between the assignments for the K-means and agglomerative clustering methods. Notice that the K-means method has created 5 clusters which clearly different structure for the agglomerative model. Each model highlights a different aspect of the structure of the data. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Evaluating cluster models\n", + "\n", + "Now that you have created some clustering models, you are likely wondering how can you evaluate these models and perform model selection. There are a number of metrics you can use to evaluate and compare clustering models. However, you should always keep in mind that the best model, should be selected based on the problem you are trying to solve.\n", + "\n", + "One useful metric for clusters is the **within cluster sum of squares** or **WCSS**. Intuitively, clusters should have minimal dispersion and therefore minimal WCSS. The \n", + "\n", + "$$WCSS = Min \\sum_i \\sum_{j\\ in\\ cluster\\ i} ||x_j - c_i||^2 \\\\\n", + "where\\\\\n", + "c_i = center\\ of\\ ith\\ cluster\\\\ \n", + "and\\\\\n", + "||x_j - c_i|| = distance\\ between\\ data\\ x_j\\ and\\ center\\ c_i\n", + "$$\n", + "\n", + "We can use WCSS to compare different cluster models. Models with smaller SSW have tighter clusters and, therefore smaller WCSS. \n", + "\n", + "****\n", + "**Note:** WCSS is also referred to as **inertia**. \n", + "****\n", + "\n", + "The **between cluster sum of squares** or **BCSS** is a related metric. Whereas WCSS measures how tight the clusters are BCSS is a measure of the separation between the clusters. To compute the BCSS, observe that the **total sum of squares** or **TSS** must equal the sum of the WCSS and BCSS:\n", + "\n", + "$$\n", + "TSS = BCSS + WCSS\\\\\n", + "where\\\\\n", + "TSS = \\sum_i (x_i - \\mu)^2\\\\\n", + "where\\\\\n", + "\\mu = mean\\ of\\ all\\ data\\ samples\n", + "$$\n", + "\n", + "Notice that the TSS is just the variance of all data points. The BCSS is then just the difference between TSS and WCSS.\n", + "\n", + "The **silhouette coefficient** or **SC** is another clustering metric. The silhouette coefficient measures the ratio between the distances within a cluster and distances to the nearest adjacent cluster. The SC for the $ith$ member of a cluster is computed as follows:\n", + "\n", + "$$\n", + "SC_i = \\frac{b_i -a_i }{max(a_i, b_i)}\\\\\n", + "where\\\\\n", + "a_i = average\\ distance\\ from\\ point\\ i\\ to\\ other\\ members\\ of\\ the\\ same\\ cluster\\\\\n", + "b_i = average\\ distance\\ from\\ point\\ i\\ to\\ members\\ of\\ an\\ adjacent\\ cluster\n", + "$$\n", + "\n", + "The SC has some important properties First, the SC values are limited as a result of the normalization:\n", + "\n", + "$$\\{ -1 \\le SC \\le 1 \\}$$\n", + "\n", + "For the case where the clusters are compact and well separated from adjacent clusters the following relationship holds:\n", + "\n", + "$$if\\ a_i \\lt b_i\\ then\\ SC \\gt 0$$\n", + "\n", + "However, for dispersed clusters that are not well separated from adjacent clusters the following relationship holds:\n", + "\n", + "$$if\\ a_i \\gt b_i\\ then\\ SC \\lt 0$$\n", + "\n", + "In other words, the tighter the cluster and the further it is from members of the adjacent cluster. The closer the SC values will be to 1. However, if the cluster is dispersed and the distances to the adjacent cluster is small, the SC will have a value less than 0. \n", + "\n", + "In summary, you want to find a model with SC values close to 1. If the SC is consistently less than zero the clustering model is probably not that useful. \n", + "\n", + "****\n", + "**Note:** The WCSS and BCSS metrics have the concept of the clustering having multivariate-Normal distributions. Therefore, these metrics are strictly only applicable to K-means cluster. This fact means that WCSS and BCSS are not useful metrics for agglomerative clustering. The SC can be computed using various metrics so is more generally applicable to most clustering methods. \n", + "****" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the previous section, you compared cluster models using visualization of the cluster assignments. The code in the cell below computes and displays WCSS and BCSS for the three K-means clustering models tried. Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "km_models = [kmeans_3, kmeans_4, kmeans_5]\n", + "\n", + "def plot_WCSS_km(km_models, samples):\n", + " fig, ax = plt.subplots(1, 2, figsize=(12,5))\n", + " \n", + " ## Plot WCSS\n", + " wcss = [mod.inertia_ for mod in km_models]\n", + " print(wcss)\n", + " n_clusts = [x+1 for x in range(2,len(wcss) + 2)]\n", + " ax[0].bar(n_clusts, wcss)\n", + " ax[0].set_xlabel('Number of clusters')\n", + " ax[0].set_ylabel('WCSS')\n", + " \n", + " ## Plot BCSS\n", + " tss = np.sum(sample[:,0:1]**2, axis = 0)\n", + " print(tss)\n", + " ## Compute BCSS as TSS - WCSS\n", + " bcss = np.concatenate([tss - x for x in wcss]).ravel()\n", + " ax[1].bar(n_clusts, bcss)\n", + " ax[1].set_xlabel('Number of clusters')\n", + " ax[1].set_ylabel('BCSS')\n", + " plt.show()\n", + " \n", + "\n", + "plot_WCSS_km(km_models, sample)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Examine these results and notice the following:\n", + "1. The WCSS deceases with the number of clusters, whereas the BCSS increases with the number of clusters. This behavior is expected. Using more clusters results in smaller clusters with lower WCSS. However, the separation between the clusters decrease with larger number of clusters leading to increased BCSS. \n", + "2. The WCSS decreases significantly between 3 and 4 clusters. The changes in both metrics are less between the 4 and 5 cluster model. \n", + "3. These metrics indicate there is not a great difference between the 4 and 5 cluster models. However, the 3 cluster model appears to be inferior. \n", + "\n", + "The code in the cell below computes and displays the SC for the three M-means models. This code uses the `silhouette_score` function from the scikit-learn `metrics` package. Execute this code and examine the result. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "assignment_list = [assignments_km3, assignments_km4, assignments_km5]\n", + "\n", + "def plot_sillohette(samples, assignments, x_lab = 'Number of clusters', start = 3):\n", + " silhouette = [silhouette_score(samples[:,0:1], a) for a in assignments]\n", + " n_clusts = [x + start for x in range(0, len(silhouette))]\n", + " plt.bar(n_clusts, silhouette)\n", + " plt.xlabel(x_lab)\n", + " plt.ylabel('SC')\n", + " plt.show()\n", + "\n", + "plot_sillohette(sample, assignment_list)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The SC decreases with increasing numbers of clusters. The 4 and 5 cluster models have rather low SC. \n", + "\n", + "Taken together the three metrics indicate that the k=5 k-means model is the most representative. While the k=3 model has higher SC, the WCSS is greater and the BCSS is lower.\n", + "\n", + "The exploration with the k-means clustering model can be extended to greater numbers of clusters. \n", + "\n", + "The code in the cell below computes and displays the SC for the three agglomerative clustering models. Execute this code and examine the results." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "assignment_list = [assignments_ag3, assignments_ag4, assignments_ag5]\n", + "plot_sillohette(sample, assignment_list)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The SC is similar for the 3 and 5 cluster model. It appears the 5 cluster model has created finer divisions of the original 3 clusters. This behavior is expected for hierarchical clustering models. Cluster divisions in these models are based on a split of at tree structure. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Another example\n", + "\n", + "The foregoing examples provide a basic introduction to K-means and agglomerative clustering. In this section you will work with another set of synthetic data and explore how changes in distance metric and linkage function change agglomerative clustering results. \n", + "\n", + "The code in the cell below computes another synthetic dataset comprised of three distributions. One of these distributions has a much larger covariance than in the previous example. This fact will stress the equal variance assumption of the K-means clustering model. \n", + "\n", + "Execute this code to generate and display the dataset. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(3344)\n", + "cov = np.array([[1.0, -0.98], [-0.98, 1.0]])\n", + "mean = np.array([-1.0, 0.0])\n", + "sample1 = make_dist(mean, cov, 1, 100, 3344)\n", + "\n", + "nr.seed(5566)\n", + "cov = np.array([[1.0, -0.8], [-0.8, 1.0]])\n", + "mean = np.array([6.0, 0.0])\n", + "sample2 = make_dist(mean, cov, 1, 100, 6677)\n", + "\n", + "nr.seed(7777)\n", + "cov = np.array([[1.0, 0.9], [0.9, 1.0]])\n", + "mean = np.array([-4.0, 0.0])\n", + "sample3= make_dist(mean, cov, 3, 100, 367)\n", + "\n", + "## Plot the distributions\n", + "plot_dat(sample1, sample2, sample3) " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As expected the dataset is comprised of three distinct distributions. The distribution in the center exhibits high covariance. \n", + "\n", + "Now, execute the code in the cell below to concatenate the three distributions into a single numpy array and Z-Score normalize the features. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sample_2 = np.concatenate((sample1, sample2, sample3))\n", + "\n", + "for i in range(1):\n", + " mean_col = np.mean(sample_2[:,i])\n", + " std_col = np.std(sample_2[:,i])\n", + " sample_2[:,i] = [(x - mean_col)/std_col for x in sample_2[:,i]]\n", + "\n", + "sample_2.shape" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "A K=3 K-means model is a useful basis for comparison. Execute the code in the cell below to compute the model and display the cluster assignments." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "nr.seed(3344)\n", + "kmeans_3 = KMeans(n_clusters=3, random_state=0)\n", + "assignments_km3 = kmeans_3.fit_predict(sample_2[:,0:2])\n", + "plot_clusters(sample_2, assignments_km3)\n", + "print(silhouette_score(sample_2[:,0:1], assignments_km3))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The k=3 K-means cluster model has largely separated the data on the right and split the data distributions on the left. \n", + "\n", + "Now, compute an 3-cluster agglomerative model using Ward linkage and Euclidean distance norms (the defaults in scikit-learn) by executing the code in the cell below. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(3344)\n", + "agglomerative3_w = AgglomerativeClustering(n_clusters=3)\n", + "assignments3_w = agglomerative3_w.fit_predict(sample_2[:,0:2])\n", + "plot_clusters(sample_2, assignments3_w)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The cluster assignments displayed above mostly link the original distributions. However certain points are linked into a new cluster in the lower right. \n", + "\n", + "But, what will be the effect of using other linkage functions? To find out, execute the code in the cell below to compute and display the assignments of a 3-cluster agglomerative model using average linkage." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(5555)\n", + "agglomerative3_a = AgglomerativeClustering(n_clusters=3, linkage = 'average')\n", + "assignments3_a = agglomerative3_a.fit_predict(sample_2[:,0:2])\n", + "plot_clusters(sample_2, assignments3_a)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The cluster assignments Follow the structure of the data rather well. Two outlying points in the lower right for the third cluster.\n", + "\n", + "Now, execute the code in the cell below to compute and display the cluster assignments for a 3-cluster agglomerative model using complete or maximal linkage. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "nr.seed(987)\n", + "agglomerative3_c = AgglomerativeClustering(n_clusters=3, linkage = 'complete')\n", + "assignments3_c = agglomerative3_c.fit_predict(sample_2[:,0:2])\n", + "plot_clusters(sample_2, assignments3_c)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These cluster assignments are quite different from those seen before. However, the assignments seem a bit discordant with the data. \n", + "\n", + "Having tried several linkage functions, you will now explore using different distance metrics. The code in the cell below computes a 3-cluster agglomerative model with average linkage using the Manhattan or l1 distance metric. Execute this code and examine the class assignments displayed." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(3344)\n", + "agglomerative3_a_m = AgglomerativeClustering(n_clusters=3, linkage = 'average', affinity = 'manhattan')\n", + "assignments3_a_m = agglomerative3_a_m.fit_predict(sample_2[:,0:2])\n", + "plot_clusters(sample_2, assignments3_a_m)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "These assignments are identical to those obtained with average linkage and Euclidean distance.\n", + "\n", + "There are some other combinations of distance metric and linkage which might be tried. In this case, only average linkage and cosine similarity will be tried. Execute the code in the cell below to compute and display the assignments for a 3-cluster agglomerative model using average linkage and cosine similarity." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "nr.seed(234)\n", + "agglomerative3_a_c = AgglomerativeClustering(n_clusters=3, linkage = 'average', affinity = 'cosine')\n", + "assignments3_a_c = agglomerative3_a_c.fit_predict(sample_2[:,0:2])\n", + "plot_clusters(sample_2, assignments3_a_c)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As with the other metric choices, the division the data into these assignments appears a bit arbitrary. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, compute the SCs for each of these models and compare them by executing the code in the cell below.\n", + "\n", + "Then, answer **Question 1** on the course page." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "assignment_list = [assignments3_w, assignments3_a, assignments3_c, assignments3_a_m, assignments3_a_c]\n", + "plot_sillohette(sample, assignment_list, x_lab = 'Model number', start = 1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The first model with Ward linkage and Euclidean distance has the highest SC. The three cluster modules using average linkage and Euclidean or Manhattan distance are nearly as good. It is clear from the foregoing, that the choice of distance metric and linkage can have significant effect on the results of agglomerative clustering. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Summary\n", + "\n", + "In this lab you have worked with two commonly used clustering models, K-means clustering and hierarchical agglomerative clustering. These methods are examples of unsupervised learning since they attempt to find interesting structure in data without using labels or marked cases.\n", + "\n", + "For agglomerative cluster models the linkage can be any of:\n", + "- **Ward** or increase in variance,\n", + "- **Average** linkage or mean pairwise distance between the members of pairs of clusters, \n", + "- **Complete** or **Maximal** linkage or maximum distance between the members of the two clusters.\n", + "\n", + "Several different distance metrics are used to compute linkage functions:\n", + "- **Euclidian** or **l2**, \n", + "- **Manhattan** or **l1**, \n", + "- **Cosine** similarity.\n", + "\n", + "A critical problem for clustering models is to determine the number of clusters. Since there are no labels, model selection must be performed by a combination of visualization and metrics. Metrics used for evaluating cluster models include:\n", + "- The **within cluster sum of squares (WCSS)** and **between cluster sum of squares (BCSS)** are used for K-means clustering only. Ideally a good K-menas cluster model should have small WCSS and large BCSS. \n", + "- Most any clustering model can be evaluated using **silhouette coefficient (SC)**. The SC measures the ratio between the distances within a cluster and distances to the nearest adjacent cluster." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3.6", + "language": "python", + "name": "python36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.3" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +}